This tutorial demonstrates how to upload Pandas DataFrame to Google BigQuery API by using pandas-gbq library (https://pypi.org/project/pandas-gbq).
This demonstration uses Jupyter Notebook in Python and shows how to connect your Notebook to Google Cloud BigQuery API via your Google account.
As you uploaded you data to BigQuery, you can easily manipulate with your data by using standard SQL commands and functions by writing SQL directly in Google Cloud Console. This is shown in the last step of this video.
BigQuery is a severless highly-scalable, and cost-effective cloud data warehouse with an in-memory BI engine and machine learning built in, Google says.
The whole video is divided in following parts:
Introduction to BigQuery.
Slide #1. What is Google BigQuery? (0:05)
Slide #2. Google Trends Insights. (0:09)
Slide #3. Which Companies use Google BigQuery? (0:13)
Step #1. Install pandas-gbq library on Python Environment. (0:17)
Step #2. Create and Prepare Google Cloud Platform Project. (0:55)
Step #3. Prepare Pandas DataFrame from local CSV file. (1:36)
Step #4. Connect your Python Environment to Google BigQuery API. (2:43)
Step #5. Upload Pandas DataFrame to Google BigQuery API. (4:26)
Step #6. Check Google BigQuery API for new uploaded Data. (6:54)
Step #7. Make SQL on BigQuery very easily. (7:32)
Note: pandas-gbq can be easily installed to your Python Environment via PIP: pip install pandas-gbq
If I missed something, please write your insights to comments. Any feedback is waiting.