you can create an Import query with a GCP BigQuery connection in Power BI, but there are important limitations and considerations depending on the Power BI version and data connector used.
✅
Import Mode with BigQuery – Supported
Power BI supports Import mode from Google BigQuery using the built-in connector, as long as:
- You’re using Power BI Desktop
- You have credentials (usually Google account OAuth2 or service account)
- The dataset isn’t too large, since Import loads data into Power BI’s in-memory model
๐
Steps to Use Import Mode with BigQuery
- Open Power BI Desktop
- Go to Home > Get Data > More…
- Choose Google BigQuery (under Database category)
- Authenticate (with Google account or service account key)
- Navigate through your GCP project > Dataset > Tables
- In the navigator:
- Select tables or views
- At the bottom, choose Load (this is Import mode) or Transform Data to go through Power Query
- Power BI will load the data into memory and store it in the .pbix file
๐ Difference Between Import and DirectQuery
Feature
Import
DirectQuery
Performance
Fast for analysis (in-memory)
Slower รข€“ queries sent to BigQuery live
File Size
Limited by RAM (Power BI file grows)
Lightweight รข€“ no large data stored
Refresh
Needs scheduled refresh (via Gateway)
Live data, but limited transformations
Transformations
Full Power Query and DAX available
Limited รข€“ many transformations restricted
⚠️ Considerations and Limitations
- Data size: BigQuery tables can be huge; import only what you need.
- Cost: Importing large data can incur BigQuery query costs (pay-per-query).
- Gateway requirement: Scheduled refresh in Power BI Service requires on-premises data gateway, even for cloud sources like BigQuery (unless using personal gateway or VNet).
- Query Folding: Power Query may or may not fold your transformations back to BigQuery — if not, performance and cost may suffer.
๐ก Tip: Reduce Query Cost and Improve Performance
- Use Custom SQL or BigQuery views to pre-aggregate/filter data before loading
- Use incremental refresh (for large tables with date-based partitions)
- Keep imports small or use DirectQuery/Hybrid mode when live data is essential