Data integration
Facebook Ads
How to's

Contents

Got insights from this post? Give it a boost by sharing with others!

How to Integrate Facebook Ads into BigQuery: Top 3 Methods

facebook ads to bigquery integration

If you’re running Facebook advertising campaigns, accurately evaluating their performance is essential. Since Meta’s built-in tools can be limited, many turn to external solutions for deeper insights to better optimize campaigns and ad spend. That’s where Google BigQuery comes into play.

BigQuery is a powerful data warehouse that streamlines the storage and analysis of large datasets, making it an ideal choice for handling Meta Ads data.

The question is: how do you send Facebook Ads data to BigQuery—a task that can be especially challenging for those without coding experience? This blog has you covered, outlining three methods to integrate Facebook Ads with BigQuery: using automated tools, manual uploads, or custom scripts.

Let’s get started. 

Why send Facebook Ads data to BigQuery?

Integrating Facebook Ads with BigQuery helps you overcome the limitations of Meta’s native analytics while unlocking several key benefits:

  • Deeper insights and control: By sending your Facebook Ads data to BigQuery, you can run advanced queries to uncover trends, patterns, and better understand relationships between different metrics.
  • Unified data source: You can combine Facebook Ads data with other platforms such as Google Ads, GA4, CRMs, email marketing tools, and more, into one unified data warehouse. This helps you create a single source of truth to measure channel impact and overall business performance.
  • Time-series analysis: BigQuery enables tracking performance over time to identify long-term trends and seasonality.
  • Machine learning capabilities: BigQuery offers powerful ML features to predict outcomes, refine targeting, and optimize campaigns for better results.

Use cases of Facebook Ads to BigQuery integration 

Let’s explore a few practical ways of leveraging Meta Ads analytics with BigQuery.

  • Integrating Facebook Ads with BigQuery enables you to fine-tune your targeting strategies for better user response and engagement. You may upload audience data from Facebook into BigQuery and conduct queries to reveal certain patterns in demographics and user behavior.  
  • Thanks to BigQuery cluster algorithms, you can create more granular and targeted ad campaigns based on age group, interests, location, or user interactions. By segmenting audiences and delivering relevant messaging to each group—and combining these insights with your business data—you can develop more effective and personalized marketing strategies.
  • You can use industry benchmarks or publicly available datasets to perform a competitive analysis. By comparing your Facebook attribution data with external performance metrics, you can identify how your ads stack up against the competition. These insights reveal performance or content gaps and highlight areas for strategic improvement.

Limitations of Facebook Ads Data Transfer

When sending Facebook Ads data to BigQuery through the native BigQuery Data Transfer service, it’s important to be aware of a few limitations that may affect your setup and scheduled data updates:

  • The shortest time allowed for repeating data transfers is 24 hours. So, you can’t set Facebook Ads transfers to happen more than once per day. 
  • BigQuery Data Transfer Service for Facebook Ads only supports a fixed set of predefined tables—such as AdAccounts, AdInsights, and AdInsightsActions—and does not support importing custom reports. Any additional tables or fields outside this supported list will not be ingested automatically. As a workaround, you’ll need to use API + custom script or third-party ETL tools like Windsor.ai.
  • Each transfer comes with a time limit. If it takes more than six hours, it won’t go through. Make sure the amount of data or how complex it is doesn’t push past that time.
  • The AdInsights and AdInsightsActions tables don’t support incremental updates. Every time you schedule a transfer for these tables, BigQuery pulls the full dataset for the selected date. This happens even if some of that data was already moved earlier. Again, using Windsor.ai helps you overcome this limit.
  • The refresh window for these tables is limited to one day. When you first run a transfer, it will only fetch data available within that one-day window. With Windsor.ai, you can schedule updates daily, hourly, or at custom intervals, giving you more flexibility and control over your data syncs.
  • Long-lived user access tokens required for Facebook Ads transfers expire after 60 days. Once expired, you must manually generate a new token by editing your existing transfer and following the same steps listed in the Facebook Ads setup instructions.
  • Lastly, if your virtual machine and network attachment are in different regions, data will move between them. This kind of setup can slow things down. It might also lead to extra charges depending on your provider.

Before you transfer Facebook Ads data to BigQuery: things to know

Types and formats of Facebook Ads data that BigQuery supports 

Before uploading your Facebook Ads data to BigQuery, make sure it’s in a supported format. For instance, if the API provides data in XML, you’ll need to convert it to a format that BigQuery can read.

Currently, BigQuery supports the following file formats:

  • CSV
  • JSON

In addition to format, it’s essential to ensure that the data types used in your dataset are compatible with BigQuery. Using unsupported or mismatched data types can lead to errors during the import process.

BigQuery supports these data types:

  • STRING
  • INTEGER
  • FLOAT
  • BOOLEAN
  • RECORD (nested structures)
  • TIMESTAMP

If you’re unsure how to properly prepare your data, refer to the official BigQuery documentation. It provides step-by-step instructions to help you format and structure your data correctly, making the upload process smoother and error-free.

Facebook Ads data extraction: tools, APIs, and setup

Meta offers a robust and detailed API to pull comprehensive data for reporting and analytics. However, with this power comes complexity—the API consists of multiple components and enforces strict rules, including important security features you need to understand.

Accessing the API typically happens through Facebook’s official SDKs, including:

  • PHP SDK
    Python SDK

You can also use community-supported SDKs for languages like:

  • R
  • JavaScript
  • Ruby

For a customized setup,  you can make direct RESTful requests using tools like:

  • cURL
  • Postman
  • HTTP clients (e.g., Apache HttpClient for Java, Spray-client for Scala, Hyper for Rust, Ruby rest-client)

All ad data comes through the Graph API, which exposes specific endpoints (called “edges”) to retrieve insights for different ad objects, including:

  • ad account/insights
  • campaign/insights
    ad set/insights
  • ad/insights

These endpoints return performance data for each object. For example, to fetch campaign-level stats using cURL, you might use a request like:

curl -F ‘level=campaign’ -F ‘fields=[]’ -F ‘access_token=<ACCESS_TOKEN>’ https://graph.facebook.com/v2.5/<CAMPAIGN_ID>/insights

Facebook returns the report in XLS or CSV format. Once it’s ready, you can download it from a URL that contains your report ID and access token.

If you need live updates, you can set up a real-time data stream using Facebook Webhooks. Webhooks enable your system to listen for data changes and automatically pull new information, helping keep your data warehouse continuously synced with the latest ad results.

Understanding all these tools and methods is essential to building an effective data extraction workflow from Facebook Ads into your analytics environment, like BigQuery.

3 methods to move Facebook Ads data to BigQuery

Moving data from Meta Ads to BigQuery is quite a straightforward task—if you pick the right method and follow the provided setup steps. The best option depends on your specific needs and expertise. 

Here are three common ways to implement this integration:

Method 1. Using ETL/ELT tools for automated Facebook Ads to BigQuery data integration

Method 2. Manual data export and upload

Method 3. API + custom script

Method 1: Using ETL/ELT tools for automated Facebook Ads to BigQuery data integration

  • Prerequisites: Facebook Ads access, BigQuery project in Google Cloud, Windsor.ai account
  • Tools used: ETL/ELT platform like Windsor.ai
  • Type of solution: Fully managed, automated, and no-code
  • Mechanism: Connect your Facebook Ads account to Windsor.ai, and it will automatically transfer your data to BigQuery, supporting auto-refreshes and schema mapping. No manual steps or coding involved
  • Best suited for: Marketing teams, analysts, and data engineers who need a quick and reliable setup that scales easily and doesn’t need constant maintenance
  • Limitations: Product subscription required

While manual methods of syncing Facebook Ads data with BigQuery can be time-consuming and demand technical expertise, ETL/ELT platforms offer a much faster and easier alternative, accessible to both technical and non-technical users.

Here’s how these tools automate the entire data pipeline:

  • Data extraction, transformation, and loading
  • Schema mapping and formatting
  • Data blending with other sources
  • Dataset refreshes on a set schedule
  • Continuous error monitoring and logging
  • Scalability as your data volume and reporting needs grow

With Windsor.ai’s no-code ELT/ETL connectors, you can import your Meta Ads data into BigQuery in just minutes—no programming skills required. After a quick three-step setup, Windsor takes care of the entire process, including authentication, schema mapping, error handling, and scheduled syncs.

Windsor securely accesses your Facebook Ads account via OAuth and allows you to extract hundreds of Meta metrics and dimensions, including campaign performance, audience insights, and account-level data. Then, it automatically transforms this data into a BigQuery-compatible structure, ready for analysis.

Here’s what your Facebook Ads to BigQuery integration looks like with Windsor.ai:

✅ Connect your Facebook Ads account securely with one click.

✅ Automate data formatting and schema mapping for BigQuery.

✅ Schedule data updates as often as you need (hourly, daily, etc.).

✅ Enhance performance with backfill, clustering, or partitioning options.

✅ Blend Facebook Ads data with 325+ other sources within the same BigQuery project.

Try your first Facebook Ads integration through Windsor.ai with a 30-day free trial. Unlock more data sources for only $19/month.

Method 2: Manual data export and upload

  • Prerequisites: Facebook Business Manager account with Marketing API access, Facebook App with Ads Management permission, Google Cloud Platform (GCP) account with BigQuery enabled, Service account credentials with BigQuery Admin role
  • Tools used: Facebook Marketing API, Google Cloud Console, BigQuery REST API, curl (or scripts)
  • Type of solution: Manual, code-based integration using public APIs and GCP tools
  • Mechanism: Extract Facebook Ads data through the Facebook Marketing API and manually upload it into BigQuery using REST API calls or scripts. 
  • Best suited for: Technical users who enjoy hands-on coding and prefer building custom solutions. If you like full control over your data pipeline—your system, your rules—this method offers maximum flexibility without relying on third-party tools.
  • Limitations: Takes time to set up and requires ongoing maintenance. Real-time or scheduled updates aren’t available out of the box—you’ll need to build and maintain automation scripts yourself.

Step-by-step instructions:

Step 1: Set up Facebook Marketing API

  1. Go to the Facebook Developers site.
  2. Click “New app” and select “Business” as the type to get started.
  3. Next, select and include the Marketing API in your application.
  4. Create an access token next. Make sure it has ads_read permission.
  5. Get:
  • The Access Token
  • Your Ad Account ID (something like act_1234567890)

Step 2: Extract Facebook Ads data

Use the Meta Graph API to fetch campaign performance data. 

Example request to retrieve campaign details:

https://graph.facebook.com/v22.0/act_YOUR_AD_ACCOUNT_ID/campaigns?fields=id,name,status,start_time,end_time,daily_budget&date_preset=last_30d&access_token=YOUR_ACCESS_TOKEN

This request returns campaign data in JSON format, including the campaign ID, name, status, start and end times, and daily budget for the last 30 days.

To fetch data for other ad objects—be it ad sets, ads, or insights—change the endpoint path and query parameters accordingly. Visit Meta’s Graph API documentation for expanded field options and supported parameters.

Step 3: Set up a Google BigQuery project

1. Turn on the BigQuery API:

  • Head over to your Google Cloud Console.
  • Click on “APIs & Services,” then open the “Library.”
  • Find and enable the BigQuery API there.

2. Create your Dataset and Table:

  • Open the BigQuery Console.
  • Pick your project and click “Create Dataset.”
  • Once that’s done, open it and click “Create Table.”
  • You can upload your schema or let BigQuery figure it out based on your Facebook data.

3. Create a BigQuery Service Account:

  • In the Cloud Console, go to “IAM & Admin” and select “Service Accounts.”
  • Create a new service account and assign the BigQuery Admin role to it.
  • Then, create a JSON key file and download it.
  • You’ll need this file later when connecting through the API.

Step 4: Load data into BigQuery

After transforming the Facebook Ads JSON into the required format, use the BigQuery API to insert it:

curl -X POST -H "Authorization: Bearer $(gcloud auth print-access-token)" \

-H "Content-Type: application/json" \

--data-binary @facebook_ads_data.json \

"https://bigquery.googleapis.com/upload/bigquery/v2/projects/{PROJECT_ID}/datasets/{DATASET_ID}/tables/{TABLE_ID}/insertAll"

Swap out {PROJECT_ID}, {DATASET_ID}, and {TABLE_ID} with your real GCP project and BigQuery table info.

Also, make sure facebook_ads_data.json is the local file where your cleaned-up data is stored.

Optional automation:

You don’t need to repeat these steps manually each time. You can automate the process by setting up a cron job, Google Cloud Function, or using Cloud Scheduler. 

Just make sure your script:

  • Logs into Facebook’s API
  • Grabs the latest ad data
  • Puts it in BigQuery format
  • Sends it to BigQuery

That’s it. You decide what to sync, when to sync, and how to catch errors if something breaks. 

Method 3. API + custom script

  • Prerequisites: Facebook Developer App with ads_read permission, Google Cloud project with BigQuery enabled, Service account with BigQuery Admin role, knowledge of APIs, Python, and Google Cloud tools
  • Tools used: Facebook Graph API, Python (requests, google-cloud-bigquery), Google Cloud Console
  • Type of solution: Fully customizable, code-driven integration
  • Mechanism: Write a script that pulls data from the Facebook Graph API, processes and formats the data as needed, and then uploads it to BigQuery using the appropriate Google Cloud client library. This approach gives you full control over every step of the pipeline
  • Best suited for: Technical users who need full flexibility and control over their data pipelines.
    Limitations: Requires coding skills and a solid understanding of authentication, scheduling, and handling API errors independently

This method is ideal for implementing custom data workflows, advanced automation, and scalable long-term solutions. It gives you full control over the data pipeline—when to upload, how to clean, and what data to retrieve are entirely up to you.

However, keep in mind that token management, retry logic, and code maintenance are additional complexities you’ll need to handle yourself.

Step-by-step instructions:

Step 1. Set up Facebook Ads API

  • Head over to the Facebook Developers. site, that’s your starting point.
  • Create a new app, and when it asks, pick the “Business” option.
  • Toss the Marketing API into your app; it’s a must-have for this.
  • Grab an access token; make sure it includes ads_read.
  • Save your App ID, App Secret, and Ad Account ID (like act_1234567890).

Step 2: Set up your Google Cloud project

  • Log in to Google Cloud Console.
  • Turn on the BigQuery API for your project.
  • Go to IAM & Admin -> Service Accounts.
  • Create a new service account.
  • Give it the BigQuery Admin role.
  • Download the JSON key file for later use.

Step 3: Prepare your environment

  • Pick where you want the script to run: on your laptop, server, or cloud. 
  • Grab the Python tools you’ll need by running this quick install:
pip install requests google-cloud-bigquery
  • Ensure Python is installed and properly configured on your side.

Step 4: Fetch Facebook Ads data

  • You can fetch data in two ways:
  1. Using Ads Manager to export reports manually
  2. Using the Graph API for automated extraction
  • To automate the extraction process, call the API like this:
https://graph.facebook.com/v12.0/act_<AD_ACCOUNT_ID>/insights?fields=impressions,clicks,spend&access_token=<ACCESS_TOKEN>
  • This call returns JSON data with impressions, clicks, and spending.
  • You can add more fields or filters based on your needs.
  • Once you get the data, clean it. Remove empty records. Format the fields correctly.

Step 5: Prepare data for BigQuery import

  • Choose what columns you need to transfer. For example:

date, campaign_id, clicks, impressions, spend

  • Choose a supported data format: CSV or JSON

For CSV, make sure the first row has headers. For JSON, each object should represent a record.

Also, define a BigQuery schema like this:

[
  {"name": "date", "type": "DATE", "mode": "REQUIRED"},
  {"name": "campaign_id", "type": "STRING", "mode": "NULLABLE"},
  {"name": "clicks", "type": "INTEGER", "mode": "NULLABLE"},
  {"name": "spend", "type": "FLOAT", "mode": "NULLABLE"}
]

This tells BigQuery how to interpret your data.

Step 6: Upload data to BigQuery

  • Use the service account key to authenticate your script.
  • Then, create a BigQuery client using the Python library:
from google.cloud import bigquery

client = bigquery.Client.from_service_account_json("key.json")
  • Now upload your data:
table_id = "your_project.dataset.table"

job = client.load_table_from_json(data, table_id)

job.result()
  • Preview your data in BigQuery once the job finishes.
  • Use the BigQuery console to run test queries and check the upload.

Step 7: Automate the process

  • On Linux or a server? Use a cron job.
  • For serverless, try Google Cloud Scheduler with Cloud Functions.
  • Set it to run daily, weekly, or whenever you need.
  • Don’t forget to refresh your Facebook token.
  • Add error handling for rate limits.

Step-by-step guide: How to use Windsor.ai to connect Facebook Ads to BigQuery 

If you’re wondering whether it’s really possible to integrate Meta Ads data into BigQuery with Windsor.ai in under 5 minutes, this guide will show you exactly how to do it, step by step.

Windsor.ai’s BigQuery Integration Documentation: https://windsor.ai/documentation/how-to-integrate-data-into-google-bigquery/.

1. Sign up for Windsor.ai and start your 30-day free trial.

2. In the data sources list, select Facebook Ads and grant access to your account(s):

connect facebook ads to bigquery

3. The connection is completed. View the list of your available Facebook Ads account(s) and select the necessary one(s) from which you want to pull data:

facebook ads connector windsor

4. Click on the “Next” button to proceed with moving Facebook Ads data to BigQuery.

5. Next, configure your dataset by selecting the desired date range and specific fields you want to stream into BigQuery. Preview your extracted data in Windsor.

facebook ads to bigquery integration

6. Scroll down to the Data Destinations section and choose BigQuery from these options:

hubspot to bigquery windsor

7. Click “Add Destination Task” and enter the following details:

  • Task name: Enter any name you wish.
  • Authentication type: You can authorize via a Google Account (OAuth 2.0) or a Service Account file.
  • Project ID: This can be found in your Google Cloud Console.
  • Dataset ID: This can be found in your BigQuery project.
  • Table name: Windsor.ai will create this table for you if it doesn’t exist.
  • Backfill: You can backfill historical data when setting up the task (available only on the paid plans).
  • Schedule: Define how often data should be updated in BigQuery (e.g., hourly, daily; available on Standard plans and above).

bigquery integration

Keep in mind that if you add new fields or edit the query after the table was initially created, you must manually edit your schema and add or edit those fields in your BigQuery table. Alternatively, you can create a new destination task, and Windsor.ai will auto-create a new table with full schema.

While Windsor.ai auto-creates the table schema on first setup, it does not automatically update the schema when new fields are added or edited later. 

8. Select advanced options (optional).

Windsor.ai supports clustering and partitioning for BigQuery tables to help you improve query performance and reduce costs by optimizing how data is stored and retrieved.

You can combine table clustering with table partitioning to achieve finely-grained sorting for further query optimization.

9. When completed, click “Test connection.” If the connection is set properly, you’ll see a success message at the bottom; otherwise, an error message will appear.

When successful, click “Save” to run the added destination task to BigQuery.

  • See the task running in the selected data destination section. The green ‘upload‘ button with the status ‘ok‘ indicates that the task is active and running successfully.

10. Now, when you open your specified BigQuery project, you should see your Meta Ads data uploaded there:

Summary: Windsor.ai vs other integration methods 

Feature 

Manual method

API+custom script

Via ETL/ELT tools (Windsor.ai)

Technical skills needed

No coding needed

Requires coding expertise

No coding – ready to use

Automation

Manual process only

Fully automated with scripts

Fully automated and fully managed

Real-time data sync

Not available

Real-time updates

Near real-time (scheduled syncs)

Scalability

Not scalable (manual repetition)

Highly scalable

Highly scalable

Customization

Limited options

Fully customizable with code

Advanced filters & mappings for extensive customization

Setup complexity

Medium

Very complex (API setup required)

Very simple (even for non-techies)

Ongoing maintenance

High effort required (repetitive)

Some upkeep required (script updates)

No maintenance required; handled by the tool

Cost

Free

Depends on the engineering resources used

Paid subscription; starts from $19/month

Data transformation

Manual adjustments needed

Can be automated with code

Automatic formatting & cleaning

Ideal use case

For one-time or rare transfers

For tech teams that need full control

For marketing and tech-focused teams requiring automated data pipelines of varying complexity to save time and effort

Conclusion

Connecting Facebook Ads to BigQuery helps you gain deep insights and make smarter decisions. 

To cover the integration, you’ve got three options: Windsor.ai for ultimate automation; manual API uploads for one-time tasks, or custom scripts for total flexibility. Each approach works for different needs and skill levels. Pick what fits you best.

Try Windsor.ai for free today: https://onboard.windsor.ai/ — and unlock the full potential of your Facebook Ads to BigQuery integration.

 

FAQs

Is there an automated way to extract data from Facebook Ads?

Yes, you can use Windsor.ai to connect your Facebook Ads account and automatically extract campaign data without writing code.

How do I sync Facebook Ads with BigQuery?

There are three main ways to integrate Facebook Ads data with BigQuery: using ETL/ELT tools like Windsor.ai, uploading data manually, or building a custom pipeline with the API and scripts. Windsor.ai offers the easiest path—thanks to its no-code setup, your Meta Ads data can start flowing into BigQuery in minutes. No technical complexity, no headaches—just smooth, automated data transfer.

Why should I choose Windsor to connect Facebook Ads to BigQuery?

Windsor.ai takes care of the entire data integration process. It extracts data from your account, maps fields, checks for errors, and keeps your data automatically updated. You don’t need to stress about anything. And budget-wise, compared to other popular ETL/ELT solutions, Windsor.ai’s pricing plans for data integration into BigQuery are significantly more affordable, starting at just $19/month.

How often can the data transfer from Facebook Ads to BigQuery be run?

You choose with Windsor.ai! Sync hourly for near real-time insights or daily for regular updates. Either way, your data stays fresh and ready to use.

Tired of juggling fragmented data? Get started with Windsor.ai today to create a single source of truth

Let us help you streamline data integration and marketing attribution, so you can focus on what matters—growth strategy.
g logo
fb logo
big query data
youtube logo
power logo
looker logo