Before You Begin
Before connecting BigQuery, we recommend the following:- Create a dedicated service account — Set up a service account specifically for Conversion with read-only access to the tables you need. This limits exposure and makes it easy to revoke access if needed.
- Use a read replica if possible — Running queries against a replica prevents sync operations from impacting your production database.
- Sync only the data you need — Select only the columns you’ll use, and always filter with
last_sync_timeto limit results to changed rows.
Step 1: Create a Service Account
Conversion connects to BigQuery using a service account. Create one in the Google Cloud Console.- Go to IAM & Admin → Service Accounts in the Google Cloud Console
- Click Create Service Account
- Enter a name (e.g., “Conversion Sync”) and description
- Click Create and Continue
- Add the following roles:
- BigQuery User — Allows running queries
- BigQuery Job User — Allows creating query jobs
- BigQuery Data Viewer — Allows reading table data (grant this on specific datasets if you want to limit access)
- Click Continue, then Done
- Find your new service account in the list, click the three dots under Actions, and select Manage keys
- Click Add Key → Create new key
- Select JSON and click Create
- Save the downloaded JSON file — you’ll need it when connecting to Conversion
For tighter security, grant BigQuery Data Viewer only on the specific datasets you want Conversion to access, rather than at the project level.
Step 2: Allow Conversion’s IP Addresses
If your BigQuery project uses VPC Service Controls or other network restrictions, allow connections from Conversion’s IP addresses:| Region | IP Addresses |
|---|---|
| US | 35.239.90.161, 35.188.167.166, 34.56.101.43, 34.122.97.230, 34.29.176.66, 35.226.154.44 |
Step 3: Connect BigQuery to Conversion
- In Conversion, go to Settings → Integrations → Data Warehouse
- Click Add Integration and select Google BigQuery
- Enter a name for this connection (e.g., “Production BigQuery”)
- Paste the contents of your service account JSON file into the Service Account Credentials field
- Click Connect
Step 4: Create a Sync
After connecting, create a sync to start pulling data:- Click Add Sync
- Select the destination type:
- Contacts — Create or update contact records
- Custom Objects — Create or update custom object records (with optional relationships)
- Events — Record events on contacts
- For Custom Objects, select the specific object type
- Enter your SQL query (see examples below)
- Click Run Query to preview results
- Set the sync schedule (how often to run)
- Click Enable
Writing Queries for BigQuery
Contact Sync
Custom Objects (Without Relationships)
To sync objects without relating them to contacts, omit theemail column:
Custom Objects with Relationships
To sync custom objects and their relationships to contacts in a single query, include the contact’semail and use JSON_OBJECT for relationship-specific fields:
- Object fields (
id,name,sku,category,price) are stored on the Product object emailidentifies which contact to create a relationship withrelationshipFieldscontains fields stored on the relationship itself (this contact’s role, quantity, etc.)
If multiple rows share the same object
id but have different email values, Conversion creates one object with multiple relationships—one per contact.Events
BigQuery SQL Reference
Converting Timestamps
Conversion expects Unix timestamps for date/time fields. UseUNIX_SECONDS() to convert BigQuery TIMESTAMP or DATETIME values:
Using last_sync_time
Thelast_sync_time variable is a Unix timestamp. To compare it against a BigQuery TIMESTAMP column, wrap it in the TIMESTAMP() function:
Building Nested Objects with JSON_OBJECT
BigQuery doesn’t support dot notation for creating nested objects. UseJSON_OBJECT to build nested structures like relationshipFields:
Handling NULLs
UseCOALESCE or IFNULL to provide default values:
Casting Types
UseCAST to convert between types:
Querying Across Datasets
Use fully-qualified table names to join across datasets:Troubleshooting
”Permission denied” errors
Ensure your service account has the required roles:- BigQuery User
- BigQuery Job User
- BigQuery Data Viewer (on the relevant datasets)
“Could not connect” errors
Check that Conversion’s IP addresses are allowed if you use VPC Service Controls or firewall rules.Sync taking too long
- Ensure you’re filtering by
last_sync_timeto reduce rows - Select only the columns you need
- Consider reducing sync frequency for large datasets
- Use partitioned tables if your data is time-based
Duplicate data appearing
If you’re using time-buffered queries (subtracting fromlast_sync_time to catch edge cases), Conversion handles duplicates automatically based on the unique key (email for contacts, id for objects, message_id for events).
Frequently Asked Questions
What BigQuery permissions does Conversion need?
What BigQuery permissions does Conversion need?
At minimum, the service account needs BigQuery User, BigQuery Job User, and BigQuery Data Viewer roles. For tighter security, grant Data Viewer only on specific datasets rather than the entire project.
Does Conversion write data back to BigQuery?
Does Conversion write data back to BigQuery?
No. Conversion only reads from BigQuery. The service account only needs read permissions.
How much does this cost in BigQuery?
How much does this cost in BigQuery?
BigQuery charges for the amount of data scanned by your queries. To minimize costs: select only the columns you need, filter with
last_sync_time, and consider partitioned tables.What if a contact doesn't exist yet?
What if a contact doesn't exist yet?
If you sync a custom object with an
email that doesn’t exist in Conversion, we create the contact automatically. You can enrich that contact’s profile through a separate contacts sync.