BigQuery Integration
Stream all payment events to BigQuery for data warehousing and analytics.
Connect Dunmore to BigQuery to stream all payment events into a data warehouse for advanced analytics.
Setup
- Create a BigQuery dataset and table for Dunmore events
- Create a service account with BigQuery Data Editor role
- Generate an access token for the service account
- Add a BigQuery connector in the Dunmore console
Table Schema
Create a BigQuery table with the following schema:
| Column | Type | Description |
|---|---|---|
event_id | STRING | Dunmore event ID (evt_xxx) |
event_type | STRING | Event type (e.g., payment.settled) |
timestamp | TIMESTAMP | Event timestamp |
data | STRING | JSON-encoded event data |
project_id | STRING | Dunmore project ID |
environment | STRING | Environment (production/test) |
Supported Events
All event types are streamed to BigQuery. This connector does not filter — it receives everything for complete data warehousing.
Configuration
{
"type": "bigquery",
"config": {
"projectId": "my-gcp-project",
"datasetId": "dunmore_events",
"tableId": "events",
"accessToken": "ya29...."
},
"eventTypes": ["*"]
}
API Example
curl -X POST https://api.dunmore.xyz/api/projects/{projectId}/connectors \
-H "Authorization: Bearer {token}" \
-H "Content-Type: application/json" \
-d '{
"type": "bigquery",
"config": {
"projectId": "YOUR_GCP_PROJECT",
"datasetId": "dunmore_events",
"tableId": "events",
"accessToken": "YOUR_ACCESS_TOKEN"
},
"eventTypes": ["*"]
}'