Documentation
HomeAPISign In
  • Getting Started
    • Overview
      • Core Concepts
      • Building your First Workflow
    • API Reference
  • Your Data
    • Brand Kits
    • Knowledge Bases
      • Add Data
        • Upload Files
        • Web Scrape
        • Import from Google Drive
        • Import from SQL Database
        • Import from Shopify
      • Knowledge Base Search
      • Knowledge Base Metadata
      • Knowledge Base API
  • Building Workflows
    • Workflow Concepts
      • Workflow Inputs
        • Input Types
      • Workflow Outputs
      • Variable Referencing
      • Liquid Templating
    • Workflow Steps
      • AI
        • Prompt LLM
          • Choosing a Model
          • Prompting Techniques
        • Transcribe Audio File
      • Web Research
        • Google Search
        • Web Page Scrape
      • Code
        • Run Code
        • Call API
        • Format JSON
        • Run SQL Query
        • Write Liquid Text
      • Flow
        • Condition
        • Iteration
        • Human Review
        • Content Comparison
        • Error
      • Data
        • Search Knowledge Base
        • Write to Knowledge Base
        • Get Knowledge Base File
      • AirOps
        • Workflow
        • Agent
      • Image & Video
        • Generate Image with API
        • Search Stock Images
        • Fetch Stock Image with ID
        • Resize Image
        • Screenshot from URL
        • Create OpenGraph Image
        • Create Video Avatar
      • SEO Research
        • Semrush
        • Data4SEO
      • Content Quality
        • Detect AI Content
        • Scan Content for Plagiarism
      • Content Processing
        • Convert Markdown to HTML
        • Convert PDF URL to Text
        • Group Keywords into Clusters
      • B2B Enrichment
        • Hunter.io
        • People Data Labs
      • CMS Integrations
        • Webflow
        • WordPress
        • Shopify
        • Contentful
        • Sanity
        • Strapi
      • Analytics Integrations
        • Google Search Console
      • Collaboration Integrations
        • Gmail
        • Google Docs
        • Google Sheets
        • Notion
        • Slack
    • Testing and Iteration
    • Publishing and Versioning
  • Running Workflows
    • Run Once
    • Run in Bulk (Grid)
    • Run via API
    • Run via Trigger
      • Incoming Webhook Trigger
      • Zapier
    • Run on a Schedule
    • Error Handling
  • Grids
    • Create a Grid
      • Import from Webflow
      • Import from Wordpress
      • Import from Semrush
      • Import from Google Search Console
    • Add Columns in the Grid
    • Run Workflows in the Grid
      • Add Workflow Column
      • Run Workflow Column
      • Map Workflow Outputs
      • Review Workflow Run Metadata
    • Review Content in the Grid
      • Review Markdown Content
      • Review HTML Content
      • Compare Content Difference
    • Publish to CMS from Grid
    • Pull Analytics in the Grid
    • Export as CSV
  • Copilot
    • Chat with Copilot
    • Edit Workflows with Copilot
    • Fix Errors with Copilot
  • Monitoring
    • Task Usage
    • Analytics
    • Alerts
    • Execution History
  • Your Workspace
    • Create a Workspace
    • Folders
    • Settings
    • Billing
    • Use your own LLM API Keys
    • Secrets
    • Team and Permissions
  • Chat Agents (Legacy)
    • Agent Quick Start
    • Chat Agents
    • Integrate Agents
      • Widget
      • Client Web SDK
  • About
    • Ethical AI and IP Production
    • Principles
    • Security and Compliance
Powered by GitBook
On this page
  • Setup Google Cloud Platform Service Account and Grant Access (for New Users)
  • 1. Create a Google Cloud Platform Service Account
  • 2. Generate a JSON key for the Service Account
  • 3. Enter the JSON Key File Content in AirOps

Was this helpful?

  1. Archived
  2. Data Sources

BigQuery

Connect to a BigQuery database in AirOps

Last updated 11 months ago

Was this helpful?

Setup Google Cloud Platform Service Account and Grant Access (for New Users)

To add a Google BigQuery Data Warehouse as a Data Source on AirOps, use a Google Cloud Platform service account with read access to the desired datasets and tables:

1. Create a Google Cloud Platform Service Account

  1. Go to the .

  2. Click on the project dropdown and select the project that contains your BigQuery dataset.

  3. Navigate to the IAM & Admin page, and then click Service accounts.

  4. Click + CREATE SERVICE ACCOUNT at the top of the page.

  5. Enter a name for your service account and click Create.

  6. On the Grant this service account access to the project page, select the BigQuery Data Viewer and BigQuery Job User roles.

  7. Click Continue, and then click Done.

2. Generate a JSON key for the Service Account

  1. In the Service accounts page, find the service account you just created.

  2. Click the Actions menu (three vertical dots) and select Create key.

  3. Select JSON as the key type, and click Create.

  4. Save the generated JSON key file to your computer.

3. Enter the JSON Key File Content in AirOps

  1. Open the JSON key file with a text editor.

  2. Copy the entire content of the JSON key file.

  3. In AirOps, paste the JSON key file content into the JSON Key field.

Now, your Google BigQuery Data Warehouse should be connected and ready for use within AirOps.

Google Cloud Platform Console