Documentation
HomeAPISign In
  • Getting Started
    • Overview
      • Core Concepts
      • Building your First Workflow
    • API Reference
  • Your Data
    • Brand Kits
    • Knowledge Bases
      • Add Data
        • Upload Files
        • Web Scrape
        • Import from Google Drive
        • Import from SQL Database
        • Import from Shopify
      • Knowledge Base Search
      • Knowledge Base Metadata
      • Knowledge Base API
  • Building Workflows
    • Workflow Concepts
      • Workflow Inputs
        • Input Types
      • Workflow Outputs
      • Variable Referencing
      • Liquid Templating
    • Workflow Steps
      • AI
        • Prompt LLM
          • Choosing a Model
          • Prompting Techniques
        • Transcribe Audio File
      • Web Research
        • Google Search
        • Web Page Scrape
      • Code
        • Run Code
        • Call API
        • Format JSON
        • Run SQL Query
        • Write Liquid Text
      • Flow
        • Condition
        • Iteration
        • Human Review
        • Content Comparison
        • Error
      • Data
        • Search Knowledge Base
        • Write to Knowledge Base
        • Get Knowledge Base File
      • AirOps
        • Workflow
        • Agent
      • Image & Video
        • Generate Image with API
        • Search Stock Images
        • Fetch Stock Image with ID
        • Resize Image
        • Screenshot from URL
        • Create OpenGraph Image
        • Create Video Avatar
      • SEO Research
        • Semrush
        • Data4SEO
      • Content Quality
        • Detect AI Content
        • Scan Content for Plagiarism
      • Content Processing
        • Convert Markdown to HTML
        • Convert PDF URL to Text
        • Group Keywords into Clusters
      • B2B Enrichment
        • Hunter.io
        • People Data Labs
      • CMS Integrations
        • Webflow
        • WordPress
        • Shopify
        • Contentful
        • Sanity
        • Strapi
      • Analytics Integrations
        • Google Search Console
      • Collaboration Integrations
        • Gmail
        • Google Docs
        • Google Sheets
        • Notion
        • Slack
    • Testing and Iteration
    • Publishing and Versioning
  • Running Workflows
    • Run Once
    • Run in Bulk (Grid)
    • Run via API
    • Run via Trigger
      • Incoming Webhook Trigger
      • Zapier
    • Run on a Schedule
    • Error Handling
  • Grids
    • Create a Grid
      • Import from Webflow
      • Import from Wordpress
      • Import from Semrush
      • Import from Google Search Console
    • Add Columns in the Grid
    • Run Workflows in the Grid
      • Add Workflow Column
      • Run Workflow Column
      • Map Workflow Outputs
      • Review Workflow Run Metadata
    • Review Content in the Grid
      • Review Markdown Content
      • Review HTML Content
      • Compare Content Difference
    • Publish to CMS from Grid
    • Pull Analytics in the Grid
    • Export as CSV
  • Copilot
    • Chat with Copilot
    • Edit Workflows with Copilot
    • Fix Errors with Copilot
  • Monitoring
    • Task Usage
    • Analytics
    • Alerts
    • Execution History
  • Your Workspace
    • Create a Workspace
    • Folders
    • Settings
    • Billing
    • Use your own LLM API Keys
    • Secrets
    • Team and Permissions
  • Chat Agents (Legacy)
    • Agent Quick Start
    • Chat Agents
    • Integrate Agents
      • Widget
      • Client Web SDK
  • About
    • Ethical AI and IP Production
    • Principles
    • Security and Compliance
Powered by GitBook
On this page
  • Setup User and Grant Access (for New Users)
  • 1. Create AirOps User
  • 2. Grant USAGE and SELECT privileges to AirOps user
  • 3. (Optional) Grant access to future tables in the schema

Was this helpful?

  1. Archived
  2. Data Sources

Redshift

Connect to a Redshift database in AirOps

Last updated 9 months ago

Was this helpful?

Remember to allowlist our IP Address 52.71.87.39

If your database is protected by a firewall, remember to allow inbound access to AirOPs IP address over your database port.

  1. Host or IP - The endpoint of your AWS Redshift cluster. Example: examplecluster.abcdefghijkl.us-west-2.redshift.amazonaws.com

  2. Port - The port on which your Redshift cluster is listening. Default is 5439.

  3. User - The Redshift user that AirOps will connect to the database with. It is best practice to create a new user for AirOps (instructions below), but any user with SELECT privileges can be used.

  4. Password - Password for the above user.

  5. Database - The Redshift database that AirOps will connect to.

Setup User and Grant Access (for New Users)

To add an AWS Redshift database as a Data Source on AirOps, use an existing user or create a user with read access to the tables and schemas you would like to access from AirOps:

1. Create AirOps User

CREATE USER airops_user
PASSWORD '<secure-password>';

2. Grant USAGE and SELECT privileges to AirOps user

Replace <schema> and <table> with the appropriate schema and table names that you would like AirOps to interact with.

GRANT USAGE ON SCHEMA "<schema>" TO airops_user;

GRANT SELECT ON TABLE "<schema>"."<table>" TO airops_user;

If you want to grant access to multiple tables, you can run the GRANT SELECT statement for each table.

3. (Optional) Grant access to future tables in the schema

If you want to grant access to all future tables in a schema, you can use the following command. Replace with the appropriate schema name.

ALTER DEFAULT PRIVILEGES IN SCHEMA "<schema>"
GRANT SELECT ON TABLES TO airops_user;