Documentation
HomeAPISign In
  • Getting Started
    • Overview
      • Core Concepts
      • Building your First Workflow
    • API Reference
  • Your Data
    • Brand Kits
    • Knowledge Bases
      • Add Data
        • Upload Files
        • Web Scrape
        • Import from Google Drive
        • Import from SQL Database
        • Import from Shopify
      • Knowledge Base Search
      • Knowledge Base Metadata
      • Knowledge Base API
  • Building Workflows
    • Workflow Concepts
      • Workflow Inputs
        • Input Types
      • Workflow Outputs
      • Variable Referencing
      • Liquid Templating
    • Workflow Steps
      • AI
        • Prompt LLM
          • Choosing a Model
          • Prompting Techniques
        • Transcribe Audio File
      • Web Research
        • Google Search
        • Web Page Scrape
      • Code
        • Run Code
        • Call API
        • Format JSON
        • Run SQL Query
        • Write Liquid Text
      • Flow
        • Condition
        • Iteration
        • Human Review
        • Content Comparison
        • Error
      • Data
        • Search Knowledge Base
        • Write to Knowledge Base
        • Get Knowledge Base File
      • AirOps
        • Workflow
        • Agent
      • Image & Video
        • Generate Image with API
        • Search Stock Images
        • Fetch Stock Image with ID
        • Resize Image
        • Screenshot from URL
        • Create OpenGraph Image
        • Create Video Avatar
      • SEO Research
        • Semrush
        • Data4SEO
      • Content Quality
        • Detect AI Content
        • Scan Content for Plagiarism
      • Content Processing
        • Convert Markdown to HTML
        • Convert PDF URL to Text
        • Group Keywords into Clusters
      • B2B Enrichment
        • Hunter.io
        • People Data Labs
      • CMS Integrations
        • Webflow
        • WordPress
        • Shopify
        • Contentful
        • Sanity
        • Strapi
      • Analytics Integrations
        • Google Search Console
      • Collaboration Integrations
        • Gmail
        • Google Docs
        • Google Sheets
        • Notion
        • Slack
    • Testing and Iteration
    • Publishing and Versioning
  • Running Workflows
    • Run Once
    • Run in Bulk (Grid)
    • Run via API
    • Run via Trigger
      • Incoming Webhook Trigger
      • Zapier
    • Run on a Schedule
    • Error Handling
  • Grids
    • Create a Grid
      • Import from Webflow
      • Import from Wordpress
      • Import from Semrush
      • Import from Google Search Console
    • Add Columns in the Grid
    • Run Workflows in the Grid
      • Add Workflow Column
      • Run Workflow Column
      • Map Workflow Outputs
      • Review Workflow Run Metadata
    • Review Content in the Grid
      • Review Markdown Content
      • Review HTML Content
      • Compare Content Difference
    • Publish to CMS from Grid
    • Pull Analytics in the Grid
    • Export as CSV
  • Copilot
    • Chat with Copilot
    • Edit Workflows with Copilot
    • Fix Errors with Copilot
  • Monitoring
    • Task Usage
    • Analytics
    • Alerts
    • Execution History
  • Your Workspace
    • Create a Workspace
    • Folders
    • Settings
    • Billing
    • Use your own LLM API Keys
    • Secrets
    • Team and Permissions
  • Chat Agents (Legacy)
    • Agent Quick Start
    • Chat Agents
    • Integrate Agents
      • Widget
      • Client Web SDK
  • About
    • Ethical AI and IP Production
    • Principles
    • Security and Compliance
Powered by GitBook
On this page
  • Create the API Integration in Snowflake
  • Create the External Function
  • Find Your Workspace Bearer Token
  • Get the Workflow ID
  • Set the Workflow Parameters
  • Create the External Function
  • Workflow Execution

Was this helpful?

  1. Archived
  2. Bulk Runs

Run in Snowflake

Access your AirOps Workflows directly in Snowflake

Last updated 9 months ago

Was this helpful?

If you have a Snowflake warehouse, you may want to integrate some of your applications (particularly those that are SQL driven) directly into your account. We'll walk through how to create and run a Snowflake External Function to access any AirOps Workflow.

These instructions can also be found in the "Snowflake" tab underneath the "Integrate" section of your Workflow:

Create the API Integration in Snowflake

First, you must create an API connection between Snowflake and AirOps. To do so, run the below statement in Snowflake with ACCOUNT_ADMIN permissions.

create or replace api integration airops_api_integration
 api_provider = aws_api_gateway
 api_aws_role_arn = 'arn:aws:iam::843219976259:role/rails_app_api_gateway'
 api_allowed_prefixes = ('https://x4xrin77vf.execute-api.us-east-1.amazonaws.com/production')
 enabled = true;

Create the External Function

Secondly you must create a function for each Workflow you wish you use in Snowflake. When running the CREATE command to create a new function, you’ll add the bearer token for authentication, identify the data app ID, give it a name, and run the command. The function can be given any name, but we recommend including “AIROPS” so it is easily identifiable (e.g. AIROPS_SENTIMENT_ANALYZE)

Find Your Workspace Bearer Token

You will need the bearer token to create the External Function. The bearer token should be in the following format:

Bearer YOUR_API_KEY

Get the Workflow ID

You will also use your Workflow ID during the Create External Function step.

The Workflow ID is the number found in the app's URL. For instance, in the URL:

https://app.airops.com/{your_workspace}/apps/123

The Workflow ID is:

123

Set the Workflow Parameters

AirOps Workflows require specific parameters in order to function properly. The parameters are inputs that are passed into the model.

Create the External Function

You’re almost there! With the Workflow ID, api key, and parameters, create the external function:

create or replace external function AIROPS_<APP NAME>(data_row object)
returns variant
api_integration = AIROPS_API_INTEGRATION
HEADERS = ( 'Authorization' = 'Bearer <YOUR_API_KEY>', 'Content-Type' = 'application/json' )
MAX_BATCH_ROWS = 25
as 'https://x4xrin77vf.execute-api.us-east-1.amazonaws.com/production/snowflake-execute-data-app/{YOUR_DATA_APP_ID}';

Workflow Execution

To execute a Workflow from your newly created Snowflake function, you must first construct an object with the request parameters. The object should contain the parameter name, and the column or value to be passed into it.

In the example below, a request object is created using the OBJECT_CONSTRUCT function in Snowflake. The order of parameters in the object does not matter.

The constructed object is passing:

  • text_column field into the input_text parameter

  • from_lanugage_column into the from_language parameter

  • to_language_column into the to_language parameter.

select 
    **OBJECT_CONSTRUCT**(**'input_text', text_column, 'from_language', from_lanugage_column, 'to_language',to_language_column )** as request_data
from
	language_data;

This will create an object to pass into the function. Below is an object created by the above query for the Language Translate Workflow:

{
	"input_text": "Translate this text!",
  "from_language": "Detect",
  "to_language": "Spanish"
}

The object is then used in the function, which will pass the mapped fields to the parameters in the AirOps Workflow.

select 
    object_construct('input_text', text_input, 'from_language', from_lanugage, 'to_language',to_language ) as data_row
	, **AIROPS_LANGUAGE_TRANSLATE(data_row) as result**
from
	language_data;

More information about Snowflake API integrations can be found .

AirOps API requests require authentication using your workspace Bearer token that forms part of the request. This can be found in the .

For more information on ensuring you have the correct parameters for your call, please see our documentation page on .

here
"Settings" page of your Workspace
API Endpoints