Run in Snowflake

Access your AirOps Workflows directly in Snowflake

If you have a Snowflake warehouse, you may want to integrate some of your applications (particularly those that are SQL driven) directly into your account. We'll walk through how to create and run a Snowflake External Function to access any AirOps Workflow.

These instructions can also be found in the "Snowflake" tab underneath the "Integrate" section of your Workflow:

Create the API Integration in Snowflake

First, you must create an API connection between Snowflake and AirOps. To do so, run the below statement in Snowflake with ACCOUNT_ADMIN permissions.

create or replace api integration airops_api_integration
 api_provider = aws_api_gateway
 api_aws_role_arn = 'arn:aws:iam::843219976259:role/rails_app_api_gateway'
 api_allowed_prefixes = ('https://x4xrin77vf.execute-api.us-east-1.amazonaws.com/production')
 enabled = true;

More information about Snowflake API integrations can be found here.

Create the External Function

Secondly you must create a function for each Workflow you wish you use in Snowflake. When running the CREATE command to create a new function, you’ll add the bearer token for authentication, identify the data app ID, give it a name, and run the command. The function can be given any name, but we recommend including “AIROPS” so it is easily identifiable (e.g. AIROPS_SENTIMENT_ANALYZE)

Find Your Workspace Bearer Token

AirOps API requests require authentication using your workspace Bearer token that forms part of the request. This can be found in the "Settings" page of your Workspace.

You will need the bearer token to create the External Function. The bearer token should be in the following format:

Bearer YOUR_API_KEY

Get the Workflow ID

You will also use your Workflow ID during the Create External Function step.

The Workflow ID is the number found in the app's URL. For instance, in the URL:

https://app.airops.com/{your_workspace}/apps/123

The Workflow ID is:

123

Set the Workflow Parameters

AirOps Workflows require specific parameters in order to function properly. The parameters are inputs that are passed into the model.

For more information on ensuring you have the correct parameters for your call, please see our documentation page on API Endpoints.

Create the External Function

You’re almost there! With the Workflow ID, api key, and parameters, create the external function:

create or replace external function AIROPS_<APP NAME>(data_row object)
returns variant
api_integration = AIROPS_API_INTEGRATION
HEADERS = ( 'Authorization' = 'Bearer <YOUR_API_KEY>', 'Content-Type' = 'application/json' )
MAX_BATCH_ROWS = 25
as 'https://x4xrin77vf.execute-api.us-east-1.amazonaws.com/production/snowflake-execute-data-app/{YOUR_DATA_APP_ID}';

Workflow Execution

To execute a Workflow from your newly created Snowflake function, you must first construct an object with the request parameters. The object should contain the parameter name, and the column or value to be passed into it.

In the example below, a request object is created using the OBJECT_CONSTRUCT function in Snowflake. The order of parameters in the object does not matter.

The constructed object is passing:

  • text_column field into the input_text parameter

  • from_lanugage_column into the from_language parameter

  • to_language_column into the to_language parameter.

select 
    **OBJECT_CONSTRUCT**(**'input_text', text_column, 'from_language', from_lanugage_column, 'to_language',to_language_column )** as request_data
from
	language_data;

This will create an object to pass into the function. Below is an object created by the above query for the Language Translate Workflow:

{
	"input_text": "Translate this text!",
  "from_language": "Detect",
  "to_language": "Spanish"
}

The object is then used in the function, which will pass the mapped fields to the parameters in the AirOps Workflow.

select 
    object_construct('input_text', text_input, 'from_language', from_lanugage, 'to_language',to_language ) as data_row
	, **AIROPS_LANGUAGE_TRANSLATE(data_row) as result**
from
	language_data;

Last updated