Memory Stores

Semantically search your knowledge, documents, and data with Memory Stores

What is a Memory Store?

Memory Stores allow you to perform semantic searches while your application is running. They enable Agents and Workflows to access relevant information based on semantic similarity, unlike typical keyword-based searches (such as those in SQL). Under the hood, Memory Stores are managed Pinecone Databases, a leading provider of Vector Databases and use OpenAI embedding models.

Semantic databases differ from normal keyword-based databases like SQL. They return results with the closest semantic meaning, not just exact matches. For example, searching for "vanilla ice cream" would return results like "vanilla icecream, vanilla custard, chocolate icecream, icecream sundae". It's great for retrieval use cases where you want to find content relevant to a particular topic or user request and is the foundation of every single chatbot "trained on your data".

Configuring a Memory Store

Setting up a Memory Store is easy. All you need to have ready is your content.

Supported Content Types

AirOps currently supports the following content types for populating Memory Stores:

  1. File Uploads -- Add individual .pdf, .csv, and .txt files

  2. Sync Google Drive File -- Connect a Google Drive file (supports Google Docs, Google Sheets, and .pdf files)

  3. Sync SQL Database -- Import data directly from your connected database(s) via a SQL query

  4. Import from URL -- Import content directly from a specific URL

  5. Import Sitemap -- Import content from a sitemap's hierarchy of pages to be crawled

Currently, individual documents must be loaded one-at-a-time. If you have more advanced content load requirements, please speak with our team.

Filtering Memory Stores with Metadata

If you are dealing with an extensive amount of data, you can scope your search using metadata filters. When uploading a CSV or Google Sheet, additional columns will be added to your Memory Store chunk as metadata and you can scope your search to just those.

For more information on configuring and filtering on Metadata, please see our Memory Stores Metadata documentation page.

Where to Use Memory Stores

Memory Stores are core pieces in both Agents and in the context of some Workflows.

Utilizing Metadata makes it so that your Memory Stores are particularly useful in content creation workflows where you want to provide your LLM with relevant context in order to complete its request.

Memory Stores can also be used for maintaining state in a workflow. For example, if you want to have your LLM step remember what decision it made in a previous execution, you can use Memory Stores to acheive that.

Example

Watch our video walkthrough to see how quickly you can configure a Memory Store, and where it can be applied.

Last updated