Genie Invokers
Background tasks are "invoked" using a growing set of Invokers in the Genie framework. Many of these Invokers talk to external systems such as Large Language Model API's or databases. They are the Genie way of executing a long(er) running process in the background.
Interface
The paradigm of an Invoker is deliberately left simple: string in, string out. This means that, for example, for a straightforward LLM Invoker that just interprets and responds to a prompt, the input can be the verbatim prompt that is sent to the LLM. The response, again, is a string with the result of the LLM call.
But for more complex Invokers, for example, a web search (we use the Searx Invoker to do web searches), takes the search query in as a string, but returns a JSON result (which is technically a string, but has semantics that can be interpreted).
If the Invoker requires a more structured input, it typically expects a JSON object that contains these details. But this is dependent on the Invoker. For instance, the chat completion Invokers expect YAML, because with long texts, the YAML format is easier on the eyes for the developers.
How the Invoker is expecting their input, and how it will return their response, is documented.
Configuration
The Genie framework uses templates to construct the input that goes into an Invoker. These
templates sit in a directory, together with a file called meta.yaml
. It is the file
in which the configuration of the invoker is done.
For example:
invoker:
type: genie_flow_invoker.invoker.ollama.OllamaChatInvoker
ollama_url: localhost:11434
model: gemma3:1b
This file tells the Genie framework that all templates in that same directory are to be
used with the Ollama Chat Invoker. It then informs the Ollama Invoker of the necessary
parameters, such as the ollama_url
and model
.
For deployment purposes, many of these parameters can be overridden by environment variables. For
instance, defining an environment variable OLLAMA_URL
will override what has been specified
as the ollama_url
in any of the meta.yaml
files.
Some invokers can override their parameters by specifying them in the template. For instance:
invoker:
type: genie_flow_invoker.invoker.weaviate.WeaviateSimilaritySearchRequestInvoker
connection:
http_host: localhost
http_port: 8080
http_secure: false
grpc_host: localhost
grpc_port: 50051
grpc_secure: false
query:
collection_name: MyDemoCollection
top: 3
horizon: 0.75
operation_level: -1
parent_strategy: replace
This file sets some of the query parameters to run a similarity search on a Weaviate server. But in the template for the query, for example:
{
"filename": "some_filename",
"tenant_name": "Tenant-{{ session_id }}",
"query_embedding": {{ map_value }},
"top": 12,
"horizon": 0.8
}
Here we can see that the parameters top
and horizon
are overridden by values in the template.
Hierarchical Configuration
The values for a meta.yaml
file are compiled by rolling upward into the templates directory.
That means you could define general values in a directory and the more specific values in
subdirectories below that.
So if this meta.yaml
lives in a directory search/
:
invoker:
type: genie_flow_invoker.invoker.weaviate.WeaviateSimilaritySearchRequestInvoker
connection:
http_host: localhost
http_port: 8080
http_secure: false
grpc_host: localhost
grpc_port: 50051
grpc_secure: false
query:
collection_name: MyDemoCollection
And in the subdirectory search/wide
, you would have a `meta.yaml like this:
invoker:
query:
top: 3
horizon: 0.75
operation_level: -1
parent_strategy: replace
Then the resulting configuration would be the same as the consolidated example before.
List of Invokers
We currently have the following invokers in place:
Invokers | Description | Package |
---|---|---|
API | Generic API GET invoker | genie-flow-invoker-api |
Azure AI Search | Conduct similarity search using Azure AI Search | genie-flow-invoker-azure-ai-search |
Doc Proc | Process documents: parse, clean, chunk, embed, search | genie-flow-invoker-docproc |
MS SQL Server | Retrieve, Insert, and Upsert data into MS SQL Server tables | genie-flow-invoker-ms-sql-server |
Neo4j | Run Neo4j Cypher queries | genie-flow-invoker-neo4j |
Ollama | Generation, Chat Completion and Embedding using Ollama | genie-flow-invoker-ollama |
OpenAI | Generation, Chat Completion and Images, using OpenAI and OpenAI on Azure | genie-flow-invoker-openai |
Redis | Insert and retrieve data from the Redis in-memory database | genie-flow-invoker-redis |
Searx | Internet metasearch engine using SearXNG | genie-flow-invoker-searx |
Weaviate | Advanced vector insertion and search using Weaviate Vector store | genie-flow-invoker-weaviate |