We're excited to announce the launch of the Riza Tools API!
The Tools API builds on our Code Interpreter API, allowing you to save functions and execute them on demand. It also adds support for structured inputs. Now, you can easily deploy and execute saved functions written by your users or an LLM!
If you're building an AI agent, you may find the Tools API useful as a deployment target for user-defined tools that your agent can invoke. Or going one step further, you could use the Tools API to allow an AI agent to write its own tools for later use. We'll explore this idea in a future blog post.
Head over to the API reference for detailed information about our new endpoints, or read on for a quick walk-through of setting up and executing a tool.
Creating a Tool
You can create a tool using the Create Tool API endpoint.
A Tool's source code must be a Python, JavaScript, or TypeScript function named execute
. Let's create a simple tool that greets a user by name using Python.
def execute(input):
print(f"Hello, {input['name']}!")
return {"name": input["name"], "greeted": True}
We'll also include an input_schema
to describe the shape of the input argument. This is a JSON Schema object, and it's used to validate inputs during executions.
Note: This input_schema
is compatible with OpenAI's Function Calling and Anthropic's Tool Use APIs, allowing you to easily pass a Riza Tool to an LLM.
{
"type": "object",
"properties": {
"name": {
"type": "string",
"description": "The name of the person to greet"
}
},
"required": ["name"]
}
We'll create this Tool using our Python SDK and call it greet_user
.
import os
from rizaio import Riza
TOOL_CODE = """
def execute(input):
print(f"Hello, {input['name']}!")
return {"name": input["name"], "greeted": True}
"""
client = Riza(
api_key=os.environ.get("RIZA_API_KEY"),
)
tool = client.tools.create(
code=TOOL_CODE,
language="PYTHON",
name="greet_user",
input_schema={
"type": "object",
"properties": {
"name": {
"type": "string",
"description": "The name of the person to greet"
}
},
"required": ["name"]
},
description="Greet a user by name"
)
Executing a Tool
You can execute your Tool using the Riza Execute Tool API endpoint. You'll need the id
of the Tool you created.
Here's how we can execute our greet_user
Tool using our Python SDK:
import os
from rizaio import Riza
client = Riza(
api_key=os.environ.get("RIZA_API_KEY"),
)
response = client.tools.exec(
id=TOOL_ID, # The id of the tool we created
input={"name": "David"}
)
The response will look like this:
{
"output": {
"name": "David",
"greeted": true
},
"execution": {
"exit_code": 0,
"stdout": "Hello, David!\n",
"stderr": ""
}
}
The output
property in the API response is the return value of the Tool's execute
function. You can also see the output of any print statements within the function under execution['stdout']
.
The Tool Execute API endpoint also supports the same HTTP authentication features as our Execute API endpoint, so your Tool executions can securely make outbound HTTP requests.
Creating and editing Tools in the Riza Dashboard
We built a UI for managing your Tools in the Riza Dashboard. You can create, edit, or execute your Tools from your browser.
Using Secrets in Tool Executions
We've also introduced a Secrets API that allows you to securely manage secrets for your Tools. You can store secrets on Riza using this API and attach them to Tools at execution time, rather than passing secrets in each Tool execution.
To use a stored Secret in a Tool execution, use the secret_id
parameter instead of the regular password
or token
parameters when configuring allowed http
hosts.
Using Tools with LLMs
Our Tools API integrates seamlessly with the function calling APIs of the most popular LLMs. Here's an example of how to call a Riza Tool using OpenAI's Function Calling API.
from openai import OpenAI
from rizaio import Riza
import json
TOOL_ID = "..." # The id of the tool we created
riza_client = Riza()
tool = riza_client.tools.get(
TOOL_ID,
)
openai_client = OpenAI()
tools = [
{
"type": "function",
"function": {
"name": tool.name,
"description": tool.description,
# Our input schema is a JSON Schema object, which matches OpenAI's (and Anthropic's) function calling API
"parameters": tool.input_schema,
},
}
]
messages = [{"role": "user", "content": "Greet David"}]
completion = openai_client.chat.completions.create(
model="gpt-4o",
messages=messages,
tools=tools,
)
messages.append(completion.choices[0].message)
# Assuming the LLM called our tool in its response...
tool_call = completion.choices[0].message.tool_calls[0]
arguments = json.loads(tool_call.function.arguments)
# Actually execute the tool on Riza
tool_exec = riza_client.tools.exec(tool.id, input=arguments)
print(tool_exec)
# Return the result of the tool call to the LLM
function_call_result_message = {
"role": "tool",
"content": json.dumps(tool_exec.output),
"tool_call_id": tool_call.id
}
messages.append(function_call_result_message)
completion = openai_client.chat.completions.create(
model="gpt-4o",
messages=messages, # The LLM will now have the result of the tool call
)
The above is a very simple example. You could also use Riza's Tools API to build a dynamic set of hosted Tools to pass to your LLM calls this way:
# Get all of our Riza Tools
riza_tools = riza_client.tools.list()
# Convert them to the format expected by the LLM's function calling API
llm_tools = list(map(lambda tool: {
"type": "function",
"function": {
"name": tool.name,
"description": tool.description,
"parameters": tool.input_schema,
},
}, riza_tools.tools))
messages = [{"role": "user", "content": "Greet David"}]
completion = openai_client.chat.completions.create(
model="gpt-4o",
messages=messages,
tools=llm_tools,
)
print(completion)
Next steps
Getting started with the API is free, so head to the Riza Dashboard to get an API key or pop into Discord to ask questions. And if you build something fun with the API be sure to let us know!