Function calling
Certain chat models, like OpenAIโs, have a function-calling API that lets you describe functions and their arguments, and have the model return a JSON object with a function to invoke and the inputs to that function. Function-calling is extremely useful for building tool-using chains and agents, and for getting structured outputs from models more generally.
LangChain comes with a number of utilities to make function-calling easy. Namely, it comes with
- simple syntax for binding functions to models
- converters for formatting various types of objects to the expected function schemas
- output parsers for extracting the function invocations from API responses
Weโll focus here on the first two bullets. To see how output parsing works as well check out the OpenAI Tools output parsers.
Defining functionsโ
Weโll focus on the OpenAI function format here since as of this writing that is the main model provider that supports function calling. LangChain has a built-in converter that can turn Python functions, Pydantic classes, and LangChain Tools into the OpenAI function format:
%pip install -qU langchain-core langchain-openai
[notice] A new release of pip is available: 23.2.1 -> 23.3.2
[notice] To update, run: pip install --upgrade pip
Note: you may need to restart the kernel to use updated packages.
Python functionโ
import json
from langchain_core.utils.function_calling import convert_to_openai_tool
def multiply(a: int, b: int) -> int:
"""Multiply two integers together.
Args:
a: First integer
b: Second integer
"""
return a * b
print(json.dumps(convert_to_openai_tool(multiply), indent=2))
{
"type": "function",
"function": {
"name": "multiply",
"description": "Multiply two integers together.",
"parameters": {
"type": "object",
"properties": {
"a": {
"type": "integer",
"description": "First integer"
},
"b": {
"type": "integer",
"description": "Second integer"
}
},
"required": [
"a",
"b"
]
}
}
}
Pydantic classโ
from langchain_core.pydantic_v1 import BaseModel, Field
class multiply(BaseModel):
"""Multiply two integers together."""
a: int = Field(..., description="First integer")
b: int = Field(..., description="Second integer")
print(json.dumps(convert_to_openai_tool(multiply), indent=2))
{
"type": "function",
"function": {
"name": "multiply",
"description": "Multiply two integers together.",
"parameters": {
"type": "object",
"properties": {
"a": {
"description": "First integer",
"type": "integer"
},
"b": {
"description": "Second integer",
"type": "integer"
}
},
"required": [
"a",
"b"
]
}
}
}
LangChain Toolโ
from typing import Any, Type
from langchain_core.tools import BaseTool
class MultiplySchema(BaseModel):
"""Multiply tool schema."""
a: int = Field(..., description="First integer")
b: int = Field(..., description="Second integer")
class Multiply(BaseTool):
args_schema: Type[BaseModel] = MultiplySchema
name: str = "multiply"
description: str = "Multiply two integers together."
def _run(self, a: int, b: int, **kwargs: Any) -> Any:
return a * b
# Note: we're passing in a Multiply object not the class itself.
print(json.dumps(convert_to_openai_tool(Multiply()), indent=2))
{
"type": "function",
"function": {
"name": "multiply",
"description": "Multiply two integers together.",
"parameters": {
"type": "object",
"properties": {
"a": {
"description": "First integer",
"type": "integer"
},
"b": {
"description": "Second integer",
"type": "integer"
}
},
"required": [
"a",
"b"
]
}
}
}
Binding functionsโ
Now that weโve defined a function, weโll want to pass it in to our model.
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(model="gpt-3.5-turbo")
llm.invoke("what's 5 times three", tools=[convert_to_openai_tool(multiply)])
AIMessage(content='', additional_kwargs={'tool_calls': [{'id': 'call_JvOu9oUwMrQHiDekZTbpNCHY', 'function': {'arguments': '{\n "a": 5,\n "b": 3\n}', 'name': 'multiply'}, 'type': 'function'}]})
And if we wanted this function to be passed in every time we call the tool, we could bind it to the tool:
llm_with_tool = llm.bind(tools=[convert_to_openai_tool(multiply)])
llm_with_tool.invoke("what's 5 times three")
AIMessage(content='', additional_kwargs={'tool_calls': [{'id': 'call_cwRoTnD1ux1SnWXLrTj2KlWH', 'function': {'arguments': '{\n "a": 5,\n "b": 3\n}', 'name': 'multiply'}, 'type': 'function'}]})
We can also enforce that a tool is called using the tool_choice parameter.
llm_with_tool = llm.bind(
tools=[convert_to_openai_tool(multiply)],
tool_choice={"type": "function", "function": {"name": "multiply"}},
)
llm_with_tool.invoke(
"don't answer my question. no do answer my question. no don't. what's five times four"
)
AIMessage(content='', additional_kwargs={'tool_calls': [{'id': 'call_sWjLyioSZAtYMQRLMTzncz1v', 'function': {'arguments': '{\n "a": 5,\n "b": 4\n}', 'name': 'multiply'}, 'type': 'function'}]})
The
ChatOpenAI
class even comes with a bind_tools
helper function that handles
converting function-like objects to the OpenAI format and binding them
for you:
llm_with_tool = llm.bind_tools([multiply], tool_choice="multiply")
llm_with_tool.invoke("what's 5 times three")
AIMessage(content='', additional_kwargs={'tool_calls': [{'id': 'call_LCdBa4cbhMJPRdtkhDzpRh7x', 'function': {'arguments': '{\n "a": 5,\n "b": 3\n}', 'name': 'multiply'}, 'type': 'function'}]})
Legacy args functions
and function_call
โ
Until Fall of 2023 the OpenAI API expected arguments functions
and
funtion_call
instead of tools
and tool_choice
, and they had a
slightly different format than tools
and tool_choice
. LangChain
maintains utilities for using the old API if you need to use that as
well:
from langchain_core.utils.function_calling import convert_to_openai_function
convert_to_openai_function(multiply)
{'name': 'multiply',
'description': 'Multiply two integers together.',
'parameters': {'type': 'object',
'properties': {'a': {'description': 'First integer', 'type': 'integer'},
'b': {'description': 'Second integer', 'type': 'integer'}},
'required': ['a', 'b']}}
llm_with_functions = llm.bind(
functions=[convert_to_openai_function(multiply)], function_call={"name": "multiply"}
)
llm_with_functions.invoke("what's 3 times a million")
AIMessage(content='', additional_kwargs={'function_call': {'arguments': '{\n "a": 3,\n "b": 1000000\n}', 'name': 'multiply'}})
llm_with_functions = llm.bind_functions([multiply], function_call="multiply")
llm_with_functions.invoke("what's 3 times a million")
AIMessage(content='', additional_kwargs={'function_call': {'arguments': '{\n "a": 3,\n "b": 1000000\n}', 'name': 'multiply'}})
Next stepsโ
- Output parsing: See OpenAI Tools output parsers and OpenAI Functions output parsers to learn about extracting the function calling API responses into various formats.
- Tool use: See how to construct chains and agents that actually call the invoked tools in these guides.