In the chapter about tagging, when I try to run the examples locally with the latest version of Langchain, I get a deprecation warning that convert_pydantic_to_openai_function is going away. The message says to use convert_to_openai_function. I was able to convert the extraction example to use that function and the @tool decorator. However, It only seems to pull a single name, age from the text indepently of what I put in pass to the model. Any ideas would be appreciated. Here is the code:
from langchain.tools import tool
from langchain_core.prompts import ChatPromptTemplate
from langchain.output_parsers.openai_functions import JsonOutputFunctionsParser
from langchain_core.utils.function_calling import convert_to_openai_function
from pprint import pprint
from langchain_openai import ChatOpenAI
model = ChatOpenAI(temperature=0)
@tool
def info(name: str, age: int = None):
"""Information to extract."""
return [{"name": name, "age": age}]
def with_decorator():
print(info.name)
print(info.description)
print(info.args)
tools = [info]
extraction_functions = [convert_to_openai_function(t) for t in tools]
pprint(extraction_functions)
extraction_model = model.bind(functions=extraction_functions, function_call={"name": "info"})
# print(info.run({"name": "Tom", "age": 20}))
# print(extraction_model.invoke("Joe is 30, his mom is Martha"))
prompt = ChatPromptTemplate.from_messages([
("system", "Extract the relevant information, if not explicitly provided do not guess. Extract partial info"),
("human", "{input}")
])
extraction_chain = prompt | extraction_model | JsonOutputFunctionsParser()
# print(extraction_chain.invoke({"input": "Joe is 30, his mom is Martha"}))
print(extraction_chain.invoke({"input": "Tom is 20. Joe is 30"}))