Difference between calling the local function or calling the local AISuite

Hi,

I am trying to understand the example in Module 3 in Agentic AI course (M3_UGL_1) and what are the differences between two ways of generating the output.

When we include the code to run locally the function:

response2 = None
Create a condition in case tool_calls is in response object

if response.choices[0].message.tool_calls:
# Pull out the specific tool metadata from the response
tool_call = response.choices[0].message.tool_calls[0]
args = json.loads(tool_call.function.arguments)

# Run the tool locally
tool_result = get_current_time()

# Append the result to the messages list
messages.append(response.choices[0].message)
messages.append({
    "role": "tool", "tool_call_id": tool_call.id, "content": str(tool_result)
})

# Send the list of messages with the newly appended results back to the LLM
response2 = client.chat.completions.create(
    model="openai:gpt-4o",
    messages=messages,
    tools=tools,
)

print(response2.choices[0].message.content)

We are doing this to run the tool locally.

I am saying this because the information regarding the tool (the Python definition of get_current_time() is implemented before and it is actually provided to the AIsuite client (Python function get_current_time):

response = client.chat.completions.create(
model=“openai:gpt-4o”,
messages=messages,
tools=[get_current_time],
max_turns=5
)

So the change it is that we are running the function locally controlling the function call when in the first implementation the tool is executed by the AISuite library.

My understanding is that in both cases the execution is done locally but in the second one it is executed directly in the code and in the second one indirectly using the logic behind the AISuite, am I right or there is something else?

Thank-you in advance.

Hi Jesus_Martos,

Good question.

Your interpretation is in line with the aisuite docs: GitHub - andrewyng/aisuite: Simple, unified interface to multiple Generative AI providers . Under 2. Automatic Tool Execution, the steps taken by aisuite are summarized:

When max_turns is specified, aisuite will:
1 Send your message to the LLM
2 Execute any tool calls the LLM requests
3 Send the tool results back to the LLM
4 Repeat until the conversation is complete or max_turns is reached

If you want to see the code, you can have a look at aisuite/aisuite/client.py at main · andrewyng/aisuite · GitHub .
There you can find the following snippet of code:

while turns < max_turns:

Make the API call

response = provider.chat_completions_create(model_name, messages, **kwargs)
response = self._extract_thinking_content(response)

        # Store intermediate response
        intermediate_responses.append(response)

        # Check if there are tool calls in the response
        tool_calls = (
            getattr(response.choices[0].message, "tool_calls", None)
            if hasattr(response, "choices")
            else None
        )

        # Store the model's message
        intermediate_messages.append(response.choices[0].message)

        if not tool_calls:
            # Set the intermediate data in the final response
            response.intermediate_responses = intermediate_responses[
                :-1
            ]  # Exclude final response
            response.choices[0].intermediate_messages = intermediate_messages
            return response

        # Execute tools and get results
        results, tool_messages = tools_instance.execute_tool(tool_calls)

        # Add tool messages to intermediate messages
        intermediate_messages.extend(tool_messages)

        # Add the assistant's response and tool results to messages
        messages.extend([response.choices[0].message, *tool_messages])

        turns += 1

You can see that aisuite calls the model, then parses the output, and if the output contains a tool call, executes the tool locally (i.e on the deeplearning.ai platform if run there, or truly locally if the notebook is run from your own device). Then it adds the tool results to the messages.

Your understanding about local execution is consistent with this logic.