----------------------
I apologize, but I don't have enough specific information in the provided knowledge sources to directly answer why you're experiencing a "symbols etc error" with the OpenAI function calling API when using LlamaIndex. However, I can provide some general information about using OpenAI function calling with LlamaIndex that might be helpful:
- Model Support: Make sure you're using an OpenAI model that supports the function calling API. As mentioned in the LlamaIndex documentation, you need to use a model like "gpt-3.5-turbo-0613" or newer that supports this feature.
- LLM Configuration: When setting up your LLM, ensure you're using the OpenAI class from LlamaIndex correctly. For example:
from llama_index.llms.openai import OpenAI
llm = OpenAI(model="gpt-3.5-turbo-0613")
- Function Definition: Make sure your functions are properly defined using the correct format. LlamaIndex provides utilities to help with this, such as
to_openai_function
.
- Error Checking: The LlamaIndex code includes checks to ensure you're using a compatible model. For instance:
if not llm.metadata.is_function_calling_model:
raise ValueError(
f"Model name {llm.model} does not support function calling API."
)
If you're still encountering errors, it would be helpful to see the exact error message and the code you're using. The error might be related to how you're defining or calling the functions, or there could be an issue with your OpenAI API key or permissions.