Hi, I'm trying to use https://llamahub.ai/l/llama_packs-dense_x_retrieval?from=llama_packs I wonder the llama_packs is a ready to use package, or is it more like a template that we can build on? For example I like to add storage context for vector store index, but not sure if I should contribute it to the llama_pack or just add the code locally.
Hi, I have a question using Llama Index with Guidance AI but for Ollama. Currently I think guidance only support OpenAI, llama.cpp, or transformer. Is it possible to use langchain model as well. Langchain has OllamaModel support so it will bridge the gap?
Hello, I'm trying to use Llama-Index with tool call. I'm wondering is it possible to annotate the parameter in the function with some description? like:
Plain Text
def get_temperature(location: Annotated[str, "The location"]) -> Annotated[float, "Degree"]:
pass
Hi, with the release of planning agent, I feel we can generalize the idea and have a team of agents playing different roles to handle given tasks, similar to crewai or autogen. Would that be something the team interested in building?
Quick question: Is there a public roadmap for llama-index as a framework? Or are we adding more features as we discover new use cases? I would be interested in the general future direction of the framework, what are we trying to achieve, let's say for 2024?
Hi, I wonder is there a guide to add custom Agent based on AgentRunner and AgentWorker? Currently only OpenAIAgent and ReActAgent are supported, but I want to add custom agent for Ollama as well.
Hi, Is there a plan to integrate https://python.useinstructor.com/ with llama-index? I'm looking for away to stream structured output in our API server to frontend.
Hi, is there any plan to upgrade to pydantic v2? I tried using Field from pydantic v2 as my function para in tool and llama index give a serialization error