Log in
Log into community
Find answers from the community
View all posts
Related posts
Was this helpful?
π
π
π
Powered by
Hall
Inactive
Updated 12 months ago
0
Follow
anyone have any information on how to
anyone have any information on how to
Inactive
0
Follow
At a glance
h
hansson0728
12 months ago
Β·
anyone have any information on how to run/create OpenAI agents running towards Local LLM (llamacpp)
W
h
L
6 comments
Share
Open in Discord
W
WhiteFang_Jr
12 months ago
You want to use local llm ?
I guess defining service_context with local llm and embed model should work.
h
hansson0728
12 months ago
maybe
L
LORKA
12 months ago
It may not work as intended due to the llm limitations. Somewhere in the llamaindex docs there is a table of different popular llm models and their capabilities.
L
LORKA
12 months ago
Cant find it now
W
WhiteFang_Jr
12 months ago
Yeah it wont work as good as OpenAI, Compatibility report on Open source LLM:
https://docs.llamaindex.ai/en/stable/module_guides/models/llms.html#open-source-llms
L
LORKA
12 months ago
Thx, i was still looking for it
Add a reply
Sign up and join the conversation on Discord
Join on Discord