Find answers from the community

Home
Members
romilly
r
romilly
Offline, last seen 3 months ago
Joined September 25, 2024
How can I load a persisted index when I'm using a local LLM?
I am using this code:

storage_context = StorageContext.from_defaults(persist_dir="../data/datastore") index = load_index_from_storage(storage_context, llm=llm)
where the llm is an instance of LLamaCPP.
I get this error: Could not load OpenAI model. If you intended to use OpenAI, please check your OPENAI_API_KEY.
20 comments
L
r
W