Find answers from the community

Updated 10 months ago

hello, i'm facing an issue when it comes

hello, i'm facing an issue when it comes to install all the library for my app running, i don't understand how to instal those library :
Attachment
image.png
W
K
34 comments
Hi, DId you try fresh installation of llama_index ?
That is either removing previous installation completely and then installing it or trying installing it in a fresh new env?
seems like your llamaindex-core is not updated
Try doing this: pip install llama-index-core
I would suggest a fresh env and then install all these!
hello, yes i tried that and with some research it works but now i'm facing this issue when it comes to run the app :
Import it like this:
from llama_index.core import StorageContext
what is "load_index_from_stoage" ? is it somrthing like "set_global_service_context"?
No, Actually i had this line in my code, my bad i copied the whole thing ๐Ÿ˜…
oh okay i see๐Ÿ˜„
i'll show what i have
Try commenting the first line
and then run it again
not woprked i'l give you the whole error :
Traceback (most recent call last):
File "c:\Projets\IA Chat Local2\Sources\AzureOpenAI\LocalIndex.py", line 7, in <module>
from llama_index.core.indices.vector_store import VectorStoreIndex
File "C:\Users\sxd-i\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\llama_index\core\indices__init.py", line 4, in <module> from llama_index.core.indices.composability.graph import ComposableGraph File "C:\Users\sxd-i\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\llama_index\core\indices\composability__init.py", line 4, in <module>
from llama_index.core.indices.composability.graph import ComposableGraph
File "C:\Users\sxd-i\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\llama_index\core\indices\composability\graph.py", line 5, in <module>
from llama_index.core.base.base_query_engine import BaseQueryEngine
File "C:\Users\sxd-i\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\llama_index\core\base\base_query_engine.py", line 7, in <module>
from llama_index.core.base.query_pipeline.query import (
File "C:\Users\sxd-i\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\llama_index\core\base\query_pipeline\query.py", line 23, in <module>
from llama_index.core.callbacks.base import CallbackManager
File "C:\Users\sxd-i\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\llama_index\core\callbacks__init.py", line 4, in <module> from .token_counting import TokenCountingHandler File "C:\Users\sxd-i\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\llama_index\core\callbacks\token_counting.py", line 6, in <module> from llama_index.core.utilities.token_counting import TokenCounter File "C:\Users\sxd-i\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\llama_index\core\utilities\token_counting.py", line 6, in <module> from llama_index.core.llms import ChatMessage, MessageRole File "C:\Users\sxd-i\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\llama_index\core\llms__init.py", line 12, in <module>
from llama_index.core.llms.custom import CustomLLM
File "C:\Users\sxd-i\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\llama_index\core\llms\custom.py", line 19, in <module>
from llama_index.core.llms.llm import LLM
File "C:\Users\sxd-i\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\llama_index\core\llms\llm.py", line 21, in <module>
from llama_index.core.base.query_pipeline.query import (
ImportError: cannot import name 'InputKeys' from partially initialized module 'llama_index.core.base.query_pipeline.query' (most likely due to a circular import) (C:\Users\sxd-i\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\llama_index\core\base\query_pipeline\query.py)
Also import service_context like this:
from llama_index.core import set_global_service_context
Can you share the code so that i can correct the import
okay i'll give you
import os
import pymongo
import ModelDef


#from llama_index.core.storage import StorageContext
from llama_index.core.indices.vector_store import VectorStoreIndex
from llama_index.core.service_context import ServiceContext
from llama_index.core import StorageContext
from llama_index.core.readers import SimpleDirectoryReader
from llama_index.core.service_context import set_global_service_context
from llama_index.llms.azure_openai import AzureOpenAI
from llama_index.embeddings.azure_openai import AzureOpenAIEmbedding
from llama_index.vector_stores.mongodb import MongoDBAtlasVectorSearch
=========================================================================

just the imports
Plain Text
from llama_index.core import VectorStoreIndex
from llama_index.core import ServiceContext, set_global_service_context
from llama_index.core import StorageContext
from llama_index.core import SimpleDirectoryReader
from llama_index.llms.azure_openai import AzureOpenAI
from llama_index.embeddings.azure_openai import AzureOpenAIEmbedding
from llama_index.vector_stores.mongodb import MongoDBAtlasVectorSearch
now i have this error :

Traceback (most recent call last):
File "c:\Projets\IA Chat Local2\Sources\AzureOpenAI\LocalIndex.py", line 5, in <module>
from llama_index.core import VectorStoreIndex
ImportError: cannot import name 'VectorStoreIndex' from 'llama_index.core' (unknown location)
are you trying with a fresh env?
Just tried with a fresh installation on colab and I'm able to import
Attachment
image.png
i don't really understand when you say "fresh env"
Create a new python environment
i just uninstall llama-index-core an install it again and it works
Yeah this my guess too here ๐Ÿ˜…
oh i'm sorry i didn't read it correctly sorry๐Ÿ™
No worries, I'm glad it is working now
Add a reply
Sign up and join the conversation on Discord