Find answers from the community

Updated 2 weeks ago

Traceback (most recent call last) error when importing llama_index.core.bridge.pydantic

Hello, i'm facing a little issue with code that i use to index documents in my database. it returns the following error:
Traceback (most recent call last):
File "c:\Projets\test IA in DB\local index\LocalIndex.py", line 11, in <module>
from llama_index.llms.azure_openai import AzureOpenAI
File "C:\Users\sxd-i\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\llama_index\llms\azure_openai__init__.py", line 1, in <module>
from llama_index.llms.azure_openai.base import (
File "C:\Users\sxd-i\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\llama_index\llms\azure_openai\base.py", line 5, in <module>
from llama_index.core.bridge.pydantic import Field, PrivateAttr, root_validator
ImportError: cannot import name 'root_validator' from 'llama_index.core.bridge.pydantic' (C:\Users\sxd-i\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\llama_index\core\bridge\pydantic.py)

here is the code :
Plain Text
import os
import pymongo
import ModelDef
from llama_index.core import VectorStoreIndex, StorageContext, SimpleDirectoryReader, set_global_service_context
from llama_index.llms.azure_openai import AzureOpenAI
from llama_index.embeddings.azure_openai import AzureOpenAIEmbedding
from llama_index.core.indices.vector_store.base import VectorStoreIndex
from llama_index.vector_stores.mongodb import MongoDBAtlasVectorSearch
from llama_index.core import Settings
from llama_index.core.indices.vector_store import VectorStoreIndex
import os
import pymongo
import json
import ModelDef
from flask import Flask, request

# ---------------------------
# Constantes de l'application
# ---------------------------

# Définition des models de déploiement AI

_modelGPT4o = ModelDef.ModelGPT("gpt-4o", "gpt-4o", "2024-10-21")
_modelAda2 = ModelDef.ModelGPT("ada2", "text-embedding-ada-002", "2024-10-21")

_server4o = ModelDef.ServerGPT("gpt4o", "https://.azure.com", "26f3ea90247b1a9286057d53c2539", "c59f1006a64015a7b083ed29", "eastus2", _modelGPT4o, _modelAda2)

_models = [ _server4o]
_model = _models[0]

# Constants

_index = "LeJourSeLeveIFC"
_directory = "C:\\Projets\\ifcdoctest"
_mongoURI = os.environ["MONGO_URI"] = "url of mongo data base"

# ----------------------
# Démarrage du programme
# ----------------------

# Initialisation OpenAI

print("Initialisation OpenAI...")

llm = AzureOpenAI(
    #model=_model.ChatModel.Model,
    #deployment_name=_model.ChatModel.Name,
    api_key=_model.Key1,
    azure_endpoint=_model.Server,
    api_version=_model.ChatModel.ApiVersion,
)

embed_model = AzureOpenAIEmbedding(
    model=_model.LearningModel.Model,
    deployment_name=_model.LearningModel.Name,
    api_key=_model.Key1,
    azure_endpoint=_model.Server,
    api_version=_model.LearningModel.ApiVersion,
)
#Settings.llm = AzureOpenAI(model=llm)
#Settings.embed_model = AzureOpenAIEmbedding(model=_modelAda2)
#service_context = ServiceContext.from_defaults(llm=llm, embed_model=embed_model)
#set_global_service_context(service_context)
# Initialisation des paramètres pour les requètes sur MongoDB Atlas
print("Initialisation MongoDB...")

mongodb_client = pymongo.MongoClient(_mongoURI)
store = MongoDBAtlasVectorSearch(mongodb_client, db_name=_index)
storage_context = StorageContext.from_defaults(vector_store=store)

# On parcours chaque fichier
print("Démarrage de l'importation...")

reader = SimpleDirectoryReader(_directory, recursive=True, encoding="latin-1", required_exts=[".pdf", ".docx", ".pptx", ".csv", ".txt"])

for docs in reader.iter_data():
        print("F > " + docs[0].metadata['file_name'])
        VectorStoreIndex.from_documents(docs, storage_context=storage_context)
# Fin du programme

print("Terminée.")
L
K
6 comments
Seems like your llama-index-core version is out of date? pip freeze | grep llama will get you all your current versions
llama-index-core is in version 0.12.5, I don't think it's out of date
for the third time i tried to install again my packages but this time using the command on your google collab and this error is fixed
i guess did it wrong before that explain why i had this error
Add a reply
Sign up and join the conversation on Discord