Find answers from the community

Updated last month

Embedding Format Mismatch with HuggingFaceInferenceAPIEmbedding

Has anyone used HuggingFaceInferenceAPIEmbedding module from llamaindex?
I am running into issue where embedding returned by model is in not in the expecteed format what llamaindex is expecting.
I am using CODE-BERT model and using task as sentence embedding
Would love to get some guindance if possible.
CODE:
embed_model = HuggingFaceInferenceAPIEmbedding( model_name='https://*******.us-east-1.aws.endpoints.huggingface.cloud', token='*******' ) text_embedding = embed_model.get_text_embedding("Your text here") print(text_embedding)

emebding format what model is returning
response: b'{"embeddings":[0.02895120158791542,0.4713986814022064,0.41316112875938416,0.2680184543132782,0.4315677583217621 ,...........]
Error:

TypeError: float() argument must be a string or a real number, not 'dict'
L
S
3 comments
seems like its not handling the response type properly here? Not sure if huggingface changed the response type or if its specific to this model
would need a PR to fix
@Logan M I am using codeBert Model.
do you want me raise a issue on gitHub
https://huggingface.co/microsoft/codebert-base
Add a reply
Sign up and join the conversation on Discord