Find answers from the community

Updated last year

HuggingfaceInferenceAPI

Hi, Is it possible to set a customer API URL for HuggingFaceInferenceAPIEmbedding? I don't see any option in the code.
W
L
S
11 comments
You wanna set custom url in text embedding inference other than the default?
You can paste in a URL directly for the model_name

As long as it points to an inference endpoint it will work
@Logan M oh, I didn't know that. Can you share an example for huggingface jinaai model?
@WhiteFang_Jr yeah it's correct
Do you have jinaai deployed on the inference api?
I run it locally
oh then you can use the TEI embed model class
embed_model = TextEmbeddingsInference(
model_name="BAAI/bge-large-en-v1.5", # required for formatting inference text,
timeout=60, # timeout in seconds
embed_batch_size=10, # batch size for embedding
base_url="http://127.0.0.1:8080"
)
nice, let me try it
Add a reply
Sign up and join the conversation on Discord