Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated last year
0
Follow
HuggingfaceInferenceAPI
HuggingfaceInferenceAPI
Inactive
0
Follow
S
San Nguyen
last year
Β·
Hi, Is it possible to set a customer API URL for HuggingFaceInferenceAPIEmbedding? I don't see any option in the code.
W
L
S
11 comments
Share
Open in Discord
W
WhiteFang_Jr
last year
You wanna set custom url in text embedding inference other than the default?
L
Logan M
last year
You can paste in a URL directly for the
model_name
As long as it points to an inference endpoint it will work
S
San Nguyen
last year
@Logan M oh, I didn't know that. Can you share an example for huggingface jinaai model?
S
San Nguyen
last year
@WhiteFang_Jr yeah it's correct
L
Logan M
last year
Do you have jinaai deployed on the inference api?
S
San Nguyen
last year
yes
S
San Nguyen
last year
I run it locally
S
San Nguyen
last year
using TEI
L
Logan M
last year
oh then you can use the TEI embed model class
L
Logan M
last year
embed_model = TextEmbeddingsInference(
model_name="BAAI/bge-large-en-v1.5", # required for formatting inference text,
timeout=60, # timeout in seconds
embed_batch_size=10, # batch size for embedding
base_url="http://127.0.0.1:8080"
)
S
San Nguyen
last year
nice, let me try it
Add a reply
Sign up and join the conversation on Discord
Join on Discord