Find answers from the community

Updated 3 months ago

QueryQrapper

I'm using HuggingFaceLLMPredictor to use StabilityAI/stablelm-tuned-alpha-3b
I'm currently using this prompt as stablelm requires it this way.
Plain Text
query_wrapper_prompt = SimpleInputPrompt(
    "<|SYSTEM|>Below is an instruction that describes a task."
    "Write a response that adequately completes the request.\n\n"
    "<|USER|>{query_str}\n<|ASSISTANT|>"
)


My question is can we have more inputs like query_str here? Like providing context separately and then user query?
Also If I'm using HuggingFaceLLMPredictor then if I pass the text_qa_template while creating query_engine instance. Will it make any difference?
L
1 comment
So, maybe the variable naming here is a little misleading lol

The query wrapper prompt WRAPS the entire text qa template / refine template
Add a reply
Sign up and join the conversation on Discord