Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated 2 years ago
0
Follow
define a custom PromptHelper and set max
define a custom PromptHelper and set max
Inactive
0
Follow
j
jerryjliu0
2 years ago
Β·
define a custom PromptHelper and set max_chunk_overlap=0 (you can see an example here
https://gpt-index.readthedocs.io/en/latest/how_to/custom_llms.html
)
M
j
4 comments
Share
Open in Discord
M
Mikko
2 years ago
Are those also the defaults? Or how do I see the defaults per each LLM?
j
jerryjliu0
2 years ago
by default the chunk overlap is 1/10 of whatever the max_input_size is, up to 200
M
Mikko
2 years ago
so by default not related to chunk_size_limit?
j
jerryjliu0
2 years ago
nope!
Add a reply
Sign up and join the conversation on Discord
Join on Discord