Find answers from the community

Updated 2 months ago

Python 3.13 Support Across Llama-index

Any idea when Python 3.13 will be supported across llama-index?
J
L
28 comments
llama-index-vector-stores-postgres==0.2.6 depends on asyncpg>=0.29.0,<0.30.0 and asyncpg 0.30.0 is where they add support for 3.13.
python is truly a nightmare
no current eta
really recommend just using 3.12 for now
Got it. Thanks, Logan.
@Logan M - assuming you guys accept PRs from the community, do you have a Py3.13 roadmap?
Always accept community PRs yea

And yea eventually. We just have 500+ packages so it takes some time
I love this project. You have done great work. With 500+ packages... might you need to rethink some of the architecture and/or strategy? How are you planning on handling that w/o burning out and losing momentum?
actually this mono-repo setup is 1000x better than having a single package
Since pacakges/integrations can be versionioned and released independently. We have CICD setup to automatically run tests across all packages
If someone merges a PR that vbumps a package, it automatically publishes
To update to 3.13, I need to write a script that will
  • bump the package version and python version in each pyproject.toml
  • test poetry lock after bumping
  • if poetry lock works, run poetry publish
Not so bad, just not the highest priority at the moment
Got it. Have you also considered migrating to uv? I made the switch after I saw their final support for running scripts. In my hands it's so much nicer. I love Poetry, but uv is going to win the war.
Yea I've used uv quite a lot. But uv doesn't have an utils for publishing sadly 😭
If they added that, it would be a complete drop-in for poetry
(unless I'm missing something)
oh hey, they added it!
I guess the current pyproject.toml files are not drop-in replacements for UV though, need to move how the dependencies are defined I thnk
someone must have wrote a script for this somewhere lol
Yeah. I converted some of my projects. It's not too bad. Plus w/the help of scripts or Cursor/LLM you can convert the dependency lists to PEP621. That's the most annoying part -- but totally automatable.
Yeah, I couldn't jump on the uv bandwagon until they added scripts. Just just did that.
I wonder ... could llama-index benefit from the workspace concept in uv?
possibly, but I think workspaces assume that all packages in your repo are compatible with each other

However in practice, I would fully expect some integrations to have conflicts due to 3rd-party dependencies, so not sure its feasible (i.e. maybe the anthropic api client conflicts with the openai api client deps). Might be worth a shot though just to see
ah, they even note that in their docs
I saw that the tool we use for cicd (pants) is adding uv support in an upcoming release too. Probably a good time to switch when that lands πŸ™‚
Add a reply
Sign up and join the conversation on Discord