The post asks about future plans for releasing a Go version of the LlamaIndex library. The community members discuss the challenges of creating a Go version, noting that it would currently require wrapping around Python functions, which would defeat the purpose of using Go. They suggest that Go would need its own libraries for Torch and Transformers before a native Go version of LlamaIndex could be developed. However, they also note that a Go version may not be necessary if using pre-hosted models that can be accessed via API calls. Overall, the community members express interest in eventually having a compiled language version of frameworks like LlamaIndex, but there is no clear answer or plan presented in the discussion.
you don't need torch or transformers if using models that are already hoted though, then its just API calls (i.e. TGI, TEI, vLLM, Ollama, OpenAI, Anthropic, etc.)