So Maybe I get this all wrong, but Im banging my head around a simple example for days now...
I work on a M1 mac with 32GB ram.
IM trying to get this example to run:
https://github.com/run-llama/python-agents-tutorial/blob/main/2_local_agent.pyBut all I get is "Process finished with exit code 0"
I try to use a smaller model, but this als does not generate the response..
When I chat with the llm directy it does deliver output.
Are these example even ment to run on local machines?