Find answers from the community

Home
Members
marvanni
m
marvanni
Offline, last seen 2 weeks ago
Joined December 10, 2024
So Maybe I get this all wrong, but Im banging my head around a simple example for days now...

I work on a M1 mac with 32GB ram.

IM trying to get this example to run:

https://github.com/run-llama/python-agents-tutorial/blob/main/2_local_agent.py

But all I get is "Process finished with exit code 0"

I try to use a smaller model, but this als does not generate the response..

When I chat with the llm directy it does deliver output.

Are these example even ment to run on local machines?
12 comments
L
m
G
W