LLM module. how to use another llm model ?
-
Hi Everyone,
I am new to the module.
Want to switch llm model as new are available
( ex: llama3.2-1b-prefill-ax630c or qwen2.5-1.5b-ax630c )
in short can't succed to load other modelAny suggestions ? or pointing to the proper documentation ? ( did not find any topic regarding changing model via Arduino )
What I did so far ?
- log into the llm module via serial
- ip a
- connect via ssh root@ip (via the ethernet compagnon board )
- load my ssh public key and then ssh
- proceed succesfully to install other models via apt-get install xxx
- reboot (just in case )
Then test via serial text ( via a M5STACK core grey, with a simple forward serial > serial2 app)
the sequence :reset :
{ "request_id": "11212155", "work_id": "sys", "action": "reset" }
{"created":1746310691,"data":"None","error":{"code":0,"message":"llm server restarting ..."},"object":"None","request_id":"11212155","work_id":"sys"}
{"request_id": "0","work_id": "sys","created": 1746310696,"error":{"code":0, "message":"reset over"}}
then...load model :
{ "request_id": "3", "work_id": "llm", "action": "setup","object": "llm.setup", "data": { "model": "qwen2.5-1.5b-ax630c", "response_format": "llm.utf-8.stream", "input": "llm.utf-8", "enoutput": true, "max_token_len": 256, "prompt": "You are a knowledgeable assistant capable of answering various questions and providing information." } }
{"created":1746310710,"data":"None","error":{"code":-5,"message":"Model loading failed."},"object":"None","request_id":"3","work_id":"llm"}but it works with...
{ "request_id": "3", "work_id": "llm", "action": "setup", "object": "llm.setup", "data": { "model": "qwen2.5-0.5B-prefill-20e", "response_format": "llm.utf-8.stream", "input": "llm.utf-8", "enoutput": true, "max_token_len": 256, "prompt": "You are a knowledgeable assistant capable of answering various questions and providing information." } }
{"created":1746310813,"data":"None","error":{"code":0,"message":""},"object":"None","request_id":"3","work_id":"llm.1004"}