https://github.com/Gilzone/Installing-a-LLM-on-Raspberry-Pi-Zero-2-W

"> https://github.com/Gilzone/Installing-a-LLM-on-Raspberry-Pi-Zero-2-W

">
Close

Running LLM locally on RPI zero 2w

A project log for Chatbot Zero - AI on Raspberry Pi zero

An AI chatbot build on Raspberry Pi zero 2w, with a small LCD screen, speaker, micphone and battery

jdaieJdaie 08/04/2025 at 03:180 Comments

I tried running Ollama on the Zero 2W, and the output speed was actually quite decent. I also ran a chatbot program at the same time — while it did slow things down a bit, it shows that the Zero 2W does have potential! That said, the core temperature quickly rose to 68°C, so it’s clearly struggling a bit.

Install ollama following this tutorial: https://github.com/Gilzone/Installing-a-LLM-on-Raspberry-Pi-Zero-2-W

Discussions