A GPU-powered Pi for more efficient AI?
- Видео
- О видео
- Скачать
- Поделиться
A GPU-powered Pi for more efficient AI?
124, 628 | 3 мес. назад | 5, 990 - 0
The Raspberry Pi is a compelling low-power option for running GPU-accelerated LLMs locally.
For my main test setup, here's the hardware I used (some links are affiliate links):
- Raspberry Pi 5 8GB ($80):
- Raspberry Pi 27W Power Supply ($14):
- 1TB USB SSD ($64):
- Pineboards HatDrive! Bottom ($20):
- JMT M.2 Key to PCIe eGPU Dock ($55):
- OCuLink cable ($20):
- Lian-Li SFX 750W PSU ($130):
- AMD RX 6700 XT ($400):
And here are the resources I mentioned for setting up your own GPU-accelerated Pi:
- Blog post with AMD GPU setup instructions:
- Blog post with llama.cpp Vulkan instructions:
- Llama Benchmarking issue:
- AMD not supporting ROCm on Arm:
- Raspberry Pi PCIe Database:
- Home Assistant Voice Control:
- James Mackenzie's video with RX 580:
Support me on Patreon:
Sponsor me on GitHub:
Merch:
2nd Channel:
3rd Channel:
Contents:
00:00 - Why do this on a Pi
01:33 - Should I even try?
02:06 - Hardware setup
04:34 - Comparisons with Llama
05:43 - How much is too much?
06:52 - Benchmark results
07:41 - Software setup
09:13 - More models, more testing
![A GPU-powered Pi for more efficient AI?](https://i.ytimg.com/vi/AyR7iCS7gNI/hqdefault.jpg)
Чтобы скачать видео "A GPU-powered Pi for more efficient AI?" передвинте ползунок вправо
- Комментарии
Комментарии ФБ