A GPU-powered Pi for more efficient AI?

Просмотров: 124, 670   |   Загружено: 3 мес.
icon
Jeff Geerling
icon
5, 991
icon
Скачать
iconПодробнее о видео
The Raspberry Pi is a compelling low-power option for running GPU-accelerated LLMs locally.

For my main test setup, here's the hardware I used (some links are affiliate links):

- Raspberry Pi 5 8GB ($80):
- Raspberry Pi 27W Power Supply ($14):
- 1TB USB SSD ($64):
- Pineboards HatDrive! Bottom ($20):
- JMT M.2 Key to PCIe eGPU Dock ($55):
- OCuLink cable ($20):
- Lian-Li SFX 750W PSU ($130):
- AMD RX 6700 XT ($400):

And here are the resources I mentioned for setting up your own GPU-accelerated Pi:

- Blog post with AMD GPU setup instructions:
- Blog post with llama.cpp Vulkan instructions:
- Llama Benchmarking issue:
- AMD not supporting ROCm on Arm:
- Raspberry Pi PCIe Database:
- Home Assistant Voice Control:
- James Mackenzie's video with RX 580:

Support me on Patreon:
Sponsor me on GitHub:
Merch:
2nd Channel:
3rd Channel:

Contents:

00:00 - Why do this on a Pi
01:33 - Should I even try?
02:06 - Hardware setup
04:34 - Comparisons with Llama
05:43 - How much is too much?
06:52 - Benchmark results
07:41 - Software setup
09:13 - More models, more testing

Похожие видео

Добавлено: 55 год.
Добавил:
  © 2019-2021
  A GPU-powered Pi for more efficient AI? - RusLar.Me