this post was submitted on 27 Jan 2024
23 points (96.0% liked)

LocalLLaMA

2884 readers
6 users here now

Welcome to LocalLLaMA! Here we discuss running and developing machine learning models at home. Lets explore cutting edge open source neural network technology together.

Get support from the community! Ask questions, share prompts, discuss benchmarks, get hyped at the latest and greatest model releases! Enjoy talking about our awesome hobby.

As ambassadors of the self-hosting machine learning community, we strive to support each other and share our enthusiasm in a positive constructive way.

founded 2 years ago
MODERATORS
 

On my machine I'm running opensuse tumbleweed and has the amdgpu driver installed. I use it for gaming and recently I've become interested in running LLMs. So I would like to keep a balance of both without compromising too much on performance.

I know that there are proprietary drivers for AMD cards but I'm hesitant to install it as I've heard that it performs less efficiently in games when compared to the open source driver.

I'm mainly confused about this ROCM thing. Is it not included with the opensource amdgpu drivers ? Or is it available as a separate package?

So what driver to use ?

Or perhaps, is it possible to run oogabooga or stable diffusion within a distrobox container (with the proprietary drivers) and still keep using the open source gpu drivers for the Host operating system.

you are viewing a single comment's thread
view the rest of the comments
[–] Falcon@lemmy.world 1 points 1 year ago

Basically, RoCM and CUDA allows one to do math on the GPU. Most Linear Algebra operations (i.e. LLM or NNs and ML generally) can be parallelized over a GPU which is much more performant than CPU.

To perform calculations on GPU, one needs some sort of interface to to their programming language of choice, NVIDIA has CUDA which is in CPP with bindings to python: (pytorch, Tensorflow etc. ), Julia: Flux etc.

RoCM is AMDs solution, there bindings are young and not widely implemented.

My advice, play around with Flux RoCM and PyTorch RoCM just to get an idea. Suffice it to say, when I started doing RL and LLMs more seriously I gave up my colab and sold my AMDs to fund a 3060.