It all depends on the size of the model you are running, if it cannot fit in GPU memory, then it has to go back and forth with the host (cpu memory or even disk) and the GPU. This is extremely slow. This is why some people are running LLMs on macs, as they can have a large amount of memory shared between the GPU and CPU, making it viable to fit some larger models in memory.
mlflexer
joined 1 month ago
Det er i hvert fald meget bøvlet. Banken kan vel tilgås via. Computeren. MobilePay er nok ikke muligt, men man kan heldigvis bruge kreditkort mange steder MitID er der måske alternativer for folk uden mobil? Ville jeg selv orke det, nej det er sgu for besværligt, men hvis man skulle gå en mellemvej, så er det måske at kigge på Linuxphones, er ikke dykket super meget ned i det selv, men det er måske et valid alternativ til Google/apple smartphones.
Hvis der er andre som har prøvet/er på en Linux phone, så kunne jeg rigtig godt tænke mig at høre hvordan det er at bruge i hverdagen, så skriv endelig en anmeldelse!
view more: next ›
Oh, I thought you could get 128gb ram or more, but I can see it does not make sense with the <24gb… sorry for spreading misinformation, I guess, in this case a GPU of the same GB ram would probably be better