this post was submitted on 05 Aug 2025
27 points (100.0% liked)
Free Open-Source Artificial Intelligence
3816 readers
3 users here now
Welcome to Free Open-Source Artificial Intelligence!
We are a community dedicated to forwarding the availability and access to:
Free Open Source Artificial Intelligence (F.O.S.A.I.)
More AI Communities
LLM Leaderboards
Developer Resources
GitHub Projects
FOSAI Time Capsule
- The Internet is Healing
- General Resources
- FOSAI Welcome Message
- FOSAI Crash Course
- FOSAI Nexus Resource Hub
- FOSAI LLM Guide
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I believe the full size DeepSeek-R1 require about 1200 GB of VRAM. But there are many configurations that require much less. Quantization, MoE and other hacks. I don't have much experience with MoE, however I find that quantization tend to decrease performance significantly. At least with models from Mistral.
That's lots of vram