Local LLM Memory Calculator
Estimate memory requirements for running LLMs locally. Compare models, quantizations, and find what fits your hardware.
1. Select Model
2. Configure
3. Your Hardware
On Apple Silicon, this is your unified memory. On discrete GPU, this is your GPU VRAM.