Before you download a 35 GB model and watch your laptop give up: CanIRun.ai is a reality check for the local-AI-curious. Yes, there are options for an M2 with 8 GB RAM. Llama 3.1 8B is a “tight fit”, but Qwen 3.5 2B “runs great”.
- CanIRun.ai checks which AI models your machine can actually run, grading them from S to F based on RAM, bandwidth, and estimated token speed.
- On an M2 with 8 GB, most popular models score F. Llama 3.1 8B scrapes a C. DeepSeek R1 needs 343 GB.
- A useful gut-check before the local AI hype meets your actual hardware.