██████╗ █████╗ ███╗ ██╗ ██╗ ██████╗ ██╗ ██╗███╗ ██╗ █████╗ ██╗██████╗ ██╔════╝██╔══██╗████╗ ██║ ██║ ██╔══██╗██║ ██║████╗ ██║ ██╔══██╗██║╚════██╗ ██║ ███████║██╔██╗ ██║ ██║ ██████╔╝██║ ██║██╔██╗ ██║ ███████║██║ ▄███╔╝ ██║ ██╔══██║██║╚██╗██║ ██║ ██╔══██╗██║ ██║██║╚██╗██║ ██╔══██║██║ ▀▀══╝ ╚██████╗██║ ██║██║ ╚████║ ██║ ██║ ██║╚██████╔╝██║ ╚████║ ██║ ██║██║ ██╗ ╚═════╝╚═╝ ╚═╝╚═╝ ╚═══╝ ╚═╝ ╚═╝ ╚═╝ ╚═════╝ ╚═╝ ╚═══╝ ╚═╝ ╚═╝╚═╝ ╚═╝
Find out which AI models your PC can run
We detect your hardware automatically and show the best models for you.
Recommended Models
116 modelsLearn to Run Local AI
Tutorials, demos and benchmarks from top channels

Ready to run?
Install Ollama and run any model with a single command. Click "ollama pull" on the cards above to copy the command.
Detect via Terminal
One command. Detects GPU, CPU and RAM, shows the top 3 models for you and opens the site with everything filled in.
npx can-i-run-aiZero dependencies
Uses only Node.js built-ins. Nothing to install besides npx.
Cross-platform
Windows, macOS and Linux. Detects hardware via native OS commands.
100% open source
Open source on GitHub. No data sent to servers.
Native apps — coming soon
Windows 10 and 11 — portable .exe
Apple Silicon & Intel — native app
Ubuntu, Fedora, Arch — AppImage
About CanIRunAI
CanIRunAI helps you find out which AI models can run on your computer. We detect your hardware automatically and calculate compatibility with dozens of popular open-source models.
Auto detection
GPU, RAM and CPU detected via WebGL and browser APIs.
Estimated performance
Tok/s and response time scaled by your GPU bandwidth.
100% local
No data leaves your browser. Everything runs client-side.
Compatible with Ollama, LM Studio and other local runtimes. Benchmark data based on real measurements with RTX 3060 as baseline.