I Built a Powerful PC to Run AI Models. Here’s Why.
You might think of Gemini or ChatGPT when you think about AI. Other players include Perplexity Claude Grok Mistral. There are many AI models available in a booming industry. Many of them don’t require an internet connection.
Local AI models are models that can be run without an internet connection. They can also be run on your hardware. ChatGPT and Gemini do not require you to connect to OpenAI or Google servers.
There are both advantages and disadvantages. Privacy is the major advantage. Local AI allows you to ask it embarrassing or sensitive questions and have it analyze them without worrying about Big Tech watching. It’s also limitless. You can ask questions as long as you’re powered up, or at least until your memory runs out.
I decided to build the most powerful PC possible in the smallest hardware available. Would it have been possible for me to use a larger motherboard that could handle much more RAM instead? Sure. But I doubt most people will use AI in their homes or small business. I chose the equivalent of a powerful, compact gaming PC. Here are the specs and approximate retail price:
- AMD Ryzen 9 9950X3D ($660)
- Nvidia RTX 5900 ($2,400)
- MSI MPG Edge Wi-Fi Motherboard ($290)
- 64GB Crucial Pro DDR5 RAM ($140)
- 2x 1TB Crucial NVMe Gen5 solid-state drives (150 dollars each)
- Corsair SF1000 Power Supply ($270)
- Fractal Design Terra 10.4-liter case ($180)
This is a very expensive system. It costs $4,240. To power some of the local AI models you need horsepower. Not all models need such expensive hardware. There are some smaller models that work really well and can be run on a laptop with a decent amount of power. OpenAI’s GPT-OSS is a localized version of ChatGPT.
I believe that the most powerful models are going to become more efficient over time. Maybe a powerful version of DeepSeek R1 from 2025 will run on modest hardware by 2027.
