I Finally Got To Test The Most Powerful Home Al Model Ever Made!



What would a “dumb” (`deepseek-r1:8b`) Bart De Wever say about the new government policy and what would a smarter one (`deepseek-r1:70b`) say?

So I actually ran DeepSeek on my own normal PC to find out.
Since it was my first time running any LLM locally, some personal takeaways:

🧠 Smarter = needs more RAM
The main hardware requirement is RAM, **not** GPU for running an LLM (“inference”)
Your model weights need to be loaded in RAM to be practically workable. Lighter/smaller models = less smart answers.

🏎️ Faster = more processing power & bus capacity
I was surprised to see the GPU not even being used at all. Think rather about DDR4 vs DDR5 memory, multiple RAM slots and their bus capacity on the motherboard. Of course a decent GPU with the necessary GB’s VRAM will be a better option for speed.

⚙️ No need for large hardware investments
Most software companies that are providing a service, will not need to make large investments to make AI a part of their offering. Their cost will be mainly engineering time & opex for using or running a model. Training of a model is costly & hardware-intensive, but running costs are a function of the requirements of speed and complexity.

On my decent-but-not-so-special workstation with an i7-9700K, 64GB RAM and 8GB RTX 580 GPU, I got these speeds (in token per seconds, tps)
– +- 3 tps on the r1-8b model
– +- 1 tps on the r1-70b model
Pretty doable!

Final text output here:

Do comment if you have a different experience running the LLMs!
Happy to answer any more questions!

The School of Logistics by Dockflow helps freight forwarders perform, offer better service to customers and take their business to the next level. We educate, inspire and build awareness for the new era of freight forwarding. Our channel brings you the latest tips & tricks, how-to guides, case studies and interviews in bite-size videos.

Want to see your logistics and supply chain questions answered? Reach out to us on social:

– Facebook:
– Instagram:
– Twitter:
– Podcasts:
– Website:

source

Disclaimer
The content published on this page is sourced from external platforms, including YouTube. We do not own or claim any rights to the videos embedded here. All videos remain the property of their respective creators and are shared for informational and educational purposes only.

If you are the copyright owner of any video and wish to have it removed, please contact us, and we will take the necessary action promptly.

Scroll to Top