Practical AI – Episode #112

Building a deep learning workstation

Get Fully-Connected with Chris and Daniel

All Episodes

What’s it like to try and build your own deep learning workstation? Is it worth it in terms of money, effort, and maintenance? Then once built, what’s the best way to utilize it? Chris and Daniel dig into questions today as they talk about Daniel’s recent workstation build. He built a workstation for his NLP and Speech work with two GPUs, and it has been serving him well (minus a few things he would change if he did it again).



LinodeGet $100 in free credit to get started on Linode – our cloud of choice and the home of Head to

Changelog++ – You love our content and you want to take it to the next level by showing your support. We’ll take you closer to the metal with no ads, extended episodes, outtakes, bonus content, a deep discount in our merch store (soon), and more to come. Let’s do this!

FastlyOur bandwidth partner. Fastly powers fast, secure, and scalable digital experiences. Move beyond your content delivery network to their powerful edge cloud platform. Learn more at

Notes & Links

📝 Edit Notes

Daniel’s workstation components:

  • CPU - AMD YD292XA8AFWOF Ryzen Threadripper 2920X
  • CPU cooler - Noctua NH-U12S TR4-SP3, Premium-Grade CPU Cooler for AMD sTRX4/TR4/SP3
  • Motherboard - GIGABYTE X399 AORUS PRO
  • Memory - Corsair Vengeance LPX 16GB (2x 2 packs), total 64GB
  • Storage 1 - Samsung (MZ-V7S1T0B/AM) 970 EVO Plus SSD 1TB
  • GPU 1 - RTX 2080 Ti
  • GPU 2 - Titan RTX
  • Case - Lian Li PC-O11AIR
  • Power Supply - Rosewill Hercules
  • Case fan(s) - Coolmaster 8mm

Daniel’s NUC 9 Extreme machine


0:00 / 0:00