Changelog & Friends – Episode #79

The state of homelab tech (2025)

with Techno Tim

All Episodes

Techno Tim joins Adam to catch up on the state of Homelab for 2025, the state of AI at home and on-prem (AI Homelab) and where that’s heading, building a creator PC, choosing the parts for your build, GPU availability, Windows being user hostile, and why Tim is happy to be using Windows, Mac AND Linux.

Featuring

Sponsors

RetoolThe low-code platform for developers to build internal tools — Some of the best teams out there trust Retool…Brex, Coinbase, Plaid, Doordash, LegalGenius, Amazon, Allbirds, Peloton, and so many more – the developers at these teams trust Retool as the platform to build their internal tools. Try it free at retool.com/changelog

Temporal – Build invincible applications. Manage failures, network outages, flaky endpoints, long-running processes and more, ensuring your workflows never fail. Register for Replay in London, March 3-5 to break free from the status quo.

DeleteMe – Text CHANGELOG to 64000 to join DeleteMe. They make it quick, easy, and safe to remove your personal data online.

Fly.ioThe home of Changelog.com — Deploy your apps close to your users — global Anycast load-balancing, zero-configuration private networking, hardware isolation, and instant WireGuard VPN connections. Push-button deployments that scale to thousands of instances. Check out the speedrun to get started in minutes.

Notes & Links

📝 Edit Notes

Chapters

1 00:00 Let's talk! 00:38
2 00:38 Sponsor: Retool 02:45
3 03:23 Skipping breakfast 02:14
4 05:37 Tim snacks 01:00
5 06:36 PEE CAN or Pecan? 02:05
6 08:41 Berdoll Pecans FTW 01:13
7 09:53 Let's talk Homelab 02:04
8 11:57 How Homelab began 01:17
9 13:14 Smaller and less power is the way 03:43
10 16:57 What's motivates Tim? 03:29
11 20:26 Script it OR wing it? 02:44
12 23:10 Cameras for filming 05:04
13 28:13 Content practices and workflows 04:16
14 32:30 Changelog core beliefs 03:43
15 36:12 Homelab stuff that's resonating with Adam 02:21
16 38:34 AI Homelab 04:51
17 43:25 On-prep AI is the future 01:45
18 45:12 Sponsor: Temporal 02:02
19 47:14 Self-hosting AI 08:33
20 55:47 AI at home is the next frontier 00:32
21 56:19 GPUs on eBay 01:42
22 58:02 Local AI and Ollama are the way 06:14
23 1:04:16 Make Ollama first-class integration 02:10
24 1:06:26 AI as a Service on my LAN (AIaaS) 01:54
25 1:08:20 GPT is the first draft word calculator 06:28
26 1:14:49 AI builds from Tim 00:31
27 1:15:20 Building a creator PC 05:14
28 1:20:34 BUT Windows is user hostile 06:35
29 1:27:09 Paying the Apple tax 02:51
30 1:30:00 Why not BOTH Windows and Mac? 01:23
31 1:31:25 Sponsor: DeleteMe 01:52
32 1:33:17 Adam's creator PC build 05:18
33 1:38:35 Understanding the various GPUs 02:44
34 1:41:19 Planning for PCIe lanes 04:09
35 1:45:28 Gamers pushed the innovation 04:07
36 1:49:35 The GPU bottleneck 02:58
37 1:52:33 Tim's Linux Workstation 04:03
38 1:56:35 The hard drive conspiracy! 02:59
39 1:59:35 Should we do this more often? 02:54
40 2:02:29 Wrapping up 00:46
41 2:03:16 Closing thoughts and stuff 01:58

Transcript

📝 Edit Transcript

Changelog

Play the audio to listen along while you enjoy the transcript. 🎧

So Tim, no breakfast. You’re not a breakfast guy?

No, no. I don’t know why. I just stopped eating breakfast a while ago.

There’s no reason for it? You just don’t do it?

No health reasons?

No health reasons.

No optimizations? No biohacking?

No, no. it kind of slows me down… I think it goes back to, I don’t know, high school, not having enough time in the morning… And same with college, just rushing to class, and so I just never picked up anything and ate along the way.

Gotcha. So you must be young enough to the point where you still reference high school and college. Because I’m so far away from those two things that I can’t even like –

I’m far away. I’m far away. I’m just – I’m far away, too. I’m just trying to figure out how it started.

I rarely ever do, unless someone’s like, I don’t know, talking about their kids, and I’m like “Oh yeah, I remember those days.”

So you’re not an intermittent faster. You don’t intermittent-fast, or do your sort of eating windows… Or do you practice a special diet of any sort?

I intermittent-fast, but not knowingly. I’ve been doing it like half my life before it was a thing, because I don’t eat breakfast. I usually skip lunch, and then I just eat dinner. And so it kind of started happening this way a long time ago, because on weekends I would do it; on weekends I would just be so focused on whatever I was working on - home lab, gaming, World of Warcraft, you name it… That I would just say “Ah, I can make it to dinner”, you know? And so I used to do that on the weekends, and then ever since like work from home, it kind of carried over. So…

Right on.

Yeah, so it’s just dinner for me. A big dinner.

What do you do for, I guess – if you’re not eating anything, are you just drinking water?

I am. Yeah.

Just water only.

Water only.

No coffee?

Oh, coffee yeah. You got me there. Yeah, yeah. So I wake up, two cups of coffee, nalgene and water, and then probably two or three more of these throughout the day. Then it’s dinner time. Then probably one or two more of these.

Wow. Okay…

every now and then I’ll grab a handful of nuts. Like, I just did right before we started.

Yeah. Right on. Nuts are good. Okay, so you’re a healthy eater then, it seems.

I guess so.

[unintelligible 00:05:42.28] versus a candy bar, or let’s say some gummies, or something like that. I don’t know.

Don’t get me wrong, if we had candy in the house, it’d be gone. Actually, when I went down to the cupboard just a second ago, my wife had chocolate-covered almonds in there. I’m like “What is going on?” If I knew about these, they’d be half gone by now.

Oh, man… Yes. I am a sucker for chocolate-covered almonds, or pecans.

I prefer pecans, because almonds - they can crack your teeth. Unless they’re roasted, and they’re like a little soft.

I just had chocolate-covered pecans. No, cashews. Chocolate-covered cashews with sea salt, the first time ever, last week. And I was like “These are so great.” But yeah, I’ve got a thing for candy. I definitely have a sweet tooth. I ask my wife, she’s like “Oh, my gosh.” She has to take it away from me. [laughs]

Let me make a recommendation on pecans, just in case – now, do you say pecan, or do you say pecan?

Pecans. I do, but I don’t enunciate it like you do. So I say pecan.

Pecan. Okay. So is it pecan pie?

Yeah, pecan pie.

Or is like pecan pie? Because some people say pecan pie, and those are not Texans. I live here in Texas, but I’m a Yankee, as they say. I’m from Pennsylvania.

And so I never really cared how you said pecan, but Texans really care. And so you can’t say pecan pie. You have to say pecan pie.

Wow, I thought pecan would have been a Southern way of saying it.

It’s Southern, but it’s not Texan. So Texas is South, but it’s not Southern.

That’s right. Gotcha.

Southern is kind of Louisiana and East, and not Florida. So basically, Louisiana, Alabama, Georgia, the Carolinas; that’s what is considered the South. Kentucky, of course, Tennessee - those are the Southern states. Texas is Southern, but not quite Southern… In that regard, at least.

No, it’s funny you mentioned that, because Indiana I feel like is like half Southern. So I’m from Indiana originally, but I’m from the Northwest near Chicago… But Indiana is this long.

Long state, yes. It’s a tall state.

Man, it’s like nine hours to drive through it, or something crazy like that. I don’t know. But some people in Indiana have a Southern accent. And growing up, some people would have the Southern accent and I’m like “Dude, you live like two blocks from me. How did this happen?” You know what I mean?

[08:13] Right. Where did they go wrong…?

No, I’m not saying it went wrong. I’m just saying, you know, somehow my family got the Midwest, the Chicago-style accent, and two blocks down some people got a Southern accent. it probably has to do with their upbringing and their family, but that’s how diverse accents are in Indiana… Because as you get around Indianapolis, you’re either North or South. I think that’s the border.

Well, let me make a recommendation on these pecans - because I say pecans - and then we’ll get into some home lab stuff, and maybe some AI home lab stuff… We’ll see, if we get a little crazy around here. The brand I want to recommend is a Texas brand. It’s a family-operated brand called Berdoll. B-E-R-D-O-L-L.

Berdoll. I like it.

The best pecans you’ll ever have. You can just get the pecans for Christmas if you want. You can get a bag, or you can get the chocolate-covered ones. Whichever flavor you want to go with, but their pecans are legit the best pecans ever. This is not sponsored, just a tried and trued, loved, beloved Texas brand called Berdoll.

I’ll have to check it out.

And damn you Buc-ee’s for dropping them. Buc-ee’s is a big – have you ever heard of Buc-ee’s?

Oh, my gosh. Well, I can’t take you there in this podcast. We’ll have to do it as an after show or something like that, but… Buc-ee’s is not a gas station, it is a destination, let me just say. It’s on most Texas highways, it’s a beloved Texas brand… Buc-ee’s. Tim, I’m going to teach you some things, man. I’m going to teach you some things.

Alright, let’s get into the meat of the matter. We’re friends, we’re talking about home lab, we’re talking about some creator PC stuff, maybe some Linux workstations, some AI… You just did a hardware tour, a software tour, a what’s practical to run as AI in your home lab/home, whatever… Where should we begin? Should we begin with the state? What is the state of home lab as you see it going into 2025? Is it growing? Is it stagnating? Is it more diversified? What’s the state of home lab?

No, I think it’s growing. And I think I said this last year that, that I think it’s growing… And I think last year I said that we’d see a trend towards more mini PCs. And I think that’s right. And we definitely will.

Right. That’s true.

And we are still seeing it. So I think it’s still growing. I think the trend is growing, because a lot of people are saying either they want to post up at home, they want data sovereignty… Probably not a majority of people, but a lot of people will start having their movie collection at home, and then that’s a gateway to get into home lab-ing… Because like me, that’s kind of how I started. I want to have a media server, I have this extra compute… What else can I do? What else can I throw on there? And then all of a sudden you have a server rack in your basement.

So I think it’s still growing. I think it is. And Geerling just a little bit ago - Jeff Geerling - had a video on 10 inch mini racks… Which - I have a couple up there, behind me. And I think that kind of opened up the doors for a lot of people to say “Hey, I don’t need a full-blown PC. I don’t need a big server. I can have a whole entire home lab in 4u of rack space that fits on my desk.” As you can see, I have two mini server racks back there. And so I think that’s going to be a bigger trend. It already is a big trend over in Europe, 10 inch racks. But in the US they’re just starting to catch on.

Yeah, I’m kind of tired of the massive rack… Major power, too. Having to have this massive UPS to deal with it… I feel like it’s – it was a good start. Home lab kind of began by bringing what was enterprise to the home, to play with it.

[12:14] That’s right.

Repurposing older enterprise servers older enterprise workstations, desktops, etc. because there wasn’t a lot happening. Now, there’s a lot of new cases out there. Fractal is going crazy with all their different designs. There’s so many others out that they can compare. Silverstone is one of my other favorite brands. I think when I was watching your Linux build, I think we are using the same - what do you call that? …case, I guess. SilverStone RM42-502, I believe is what it was…

Well –

What are you using for that? I’m jumping the gun a little bit, but…

Yeah, that one is just – I don’t even remember the brand name. It’s not a popular brand for that [unintelligible 00:12:52.28]

It wasn’t a Silverstone?

No. Although I –

It looked very similar.

Yea, it’s actually right there behind me. I’m actually going to probably move that into a Sliger case. And Sliger cases are super-awesome. They’re one of the few that focus on rack-mount systems. And yeah, I agree. I overindexed in the beginning on compute. I used enterprise servers… You know, decently-priced, or cheap, in general…

But most of the time that’s way overkill on compute and power and everything else. Heat, noise, you name it. And so I reeled it back, I don’t know, two years ago, and moved to mini PCs for my Kubernetes cluster and some of the things that I’m hosting at home.

And even before that, I got 1u servers and kind of – I kind of toned it back a little bit, even before I moved to those mini PCs. So yeah it’s wild. But at the same time, me personally, I’d rather have my stuff in a rack. So having my stuff in a rack doesn’t make it high-energy, or high-power usage, because I have a lot of stuff in my rack right now and I think I’m like at 500 Watts… Which is a lot, don’t get me wrong, but nowhere near what it could be or what it was, 800-900.

So for me personally, I like putting things in a rack because it keeps things organized. It keeps things where they need to be. Cords have to go in a certain spot. Power has to go in a certain spot. These machines have to stack in a certain way, you know? And when I close the doors on the server rack, there can’t be anything hanging out.

So for me personally, I like to compartmentalize all of that in one space. That’s how my workstation is right here. I have a Mac Studio that’s racked, just because I don’t want to have to deal with cords and all of that. I want it to look nice, and give it a proper place to put it. Otherwise it’d be sitting on the floor, or some shelf, and… Who knows…?

Yeah, I made a mistake, actually, when I mentioned my Silverstone. I think it was – not that you might know this, because you said it’s not a Silverstone… But it’s a SilverStone RM44. That’s actually my Plex server case. It’s got three fans in the front, great ventilation, just in case I spin up that 4K footage and need to transcode it to 17 clients, all that good stuff to manage the cooling of it… But it’s a SilverStone RM44, and it’s a 4U case. And I’m the same. I’ve got a – my preference is rack mount, really. And if I can’t rack-mount it, it might look nice – for example, the Fractal Design North; a lot of people are liking that. But I’m like “Can I get a rack mount version of that?”

Yeah, exactly.

[15:52] I would – all day.

Yeah, all day.

Are you there with me on that one?

Oh, yeah, man. Metal, wood, and stone is all you need.

Right? A little wood is just so nice.

Yeah, if it says Fractal and it has wood in it, I automatically like anything that they post, so…

Yeah, for sure.

Yeah, they have great designers there. I like it. Yeah, definitely minimal.

Do you pay attention to case designers? I know a lot of people that really steep themselves – you’re a YouTuber, you’re a creator… Do you get into that realm of like “I pay attention to a case designer, and when they go from XYZ company to Fractal and they design their next case, I’m on it.” Is that the kind of person you are? Or are you just sort of like “Nah…”?

No, I just know what I like when I see it. And embarrassingly, maybe, I don’t know the names of any of the designers who do this stuff. I know that there’s a great designer behind all of this stuff, but for the most part, I just see the brand, where it lands. So no, I couldn’t tell you any designer’s name.

Well, good for you. You’re not that deep then. What is it that motivates you to do what you do? I think you – did you begin this as a sort of a side gig, to a hobby – because you were a software developer, and you still are… But you kind of like inched into this, and maybe did it well, and got popular, and had some good thoughts, and then you’re like “Man, this is kind of cool.” Is that kind of your story, or what makes you get up in the morning and every day make content?

Yeah, so that is kind of how it happened. That is kind of how it happened. Like, I’ve been doing this as a hobby for a long time. Instead of home labbing, I’m starting to call it infrastructure as a hobby. That’s what it has been for me, for a long time.

I did infrastructure for work, but then, as you mentioned, got into software development. I’ve been a software developer for a long time now. So now infrastructure is now my hobby. So everything I do at work is cloud, 100% cloud, 100% virtual. I don’t touch anything except for my laptop.

So I always have this itch for an infrastructure. And it comes from two things. One, the media server that I mentioned earlier, but two, just having an access point and a router at home. anyone can look at their access point router and say either they love it or they hate it, you know? And if they say they hate it, it’s probably rented from whoever their ISP is. If they say they love it, they put some time into it and bought their own. And that’s kind of where I like to play, is like a little bit in the networking, and somewhat in compute and storage… So even if I weren’t totally in, as I am now, I’d still be doing this kind of stuff.

But the thing that motivates me in the morning to get up – like, this morning, I thought “Ah, I get to play with this stuff.” As soon as I opened my eyes, I thought “Sweet. Did my print job complete on my 3D printer? Because I need that stuff for these mini racks behind me.” That’s one thing I got into the last six months, is 3D printing. Or actually, two months. 3D printing. I’m in deep. But…

Is that right?

Oh, man. Yeah, it’s wild. But that’s what gets me up in the morning. I think just having a variety of stuff to work on… And if it’s not hardware, it’s definitely software. Whether it’s writing software, or using software, I’m in this kind of weird space where I can write software, and I do, and I build that software and run it in my home lab… But I also use software from other companies, whether it be closed or open source, in my home lab. I have hardware that I built and scrapped together myself for custom builds, and then I have hardware from vendors that’s full solutions that I use. And so I’m kind of playing in that world. So if there’s a Venn diagram of all of that, that’s what I like. It’s hardware, software, open source, closed source… It doesn’t matter. Closed source, it has to be a really compelling story and a good solution, better than anything out there… That’s why some of that stuff might be closed, is because they’re making a profit and they’re making a good product. And I’m not afraid to support good products.

[20:10] But yeah, that’s what gets me going, is really just playing with stuff, and thinking about what I’m going to tell people about the stuff that I’m working on. That’s kind of what goes through my head.

Do you go as far as writing scripts, too? What’s your process? Give me a one-minute, two-minute version of how you do what you do. Can you compress it in two minutes, or is that too hard?

I can. Cut me off if I go too long.

No, you can go far if you want to. I was just kind of giving you… Just a limit.

I could talk all day. I used to wing it. I used to wing it. And that was fun, except for I was messing up every other sentence, and it got super-painful to edit later. So for me, writing a script was more of an optimization for later, so I didn’t have to edit longer. And I noticed the more I messed up, the longer I’d have to edit, so I started writing scripts.

So I have two or three ways that I do it. Sometimes I’ll have a script, sometimes I’ll have an outline. It depends on the topic. If it’s a tutorial, outline. Intro has a script, tutorial, an outline, outro has a script. That’s kind of how I work now. But there are other videos where I have a full script.

So how does it start? Well, I have a list of ideas. Every video I kind of vote in my head what’s next. I have commitments from – external commitments, maybe from brands, I have the things that I want to do, and then I have the things that I kind of think will do well. And then I have the things that are relevant, super-relevant to right now. So I have to kind of juggle that.

So I usually pick one of those, I start depending – depending on the topic, I’ll either start experimenting, so I can think through what I’m going to say as I’m doing it… Or I dive right into writing, because I know what I’m going to say. I’ll film it… I do use a teleprompter if I need to. And then I’ll start editing. While I’m editing, I kind of think of what the thumbnail is going to be. And while I’m editing, I’m trying to think, “Are there pieces I can cut out to be more concise?” Because sometimes even if I review the script that I’m going to record, there are still things that slip in, that I think later on when I’m editing, I don’t need to say. And then – yeah, then it’s a polish it up, put effects on it, upload… You name it. Subtitles, thumbnails… Three thumbnails now, for A/B/C testing. It’s tough. Yeah, YouTube’s a lot of work. And I’m not complaining at all… It’s just, it’s more work than people think it is.

Oh, yeah, it is. It’s a lot of iterating, and I think that’s what you mean by work. Even the A/B/C testing, that’s iterations. Like, which one is the one that makes people really excited about this video, or connects with the people that should watch it?

Are you tapping into the iPhone camera at all, or are you strictly staying like mirrorless, or DSLR? What’s your flavor of how you shoot? Have you got any preferences?

I use kind of whatever works. So I have – well, now I have two cameras that I work with. Actually, three. So I’m doing top down kind of stuff, it’s back there. That one’s mounted, because I replaced it with this one. This is an Sony mirrorless FX30 camera line. I’m not a huge photographer. I realized that a lot of people that are in YouTube are like former photographers, or former camera people… Not me. I’m a tech person who turns my camera on. So anyways, I have that. And then my iPhone. My iPhone is actually really good at shooting B-roll. I have some videos where 90% of it’s iPhone, and no one ever balks at it one bit.

Oh, yeah.

[24:04] So yeah, being able to record this easily, as long as I have enough light, it has stabilization built in… No one even knows. I know, because I know what to look for. But no one knows, or no one cares. And to be able to do that is game changer. I really wish mirrorless cameras or DSLR cameras would kind of catch up. And not so much on the optics, because they’re way beyond, but more so on connectivity, like store stuff; automatically send it somewhere when I’m done. Or even do NDI. Send it over the network somewhere to my NAS, so I don’t even have to worry about SD cards or anything anymore. So I don’t know, I’m hoping cameras take a huge leap at some point. they’re already there. Super-expensive, high-end cameras can do all of that… I just feel like – I don’t know, the interface is just so old. Whenever you look at a camera, the interface is just so bad and so old.

Yeah, I agree with that. It’s primed for disruption; self-disruption, hopefully. I’d love to see maybe even Ethernet cable on a camera, connected; real-time bandwidth, 10-gigabit, 100-gigabit, 50-gigabit, whatever you’ve got on your network, straight to your NAS. Direct record, like a computer. It is a computer. Why not be a computer?

I agree. I agree. And it would be awesome. So if we’re going to go down this path, make them as dumb as possible, and then put software on my machine, so that I can push software to it. Basically a controller. You make the camera dumb, and then make some kind of software smart, so you can just push settings and push everything to that camera. It just captures it and dumps the video wherever you want. But yeah, similar to access points. They’re pretty dumb. You use a nice GUI to configure them, push settings to it… Software-defined network. Camera-defined? No, I don’t want that. Software-defined cameras. There we go.

There you go. Yeah. I’m with you on that. I’ll vote on that, if you can get that bill passed.

[laughs] Yeah, no, the camera makers will not have it, because then they become a commodity, right?

Yeah. They’ve gotta control the market. It’s all about my processor, and my – yeah, it’s a game, for sure.

Yeah. I don’t blame them, but…

Do you – one more thing on the iPhone-style shooting. Do you do just the straight camera application, or do you do something else? Blackmagic, I think, has a pretty sophisticated, where you can mess with the F-stop, and shutter speed… Do you go crazy with it, or are you just sort of like open up camera app and you’re done?

Yeah, so for me it’s a camera app, Auto, and I’m done.

Keep it simple.

Yeah. So the one thing I do use - you can kind of see it back - there is my gimbal, DJI gimbal back there. And they have their camera app built in too, but I don’t change the settings. I just hit Record and it’s Auto. It burns me sometimes on the white balance and stuff like that, but… You know, I started thinking about like – like, my goal this year is to turn around content quicker. And so I need to stop sweating the small stuff. That’s big for me. I was just talking about white balance. No one is going to care about the white balance. And if they care about the white balance being a little bit off, they’re either super-interested, or maybe not the target audience. You know what I mean?

Or it’s that obvious. So I need to stop sweating the small stuff on a lot of my videos and just get them out quicker. That’s a trend I see, too. People are starting to gravitate more and more towards less polish, more informal, more spur of the moment type videos, you know? I kind of do a bit of all three, and so just trying to figure that out for this year. I just want to be able to turn videos around a lot quicker, and I think the way that I do that is to make a lot more informal videos, you know?

[28:11] Yeah, I’m with you on that. As I mentioned in the pre-call - I think it might’ve been in the show, I’m not sure… that this year, 2025, for Changelog proper is video-first. We’re taking our podcast full-length video. That’s Changelog News, our two flavors of our long-form show, both on YouTube full length… Our news show is nine minutes or less… And it’s already hitting close to 2,000 views. Like, 2,000 views is pretty good. We just started this year, and it’s approaching 2,000 views for that news show consistently.

Our shorts and clips, those things are always in the hundreds, maybe sub one thousands. There’s some that are breakout hits in the 50, 60 thousands. We’ve got one video out there, that we’re gonna actually recycle, because it’s just that good, about AI, and IP law, and stuff like that. And music. It’s got all the right touches and feels, you know… That one has done more than a million views.

Yeah. I remember it, yeah.

So you’ve got this really great piece of content… Let’s just say “What’s in my hardware rack for 2025.” Let’s say that’s what you’ve done. Let’s just – hypothetically, maybe you’ve done a video like that. You put a lot of work into that video, and maybe it’s 20-ish minutes. You’ve got good chapters in there, you’ve got all the right things. You don’t just leave it there, though. You can turn your camera, your iPhone camera, let’s say, vertical, and do a short. And it is a side promotion of that content. A little informal… I feel like we need this hub and spoke mentality. You’ve got this hub, which is this larger, longer form, kind of thought out content that’s chaptered and linked up in the description… Then you get these side cars. You’ve got to go on LinkedIn, you’ve got to go into the shorts, you’ve got to pull some clips from it maybe… You’ve got to find some way to promote the hub. Do some spokes to promote the hub, essentially. Is that some things you’re doing?

It is, it is absolutely. And yeah, I agree, shorts are always, for me, super-informal. Sometimes I might write from a script just because - well, I used to only have 60 seconds. Now you get three minutes. YouTube finally folded into what reels and TikTok’s been doing, so you get three minutes now… Which I think is awesome. Because before I was like trimming off seconds, half-seconds between words just to get it into 59.9 seconds, just in case… Because then they wouldn’t consider it a short. But yeah, that’s exactly what I do. I have my long-form, polished content, on YouTube proper. And then I take it even one step further; I live stream on Twitch every Saturday. Totally informal, totally Q&A, AMA style. I never know what’s coming. I promote that other places… And then yeah, absolutely, I upload my videos to different – LinkedIn, X, Facebook, you name it. Upload those there as well. And then Instagram reels, too.

And and then there’s content all the time that I made from that week of shooting the video. So yeah, it’s kind of weird, because you just – it’s a lot of sharing stuff, and hoping that stuff sticks, or gets shared again. And it’s kind of weird, I don’t know… For me it sometimes gets kind of weird, because it just feels like sometimes – and I don’t want people to ever think like I’m showing off, you know what I mean? And that’s kind of what – that’s not me at all. Like, I never want to show off stuff. But if you see my YouTube stuff, you’re like “That dude’s a show-off. What the heck is he talking about? Why is he showing me all this stuff?” maybe. And some people, who didn’t know who I was and didn’t know that I do all of this, but maybe just saw my video, the vertical one, for the first time. So it’s just a weird space for me to be in, because it’s like self-promotion on steroids, all the time. And in my private life, that’s not how I am.

[32:26] Yeah. I do feel that angst as well, honestly… And there’s two pieces of - I wouldn’t call it advice; maybe just conversation. One is give them what they came for. That’s one of our pillars. Let’s just say a core belief, so to speak, around here. Give them what they came for. It’s got a lot of things, like, don’t make the intro so long that it takes too long to get to the content. Time to content is essential. Whatever is valuable, the hook etc. Give them what they came for.

I think people come to you and they want that start of the year, they want that maybe mid-year, and kind of content from you which is show-off content. “Show me what you’ve got, Tim. Show me the choices you’re making”, which is like that show-off feeling, but really you’re just giving them what they came for. You’ve got an audience who cares about your opinion, and you kind of have to show off, and so I kind of feel that for you.

And when you do that, if your default gear in life as a human being and your persona and your personality is not a show-off personality, then that’s going to be a bit foreign to you. You’re going to feel a little icky. You’re going to feel a little too self-promotional, and you’ll get some haters. And my second piece of advice is let them. Now, that’s not my thing. That’s Mel Robbins. Do you know Mel Robbins?

No, I don’t.

Oh, my gosh. She’s awesome. She wrote this book… This is non-sponsored. I can’t wait to read it. I’ve heard talk about it, I’ve been paying attention for a long time, but I don’t know how to describe Mel Robbins [unintelligible 00:34:59.08] I think she might be a psychologist, she might be motivational guru, I’m not really sure… But she has been through the ringer, let’s just say, in life, and she’s bounced back, and she’s got lots of life advice because of the way she’s changed. And she has this new book out called “Let Them.” It’s called “The Let Them Theory”, as a matter of fact, is what this book is called. I haven’t read it yet, but I identify with the premise, which is you’ve got some haters, you’ve got some people who want to say something about you that - “Oh, Tim, yeah, you’re just showing off.” Let them. Somebody doesn’t like you? Let them. Let them not like me, I don’t care… Because we care so much as human beings more than anything about what other people think about us. And the reality is in almost every case they’re not thinking about us at all, they’re thinking about themselves. And they want to hate on you. You want to change your ways or not show up for your audience, or show up in the ways that you want to, to just be the person you are on YouTube, or the different places you’re at, and be the human being you are… Someone wants to hate on that? Let them. Let them hate on it. You know?

Yeah, it’s good, man. I like it. You know, I have that internal struggle. And even if people didn’t say anything too, I personally am just like – you know, would I walk outside and be like “Hey guys, come look at all this stuff I have running in my basement”? That’s kind of how I feel like sometimes on social media, and it’s like “Ah…” But no, it’s awesome. No complaints whatsoever, because - man, most days of the week I get to wake up and do whatever the heck I want, and it’s awesome.

What a joy ride. What an absolute blessing. And yes, it’s almost like if someone says “Tim, how’s your day?” Well, am I healthy? Is my wife healthy? Y’all don’t have kids, but I have kids… Are my kids healthy? Is my dog even – is my dog healthy? Are my people healthy? Do I have the opportunity to do what I want to do this day? Man, it’s a great day. That’s a great day, right?

[36:00] I agree. I agree. Yeah, I’ve been blessed for a long time with people being healthy in my family. And at the same time, having so many opportunities that I’ve had.

Well, at the risk of getting too deep in the details here, man, let’s truly talk about home lab. I feel like for me there’s a couple of things that I’ve been resonating with, I would say middle last year, tail of last year, and the beginning of this year. One, I’m desperate to create a creator PC. But then I think “I’ve got to put Windows on this thing” and I’m just like “Forget it.” I want to build it, and I want to tinker with all the parts, the GPU, the CPU, the motherboard, all the parts. I want to have fun building it. I want it to look cool. I want it to be a showpiece, but I also want it to be performant. But then I’m like “Man, Windows?” I’m a Mac guy, through and through. So I’m like, I can go buy a Mac mini that’s basically – sure, it’s overpriced on some of the hardware, but I don’t have to worry about drivers, and BIOS, and configuration… Which I don’t mind doing in the Linux world, but forget it on a Windows PC. That’s [unintelligible 00:37:08.05] one, is I really want to build a creator’s PC.

And then my second thing is - we’ve talked around the software of artificial intelligence on our show for years now. we have a show called Practical AI that is part of the network… It’s been doing it before it was artificial intelligence.

It was data science then, machine learning then. It was not artificial intelligence. Now it is. That’s how long we’ve been steeped in this world of AI. Pre GPT, anything, really. But I really want to run AI in my home, like a lot of people. And I think thanks to Ollama, Open Web UI and a lot of these advancements around open source models, or at least openly available models, whether truly open source or not, I feel like has now crossed that chasm that I’m compelled to build an AI home lab. And right now I have one. It’s so embarrassing… It is an Intel NUC. It runs all on the CPU, because there is no GPU in it… And Ollama will not recognize the IGPU at least, on the thing. So it’s just straight-up CPU. I can run some 1.5, maybe 5 billion parameter models okay… But anything beyond that is just like forget it.

So those are my two [unintelligible 00:38:20.07] is like creator PC, and some form of AI home lab. I feel like I wanna self host AI in perpetuity, at some point. So I’m going to crack the nut on this machine. So those are my two subjects. Which one do you want to talk about first?

Let’s go with the AI one, because that’s – yeah. Let’s go with that. So I’ve been doing it for a little while, and it doesn’t take as much as you think it does. You’ve already been doing it, and you’ve been doing it with this small 1.5 billion parameters, and it’s running okay. And for the most part, you can run it in your home. I mean, you talked about Ollama, which is fantastic, which kind of opens the gates for all of these other LLMs, and gives you an easy way to swap them out… And then Open Web UI really takes that to the next level, where it’s like “Okay, now I have GPTs and helpers, and basically UI to do all this stuff with Ollama.” And so you could do it very easily.

If you have an old gaming card, if you had an old gaming PC that’s, I don’t know, 20, 30-ish, series RTX, perfect. It’s probably only going to have about eight gigs of RAM, but that’s more than enough for some of the smaller LLMs. And that’s the most important thing, I think, in GPUs, is your VRAM. Because if it’s small, I can’t fit the whole model inside, and so it’s going to be paging that. So you want to look for cards that have enough VRAM to fit models in them.

[39:54] And so there are expensive options, like even 3090, 4090, 5090, but there are budget options. I think the 3070… So in that Linux workstation build video I did - and that was focusing on AI, or really LLMs… I wanted to build a workstation to play with them. I think 3070 is the budget pick right now. One, because it has 12 gigabytes of VRAM. And two, because of the price.

So when I say budget, we’re talking sub $300. That’s still a lot of money, don’t get me wrong, but it’s not the five or eight thousand dollars that you’re going to pay for cards where the compute end doesn’t really matter. So with 3070s, if you can find one, especially used, or something, you can load big LLMs on there. And not only one, you could run multiple.

So I’ve been running LLMs at home for a little while now. I’ve been running Open Web UI to kind of have my own local ChatGPT, if you will… And I’ve also done some stuff, even as simple - maybe not so simple, but voice transcription, with Home Assistant. And now Home Assistant has an AI voice assistant… But prior to that, I hooked up Ollama to Home Assistant to make Home Assistant even smarter.

The most basic example is I could ask Home Assistant “How many lights are on in the house?” Simple question. Home Assistant prior to me hooking up to LLM would say “I don’t know what room that is.” It has no idea what I’m even saying. So then I hooked it into Ollama - use any model you want, really, honestly - and then asked that same question, and it said, “You have 17 lights on in your house.” And it just blows my mind on how, hopefully, most things we run will let you plug in your own model or engine, API endpoint if you will… Because that’s what it is. And that’s yet another thing I want to get into a little bit more, is - I can do this stuff in the GUI with Open Web UI, and Midjourney, and generate graphics… But for me, the next piece, besides trying to train, is having an LLM API. So really, using the REST API that’s on Ollama, so I could feed it text, and do sentiment analysis, or ask it to summarize stuff, through an API endpoint… Which then I can build any tools I want, in the code I want to write, and then have that backed with an LLM of my choosing.

So that’s where I think it’s super-powerful, and that’s why the whole DeepSeek thing is kind of blowing up, too. But being able to do some of this… And I don’t even want to say at home. I just want to say like being able to self-host your own LLM, whether that’s in the cloud, in your home, at work, but having control over that LLM, and not going out to Open AI and using their API, but using your own API, on your own model, that’s trained, or maybe a public open source one, is I think going to be a huge game changer for a lot of people, for a lot of companies; a lot of companies first, I will say. But it will impact all of our lives, as it already is.

Yeah. I think on-prem AI is the necessary next step. And I would potentially even pay for a license. Like, if OpenAI is the winner, or if DeepSeek is the winner… Whoever is the trending – we choose Intel, we choose AMD, we choose a brand… So whatever brand we as a society or techies or geeks want to choose, if it’s not going to be open source, literally where I can download it myself, I would love it where I can at least license it, like you would software, and say “Okay, well, if O3 truly is the best, or O3 Mini is truly the best, and you can give me an on-prem version, and I have a licensing scheme or something, and I know that you’re not hoarding my data, or sniffing my stuff, or training other things on top… If you can give me some agency”, then I’m for that, too.

[44:18] My preference really is – but it’s hard to have this as a hard preference, because I know how much money goes into training these models. There was speculation that DeepSeek was only a few million, which everyone was like “There’s no way. How did they do this at such a cheap cost?” Well, that was actually just the GPU costs. That was not the true actual cost of training, it was really speculated to be truly in the billions, similar to OpenAI. So that’s the one major trend this year. And thanks to Jerod, my business partner and co-host on the show, because he shared that in Changelog News on Monday… Which was really – I love that show. I love paying attention to our own content. I’m up on the latest with dev news when I listen to Changelog News on Mondays… Every single Monday, by the way.

Break: [45:06]

I want to self-host. I’m seeing the future of where this is going. I feel like I want to self-host AI, and I want to go as far as I want to have quad GPUs. Okay, Tim? I want to build an open rack, with good ventilation… I want to deck it out. I’m seeing the future where this is going… And that build is maybe like $4,000, $5,000 for that kind of build. But so far, all you can do is do some of the things that you’re doing now, which is like Home Assistant automation, and stuff like that. What do you think about this world where we’re going to eventually have an appliance, maybe even? I feel like someone’s going to be like “Let me simplify this for most of these home users. Let me one-single-button this thing, because we’re not all Techno Tims and Adams out there. They’re not going to build these machines. They may pay the price for it”, but… I kind of feel like the next major trend, even for non-geeks like us, people who are just like everyday folks - they’re going to eventually get to the point where they’re saying “You know what? I want to have some agency over my AI, and I want to have an appliance in my house that I know I can trust.” What do you think about that?

Yeah, I agree. I agree. And I see this slipping in a little bit, so it’s starting to creep in. So a lot of, I guess, NAS – I shouldn’t say a lot… But some NAS vendors are now positioning their NAS as “Oh, it can also do AI”, because they can put a video card in there, run an LLM, and there you go. Although that’s still developer mode for most people. But if you install Open Web UI, Ollama, and it comes in a nice package, in a Synology, or whatever NAS you want, and it has a video card in there… It’s like, okay, cool. That works. That’s a full product. Right?

And so I think that people wanting storage at home, and also those companies seeing that “Hey, AI at home is cool, too.” I think it’s going to start going in that way. But I will say that – well, also, NVIDIA had that, whatever it was, I forget what it is… H100, or whatever it is. It basically has like crazy –

That tiny thing?

Yeah. Basically like crazy, crazy compute, for a reasonable price, I would say. It would destroy anything. H100? I can’t remember.

I think it was a Jetson Nano… Is that what you’re talking about?

No. No, they announced –

Oh, this is something different.

Oh, yeah. They announced this, I don’t know, two weeks ago. It was right before – they had a press conference right before I think CES.

Okay. You keep talking, I’ll do some googling.

So there’s that. But again, this is like a developer tool, and so it’s going to take developers to do this. But that thing is small and compact, and so before you go spending four or five thousand on GPUs, I’d say look at that first… Because this is like compute on steroids for anybody to do it themselves. And it’s way beyond anything you could possibly build for the money.

But I still think there’s going to be – well, two things. I still think it’s going to take a killer app. It’s going to take a company to put it together in a package, and create a full product. It’s not going to be “Hey, use this software along with this video card, and put it in this hardware, and then you’re going to have AI.” No. I think it’s really going to take a company that’s going to do a full product. And that could be anyone. It could be companies we know already, or it could be companies that are just starting out.

But I will say though that I think where this becomes super-useful as soon as we get - I think we have it now - action support. Action support is really going to change our lives even more than just LLMs already have. So if I’m able to tell a helper or a GPT to go do something for me, that’s where this is going to be huge. And I think ChatGPT just announced something about –

Operators, yeah.

Yeah. So they’re already getting action support. And so I would love to be able to, like I do in Home Assistant, say “Hey, turn on these four lights” or “Turn off these four lights.” I would love to be able to say things that are kind of tedious to do, but I want them done. And I don’t know, maybe it’s Google and Gemini, but… I don’t know, “Hey, summarize my emails. Tell me the most important ones right now.” You know what I mean? Or “Do I have any emails that are super-important or critical that I should look at?” This is all business use case… But imagine if you had that helper on your desktop. And this might be going too far for some people, right? But imagine if you had software running on your desktop that was your helper, your AI helper.

[52:00] Yeah. I mean – well, yeah, but I mean, it could do things; like, it could launch VS Code, it could launch whatever. It could go and patch your server. Like, this is getting probably too techie, but there are so many implications of like having a helper and action support be able to do things for you that I think it’s going to drastically change stuff. Like, even right now with Home Assistant. Well, Jarvis can, I think… But when I hooked up the LLM to Home Assistant, it couldn’t do any action through the LLM, because - I don’t know if [unintelligible 00:52:33.29] They were like “Yeah, this only works with Google Home” and something else. I’m like “Wait, you can’t turn off my lights on my thing because I told you to? But I have to go through Google Home to tell Google Home to do it?” Kind of silly. I’m sure they’re working it all out, but I don’t know. And I’m just thinking of just tiny actions here, commands. I’m sure the list goes on and on and on, of things that people can think of or do.

I don’t know. For me personally, I’m excited about that. Having a helper, to say “Yeah, go do these couple of things really quick and let me know when you’re done…” You probably can relate. I’m a person of one, you’re a person of one, you have a show and a company to run, and I have a channel to run, along with work on the side, and I can use all the help I can get. And so if I can rattle off a few commands to an AI that I trust to do those things for me, and do them right - a lot of stipulations there - that would be great. That would be great for me.

And so I’m really looking forward to some of this, because - yeah, as you know, I could use more help. And so far, having an LLM like ChatGPT for me personally, or even a local Ollama, has been super-helpful for a creator, just to have some ideas, just to have like an assistant that understands everything you say to it, that has, maybe or maybe not, the same perspective that you do, to be able to ask it questions and get feedback on exactly what you’re doing has just been game-changing for me personally.

I don’t even Google anymore half the time, because – I don’t know if I should go this far; I’m kind of rambling about AI, but I rarely Google anymore for things, because I don’t want to see advertisements. I don’t want to see Google’s Gemini slowly type in things. I don’t even want to click on the link that they suggest, even if it’s right, because then I’ll have to sift through that website to find what I want.

And so I’ve gotten a lot more efficient by using GPTs, because I can get, for the most part, a really good answer, really quick, and I don’t have to shift focus. I shift focus once to say ChatGPT, and I’m back to doing what I was doing. If I go to Google - say I went to Google, now I’m getting ads, now I’m clicking on this thing that wasn’t the right link… Oh, it took me to Reddit. What are all these people saying? Let me fish through these comments. No. You know what I mean?

I’m so with you.

There’s so many chances for it to steal my focus. And that’s marketing in general. That’s what they want. That is exactly what they want. And so I’m glad that ChatGPT is here to kind of disrupt that… Because I don’t want to be served up advertisements. I just want to focus on what I want to focus on, and if I need some help from an expert, that’s what they’re there for. That’s what I feel like.

[55:46] I think it is the next frontier. That’s why I bring it up here in this home labby Friends conversation, is because I feel like once you have – I know the GPUs are even hard to find. They’re hard to find, they’re expensive, I feel like it’s a racketeer system… Like, there’s something happening there. I get it, everybody needs something GPU to do the next big thing they want to do, whether it’s personally, corporately, in the artificial intelligence world. They want to run an LLM. I get it. They’re expensive, they’re hard to find… You can find like a 3090, or a [unintelligible 00:56:21.26] like you mentioned before on eBay, pretty easily. 3090s are in the $900 to $1,500 range for a decent use one. And if you know how to eBay, then you won’t get scammed, or you won’t buy the wrong one, or buy somebody’s junk, basically.

I’m thankful that eBay has gotten better at weeding out the poor sellers… And you still will have someone who doesn’t know how to eBay well buy from a poor seller. They don’t understand how to use the ranking system, or look at feedback to make sure they were a proper seller… And even that is still hard to do; you can still kind of not get scanned, but you just buy something that’s less than what they say it is. Thankfully, the eBay guarantee - this is not an ad for them, but I use eBay a lot for aftermarket products. I usually win auctions I involve myself in; I can tell you how I do that if you want to know… But you’ve got to learn how to look at a seller and evaluate it well. The eBay guarantee does say if they say it’s new, or it’s an open box, or if they misidentify what it is… Or let’s say it’s a version – they say it’s a version two, but it’s actually a version one. Well, the eBay guarantee protects you as a buyer from that. And eBay has gotten so much better at enforcing this, that for me it’s a fairly trustworthy place to buy things. Albeit, you can still buy the wrong thing from the wrong person, and maybe you have some challenges with getting your money back, or getting a replacement. But my adventures in eBay lately have been mostly positive. Mostly positive. And [unintelligible 00:57:56.07]

That’s my caveat to say like GPUs are hard to access. But I feel like the next frontier really is “Okay, I can self-host–” Even the geeks like us, for now we can begin to say “Flesh out this world.” Running AI locally on a decent box or a really beefy box to me seems like the next frontier. Ollama seems to be the centerpiece of enabling most, if not all of this, because it’s the vehicle. It’s the index online, ollama.com, you can go there and find the latest models, you’ve got downloads, you’ve got the parameters, you’ve got clipboards where you can copy and paste to your terminal, or inside of Open Web UI… They’re making it really easy to find models you can play with locally.

And the API that Open Web API offers, or even Ollama offers - that whole API scenario, being able to tap into Home Assistant… Your video on that opened my eyes up, because I was like “Wow, you can now voice-command Home Assistant where once before you could not.” And all you need is probably a simple, a really simple model that understands how – it can be a very small parameter model. So maybe that Home Assistant box is a very small box. Maybe it’s all CPU even, because it’s like such low pressure AI. It’s just like “How many lights?” “Turn on kitchen. Give me the smarts where the smarts were not there before, because the API is there.”

Then for me, I’m thinking, as a person who wants to simplify my time, we quote somebody on working with us as sponsorship level or partner level every single day. And it is a time-consuming task, because I haven’t found a way to automate it with the level of high touch I want to give everybody who works with us.

I can easily just copy-paste from something, but I feel like everybody needs a little bit of extra attention and detail… I would love it if I can train an AI, or just RAG it essentially, and tell it “Okay, this is our proposals. This is our pricing scheme. This is how we do things”, and me just tell it what I need, and I tell it the format I want it back, and it gives that back to me pretty much erorless. Pretty much. I’m still going to review it…

[01:00:09.15] Those are really good applications I can see myself doing, but ChatGPT is not trying to optimize for that for me. Maybe I can do it, maybe they’re trying to do it, but at the same time now I’ve got to give them all my data. And sure, I know we’ve been googling forever, and they’ve got all of our information anyways, and sure, they know exactly where you’re at in the world, according to Changelog News this past Monday… But I feel like this local run, privacy-focused scenario, with the things we’re asking AI these days, is sort of the next frontier. And I’m hopeful for you, because you’re like steep in this home lab stuff, so people are probably just pouring into your channel, thinking “How do I run this stuff?” Is that what you’re seeing?

Yeah. You know, it’s an odd topic on YouTube, I’ll say that. It’s an odd topic.

Yeah, odd. Yeah, yeah. Because it has the potential to go, I think, either way. Either people are burned out and don’t want anything to do with it – well, there are people who don’t want anything to do with it. There are people who are burned out about hearing it. And then there are people who might want to hear about it and want to do it. So it’s always – like every video, it’s a crapshoot talking about those things… But there are people, yeah. Absolutely. I mean, that video for me is performing pretty well, and people have asked, and there are other creators talking about this, too… Especially the DeepSeek thing really got people interested in self-hosting AI, because they wanted to figure out and play with this DeepSeek thing. And the easiest way to do that without sending all your data wherever it would be online was to host it yourself.

So oddly enough, I think that DeepSeek helped people realize that they can self-host LLMs, probably more so than Ollama’s ever done. You know what I mean? Because it did two things. One, it let people know that it’s a possibility, and it also let people know that “Hey, it’s private.” And those are two things that I don’t think a lot of people knew were possible. And nothing against Ollama. It’s fantastic, it’s great. But it’d be hard to market them, or that product or service, in the way that DeepSeek I think marketed it. All the DeepSeek hype really was marketing for other stuff too, which drove so many people getting interested in this.

But yeah, I think that’s what people are going to do. I honestly still feel like there’s an opportunity for a company to put it in a bow. And there’s going to be that opportunity, I think, for a long time, and do a whole product. But in the meantime, we can do this ourself. I just hope that companies continue to allow us to plug in our own models.

And one of the things that I talked about in that video too was that – I kind of hinted at it, but I use Mac Whisper for my subtitles and stuff, to get them transcribed right - because I care about them - and I think a lot of people should do; that’s my pitch for accessibility in general. You should care about them, because there are a lot of people who can’t hear, or have hard of hearing, or just don’t speak English, and they want to be able to read English.

So anyways, Mac Whisper is an app. It’s freemium, you can get it, you can download it, run it on your Mac… It does great with the small models. But in there, I saw – I paid for it. I paid for it myself, so I use it. And no relationship to them whatsoever, but I kind of talked about them in the video, where I said “Hey, it would be great if a lot of companies let you plug in your own endpoint, or plug in your own models.” And I kind of showed their screenshot of them only allowing ChatGPT… Well, sure enough, shortly after, now it allows Ollama. I probably had nothing to do with it, it was probably on the roadmap…

[01:04:14.16] But that’s what I want to see. I want to see, instead of application developers thinking “Hey, how can I hook into ChatGPT so everybody can now use ChatGPT?” Well, think about Ollama and other things, too. Think about how you can offer that option. And I think that’s a huge differentiator, too. If someone developing software right now says “Yyeah, we can plug into Ollama”, that’s a differentiator. Every software company right now is saying “Yeah, we can hook into ChatGPT. Just put in your API key. Here you go.” But I also think that they should be including Ollama, for sure.

For sure. Because Ollama can be in the cloud, too. You can spin up an Azure server… I mean, I was even tempted – like, okay, if I want to play with some H100, or just like some sort of thing I can’t afford or have access to in terms of a GPU, I can go to Azure, I can spin up even a Windows box - which is kind of crazy. You can spin up an actual Windows machine in the Azure Cloud, and play with it as if it’s like a local desktop. You can VNC into it, whatever, remote desktop into it… And you can install things on it, obviously; you can give it access to GPUs… You can do a lot of cool stuff. And you might spend a hundred bucks on that front run, because you’re maybe renting a really expensive GPU, but it’s better than thousands you may not have or want to spend on a GPU you can’t even get access to. Right?

Yeah, yeah. That’s for sure. Yeah.

But I feel like that’s kind of cool that you can do that. I feel like Ollama is – and maybe thanks to DeepSeek… We’ve been talking about Ollama for a while, but I’ve just never actually been curious enough to play with it. For whatever reason I just haven’t, let’s just say. But now I’m in this kick, like I mentioned, of – I’ve got two things resonating with me.

I really want to build some sort of AI home lab, and thus far I’m just using what I have… Because I think that’s where you should begin, right? If you want to know where to begin, what do you have? Play with that. Even if it can’t run anything super-powerful, begin where you are, basically. But now I’m curious enough to be like “Well, would it make sense for me to take my hard-earned dollars and invest in hardware?” One, for just curiosity. Two, maybe we can get a sponsor to pay for it. Three, maybe we can make some content from it. But four, just have this AI as a service with models I’m going to swap out as this becomes more and more popular, on my own network. You know, tie in a Home Assistant, like you’ve done, maybe even side-train a model where I can take this sort of like really small model and say “This is the way we propose things. And these are all of our contracts over the last two years”, and let it have that source of truth. And now it’s just super-smart with the way we do business.

I don’t want to give that kind of – sure, I could probably do that with ChatGPT… But man, wow, what an exposure point. Here’s all of our contracts over the last two or three years… Could you imagine that? No. I would feel much more comfortable doing it… And I think that’s the angst, is like everyone has to give up some version of privacy to play with artificial intelligence; or you have some version of plagiarism. Well, I’m sorry to tell everybody, like, AI is here to stay. There’s no putting that genie back in the bottle. And you can be against it if you want to be, and you can ignore it if you want to, but you will be behind. You will. Because young folks - and I don’t want to say just young folks, but people that are born into the world today, with the way technology is, they don’t know any different. You and I, Tim, we grew up probably in the dial-up days. You probably remember how AOL sounds, right?

Yeah. Goodbye… Yeah, I got that down.

Right? We’ve got that. They have no idea what that is.

[01:07:50.09] No, I agree. Yeah, I totally agree. And that I feel like there’s this stigma – for the people who are against AI, I feel like there’s a stigma of those things you talked about. Plagiarism, it’s not your own thoughts, you’re not being unique, you’re not using your brain… You know, all of these things I hear people say every now and then. Your brain’s going to turn to mush… It’s things we’ve heard about TV for [unintelligible 01:08:11.13]

Truth. Or metal. Or hard rock.

That’s right. Yeah, exactly. But what I see is I’ve always – and I think I said this last time, but I always think of like… When I use a GPT, or a helper, ChatGPT, whatever, Open Web UI… When I use an LLM to ask it questions, I’m usually thinking of it like a rough draft. Like a rough draft. Anything that it spits out to me is a rough draft. It might be ideas, it might give me new ideas, it might give me new perspectives… But I’m never going to copy and paste that thing and put it in my email directly. I might copy and paste it and change some words, but I’m saying like, that is always going to be a rough draft to me. So I feel like if people can understand that, and it’s not just something that’s feeding me ideas or doing work for me, I think that’s – that’s the perspective I have. I kind of related to it like 3D printing, and like at least – not at home, but say you’re a machine shop, and you want to be able to produce something really quick, and you have some ideas, and you want to test something out… You 3D-print it. You look at it, you test it out, you think “Will this work?”, you make some adjustments… And then if you like it, that goes to production and you produce the real thing. Well, for me at least, ChatGPT is kind of like that for me. “Help me out, give me some ideas… Maybe there’s things I’m not even thinking about, but that’s my rough draft. I’m going to take that and use it probably in my final product”, but it’s not the final product. And that’s the way I look at it.

And you can get so much done so quick, just by having the right answers, almost all the time. I tell my wife, because she’s just very early on – she’ll ask me questions about it. She’s only used it a handful of times. But I say to her, “Imagine if you had a friend on Slack that knew everything you were saying to it, that understood almost everything you could possibly say to it, that could help you out with anything you ask it, and is usually right, and will give you a pretty concise answer every time. Imagine if you had that friend in Slack that you could DM on the side.” That’s how I kind of treat ChatGPT, or any LLM… Because it’s just so refreshing to have – I don’t know, I work on really weird stuff. I work on home lab stuff. I work on things where the same question has probably been asked twice, ever. You know, “Will this GPU fit here? Because I only have this amount of clearance in my server rack.” I’m not going to find that answer on Reddit, and it’s going to take me a long time to go and get the dimensions, go and measure, do all this stuff. Where if I could ask an LLM that, and it knows, it’s “Oh, here are the dimensions. Here’s the height on this case. Yeah. Yeah. That’ll fit.” It’s fantastic. I’m not trying to talk it up, but…

You are. You should.

I know. It saved me so much time.

Don’t feel bad about it, man. I’m with you.

I know.

We’re simpatico.

Alright, man. Because it saved me so much time, so much time, from not going to Google, and not wasting my time on Reddit, and not going somewhere else and getting advertised to… It saved me a lot of time. I mean, even simple things. My wife and I started saying “Give me interesting pizza recipes.” Because we make pizza every week, and we have the same kind of pizzas… But we’re like “Let’s branch out.” “Give me interesting pizzas”, whatever. Obviously, it’s going to go to the web, it’s going to look… But it’s going to find all of them, and it spits them out. There’s one that’s Barbecue Cauliflower, and we’re like “Yeah, that’s kind of interesting.” We made pizza the other day and it was fantastic. And it’s kind of like “Hey, tell me more. Tell me more about this recipe.” And then they give you the whole entire recipe.

[01:12:10.04] I’m to the point now where I turn on voice. And I’ve gotten kind of bad with it a little bit late – not bad, but I’ve been relying on it a lot, while I’m working on stuff. I can just be like “Hey–” A perfect example for me… We were talking about the UI of cameras earlier. I cannot find menu items in a camera. If you have a Sony camera, you can relate.

So many pages…

Where is the image format? Like, I have no idea. So I’ll have voice on sometimes and I’ll just be like “Hey, can you tell me how to get to this setting in the camera? Here’s the camera app.” “Sure. Go here, here and here.” “Awesome.” And then I’m like “If I change this setting, is this going to affect the frame rate?” “No, it’s not.” You know what I mean? So I will have a pretty in-depth conversation about one thing I’m working on with an LLM. And it’s so great. It’s so great for the things you don’t understand, or… You know, I’m a huge fan. I’m a huge fan. I talk to it, because it’s faster than typing.

Have you heard it be called a word calculator yet?

Word calculator?

No, not yet.

That’s what I call it. That’s what we call it around here.

It’s become the way I think about it, is it’s a word calculator.

Yeah. Yeah, it’s guessing the percentage of what – yeah, it’s trying to determine what to say based on percentage of what it knows.

Well, the same way you use a calculator, you’re trying to figure something out, right? But it calculates with… It’s a word – it uses words. It uses understanding, reasoning even, in the latest models. I think of it like that. It’s like “Okay, here is–” I’ll just paste in a bunch of stuff and be like “Tell me all the numbers in here and add them up.”

If I copied 15 lines from my bank statement, for example, online, and I’m like “I want to know what these add up to”, but the copy and paste on my Bank of America web UI is just terrible. And I’ll want to pull that in a Sublime text, and pull out all the – I’m not gonna waste my time with that. I’ll throw it into a GPT. There’s no information there that’s really shareable. I would much rather do it locally, given all the things we’ve just talked about… But like “Hey, you see all these numbers here after the dollar sign? Those are all figures I want you to add up, and de-dupe, and tell me the right answer.” It’s gonna do it. In a second.

And that’s what I mean about word calculator. You could probably gush about AI for forever. Let’s not do that. Let’s talk about two more things… Especially if you have time. Do you have more time?

Dude, I have all day, man. I’m yours.

Okay, sweet. Let’s go deep then. Okay, so let’s close with – I think I would like to see some AI builds from you. I’d like to see low-tier, mid-tier, high-tier, AI homelab builds that might give people a gateway into this world.

That’d be cool. It might be hard. I think that’s really up your alley. That’s where I’ll leave that at. Let’s talk about the creator PC. And so you mentioned the dev workstation, which I watched your video on that. So you built a Linux workstation, the ultimate Linux workstation… 78,000 views. I think people are resonating with what you’re saying here. I mentioned I’m resonating with building a creator PC. I kind of just want it – I just want to build another machine. And I don’t have a need for another machine, but I’ve got this itch. I want to build another PC from scratch. I love it. It’s just so much fun, honestly. But I don’t want to put Windows on it. And I don’t think Windows – or sorry, I don’t think the year of Linux desktop is here for video editors, audio editors… It may be for developers. It’s here and it has been here forever. So my PC building, my creator PC building has been stifled until I can figure out how to put macOS on it. Not going to happen…

[01:16:10.28] Apple, if you’re listening – somebody at Apple, if you’re listening, I would love it if you can just make it so that if you love open source (I think you might) make it so that macOS can live on a PC that is built for Linux. Treat it like Linux. I want to build your hardware too, but gosh, I want to build my own hardware sometimes. And I don’t want to give up macOS.

No, you’ve got to you got to pay for the dongle, which is the whole machine.

That’s right. Or the RAM. Did you see that speculation about the Mac Mini basically being free, the lowest tier basically being free? Because if you, I think, added a couple of things to it, it’s like double the price?

Yeah, yeah. [unintelligible 01:16:45.02]

That’s interesting. Yeah, so… creator PC. What would you do here? What are your thoughts on this?

Yeah, so creator PC… So to me, I think “Okay, it’s going to do video editing, video capturing. It might do Photoshop, edit – it might do raster, graphics, or whatever. Vector…” But it’s the same thing. It’s the same thing I think of for gaming PC, and it’s the same thing I think of for an AI machine. It’s all going to boil down to your video card. So your first choice, most people’s first choice is going to be CPU. And so you have two choices. I think most people right now are going to pick AMD. I picked Intel, and that was my choice in that video, because I wanted a couple of things. One, I wanted QuickSync, because QuickSync is awesome for transcoding… And when I say transcoding, I don’t mean just for Plex, or anything like that, but you’re able to take advantage of that on Windows, while you’re editing machines, or decoding/encoding. Like CPU, it can go either way. You can’t go wrong right now. But a lot of people are leaning towards AMD, because Intel’s kind of been, I don’t know, all over the place. But I did choose Intel.

The Core Ultra.

Yeah, the Core Ultra, I think, is great for everything but gaming. And let me say this, it’s still great for gaming, but if you’re going to build a gaming PC, you might as well go with AMD right now. But it’s great for multitasking. Low heat, low power… Intel kind of turned it around with the CPU, even though it’s doing poorly. It uses less power, less heat, whatever. But you’re going to choose your CPU. RAM - it’s going to be DDR5. Don’t choose 4. A lot of people want to choose 4, because it’s cheaper. Don’t. If you’re building anything new today, 5.

So that means you’ve got to find a motherboard that has the right socket and DDR5. Which - pretty easy to find those. I’m an Asus fan. I’m a big fan of Asus motherboards, so I usually choose from Asus. I’m like “What do they have that I can buy?” It’s not “Let me look at all motherboards.” It’s similar to – you know, when I’m brand-loyal to stuff, that’s what I do. Like Samsung flash drives. That’s all I’ll ever buy. So I’m like “What does Samsung have right now?” Same with TVs. I’m brand loyal to Samsung. What TVs does Samsung have? It’s not just like “What’s on sale?” You know. So anyways, I’m the same way with motherboards. It’s always going to be for me Asus. It doesn’t mean I’ll never buy something else, but that’s what I prefer.

Typically, on that board, I will look for a chipset, possibly that’s Intel, that has an Intel NIC on it, mainly because those play with Linux a lot better than real tech chipsets. So if you are thinking about it - yeah, definitely look for one with an Intel NIC built into the board. That’s the way I look at it, because it does better with Linux.

[01:19:46.12] And then you’re going to want 2.5 gig networking on there. Honestly, if you’re building a creator machine, you’re going to want 10 gig. 2.5 is borderline, pretty good. You could probably edit 4k with maybe a couple of stutters… But you want to look for 10-gig. And if you can’t get it on board, you could do aftermarket parts and add it there. So that’s a lot of requirements up front.

I know that Asus has their Media Art, or Creator Art PC – ProArt. ProArt. I knew art was in the word. But they have those, that are dedicated, they say, to creators. But really, what it means is a lot of space for fast storage, fast network cards, and we support the latest RAM and CPUs. And so honestly, I would say – I know you’re against Windows. I run everything.

I’m not against it, I’m just sad that that’s the only option.

I know, I know.

So I played with it. Before I did my AI home lab on my same NUC - because it’s the only extra machine I have to like play with something on bare metal. So I installed Windows 11, and thankfully, they let you install it and play with it for free. They don’t make you have to have a license. Now, you can’t change the desktop and do – there’s limitations to what you can do, and I think they nag you a little bit… But then I was like – I mean, there’s a lot of extra software on there… It’s not the worst ever. I just think “Gosh, if I’m going to build a creator PC, I’ve already got so many efficiencies and workflows in the macOS world, software I’ve bought, things that I just can’t live without…” Raycast I know is something I use on Mac. I think they have a plan for a Windows client, but they’re not there yet, to my knowledge. Maybe they are, I don’t know for sure. But I’m thinking “What do I have to give up to move to this different world?” Okay, well, Adobe Creative Suite - at least my license there doesn’t limit me to platform. I can use Windows or Mac. So that’s cool. But then - man, everything else is just like these weird hoops.

And then Windows seems to be, I think, user hostile. Absolutely user hostile. To go and have fun and build and spend the money on a really awesome creator PC and have to install Windows to be a creator on that same machine, and have a user hostile operating system… Not saying macOS is that much better, but it is. It’s that much better hostility-wise. It’s at least not doing all sorts of crazy stuff like Windows does, and like AI-ing everything. I think Apple Intelligence might be the next frontier for them, and hopefully they don’t push that button too hard. They’re sort of backpedaling a little bit…

It’s already here. It’s on my Mac Studio.

I mean, you could tell me what the experience is like. I don’t have that. Nut I’m hoping that they don’t Apple Intelligence me too much. I feel like Windows is just user-hostile, really.

Yeah. I mean, they have a long history of like introducing features that people don’t want, they say they don’t want, and then they prove they don’t want, because then they get taken away… And then at the same time, taking away features that people still want, because they think that’s what’s best for the user, and it ends up not being what’s best for the user, and so they backpedal. Start menu, classic example.

So yeah, it’s… I don’t know. Personally, I use Mac, Windows, Linux… I have Windows here, too. I actually enjoy Windows. After it’s installed, you strip out the stuff, you make it exactly the way you want, and then you get WSL running on it, so Windows Subsystem for Linux… I feel like at that point it’s almost everything that I need, outside of iMessenger, or Messenger. Outside of there, like being able to text people while I’m on my machine, it does everything. Like, the WSL side lets me run Linux, in a terminal, that lets me do – basically, I’m in a terminal for Linux, and I can do everything I want to do. I don’t even have to think about PowerShell, or anything like that. I can be a developer and run developer tools.

But then if I want to launch Adobe, whatever, Creative Suite - boom, I’m there. I’m editing. It’s working great. Windows drivers with NVIDIA or even QuickSync or whatever works fantastic, because they’re getting drivers and iterations so much faster, because of just the volume of people. Then if I want to launch a game, it’s right there. I launch a game, boom, I’m in the game.

[01:24:02.16] The other thing is compatibility with hardware. Like, you cannot match Windows compatibility with hardware. It works with anything you can plug into it. And most things are built to work with Windows. And again, this is brought to you by a Mac right now. And I’ve had to jump through a lot of hoops because of Apple, because dongles, because no PCI Express, because whatever the reason. So the things that you think are getting taken away from you, imagine what people lose when they go to a Mac.

So I would flip that around and say, you lose so much freedom as a tinkerer, as a builder, as a custom rig builder, as a gamer, as whatever, going to Mac. You lose so much. You lose a lot.

Oh, yeah. Oh yeah. I mean, I can’t even plug in a PCI Express card. Mac doesn’t allow that. They don’t allow any video card. You’ve got to use theirs. And so what I’ve had to do is get this USB-C dongle that powers this thing that allows me to plug in PCI Express cards, so I can capture video and you can see me right now. That’s what I do. I mean, that’s the lengths people have to go to get things to work that just work in Windows. You get them on the board, you get PCI Express, you plug the card in - boom, it works. Drivers are already there.

So I’ve gone both ways. I use both, and I flip flop on both all the time. It was honestly harder for me to go to Mac than to go to Windows… Obviously, probably because I ran it for 20 years. But making that switch I realized Mac is great, stable, fantastic… I never have to worry about it waking up or it staying on. Like, it just works. Apps are so stable all the time, with the exception of editing every now and then. And I don’t know, I kind of have a beef with Apple’s biggest release – their software release with everything. I feel like every single thing I use from Apple right now has a bug and we just need to get past this point. But going to Mac, I kind of sometimes miss Windows, if that’s a thing. For me, it is. Sometimes I do miss Windows. To be able to just, I don’t know, do whatever I want, because Windows works with everything.

So free. Yeah.

Well, you’re encouraging me to bite the bullet, as they might say. So I’m running a – to give some context to why I’m in this struggle, is one, I like to build machines. And two, I’m running an M1 Max machine, that literally is maxed out.

64 gigs is the M1 Max. It’s the initial M series MacBook Pro. I’ve got four terabytes of onboard storage, just because we wanted to max things out when we purchased these. I think back in 2021. I don’t even know. So I’ve obviously been a Mac user for a while… And I feel like I want the freedom to build my own machine, I want the freedom to choose my own video card, I want the freedom to choose AMD versus Intel, I want the freedom to choose DDR5 and not spend $10,000 on Apple RAM… I’m being facetious there, but it’s expensive. RAM is expensive, storage is expensive… And so you can go to Samsung and get one of the 900 series… The NVMEs, what are those?

The 980s, yeah.

The 980s, yeah.

[unintelligible 01:27:39.00]

The Samsung 980, M2, NVME… Super-fast. You can get those so much cheaper than you would to even try and double your storage on a Mac build. It’s very expensive.

[01:27:53.00] Yeah, you could put 10 of them in a machine if you wanted to; if you have enough lanes.

Precisely. So you pay this Apple tax. And the tax is to some degree simplicity, in the fact that it just works. It’s a pretty stable system. I really haven’t had a lot of issues. But you’ve got to pay that dollar tax. And then, obviously, my family is an iMessage family, and so I’ve got to have that somewhere. I can’t just have it on my phone… I’ve gotta have it on my desktop, too. Who wants to text only on their phone? Forget that. That’s a terrible world.

Yeah. I lived that world for a while, and that’s why I was saying, if I ever left Mac and went 100% back to Windows, that’s the only thing I’d miss, to be honest. That was the only thing that I’d miss. I mean, notes too, now that I’m in the ecosystem. I’m like “Yeah, notes are super-easy. They’re right here. I’ll type a note in here.”

Photos, man. I reference photos on my desktop frequently.

The syncing between Photos app on my phone to the desktop is just, you know…

And then there’s the other tax there. You get your iCloud tax. And I don’t know about you, I like to back up my photos, but then I also like the Apple Cloud for those photos too, because I share a lot of photos with my wife. We’ve got kids and we just have history. We love to look back at photos that are 5, 10 years old, frequently. Because as a dad, one of the things I’ll do with my kids is – we’ll do story time at night, but we’ll also look through some photos of things we’ve done a year ago or two years ago, or have a memory with somebody. For kids, that’s grounding. That’s their identity. “Who are we? Why do you love me, dad?” It’s a reminder. I know they know I love them, but it’s a reminder of the fun things we did. Just because we didn’t get to do something fun this week doesn’t mean we haven’t done it before. And it’s just this remembrance of where they’ve been, loved ones that are not here anymore, or someone we don’t get to see too frequently. Just reminding them how much they matter to them. And that’s how we use photos. Not just, obviously, for B-roll, like you’re doing. It’s more than that, you know? We live in it. It’s a life thing for us. So maybe I just need to be a Windows and Mac family. Maybe it’s like “Why one or the other? Why not both?”

That’s what I say. why not both? Exactly that. Because I know a lot of people who use a Mac all day, all day. Me, personally, I used to be Windows at home, Mac at work. No, sorry. Other way around. Mac at home, Windows at work.

That’s how it was for a long time as a developer. Every enterprise had Windows. “Don’t bring those Macs in here…” You know? And then it flip-flopped. Then it got to be “Well, now I’m a developer, they just hand me a Mac.” So I’m using a Mac at work. Well, I’m going to use Windows at home, because I game, you know? So it can be like that…

If you think of the computer you’re using like a tool, and kind of what it is… It’s hard to think of it like a tool, because it’s so versatile. But if you say “Here’s my editing machine, and here’s my everything else machine”, then it kind of might make sense. But you’re still going to have a MacBook or a laptop. And so you’ll have your laptop, plus a workstation. It is hard going that way, but I honestly think it’s hard going the other way, because you give up so much. Yeah, you give up so much to move to Apple.

Break: [01:31:19.10]

Humor me. Let’s build a creator PC. I know you just shared your video, and I know you’re running Linux on it, you’re not running Windows on it. But that’s okay because the build itself, the hardware itself is probably very similar. So let me tell you what I would like to build, and let’s compare it to what choices you’ve made, and why. And this is not an exhaustive of all the components. It’s the core things. It’s the case, it’s the motherboard, it’s the CPU, it’s the cooler. Things like that. You can go into RAM. I think that’s pretty… You may have opinions about which brand maybe is cheapest, whatever. Obviously, you’re a Samsung lover on the NVMe storage section… But the case I would like to use is the ProArt PA602.

Yeah, ProArt. That’s what I was saying.

The motherboard… ProArt again. ProArt Z790. CPU so far is the Intel 14900K.

14th gen. You’re going with last gen.

Okay, so that I didn’t even know. Is that last gen? I’m not like you, Tim. I’m not on the – that’s why you’re here, man. Keep me on the edge.

No, it’s totally fine. But yeah, 14th gen –

Is Ultra the new hotness then?

It is. It is. It’s their latest.

Okay. I wasn’t sure what Ultra was. It was new. I was like, “Is that for – who is that for?” Okay, so Ultra is the new hotness.

But it’s not really new, because they launched it on laptops a while ago, but now they launched it on desktops… And it’s super-confusing, because you’ll see a laptop, for reference, with Core Ultra, but then desktop processors now are out that are Core Ultra. So I don’t know. There’s a lot of cross-checking before you buy stuff, because you don’t want to buy the wrong thing… So that’s one thing to look into. But yeah, 14th gen - they’re great. Run hot, run power hungry… Then there was all those problems with them, but that’s all fixed in microcode now, and firmware updates.

Okay. So I’ll [unintelligible 01:35:01.09] them on the CPU. I’ll take some advice. Maybe the 1400K is the old hotness, and I need the new hotness. Who knows…? Cooler - I was advised on the Arctic Liquid Freezer III 420. 420 millimeter, you know… All that good stuff. That’s the biggest thing you can put into this. I think it fits in the ProArt case as well. Thermaltake, tough power, 1200 watts, 80 PLUS Platinum… And then the GPU… I mean, aside from the conversation we had earlier about AI home lab stuff, I think a GPU is just hard to find. And so I’ll pay through the nose for this if I can get this one, but a 4090. I can’t buy a 5090, because one, it’s not available, and two, it’s probably just like five, seven, ten thousand dollars, because of scalpers, or whatever.

[01:35:52.22] So the jury is out on the 5090, but I do want the 90 series of the 30 or the 40… And so I was thinking the MSI Gaming GeForce RTX 4090. I could also go with the ASUS TUF version of that, the RTX 4090 there, or the 3090. I think the 4090 and 3090 kind of compare pretty well. So if it’s availability and price, maybe the 3090 TUF edition from ASUS, or this MSI GeForce RTX 4090, if I can find it. But that’s gonna be pricey. That’s gonna be like 2,000, 2,500 bucks.

So that’s what I’m saying - if I have to spend this much money to build, for fun, this machine… I’ve gotta put Windows on it? Okay, fine… You’ve made me think that maybe there’s a world I can live in that has both Mac and Windows in my life. That’s the rough of what I’d like to build.

I wouldn’t mind AMD. I do have an AMD AI home lab, but your video on your Linux workstation made me think maybe think maybe my AI home lab should be this creator PC workstation. Maybe I can blend the worlds. I don’t know. What do you think?

Oh, yeah. It absolutely could. Because you could use that GPU for Ollama or for anything else while you’re not using it, or even while you’re using it. Because it’s just going to use the CUDA cores if you’re using NVIDIA. And so it’s going to use CUDA cores and VRAM. But if you’re typing a document, you don’t need that. You don’t need any of that, you know?

So yeah, you could do both. You could have it run both. And then you could keep the AI local. And I think there’s desktop applications that you could just install it and do it all local, local; not even on a server in your home, but on the machine you’re using. So yeah, that would totally work. It would totally work.

3090 - yeah. I have a 3090. I got pretty lucky, because right before the pandemic, or right as it started, that launched, and everybody wanted the 3070 and 3080, I think. And I wanted the 3080, but it just so happened Best Buy had one of those in stock, the 3090. And I thought, “Oh my gosh, I’m going to spend like a thousand dollars on a GPU?” It turned out to be the best purchase ever because you couldn’t find them after that. And I used it all through the pandemic, for all of my videos and everything. It was great. But I still think they’re solid.

I like the founder’s edition. If you’re going to buy something, I feel like the founder’s edition directly from Nvidia, just the design and everything, is fantastic. I understand why people don’t choose that, but that’s just me; my personal preference.

Yeah. This world of GPUs is hard to understand as an outsider coming in. Founder’s edition. OC. Tough gaming. And I know these are subbrands of certain brands… Then you’ve got MSI, and you’ve got - EVGA, I believe, is another prominent brand that you can get. Now, I think that might be availability maybe pushing that brand, because MSI or somebody else might not be available… I don’t know. From the outsider coming in, someone who’s never been a gamer - I’ve never been a gamer. I’ve never built a PC to game on… So building even a machine to utilize a dedicated GPU is foreign to me. I’ve never done it. So even selecting which GPU, or having this history… Thankfully, we’ve got people like you out there, and my other good friend - I don’t know his name; I can’t remember his name. Tech Notice. Great dude.

He’s always on it with like the latest… And he’s strictly creator PC guy. Like, he’s not gamer PC guy. Now, he will talk about how it may influence or not influence if you’re a gamer too, but he’s primarily giving advice, generally on the tip of technology… And he’s got lots of videos out there. So he’s covered most of everything, really, from RAM, to storage, to whatever. And it’s all from a creator PC or a creator lens, not a gamer lens. Not that it’s a bad thing, but a lot of people are trying to build their best possible rig for creating, because the Mac has limitations, or they really want to push the boundaries, or they need more than one GPU for whatever reason.

[01:40:03.00] For me, if I really had unlimited funds, I would love to build an AMD Ryzen Threadripper Pro machine, that has a workstation-level motherboard, tons of PCI lanes, and I would love to have multiple, if not four… I think once you get to five it’s kind of hard, especially on like power and cooling… But at least two, maybe four GPUs. Like, if this AI theory – I know you’re laughing… If this AI theory plays out, I feel like, wow, I can build this rig… It might be expensive, but long-term it might play out, because I think AI will become the centerpiece of home labbers here soon enough. If models become, as we’ve talked about before - if they remain or become more open source, or available on Ollama, and Ollama becomes a a first-class citizen when it comes to integrations for platforms… I think if it’s like I integrate with ChatGPT and Ollama - if that becomes a real thing… Well then, that kind of machine will pay us dividends over time, as AI becomes more and more advanced, and as we allow it to inject itself more and more into this private world we have, into our home lab world.

I probably wouldn’t build an AMD Threadripper Pro machine for my own personal creation, like creator level. I think the Intel Core Ultra, or the Intel 14900K would be just great in that world… But AMD has some compelling things about it, it seems. Threadripper… That’s a cool name. Right? Threadripper’s cool.

Yeah, ever since I heard it the first time, I’m like “Heck yeah, I want to rip some threads, man.”

“I want that.” Yeah.

Who doesn’t want to rip threads? So yeah, you touched on a good point, and I’m glad you mentioned that, because it’s totally an oversight on my part, something that I always think about when building a machine… It’s PCI Express lanes. So if you’re going for a creator machine - yeah. There is a huge market, I think, for workstation level machines. And Threadripper is what we have from AMD… I don’t know what Intel did. They had one, now they don’t have one, and who knows what’s going on… But workstation level machines, I think, should be a focus for, I think, both platforms… Mainly because you’re limited in PCI Express lanes. I think on Intel now you get 24, so they caught up with AMD… But what does that mean in reality? It means you put a video card and you get two NVMe drives. That’s not enough for most people, especially creators. And it used to be 20. So you’d put a video card in there - that’s 16 lanes. Then you’d put an NVMe drive in there, that’s four lanes. And you’re maxed out. As soon as you build a PC, you’re maxed out.

So that’s something to keep in mind, too. Any desktop class processor you go with AMD or Intel, you’re going to be limited to 24 lanes. I think they’re both 24 now. And so that means a video card and two NVMe drives.

You could bifurcate those, though. You can drop it down to eight lanes. Less bandwidth, I think. It’s not speed, it’s bandwidth. Ain’t that what it is?

Well, bifurcation - yeah. You can do… And it’s really just dividing up the lanes. So yeah, you could maybe turn that 16 into two eights. Or maybe even go down to, four fours, to get to 16. Depending on the cards, depending on the motherboard… There’s a big if in there. But at the end of the day, you’re still getting a total number of 24. And so your video card is going to take 16. And NVMe drive, one of them, your OS probably, is going to take four of that. So then you’re left over with another four for maybe your media. And then it means whatever else you want to plug in there - yeah, it’s going to be shared, or who knows, maybe [unintelligible 01:43:54.26] It’ll most likely be shared.

[01:44:02.12] So yeah, I don’t know… I feel like gaming influenced this whole thing to why we don’t have PCI Express lanes. I kind of feel like it. I don’t know, this is my theory, is that manufacturers saw “Hey, most people just want to put a video card in and one NVMe drive and call it a day.” And so I think motherboard manufacturers started seeing that and they’re like “Okay, well, we’re going to chop our motherboards down and make them smaller. We’re just going to give you two slots.” And then case manufacturers saw that and they’re like “Okay, we’re going to make our cases smaller.” And then CPUs, they were trying to see how many lanes the minimum they could get away with, and they’re like “20 sounds good.” I don’t know…

I feel like they optimized for the wrong thing. They didn’t optimize for me. And maybe maybe I’m the outlier. Maybe we’re the outliers. Like, we want PCI Express lanes. We want to add in cards. And…

That’s the whole point, right?

I mean, yeah.

I want a NIC… Maybe my motherboard comes with a decent one. That’s okay. But if it doesn’t…

Yeah, you’re going to throw in 10 gigs…

If they’re not pushing 10 gig or more, I want to put a card in. Maybe I want to do an HBA, because I want to have – maybe I’m building a NAS> Maybe I also want a GPU in there too, and I want room for it; not just ability to put it in there for the lanes, but like, give me some room.

I think you’re going to want to do some things like that, and we’re not the outliers here. I don’t know. Everything that is GPU related, even motherboards lately, until ProArt and things like this, they started to push creators, not game – almost everything is… And I know you love RGB, I know you do… But you’re also a gamer too, so you’ve got that in your blood. I love games, but I’ve never been a PC gamer. I like to play Nintendo Switch with my kids. We love Mario Party. It’s all the rage in our house. We love that. And Mario Kart, of course, too. But gamers really influenced, I think, PC builds, because everything’s gamer edition.

And it’s all gaming influence; it’s all pushing what can happen in gaming. But I think you need that, though. You need some sort of killer application, or killer thing to happen… I think what happened with Ollama is DeepSeek was the killer app for Ollama. It was “Let’s make Ollama more useful.” Well, now we have a model that’s comparative to others.

But in the in the PC world, gamers kind of pushed that world for a while. Like, you don’t have anybody who has to have access to cloud docs and stuff like that needing a GPU. That workstation is not your everyday person. Maybe you have somebody who’s got a spreadsheet-itis, and they’ve got spreadsheets out the wazoo, and they need a better CPU for that… But that’s the limit. They’re not pushing GPU stuff. So all of that PCI Express, or PCI Express lane innovation and GPU innovation was happening because gamers were pushing the innovation, really.

Yeah, yeah, absolutely. Yeah. To get the fastest machine, to get the best frame rates, to get the lowest latency. Yeah, it pushed a lot of things forward. But I also feel like, man, that kind of like pigeonholed what a machine is possible of, and everyone optimized probably for cost, to get less lanes, smaller motherboards. It’s been happening over time for a long time. But yeah, it’s crazy to think - like, everything depends on the GPU now, too. No matter what you want to do. I mean, when you think of AI, when you think of Bitcoin, when you think of video rendering, creation… Everything depends on the video card. And everyone’s competing for them, at least with the 5080, I think. The 5080, what NVIDIA did was basically say “No, this one’s specifically for gaming.” They basically gave it gaming performance, like the previous generation for the cost, I think, less than the previous generation, but without the CUDA cores for the ML and AI.

[01:48:03.27] So I think – I mean, I don’t know if it’s the right thing to do, but it was a smart thing to do, for NVIDIA to say “Nah, this one’s for gaming. If you’re going to do AI, this is not the one to get.” I think that that kind of segmented their audience. I wish they would go wider. I wish they would go wider. Say “Hey, here’s the creator edition. Here’s the AI edition”, or Bitcoin edition, whatever. Call it whatever you want. “And here’s the gamer edition.” And then optimize for those things.

Honestly, it’s hard to discern between media creator and gamer, because really, you just need GPU, you need that encoder. You don’t need the 3D aspect, but you need that encoder, and you need video RAM. So you don’t need any – well, the funny thing is you still need some AI capabilities, even for creation today. If you think of Photoshop… When you say remove, or blur, or anything like that. They’ve been doing it for a long time. So that’s offloaded to your GPU if you have one, is to be able to do Gaussian blurs, or blurs, or even replace, or auto select this person, or cut this person out of the photo. So those are still needed, even outside of video, if you’re just doing static art… But yeah, I don’t know. I wish they would – I don’t know. I don’t know what they should do.

“I don’t know. I don’t know.”

Yeah. I don’t know [unintelligible 01:49:29.10]

But I think my pushback would be - or my response on that would be - I think, especially with the GPU scenario, literally at the creation of the hardware, there’s a limitation. From what I understand - I can’t recall the company’s name, but there’s one particular company that can do the fabrication of the… It’s not a CPU, it’s like the chip on the GPU card. And forgive my lack of familiarity, because I really haven’t played with much GPUs, honestly. But from my understanding is there’s a limitation on that, because there’s such a demand, and there’s only one tried and true company who can do this well, and there’s a bottleneck.

And so maybe the lack of SKUs, which is kind of what you’re hinting at, is like you want a gamer edition, you want an AI edition, a creator edition. The lack of SKUs, especially now, is the pressure on the ability to crank these things out. That might be it.

I would think maybe you just need levels. Like, they have like the 70, the 80 and the 90. It’s like “Well, the 70 is for budgetary, less needs. And the 90’s got AI, it’s got gaming, it’s got creator in it.” I kind of feel like you can blend those worlds. It’s like almost tiers of type. I want to do gaming, I want to do AI and I want to do creation stuff. So the 90 may be better for you. And the 90 guarantees you’ll have 24 gigs of VRAM, or more, and maybe you can add more to it… And the 80s sort of put you in this VRAM scenario with certain technologies in it… And the 70 is budgetary and more limiting, but still quite capable for dedicated GPU scenarios. I don’t know… As an outsider who’s just learning, that’s maybe how I’d skew it. That’s probably how I would think about it, personally.

Yeah. That’s a good way to put it. It’s just odd now though, because this is the first time I think that I’ve seen where they’re like “Yeah, the 80 is specifically for gaming, because we’re taking out a lot of the AI and ML stuff”, which I think was probably a good move on their part, to say “Nope. Gamers, you’re going to get this card. This is the one to get.” But yeah, it’s interesting to see. Yeah, it is, for sure.

[01:51:37.13] And honestly, on media creation, you could get by; you could get by with a lot less than you think. A lot less than you think. And I will say, just doing media creation a lot, a lot of the times you don’t even need to transcode anything or convert anything until the end. You just have to have good encoders and a fast network to be able to process that video. I edit in 4k. I used to – when I had a worse machine, I would create proxies. It’s kind of an insider term for transcoded down to a lower resolution, so I can edit at a lower resolution, because my machine’s not that strong. But nowadays it’s like, you can get by with surprisingly very little for content creation. There’s some people who even just use the iGPU, and that’s enough to be able to do the encoding and decoding as they edit, you know?

Yeah. Well, I’m gonna link out to your “Building my ultimate Linux workstation” video, but because I said “Let’s create a creator PC”, give me your spec list. Give me the motherboard, the CPU… Give me a rough build list for this machine you’ve built.

Oh, the one I built in there?

Okay. Oh man, I’d have to kind of look at my notes…

You don’t know from heart?

Oh, man…

Come on, Tim…

Dude, that was like three, four weeks ago, five weeks ago…

Oh, that was like yesterday, basically.

[laughs] I’m like on to five more things besides then. And I’m terrible at specs, but I can tell you probably off the top of my head. I know it’s an Intel Core Ultra. I think it’s seven or five. It’s Core Ultra five. Not the latest. And that was really just for budgetary constraints, because I didn’t see much use in going with the top tier… Which is totally against what I’ve done my whole entire life. I’ve always been a buy an i7, i9, the max one, figure it out later. Then I went with an Asus motherboard. I couldn’t tell you the model name, but –

The Asus Prime Z890-P WIFI.

Sorry. I got your video up. I got your back.

Right. I’ll pull it up, too. Corsair RAM. I know I went with Corsair RAM. DDR five. I think it was – I don’t remember the speed. 7,000, maybe? Now I’m making up numbers. We’ll use your videos as a backlog. I mean, we don’t have to be perfectly accurate. I was just curious, can we build a creator – like, I shared my spec with you only because I had it listed and I’ve been dreaming about it and thinking about it, and deliberating and hemming and hawing, as people know that I do whenever I think about change or something new… I write it down, and I marinate for a while on a lot of choices that I make in my life. And building a multi-thousand dollar machine is not easy from a dollars point. So like “I’m going to think about this thing for a while. I’m going to survey my favorite creators…”, you’re one of them, “and see what their choices are, and compare, and contrast.” And the only change I’m making personally is this core ultra consideration… But maybe AMD. So I thought maybe you can rattle off your dream list, so to speak, for your workstation.

Yeah. I kind of built it with that. I’d probably bump it up – if it’s my dream list… I mean, if it’s my dream list, it’s a CPU that doesn’t even exist, that I have a workstation level processor. But for my Linux workstation - yeah, it was a Core Ultra 5, which I think is great. It’s great for multitasking, it’s great for coding, it’s great for compiling… It’s great for the things I’m going to do as a developer. Is it the best for gaming? No. I think we talked about that earlier. But it can still do it great. It’s just not the leader in that space anymore, like they used to be. But great for multitasking.

You know, it’s DDR5. The fastest DDR5 I can find. Motherboard to me, again, doesn’t really matter. I generally don’t want WiFi and Bluetooth on it, but it comes with every single one. I need four slots of four slots for DDR5. And it supports up to 192 gigabytes, which is such a weird number… And at the same time now RAM kits come in weird numbers now to get to that 192. Weird, weird times we’re in.

[01:55:52.10] Yeah, and I already talked about this, but it’s super-fast NVMe drives for me. That’s Samsung 980s. It’s the Pros. And then one’s going to be for OS and one’s going to be for everything else. And then I want 10-gig networking, because I have a 10-gig network backbone… Even though I don’t even need it. Honestly, if this is my dev workstation, I don’t need it at all. I’ll stick with the 2.5-gigabit that comes on it, and that’ll be fine, because I’m rarely going to transfer things to and from this machine.

Yeah, you’ll never saturate that.

No. It’s for writing code, man. I mean, maybe models when I download stuff… But no, not even that, because my home network is Gigabit. Sorry, my ISP is Gigabit. I won’t put any spinning hard drives in anything I ever buy anymore, except for my NAS. So that’s off the table. And even NAS, I’m kind of like still questioning it, like “Why are we still using spinning drives?”

Because they’re big.

I know…

Because they’re big and less expensive than something else that’s big and really expensive.

Yeah, I don’t know… I want to kind of feel like “Is this people –”

Is this real?

[unintelligible 01:56:59.24]

They’re holding us back on purpose, do you think? It’s a conspiracy?

Big storage. I don’t know.

Big storage…

Big storage is out to get us… [laughter] I don’t know.

I mean, do we need to stay on spinning? I get it that there’s more capacity, but could we make that more capacity on SSDs? Is it possible? Yes, it’s physically possible. Is it cost-effective? I don’t know. Maybe if we did it more. I don’t know. But anyway, spinning drives… Well, I heard from someone that spinning drives will never go away, because they’ll always be more dense, and they’ll always have more capacity. But I feel like that doesn’t always have to be true… But I don’t know. Maybe that’s me not understanding flash, and NAND flash, and all that.

What do you gain though from – I mean, obviously, there’s challenges with [unintelligible 01:57:49.08] because you’ve got… Vibrations can cause read/write errors… You’ve got lots of things that can happen. But generally, if you have a pretty good machine and a good build, those aren’t true challenges. They can be challenges if you’re not proactive in making them not challenges. And if the density is always there and you really don’t need – maybe you need more than… What is it, six gigabit per second per disk? Is that usually what it is?

If you have a decent backbone on your PCIe lanes, then you’re gonna hard – you’d be hardpressed to saturate that, in a lot of cases… Unless you’re doing some major transfers. And maybe your home lab is super-enterprise and maybe mine is less… But the main thing I’m moving around on my network is Plex movies. And it’s usually when I rip it to the NAS, and never again. And then obviously, whenever it comes off those disks to stream… But I don’t need that level of saturation. So disks for me work.

Yeah… For me it’s heat, it’s simplicity…

Yeah. Heat, noise, power… I could do away with most of my fans. And they’re loud, in general; they make noises. My flash SSDs make zero noise. They give off almost zero heat. They take up a quarter of the space, you know? And so because of that, my NAS is 4U, because it needs to fit these drives. I don’t know. I wish we were just all SSD, all flash storage.

One day, Tim. One day.

I know, I know. But it’s nothing more than me just like kind of wanting to be done with it, for those reasons. But they’re efficient. They’re large… And they’re cheap.

What’s left to say? I’ve got to go in like two minutes here. I’ve got a hard stop personally. I’d love to keep just going deeper if we could… We’ll have to do this more frequently, something like that. Who knows? Maybe more than once a year. I’d guest on Techno Tim talks, but I don’t think that’s your style. You don’t do that there. Do you have guests? I don’t know.

Yeah, I can. I absolutely have guests. I’ve had before.

Yeah, I’ll geek out with you.

Yeah, it’s usually on Twitch. I’ve kind of been kind of been switching stuff up a little bit… I don’t know, Twitch accidentally banned me, twice.

I saw that.

Dude…

The API.

Dude. Yeah. I’m kind of like over it, in my head… Because I’m like “Really?” That gave me the opportunity to stream on YouTube live and realize the opportunity there, and the audience there.

Yeah. Well, you’re already there, so you could just like tap into existing subscriber base.

Exactly. Whereas Twitch - you know, it’s “Hey, guys, come check out my Twitch.”

That’s why I don’t hang out there, honestly. I would probably at least lurk in your lives… Whereas I’m not gonna go on Twitch, personally.

[02:00:40.14] I get it. If you don’t type in that URL and you don’t go there or have someone to watch, you just don’t go there. It’s just not in your routine, ever. So I totally get it. It’s just like, you know, that’s kind of where I started. I started out live streaming before YouTube, and it was on Twitch, and it was playing games… So I just have a soft spot in my heart for it.

Well, I mean, you have done some cool stuff there, but whenever you disrupt somebody’s normal habit and flow, you give them a reason to ponder change. And sometimes that means the negotiating goes the opposite way, and they leave your space. And so maybe Twitch is in your past, and YouTube lives are in your future. But either way, I’d love to talk to you more, as it makes sense. I think we can geek out quite a bit about this stuff. And I think it’s fun, really, honestly. I think it’s fun to just dig into it with somebody else… Because, as you can tell, I make my dream lists, and I ponder them myself… And I might pay attention to people, but I’m not having a conversation with anybody really deeply about my choices or why I’m making these choices. And it’s just… Maybe after this conversation I might be okay with having both, Windows and Mac, in my life. Maybe.

I think you’ll be okay. I think you’ll be okay. And then maybe your kids one day will have a gaming machine. They’ll be like “Yeah, now I’ve got a gaming machine.”

“Dad gave me the handmedown”, yeah.

Yeah…

“I just got the 4090 in there, oh my gosh…” Or the 3090. “What’d you do, dad? The 3090, really? You couldn’t get the 4090?” Well, son, let me tell you what happened, okay…? AI changed everything, okay? And GPUs were hard to find, and super-expensive.

Yeah. “Back in my day we didn’t have this AI thing stealing all of our GPUs. It was Bitcoin.”

That’s right. That’s right. Well, Tim, it’s been fun geeking out with you, man. Thank you for hanging out for a bit, and… Anything left? Anything else?

No. Just thank you for having me.

Any self-promotion, any plugs, anything?

I mean, no.

I’ll link it all up for you, don’t you worry.

Oh, yeah. Thank you. No, I appreciate being here. I appreciate the time to talk about it. I say this on my live stream on Twitch… I rarely get to talk about this kind of thing to people in real life. It’s either on my live show, or to people in chat. So it’s nice to be able to talk to someone who understands what I’m talking about. So I appreciate it, man.

A real human. I’m not AI. If you thought I was, I’m not. This is real. I’m the real.

Yeah, I appreciate it, man.

Alright. Bye, friends.

Bye, friends.

Changelog

Our transcripts are open source on GitHub. Improvements are welcome. 💚

Player art
  0:00 / 0:00