Practical AI – Episode #251

AI predictions for 2024

get Fully-Connected with Daniel & Chris

All Episodes

We scoured the internet to find all the AI related predictions for 2024 (at least from people that might know what they are talking about), and, in this episode, we talk about some of the common themes. We also take a moment to look back at 2023 commenting with some distance on a crazy AI year.

Featuring

Sponsors

Changelog News – A podcast+newsletter combo that’s brief, entertaining & always on-point. Subscribe today.

Fly.ioThe home of Changelog.com — Deploy your apps and databases close to your users. In minutes you can run your Ruby, Go, Node, Deno, Python, or Elixir app (and databases!) all over the world. No ops required. Learn more at fly.io/changelog and check out the speedrun in their docs.

Chapters

1 00:00 Welcome to Practical AI
2 00:43 Happy New year
3 01:35 AI adoption curve
4 02:31 2023 Recap
5 03:50 Keeping up with development
6 05:29 AI by API
7 07:20 Human differentiation
8 07:44 2023 Highlights
9 09:57 Integrating the AI workflow
10 13:34 AI in daily life
11 16:35 Mainstream productivity / Lessening AI fear
12 20:43 AI controversies
13 21:33 Regulations and policies
14 23:29 Daniel's spicy AI prediction
15 27:01 Chris' spicy AI prediction
16 29:56 Sponsor: Changelog News
17 31:47 Prediction for 2024
18 35:03 Models better than GPT-4?
19 36:50 Clem's prediction / Finances will matter
20 40:49 Using software dev practices in LLMs
21 42:58 Looking forward to 2024
22 44:04 Outro

Transcript

📝 Edit Transcript

Changelog

Play the audio to listen along while you enjoy the transcript. 🎧

Welcome to another episode of Practical AI, and Happy New Year 2024. Chris and I are starting out this new year with a Fully Connected episode, which are those episodes of our podcast where we keep you fully connected with everything that’s happening in the AI news, and trends, and help you level up your machine learning and AI game. I’m Daniel Whitenack, I’m the founder and CEO at Prediction Guard, and I’m joined as always by my co-host, Chris Benson, who is a tech strategist at Lockheed Martin. How are you doing, Chris?

I’m doing very well, Daniel. Excited for 2024. I suspect – you know, we keep talking about how 2023 was just such an amazing year for AI, and we’re about to go through some of that… But I’m pretty sure 2024 will be even better.

It’s interesting, I was just having a conversation with one of my friends and mentors yesterday, and we were talking about the kind of curve of adoption of this technology, and where we thought we were in it… He kind of sketched out this trend of early days, where really what you’re investing in is research and development… And then there’s sort of a usage expansion/adoption period, where what you’re investing in is usually developers, to actually build out things at scale. And then you kind of like reach this place where what you’re investing in is lawyers, because everything is consolidating, and there’s more regulation, there’s a lot of mergers and acquisitions, and so everything’s kind of consolidating in together… I’m curious to know what your perspective is in terms of – maybe let’s scope it to generative AI, since that’s pretty much what 2023 was kind of about, and certainly not what all of AI is about, but…

I mean, 2023 was the year that big tech dived into AI for real. They had been talking about it and doing it, but suddenly the whole world went AI, with ChatGPT catching on the way it did, and its competitors. So big tech is all over it, but there are many challenges to that. And you kind of just talked about that process about working your way through the challenges… And so there hasn’t been the killer app; there have been really interesting things to come along, and some of them occasionally are revolutionary, a lot of them are revolutionary… But there’s still a lot of organizations, a lot of businesses specifically out there trying to figure out “How does this work for us?” and there have been some stumbles along the way. So I think 2024, in that sense, is more of the same - investing a lot in AI, not so much for today, but for getting ready for figuring it out for tomorrow. So I think we’ll see a lot of that two steps forward, one step back.

Yeah, I kind of had a similar thought. I saw some study - I actually don’t remember which one it was off the top of my head, but it was one of those that surveyed a bunch of enterprises, or something like that… And something like 15% to 20% of enterprise companies actually had either kind of something prototyped, or some type of integration to an AI system. Maybe that’s just a simple integration with an API, or something… But if you think about that, there’s definitely kind of at that beginning I would say we’re kind of past R&D in the sense that – not that R&D is not important, and new things will be discovered, as they continually are, but in terms of the technology that has sort of dominated the news cycle in 2023, we’re moving past the heavy R&D investments, more into the “How do you scale this with software engineers, developments?” Like, if you’re a company developing these technologies, you probably are going to not win by doing R&D, you’re gonna win by creating, like you say, products, the applications, the platforms, the APIs, the systems that power this kind of growing adoption within the enterprise. So I think where you saw that kind of adoption start in 2023, that’s going to expand very rapidly in 2024, and those that keep up with that in terms of actual software and system development will likely be in a good place.

Kind of extending that a little bit, another huge trend we saw was - in years past, we’ve talked a lot about the process of doing the R&D, and how you’re kind of self-hosting, or how you’re using cloud, but you’re still driving it yourself in a lot of ways… And what we’re really seeing now as the marketplace for AI integrations is exploding, is that these large tech companies, the Microsofts, and the Googles, and the AWS’es, the big cloud companies are really making a lot of AI usage about API usage. It’s AI by API. That’s essentially Microsoft’s entire primary play right now, is to take OpenAI technologies and package them up in their own products, and push them out.

[00:06:14.17] And we’re seeing a lot of uptake on that, from organizations that maybe struggled with their R&D a bit in broad… And I’m not talking about necessarily the leaders in the AI space, the most select group, but the second, the third tier, and the fourth tier companies out there kind of going “I’ll just use what they’ve made available to us via API, and that’s good enough.” And a lot of strategy, a lot of business strategy is being built around API access. So it’s becoming software, like so much other software.

Yeah. And some of that will likely get more commoditized as other API products have been sort of commoditized over time… And so –

Open source is coming.

Yeah, it’ll be interesting to see how people develop their competitive moats within that space in 2024. I think it’s a year where probably there’ll be some people that really come out ahead, and some people that suffer very badly, because certain things are just becoming more commoditized.

I think you’re starting to see companies realize they can’t just say “We have AI in our product” now. We had a period where having AI, being able to say “We have AI” was a marketing ploy into itself. But as it becomes more ubiquitous and more common, that’s not going to work anymore. Your point there about differentiation is going to be key, and human creativity to find what those opportunities are.

I was looking back at some of the developments that happened in 2023, and it’s amazing how much was packed into that year. Yeah, some of that I was like “Did that really happen in – did that also happen in 2023?”

It was a year, man…

So much was packed in… I’m wondering, what are some of the highlights that you saw in terms of things that happened in 2023, that we either commented on, or didn’t comment on, or that are kind of highlights in your mind from things that we discussed in 2023, that are kind of informing your perspective as we go into 2024.

A couple of the big highlights - and we can dive into whatever we want - are at the tail end of 2022 ChatGPT was released, early in ‘23 I think it was that we had GPT-4 added into that underneath it from GPT-3, and then they updated that with 3.5. And that kind of kicked off the firestorm at the beginning of the year. And we saw Google scrambling to try to catch up. They got Bard out, and then they came along late in the year with Gemini to back it up… We saw Meta coming out with LLaMA 2, we saw Anthropic coming out with Claude 2… And so there’s now a whole industry; you’re going into 2023 - there weren’t so many options, really. And ChatGPT kicked off the arms race in the industry. And it’s interesting - I know that for me, as we were going through the year and all these crazy things were happening, it was a new week and a new model to try. And I had my standard tests, I had the things that I cared about, which were usually avoiding the things everybody else was doing. Everybody would ask for Python code for the models, and that’s the best language you can possibly ask for. So I would ask for Rust, and they were all failing all over the place. You could have all these top models, and they were doing great in Python, because that’s what they were really centered around, but in Rust they were really falling all over themselves. It’s starting to change a little bit now, but… That and a few other things.

[00:09:57.18] We both may or may not be able to comment on everything that is actually integrated in our daily work, but you as a person, a developer, a strategist, as we went through 2023 and ended up at the end of 2023, your own usage of AI products - how was it impacting your daily work in ways that were different at the end of 2023, than at the beginning of 2023?

I think 2023 was truly the year where I never put down the various model interfaces…

You mean like the ChatGPT or Claude or whatever was up in a window, right?

Yeah, things like that. Exactly. And now, when I open a browser, I have a kind of a bookmarking app in my Chrome that does it, and across the top of the screen I have all the models that I use. And on any given day, I probably use every one of those. And often for a problem I will go to all those things. So it really became the year that it integrated into every part of my workflow. And it didn’t matter if I was coding, or if I was having to write something… And it could be technical-related or not technical-related for my animal nonprofit that we talk about occasionally… I use that for that, for a whole bunch of different things… So it wasn’t all centered around the AI world or the coding world, it was kind of every aspect. At the dinner table, my daughter, who’s 11, she’s in sixth grade, and I would challenge her – she’d say, “Well, my teacher doesn’t want me to use that interface.” And I was inevitably saying “Well, that teacher’s dumb. Because this is part of our life. They need to get used to–” This needs to be integrated into teaching, instead of being afraid of it, or something. But I started showing her how she could learn more, and learn faster by using the tools. And so it was a dinner table thing for us.

So that was it. I mean, I’m living kind of the same experience that so many other people are, that using these technologies is now completely built into everything that I do, no matter what that activity is, short of swinging a hammer. I think that’s the biggest thing for me, is it hit work, it hit side job, the nonprofit, it hit all my personal life… It hit my kids’ life. And so anyway, yeah, it’s the year that changed me from an AI perspective.

Yeah. That’s interesting, because we’ve been doing this for - what, five years now? And most of that – it has certainly been the impact from deep learning, machine learning over time in the enterprise setting, and value that companies have definitely gained over time with those technologies… But mostly, it sounds like what you’re saying is mostly, like, you were working in that space, and helping develop some of those things… But in terms of the way that you did your job, it wasn’t like those things were tightly integrated into your activities… Is that a good way to put it?

That’s a good way to put it. I would set aside time to do activities in the space, and that was part of either my day job, or us two on the podcast, or whatever. And then I would turn away, in years prior to 2023. So I think the fact that it didn’t just integrate into one of my workflows, it integrated into all of my workflows, across all the opportunities therein, and using one against the other, and taking the output of one and putting it the other and things like that, and just trying things out… That’s what it was. And I suspect that it was the same for many of our listeners.

It’s interesting, Chris, that you mentioned impacting kind of your whole family’s life… So one of the things that I did for Christmas - I have a bunch of nieces and nephews, and it’s always… Well, my wife and I don’t have kids. A lot of my life isn’t dominated by thinking about what kids are interested in these days… So sometimes it’s hard to figure out Christmas gifts, and all of that. I have to do a good bit of inquiries with my brothers to figure out what to do. But one thing I did this year was I just created like a framed picture… So I know each of them have certain interests; one’s really into electric guitar right now, and one of my nieces is really into Frozen, or whatever.

[00:14:21.16] So taking something from each of those - I took an artistic picture of someone shredding guitar on stage, and then basically deep-faked my nephew into the photo, and stylized it with – I did all that with ClipDrop from Stability… So you can go in, you’ve got like “Oh, here’s the –” I can go in first and just remove the background from the image, and then generate the scene, and then face-swap the thing, and then you can clean up certain areas, or remove objects, or change the lighting… And all of that happened very seamlessly for me. So it’s just interesting even in that context that’s what I turned to.

You know, it’s funny… And I did the same. A lot of picture generation and stuff, lots of raccoons and foxes and stuff in various scenarios… I’ll share a two-second touching moment that just caught me off guard. I work with a wildlife vet in the animal nonprofit, and they do a lot of stuff for free, because we’re a nonprofit, and we’re doing all this good, and they love these animals, and so they just help. And I’m very thankful for that. But one day I had come home after taking some animals in for them to help that were beyond my ability, and I just send them a text saying “Thank you”, but in the process I went and generated an image that showed their veterinary practice and all that, and some animals around it… And I sent it, and it was like a two-second thing for me. It was just like “Oh, I hope you like this.” And they received it, but they’re not people in AI, they’re not as tech-focused, obviously, as we are. And she had it printed out and framed, and put in the main office, where people could see it, and everything. And it had never occurred to me, but it made me realize that even these completely non-tech things - AI is something that can help people that are not focused on AI still find value in their activities in ways that maybe some of us who were talking about this all the time don’t think about. So it was kind of an amazing moment to see something so trivial turned into something that was meaningful for them.

Yeah. And one major change for me, which in the past – so those that have listened to the show for some time, you might be aware that I’ve always coded in Vim; that’s been my IDE of choice. This was the year that I was finally motivated to – so I actually changed my editor. So I’m using VS Code now.

Oh, my… He came over. I can’t believe it.

Yes. With Codium. So I’ve tried VS Code in the past; I’m kind of like “Okay, cool.” There’s things like the searching, or completion, or function finding stuff - that’s all useful, but you can do all that in Vim easy enough. And I know there’s also integrations of AI stuff in Vim, but I’ve found that that really tight workflow, and having even a chat interface within VS Code through Codium was actually just so efficient for me.

So I felt like I was able to be way more productive as an individual contributor this year, even in ways that would have been very difficult for me before. So writing different TypeScript stuff, or other things that – it’s not natural for me, I don’t know a lot about that, whether it’s that screen ind VS Code where Codium is, or hopping over usually… Like you, I also have my ChatGPT window up, and I’m asking questions back and forth there… So it’s kind of a combination of these things. It’s not maybe completely seamless, but it’s just so efficient.

[00:18:12.28] And I love now - you kind of saw that progression, it was really powerful for developers, I think, this year… And then recently, now you’re seeing more privacy conserving options popping up. There’s a project called Continue.dev, or Continuum, I’m not sure what they prefer, but it’s a sort of open source integration of this kind of VS Code type of interface, but with – you can choose the model that you want to use, you can integrate an open model like Wizard Coder, or Code LLaMA, or something like that. So I think it’s cool that there’s a lot of really seamless configuration of that for individual contributors. And the feeling I never got was “Oh, this is taking over what I’m able to do”, but by embracing that I was able to be so much more productive as an individual contributor.

I was, too. I mean, I’ll have periods where I’m coding, and I’ll have periods where I’m not, and it kind of goes in and out. This was this year was definitely dipped back into coding, and there’s always the kind of catch up to where you were, the moment when you’ve taken time off… But it was different this year, because by embracing these tools that we’re talking about, it let me catch up and let me do things that I would have taken time and struggled with before by using all these new amazing tools. And we’re seeing that across so many things, and you don’t have to be a developer to benefit from this.

So I think 2024 hopefully will be the year where people discover productivity, instead of just kind of entertainment. Because most of the things I’ll see on Facebook are people that are not really technical, and they’ve discovered how to generate images and stuff like that, and they’ll do that. But they haven’t learned how to really change their lives the way we’ve been talking about. I’m hoping that that starts to transform folks, and takes maybe some of the fear of AI away… Because that was the other thing I really noticed a lot of this year, was fear of AI.

I’ll walk into a room and I’m ready to talk with anybody about AI, and all these cool things… And I kept hitting walls of fear with people who are not in the industry. Every time I did it, it always surprised me. I think it’s because I’m so upbeat about things. That was another thing, is maybe 2024 is the year I’m hoping that some of that fear gets mitigated, and people discover productivity instead.

I mean, there were definitely a good number of things in 2023 that led to a chaotic feel in the industry, whether that be in the sort of hiring and firing of Sam Altman, or disclosure of like data breaches of one kind or another… We ended the year, of course, with New York Times - I don’t know if it was 2024 or 2023; I’m assuming they had the file things in 2023… Where the New York Times is suing OpenAI, which is definitely in the news now, for copyright stuff. So yeah, there’s a lot of – if those are the sound bites that you’re getting, it definitely doesn’t lend itself to thinking that this space is reliable, and safe, and trustworthy.

The flipside of the coin on the fear is that this was also the year that we saw significant policy and regulation initiatives being put in place. We had the executive order that came about in the US, within Europe they had the AI act late in the year, and that made a big difference… And we have talked about a lot of this, by the way, obviously in the show, and I would refer listeners back - many of these things we’ve discussed here you can find in a dedicated episode on some of these things on our show. So look back through that. But in general, as I moved to - and I mentioned this on a previous episode - move into family, or go and hang out with some folks that had nothing to do with AI, and they don’t know what to believe.

[00:22:16.18] Another thing that got pointed out to me several times is that you’ll have big names in our industry, that are big enough to make it into mainstream, people like Yann LeCun - he famously considers it ridiculous, that there’s fear of AI and stuff like that. And then you had Geoffrey Hinton on the other side, another one of the major names, who left Google so that he could talk honestly about the dangers of AI. And so you’re talking about two global luminaries in the space that were actually recognized together recently as pioneers, and yet they have polar opposite views about that. And so I think that’s hard if you’re in the industry. But if you’re not in the industry, you have a lot of trouble trying to figure out who should you believe when you get a New York Times article or something like that that addresses these issues, and you go “Well, how do I handle that?”

So I’m hoping that in 2024 maybe we can pick up a few new listeners, maybe some of those that are interested in the topic, but don’t work in the industry, but maybe they can get educated a little bit more on kind of what the space looks like.

Yeah. Well, thinking about more forward-looking things, and into 2024, I know we want to talk about some of the maybe hot takes, or non-hot takes that people had in making predictions about 2024… Before we do that, I’m wondering, Chris, if you have – I can give mine first, but if you have any hot take or spicy opinion that’s maybe not represented in kind of the overall, like what everyone is saying… I can give mine first, and no worries if you don’t have one… But mine, I think that I saw a lot of people posting on Twitter, or LinkedIn, “Here’s my predictions for AI in 2024”, mostly all of them having to do with generative AI, and utilizing those models as a key piece of a workflow, which I started a company to do this, so I don’t disagree that that’s gonna be a big focus… But maybe my spicy take, that’s kind of different from many people’s, is I think that we’re gonna start to see - there were a couple of takes that I saw that said something about the software engineering element of building out these systems being a key piece of what will be important in 2024, not just making prompts… I would kind of build on that a little bit, and propose that I think there’s going to be some people that are really going to win by combining the kind of “traditional data science” and machine learning algorithms, or models, or systems, with generative AI systems in a sort of hybridized way. And I say that because in our own client work I’ve found that to be very much the case, and a very, very powerful approach is “Hey, in this case we’re generating responses for customer emails”, or something. And I don’t want to generate a response when the customer is really frustrated, or something; I’d rather a human respond in that case, and not the AI, right? Well, the best way to figure that out is, I think, with a sentiment analysis model that we know how to do really, really well, and we can run on a CPU with very little cost… And then we can use the generative AI to answer when it makes sense, or maybe even informed by the sentiment label.

[00:26:03.26] I think that’s only a very simple example, but combinations of recommender systems, gradient-boosting machines, time series analysis with generative AI models, either large language models or other types of generative AI models - I think that’s a really powerful combination, that many people are kind of ignoring. It’s like they’ve sort of moved on from the past, “Now we’re in this zone.” I kind of have the opinion that we’re going to see a little bit more of that kind of make a resurgence in 2024, and be combined in interesting ways, or with hybridized systems. You may not see it in the news as much, but I think on the battle lines within enterprises it’s something that you’re gonna see a lot.

I think that’s a fantastic – not only a fantastic prediction, but a very practical prediction as well. So you caught me off guard with the word “spicy” a little bit… There are many predictions I would make, but many of them are fairly mundane and in alignment. They’re kind of the logical thing. So while you were talking, I kind of wiped all those off the slate, because you said “spicy” and “different”. So I’ll make two. One is I think Prediction Guard is going to really take off in 2024.

Let’s hope.

I’m sure it will. But here’s my spicy one. Because I think that that was just a given; that’s inevitable, because you’ve done a great job with that. The spicy one is I think that there is generally going to be a resurgence of interest that we’ve started to see develop again in 2023, around Artificial General Intelligence, AGI. And I’ll tell you why I’m predicting that. Because we have seen in the past year these models make such a leap, depending on how you want to measure what a model is capable of in terms of different measures of intelligence… People talk about there’s intelligent, there’s super-intelligent, they’re almost as intelligent, and I think it depends on what kind of metric you’re using for that. But I think as a generalization, kind of looking at all those, we’re seeing these models that are incredibly productive. And if you’re measuring intelligence in terms of productivity, and you’re comparing that against what a human would be able to reasonably do in the same time period, we’re seeing output from these models that we’ve been talking about that’s just amazing… Which is why you and I have integrated them so heavily into our lives.

So if you take that for a moment as a measure of intelligence, and then you say “Well, there are about roughly a dozen different ideas on what consciousness would be in the space.” They don’t agree with each other, we’ve made it nowhere, but there’s a lot more fear now. And fear tends to drive priority, as I’ve learned in the industry that I’m in… And so the general fear out there is when you have such capable models, if there is a worry that we don’t understand – we see in nature all around us that consciousness arises in animals all over the place; it’s probably mathematical in nature, but we don’t have anything. So I think that there will be a resurgence of research. I think that research will not come in the AI space, it will come in the neuroscience space, trying to understand… Because the big fear is what happens when we stumble upon it, and you already have such productive models. So I’ve run into that fear over and over during 2023. So I’m predicting, not terribly practically, that there’ll be a focus, at least in some quarters, about how do we ensure that we don’t hit a moment that comes with big surprises in the large. I like that we have very, very different predictions on this one… But that’s my spicy prediction.

Break: [00:29:56.21]

Well, Chris, one of the things that we did leading up to this conversation was take a look at dozens of these “This is what I’m predicting for AI in 2024” posts on Twitter and LinkedIn, and kind of crystallized down or distilled down some trends of what people were predicting. So I’m gonna take what I kind of distilled down from all of these posts, and I’ll just put it out there, and we can comment on any of those… And then it may be interesting to look at a couple of these from specific people; that might be interesting, because a lot of people have been making these predictions.

So I did not do this with AI, but if you actually just look through the internet at these posts, you’ll see some trends pop up of what people are predicting… And I think both you and I looked at these and said “Yeah, these are kind of what would be expected to be predicted based on what we’ve seen and the conversations we’ve had recently.”

So the common points that were predicted by many different people across the interwebs - I put them in five categories here. So I’ll just read them off, and then we can make a comment on any of them if you want. So RAG, or retrieval-augmented generation, will continue to be a focus and will experience various improvements. So that’s number one. Number two, open models will beat GPT-4 in 2024. Number three, productivity in work will be enhanced by AI, rather than replaced by AI. Number four, multimodal models will be more of a focus in 2024. I actually think that was one that I predicted last year when we did our predictions, o maybe I was a year off… So there you go, I was a year off. And then number five, there’ll be more focus on small language models, rather than large language models, because of economic and compute efficiency.

So those are the five that were kind of distilled out of a bunch of different Twitter, LinkedIn blog posts. Any of those strike you as particularly interesting, Chris?

I agree with all of them. And these were some of the ones that I disregarded for the spicy one that I made, that would probably make half of our audience roll their eyes. [laughter] Maybe all of our audience.

What’s the good in listening to a podcast if you can’t roll your eyes sometimes?

If you can’t roll your eyes from time to time. But yeah, I think these are all very, very practical, fairly safe predictions… I think that probably most of us that follow the industry closely would tend to say – actually, since they came from many posts, would tend to say that’s the thing. And I’m looking forward to that. The multimodal thing in particular is something that I’ve been waiting for in 2023. I was like “Okay, but… Come on…” So that - yes, I agree with everyone that’s there, and I think that’s the logical progression. I would be very surprised if they don’t all come through this year.

Yeah. Probably a lot of these are a given. There’s definitely some open models that already “beat” GPT-4 in certain respects, certain tasks or something like that. If you think about maybe generating SQL queries based on a schema, or if you think about doing this particular thing in this language that’s not English, or if you consider specific domains, or other kinds of specific tasks, I think you already see that to some degree.

Now, GPT-4 is this sort of general purpose model that does all of these things at a pretty incredible level… But I think we’ll see open models get much, much closer to that, and you’ve already seen a lot of that kind of being hinted at with models like Mixtral, from Mistral, which is a mixture of experts model similar to, in that sense, mixture of experts sense, to GPT-4.

[00:36:03.20] And I think we’re already kind of seeing a lot of that happening… To your point, if GPT-4 remains King at the moment, it’s not king in everything, and different taskings work better. There are some things I’ve found LLaMA 2 work does the best at, there’s Gemini, which is still quite new, just a few weeks old from its release, and it’s good… I think right now I’m getting better rRust out of Gemini than any of the others. And so that’s one of those things where we’re kind of learning what model to go to for different tasks, with GPT-4 probably having the best overall return rate in a generalized sense still, but that will certainly change this year. It will probably change multiple times.

I’ve found some of the comments on specific ones of these interesting from particular people that are especially well-positioned to comment on some of these items. So for example, the number five one, the focus on small language models, but also with the perspective of becoming more economical, and cost and compute-efficient… Clem from Hugging Face, the CEO, made a video which is really nice; I recommend everyone watch that on Twitter, or other places that it’s posted… But he made some comments about his prediction that one of the hyped AI companies - certainly a lot of them now - would go bankrupt in 2024, or get acquired for a low price. And he tied that in with the comments along the lines of cost efficiency, and focusing on cost of running these models… Because yeah, you likely have a lot of startups that have raised big money; their compute costs are probably astronomical, because they’re running these large models at scale, and hoping that their margins get better over time… But once you make the shift to open models and cost-efficient models, that may not work out in their favor, and so the ability for people to run models in their own infrastructure, run more cost-efficient models - that’s not going to play out well for certain people. But it will play out well generally for the costs of running these sorts of systems in enterprises, whether that be – it still could be a software system that runs LLMs and is self-hosted within an enterprise, but it’s going to be much more cost-efficient to do that, especially for those that are wanting to pull some of that in, not rely on external systems, be more privacy-conserving, not have data leave their infrastructure… That’s going to become more and more possible.

So yeah, I thought it was interesting how Clem tied together some of his predictions around yes, being more cost and compute efficient, which is a benefit to the climate, for those that are thinking about those things, but also cost-efficient in terms of enterprise and operational costs… And how the focus on that will not work out that great for certain of these kind of hyped AI plays.

Kind of as a follow-up to that, I saw - it was a few days ago, and I’m gonna paraphrase, because I don’t have it in front of me… But he did a social media post that basically was an appeal to teams out there, and saying “Listen, if this year you’re at the end of your financial run, and you want to keep doing the work and you want to keep your team together, reach out to us at Hugging Face, and maybe you can join our team and keep doing some of the same stuff in that way, with our infrastructure.” Which I think is a very natural follow-up to him pointing out that we’d see some crashing and burning otherwise in there… And a smart move on his part as the CEO.

[00:40:10.23] Yeah, yeah, definitely. It’s great to see that. I think there will be some of that, some of the hints of that this year. I don’t think we’re in that sort of “hire lawyers and consolidate” phase yet; we’re still in that kind of building and engineering phase… But just the economics of how things are shifting will shift, I think, in 2024, which will be interesting.

Yeah. I think the finances of it all will matter; instead of just building and building, we’ll see a building, but a building with a practical eye on “How do we sustain this over time?”

Yeah, for sure. And I know one of the other ones that I think is very well positioned to make a comment on the things that he was making a comment on is Jerry from LLaMA Index, who was on the show… Well, Clem was on the show too, but Jerry more recently… And of course, LLaMA Index is one of the key frameworks that’s being used within RAG workflows… And he also made one of the comments around – the way he phrased it, “As every AI engineer still needs to have strong software engineering fundamentals…”

So shipping - this is quoting him - “Shipping an LLM app to production reduces to software engineering, and clean, extensible abstractions, testability, monitoring in production etc.” So yeah, I think that insight is very fair, and kind of gets to this really need for software development practices to kind of gather around the LLM practices and model calls within 2024.

So I’m glad that you brought that up… Just as a quick add-on to that, we’ve seen a lot of kind of AI-specific language around producing AI, capabilities and such as that. A lot of phrases have been coined in that way. I think that’s important is that at the end of the day AI remains a really cool new capability within the larger software space. And to do anything with it, you have to have a software capability, and that those two are gradually merging. And someday, when we’re past kind of the coolness of AI, and we’re all just like “Oh, yes, we’ve been doing this for a while, and it’s not quite such a big deal”, i’ll be software again. And all software will have it, and it will just be another aspect of software. We’re not there yet; we’re very much in the cool space at this point… But software skills remain important. And some of those may be human-driven, and some of those may be driven by the software with models… But that doesn’t change going forward. You’re still going to need it.

Yeah. I’m looking forward to learning with you in 2024, Chris, and talking through whatever comes, which will certainly be different than what we just predicted, as is always the case every year that we try to do this; it’s always different. But I am looking forward to navigating that journey… And thank you to our listeners for being loyal, and engaged in 2023, and really happy to continue bringing you this content and learning with you all as well in 2024.

If you haven’t yet, make sure you go to changelog.com/community. You can join the Slack channel where you can chat with us if you like, and connect with us on Twitter and LinkedIn, and all the places where…

BlueSky.

Yeah, exactly, BlueSky. And we’ll love to hear guests that you want on the show, or topics that you want discussed, and chat with you about all the cool stuff that you’re doing. So thanks for a great 2023, and Happy New Year 2024.

Happy New Year to you too, and everyone listening.

Changelog

Our transcripts are open source on GitHub. Improvements are welcome. 💚

Player art
  0:00 / 0:00