Kelsey Hightower changelog.com/posts

Automation is the serialization of understanding

and understanding is based on the fundamentals, not the tools

What you’re about to read is a fancified excerpt from Ship It! #44. You should read the original transcript or listen to the entire conversation for more context. To set the stage, this is Kelsey’s response to Gerhard asking what the biggest takeaway should be from their conversation.


The reason we find ourselves as practitioners (as an industry as a whole) constantly migrating between different platforms, journeys, and digital transformations…

We do this because I don’t think we ever understood the fundamentals.

The fundamentals are very clear. If you’re thinking about application delivery, for example, we know what the fundamentals are.

Ideally, we’re versioning the source code we write, and using tools that enable us to build and package applications in a reproducible way. 15 years ago maybe you were just creating WAR files or ZIP files. Maybe you were sophisticated and you were creating RPMs or DEBs so you could use a package manager. Those practices have always been good ideas. Packaging is a by-product, or the artifact, of assembling code and your dependencies, and getting your applications ready for deployment in a repeatable way. If you did that 15 years ago, adopting something like Docker is not hard to do. You say…

“Okay, we’re gonna take the existing RPM and use it to create a container image. Instead of running yum install on a server to install the RPM, we can now call the same command from a Docker file, and produce a container image.”

It’s just another packaging step, and if you decouple packaging from deployment, then you get the ability to change just the last mile.

So even if you’re just using RPMs today, you can layer on container packaging. Even if you’re using something like Puppet to deploy those RPMs, once you have a container image, you can actually swap out the Puppet step for the Docker step, and it works.

But you have to understand the fundamentals and the boundaries between these concepts.

I think as an industry we’ve been pushing…

Automate. Automate. Automate.

And we haven’t been saying…

Understand. Understand. Understand.

Because if you understand what you’re doing, you can automate if you want to.

Because if you really have a clean process that says…

“Okay, we’ve automated the build process of a Docker container image, and look, we don’t really have more than two environments.”

So the QA process can look like this…

“Docker run this container image. It works. We go to production. Docker run the same container image. It works.”

And maybe that’s all your team needs to do! And look, that’s okay. But I think understanding allows us to make that decision, and then we can decide what automation tool is the best for the job.

These fundamentals can be applied using different tools. The tools are not the fundamentals. It’s the ideas and concepts that we have been talking about today.


Dig this post? Subscribe and check out what Kelsey has to say about the power of protocols. 👇

Subscribe to our YouTube channel for more clips like this, live show recordings, and more ✌️

Discussion

Sign in or Join to comment or subscribe

2022-04-17T10:05:01Z ago

I really dig the sentiment. I think often the problem is that tooling is trying to solve the problem for us.

Packaging is a good example of this, take the Python ecosystem as an example. Most of the time, the advice is ‘follow best practices, and packaging should just work’. But then if it doesn’t, you’re diving deep into configuring setuptools. You’ll probably not take the time to learn what the package formats (sdist, wheel) are, what you’re actually trying to achieve.

As a result, the problem-solving process is more about searching for a magic incantation of config, rather than methodically working towards a solution. Which leaves everyone confused. The process of making it work doesn’t build understanding.

Player art
  0:00 / 0:00