Observing the power of APIs
Jean Yangâs research on programming languages at Carnegie Mellon led her to realize that APIs are the layer that makes or breaks quality software systems. Unfortunately, developers are underserved by tools for dealing with, securing & understanding APIs.
That realization led her to found Akita Software, which led her to join Postman by way of acquisition. That move, at least in part, also led her to join us on this very podcast. We think youâre going to enjoy this interview, we sure did.
Matched from the episode's transcript đ
Jean Yang: Something Iâm really excited about is low code with APIs. Because part of me is like âLetâs just all be really honest about what weâre doing here⌠Basically gluing together APIs.â So Iâve been a big Zapier fan for many years now, and Iâm also a really big fan of Postmanâs new low-code product called Flows. But as a programming languages person, itâs always about if your language or your builder abstractions are at a higher level of abstraction, itâs always easier to analyze whatâs going on. And so from my point of view, we have to do all this stuff with API observability right now, because we have to back-engineer all the API traffic, and really piece together all of the API interactions⌠But if youâre just straight up using a low-code tool, thatâs just right there. And so thatâs something thatâs really interesting and compelling to me. I think that thatâs very clean from an abstraction standpoint, and it also just enables more software builders, which I think is very cool.
So to me, that cleaning up â so like right now, calling APIs from low-level code kind of feels like youâre mixing assembly with some other stuff. Youâre at like a low level of abstraction. So lifting the whole abstraction layer to something thatâs API-centric is very exciting to me. And then you would only need something like us for the messy stuff that you customize, or something⌠You know what I mean? But all the other stuff, itâs cleaner to begin with. So thatâs something thatâs really exciting to me.
And then there needs to be a better solution for legacy stuff. So legacy subsystems today are like toxic waste. Theyâre just sitting there, waiting for like a big bug or vulnerability to really cause things to spill over⌠And the work weâre doing is one piece of what allows people to make inroads into legacy software. I think thereâs some work that Docker is doing thatâs really interesting, helping people containerize legacy software⌠So the reason Iâm excited about that is if you have legacy software thatâs just kind of like sitting somewhere, running on your own systems, on a non-standard tech stack, itâs really hard to make sense of it. But the minute you virtualize it, you can start poking and prodding at it in a black box way. That support some of the stuff weâre doing, actually.
So we can only watch things⌠If theyâre sufficiently virtualâ or we could also⌠This is a gray area, but we could also install our agent on bare metal, etcetera, etcetera. But the minute things get containerized, things are easier. So the push to containerize and standardize infrastructure I think will help some of the legacy problem. But a lot of software tools, discussions really gloss over the fact that we have growing amounts of legacy code that are never going to be part of this future that theyâre describing, and what do we do with all of that code?