The future of compute could be very exciting
I love to geek out about futuristic things. If you follow me on Twitter, you know I'm always following things like spaceflight and electric cars, and I love being an early adopter of apps and software projects. I get a certain excitement when looking at a new paradigm or product and think about all of the things that can now be built because someone took the time to create something new. For almost two years I've been constantly thinking about one thing in particular, and that's how frickin' exciting the next version of "serverless" compute could be.
I'm not talking about whatever new feature of Lambda that Amazon is going to introduce, or whatever enhancements to Cloud Run will be announced at the Next conference, I'm talking about an actual evolution of the serverless concept. I have a specific idea of what that next step looks like, and I know others do too. My particular version is based around WebAssembly, but I'm not going to talk about that here, I want to keep it conceptual. It doesn't really matter which implementation wins out, the possibilities are so exciting that I just want to geek out with you.
The name I've been using for this new evolution is 'decomposed computing', but I don't love it, so I'm sure someone will think of a better name. It's the idea that your software, both client and server, can be shipped together, and the network will just figure out how to run it. 'The network' in this scenario is everything from your central cloud in us-east-1 to the hardware actually doing work for an end purpose (and whatever is in between). Wouldn't it be incredibly cool if you could push to your main branch and have every component of your product upgraded within milliseconds, all around the globe? That's what gets me excited.
So how would we get there? Well that's where decomposition comes in. If you think about how we build software today, there's data being stored somewhere, there's compute doing operations with that data, and there is some consumer connected to it at the other end providing and/or using that data. Whether it's a mobile app running a social network application, a tiny IoT device in the basement of some factory, or a car loading up directions to your destination, it's some variation of that pattern. There are various architecture paradigms, networking protocols, and data formats involved, but at the end of the day if you draw it on a napkin it'll have a similar shape. So how can we build something to operate across that entire stack?
I said I wasn't going to talk about WebAssembly, so let's just say we'd need a portable execution format of some kind. We'd need the ability to execute your business logic in a cloud provider's data center, on an edge network, on a mobile device, on a teeny tiny moisture sensor, in a tractor's onboard computer, on a....
You get the point. If we can execute your code anywhere without shipping a dozen different versions of it, things become a hell of a lot easier. Using this portable execution format, you start compiling your business logic. You write modules for doing CRUD and transforming data and doing ML inference. You write more modules to display things in an app, to do search, to make recommendations... and then you push.
What happens next is why I think it is the next evolution. As soon as that code hits your GitHub account, things are put into motion. This isn't the normal CI/CD pipeline you have today, but instead an intelligent analysis of what you've built, and determination of where it needs to run. This would be a system that pushes your backend modules into GCP, sends your edge modules to your CDN, notifies devices and apps that an update is available, and everything starts working in tandem. As the network updates, the new version starts naturally rolling out. No canaries or blue/green needed, just a network absorbing, orchestrating, and executing the latest and greatest version of the software you've spent your long hours building.
Your app downloads the new modules. Hundreds of machines at edge locations around the globe do the same. The final few modules slide into place and the whole thing snaps together like the most satisfying Lego model ever built, while the data just keeps flowing as if nothing even happened. Your jobs keep running, your app keeps making requests, your moisture sensor keeps reporting values.
The important part is that the network will be deciding where everything runs based on what each module needs to access, so you as the developer or operator won't need to. You won't need to care how your compute is being scheduled because the network will take care of it. That's what makes decomposition so powerful. Cloud, edge, and devices becoming one cooperative network.
Until then, wear a mask!