And if not, will the move to ARM encourage coders who love macOS to go “serverless” (incl. FaaS) even faster?
I am a real Mac fan. I haven’t always been. I came to be over time. First, for simple things like personal use. Afterwards as productivity machine. And over the last 5 years or so, I even switched my (mostly prototyping) development efforts slowly from Linux to the Mac – only .NET (Core) and Azure coding remained the prerogative of Windows 10.
Now, Apple is moving the Mac to ARM. This has some exciting aspects: Much better hardware/software integration for native apps, for example. Interchangeable code between iOS, iPadOS, and macOS. Probably much better battery life. And likely an even tighter integration of “Apple-style” productivity – things like handoff and sidecar, which I admire.
However, there is also a downside. And it’s a big one. In my team, many people use Intel-Macs for software development, targeting our web and cloud platforms. Of course, the world is moving step-by-step towards “serverless” (at least modern organizations do), but major workloads are still based on VMs and Docker. And VMs and Docker are based on Intel/AMD technology stacks – in many ways.
X86 and X64 platforms provide many virtualization techniques even on the CPU level (especially Typ 2 virtualization). As such, they use native hardware to “provide” to their VMs. For Docker, the situation is similar: In Docker, Kernel-level isolation is used to run dedicated containers, but because Kernels are hardware-specific, Docker is not providing any level of abstraction. In other words: If you put ARM chips in your Mac, and there is no magic X64 translation in either the chip (which could be okay in terms of performance theoretically) or in the OS (which will probably never be “fast” at all), every hypervisor and container stack is limited to provide ARM-based operating systems.
How bad is it? Well, pretty bad. If the base of web and cloud technology on the VM / container level (and very likely also in the serverless-world, actually) will remain X86 / X64 when it comes to the metal underneath, the code you intend to build for those platforms…
a) will not run on macOS in the beginning anymore.
b) might run in a sub-optimal way at some point in the near future.
c) will never be a real 1:1 reproduction of your production environment.
That’s really sad. It means that for serious developers, many of the great tools that are (finally) available on the Mac platform (Atom.io, VSCode, Docker, Vagrant, GitHub, etc.) now have a at least a question mark attached to it. Of course, you can natively run code (Python, Node.JS, Ruby, etc.) on macOS for ARM – no problem. And of course tools like Docker and Parallels (maybe even VMware, although they have been awfully quite so far) will continue to run dev containers/machines locally with the right execution environments inside (many Linux systems support ARM – question is on which level).
But those local developer stacks will in fact be different from your production environments (unless you run ARM servers in the cloud) – and it is highly questionable whether the images you are using on ARM can “reproduce” exactly the same results that your DevOps pipeline and your cloud systems yield.
In the end, it means that the next level of abstraction – again – is immanent. If “code” is the real value, all that OS/kernel-level plumbing (which my teams have to do, because it generates quite a lot of value at the moment) has to become less and less important eventually.
With VMs and docker, we unified the local and server-side development, and for many years, that sounded like we finally solved that fu**ing problem of “runs on my machine”. In fact, VMs and Docker delivered the next level of what the JVM and .NET (Core) tried to do: To provide a universal level of abstraction for developers, regardless of the actual environment.
That was possible, because we went back to the lowest possible level (CPU/OS/Kernel) as common denominator, and made it work “side-by-side” with the fantastic and individual desktop experiences we all enjoy. However, now Apple is changing that denominator – and that makes it at least difficult for Macs to remain part of the “let’s code for the (X86/X64) cloud”-team.
My hope is that we further embrace “code” as the true common denominator. Again, the JVM and .NET (Core) tried to do that, and neither has given up yet. But in the end, it seems to me that “serverless” is the only real path towards a higher level of value generation in the long term, because it promises that you don’t have to build something for the cloud you are using – instead, the cloud provides everything to execute your individual developer stack. But for sure, we are not there yet – as of today, while “serverless” platforms and especially the big FaaS providers take away a lot of complexity from the DevOps teams, they also take away a lot of power and freedom.
Now, what does it mean for the Mac? I honestly believe that quite a few developers will switch to alternatives for at least some time – unless they build for iOS, iPadOS, or macOS, of course.
Cloud developers will move to either Linux (if they build stuff for AWS, Google Cloud, Heroku, etc.) or Windows (if they build stuff for Azure), and it will take some time before they come back – either because the VM/container situation is improving, or because they went serverless.
So, what’s the answer to the question I raised in the beginning? Did Apple just kill the Mac as the go-to machine for cloud developers? And if not, will the move to ARM encourage coders (on macOS) to go “serverless” even faster?
I believe, for some of us, the answer to both questions will be: