torsdag 19 november 2020

Apple M1 the bigger picture

This will be an attempt to make a short yet slightly different analysis of the Apple M1 release. I will try not to repeat what's already been said 100 times.

I see a lot of raving about M1 and to be fair it seems amazing, on the other hand for us who has been around computing since the 90's this is nothing new. In fact we used to get similar news and 50% improvements every year in the mid/late 90's. In some areas (eg: GPU's) the development has been fantastic even in the later years. But in CPU's well, we got constant improvements but not that much to rave about for the mainstream customers, until Apple released M1. So what have they really done?

On a high level they have done something very simple, they broke rules that didn't even exist. At it's core (pun intended) there is nothing new, I don't see a single piece of technology invented in M1. Not even the integration aspect of M1 is new, it has been done for "ages" in mobile phones. What is new is the design decision to use this highly integrated design in a traditional computer.

For most mainstream users of computing the machine is simply a black box, you don't change its internals once you've bought it, yet PC designs have been highly modular, with all the costs and issues that comes with modular design but reaping few of the benefits. This is more tradition than anything else, one manufacturer creates a dozen or so CPU designs that roughly fit everything from tablets to data centers and supercomputers (with a tweak), these are then combined with memory, storage and frequently graphics to build the consumer machine.

This was great for the average consumer at the time when you from one year to another could upgrade memory with four times the capacity or buy a new cpu and extend the lifetime of the machine with a few more years. But as the improvements reached diminishing returns fewer and fewer upgraded, and fewer and fewer saw any need to do so. Modularity became a feature not put in real use by the majority of the customers, they basically sponsored other groups of customers. However the approach to delivering systems was such a well oiled machine and worked for the benefit of the systems builders that nobody questioned the necessity of it, or the cost...

What cost? Modularity is great! Yes, when you use the flexibility frequently, having the ability to have a trailer after your car is great when you do it a few times a year, if you never do it you don't need the towing hook, if you do it every day a pickup might be better. For a cpu to have external memory modules is great if you do change memory chips, if not the cost is reduced memory bandwidth, higher latency and more power used, all for nothing. Some computing problems require more or less memory, but th 16GB that you can get in an M1 is what has been roughly the standard in home computing for the last 5 years or so anyway.

There are problems that just crave those 64GB of memory to be computed efficiently but if that is your use case then you are way outside of the target audience for this chip. There are customers who benefit greatly from upgrading their computers with new graphics cards (or even multiple) and just need the extra RAM, but the M1 is not a computing revolution, the fundamentals of how computers process problems has not changed. The M1 is great optimization for a large group of mainstream Apple customers, a well designed system on a single chip that will be amazing for those it fits, not a silver bullet for all of them. Even being a computer enthusiast I appreciate what Apple has done, the M1 (and its successors) will be a great benefit for a huge amount of customers, for customers who previously didn't really have products designed specifically for them, there will likely be options for the rest of us as well later.

A concern with highly integrated products is that the whole package need to last equally long. So even if we have leveled of the development in cpu performance and memory requirements, maybe we will soon see a big shift in AI/ML processing, if this happens the whole device might feel very aged unless an external AI/ML processor can be added via the USB-C connector. In a traditional modular PC there are more options available for such upgrades. In a worst case scenario dominant manufacturers will drive specific development to make older devices obsolete on purpose. A strong second hand market and options to upgrade or replace specific components such as cpu's and battery would be a reasonable mitigation of this if priced fairly.

If Apple has done their homework (and they usually do) they have just created an integrated version of what was previously available as stand alone components and well known approaches in mobile phone technology. If they've done the maths properly this chip will cater to and a be a big benefit for around 80% of their users, to deal with the remaining 20% they will likely offer another chip that is great for 80% of that group and the remaining users will likely get a different and possibly more traditionally modular and likely very expensive offering.

When Apple released the iPhone some companies and people laughed at it, I think most people recognize the same pattern this time. And most likely history will repeat itself, just as Android became a competitor for IoS devices we will likely see similar developments in PC's, will it be similarly integrated machines from Microsoft, Ubuntu or Chromebooks? Hard to say but this change has been long overdue and will likely result in similar devices from other manufacturers soon, hopefully sooner rather than later for the benefit of all customers.