torsdag 31 december 2020

Apple M1, future opportunities and challenges

Previous experiences

For many the introduction of the M1 chips by Apple felt like a revolution or at least significant innovation. For some of us it is just a modern implementation of a very old concept, this can be compared to electrical cars which where first built in the 1890's, went out of fashion and have now returned. What can we learn from the previous attempts of the same technical approach, what can we guess will be the next steps for Apples M1 chips and what opportunities and challenges lie ahead? This article is a stand-alone second part to (Apple M1 the bigger picture)

Historical context

For those of you familiar with the technical architecture of the Amiga or similar computers of its time feel free to skip.

In essence there was in the early nineties several computers available that in addition to relying on cpu manufacturers (eg: Motorola & Intel etc) developed additional pieces of silicon to complement the main cpu. Commodore did this to a huge extent in the Amiga chipsets but others including Apple did so too. Some of these processors where pretty simple and handled disk IO and other fairly simple tasks for the CPU, things that has since moved into motherboards or controller chips inside hard disk drives. A common strategy at the time was to use shared memory between these chips and let them execute fairly high level sub routines themselves to avoid bogging down the low power CPU's of the time with these tasks.

Even floating point calculations wheren't integrated in the CPU's of this time, if you wanted that you had to by an additional co-processor. Memory expansion was possible but there was a difference between expandable memory that could only be used by the CPU (in the Amiga called "Fast RAM") and chipset memory that was shared between co-processors.

The results where pretty good, despite the limitations of the CPU's they could still run graphical user interfaces nicely and multitask several time sensitive operations in parallell (eg: play games, display video and play music in sync). All while using just a small amount of power, cooling CPU's with a heatsink was not common, active cooling with fans wasn't a thing.

And they all died

Newer and faster CPU's appeared. Despite their improved performance there where still many things they couldn't do as well, multimedia and multitasking on early PC's was poor (to some extent due to software as well to be fair), UI's where slow and unresponsive and power use was very high. For many years there where things that huge, expensive, power hungry PC's couldn't do as well as the old computers they gradually replaced.

Yet the flexibility of added CPU power created the ability to handle new problems. The modular approach with discrete sound cards and graphics cards complemented the CPU with the needed capabilities to handle graphics, sound and IO. The new computers had motherboards allowing the expansion of RAM to previously unheard amounts.

Some of the most fundamental co-processors became targets for integration in the growing CPU's (eg: Floating point operations), many CPU's of today contain all kinds of multimedia extensions and graphics capabilities.

The pace of innovation was incredible and the old school of highly integrated chipsets died away, it just wasn't flexible enough and couldn't keep up with the innovation pace.

What is different now?

The biggest difference is maturity in the computing industry, not much has really changed the last few years.

Back in the days the introduction of a file format like JPEG was a huge challenge to older computers but today new radically different codecs are few and far between. It will be much easier for Apple to keep up with supporting these changes with tweaks to their CPU's and which codecs they accelerate via specialized processor features and most of what can be handled by an Intel/AMD CPU can be handled in the M1 anyway as they are fairly similar in computing power.

When it comes to RAM, then it doubled every year, what was great when you bought it was a critical problem two years later, modularity and expansion was a key benefit. Virtual memory existed but swapping memory to rotating disks was a very poor workaround to having more RAM. The consumer PC's of the last decade has all been working pretty well with 8-16GB RAM, few applications run orders of magnitudes fast from having more. Swapping to SSD's is still slower than RAM but the slowdown compared to spinning drives is much less noticeable, if this happens a few times a day it's not a big deal.

Computing has shifted from being a desk only activity to becoming something that is assumed to be portable. As the emphasis on portability has increased PC's have become less and less modular themselves. Few laptops today offer a generic internal connector for a high speed co-processing unit, the only I can think of is sockets for RAM and M2 slots which are typically occupied. The risk is much lower that a huge shift will be introduced by an amazing M2 card that everybody will buy to radically increase the capabilities of their devices, in the bad old days, upgrading add-on cards was common (eg sound, graphics, hard drive controllers etc).

It boils down to, when did you last hear of a new type of application that required a computer upgrade? For most consumers they can hardly remember this happening, pace of innovation is still fast but doesn't require you to change your device. This is a huge benefit for optimizing for efficiency and integration.

Opportunities and challenges

Multitasking is something we all take for granted compared to 30 years ago there is very little benefit of specialized processors for this today. Most computers even on the low end has multiple CPU cores and software is good enough to make up for lack of parallel processing. The opportunity for integration is less, but so is the risk. Possibly some new AI processing devices will change that soon but any non modular PC design will be at an equal playing field anyway, and this isn't where Apple's M1 is competing.

When it comes to memory capacity there will a small challenge to cater with enough flexibility for professional users that might want (or even really need) more RAM. The old "Fast RAM" trick can of course be applied and would probably be a very efficient and flexible solution to let those who like add another 32-64GB of RAM. The other option is of course to introduce a CPU model with 32GB of RAM on the package, that would increase both the capacity and likely also the performance, sounds simple but it would require space and does use more power so it's not without at least issues.

When it comes to performance a very close integration between the CPU and its memory has a significant benefit, with low latency connection you get even better improvements from adding even faster RAM. The M1 is using DDR4 today, this could be upgraded in the future with any of several higher performance options (eg HBM or DDR5/6)

From a marketing perspective when you control integrated features in the processor as well as the software used on all machines you'll have a much faster/easier path for optimizing specific work flows. I'd expect future Apple marketing to boast specifically about acceleration of specific work loads with numbers at least in the 200% plus region. These improvements will be impressive from a marketing point of view, and offer huge improvements for some customers. It will be a challenge over time to find these optimizations that matter to enough customers though as less and less things ever feel slow.

There will be a challenge in offering high power graphics capabilities. For now it seems like Apple is mostly staying out of that, but eventually they might want a piece of the gaming market. Today there is still a general acceptance that you do gaming on a different device (eg Console or gaming PC). There are two risks in this area, if the interest in this market expands heavily it might force Apple back to support external graphics cards, not a big deal but some of the current benefits of avoiding that will diminish. The second and less likely but higher impact risk is that a new type of killer app emerges that benefit increadibly from having that high power gfx card even in your laptop, then Apple will need to adapt very quickly.

Summary

The opportunity of beating everything else on efficiency for a few years is significant, compared to what has been released now it is a question to all competitors how long they have this exclusive benefit. For a large chunk of the mainstream consumer computer market this looks like a very rational and sensible direction. The near term challenges are minor and have several well known solutions. In the long term there is no knowing how fast Apple will be to react and adapt to shift in the market, or if they will even be leading and driving that change.

Apple M1, the modern version of the computer I used to love.

torsdag 19 november 2020

Apple M1 the bigger picture

This will be an attempt to make a short yet slightly different analysis of the Apple M1 release. I will try not to repeat what's already been said 100 times.

I see a lot of raving about M1 and to be fair it seems amazing, on the other hand for us who has been around computing since the 90's this is nothing new. In fact we used to get similar news and 50% improvements every year in the mid/late 90's. In some areas (eg: GPU's) the development has been fantastic even in the later years. But in CPU's well, we got constant improvements but not that much to rave about for the mainstream customers, until Apple released M1. So what have they really done?

On a high level they have done something very simple, they broke rules that didn't even exist. At it's core (pun intended) there is nothing new, I don't see a single piece of technology invented in M1. Not even the integration aspect of M1 is new, it has been done for "ages" in mobile phones. What is new is the design decision to use this highly integrated design in a traditional computer.

For most mainstream users of computing the machine is simply a black box, you don't change its internals once you've bought it, yet PC designs have been highly modular, with all the costs and issues that comes with modular design but reaping few of the benefits. This is more tradition than anything else, one manufacturer creates a dozen or so CPU designs that roughly fit everything from tablets to data centers and supercomputers (with a tweak), these are then combined with memory, storage and frequently graphics to build the consumer machine.

This was great for the average consumer at the time when you from one year to another could upgrade memory with four times the capacity or buy a new cpu and extend the lifetime of the machine with a few more years. But as the improvements reached diminishing returns fewer and fewer upgraded, and fewer and fewer saw any need to do so. Modularity became a feature not put in real use by the majority of the customers, they basically sponsored other groups of customers. However the approach to delivering systems was such a well oiled machine and worked for the benefit of the systems builders that nobody questioned the necessity of it, or the cost...

What cost? Modularity is great! Yes, when you use the flexibility frequently, having the ability to have a trailer after your car is great when you do it a few times a year, if you never do it you don't need the towing hook, if you do it every day a pickup might be better. For a cpu to have external memory modules is great if you do change memory chips, if not the cost is reduced memory bandwidth, higher latency and more power used, all for nothing. Some computing problems require more or less memory, but th 16GB that you can get in an M1 is what has been roughly the standard in home computing for the last 5 years or so anyway.

There are problems that just crave those 64GB of memory to be computed efficiently but if that is your use case then you are way outside of the target audience for this chip. There are customers who benefit greatly from upgrading their computers with new graphics cards (or even multiple) and just need the extra RAM, but the M1 is not a computing revolution, the fundamentals of how computers process problems has not changed. The M1 is great optimization for a large group of mainstream Apple customers, a well designed system on a single chip that will be amazing for those it fits, not a silver bullet for all of them. Even being a computer enthusiast I appreciate what Apple has done, the M1 (and its successors) will be a great benefit for a huge amount of customers, for customers who previously didn't really have products designed specifically for them, there will likely be options for the rest of us as well later.

A concern with highly integrated products is that the whole package need to last equally long. So even if we have leveled of the development in cpu performance and memory requirements, maybe we will soon see a big shift in AI/ML processing, if this happens the whole device might feel very aged unless an external AI/ML processor can be added via the USB-C connector. In a traditional modular PC there are more options available for such upgrades. In a worst case scenario dominant manufacturers will drive specific development to make older devices obsolete on purpose. A strong second hand market and options to upgrade or replace specific components such as cpu's and battery would be a reasonable mitigation of this if priced fairly.

If Apple has done their homework (and they usually do) they have just created an integrated version of what was previously available as stand alone components and well known approaches in mobile phone technology. If they've done the maths properly this chip will cater to and a be a big benefit for around 80% of their users, to deal with the remaining 20% they will likely offer another chip that is great for 80% of that group and the remaining users will likely get a different and possibly more traditionally modular and likely very expensive offering.

When Apple released the iPhone some companies and people laughed at it, I think most people recognize the same pattern this time. And most likely history will repeat itself, just as Android became a competitor for IoS devices we will likely see similar developments in PC's, will it be similarly integrated machines from Microsoft, Ubuntu or Chromebooks? Hard to say but this change has been long overdue and will likely result in similar devices from other manufacturers soon, hopefully sooner rather than later for the benefit of all customers.