“The whole widget” ~Steve Jobs
…to stay competitive, Apple had to make and control everything: the software, the hardware, the user experience, and the chips that power it all. He (Steve Jobs) referred to this as “the whole widget.”
For Apple, the iPhone represented a chance to start afresh.
“Steve used to say that we make the whole widget,” Joswiak told me. “We’ve been making the whole widget for all of our products, from the iPhone, to the iPads, to the watch. This was the final element to making the whole widget on the Mac.”
WHY THE M1 MATTERS
- Modern computing is changing. Software is an end-point for data and works using application programming interfaces.
- Chips have become so complex that you need integration and specialization to control power consumption and create better performance.
- Apple’s chip, hardware, and software teams work together to define the future systems to integrate them tightly.
- The future of computing is moving beyond textual interfaces: visual and aural interfaces are key.
- Machine learning will define the capabilities of the software in the future.
It uses Apple’s Unified Memory Architecture (UMA), which means that a single pool of memory (DRAM) sits on the same chip as various components that need to access that memory — like the CPU, GPU, image processor, and neural engines. As a result, the entire chip can access data without copying it between different components and going through interconnects. This allows them to access memory with very low latency and at a higher bandwidth. The result is a much better performance with less power consumption.
Arete Research’s Richard Kramer pointed out that the world’s first 5-nanometer chip put M1 a generation ahead of its x86 rivals. “Apple is producing world-leading specs over x86, and it is doing so at chip prices less than half of the $150-200 paid by PC OEMs, while Apple’s Unified Memory Architecture (UMA) allows it to run with less DRAM and NAND,” Kramer noted.
Thee news of the M1 focusing on the lower-end machines got some tongues wagging. Though, according to Morgan Stanley research, these three machines together represent 91% of trailing twelve-month Mac shipments.
Apple CEO Tim Cook pointed out that one in two new computers sold by Apple is being bought by the first time Mac buyers. The Mac business grew by nearly 30% last quarter, and the Mac is having its best year ever. Apple sold over 5.5 million Macs in 2020 and now has a 7.7 percent share of the market.
Intel and AMD have to talk about gigahertz and power because they are component providers and can only charge more by offering higher specifications. “We are a product company, and we built a beautiful product that has the tight integration of software and silicon,” Srouji boasted. “It’s not about the gigahertz and megahertz, but about what the customers are getting out of it.”
“I believe the Apple model is unique and the best model,” he said. “We’re developing a custom silicon that is perfectly fit for the product and how the software will use it. When we design our chips, which are like three or four years ahead of time, Craig and I are sitting in the same room defining what we want to deliver, and then we work hand in hand.”
In large part due to mobile devices, which are always connected, computers now must startup instantaneously, allowing the user to look, interact, and move away from them. There is low latency in these devices, and they are efficient. There is a higher emphasis on privacy and data protection. They can’t have fans, run hot, make noise, or run out of power. This expectation is universal, and as a result, the software has had to evolve along with it.
Modern software has many entry points. If you look at more recent mobile OS changes, you can see emergence of new approaches such as App Clips and Widgets. They are slowly going to reshape what we think of an app, and what we expect from an app. What they are showing is that apps are two-way end-points —application programming interfaces— reacting to data in real-time.
You don’t worry about the CPU specs; instead, you think about the job. “Architecturally, how many streams of 4k or 8k video can you process simultaneously while performing certain effects? That is the question video professionals want an answer to. No spec on the chip is going to answer that question for them.”
Modern graphics are no longer about rendering triangles on a chip. Instead, it is a complex interplay between various parts of the computer’s innards. The data needs to be shunted between video decoder, image signal processor, render, compute, rasterize all at rapid speeds. This means a lot of data is moving.
“If it’s a discrete GPU, you’re moving data back and forth across the system bus,” Federighi points out. “And that starts to dominate performance.” This is why you start to see computers get hot, fans behave like turbochargers, and there is a need for higher memory and more powerful chips. The M1 — at least, in theory — uses the UMA to eliminate all that need to move the data back and forth. On top of that, Apple has a new optimized approach to rendering, which involves rendering multiple tiles in parallel and has allowed the company to remove complexity around the video systems.
“Most of the processing once upon a time was done on the CPU,” Srouji said. “Now, there is lots of processing done on the CPU, the graphics and the Neural Engine, and the image signal processor.”
At a human level, all of this means that you will see your system as soon as you start to flip open the screen. Your computer won’t burn your lap when doing zoom calls. And the battery doesn’t run out in the middle of a call with mom.