Insights

Publication | Legaltech News

Nervous System: The CPU ARM Race

David Kalat

June 7, 2021

Apple’s recent transition from Intel-based processors to a new M1 processor using so-called ARM technology has shaken the microprocessor world. But as this month’s history of cybersecurity explores, ARM technology is not a recent phenomenon.

Apple’s recent transition from Intel-based processors to a new M1 processor using so-called ARM technology has shaken the microprocessor world. ARM processors are prevalent throughout our digital world and are the default processor used in smartphones. Weirdly, though, this seemingly cutting-edge technology has its origin in a children’s TV show made in 1982.

The Computer Programme was a British series produced by the British Broadcasting Corporation (BBC). As the home-use personal computing revolution was taking full swing, the BBC wanted to educate British children about this brave new world. The producers envisioned on-air demonstrations of how computers worked and what they could do.

Had the show been produced in America, the makers might have opted to achieve their goals through special effects. It never occurred to the makers of Tron to actually digitize Jeff Bridges into a video game.

Nevertheless, the creators of The Computer Programme went looking for a computer that met their requirements for the needs of the show. Upon finding that no such machine existed in those specifications, they decided to have one custom built.

Cambridge-based Acorn Computers was selected to design and build the bespoke computer, which was not just used for the show, but also sold commercially to the public. The BBC Micro was a state-of-the-art PC for its time. It innovated a number of engineering feats that later became standard in the industry. One of its key design features was its unusual processor.

The central processing unit, or CPU, is the brain in any computer. It does the work of computing. CPUs come manufactured with a built-in set of operations, also called instructions. At the time the BBC went shopping, the main family of CPUs was designed for complex instruction set computing (CISC). As the name implies, these types of processors had complex instruction sets built into them, which meant that certain complex functions did not need to be coded into the software, because it was already a part of the processor. This enabled software programmers to write simpler code, although that code would place a heavier demand on the processor to execute it.

Reduced instruction set computing (RISC) ran the opposite direction. These processors had fewer instructions built in, which meant the software would need to do more heavy lifting but would run more efficiently and use less memory.

While most of the rest of the PC world had committed to CISC processors, Acorn opted for RISC processors to power the BBC Micro. Embedding one acronym inside another, the chip was called the Acorn RISC Machine, or ARM. Compared to Intel’s processor chips, the ARM was smaller, less complex, and less demanding of resources, and it performed measurably better. Despite these advantages, the Micro was an anomaly for using an ARM chip, and the rest of the computing world remained in Intel’s CISC camp for decades to come.

Engineers working on developing smaller and more portable computer systems, such as Apple’s Newton in 1993, were drawn to the ARM chips. The biggest breakthrough came in 2007, when ARM processors inside the first iPhone helped accelerate the market for smartphones into mainstream popularity. ARM processors became the chip of choice for mobile devices, due to their low power consumption and reduced heat profile.

Apple’s decision in 2020 to switch its laptops and desktops to ARM processors from Intel CISC processors brought the ARM technology back to the full-size computing world for which it was originally invented. That original creation, the BBC Micro, sold over 1.5 million units through the 1980s. Many of those users were artists, who relied on the Micro to create electronic music (including such bands as Depeche Mode, Yazoo, and Erasure) or generate early computer graphics animation and special effects in shows like Doctor Who.

The Computer Programme ran for ten episodes in 1982. It spawned a sequel of sorts, Making the Most of the Micro, which was more or less a televised instruction manual for the computer.

The makers of the original show needed a working computer to use as a demonstration tool on air. The producers created one customized to their needs, the BBC Micro, and helped engineer an alternative to CPUs that were used by IBM and Apple. That quixotic CPU turned out to have some special advantages, such as efficient operation and less power usage, that were well positioned for the needs of the smartphone era, decades after the show had been televised.

Find out more at Legaltech News. (subscription required)

The views and opinions expressed in this article are those of the author and do not necessarily reflect the opinions, position, or policy of Berkeley Research Group, LLC or its other employees and affiliates.

BRG Experts

Related Professionals

David Kalat

Director

Chicago