Why I Believe Apple's Swift Going Open Source Gives Hope To The Future Of Computing

First, let's start by looking back at the history of some of the technology we use today, that we often take for granted, to understand how a good programming language and a good compiler can make the world shake, literally.

Turing Machine

Alan Turing was the first to formalize the model Turing Machine when building the Enigma computer for the second world war, to decrypt Nazi communications.

Turing Machine Turing Machine Sketch

It was a machine that read a tape and processed the tape input and wrote it back to the tape.

Von Neuman Machine

Later, after the Turing machine, John von Neuman, one of the guys behind the Hiroshima bombing, invented his own architecture.

Von Neuman machine Von Neuman Machine Architecture

The Von Neuman architecture is the predecessor of most modern computers, having a memory unit, and a Central Processing Unit (CPU), which has an arithmetic unit and control unit (knowing what’s next to run).

The Big Machines

More technical specifications were added to computers as time went on, such as rotating hard drives, buses, and multiple tape inputs.

Since computers were a set of chips and hardware wired together, each hardware had its set of instructions and limitations, and programmers were to configure things up together, that’s why operating systems were mostly written in machine code. This came at a cost, computing power was expensive, so time-sharing Operating Systems were invented. They simply made terminals connect to one centralized mainframe, and share its computing power through a network. It's similar to what we now call cloud computing.

Birth Of UNIX & C

While building a time-sharing Operating System with General Electric and Bell Labs in machine code, Dennis Ritchie and Ken Thomson invented the C Programming Language.

Dennis Ritchie and Ken Thomson Dennis Ritchie and Ken Thomson

The two engineers worked on UNIX as a time-sharing operating system written in C. Which made the OS modulable, easy to debug, and easily portable to other machines.

Unix is the precursor to Linux, Android, iOS, and MacOSX, that power your phones and servers.

Era Of Personal Computing

Born out of a military background, and due to their expensive nature, computers were accessible by big corporations and governments only.

Hobbyist computer clubs started emerging.

In the 1970s, BASIC was one of the most popular programming languages for hobbyists, it was as popular as Javascript is today.

At high-school Bill Gates and Paul Allen started a Traf-O-Data, a startup making a computer for road traffic hardware monitoring, using MITS technology. MITS a manufacturer of computers for adopted Intel 8080 for their Altair 8800. This made the processor popular among hobbyists. Bill Gates and Paul implemented BASIC for the Intel 8080 8-bits processor to make use of the Altair 8800 in their product line. Soon Bill Gates became famous in the hobbyist world. In 1975, MITS officially marketed their implementation as Altair BASIC and started buying services from them, which made them found Microsoft, to focus on software only.

At the time, Atari was emerging as a leader of the revolution of video gaming machines.

Back in the days games were implemented multiple ships and hard-wired. Programming was avoided because each game needed some audio and video features. Wires and more wires, no code.

Steve Jobs and Steve Wozniak Steve Jobs and Steve Wozniak

Steve Jobs knew that Atari wanted to develop Pong and optimize their use of transistors. Steve Wozniak optimized the transistors like hell, and made the Pong with as few chips as possible, which made them both good cash.

MOS, an indie team, introduced the 6502 8-bits processor in 1975, which was the cheapest on the market. It was just like the raspberry pie or Arduino in their days. To make his name just like Bill Gates, Steve Wozniak worked on a BASIC compiler for the 6502 chip, and made it for their Apple I to get the respect of their hobbyist community.

"But I had in the back of my head that I could be a star, that I could get a little fame in the hobby world, like Bill Gates, if I created the first BASIC for the 6502" — Steve Wozniak

Wozniak was learning to make video games in BASIC, frustrated he upgraded the hardware (audio, timers, colors). He then made a video game in BASIC, that was configurable.

In other words, he could write video games and iterate over changing their design in half an hour, what would take ATARI 10 years of work. It was a revolution, the Apple II was born, making the success of Apple in the computer market.

Then personal computers and smartphones happened.

Portability Issue

To open new possibilities, engineers invent new sensors, processors, memory technology each year. Still the gap of making software compatible and making developers adopt it remains a tough challenge in the Information technology industry.

Java tried to solve this problem with its Java Virtual Machine, while many interpreted languages, such as PHP, Python, and Ruby, jumped the shark with their Just In Time compilers.

With these languages, we were able to build more desirable software in the least amount of time and frustration, but none of these languages could be at the power of the low-level static languages such as C and C++.

At a macro level, servers and desktops, performance is just a cost in the equation, but when we get to program hand-held devices and also now we talk about the Internet of Things, i.e. devices which have limited memory and CPU power, we have no other option.

Simply because writing a language is simply a tough job, there is a lot of voodoo work you have to do. Building a compiler, a debugger, an optimizer, translators (back-end) for each platform.

Summoning LLVM

In 2002, Chris Lattner invented LLVM, an infrastructure for programming languages. It is a common framework for anyone who wants to write a compiler, debugger, an emulator, and why not virtual machines.

As a result, he was awarded the Nobel prize equivalent for software, the ACM System Software Award in 2012, the same award that was given to UNIX, TCP/IP, and the World Wide Web.

Now LLVM has replaced GCC, the de-facto compiler for C, and is now used everywhere from Nintendo Games porting, running video games inside FireFox, to creating beautiful languages such as Apple's Swift.

It's partly thanks to LLVM that XCode toolkit and iOS was bread and butter for developers during all these years.

Lower Level Virtual Machine

Chris Lattner joined Apple in 2005, he played a huge role in the making of iOS and the XCode Developer tool-chain, he started working on Swift in 2010, and now he is Senior Director of XCode and Apple dev tools.

This week Apple announced that Swift is Open Source. I hope we'll be able to build programs easily for embedded systems and the Internet-of-Things devices.

Just like it did unfold for Wozniak and Bill Gates, I hope history will repeat itself.

Comments