The New Age Of Software-Defined

Sam Lin
7 min readMay 29, 2021

Before Internet Explorer took over the 1st browser war, Netscape was the king in the 90s. For those too young to know, it’s the first commercial successful Browser. Which changed the way we access the World Wide Web forever. Marc Andreessen not just co-founded it, but also has many great first-hand successful experiences in tech. So in 2011, Mr. Andreessen wrote “Why Software is Eating the World”, everybody listened. Which is a result of SoftWare-Defined(SWD).

Since that, many reframe the quote on other trending technologies such as mobile, Big Data, AI, etc. But generally speaking, we can still take the quote as-it. Allow me to explain why the trend will continue into the foreseeable future. What do I mean by software-defined. What are the 3 big wavies: programmable, democratization & transforming 🌊.

Wikipedia.org: Browser_wars 1996–2009

Software-Defined

I prefer to generalize the concept as SoftWare-Defined(SWD). tl;dr it’s just to define, aka implement a new feature by SW instead of a piece of new hardware. Not thing new here really. For example, you must know the iconic case: the iPhone in 2007. Just listen to how Jobs introduced iPhone as an iPod+phone+Internet commentator & why the new SW keyboard will be superior to all HW keyboards.

iPhone 1 — Steve Jobs MacWorld keynote in 2007

But wait, it’s not even the 1st successful application of SWD. As Jobs said: “we’ve solved it with a computer 20 years ago. We solved it with a big Mac screen, that can display anything we want, put any user interface up. And a pointing device, we solved it with a mouse…”. It won’t be the last one either. In the simplest term, it all boils down to the flexibility of SW, aka programmable.

Don’t get me wrong, I do like good design & tailor-made HW. The only problem is they are fixed & will be “outdated” as soon as they are shipped. Design is all about trade-off intelligently. I just too worry about car makers are leaning too much on the HW aspect they used to. Which will be in expenses of SW investment & app complexity. Luckily, there are black sheep like Tesla. So consumers can have a different choice. No, it’s not about to follow Apple’s footsteps, but more the histories end up taking similar paths. Yep, I know it’s hard to see the next star basing on the success models today. But no worries, it’ll become obvious in the future when we look back 🤫.

www.tesla.com/models

The 1st Wave: Programmable

Ada Lovelace is the 1st computer programmer, even before the computer is built. In her note G published in 1843, she created an algorithm to compute Bernoulli numbers on the Analytical Engine, a general-purpose computer proposed by Charles Babbage. Thanks to her great vision & contribution, new doors & windows are opening up for general-purpose computers since.

Bleeding-edge innovation has always been a messy business. It’s even so back then when computation had to take many different forms because of primitive tech. Luckily, visionary pioneers still improvise to push forward. Ada used her brilliant mind & imagination even pens & papers were the only weapons on her disposal. Later Alan Turing leveraged electromechanics.

  1. In 1940, Turning & the team created the 1st Bombe, an electromechanical machine, which broke the German naval Enigma cipher & changed the course of WWII.
  2. Finally in 1945, Electronic Numerical Integrator and Computer(ENIAC) the 1st programmable, general-purpose & digital computer. A Turing-complete machine can solve many numerical problems by reprogramming.

From that forward, we don’t need to build a new machine for every new problem. Just write a new program for it. So, SW has started eating the world.

www.lookfar.com/blog/2017/10/14/ada-lovelace-awards-backstory

The 2nd Wave: Democratization

Bear with me to against all common wisdom, I prefer to focus on democratizing computing power instead of specific market sections. Because that those change more lives. In this frame, the only test is if a device is “programmable” by any 3rd party, most of the time including users. If yes, I typically call the device is “smart”. The key reason I love this lens is democratizing the computation powers to whoever wants to solve whatever problems they care about. So Personal Computer, smartphone, tablet, etc. are all in this game together. And yes, it all started from a PC.

In 1971, Intel released the 1st commercially produced microprocessor: Intel 4004. A complete general-purpose CPU on a single chip marks the dawn of the personal computing revolution. Soon in 1974, MITS introduced Altair 8800, the 1st “true PC” powered by Intel 8080 CPU. Its 1st programming language is Altair BASIC. Which is Microsoft’s first product. Today, MS is a $1.9T company & nothing basic at all.

A fun fact, BASIC is the 1st programming language I learned & taught. Thank you, M$ 😉. In 2021, There are 5.3B mobile phone users & 4.7B Internet users according to DataReportal. It just takes 50 years. What’s a fast & furious ride. I’m feeling lucky, won’t you 😉?

www.asymco.com/2018/05/07/just-in-time/

The 3rd Wave: Transforming

It’s very difficult to get this right or even close to what the 3rd wave will be like because it’s very early. Even my wildest dreams are limited by what I know & knowing I don’t know. But no hurt to do a thought experiment, right?

In 2018, I took the time to learn few new tricks on Machine Learning & realized the transformation of computing may be at an inflation point. Because the programming model became more data-driven than algorithm-driven, and probability is the new math for answers rather than deterministic equations.

To be clear, it’s not to fully replace the classical programming models. Rather, it opens a huge application space, which used to be difficult or impossible. So what could be a better computing architecture to power this?

Let’s focus on consumer computing, where my passion is. In fact, we are not a stranger to computing architecture: CPU, GPU + specialized accelerators, and everything else are all SWD. Maybe Apple M1 is the “ Intel 4004” for this phase change. I guess so because of 2 key hypotheses.

1. The Complement Computing Power in Parallelism

Most ML programming will continue based on high-level languages, such as Python & frameworks. Today, they can be accelerated by many different specialized chips. But in the longer run, many accelerators will be generalized & consolidate fewer types of generic programable processes. So that a new type of acceleration can be easily added/programmed quicker whenever needed. So that, an ecosystem may prosper because “the innovation stack” is defined by SW from different players. GPU happens to be a good candidate for ML & many computations that take CPU a long time to compute.

Mythbusters Demo GPU versus CPU

2. The Economies Of Scale

Of cause, there are other strong programmable candidates to accelerate ML. For example, Field-Programmable Gate Array(FPGA) can be a strong candidate with advantages on low-power & high performance. But in the mass consumer computing market, the size does matter. GPU is already a must for most computing devices with GUI. Which put it into a better position on the economies of scale for a lower marginal cost & reusability of SW investments.

omdia.tech.informa.com: 28nm, a long lived node for IC applications in the next 5 years.

Embrace Software-Defined 🦾

Copy when you can. Invent when you must.
- Jim McKelvey, The Innovation Stack

To be clear, there are always niches for specialized accelerators & controllers. But if you are in a “middle-class” market & want to be “Smarter” in a new Digital Transformation journey. You better properly architecture your technology stack to maximize the leverage, e.g. taking a free ride from an adjustment market. Knowing what to steal and when to invest is the art one should master.

Remember SW is a talent-intensive industry. Even if you can afford the one-off investment to get your product out of the door, the true competitive advantage will be your capabilities to keep up with the ever-changing new demands & competitions. Psst! Smarter car players, “The soul of smarter car will be a software product… rather than a hardware part”.

Full Disclosure

The opinions stated here are my own, not those of my company. They are mostly extrapolations from public information. I don’t have insider knowledge of those companies, nor a whatever expert.

www.worldsciencefestival.com/infographics/a_history_of_computer_science

--

--

Sam Lin

A Taiwanese lives in Silicon Valley since 2014 with my own random opinions to share. And, they are my own, not those of companies I work for.