A few thoughts on innovation

9 minutes estimated reading time

I started thinking about a post on innovation, after an agency meeting about a possible project. My friend Nigel Scott has been researching the venture capital industry. His ideas fired some of the thoughts in this post.

It caused me to reflect again on innovation and the way we think about it.

Innovation rewards hard work?

We are often told that innovators work really hard and strive to achieve their goals. In Where Wizards Stay Up Late – there is a description of Silicon Valley culture. Late nights by engineers and takeout food was considered one of the factors that drove the early Internet. Engineers were building new technologies as they went at break-neck speeds.

The problem is that for many jobs there is no 9-to-5 now. When I worked in agencies 12+ hour days were typical depending on the client load. Yet we weren’t pulling Cannes Lions award-winning work out of our butts.

In China, many companies now work to ‘996‘. That is 9am to 9pm, 6 days a week as core hours. This is basic a minimum requirement for engineers. Somewhere like Huawei, try to build a ‘wolf’ mentality. They work their staff much harder and they’re expected to retire at 45 – presumably physically and mentally burned out.

Working hard is a hygiene factor, technology has made it that way. Your typical Uber driver is gamed by the driver app to put in excess of 12-hours/day. Both knowledge and unskilled workers would have a similar level of time poverty.

Innovation is like buses

For long-suffering public transport users in the UK many services are compared to buses. Due to road traffic and scheduling, there would often be an overly long time for a bus to arrive. When it eventually did, there would be another two following very closely behind.

You can see a similar thing with innovation.

Whilst we’re used to thinking of John Logie Baird as the inventor of television – and Baird worked very hard on television. The reality is that television was based on a series of inventions from the middle of the 19th century onwards.

There are at least 20 different inventors who had some claim to coming up with the light bulb. But Edison did manage to create the first commercially successful bulb. British school children are taught about Joseph Swan’s carbon filament bulb. This was let down by the vacuum process in manufacturing and poor quality electricity supplies so the bulbs didn’t last very long. Swan had solved his bulb’s problem and changed the filament.

It was only at this point that Edison started his research into electric light bulbs.

More recently, I was talking to an agency about a piece of work that didn’t come off in the end. The discussion turned to a drug that was very recently launched. The problem was that although they were first to market, they weren’t the only inventors. A large rival had launched drug approvals for their product in markets were original firm hadn’t focused on for its initial approvals. Another two companies were immediately behind them and likely to drop their prices (and profit margins) to make up for later market entry.

If one thinks about the modern computer with its graphical user interface. This was created by layers and layers of innovation. Doug Engelbart, whilst working at SRI International demonstrated the following to an audience of government officials in 1968

  • GUI interface
  • Mouse pointing device
  • Text manipulation
  • Collaborative editing
  • Video conferencing (a la Skype)

The Xerox PARC (Palo Alto Research Center) refined Engelbart’s concepts further with a complete modern office by 1973. Steve Jobs and his team got into see it, which drove work on the Lisa and then the Macintosh. Microsoft got in and eventually came up with Windows. Microsoft also learned from building software applications for the Macintosh.

Digital Research invented their own GUI layer called GEM. GEM was demoed at Comdex in 1984; right about the time Apple launched the Macintosh. Commodore launched the Amiga in 1985 and also added multi-tasking – the ability to run two or more apps at the same time.

These are just a few examples for the sake of brevity. But the inventor slaving away in isolation to come up with something, uniquely innovative is not rooted in evidence. Yet intellectual property law gives lie to this myth. I don’t want to belittle the work done, but it is as if there is a certain amount of predestination to invention based on prior innovations.

Innovation happens

This predestination of technological progress is something that Kevin Kelly labeled the Technium. In his book What Technology Wants he posited that technological progress can be slowed, but nothing short of an apocalypse can stop it completely.  Here’s what Kevin Kelly said in an interview with Edge.org when supporting the launch of What Technology Wants:

The technium is a superorganism of technology. It has its own force that it exerts. That force is part cultural (influenced by and influencing of humans), but it’s also partly non-human, partly indigenous to the physics of technology itself.

We understand the innovation process?

Nigel Scott has done some research on the historic records of venture capital companies. And a key finding was the Silicon Valley venture capital firms do a ‘random walk’ on Sandhill Road. It implies that much of the advice dispensed is survivor bias or post-rationalisation.

You hear the phrase ‘pivot’ which means changing the model to profitablity. Old time VCs used to talk about investing people or teams, which explains why research by Boston Consulting Group found that women get less funding than male entrepreneurs.

Venture capitalists have the monetary incentive and the budgets to develop a thorough understanding of innovation, yet they don’t seem to apply it successfully. Which begs the question – how much do we really understand about innovation?

Innovation: did software really eat the world?

Back in 2011, Marc Andreesen wrote an op-ed (opinion piece) in the Wall Street Journal ‘Why Software Is Eating The World‘.

Six decades into the computer revolution, four decades since the invention of the microprocessor, and two decades into the rise of the modern Internet, all of the technology required to transform industries through software finally works and can be widely delivered at global scale.

Over two billion people now use the broadband Internet, up from perhaps 50 million a decade ago, when I was at Netscape, the company I co-founded. In the next 10 years, I expect at least five billion people worldwide to own smartphones, giving every individual with such a phone instant access to the full power of the Internet, every moment of every day.

On the back end, software programming tools and Internet-based services make it easy to launch new global software-powered start-ups in many industries — without the need to invest in new infrastructure and train new employees. In 2000, when my partner Ben Horowitz was CEO of the first cloud computing company, Loudcloud, the cost of a customer running a basic Internet application was approximately $150,000 a month. Running that same application today in Amazon’s cloud costs about $1,500 a month.

As one can see Andreesen’s title is a bit of a misnomer. Software is only the front end of a technology stack that is transforming the world. That transformation started before the web, before broadband infrastructure; with the rise of integrated circuits. Machine learning is doing some impressive things, but they are part of a continuum. Machine learning in data mining is building on work done in academia in the 1980s. It is replicating work done in the 1990s on decision support systems and business intelligence software.

Even, back in the early 1990s, commercial chemical labs were using software to guide product development. Rather than having to test every combination religiously; you started inputting formulations and results. The software would then extrapulate possible combinations and narrow down on an ideal formulation much quicker.

Its_a_Sony

As for machine learning in consumer products; it mirrors the late 1980s. Fuzzy logic came out of a 1965 research paper by Lofti A Zadeh at the University of California, Berkeley.

Japanese manufacturers built lifts that optimised for traffic flows of people. Microwaves that set its own timer for defrosting an item. Washing machines customised spin cycles based on the drum load. Televisions adjusted their brightness based on the ambient conditions of the room. (When similar technology was rolled out on early Intel MacBook Pro screens and keyboard lights it was billed as game changing). It removed a lot of blur from camcorder videos. All applications that are not a million miles away from smart homes and consumer technology today. They improved energy efficiency, with precise lighting, heating or cooling.

A western analysis of Japanese technology companies; usually cites their ‘defeat’ by Silicon Valley as an apparent lack of software skills. I’d argue that this lacks an understanding of Japanese software capabilities. From gaming to rock solid RTOS (real time operating systems); Japanese products met Andreesen’s software definition. The Japanese didn’t manage to sell enterprise software in the same way as Silicon Valley. It is something to bear in mind given the current glut of machine learning-orientated businesses in Silicon Valley. Does it mean that we won’t have the type of general AI applications that we’ve been promised in the future? No far from it, though a technological idea often takes several tries before it breaks through.

What becomes apparent is that software making an impact is merely the last stage of previous innovations. The problem with Andreesen’s model is that it portends what Judy Estrin described as innovation entropy.

Andreesen’s model couldn’t exist without:

  • Packet-switched networks – 1960 (RAND)
  • Unix-type operating systems – mid 1960s (MIT, AT&T Bell Labs, General Electric)
  • C programming language – 1972 (Unix development team)
  • Optical fibre networks – 1965 (Telefunken)
  • Internet router – 1966 (UK National Physical Laboratory)
  • ADSL 1988 (Bellcore)
  • DOCSIS 1997 (CableLabs)

So the core technologies that Andreesen’s software relied upon to eat the world was between 15 and 50 years old. It also relied on a massive overinvestment in optical fiber.  The dark fiber was done as part of a telecoms boom that occurred around the same time as the dot com boom. Software isn’t eating the world, its just the cherry on top of innovation that’s gone before. More importantly, software seems to be an end point and  doesn’t seem to extend the base of innovation further.

A second problem is that semiconductors phenomenal progress in integrated circuits is slowing down. Part of the problem is that more money is being dumped into disrupting the supply and demand for service industries, rather than funding start-ups who will power the next wave of underlying innovation that future software will rely on.