25 technologies that have come to prominence during the past quarter century and have changed the world. CNET came up with their own list. That inspired me to take a run at it and make my own list of 25 technologies.
SMS and instant messaging
IoT (internet of things)
DOCSIS & DSL
VPN (virtual private network)
SaaS (Software as a Service)
VoIP (voice (and video) over IP)
Global navigation satellite systems
OSS (open source software)
XML (eXtensible Markup Language)
MPEG – (Moving Pictures Experts Group)
NFC – (near field communications)
2FA (2 factor authentication)
Looking over the list of things now, I can see that my ideas were about more foundational 25 technologies required to make the modern technology environment. I also have taken a more sanguine view on the 25 technologies.
Bitcoin and blockchain didn’t make the cut. Most of the applications that people like IBM look at call for a ‘private blockchain’, which negates the distributed ledger benefit. It can’t handle as many transactions as an Oracle database as fast. Digital currency maybe a thing and central banks have been actively thinking about it, but I am less convinced by cryptocurrencies. Secondly, crypto currencies are exceptionally energy inefficient; which is important in a world trying to move towards a low carbon economy.
With quantum computing it is just too early to tell. The technology is probably only where the digital computer was back in the 1940s. IBM have form in backing alternative forms of computing that haven’t panned out, like Josephson junctions, optical computing and gallium arsenide based computing. None of which have made it into mainstream computing.
Back in the 1980s progress was made at a furious rate on superconducting materials, with a future promise of room temperature superconductors at some point in the future. Although the research gave use some novel materials, it has mostly made a difference in heavy hospital based medical equipment sensors. Hence my hesitation to get excited about technologies still in their relative infancy.
Common items in the list of 25 technologies
Wi-Fi – like many technologies, Wi-Fi didn’t suddenly spring forth from the ether. It was the child of several developments over three decades. The name itself came about in 1999, created by branding agency Interbrand. It meant nothing in and of itself except as a pun on ‘hi-fi’. The name and logo were important at the time as they were signs of compatability. A laptop with wi-fi could log on and use a network with the right security details. This changed IT and buildings dramatically. Before Wi-Fi, you needed ethernet cable, a modem, modem cables and socket adaptors. And you’d still need your laptop power brick as battery life was a lot poorer back then. Wi-Fi was easy to install and changed spaces, at home, at work and in between. If you had internet at home before brandband you were tethered to the telephone port; or the modem tethered to the telephone port. The Internet was used in a fixed space. At work you were tied to your desk and forget about working in a coffee shop if you needed to log on. Even the term log on implies a time when going on the internet was an active thing to do. Wi-Fi redefined all that, you could work wherever you wanted to in the house. Connect whatever devices you wanted. I still use ethernet at home for my computer and Apple TV, but I don’t have to. My laptop switches on to the Wi-Fi network when I move away from my desk. Wi-Fi was also critically important for smartphones. Mobile networks are patchy, even more so indoors, but with smartphones came the ability to route their cellular calls over wi-fi. This was first of use to Blackberry users and is now an option to be turned on with most modern smartphones. Logging on no longer had to be an active state, we became always on, all the time. Along the way Wi-Fi had to see off competition from a European standard called HyperLAN2. I worked on promoting Ericsson’s home hubs for that, lovely product design but it was going nowhere.
Bluetooth – While the origins of Bluetooth owe a lot to a couple of Ericsson engineers in the late 1980s. Much of what we now think of Bluetooth is down to a partnership that Ericsson and IBM did in 1997. They looked to incorporate a short link wireless connection between a laptop and a cellular phone. The cellular phone would then be used as a modem for basic email. At the time, the other options were a cable, or IrDA – an infra red connection. IrDA was supposed to have a one metre point-to-point connection. I found in practice that you had to to a third of that distance most of the time. This limitation at least made it secure. Bluetooth eventually made it to phones, laptops and headsets during the dot.com boom. A key driver in this was the more compact nature of lithium ion batteries. People found it disconcerting someone would be next to them apparently talking to no one. So Bluetooth headsets didn’t take off really well until people stopped using voice on phones so much. I was fortunate to go to the US on a business trip in 2006 and picked up a Jawbone headset. This was a major improvement in noise reduction and call quality, but I only ever used for Skype calls at home as I didn’t want to look like a doochebag boiler-room sales professional.
What’s amazing now is the sheer ubquity of Bluetooth. Industrial computing networks, medical technology, consumer electronics, gaming and electronic fences.
Social networking – social networking as a concept had existed for as long as consumers had gone online. There was the bulletin board culture, forums, services that helped you build your own sites. Chat rooms kind of served the same role that Twitter hashtags do. 10 years ago, social networking was a place of interesting experiments. Localised solutions for different markets; Japan and Korea were way out in front doing mobile social. Mass adoption changed things. Now social is engrained in the fabric of society, like a bunion. What we didn’t get was the digital utopian dream of a harmonious global village, but the same grubby aspects of society accelerated through using a digital domain. The truth is no longer a universal concept.
Apps – apps or more accurately an online app store and signed apps have changed computing. The app store first appeared in the early 1990s on NeXT computers. It was designed to manage intellectual property rights on digital media and software. The app store built on Unix-like system tools called a package manager. Palmix was an Indian web based app store aimed at PDA users. A year later NTT DoCoMo launched i-mode an online integrated app store for mobile phones. Vodafone, KPN and Nokia followed with stores soon after. Handango released the first on device store similar to the Apple App Store experience now.
Apple quickly realised that the app store was a winner and put it front and centre of its marketing.
RFID – it was originally used to track boxes in a warehouse and containers in a port. Technology brought the cost down so that it could track most items on a shop, or books in a library. Security guards walking a beat could tap and go at checkpoints and so could credit card payments. Pet could be returned to their owners thanks to an RFID pellet injected below the skin. In a secure lab that I worked in, it took a certain knack to swipe your card through the magnetic stripe reader and open the door. With RFID, it would be tap and go. In a post-9/11 world RFID tags went into every passport, changing immigration experience of air travel forever.
On a more prosaic level it sparked off several stored value transport cards including Oystercards in London and Octopus in Hong Kong. On average, they still get you through the turnstile faster than a phone app and NFC.
The rest of the 25 technologies
SMS and instant messaging – the UK and US developed in very different ways during the late 1990s and into the 2000s. Thanks to the EU spending so much research and development money on getting second generation networks up and running. Meanwhile over the US there was a plethora of cellular network standards and mobile roaming a nightmare. Instead the US established an internet culture earlier. Free local calling made dial up internet popular. This meant that they developed an instant messaging culture, whilst Europe saw a similar surge around SMS messaging. Both provided training wheels for adoption of our current mobile messaging culture. SMS is still used as a lingua franca for smartphone messaging, by everyone from Amazon to airlines. Like email, tales of its demise are premature.
Mobile broadband – GSM or 2G democratised mobile phone usage, but it was limited by data bandwidth and data latency. Whilst it was rated as being similar to a dial-up modem it often felt way slower. It was only 2.5G (EDGE or EGPRS), 3 and 4G that made possible what we now take for granted as a mobile experience. With 2G, getting anything done took a real effort. Downloading text emails were painfully slow. And data was expensive. Mobile connections were worthwhile for specialist applications like news and sports photographers needing to get images as fast as possible for sale to picture desks. 3G promised video calls and TikTok-esque sports highlights. The reality was passible email and access to maps. It was around about this time that I no longer carried an A-to-Z atlas of London with me everywhere. You couldn’t have Instagram, WhatsApp, Google Maps, Siri or the weather without mobile broadband. These not only empowered services like downloads, streaming and video, but changed our relationship with the internet. Our relationship with bandwidth and real terms price drops were responsible for our always-on life as much as WhatsApp and Skype.
Voice recognition – as with many of our 25 technologies, voice recognition as we understand it now started with work done at Bell Labs. Back in the early 1950s, they managed to train a system to recognise a single voice dictating digits. From there voice recognition evolved in fits and starts. This innovation was predominantly driven by the telephone companies and the defence industry. 1990 was a pivotal year. Dragon Dictate – a personal computer based system was launched. AT&T deployed the Voice Recognition Call Processing service. AT&T service allowed calls to be routed without the involvement of a receptionist. This is usually the first line of a call centre experience, or when phone banking is used to validate online banking payments.
It has become more important as smartphone interfaces have hidden the number pad on calls. Voice has also been an area where phone interfaces and home devices have tried to tap into. And for many they have worked reasonably well. I have personally found that the results have been more inconsistent for me. My Ericsson T39 from 2001 was able to recognise ‘Call <insert name>’ consistently; associating the name with a person in my speed dial list. Something that Siri struggles to do now. Siri manages to play me the headlines from the BBC and Google doesn’t seem to understand me at all.
The benefits of speech recognition moves forwards in fits and starts. The UK may prove trickier due to the relative volume of accents compared to the size of the population. And then you have people like me with an accent that has changed over time as I have moved around. Unconsciously adapting to my environment and losing some of the edges of my North of England and Irish upbringing.
Search – like most people who have been using the internet since the mid-1990s, my experience was divided not by before and after Facebook. But before and after Google. Originally the web was so small that the original search engines worked remarkably well. I remember using them as part of my research process during my degree. As the internet grew the original search engines like Hotbot, AltaVista and Excite struggled to keep up. On to the scene came Google.
Google changed the way that we found things on the web. Concepts like web rings and directories are now ancient history. Our relationship with the web was mediated through its search box and it became our gateway to the web. Search also changed our relationships with our devices. It inspired journaled index of computer drives as consumers expected answers to finding items on their computer with the same ease as the web. Search is now the primary way that I navigate my Mac and my iPhone. It is a design metaphor that will be with us for a long time.
VoIP – voice over Internet Protocol (VoIP) was first used in the early 1970s to pipe instructions into a flight simulator over the ARPANet. It really found its feet in 1991 with the first software programme allowing VoIP communications. The following Commuique was released which was like a Zoom analogue. As the commercial internet rolls out in the US, Israeli firm VocalTec releases its ‘Internet Phone’ application. Soon after the ITU looks at VoIP standards. The rise of the internet led to alternative telcos that routed voice minutes over data networks – a mix of old and new telecoms.
I started my agency career working on one such alternative telco that used technology from Israeli VoIP start-up deltathree. At this time, the price of voice calls declined precipitously; particularly for international calling at the expense of quality. The industry attracted numerous spivs. The SIP standard was developed as an analogue for SS7 in voice and video calls.
With 3G phones and a modicum of good interface design drove VoIP calls over services like Skype and Vonage. This was displaced in terms of popularity by a new generation of mobile first services like Viber, WhatsApp and FaceTime. Zoom built on this base for its conference call platform. In the meantime, telecoms providers have tried to reinvent themselves. Some with more success than others.
Global navigation satellite systems – The US highlighted the impact of GPS during the first Gulf War.
After the Gulf War, non-defence usage came into focus. Telematics and navigation. GPS also provided timing to a diverse range of technologies from mobile networks. to ATM machines. In the early 2000s. PDA manufacturers like Fujitsu manage to integrate GPS modules into their PDA (personal digital assistant) devices. Nokia’s N95 smartphone, was the first popular device with a built in GPS receiver and this spurred the adoption of maps on a smartphone.
Now the use cases are limitless as smartphone apps can tap into location data when a person is outside a building. The next step is accurate indoor location positioning – all be it, no longer relying on satellite signals.
OSS – Open Source Software (OSS) is pervasive in the modern day. This blog runs on OSS (Linux, Apache, MySQL and PHP). The Mac that I write this post on is based on OSS (Darwin, Mach microkernel, FreeBSD). The web browser is based on a branch of KDE Conqueror called WebKit and that’s the same with the iPhone and iPad as well. If you’re using an Android phone its based on Linux. Even smart home light bulbs run Linux.
The rise of OSS went hand-in-hand with the web. Widespread doption started in server software that worked with open standards. Pretty soon you saw attempts to put it elsewhere. Desktop Linux including Netbooks – lightweight low power laptops. Ideal for checking your email or surfing the web. At the same time Apple had transitioned from the ‘Classic’ MacOS to something based on NeXTSTEP – acquired with NeXT Computer. Motorola and other manufacturers put it into mobile phones – as forerunners of the modern smartphone. From there it went into Sony PlayStation 3 console. As globalisation drove electronics manufacturing to China; manufacturers of all kinds of gadgets saw the benefits of Linux – even if they didn’t honour the law and spirit of open source cough, cough Huawei…
Email – despite Facebook owning all our data, email is the key identifier. The identifier that you log into your Amazon account, log on to Netflix with and countless other services. Despite email being dead and countless other services being layered on top to replace it, its still very much alive. My own email account has selected correspondence that goes back to 2001.
Email marketing statistics are declining in effectiveness yet its still a very effective medium. Just look at businesses like ASOS.
Our relationship with email changed. When I left college, I had signed up for an online account with Yahoo! I could keep in touch with friends and apply for jobs. The email address went on my CV and I went to a cyber cafe Liverpool with a disc full of email messagess to send every Saturday. I usually had coffee and carrot cake with a friend whilst I sent it. We’d then go into the shopping district of central Liverpool to chat and do some window shopping.
Working in an office, I could check my personal email at lunch time. Home broadband meant that I could check my account at home. Move forward ten years and email is in the palm of our hands, everywhere we go. I managed to get email to work on a Nokia 6600. You can see a surge in Gmail accounts that coincides with the rise in popularity of smartphones.
MPEG – Motion Picture Experts Group is responsible for pretty much every form of audio and video format that we use today. Whilst the technology might come from a multitude of sources, MPEG set standards are invaluable for it. Whether its digital radio, online radio, digital physical media like Blu-Ray and DVD or streaming media MPEG has had an outsized influence. It also relates directly to voice and video communications codecs, hence their place in the 25 technologies. If you’ve done a FaceTime call, listened to Spotify or watched a movie you can thank MPEG.
NFC – Near-field communications offers a way of using devices as authentication. It has really come to its own in smartphones where they serve as contactless digital wallets, access passes and digital car keys. Admittedly mobile wallets have a poor experience and its frightening to think that you wouldn’t be able to get into your car because someone couldn’t be bothered to maintain the Android or iOS app. Yet whether we like it or not NFC has become part of our tech eco-system. I would have preferred if I didn’t have to put into this list of 25 technologies, but I had to acknowledge its impact.
2FA – over the past ten years, two factor authentication (2FA) has gone from being an enterprise level security tool to consumer grade security. The traditional RSA dongle with its constantly changing number codes was a status symbol of the corporate road warrior alongside Tumi luggage and a Blackberry. Now we get those numbers via a smartphone app or by SMS. This has happened as online identity theft and data breaches have become commonplace and massive databases of passwords have been cracked.
Strong cryptography – It’s hard to convey how pervasive strong cryptography has become. Up to a 1/4 of users online currently use a VPN application which encrypts their web traffic. Web connections between a site and a browser are now encrypted more often than not. If you’ve ever done online backing, or bought something online with your credit card you’re using strong cryptography. My laptop uses Apple’s FileVault to encrypt the drive completely. Messaging via iMessage, WhatsApp, Signal or Silent Phone all use strong cryptography. Back in the early 1990s, strong cryptography was seen as a weapon, it was limited in its export. I strongly recommend reading Steven Levy’s Crypto to find out how we got here. I remember when Lotus Notes came with weaker encryption outside the US during the dot com era. Now I am leery of using any communications platform that doesn’t have strong cryptography.
OCR – optical character recognition is technology that has been around for decades. In its modern sense, the start of it is around 1974 with entrepreneur Ray Kurzweil. Now its a foundational technology for many leading edge applications:
- Interpreting the real world (billboards, road signs, automatic number plate readers)
- Real time translation (using Google translate to read restaurant menus etc)
- Digitisation of books and manuscripts (Google Books)
- handwriting recognition and pen computing
- Making digitised documents searchable
All of this helps technology to interact with the real world in near real time. You need it for many of the wide range of future technologies that are envisaged. The slow rise of a web-of-no-web where the real world is blended with the online world is possible because of multiple technologies from GPS and QRcodes to optical character recognition.
Machine learning – when people talk about artificial intelligence they usually mean machine learning. Google and other companies are applying techniques that were developed at the University of Toronto in the 1980s during an AI winter. The idea is that if you show a computer programme enough pictures with cats, it will recognise cat attributes as a pattern and recognise them in the future. Its a very particular skill which is the reason why machine learning has offered so much promise and let us down at the same time.
I talked about an AI winter. That’s a time when there was a dearth of spending in artificial intelligence research. We’ve had several cycles of massive government investment and withdrawal as AI historically failed to deliver.
So under the right circumstances, machine learning can count craters on lunar photography or likely cancerous tumours in X-ray imagery. Yet machine intelligence struggles to recognise what I ask. AI driven ad platforms get targeting hilariously wrong. It mirrors some of the fuzzy logic capabilities of Japanese consumer electronics: the auto focus camera, lifts that optimise for traffic flow in tall buildings or the microwave that knows how long to cook your food for. This was based off a mathematical paper published in 1965 by an academic at UC Berkeley.
Moore’s Law and the worry of digital disruption has pushed machine learning adoption, the results may disappoint; but realpolitik will keep it in play.
USB – Computing before USB was messy. There was a range of ports for different things. Connecting a printer, connecting a keyboard and connecting an external hard drive or CD ROM drive all required different sized cable connectors. When you were setting up a computer, it would be clearly labeled on the back of the machine what its function was. I had cables which had ideograms that were moulded on the top of them which came with the Macs that I owned.
CMOS sensors – CCD sensors had been invented over 50 years ago. They were well understood and had been incorporated in video cameras since at least the early 1980s. CCD sensors offered better quality, but had issues with lag. Techniques designed to deal with this helped the performance of CMOS sensors. CMOS sensors were invented by NASA’s Jet Propulson Lab building on work that Olympus did in the 1980s. First it went into mice, then into low end cameras. The technology got better all the time. Doing more in less space with less power. Eventually they went into webcams and cellphones. Nowadays, you’re only like to see CCDs in very particular use cases now. CMOS sensors are everywhere in modern life; even high end photography equipment like PhaseOne.
What would be in your 25 technologies, how would they differ from mine or CNet’s?