Category: materials | 原料 | 원료 | 原材料

Materials are as important as technology and innovation. Without access to hydrocarbons you wouldn’t just lose access to the car as transport, but the foundational products of modern life.

Added to the materials list of importance would be the likes of:

  • Lithium – current battery technology and in some alloys
  • Helium – inert atmosphere for chemical reactions and lighter than air craft including blimps, airships and weather balloons
  • Silicon – semiconductors
  • Cobalt – a key material in batteries
  • Titanium – similar applications to steel but with a higher weight to strength ratio. Also hypoallergenic in nature
  • Carbon fibre – high strength light weight materials

Rare earth metals and key materials including:

  • Dysprosium- magnets, lasers, nuclear control rods
  • Erbium – lasers, particularly in telecoms fibre optics cables and optronics
  • Europium – interest in using it to develop memory for quantum computers
  • Holmium – magnets, lasers and quantum computer memory
  • Neodymium – high strength magnets
  • Praseodymium – magnets
  • Yttrium – catalyst in some chemical processes
  • Thorium – future safer nuclear fuel source
  • Thulium – portable x-ray devices, ceramics used in microwave equipment
  • Scandium – high strength lightweight alloys
  • Ytterbium – manufacture of stainless steel, atomic clocks
  • Uranium – nuclear fuel

In addition to innovation in material science and chemistry with these raw materials. There is also the benefit of recycling and reusing existing stuff once it has finished its useful life. The Tokyo olympics of 2020 saw an unprecedented peace time effort to find precious metals in e-waste and junk that could then be processed into the winners medals.

A desire to lower the carbon footprint will require ingenuity in systems, design and materials use for it to be successful

  • Business cards

    The Financial Times opined on the obsolescence of business cards. This has been a common theme for the past quarter of a century, so whether or not it’s actually news is up for debate.

    TWGE

    Business cards have been a surprisingly accurate marker of my career’s evolution. Before college, when I was working in laboratories to save up, business cards were strictly for management. If anyone needed to reach me, they’d receive my name and extension number scribbled on a company compliments slip.

    Fast forward to my early agency days, and changing my business cards became the immediate priority after receiving a promotion letter. I vividly recall discussing new cards with our office manager, Angie, to reflect my new title: from Account Executive to Senior Account Executive. While that promotion enabled me to buy my first home, it was the tangible act of updating my business cards that truly solidified that future title for me in my memory.

    Building a network was an important part of development in the early part of my career and my manager at the time would ask us each week how many business cards we’d given out as a way of quantifying that development.

    Business cards had a symbolism and status that was captured famously in Brett Easton Ellis’ American Psycho and in memorable scene of its its subsequent film adaptation.

    Even today in Asian countries, business cards come loaded with cultural symbolism and a distinct etiquette of exchange. The exchange of them is handy as it allows to lay out a model of who is around a meeting table based on the card collection, facilitating easier meeting communications.

    Personal organisers

    In the mid-1990s, the personal organiser was a staple, its prevalence varying depending on location and budget. These organisers typically featured loose-leaf pages for schedules, an address book, and a system for storing and archiving business cards, even those of people who had moved on. However, by 2001, the media was already concerned about the impending demise of the personal organiser and its potential impact on the business card’s future.

    Filofax

    Filofax has the reputation for being the most British of brands. It originally started off as an importer of an American product Lefax. Lefax was a Philadelphia-based business which made organisers popular within industry including power plant engineers in the early 20th century.

    At that time electricity was considered to be the enabler that the internet is now, and Lefax helped to run power plants effectively and reliably. Filofax eventually acquired Lefax in 1992. During the 1980s, the Filofax became a symbol of professionalism and aspirational upward mobility. I was given one as soon as I started work, I still have it at my parents home. It’s leather cover didn’t even develop a patina, despite the beating it took in various parts of my work life: in night clubs, chemical plants and agency life. Filofax even became part of cinematic culture in the James Belushi film Taking Care of Business also known as Filofax in many markets.

    Day-Timer

    In the US, there was the Day-Timer system, which came out of the requirements of US lawyers in the early 1950s and became a personal management tool for white collar workers in large corporates like Motorola – who appreciated their whole system approach. Day-Timer was as much a lifestyle, in the same way that David Allen’s Getting Things Done® (GTD®) methodology became in the mid-2000s to 2010s. Customers used to go and visit the personal organiser factory and printing works for fun. Along the way, other products such as At-A-Glance and Day Runner had appeared as substitute products. Day-Timer inspired the Franklin Planner system; a similar mix of personal organiser and personal management philosophy launched in 1984.

    By the mid-1990s, Day-Timer had skeuomorphic PC programme that mirrored the real-world version of the Day-Timer. At the time this and competitor applications would allow print-outs that would fit in the real world Day-Timer organiser. Day-Timer’s move to mobile apps didn’t so well and now it exists in a paper-only form catering to people wanting to organise their personal lives and home-workers.

    Rolodex

    While the Filofax allowed you take to your world with you, the Rolodex allowed you to quickly thumb through contacts and find the appropriate name.

    Rolodex

    Back when I first started my first agency job, I was given my first Rolodex frame. I spent a small fortune on special Rolodex business card holders. At my peak usage of Rolodex as a repository for my business contacts, I had two frames that I used to rifle through names of clients, suppliers and other industry contacts.

    Rolodex became a synonym for your personal network, you even heard of people being hired for ‘their Rolodex’. For instance, here’s a quote from film industry trade magazine Hollywood Reporter: Former British Vogue Chief Eyes September for Launch of New Print Magazine, Platform (May 8, 2025):

    …to blend “the timeless depth of print with the dynamism of digital” with coverage of top creative forces, no doubt leaning into Edward Enninful’s enviable Rolodex of A-list stars, designers and creators gathered through years spent in the fashion and media space with tenures at British Vogue and as European editorial director of Vogue.

    If I was thinking about moving role, the first thing I would do is take my Rolodex frames home on a Friday evening. The fan of business cards is as delicate as it is useful. It doesn’t do well being lugged around in a bag or rucksack. Each frame would go home in a dedicated supermarket shopping bag.

    The Rolodex was anchored to the idea of the desk worker. The knowledge worker had a workstation that they used everyday. Hot-desking as much the computer is the enemy of the Rolodex. My Rolodex usage stopped when I moved to Hong Kong. My frames are now in boxes somewhere in my parents garage. Doomed not by their usefulness, but their lack of portability.

    Personal information management

    The roots of personal information management software goes back ideas in information theory, cognitive psychology and computing that gained currency after the second world war.

    As the idea of personal computers gained currency in the 1970s and early 1980s, personal information software appeared to manage appointments and scheduling, to-do lists, phone numbers, and addresses. The details of business cards would be held electronically.

    At this time laptops were a niche computing device. Like the Rolodex, the software stayed at the office or in the den at home. NoteCards used software to provide a hybridisation of hypertext linkages with the personal information models of the real world. NoteCards was developed and launched in 1987, prefiguring applications like DevonTHINK, Evernote and Notion by decades.

    As well as providing new links to data, computers also allowed one’s contacts to become portable. It started off with luggable and portable laptop computers.

    Putting this power into devices that can fit in the hand and a coat pocket supercharged this whole process.

    Personal digital assistants

    Personal digital assistants (PDA) filled a moment in time. Mobile computer data connections were very slow and very niche on GSM networks. Mobile carrier pricing meant that it only worked for certain niche uses, such as sports photographers sending their images though to their agency for distribution to picture desks at newspapers and magazines. While the transfer rate was painfully slow, it was still faster than burning the images on to CD and using a motorcycle courier to their picture agency.

    The PDA offered the knowledge worker their address book, calendar, email and other apps in their pocket. It was kept up to date by a cradle connected to their computer. When the PDA went into the cradle information went both ways, contacts and calendars updated, emails sent, content to be read on the PDA pushed from the computer. IBM and others created basic productivity apps for the Palm PDA.

    IrDA

    By 1994, several proprietary infra red data transmission formats existed, none of which spoke to each other. This was pre-standardisation on USB cables. IrDA was a standard created by an industry group, looking to combat all the proprietary systems. The following year, Microsoft announced support in Windows, allowing laptops to talk with other devices and the creation of a simple personal area network.

    This opened the possibility of having mice and other input devices unconstrained by connecting cables. It also allowed PDAs to beam data to each other via ‘line of sight’ connections. The reality of this was frustrating. You would often have to devices an inch from each other and hold them there for an eternity for the data to crawl across. It wasn’t until 1999 that the first devices with Bluetooth or wi-fi appeared and a couple more years for them to become ubiquitous. Unsolicited messages over Bluetooth aka bluejacking started to appear in the early 2000s.

    But IrDA provided a mode of communication between devices.

    versit Consortium

    versit Consortium sorted another part of the puzzle. In the early 1990s the blending of computer systems with telephony networks as gaining pace. A number of companies including Apple, IBM and Siemens came together to help put together common standards to help computer systems and telephony. In 1995, they had come up with the versitcard format for address book contacts, better known now as ‘vCards’. These were digital business cards that could be exchanged by different personal information management software on phones, computers and PDAs. For a while in the late 1990s and early 2000s I would attach my vCard on emails to new contacts. I still do so, but much less often.

    The following year the same thing happened with calendar events as well.

    Over time, the digital business card came to dominate, via device-to-device exchanges until the rise of LinkedIn – the professional social network.

    Faster data networks allowed the digital business card sharing to become more fluid.

    A future renaissance for the business card?

    While business cards are currently seen outdated in the west, could they enjoy a renaissance? There are key changes in behaviour that indicate trends which would support a revitalisation of business cards.

    Digital detox

    While information overload has been a turn that has been with us since personal computers, digital detox is a new phenomenon that first started to gain currency in 2008 according to Google Books data. Digital detox as a concept has continued to climb. It has manifested itself with people talking a break from their screens including smartphones. Digital detox has continued to gain common currency.

    Creating a need for tangible contact details in the form of a business card in certain contexts.

    The pivot of personal organisers

    Day-Timer and Filofax didn’t disappear completely. While Day-Timer is no longer a professional ‘cult’, it now helps remote workers organise their own work day at home. They also tap into the needs of people organising their own wedding. The paper plans also gives them a memento of this event in a largely digital world.

    If personal organisers continue to exist then real-world business cards would also make sense in those contexts.

    Bullet-journaling

    Ryder Carroll is known as the ‘father’ of the bullet journal which was a home-made organisation method which was similar to the kind of task lists I was taught to pull together in my first agency role. There were aspects of it that would be familiar to Day-Timer advocates as well.

    When the world was going digital Carroll used paper to help organise himself. Carroll tapped into the fact that even computer programmers use paper including notebooks and post-it notes to manage projects and personal tasks within those projects. Carroll took his ‘system’ public via Kickstarter project in 2013.

    Bullet journaling provided its users with simplicity, clarity and an increased sense of control in their life. What is of interest for this post, is the move from the virtual back into paper organisation.

    Changing nature of work

    Hybrid working, remote working and increasing freelance communities in industry such as advertising has affected one’s professional identity. This has huge implications for personal standing and even mental health. Human connection becomes more important via virtual groups and real-world meet-ups. Controlling one’s own identity via a business card at these meet-ups starts to make an increasing amount of sense.

    The poisoning of the LinkedIn well

    On the face of it LinkedIn has been a wonderful idea. Have a profile that’s part CV / portfolio which allows your social graph of professional connections to move with you through your career. Services were bolted on like advertising, job applications and corporate pages to attract commercial interest and drive revenue.

    Over time, LinkedIn has increased the amount of its creator functions, driving thought leadership content that is a prime example of enshitification. 2025 saw ‘thought leaders’ publishing generative AI created posts as entirely their own work.

    LinkedIn has become devalued as a digital alternative to the humble business card.

    More related posts can be found here.

  • Intelligence per watt

    My thinking on the concept of intelligence per watt started as bullets in my notebook. It was more of a timeline than anything else at first and provided a framework of sorts from which I could explore the concept of efficiency in terms of intelligence per watt. 

    TL;DR (too long, didn’t read)

    Our path to the current state of ‘artificial intelligence’ (AI) has been shaped by the interplay and developments of telecommunications, wireless communications, materials science, manufacturing processes, mathematics, information theory and software engineering. 

    Progress in one area spurred advances in others, creating a feedback loop that propelled innovation.  

    Over time, new use cases have become more personal and portable – necessitating a focus on intelligence per watt as a key parameter. Energy consumption directly affects industrial design and end-user benefits. Small low-power integrated circuits (ICs) facilitated fuzzy logic in portable consumer electronics like cameras and portable CD players. Low power ICs and power management techniques also helped feature phones evolve into smartphones.  

    A second-order effect of optimising for intelligence per watt is reducing power consumption across multiple applications. This spurs yet more new use cases in a virtuous innovation circle. This continues until the laws of physics impose limits. 

    Energy storage density and consumption are fundamental constraints, driving the need for a focus on intelligence per watt.  

    As intelligence per watt improves, there will be a point at which the question isn’t just what AI can do, but what should be done with AI? And where should it be processed? Trust becomes less about emotional reassurance and more about operational discipline. Just because it can handle a task doesn’t mean it should – particularly in cases where data sensitivity, latency, or transparency to humans is non-negotiable. A highly capable, off-device AI might be a fine at drafting everyday emails, but a questionable choice for handling your online banking. 

    Good ‘operational security’ outweighs trust. The design of AI systems must therefore account not just for energy efficiency, but user utility and deployment context. The cost of misplaced trust is asymmetric and potentially irreversible.

    Ironically the force multiplier in intelligence per watt is people and their use of ‘artificial intelligence’ as a tool or ‘co-pilot’. It promises to be an extension of the earlier memetic concept of a ‘bicycle for the mind’ that helped inspire early developments in the personal computer industry. The upside of an intelligence per watt focus is more personal, trusted services designed for everyday use. 

    Integration

    In 1926 or 27, Loewe (now better known for their high-end televisions) created the 3NF[i].

    While not a computer, but instead to integrate several radio parts in one glass envelope vacuum valve. This had three triodes (early electronic amplifiers), two capacitors and four resistors. Inside the valve the extra resistor and capacitor components went inside their own glass tubes. Normally each triode would be inside its own vacuum valve. At the time, German radio tax laws were based on the number of valve sockets in a device, making this integration financially advantageous. 

    Post-war scientific boom

    Between 1949 and 1957 engineers and scientists from the UK, Germany, Japan and the US proposed what we’d think of as the integrated circuit (IC). These ideas were made possible when breakthroughs in manufacturing happened. Shockley Semiconductor built on work by Bell Labs and Sprague Electric Company to connect different types of components on the one piece of silicon to create the IC. 

    Credit is often given to Jack Kilby of Texas Instruments as the inventor of the integrated circuit. But that depends how you define IC, with what is now called a monolithic IC being considered a ‘true’ one. Kilby’s version wasn’t a true monolithic IC. As with most inventions it is usually the child of several interconnected ideas that coalesce over a given part in time. In the case of ICs, it was happening in the midst of materials and technology developments including data storage and computational solutions such as the idea of virtual memory through to the first solar cells. 

    Kirby’s ICs went into an Air Force computer[ii] and an onboard guidance system for the Minuteman missile. He went on to help invent the first handheld calculator and thermal printer, both of which took advantage of progress in IC design to change our modern way of life[iii]

    TTL (transistor-to-transistor logic) circuitry was invented at TRW in 1961, they licensed it out for use in data processing and communications – propelling the development of modern computing. TTL circuits powered mainframes. Mainframes were housed in specialised temperature and humidity-controlled rooms and owned by large corporates and governments. Modern banking and payments systems rely on the mainframe as a concept. 

    AI’s early steps 

    Science Museum highlights

    What we now thing of as AI had been considered theoretically for as long as computers could be programmed. As semiconductors developed, a parallel track opened up to move AI beyond being a theoretical possibility. A pivotal moment was a workshop was held in 1956 at Dartmouth College. The workshop focused on a hypothesis ‘every aspect of learning or any other feature of intelligence can be so precisely described that a machine can be made to simulate it’. Later on, that year a meeting at MIT (Massachusetts Institute of Technology) brought together psychologists and linguists to discuss the possibility of simulating cognitive processes using a computer. This is the origin of what we’d now call cognitive science. 

    Out of the cognitive approach came some early successes in the move towards artificial intelligence[iv]. A number of approaches were taken based on what is now called symbolic or classical AI:

    • Reasoning as search – essentially step-wise trial and error approach to problem solving that was compared to wandering through a maze and back-tracking if a dead end was found. 
    • Natural language – where related phrases existed within a structured network. 
    • Micro-worlds – solving for artificially simple situations, similar to economic models relying on the concept of the rational consumer. 
    • Single layer neural networks – to do rudimentary image recognition. 

     By the time the early 1970s came around AI researchers ran into a number of problems, some of which still plague the field to this day:

    • Symbolic AI wasn’t fit for purpose solving many real-world tasks like crossing a crowded room. 
    • Trying to capture imprecise concepts with precise language.
    • Commonsense knowledge was vast and difficult to encode. 
    • Intractability – many problems require an exponential amount of computing time. 
    • Limited computing power available – there was insufficient intelligence per watt available for all but the simplest problems. 

    By 1966, US and UK funding bodies were frustrated with the lack of progress on the research undertaken. The axe fell first on a project to use computers on language translation. Around the time of the OPEC oil crisis, funding to major centres researching AI was reduced by both the US and UK governments respectively. Despite the reduction of funding to the major centres, work continued elsewhere. 

    Mini-computers and pocket calculators

    ICs allowed for mini-computers due to the increase in computing power per watt. As important as the relative computing power, ICs made mini-computers more robust, easier to manufacture and maintain. DEC (Digital Equipment Corporation) launched the first minicomputer, the PDP-8 in 1964. The cost of mini-computers allowed them to run manufacturing processes, control telephone network switching and control labouratory equipment. Mini-computers expanded computer access in academia facilitating more work in artificial life and what we’d think of as early artificial intelligence. This shift laid the groundwork for intelligence per watt as a guiding principle.

    A second development helped drive mass production of ICs – the pocket calculator, originally invented at Texas Instruments.  It demonstrated how ICs could dramatically improve efficiency in compact, low-power devices.

    LISP machines and PCs

    AI researchers required more computational power than mini-computers could provide, leading to the development of LISP machines—specialised workstations designed for AI applications. Despite improvements in intelligence per watt enabled by Moore’s Law, their specialised nature meant that they were expensive. AI researchers continued with these machines until personal computers (PCs) progressed to a point that they could run LISP quicker than LISP machines themselves. The continuous improvements in data storage, memory and processing that enabled LISP machines, continued on and surpassed them as the cost of computing dropped due to mass production. 

    The rise of LISP machines and their decline was not only due to Moore’s Law in effect, but also that of Makimoto’s Wave. While Gordon Moore outlined an observation that the number of transistors on a given area of silicon doubled every two years or so. Tsugio Makimoto originally observed 10-year pivots from standardised semiconductor processors to customised processors[v]. The rise of personal computing drove a pivot towards standardised architectures. 

    PCs and workstations extended computing beyond computer rooms and labouratories to offices and production lines. During the late 1970s and 1980s standardised processor designs like the Zilog Z80, MOS Technology 6502 and the Motorola 68000 series drove home and business computing alongside Intel’s X86 processors. 

    Personal computing started in businesses when office workers brought a computer to use early computer programmes like the VisiCalc spreadsheet application. This allowed them to take a leap forward in not only tabulating data, but also seeing how changes to the business might affect financial performance. 

    Businesses then started to invest more in PCs for a wide range of uses. PCs could emulate the computer terminal of a mainframe or minicomputer, but also run applications of their own. 

    Typewriters were being placed by word processors that allowed the operator to edit a document in real time without resorting to using correction fluid

    A Bicycle for the Mind

    Steve Jobs at Apple was as famous for being a storyteller as he was for being a technologist in the broadest sense. Internally with the Mac team he shared stories and memetic concepts to get his ideas across in everything from briefing product teams to press interviews. As a concept, a 1990 filmed interview with Steve Jobs articulates the context of this saying particularly well. 

    In reality, Jobs had been telling the story for a long time through the development of the Apple II and right from the beginning of the Mac. There is a version of the talk that was recorded some time in 1980 when the personal computer was still a very new idea – the video was provided to the Computer History Museum by Regis McKenna[vi].

    The ‘bicycle for the mind’ concept was repeated in early Apple advertisements for the time[vii] and even informed the Macintosh project codename[viii]

    Jobs articulated a few key concepts. 

    • Buying a computer creates, rather than reduces problems. You needed software to start solving problems and making computing accessible. Back in 1980, you programmed a computer if you bought one. Which was the reason why early personal computer owners in the UK went on to birth a thriving games software industry including the likes of Codemasters[ix]. Done well, there should be no seem in the experience between hardware and software. 
    • The idea of a personal, individual computing device (rather than a shared resource).  My own computer builds on my years of how I have grown to adapt and use my Macs, from my first sit-up and beg Macintosh, to the MacBook Pro that I am writing this post on. This is even more true most people and their use of the smartphone. I am of an age, where my iPhone is still an appendage and emissary of my Mac. My Mac is still my primary creative tool. A personal computer is more powerful than a shared computer in terms of the real difference made. 
    • At the time Jobs originally did the speech, PCs were underpowered for anything but data processing (through spreadsheets and basic word processor applications). But that didn’t stop his idea for something greater. 

    Jobs idea of the computer as an adjunct to the human intellect and imagination still holds true, but it doesn’t neatly fit into the intelligence per watt paradigm. It is harder to measure the effort developing prompts, or that expended evaluating, refining and filtering generative AI results. Of course, Steve Jobs Apple owed a lot to the vision shown in Doug Engelbart’s ‘Mother of All Demos’[x].

    Networks

    Work took a leap forward with office networked computers pioneered by Macintosh office by Apple[xi]. This was soon overtaken by competitors. This facilitated work flow within an office and its impact can still be seen in offices today, even as components from print management to file storage have moved to cloud-based services. 

    At the same time, what we might think of as mobile was starting to gain momentum. Bell Labs and Motorola came up with much of the technology to create cellular communications. Martin Cooper of Motorola made the first phone call on a cellular phone to a rival researcher at Bell Labs. But Motorola didn’t sell the phone commercially until 1983, as a US-only product called the DynaTAC 8000x[xii].  This was four years after Japanese telecoms company NTT launched their first cellular network for car phones. Commercial cellular networks were running in Scandinavia by 1981[xiii]

    In the same way that the networked office radically changed white collar work, the cellular network did a similar thing for self-employed plumbers, electricians and photocopy repair men to travelling sales people. If they were technologically advanced, they may have had an answer machine, but it would likely have to be checked manually by playing back the tape. 

    Often it was a receptionist in their office if they had one. Or more likely, someone back home who took messages. The cell phone freed homemakers in a lot of self-employed households to go out into the workplace and helped raise household incomes. 

    Fuzzy logic 

    The first mainstream AI applications emerged from fuzzy logic, introduced by Lofti A. Zadeh in 1965 mathematical paper. Initial uses were for industrial controls in cement kilns and steel production[xiv]. The first prominent product to rely on fuzzy logic was the Zojirushi Micom Electric Rice Cooker (1983), which adjusted cooking time dynamically to ensure perfect rice. 

    Rice Cooker with Fuzzy Logic 3,000 yen avail end june

    Fuzzy logic reacted to changing conditions in a similar way to people. Through the 1980s and well into the 1990s, the power of fuzzy logic was under appreciated outside of Japanese product development teams. In a quote a spokesperson for the American Electronics Association’s Tokyo office said to the Washington Post[xv].

    “Some of the fuzzy concepts may be valid in the U.S.,”

    “The idea of better energy efficiency, or more precise heating and cooling, can be successful in the American market,”

    “But I don’t think most Americans want a vacuum cleaner that talks to you and says, ‘Hey, I sense that my dust bag will be full before we finish this room.’ “

    The end of the 1990s, fuzzy logic was embedded in various consumer devices: 

    • Air-conditioner units – understands the room, the temperature difference inside-and-out, humidity. It then switches on-and-off to balance cooling and energy efficiency.
    • CD players – enhanced error correction on playback dealing with imperfections on the disc surface.
    • Dishwashers – understood how many dishes were loaded, their type of dirt and then adjusts the wash programme.
    • Toasters – recognised different bread types, the preferable degree of toasting and performs accordingly.
    • TV sets – adjust the screen brightness to the ambient light of the room and the sound volume to how far away the viewer is sitting from the TV set. 
    • Vacuum cleaners – vacuum power that is adjusted as it moves from carpeted to hard floors. 
    • Video cameras – compensate for the movement of the camera to reduce blurred images. 

    Fuzzy logic sold on the benefits and concealed the technology from western consumers. Fuzzy logic embedded intelligence in the devices. Because it worked on relatively simple dedicated purposes it could rely on small lower power specialist chips[xvi] offering a reasonable amount of intelligence per watt, some three decades before generative AI. By the late 1990s, kitchen appliances like rice cookers and microwave ovens reached ‘peak intelligence’ for what they needed to do, based on the power of fuzzy logic[xvii].

    Fuzzy logic also helped in business automation. It helped to automatically read hand-written numbers on cheques in banking systems and the postcodes on letters and parcels for the Royal Mail. 

    Decision support systems & AI in business

    Decision support systems or Business Information Systems were being used in large corporates by the early 1990s. The techniques used were varied but some used rules-based systems. These were used in at least some capacity to reduce manual office work tasks. For instance, credit card approvals were processed based on rules that included various factors including credit scores. Only some credit card providers had an analyst manually review the decision made by system.  However, setting up each use case took a lot of effort involving highly-paid consultants and expensive software tools. Even then, vendors of business information systems such as Autonomy struggled with a high rate of projects that failed to deliver anything like the benefits promised. 

    Three decades on, IBM had a similar problem with its Watson offerings, with particularly high-profile failure in mission-critical healthcare applications[xviii]. Secondly, a lot of tasks were ad-hoc in nature, or might require transposing across disparate separate systems. 

    The rise of the web

    The web changed everything. The underlying technology allowed for dynamic data. 

    Software agents

    Examples of intelligence within the network included early software agents. A good example of this was PapriCom. PapriCom had a client on the user’s computer. The software client monitored price changes for products that the customer was interested in buying. The app then notified the user when the monitored price reached a price determined by the customer. The company became known as DealTime in the US and UK, or Evenbetter.com in Germany[xix].  

    The PapriCom client app was part of a wider set of technologies known as ‘push technology’ which brought content that the netizen would want directly to their computer. In a similar way to mobile app notifications now. 

    Web search

    The wealth of information quickly outstripped netizen’s ability to explore the content. Search engines became essential for navigating the new online world. Progress was made in clustering vast amounts of cheap Linux powered computers together and sharing the workload to power web search amongst them.  As search started to trying and make sense of an exponentially growing web, machine learning became part of the developer tool box. 

    Researchers at Carnegie-Mellon looked at using games to help teach machine learning algorithms based on human responses that provided rich metadata about the given item[xx]. This became known as the ESP game. In the early 2000s, Yahoo! turned to web 2.0 start-ups that used user-generated labels called tags[xxi] to help organise their data. Yahoo! bought Flickr[xxii] and deli.ico.us[xxiii]

    All the major search engines looked at how deep learning could help improve search results relevance. 

    Given that the business model for web search was an advertising-based model, reducing the cost per search, while maintaining search quality was key to Google’s success. Early on Google focused on energy consumption, with its (search) data centres becoming carbon neutral in 2007[xxiv]. This was achieved by a whole-system effort: carefully managing power management in the silicon, storage, networking equipment and air conditioning to maximise for intelligence per watt. All of which were made using optimised versions of open-source software and cheap general purpose PC components ganged together in racks and operating together in clusters. 

    General purpose ICs for personal computers and consumer electronics allowed easy access relatively low power computing. Much of this was down to process improvements that were being made at the time. You needed the volume of chips to drive innovation in mass-production at a chip foundry. While application-specific chips had their uses, commodity mass-volume products for uses for everything from embedded applications to early mobile / portable devices and computers drove progress in improving intelligence-per-watt.

    Makimoto’s tsunami back to specialised ICs

    When I talked about the decline of LISP machines, I mentioned the move towards standardised IC design predicted by Tsugio Makimoto. This led to a surge in IC production, alongside other components including flash and RAM memory.  From the mid-1990s to about 2010, Makimoto’s predicted phase was stuck in ‘standardisation’. It just worked. But several factors drove the swing back to specialised ICs. 

    • Lithography processes got harder: standardisation got its performance and intelligence per watt bump because there had been a steady step change in improvements in foundry lithography processes that allowed components to be made at ever-smaller dimensions. The dimensions are a function wavelength of light used. The semiconductor hit an impasse when it needed to move to EUV (extreme ultra violet) light sources. From the early 1990s on US government research projects championed development of key technologies that allow EUV photolithography[xxv]. During this time Japanese equipment vendors Nikon and Canon gave up on EUV. Sole US vendor SVG (Silicon Valley Group) was acquired by ASML, giving the Dutch company a global monopoly on cutting edge lithography equipment[xxvi]. ASML became the US Department of Energy research partner on EUV photo-lithography development[xxvii]. ASML spent over two decades trying to get EUV to work. Once they had it in client foundries further time was needed to get commercial levels of production up and running. All of which meant that production processes to improve IC intelligence per watt slowed down and IC manufacturers had to start about systems in a more holistic manner. As foundry development became harder, there was a rise in fabless chip businesses. Alongside the fabless firms, there were fewer foundries: Global Foundries, Samsung and TSMC (Taiwan Semiconductor Manufacturing Company Limited). TSMC is the worlds largest ‘pure-play’ foundry making ICs for companies including AMD, Apple, Nvidia and Qualcomm. 
    • Progress in EDA (electronic design automation). Production process improvements in IC manufacture allowed for an explosion in device complexity as the number of components on a given size of IC doubled every 18 months or so. In the mid-to-late 1970s this led to technologists thinking about the idea of very large-scale integration (VLSI) within IC designs[xxviii]. Through the 1980s, commercial EDA software businesses were formed. The EDA market grew because it facilitated the continual scaling of semiconductor technology[xxix]. Secondly, it facilitated new business models. Businesses like ARM Semiconductor and LSI Logic allowed their customers to build their own processors based on ‘blocs’ of proprietary designs like ARM’s cores. That allowed companies like Apple to focus on optimisation in their customer silicon and integration with software to help improve the intelligence per watt[xxx]
    • Increased focus on portable devices. A combination of digital networks, wireless connectivity, the web as a communications platform with universal standards, flat screen displays and improving battery technology led the way in moving towards more portable technologies. From personal digital assistants, MP3 players and smartphone, to laptop and tablet computers – disconnected mobile computing was the clear direction of travel. Cell phones offered days of battery life; the Palm Pilot PDA had a battery life allowing for couple of days of continuous use[xxxi]. In reality it would do a month or so of work. Laptops at the time could do half a day’s work when disconnected from a power supply. Manufacturers like Dell and HP provided spare batteries for travellers. Given changing behaviours Apple wanted laptops that were easy to carry and could last most of a day without a charge. This was partly driven by a move to a cleaner product design that wanted to move away from swapping batteries. In 2005, Apple moved from PowerPC to Intel processors. During the announcement at the company’s worldwide developer conference (WWDC), Steve Jobs talked about the focus on computing power per watt moving forwards[xxxii]

    Apple’s first in-house designed IC, the A4 processor was launched in 2010 and marked the pivot of Makimoto’s wave back to specialised processor design[xxxiii].  This marked a point of inflection in the growth of smartphones and specialised computing ICs[xxxiv]

    New devices also meant new use cases that melded data on the web, on device, and in the real world. I started to see this in action working at Yahoo! with location data integrated on to photos and social data like Yahoo! Research’s ZoneTag and Flickr. I had been the Yahoo! Europe marketing contact on adding Flickr support to Nokia N-series ‘multimedia computers’ (what we’d now call smartphones), starting with the Nokia N73[xxxv].  A year later the Nokia N95 was the first smartphone released with a built-in GPS receiver. William Gibson’s speculative fiction story Spook Country came out in 2007 and integrated locative art as a concept in the story[xxxvi]

    Real-world QRcodes helped connect online services with the real world, such as mobile payments or reading content online like a restaurant menu or a property listing[xxxvii].

    I labelled the web-world integration as a ‘web-of-no-web’[xxxviii] when I presented on it back in 2008 as part of an interactive media module, I taught to an executive MBA class at Universitat Ramon Llull in Barcelona[xxxix]. In China, wireless payment ideas would come to be labelled O2O (offline to online) and Kevin Kelly articulated a future vision for this fusion which he called Mirrorworld[xl]

    Deep learning boom

    Even as there was a post-LISP machine dip in funding of AI research, work on deep (multi-layered) neural networks continued through the 1980s. Other areas were explored in academia during the 1990s and early 2000s due to the large amount of computing power needed. Internet companies like Google gained experience in large clustered computing, AND, had a real need to explore deep learning. Use cases include image recognition to improve search and dynamically altered journeys to improve mapping and local search offerings. Deep learning is probabilistic in nature, which dovetailed nicely with prior work Microsoft Research had been doing since the 1980s on Bayesian approaches to problem-solving[xli].  

    A key factor in deep learning’s adoption was having access to powerful enough GPUs to handle the neural network compute[xlii]. This has allowed various vendors to build Large Language Models (LLMs). The perceived strategic importance of artificial intelligence has meant that considerations on intelligence per watt has become a tertiary consideration at best. Microsoft has shown interest in growing data centres with less thought has been given on the electrical infrastructure required[xliii].  

    Google’s conference paper on attention mechanisms[xliv] highlighted the development of the transformer model. As an architecture it got around problems in previous approaches, but is computationally intensive. Even before the paper was published, the Google transformer model had created fictional Wikipedia entries[xlv]. A year later OpenAI built on Google’s work with the generative pre-trained transformer model better known as GPT[xlvi]

    Since 2018 we’ve seen successive GPT-based models from Amazon, Anthropic, Google, Meta, Alibaba, Tencent, Manus and DeepSeek. All of these models were trained on vast amounts of information sources. One of the key limitations for building better models was access to training material, which is why Meta used pirated copies of e-books obtained using bit-torrent[xlvii]

    These models were so computationally intensive that the large-scale cloud service providers (CSPs) offering these generative AI services were looking at nuclear power access for their data centres[xlviii]

    The current direction of development in generative AI services is raw computing power, rather than having a more energy efficient focus of intelligence per watt. 

    Technology consultancy / analyst Omdia estimated how many GPUs were bought by hyperscalers in 2024[xlix].

    CompanyNumber of Nvidia GPUs boughtNumber of AMD GPUs boughtNumber of self-designed custom processing chips bought
    Amazon196,0001,300,000
    Alphabet (Google)169,0001,500,000
    ByteDance230,000
    Meta224,000173,0001,500,000
    Microsoft485,00096,000200,000
    Tencent230,000

    These numbers provide an indication of the massive deployment on GPT-specific computing power. Despite the massive amount of computing power available, services still weren’t able to cope[l] mirroring some of the service problems experienced by early web users[li] and the Twitter ‘whale FAIL’[lii] phenomenon of the mid-2000s. The race to bigger, more powerful models is likely to continue for the foreseeable future[liii]

    There is a second class of players typified by Chinese companies DeepSeek[liv] and Manus[lv] that look to optimise the use of older GPT models to squeeze the most utility out of them in a more efficient manner. Both of these services still rely on large cloud computing facilities to answer queries and perform tasks. 

    Agentic AI

    Thinking on software agents went back to work being done in computer science in the mid-1970s[lvi]. Apple articulated a view[lvii]of a future system dubbed the ‘Knowledge Navigator’[lviii] in 1987 which hinted at autonomous software agents. What we’d now think of as agentic AI was discussed as a concept at least as far back as 1995[lix], this was mirrored in research labs around the world and was captured in a 1997 survey of research on intelligent software agents was published[lx]. These agents went beyond the vision that PapriCom implemented. 

    A classic example of this was Wildfire Communications, Inc. who created a voice enabled virtual personal assistant in 1994[lxi].  Wildfire as a service was eventually shut down in 2005 due to an apparent decline in subscribers using the service[lxii]. In terms of capability, Wildfire could do tasks that are currently beyond Apple’s Siri. Wildfire did have limitations due to it being an off-device service that used a phone call rather than an internet connection, which limited its use to Orange mobile service subscribers using early digital cellular mobile networks. 

    Almost a quarter century later we’re now seeing devices that are looking to go beyond Wildfire with varying degrees of success. For instance, the Rabbit R1 could order an Uber ride or groceries from DoorDash[lxiii]. Google Duplex tries to call restaurants on your behalf to make reservations[lxiv] and Amazon claims that it can shop across other websites on your behalf[lxv]. At the more extreme end is Boeing’s MQ-28[lxvi] and the Loyal Wingman programme[lxvii]. The MQ-28 is an autonomous drone that would accompany US combat aircraft into battle, once it’s been directed to follow a course of action by its human colleague in another plane. 

    The MQ-28 will likely operate in an electronic environment that could be jammed. Even if it wasn’t jammed the length of time taken to beam AI instructions to the aircraft would negatively impact aircraft performance. So, it is likely to have a large amount of on-board computing power. As with any aircraft, the size of computing resources and their power is a trade-off with the amount of fuel or payload it will carry. So, efficiency in terms of intelligence per watt becomes important to develop the smallest, lightest autonomous pilot. 

    As well as a more hostile world, we also exist in a more vulnerable time in terms of cyber security and privacy. It makes sense to have critical, more private AI tasks run on a local machine. At the moment models like DeepSeek can run natively on a top-of-the-range Mac workstation with enough memory[lxviii].  

    This is still a long way from the vision of completely local execution of ‘agentic AI’ on a mobile device because the intelligence per watt hasn’t scaled down to that level to useful given the vast amount of possible uses that would be asked of the Agentic AI model. 

    Maximising intelligence per watt

    There are three broad approaches to maximise the intelligence per watt of an AI model. 

    • Take advantage of the technium. The technium is an idea popularised by author Kevin Kelly[lxix]. Kelly argues that technology moves forward inexorably, each development building on the last. Current LLMs such as ChatGPT and Google Gemini take advantage of the ongoing technium in hardware development including high-speed computer memory and high-performance graphics processing units (GPU).  They have been building large data centres to run their models in. They build on past developments in distributed computing going all the way back to the 1962[lxx]
    • Optimise models to squeeze the most performance out of them. The approach taken by some of the Chinese models has been to optimise the technology just behind the leading-edge work done by the likes of Google, OpenAI and Anthropic. The optimisation may use both LLMs[lxxi] and quantum computing[lxxii] – I don’t know about the veracity of either claim. 
    • Specialised models. Developing models by use case can reduce the size of the model and improve the applied intelligence per watt. Classic examples of this would be fuzzy logic used for the past four decades in consumer electronics to Mistral AI[lxxiii] and Anduril’s Copperhead underwater drone family[lxxiv].  

    Even if an AI model can do something, should the model be asked to do so?

    AI use case appropriateness

    We have a clear direction of travel over the decades to more powerful, portable computing devices –which could function as an extension of their user once intelligence per watt allows it to be run locally. 

    Having an AI run on a cloud service makes sense where you are on a robust internet connection, such as using the wi-fi network at home. This makes sense for general everyday task with no information risk, for instance helping you complete a newspaper crossword if there is an answer you are stuck on and the intellectual struggle has gone nowhere. 

    A private cloud AI service would make sense when working, accessing or processing data held on the service. Examples of this would be Google’s Vertex AI offering[lxxv]

    On-device AI models make sense in working with one’s personal private details such as family photographs, health information or accessing apps within your device. Apps like Strava which share data, have been shown to have privacy[lxxvi] and security[lxxvii] implications. ***I am using Strava as an example because it is popular and widely-known, not because it is a bad app per se.***

    While businesses have the capability and resources to have a multi-layered security infrastructure to protect their data most[lxxviii]of[lxxix] the[lxxx] time[lxxxi], individuals don’t have the same security. As I write this there are privacy concerns[lxxxii] expressed about Waymo’s autonomous taxis. However, their mobile device is rarely out of physical reach and for many their laptop or tablet is similarly close. All of these devices tend to be used in concert with each other. So, for consumers having an on-device AI model makes the most sense. All of which results in a problem, how do technologists squeeze down their most complex models inside a laptop, tablet or smartphone? 


    [i] Radiomuseum – Loewe (Opta), Germany. Multi-system internal coupling 3NF

    [ii] (1961) Solid Circuit(tm) Semiconductor Network Computer, 6.3 Cubic inches in Size, is Demonstrated in Operation by U.S. Air Force and Texas Instruments (United States) Texas Instruments news release

    [iii] (2000) The Chip that Jack Built Changed the World (United States) Texas Instruments website

    [iv] Moravec H (1988), Mind Children (United States) Harvard University Press

    [v] (2010) Makimoto’s Wave | EDN (United States) AspenCore Inc.

    [vi] Jobs, S. (1980) Presentation on Apple Computer history and vision (United States) Computer History Museum via Regis McKenna

    [vii] Sinofsky, S. (2019) ‘Bicycle for the Mind’ (United States) Learning By Shipping

    [viii] Hertzfeld, A. (1981) Bicycle (United States) Folklore.org

    [ix] Jones, D. (2016) Codemasters (United Kingdom) Retro Gamer – Future Publishing

    [x] Engelbert, D. (1968) A Research Center For Augmenting Human Intellect (United States) Stanford Research Institute (SRI)

    [xi] Hormby, T. (2006) Apple’s Worst business Decisions (United States) OSnews

    [xii] Honam, M. (2009) From Brick to Slick: A History of Mobile Phones (United States) Wired

    [xiii] Ericsson History: The Nordics take charge (Sweden) LM Ericsson.

    [xiv] Singh, H., Gupta, M.M., Meitzler, T., Hou, Z., Garg, K., Solo, A.M.G & Zadeh, L.A. (2013) Real-Life Applications of Fuzzy Logic – Advances in Fuzzy Systems (Egypt) Hindawi Publishing Corporation

    [xv] Reid, T.R. (1990) The Future of Electronics Looks ‘Fuzzy’. (United States) Washington Post

    [xvi] Kushairi, A. (1993). “Omron showcases latest in fuzzy logic”. (Malaysia) New Straits Times

    [xvii] Watson, A. (2021) The Antique Microwave Oven that’s Better than Yours (United States) Technology Connections

    [xviii] Durbhakula, S. (2022) IBM dumping Watson Health is an opportunity to reevaluate artificial intelligence (United States) MedCity News

    [xix] (1998) PapriCom Technologies Wins CommerceNet Award (Israel) Globes

    [xx] Von Ahn, L., Dabbish, L. (2004) Labeling Images with a Computer Game (United States) School of Computing, Carnegie-Mellon University

    [xxi] Butterfield, D., Fake, C., Henderson-Begg, C., Mourachov, S., (2006) Interestingness ranking of media objects (United States) US Patent Office

    [xxii] Delaney, K.J., (2005) Yahoo acquires Flickr creator (United States) Wall Street Journal

    [xxiii] Hood, S., (2008) Delicious is 5 (United States) Delicious blog

    [xxiv] (2017) 10 years of Carbon Neutrality (United States) Google

    [xxv] Bakshi, V. (2018) EUV Lithography (United States) SPIE Press

    [xxvi] Wade, W. (2000) ASML acquires SVG, becomes largest litho supplier (United States) EE Times

    [xxvii] Lammers, D. (1999) U.S. gives ok to ASML on EUV effort (United States) EE Times

    [xxviii] Meade, C., Conway, L. (1979) Introduction to VLSI Systems (United States) Addison-Wesley

    [xxix] Lavagno, L., Martin, G., Scheffer, L., et al (2006) Electronic Design Automation for Integrated Circuits Handbook (United States) Taylor & Francis

    [xxx] (2010) Apple Launches iPad (United States) Apple Inc. website

    [xxxi] (1997) PalmPilot Professional (United Kingdom) Centre for Computing History

    [xxxii] Jobs, S. (2005) Apple WWDC 2005 keynote speech (United States) Apple Inc.

    [xxxiii] (2014) Makimoto’s Wave Revisited for Multicore SoC Design (United States) EE Times

    [xxxiv] Makimoto, T. (2014) Implications of Makimoto’s Wave (United States) IEEE Computer Society

    [xxxv] (2006) Nokia and Yahoo! add Flickr support in Nokia Nseries Multimedia Computers (Germany) Cision PR Newswire

    [xxxvi] Gibson, W. (2007) Spook Country (United States) Putnam Publishing Group

    [xxxvii] The O2O Business In China (China) GAB China

    [xxxviii] Carroll, G. (2008) Web Centric Business Model (United States) Waggener Edstrom Worldwide for LaSalle School of Business, Universitat Ramon Llull, Barcelona

    [xxxix] Carroll, G. (2008) Web of no web (United Kingdom) renaissance chambara

    [xl] Kelly, K. (2018) AR Will Spark the Next Big Tech Platform – Call It Mirrorworld (United States) Wired

    [xli] Heckerman, D. (1988) An Empirical Comparison of Three Inference Methods (United States) Microsoft Research

    [xlii] Sze, V., Chen, Y.H., Yang, T.J., Emer, J. (2017) Efficient Processing of Deep Neural Networks: A Tutorial and Survey (United States) Cornell University

    [xliii] Webber, M. E. (2024) Energy Blog: Is AI Too Power-Hungry for Our Own Good? (United States) American Society of Mechanical Engineers

    [xliv] Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, L., Polosukhin, I. (2017) Attention Is All You Need (United States) 31st Conference on Neural Information Processing Systems (NIPS 2017)

    [xlv] Marche, S. (2024) Was Linguistic A.I. Created By Accident? (United States) The New Yorker.

    [xlvi] Radford, A. (2018) Improving language understanding with unsupervised learning (United States) OpenAI

    [xlvii] Heath, N. (2025) Authors outraged to discover Meta used their pirated work to train its AI systems (Australia) ABC (Australian Broadcast Corporation)

    [xlviii] Morey, M., O’Sullivan, J. (2024) In-brief analysis: Data center owners turn to nuclear as potential energy source (United States) Today in Energy published by U.S. Energy Information Administration

    [xlix] Bradshaw, T., Morris, S. (2024) Microsoft acquires twice as many Nvidia AI chips as tech rivals (United Kingdom) Financial Times

    [l] Smith, C. (2025) ChatGPT’s viral image-generation upgrade is ruining the chatbot for everyone (United States) BGR (Boy Genius Report)

    [li] Wayner, P. (1997) Human Error Cripples the Internet (United States) The New York Times

    [lii] Honan, M. (2013) Killing the Fail Whale with Twitter’s Christopher Fry (United States) Wired

    [liii] Mazarr, M. (2025) The Coming Strategic Revolution of Artificial Intelligence (United States) MIT (Massachusetts Institute of Technology)

    [liv] Knight, W. (2025) DeepSeek’s New AI Model Sparks Shock, Awe, and Questions from US Competitors (United States) Wired

    [lv] Sharwood, S. (2025) Manus mania is here: Chinese ‘general agent’ is this week’s ‘future of AI’ and OpenAI-killer (United Kingdom) The Register

    [lvi] Hewitt, C., Bishop, P., Steiger, R. (1973). A Universal Modular Actor Formalism for Artificial Intelligence. (United States) IJCAI (International Joint Conference on Artificial Intelligence).

    [lvii] Sculley, J. (1987) Keynote Address On The Knowledge Navigator at Educom (United States) Apple Computer Inc.

    [lviii] (1987) Apple’s Future Computer: The Knowledge Navigator (United States) Apple Computer Inc.

    [lix] Kelly, K. (1995) Out of Control: The New Biology of Machines (United States) Fourth Estate

    [lx] Nwana, H.S., Azarmi, N. (1997) Software Agents and Soft Computing: Towards Enhancing Machine Intelligence Concepts and Applications (Germany) Springer

    [lxi] Rifkin, G. (1994) Interface; A Phone That Plays Secretary for Travelers (United States) The New York Times

    [lxii] Richardson, T. (2005) Orange kills Wildfire – finally (United Kingdom) The Register

    [lxiii] Spoonauer, M. (2024) The Truth about the Rabbit R1 – your questions answered about the AI gadget (United States) Tom’s Guide

    [lxiv] Garun, N. (2019) One year later, restaurants are still confused by Google Duplex (United States) The Verge

    [lxv] Roth, E. (2025) Amazon can now buy products from other websites for you (United States) The Verge

    [lxvi] MQ-28 microsite (United States) Boeing Inc.

    [lxvii] Warwick, G. (2019) Boeing Unveils ‘Loyal Wingman’ UAV Developed In Australia (United Kingdom) Aviation Week Network – part of Informa Markets

    [lxviii] Udinmwen, E. (2025) Apple Mac Studio M3 Ultra workstation can run Deepseek R1 671B AI model entirely in memory using less than 200W, reviewer finds (United Kingdom) TechRadar

    [lxix] Kelly, K. (2010) What Technology Wants (United States) Viking Books

    [lxx] Andrews, G.R. (2000) Foundations of Multithreaded, Parallel, and Distributed Programming (United States) Addison-Wesley

    [lxxi] Criddle, C., Olcott, E. (2025) OpenAI says it has evidence China’s DeepSeek used its model to train competitor (United Kingdom) Financial Times

    [lxxii] Russell, J. (2025) China Researchers Report Using Quantum Computer to Fine-Tune Billion Parameter AI Model (United States) HPC Wire

    [lxxiii] Mistral AI home page (France) Mistral AI

    [lxxiv] (2025) High-Speed Autonomous Underwater Effects. Copperhead (United States) Anduril Industries

    [lxxv] Vertex AI with Gemini 1.5 Pro and Gemini 1.5 Flash (United States) Google Cloud website

    [lxxvi] Untersinger, M. (2024) Strava, the exercise app filled with security holes (France) Le Monde

    [lxxvii] Nilsson-Julien, E. (2025) French submarine crew accidentally leak sensitive information through Strava app (France) Le Monde

    [lxxviii] Arsene, Liviu (2018) Hack of US Navy Contractor Nets China 614 Gigabytes of Classified Information (Romania) Bitdefender

    [lxxix] Wendling, M. (2024) What to know about string of US hacks blamed on China (United Kingdom) BBC News

    [lxxx] Kidwell, D. (2020) Cyber espionage for the Chinese government (United States) U.S. Air Force Office of Special Investigations

    [lxxxi] Gorman, S., Cole, A., Dreazen, Y. (2009) Computer Spies Breach Fighter-Jet Project (United States) The Wall Street Journal

    [lxxxii] Bellan, R. (2025) Waymo may use interior camera data to train generative AI models, but riders will be able to opt out (United States) TechCrunch

  • Liberation day + more things

    Liberation day

    Liberation Day was a glorified press conference where the Trump administration revealed their tariff scale on every country around the world. Weirdly enough, Russia wasn’t tariffed. Here’s some of the interesting analysis I saw prior to, and after the event.

    Liberation day social media post.

    The Trump administration leant into an aesthetic influenced by patriotic memes, the steeliness of The Apprentice and generative AI – a look I call Midjourney Modern. Liberation Day was no exception.

    The Economist did a hot take that calls the whole thing a ‘fantasy’.

    America’s Cultural Revolution – by Stephen Roach – Conflict – Stephen Roach was an Asian focused chief economist at Morgan Stanley. The American Cultural Revolution narrative is something I have heard from a few contacts in China and Roach echoes that perspective in this article.

    China says weaponising agriculture in US trade war should be off-limits | South China Morning Post – agricultural price shocks in the past have led to civil disruption in China

    Liberation Day and The New World Order | Fabricated Knowledge

    Opinion | I Just Saw the Future. It Was Not in America. – The New York TimesPresident Trump is focused on what teams American transgender athletes can race on, and China is focused on transforming its factories with A.I. so it can outrace all our factories. Trump’s “Liberation Day” strategy is to double down on tariffs while gutting our national scientific institutions and work force that spur U.S. innovation. China’s liberation strategy is to open more research campuses and double down on A.I.-driven innovation to be permanently liberated from Trump’s tariffs.

    Beijing’s message to America: We’re not afraid of you. You aren’t who you think you are — and we aren’t who you think we are. – Thomas Friedman – Overall, I would agree with the sentiment, BUT, you have to remember what he’s been shown is the best view of what China can do and reality is much more complex. I still think that there is a lot of the future being made in places like France, Finland, Latvia, Japan, Singapore, South Korea and Taiwan – as well as China. What China does best is quantity that has a scale all of its own, something America has historically excelled at.

    Consumer behaviour

    Bachelors Without Bachelor’s: Gender Gaps in Education and Declining Marriage Rates by Clara Chambers, Benjamin Goldman, Joseph Winkelmann :: SSRN

    Culture

    Montreal DJs move clubbing from midnight to morning, adding coffee and croissants | Trendwatching – early morning clubbing, reminds of Marky J‘s mornings at the Baa Bar in Liverpool.

    Health

    Is Gen Z more mentally ill, or do they just talk about it more? | Doomscrollers

    Europe Rapidly Falling Behind China in Pharma, Astra Chief Warns – Bloomberg

    Ideas

    I’m Tired of Pretending Tech is Making the World Better | Joan Westenberg

    Innovation

    Samsung Develops Groundbreaking Achromatic Metalens With POSTECH – Samsung Global Newsroom

    Korea

    South Korean movie theater launches monthly knit-while-you-watch screenings | Trend-Watching

    Luxury

    Counterfeit luxury goods: London raids miss the target | Dark Luxury

    Vogue Business Index top 10: Preppy is back and so is Ralph Lauren | Vogue Business

    Polène: The global success of the French handbag made with love | Le Monde

    Marketing

    X-tortion: How Advertisers Are Losing Control Of Media Choice | Forrester – I am surprised how ‘on the nose’ Forrester is in this post.

    Technicolor, Parent Company of The Mill, MPC, and Mikros, Facing Potential Closure | LBBOnline – this hit the creative industries like a lightning bolt.

    Influencer Marketing: The quiet reset in the influencer economy, ET BrandEquity – the total number of influencers has shot up from 9,62,000 in 2020 to 4.06 million influencers in 2024, reflecting a staggering 322% growth.

    Materials

    DIY Birkin? China’s Gen Z 3D print dupes, share on RedNote | Jing Daily – Armed with affordable 3D printers and free design templates, young consumers are crafting their own versions of iconic luxury accessories. – Homage flowerpots or penholders rather than ‘dupes’ but 3D printing feels mainstream

    Online

    Revealed: Google facilitated Russia and China’s censorship requests | Censorship | The Guardian – After requests from the governments of Russia and China, Google has removed content such as YouTube videos of anti-state protesters or content that criticises and alleges corruption among their politicians. Google’s own data reveals that, globally, there are 5.6m items of content it has “named for removal” after government requests. Worldwide requests to Google for content removals have more than doubled since 2020, according to cybersecurity company Surfshark.

    The reason you feel alienated and alone | Madeline Holden – your Dunbar number is filled with para-social relationship rather than social relationships.

    China’s fragile online spaces for debate | Merics

    AI Discoverability: Amazon’s Mistakes NN Group

    Retailing

    Lidl TikTok Shop launch sells out in under 20 minutes | Retail Gazette – I am curious about Lidl fulfilment approach

    Security

    Military delegates lose sway at China’s signature political gathering | FT

    Putin is Unlikely to Demobilize in the Event of a Ceasefire Because He is Afraid of His Veterans | Institute for the Study of War – which poses economic challenges in Russia and a greater incentive to attack outside Ukraine once the conflict winds down

    Exclusive: Secretive Chinese network tries to lure fired federal workers, research shows | Reuters

    FBI raids home of prominent computer scientist who has gone incommunicado – Ars Technica

    Technology

    Google’s Sergey Brin Asks Workers to Spend More Time In the Office – The New York Times – 60 hour weeks are productivity sweet spot according to Sergey Brin. Silicon Valley looks more-and-more like Huangzhou.

    Alibaba exec warns of overheating AI infrastructure market • The Register

    Telecoms

    SoftBank and Ericsson agree to collaborate on next-gen telco tech

    Web-of-no-web

    Meta announces experimental Aria Gen 2 research smart glasses | CNBC

    WeRide to open driverless taxi service in Zurich | EE News – Chinese operator is set to launch a fully unmanned taxi service in Zurich in the next few months. This follows the launch of its latest generation Robotaxi, the GXR, for fully unmanned paid autonomous ride-hailing services in Beijing. The GXR, with a L4-level redundant drive-by-wire chassis architecture, is WeRide’s second Robotaxi model to achieve fully driverless commercial operations in the city following pilot trials.

    Wireless

    London’s poor 5G blamed on spectrum, investment, Huawei ban • The Register – the comments nail it

  • Apple Intelligence delayed + more

    Apple Intelligence delayed.

    Apple announced that features showcased during the 2024 WWDC enhancing Siri would be delayed. Apple Intelligence delayed represents a serious breach of trust for Apple’s early adopters and the developer community. On its own whilst that’s rare from Apple, it’s survivable.

    Sad Mac icon

    Apple has made other FUBARs: the Newton, some of the Performa model Macintosh computers in the 1990s, the Apple Pippin, Apple QuickTake cameras and the Apple Cube computer from 2000.

    The most recent game-changing product has been the AirPod series of headphones which have become ubiquitous on the tube and client video calls. But there has been a definite vibe shift around perceptions at Apple.

    • Recent product upgrades to the MacBook Air were given a muted welcome. Personally I think Apple came out with a banger of a product: the M4 processor in the MacBook Air M1 form-factor at the Intel MacBook Air price of $999.
    • The Vision Pro goggles are at best a spoiler on the high-end market for Meta’s VR efforts, and an interesting experiment once lens technology catches up with their concept. At worst they are a vanity project for Tim Cook that have a very limited audience.
    • Conceptually Apple Intelligence told a deceptively good story. Let others develop the underlying LLMs that would power Apple Intelligence. This solution is partially forced on Apple due to the mutually exclusive needs between China and its other markets. But it also meant that Apple had a smaller AI challenge than other vendors. On-device intelligence that would work out the best way to solve a problem and handle easier problems without the latency of consulting a cloud service. More complex problems would then be doled out to off-device services with privacy being a key consideration. The reality is that Apple Intelligence delayed until 2027 because of technical challenges.

    Key commentary on the Apple Intelligence delay:

    Chungking Express.

    One of my most loved films is Chungking Express directed by Wong Ka wai. It was one of the reasons that I decided to take up the opportunity to live and work in Hong Kong. This YouTube documentary cuts together some of the oral history about the making of the film. The story of the production is nuts.

    Drone deliveries

    Interesting documentary by Marques Brownlee on the limited use cases and massive leaps in innovation going into drone delivery systems.

    Effective Marketing for Financial Services

    Les Binet presents a financial services-specific view on marketing effectiveness. It has some interesting nuances, in particular how brand building is MORE important in subscription services.

    Tony Touch set

    Tony Touch did a set for Aimé Leon Dore. It’s an impeccably programmed set.

    LUCID Air focus on efficiency

    It’s rare to hear the spokesperson for an American car company quoting Colin Chapman’s design philosophy – which he shared with Norman Foster.

  • Clutch Cargo + more things

    Clutch Cargo

    Clutch Cargo was an animated series first broadcast on American television in 1959. Clutch Cargo was created by Cambria Productions – who were a start-up animation studio. Cambria used a number of techniques to radically reduce the cost of producing the animated series.

    clutch cargo

    A key consideration was reducing the amount of movement that needed to be animated. There were some obvious visual motifs used to do this:

    • Characters were animated from waist height up for the majority of the films, this reduced the need to animate legs, walking or running.
    • Much of the movement was moving the camera around, towards or away from a static picture.
    • To show an explosion, they shook the camera, rather than animate the concussive effect of the blast.
    • Fire wasn’t animated, instead smoke would be put in front of the camera. Fake snow was sprinkled so that bad weather didn’t need to be drawn.
    • Cameraman Ted Gillette came up with the idea of Syncro-Vox. The voice actors head would be held steady, they would have a vivid lipstick applied and then say their lines. Gillette then put their mouths on top of the animated figures. Cambria made use of it in all their animations with the exception of The New Three Stooges – an animated series that allowed Moe Howard, Larry Fine and Joe DeRita to be voice actors after their movie contracts finished and they were affected by ill health.

    These choices meant that Clutch Cargo cost about 10 per cent of what it would have cost Disney to animate. The visual hacks to cut costs were also helped in the way the scripts were developed. Clutch Cargo avoided doing comedy, instead focusing on Tin-Tin-like adventures. ‘Physical’ comedy gags create a lot of movement to animate. By focusing on the storytelling of Clutch Cargo. The young audience weren’t bothered by the limited animation, as they were captivated into suspending their beliefs.

    Culture

    jwz: NEUROBLAST: Dispatch From The Cyberpunk City – Contemporary view of San Francisco through the lens of cyberpunk literature

    Energy

    ‘Hydrogen nanoreactors’ can create breakthrough in Green Hydrogen | EE News Europe

    FMCG

    Ozempic Could Crush the Junk Food Industry. But It Is Fighting Back. – The New York TimesLars Fruergaard Jorgensen, the chief executive of Novo Nordisk, which makes Ozempic and Wegovy, told Bloomberg that food-industry executives had been calling him. “They are scared about it,” he said. Around the same time, Walmart’s chief executive in the United States, John Furner, said that customers on GLP-1s were putting less food into their carts. Sales are down in sweet baked goods and snacks, and the industry is weathering a downturn. By one market-research firm’s estimate, food-and-drink innovation in 2024 reached an all-time nadir, with fewer new products coming to market than ever before.

    Ozempic users like Taylor aren’t just eating less. They’re eating differently. GLP-1 drugs seem not only to shrink appetite but to rewrite people’s desires. They attack what Amy Bentley, a food historian and professor at New York University, calls the industrial palate: the set of preferences created by our acclimatization, often starting with baby food, to the tastes and textures of artificial flavors and preservatives. Patients on GLP-1 drugs have reported losing interest in ultraprocessed foods, products that are made with ingredients you wouldn’t find in an ordinary kitchen: colorings, bleaching agents, artificial sweeteners and modified starches. Some users realize that many packaged snacks they once loved now taste repugnant.

    Gadget

    TIM Brasil unveils a wearable pin to combat phone theft at music festivals | Trendwatching – surprised mobile phone companies haven’t implemented something similar for London

    Marketing

    Madison Avenue has a Price Problem — Too Much Work for Meager Fees — Rather than a Cost Problem Requiring Chronic Downsizings – So why are cost reductions the go-to strategies for holding companies, who must surely know better? Downsizings stress and liquidate talent; they do nothing to improve the quality of agency services.

    IPG predicts 1-2% revenue drop for 2025, eyes savings of $250m ahead of Omnicom merger – interesting financial move as Omnicom deal closes.

    Apple resumes advertising on Elon Musk’s X after 15-month pause – 9to5Mac – the negative reaction to this that I have seen from Mac and iPhone users that I know is interesting. It’s the scales have dropped from their eyes about Apple’s performative progressive values. Yet the signs have been out there for years – in particular with regards anything that is even tangentially connected to China.

    Materials

    German startup achieves industrial-scale graphite recovery for lithium ion batteries | EE News Europe

    Media

    Zuckerberg’s rightward policy shift hits Meta staffers, targets Apple | CNBCemployees who might otherwise leave because of their disillusionment with policy changes are concerned about quitting now because of how they will be perceived by future employers given that Meta has said publicly that it’s weeding out “low performers.” Meta, like many of its tech peers, began downsizing in 2022 and has continued to trim around the edges. The company cut 21,000 jobs, or nearly a quarter of its workforce, in 2022 and 2023. Among those who lost their jobs were members of the civic integrity group, which was known to be outspoken in its criticism of Zuckerberg’s leadership.  Some big changes are now taking place that appear to directly follow the lead of Trump at the expense of company employees and users of the platforms, the people familiar with the matter said.

    Security

    Helsing ramps up drone factories across Europe | EE News Europe

    SCAR: Modernizing Satellite Communications at the Speed of War | Soldier Systems Daily Soldier Systems Daily

    Singapore

    Why Asia’s young women are going ‘boy sober’ and swiping left on romance | South China Morning Post – more Singaporean young women are opting out of traditional dating and marriage, prioritising career and personal freedom over societal expectations.

    Software

    The End of Programming as We Know It – O’Reilly

    Web-of-no-web

    Nissan to ship driverless cars in 2027 | EE News Europe