Search results for: “throwback gadget”

  • Dumb internet

    Over the past 20 years has the modern web became a dumb internet? That’s essentially a less nuanced version of what media theorist Douglas Rushkoff proposed.

    Douglas Rushkoff at WebVisions 2011
    Douglas Rushkoff at WebVisions 2011 taken by webvisionevent

    In his essay ‘The Internet Used to Make Us Smarter. No Not So Much” Rushkoff outlines the following points:

    • Too much focus and analysis has been put in the new, new thing. Novelty gets the attention over human impact
    • Consumer movements or subcultures become fads when they lose sight of their purpose
    • Rushkoff thinks that netizens let go of the social / intellectual power of the web. This provided the opportunity for them to become yet another large corporate business
    • Bulletin boards, messaging platforms and email lists facilitated non-real time or asynchronous communications. Asynchronous communications channels allowed people to be ‘smarter’ versions of themselves.
    • The move to an ‘always-on’ medium has been detrimental
      Going online went from an active choice to a constant state of being. The resulting disorientation is self-reinforcing.

    Rushkoff’s commentary is interesting for a number of reasons. He had been a herald of how online culture would change society and consumer behaviour.

    But his essay posits a simple storyline. It wasn’t people that ruined the internet, it was big business that did it when people weren’t looking. So I wanted to look at the different elements of his hypothesis stage-by-stage.

    Too much focus and analysis has been put in the new, new thing

    With most technologies we see the thing and realise that it has potential. But it is only when it reaches the consumer, that we truly see its power.

    Different cultures tend to use technology in very different ways. Let’s think about examples to illustrate this. Technology research giants like Bell Labs and BT Research had science fiction writers onboard to try and provide inspirational scenarios for the researchers. So it was no surprise that mobile wireless based communications and computing was envisaged in Star Trek.

    Tricorder
    A replica of a Science Tricorder from Star Trek by Mike Seyfang

    And yes looking back Star Trek saw that the computer was moving from something the size of a filing cabinet, to something that would be a personal device. They realised that there would be portable sensing capabilities and wireless communications. But Star Trek didn’t offer a lot in terms of use cases apart from science, exploration and telemedicine.

    These weren’t games machines, instead the crew played more complex board games. Vulcan chess seemed to be chess crossed with a cake stand.

    Yes, but that’s just the media, surely technolgists would have a better idea? Let’s go to a more recent time in cellphones.

    Here’s Steve Ballmer, at the time the CEO of the world’s largest technology company. Microsoft Research poured large amounts of money into understanding consumer behaviour and tech developments. In hindsight the clip is laughable, but at the time Balmer was the voice of reason.

    Nokia e90 and 6085
    The Nokia E90 Communicator and Nokia 6085 that I used through a lot of 2007

    I was using a Nokia E90 Communicator around about the time that Ballmer made these comments.

    I was working in a PR agency at the time and the best selling phone amongst my friends in the media industry were:

    • The Nokia N73 I’d helped launch right before leaving Yahoo! (there was an integration with the Flickr photo sharing service)
    • The Nokia N95 with its highly tactile sliding cover and built in GPS

    The Danger Sidekick was the must-have device for American teenagers. Japanese teens were clued to keitai phones that offered network-hosted ‘smartphone’ services. Korea had a similar eco-system to Japan with digital TV. Gran Vals, by Francisco Tárrega was commonplace as the Nokia ringtone, from Bradford to Beijing. Business people toted BlackBerry, Palm or Motorola devices which were half screen and half keyboard.

    The iPhone was radical, but there was no certainty that it would stick as a product. Apple had managed to reinvent the Mac. It had inched back from the brink to become ‘cool’ in certain circles. The iPod had managed to get Apple products into mainstream households. But the iPhone wasn’t a dead cert.

    The ideas behind the iPhone weren’t completely unfamiliar to me. I’d had a Palm Vx PDA, the first of several Palm touch screen devices I’ve owned. But I found that a Think Outside Stowaway collapsible keyboard was essential for productive work on the device. All of this meant I thought at the time that Ballmer seemed to be talking the most sense.

    Ballmer wasn’t the only person wrong-footed. So was Mike Lazaridis of Research In Motion (BlackBerry) repeatedly under-estimated the iPhone. Nokia also underestimated the iPhone too.

    So often organisations have the future in their hands, they just don’t realise it yet; or don’t have the corporate patience to capitalise on it. A classic example is Wildfire Communications and Orange. Wildfire Communications was a start-up that built a natural language software-based assistance system.

    In 1994 the launched an ‘electronic personal secretary’. The Wildfire assistant allowed users to use voice commands on their phone to route calls, handle messaging and reminders. The voice prompts and sound gave the assistant a personality.

    Orange bought the business in 2000 and then closed it down five years later as it didn’t have enough users taking it up. Part of this is was that the product was orientated towards business users, like cellphones has been in the 1980s and early 1990s.

    But growth took off when the cellphone bridged into consumer customer segments with the idea of a personal device. There wasn’t a horizon-scanning view taken on it, like what would be the impact of lower network latency from 3.5 and 4G networks.

    Orange had been acquired by France Telecom and there were no longer executives advocating for it.

    Demo of Wildfire’s assistant that I found on the web

    In retrospect with the likes of Siri, Alexa and Google Assistant; Wildfire was potential wasted. Orange weren’t sufficiently enamoured with the new, new thing to give it the time to shine. And the potential of the service wasn’t fully realised through further development.

    The reason why the focus might be put on the new, new thing is that its hard to pick winners and even harder to see how those winners will be used.

    Consumer movements or subcultures become fads when they lose sight of their purpose

    I found this to be a particularly interesting statement. Subcultures don’t necessarily realise that they’re a subculture until the label is put on them. It’s more a variant of ‘our thing’.

    • The Z Boys of Dogtown realised that they were great skaters, but probably didn’t realise that they were a ‘subculture’.
    • Shawn Stüssy printed up some t-shirts to promote the surf boards he was shaping. He did business the only way he knew how. Did he really realise he was building the foundations of streetwear culture of roadmen and hype beasts?
    • Punks weren’t like the Situationists with a manifesto. They were doing their thing until it was labelled and the DIY nature of doing their thing became synonymous with it.
    • The Chicago-based producers making electronic disco music for their neighbourhood clubs didn’t envisage building a global dance music movement. Neither did the London set who decided they had such a good time in Ibiza; they’d like to keep partying between seasons at home.

    Often a movement’s real purpose can only be seen in hindsight. What does become apparent is that scale dilutes, distorts or even kills a movement. When the movement becomes too big, it loses shape:

    • It becomes too loose a network
    • There are no longer common terms of reference and unspoken rules
    • The quality goes down

    But if a community doesn’t grown it ossifies. A classic example of this is The WeLL. An online bulletin board with mix of public and private rooms that covered a wide range of interests. Since it was founded in 1985 (on dial-up), it has remained a disappointing small business that had an outsized influence on early net culture. It still is an interesting place. But its size and the long threads on there feel as if the 1990s have never left (and sometimes I don’t think that’s a bad thing).

    When you bring in everyone into a medium that has an effect. The median in society is low brow. This idea of the low brow segment of society was well documented as a concept in the writing of George Orwell’s 1984 and Aldus Huxley’s Brave New World. Tabloid newspapers like The Sun or the National Inquirer write to a reading age of about 12 years old for the man in the street. Smart people do stupid things, but stupid people do stupid things more often.

    It is why Hearst, Pulitzer and Beaverbrook built a media empire on yellow journalism. It is why radio and television were built on the back of long-running daytime dramas (or soap operas) that offer a largely-stable unchanging backdrop, in contrast to a fast-changing world.

    Netizens let go of the social / intellectual power of the web

    When I thought about this comment, I went back to earlier descriptions of netizens and the web. Early netizen culture sprang out of earlier subcultures. The WeLL came out of The Whole Earth Catalog:

    • A how too manual
    • A collection of essays
    • Product reviews – a tradition that Kevin Kelly keeps alive with this Cool Tools blog posts

    The Whole Earth Catalog came out of the coalescence of the environmental lobby and the post-Altamont hippy movement to back to the land. Hippy culture didn’t die, but turned inwards. Across the world groups of hippies looked to carve out their own space. Some were more successful than others at it. The Whole Earth Catalog was designed as an aid for them.

    The hippy back to the land movement mirrored earlier generations of Americans who had gone west in the 19th century. Emigrates who had sailed for America seeking a better life. Even post-war GIs and their families who headed out to California from the major east coast cities.

    The early net offered a similar kind of open space to make your own, not bounded by geographic constraints. Underpinning that ethos was a certain amount of libertarianism. The early netizens cut a dash and created net culture. They also drew from academia. Software was seen as shareable knowledge just like the contents of The Whole Earth Catalog. Which gave us the open source software pinnings that this website and my laptop both rely on.

    That virtual space that was attractive to netizens also meant boundless space for large corporates to move in. Since there was infinite land to stake out, the netizens didn’t let go of power.

    To use the ‘wild west’ as an analogy; early netizens stuck with their early ‘ranch lands’, whilst the media conglomerates built cities that the mainstream netizens populated over time.

    The netizens never had power over those previously unmade commercial lands which the media combines made.

    Asynchronous communications channels allowed people to be ‘smarter’ versions of themselves

    Asynchronous communications at best do allow people to be the smarter version of themselves. That is fair to a point. But it glosses over large chunks of the web that was about being dumb. Flame wars, classes in Klingon and sharing porn. Those are things that have happened on the net for a long long time.

    In order to be a smarter version of yourself requires a desire to reflect that view to yourself; if not to others. I think that’s the key point here.

    The tools haven’t changed that much. Some of my best discussions happen on private Facebook groups. Its about what you choose to do, and who you choose to associate with.

    In some ways I feel like I am an anachronism. I try and read widely. I come from a family where reading was valued. My parents had grown up in rural Ireland.

    I remember that my Dad brought home a real mix of secondhand books from Modern Petroleum Technology and US Army field manuals for mechanics to Grimm’s Fairy Tales and Hammond Innes.

    This blog is a direct result of that wider reading and the curiosity that it inspired. I am also acutely aware that I am atypical in this regards. Maybe it is because I come from a family of emigres, or that Irish culture prides education in the widest sense. My Mum was an academically gifted child; books offered her a way off the family farm.

    My father had an interest in mechanical things. As the second son, so he had to think about a future beyond the family small holding that his older brother would eventually inherit.

    Being erudite sets up a sense of ‘otherness’ between society at large and yourself. This shows up unintentionally in having a wider vocabulary to draw from and so being able to articulate with a greater degree of precision. This is often misconstrued as jargon or complexity.

    I’d argue good deal of the general population doesn’t want to be smarter versions of themselves. They want to belong, to feel part of a continuum rather than a progression. And that makes sense, since we’re social animals and are hardwired to be concerned about difference as an evolutionary trait. Different could have got you killed – an enemy or an infectious disease.

    The move to an ‘always-on’ medium has been detrimental

    Rushkoff and I both agree that the ‘always-on’ media life has been detrimental. Where we disagree is that Rushkoff believes that it is the function of platforms such as Twitter. I see it more in terms of a continuum derived directly from network connectivity that drove immediacy.

    Before social was a problem we had email bankruptcy and information overload. Before widespread web use – 24-hour news broadcasting drove a decline in editorial space required for analysis which changed news for the worse.

    James Gleick’s book Faster alludes to a similar concept adversely affecting just about every aspect of life.

    Dumb Internet

    I propose that the dumb internet has come about as much from human factors as technological design. Yes technology has had its place; algorithms creating reductive personalised views of content based on what it thinks is the behaviour of people like you. It then vends adverts against that. Consumers are both the workers creating content and the product in the modern online advertising eco-system as Jaron Lanier’s You are not a gadget succinctly outlines.

    The tools that we have like Facebook do provide a base path of least resistance to inform and entertain us. Although it ends up being primarily entertainment and content that causes the audience to emote.

    But there is a larger non-technological pull at work as well. An aggregate human intellectual entropy that goes beyond our modern social platforms.

    If we want a web that makes us smarter, complaining about technology or the online tools provided to us isn’t enough:

    • We need to want to be smarter
    • We need to get better at selecting the tools that work for us as individuals
    • We need to use those tools in a considered, deliberate way
  • My decade of the iPhone

    A decade of the iPhone – last week has seen people looking back at the original launch. At the time, I was working an agency that looked after the Microsoft business. I used a Mac, a Nokia smartphone and a Samsung dual SIM feature phone.  At the time I had an Apple hosted email address for six years by then, so I was secure within the Apple eco-system. I accessed my email via IMAP on both my first generation MacBook Pro and the Nokia smartphone.

    Nokia had supported IMAP email for a few years by then. There were instant messaging clients available to download. Nokia did have cryptographic signatures on app downloads, but you found them on the web rather than within an app store.

    At the time BlackBerry was mostly a business device, though BlackBerry messaging seemed to take off in tandem with the rise of the iPhone.  The Palm Treo didn’t support IMAP in its native email application, instead it was reliant on a New Zealand based software developer and their paid for app SnapperMail.

    Microsoft had managed to make inroads with some business users, both Motorola and Samsung made reasonable looking devices based on Windows.

    The iPhone launch went off with the characteristic flair you would expect from Steve Jobs. It was a nice looking handset. It reminded me of Palm Vx that I used to have, but with built in wireless. Whilst the Vx had a stylus, I had used my fingers to press icons and write Graffiti to input text. It looked good, but it wasn’t the bolt from the blue in the way that others had experienced it.

    But in order to do work on the Palm, I had a foldable keyboard that sat in my pocket.

    By the time that the iPhone launched, I was using a developer version of the Nokia E90 which had an 800 pixel wide screen and a full keyboard in a compact package.

    Nokia e90 and 6085

    I had Wi-Fi, 3 and 3.5G cellular wireless. I could exchange files quickly with others over Bluetooth – at the time cellular data was expensive so being able to exchange things over Bluetooth was valuable. QuickOffice software allowed me to review work documents, a calendar that worked with my Mac and a contacts app.  There was GPS and Nokia Maps. I had a couple of days usage on a battery.

    By comparison when the iPhone launched it had:

    • GSM and GPRS only – which meant that wireless connectivity was slower
    • Wi-Fi
    • Bluetooth (but only for headphone support)
    • No battery hatch – which was unheard of in phones (but was common place in PDAs
    • No room for a SD, miniSD or microSD card – a step away from the norm. I knew people who migrated photos, message history and contacts from one phone to another via an SD card of some type

    I wasn’t Apple’s core target market at the time, Steve Jobs used to have a RAZR handset.

    As the software was demoed some things became apparent:

    • One of the key features at the time was visual voicemail. This allowed you to access your voicemails in a non-linear order. This required deep integration with the carrier. In the end this feature hasn’t been adopted by all carriers that support the iPhone. I still don’t enjoy that feature. I was atypical at the time as I had a SIM only contact with T-Mobile (now EE), but it was seemed obvious that Apple would pick carrier partners carefully
    • There was no software developer kit, instead Apple encouraged developers to build web services for the iPhone’s diminutive screen. Even on today’s networks that approach is hit-and-miss
    • The iPhone didn’t support Flash or Flash Lite. It is hard to explain how much web functionality and content was made in Adobe Flash format at the time. By comparison Nokia did support Flash, so you could enjoy a fuller web experience
    • The virtual keyboard was a poor substitute for Palm’s Graffiti or a hardware keyboard – which was the primary reason that BlackBerry users held out for such a long time
    • The device was expensive. I was used to paying for my device but wasn’t used to paying for one AND being tied into an expensive two year contract
    • Once iPhones hit the street, I was shocked at the battery life of the device. It wouldn’t last a work day, which was far inferior to Nokia

    I eventually moved to the Apple iPhone with the 3GS. Nokia’s achilles’ heel had been its address book which would brick when you synched over a 1,000 contacts into it.

    By comparison Apple’s contacts application just as well as Palm’s had before it. Despite the app store, many apps that I relied upon like CityTime, MetrO and the Opera browser took their time to get on the iPhone platform. Palm already was obviously in trouble, BlackBerry had never impressed me and Windows phone still wasn’t a serious option. Android would have required me to move my contacts, email and calendar over to Google – which wasn’t going to happen.

    SaveSave

  • SEO marketer crime + more

    SEO Marketer Sentenced to More than Three Years in Federal Prison for Extorting Money from a Local Merger and Acquisitions Firm | Department of Justice – just wow. Who would have thought that an SEO Marketer would be sent down for a crime that sounds more like a mafiosi. You’ll never look at an SEO marketer in the same way again

    The Many Ways Of WeChat: How Messaging Is Eating The World | TechCrunch – quite a nice primer on WeChat.

    Microsoft’s Cortana Gets Baked Into Cyanogen’s Forked Version Of Android | TechCrunch – this should give Google cause for concern

    SMARTPHONES: Hungry Huawei Eyes US Smartphone Market – Huawei’s move into the US smartphone market looks like a logical and necessary step to consolidating its place as a top global brand, but will require years of major investment to succeed.

    Why the 2012 non-Retina MacBook Pro still sells – Marco.org – The better question isn’t why anyone still buys the 101, but why the rest of the MacBook lineup is still less compelling for the 101’s buyers after almost four years, and whether Apple will sell and support the 101 for long enough for newer MacBook models to become compelling, economical replacements

    Circus Ponies – great Alphabet Inc parody by defunct Mac developer

    Sennheiser’s 3D audio will finally make VR complete – Engadget – the problem is about getting to a standard, immersive sound options have been around for a while

    Nextbit’s cloud-savvy smartphone ships on February 16th – Engadget – interesting cloud based focus, reminds me of the Danger Sidekick in that respect

    Yahoo prepping to lay off 10% or more of workforce – Business Insider – you can’t cut yourself to growth

    WeChat Unveils New Calling Feature to Start 2016 – Thatsmags.com – To cap off the holiday season and ring-ring-ring in the New Year, WeChat has released a new calling feature that allows users to call mobile and landline phones around the world.

    I, Cringely 2016 Prediction #1 – Beginning of the end for engineering workstations – I, Cringely – Centrix model for graphical computing. It depends on network latency, resiliency and redundancy. So it may not work in many markets

    Casio’s First Android Wear Watch Will Be Ruggedized And Have Up To A Month Of Battery Life & Razer Nabu Watch – Digital Watch with Smart Functions – closer to the minimum spec I would look at, if they nail 200 metres water resistance, a large G-Shock style form factor and a compelling use case I would re-examine my smart watch usage

    NSA hacked two key encryption chips | TechEye – bad news for the likes of Gemalto

    The Huawei Mate 8 Review – Anandtech – incremental rather than revolutionary changes, interesting how Qualcomm takes a pasting in the article comments

    Qualcomm Pushes Snapdragon Chip Beyond Phones | WSJ – maturation of smartphone market forces expansion (paywall)

    Daring Fireball: Samsung Galaxy Tab Pro S | DaringFireball – move to Windows rather than Android – Surface competitor. This probably says a lot about Android on the desktop

    Brian Eno Tells The Origin Story For Ambient Music » Synthtopia – Eno on the origins of ambient

    IMF Chief Economist’s 2016 Warning: Watch Out for China and Emerging-Market Volatility | WSJ – add to this interest rate rises in the developed world and you have a toxic mix of circumstances

    richard sapper (1932-2015) DesignBoom – Richard Sapper was the designer behind Lenovo and before that IBM. The ThinkPad was influenced by bento boxes. You had the butterfly keyboard of the ThinkPad 701 and titanium chassis before the famous PowerBook

  • Immersive as well as interactive experiences

    This post may take a while to get into, so please bear with me, but I want to take two examples that showcase where I was going on immersive experiences.

    User experience problems barring the way for immersive experiences

    I gave my parents my first iPad in September last year so that we could stay in touch, and detailed some of the challenges that they faced in getting to grips with the device. There were two things that sprang out of this that I found of interest:

    • Special purpose devices like the digital TV EPG (electronic programme guide) or a satellite navigation device interface seemed to be easier to grasp
    • Modern interfaces weren’t as intuitive as we think

    All of this is ironic given that the long term goal of HCI is to design systems that minimise the gap between the user’s cognitive model of what they want to do and the computer’s understanding of the user’s task.

    From the late 1980s through to the dot com boom, technology was genuinely exciting. We got a whole new genre of fiction: cyberpunk, there was tremendous advances and glorious failures in innovation.
    Sony Glasstron
    Devices like Sony’s Glasstron display made wearable computing seem like just around the corner. Computer performance leapt forward, you could really feel the speed difference between processor chips or going from one games console generation to another.  And there was a large degree of form-factor experimentation in computing:

    These weren’t necessarily accessible to the average consumer, they were aspirational in nature. Culture including film, art and music promised an immersive cybernetic experience from The Lawnmower Man to Cyber Dog club wear. Virtual reality arcade games to the PowerGlove for Nintendo’s NES meant that William Gibson’s vision of the internet seemed just around the corner. Yet despite the early promise of this technology we ended up with mere interactive experiences that put up a barrier between the user and technology.

    If we come forward to 2013, the killer applications of the smartphone aren’t a million miles away from the proto-instant messaging and chat services provided by CompuServe and AOL before the modern internet got started. In order to get the technology to work better it is time to break down the cognitive barriers and revisit immersive experiences.

    There are two ways of providing immersive experiences:

    • Immersing the consumer in the device
    • Immersing the data in the environment

    An example of immersing the consumer in the device is easy to find. From Sony’s Glasstron headsets, to augmented reality application Layar and Apple’s iOS 7.

    One of the problems that virtual reality helmets back in the early 1990s was the feeling of motion sickness that it induced. This also seems to be happening with iOS 7, in fact there is now a name for the condition: cyber-sickness.

    An example of immersing the data in the environment would be 3D projection mapping or a cinema screen and a digital taxi adverts with geofenced campaigns.
    image
    The problem is one of scale. Incorporating the data into the environment at least doesn’t make people ill.Where this will take us all is exciting and largely uncharted territory.

    More information