Jakobsson was a co-founder of user interface (UI) advisory firm TAT (The Astonishing Tribe), which was bought by RIM in 2010. TAT's work formed the basis for much of the UI for BlackBerry 10 OS upgrade. Jakobsson quit RIM in December 2012 and is a co-founder of a mobile enterprise software company.
How will we change the way we interact with smartphones? As Apple popularised touchscreens in 2007, the mobile world has harmonised around a circa 4-inch mobile phone with few buttons. Before the iPhone and the first few years after it, designers in the industry were discussing hidden buttons, haptics [sense of touch], keyboards, add-on keyboards to improve the input functionality. But nothing really happened. Instead, Apple revolutionised the phone by focusing on ease of use instead of features.
The hope now stands at making devices 'intelligent'. By allowing us to talk to them. Or having them 'understand' us. This is futile. Speech to text might work for dictation, but still I need to get the phone out of my pocket and wouldn't want it to inform me of things when other people are near.
Jakobsson was a co-founder of user interface (UI) advisory firm TAT, which was bought by RIM in 2010.
What will happen in the coming two years then? Will Google Glass let us walk around with an augmented world where you are one word from Googling people you meet? Will we get flexible screens that fold out and give us full-sized keyboards or newspaper-sized screens? I doubt it will be mainstream. I think it will be too much information as well as, honestly, too little interesting information. If you could get text in front of your eyes at any time, what would it say? It can't be static or things you need to interact with, with a lot of information (like your email), as the glasses won't have a very rich input. It needs to be information highlighting your surroundings-but what would that be that wouldn't be spammy? Discount on stores around you or names of people floating on top of their head sounds distracting. There are 'use cases' [examples of how people will use them], like visual manuals for auto repair men, the route you should walk to get to the right gate, but these are still niche.
I think the coming two years will be the years of accessories. The innovation will happen in watches, wristbands, dongles and key rings. These will extend the physical input interface to our Tamagotchis [handheld digital pets] and life controls. And phones will further spill out on existing screens-your computer, TV, and maybe in the coming future, public screens. Administration, analysis and overview is still best done 'lean-forward' at your computer and enjoying movies, pictures, or music is still done 'lean-backward' in front of your TV. Technology changes fast, but human behaviour has a slow pace.
What is needed is a philosophy that helps us turn on the real world. Be more in the 'now'. Devices should tell us if there is something important, really important, to attend to. Not just a Facebook post, a tweet, a news article to read, or an email to reply. Our grandparents' generation fled the assembly line so that we now can create a digital one for ourselves. We take care of these self-created chores, hoping to be free at the end of the day. The best use I see of Google Glass is that it should remove information around me-logos, text and advertisement-so I can focus. I guess that's not in Google's interest.
The question we should ask ourselves is: Free to do what? Why aren't you doing that now? What are you afraid to miss by not looking at your phone or, god forbid, turning it to mute? Try it for a week and see what happens.
- Letter from the Editor: The Rise of India's States
- YV Reddy: Empower the Fiscal, Don't Weaken It
- Oscar Trivia