Wearable tech – working together

Posted on March 19, 2014


i read that Google now have a version of android for wearable gear,  kinda makes sense to me.

However what I want to see is the greater intercommunication between wearable kit.

If I have a screen projected on my glasses I do not want to be touching my head,  or waiving my hands around in the air (like I just don’t care).  This is not a natural thing,  not a part of the mental model for using a screen.
You might say,  well you’ll learn,  however social norms of behaviour (in many parts of the world) are not to be making gestures in public,  and touching your head can be awkward, for example if you want to be discreet etc.

So I am unconvinced that these are the way to go. As I was always told you have to be constructive when being critical … so, what I’d like is to be able to use my watch / or fitness fob/bracelet to control my screen. This an easier and far less obvious action.
I should declare an interest of sorts: I wore a watch for <breaths a sigh> many decades, and as such the actions of checking a, or fiddling with a watch are natural to me.  So that may be the reason.

So I guess what I am saying is, lets have an open, common API that allows the various devices to inter communicate and this will help a lot.

And with that comes a responsibility for ensuring the user understands the causality of events.  If I am not sure why something is happening, (or even, how) then it will lead to dissonance and a poor user experience.  It will not be easy to ensure that the mental model the user has is consistently applied across devices that are interconnected.  So perhaps there needs to be a new discussion and some underlying standards in this space to allow new ideas to flourish without creating a confused user.

ohh and is it ok to say that I love the round screens in the google demo,  somehow it just makes it all seem more futuristic.Image of a Hangouts message


Posted in: life, mobile