At last! It’s time to write a commentary on something I actually may have an expert opinion on. Maybe. I received my pre-ordered Myo last week all the way from Thalmic Labs in Toronto.
Interestingly, I only got it a month or so after people I knew who had pre-ordered it significantly before me (around the time it was first announced).
Goes to show it is possible for tech companies to sort out their production to meet current demand (*cough* OnePlus *cough*).
The Myo is an armband or chunky bracelet that sits high on the forearm, just below the elbow. It allows the wearer to control software applications and physical devices through movements that are recognised by its 9-axis IMU (combined gyro, accelerometer and magnetometer).
Its unique feature is the usage of the electromyography or EMG signals of the muscles in the forearm.
Basically, the Myo is able to detect different forearm muscle contraction patterns and link them with motions such as making a fist, spreading the fingers or pivoting the hand. Such recognition can occur without the forearm (and Myo) moving at all. Combined with the IMU, this provides the Myo with the ability to sense control gestures and actions of the hand and fingers.
Easy parallels can be drawn to the gloved system in Minority Report or the gesture-based computers at Tony Stark’s disposal in Iron Man and Avengers.
I did my postgraduate studies on using the EMG signal as a control method for prostheses or exoskeleton devices – a slightly more difficult problem because the aim was to provide continuous control, for example reproduce an intended trajectory, which requires constant monitoring for changes in muscle activity of antagonistic muscle pairs.
The Myo is slightly more simplistic (and therefore consumer-ready) because it only needs to recognise particular snapshots of muscle activity. And it does, really well.
For a new technology, Thalmic have done an extraordinary job in facilitating the process of getting the Myo out of its nicely and efficiently packaged box, onto your forearm and recognising control gestures.
Instructions are very clear, and it’s the little things, like checks to make sure the Bluetooth dongle is attached properly and demonstration videos that exemplify each step, which really make the difference.
Any Luddite would be able to do it – and that signifies a fantastic user experience.
The Myo itself works. There is no over-promising of its ability to recognise gestures. Out of the box, allow it 2-4 minutes to stabilise and you’ll be able to use it immediately.
The gesture library is currently quite limited but there are a few to choose from:
Obviously, the muscle activity required to produce these gestures are the easiest to differentiate, but it shows Thalmic have carefully considered what gestures to use and how to use them.
This is reinforced by the “wake-up” command (double tap of the thumb and middle finger), which tells the Myo to start listening for a gesture.
The feature prevents false positives, or unintended gestures from carrying through to the device being controlled.
Combinations of gestures can be used to generate a secondary tier of commands. For example, an opened hand followed by a clenched fist could generate an action on its own.
This opens up further possibilities for control and actions once a user becomes proficient at stringing different commands together.
The Myo itself is comfortable. I’ve worn it for longer periods of time (20-30 min) and have not yet experienced discomfort.
It also comes with adjustment clips for if you have smaller forearms. I don’t think there’s an alternative for larger forearms, but all I’ve had to deal with are those skin depressions that fade after some time.
I’m hoping to eventually see how it feels after an hour or more of use.
Thalmic has its own Myo app market, called the Myo Market. It contains the “connectors” that are currently compatible with the Myo and allow you to use it to move your mouse, manipulate your web browser (all the good ones are supported: Chrome, Firefox and Safari), change presentation slides or control your media player (Spotify, VLC, iTunes and WMP are all supported).
The connectors allow you to integrate Myo with a host of other applications and games, such as Minecraft, Popcorn Time, and even Saints Row IV, but I have yet to test them out. There’s even a connector for Trello, which at the time of writing is the only productivity tool supported.
The Myo is relatively new, and although their developer program has been active for a while (still a little butthurt I didn’t get in…), I expect more and more connectors to come out soon to further populate the Market.
At the moment, I feel like there isn’t enough functionality for a user to completely integrate the Myo into their interfacing experience. It’s not much use when you can only use it for certain (and a few) apps.
My other main issue is that there’s no over-arching system that ties all the connectors together. For example, to use the Spotify connector, the Spotify window has to be in focus.
I therefore either need to alt-tab or mouse click the Spotify window before I can perform the gesture that I want. Cutting the Myo out of the process is faster and easier.
However, if I can tell Spotify specifically to listen for the next gesture, and other apps to ignore it, I won’t need to disrupt my current workflow. To me this seems like an app selector is needed that could be like a weapon selection wheel commonly found in games where the protagonist can carry a gazillion weapons.
I’m going to take this idea to the developer forums to see if it can gain some traction.
As I said before, the gestures are quite simple, and I expect additional base gestures to be added as the recognition algorithms of the Myo are improved. More complex movements will allow more to be done and faster.
At the moment, because the gestures are focussed on movements emphasising polarity, it can get quite tiring quickly when a multitude of gestures are used in sequence. Even more so, when such range of motion of the wrist isn’t often used.
The IMU works well and can accurately sense movement. Its main function so far is to operate as a dial when rotating the forearm (for example as a volume knob), or as a panning tool (for example moving a cursor).
The problem I’ve found, especially with the panning, is that when this is combined with a gesture, there is quite a bit a drift. This makes it difficult to use the features in tandem with gestures.
I gave up trying to click on buttons using the mouse connector. Some sort of compensation is required.
It is also limited by its position close to the elbow. The data available to it isn’t as rich as it would be if it was around the palm for example.
This is particularly evident during rotation, as the wrist experiences more natural rotation than the upper forearm. Getting the upper forearm to rotate is slightly awkward without straightening the elbow. I am however, interested to see how this might be overcome.
Finally, if you watch the promotional video of the Myo, and compare it to what you can do now you’ll find that some of the applications were only ideas.
While definitely feasible, not all of them are available yet, such as smart home or robotics control. However, I believe support for them are coming – just don’t get your hopes up too early.
The Cons are really points of improvement, and Thalmic and the developer community are probably working hard to address them.
The Myo is certainly an innovative product and has the potential to reshape how we interact with technology. Much of its success depends on developers, and on how they exploit its features to provide a seamless, intuitive and natural interfacing method.
I especially look forward to more connectors being developed for productivity apps. Hopefully I’ll have the time to provide some input, as I do want to see it grow and improve.
I really want to be able to wear it constantly and have my environment respond to my wishes without need for visual displays or feedback. Not only that, but I believe the Myo has the potential to really enhance our existing interfacing experiences. The hard part, is figuring out how.
Demo of using the Myo to control the Parrot AR.Drone quadcopter. The implementation is basic (leverages mostly off the IMU data), but it’s a good example of where the technology could be used.
Last Updated on Mar 9, 2015