Advertisement

NextMind's brain-computer interface is ready for developers

Finally one you might actually want to wear.

If science fiction has taught us anything, it’s that we have an innate fantasy to control things with our minds. Whether it’s duping Stormtroopers about which droids they are looking for or Professor X’s mutant powers, we clearly hope that magical powers lie dormant within. The truth is, the power is definitely there, but you might just need something like NextMind’s headset to use it (for now).

NextMind is the latest in a long line of companies trying to harness the brain as a means of controlling our digital world. At first, its take on things may seem familiar: Don a headset which places a sensor on the back of your head, and it’ll detect your brainwaves which can then be translated into digital actions. One area where NextMind differs is that the sensor seems more practical than many we’ve seen and won’t leave you looking like a shower cap-wearing lab rat. In fact, the wearable can just as easily clip onto the rear of a snapback.

Beyond size and aesthetics, NextMind’s technology also seems fairly mature. I tried a demo (via the developer kit which goes on sale today for $399) and was surprised by how polished the whole experience was. Set up involved just one basic “training” exercise and I was up and running, controlling things with my mind. The variety of demos made it clear that NextMind is thinking way beyond simple mental button pushes.

There’s still a slight learning curve to get the “knack” -- and it won’t replace your mouse or keyboard just yet. Mostly because we’ll need to wait for a library of apps to be built for it first, but also it’s still a new technology -- and it takes some practice to become “fluent” with it, as my terrible performance on a mind-controlled game of Breakout can attest. But the diverse and creative demo applications I experienced do hold a lot of promise.

NextMind brain-computer interface
James Trew / Engadget

Right now, the applications are pretty simple: Mostly controlling media and games and so on, but NextMind’s founder and CEO, Sid Kouider is confident the technology will evolve to the point where you can simply think of an image to search for it, for example. There are also complementary technologies, like AR, where this sort of control not only seems apt, but almost essential. Imagine donning some augmented reality glasses and being able to choose from menu items or move virtual furniture around your room just with a glance.

The technology driving things is familiar enough: The sensor is an EEG that gently rests against the back of your head. This position is key, according to Kouider, as that’s where your visual cortex’s signals can most easily (or comfortably) be reached. And it’s these signals that NextMind uses, interpreting what you are looking at as the item or signal to be acted upon. In its simplest form, this would be a button or trigger, but the demos also show how it can be used to DJ, copy and paste and even augment (instead of simply replace) other inputs, such as that mouse or a game controller you are already using.

Perhaps more intriguing is its potential application in the home. As everyday household items become “smart” the ways we interact with them can change. Kouider said that after this year’s CES his company has been speaking with around 25 enterprise companies. Mostly in the entertainment or gaming industry, but also companies that want to put their technology into their own products. “This is something very, very exciting that we are working on right now to integrate this in real, physical objects,” Kouider added. Clapping twice to turn off the lights will seem more passé than ever once you can simply glance at the switch and do it that way.

If you’re the suspicious type and worried about your most intimate thoughts becoming the next hot tech data for sale, Kouider wants to remind you that NextMind isn’t decoding your thoughts (that would be a much more incredible technology). Instead, it’s simply interpreting your visual focus.

As with the iPhone, which popularized the touchscreen as a primary interface, how this technology fits into our lives will evolve as users (and thus, developers) have new expectations and demands for it. Time isn’t quite up for the trusty keyboard and mouse combo, but they could start to seem a little quaint, perhaps.