We checked out a prototype head-mounted display from Seebright, demoed at GDC.
Most virtual-reality headsets have their own onboard hardware and screens. An as-yet-unnamed headset from Santa Cruz, Calif., startup Seebright does it a little differently — it uses your smartphone's screen and onboard gyroscope and accelerometer to create virtual-reality and augmented-reality experiences.
This headset, a prototype for which was revealed at the Game Developers Conference (GDC) in San Francisco, is basically just a head-mounted container for your smartphone, and comes with an eyepiece that reflects the screen's images onto your eyes. It's an interesting idea, especially if it shapes up to be a functional, low-cost virtual-reality device, but as it currently exists, the headset is clunky, ill-fitting and uncomfortable on the eyes — both the wearer's and everyone else's.
We tested out the headset at GDC, where Seebright's CEO and founder John Murray explained that the headset was designed with two goals: one, to repurpose a smartphone as a virtual-reality device, and two, to keep the wearer's eyes visible. That's why the bulk of the device sits at the top of the wearer's forehead, at about a 30-degree angle above the wearer's line of sight.
To get the smartphone's image in front of your eyes, you clip on a pair of "beam-splitters," essentially glasses-shaped mirrors that reflect the smartphone's image toward your eyes. There was also a second pair of beam-splitters with 20 percent transparency, meaning the real world behind the image was visible.
We played a rough spaceship simulator running on an iPhone 5, using a custom Seebright controller to aim and fire at asteroids. Turning our head revealed more of the cockpit we were in. Because the Seebright headset itself doesn't have any hardware or software, all of this motion was calculated on the iPhone itself, using its accelerometer and gyroscope. This is similar to the Rescape, a gun-shaped iPhone attachment that creates augmented-reality experiences.
The demo itself was a bit clunky, however. More importantly, the image appeared to be slightly off, as if it was very slightly doubled.
We then watched a trailer for "Puss in Boots: The Three Diablos" that looked 3D, showing that the headset can be used for more than just gaming. However, once again, the screen was slightly off.
The Seebright folks eventually realized this was because the device kept slipping on our head, which they said was due to our "long hair." Eventually, after much readjustment, and with one representative holding the back of the device as a counterweight to its heavy front, the screen appeared normal to our eyes.
Of course, this headset is still very much a rough prototype. Achieving a functional design is even further complicated by the fact that the Seebright device is designed for use with any smartphone, which means the weight in the front part will have to be adjusted depending on the device used.
Using a smartphone comes with other design challenges as well. For one, you have to remove the smartphone from the headset every time you want to change something on it. Improving the Seebright controller might make this a non-issue, but in the build we saw, it definitely affected the usability.
Also, even though you can theoretically use any smartphone in the Seebright device, you can't use any app. However, Seebright says it's easy to make apps compatible. The company is developing an app software-development kit (SDK) that basically adds two functions: it doubles the screen (creating one image for each eye) and it provides a connectivity to the remote control. It's up to app developers to utilize this SDK, and Seebright is already in talks with several.
We're not ready to write off the Seebright headset just yet: The virtual-reality field is still young, and none of the headsets feel entirely natural yet. But the difference in going from Sony's Morpheus or the Oculus Rift to the Seebright is striking, and not in a good way.