meta1-05-17-13-03  

Now that Google Glass and Oculus Rift have entered the zeitgeist, might we start to see VR and AR products popping up on every street corner? Perhaps, but Meta has just launched an interesting take on the concept by marrying see-through, stereoscopic, display glasses with a Kinect-style depth sensor. That opens up the possibility of putting virtual objects into the real world, letting you "pick up" a computer-generated 3D architectural model and spin it around in your hand, for instance, or gesture to control a virtual display appearing on an actual wall. To make it work, you connect a Windows PC to the device, which consists of a pair of 960 x 540 Epson displays embedded in the transparent glasses (with a detachable shade, as shown in the prototype above), and a depth sensor attached to the top. That lets the Meta 1 track your gestures, individual fingers and walls or other physical surfaces, all of which are processed in the PC with motion tracking tech to give the illusion of virtual objects anchored to the real world.

Apps can be created via Unity3D and an included SDK on Windows computers (other platforms will arrive later, according to the team), with developers able to publish their apps on the upcoming Meta Store. The group has launched the project on Kickstarter with the goal of raising $100,000 to get developer kits into the hands of app coders, and though it's no Google, Meta is a Y Combinator startup and has several high-profile researchers on the team. As such, it's asking for exactly half of Glass' Explorer Edition price as a minimum pledge to get in on the ground floor: $750. Once developers have had their turn, the company will turn its attention toward consumers and more sophisticated designs -- so if you like the ideas peddled in the video, hit the source to give them your money.

Source:http://www.kickstarter.com/projects/551975293/meta-the-most-advanced-augmented-reality-interface

arrow
arrow
    全站熱搜

    Shacho San 發表在 痞客邦 留言(0) 人氣()