Osterhout Design Group (ODG) has been revolutionizing wearable technologies for the consumer market since the turn of the century. The San Francisco-based incubator, founded by Ralph Osterhout, who first started making props for James Bond films, is now bringing science fiction to life by pioneering the next generation of mobile augmented-reality smart glasses.

ODG carried the momentum of debuting their flagship product—the R-7 Smartglasses—at CES in 2015 by partnering with NASA to bring AR glasses into space, as well as with brands like MINI of the BMW Group to showcase the manufacturer’s cars.

The first half of 2016 has been just as fruitful for ODG. In June, they collaborated with OTOY, a Los Angeles-based cloud rendering company, to develop mobile products and holographic experiences for virtual cinema. In May, they partnered with NuEyes to offer people suffering from macular degeneration, glaucoma, retinitis pigmentosa, and other vision-robbing conditions with a chance at renewed eyesight, and the company’s Project Horizon, which provides mobile, theater-like see-through display, won best-in-show for AR at Augmented World Expo.

Nima Shams, ODG’s vice president of head-worn products, joined [a]listdaily to discuss how they’re reaching consumers with augmented reality.

Prince William demos ODG's R-7 smart glasses at the Founders Forum in London earlier this year.
Prince William demos ODG’s R-7 Smartglasses at the Founders Forum in London earlier this year.

How has AR evolved to where we are now? Where do you think the space is headed in the future?

Whatever your belief is around Google Glass and Oculus, both of those companies actually brought attention to the AR market. Even though we were quiet at the time, just shipping product for government, heavy industry and enterprise sectors, we weren’t announcing to the world what we were doing because it wasn’t ready enough for the world to know. About two years ago was the first time ODG came out and told people about our journey and experience. AR has been around for a long time. The phone-form factor is not the right form factor for AR. You’re not going to walk around holding a device in your hand. It just needs to be natural. True AR needs to blur where the digital world and real world interact. Once you can blur that line, people really can’t tell the difference, and that’s when it becomes magical. We think AR will actually be the next step in the evolution of mobile computing.

How can brands use AR activations to reach consumers?

AR can be made just like everything else—well, and not very well. You get stuff flying out, and being distracting, or you can be helpful. If I go to a movie theater and I look at a poster, the preview of the movie can start playing. Or, we have a company in our group right now that if you grab a cereal box and look at it, it gives you all of the nutritional facts. It tells you how much sugar it has, what allergies people should know about. It just enhances people’s lives and by doing that, it allows you to interact with that product. If I’m on the other side and I’m selling that product, it allows deeper storytelling. Now people don’t just see the object, but have almost like a personal assistant that can show them content or information.

With ODG, what are some of the brands that you’ve worked with and can discuss?

Our partnerships span everywhere from heavy industry to consumer-centric type of devices. We’re seeing a few consumer verticals emerging in in-flight entertainment. You might be in a fancy jet, but you have this old screen that was maybe the best thing in 1980 when the plane was first built. You wear AR glasses, and you have that information in front of your eyes at cinema quality, and it’s private. We have a few companies, like NuEyes, who’s trying to help people with visual impairments. It’s very powerful.

What do you think is the best way to use AR?

My goal at ODG is to fuse what’s reality, and what’s virtual. AR has a lot of nuances—there’s mixed reality, light fields, assisted reality—all of these terms are basically simplified to advanced computing on the head, and visualize the world and bringing that digital content that exists into your world. It’s a disruptive new technology. We never knew healthcare would be so huge. We have surgeons who’re performing procedures and wearing our glasses and it improves their ability to perform a surgery because they visualize and augment all of the vital information around the operating table. They don’t have to look around the room to look at vital monitors anymore.

R6-packaging-open-3k

How are you separating yourself from the peripherals competition?

ODG stands on three major pillars. The first one is cinema-quality optics. It’s the equivalent of being in a movie theater. You don’t see pixilation, or shuttering effects. It’s high resolution and independently driven to each side. The second is full integration. Our product doesn’t need an external device; there’s no cable, and it’s all built in. There’s a Qualcomm Snapdragon 805 processor inside; it has 64 gigabytes of internal memory, 4 gigabytes of pop, wi-fi, GNSS—so any technology that’s in your phone, is in our device. And it’s under five ounces. The third one is extreme mobility. You could use this product outdoors and in bright sunlight; you could use it indoors at a show with bright screens.

How are you enabling developers?

We’re investing very heavily in the developer side because the hardware is the barrier into entry. Additionally, it’s the software and the content that makes it magical. If you have a huge reticle of OS developer programs which subsidize the glasses for developers, we have an SDK for them to use on our product. And because it’s Android, most developers feel very comfortable with the product.

What are some of the use-case scenarios for AR to attract consumers?

It’s not only AR, it’s mobile computing, variable computing, headwear computing. What I mean by that is, AR is fantastic, however I could do email, I could watch YouTube on this device. It could guide me to where I want to go. The challenge with the phone is the screen can’t get any larger than it is now. It’s not sunlight readable, it’s not body-position independent—you have to hold the device. With glasses, you wear them one time and it’s just persistent. They enhance your surroundings.

How do you think AR can be used toward gaming?

I’m a big gamer myself and the difference is you don’t need a screen now. You don’t need a set location. Your world becomes your screen to play a first person shooter. You can augment your friends looking at any creature you want. You can scan the room you’re in and interact with it, and run around in it. There’s no longer the concept of ‘hey, I’m going to be in this screen.’ Your world becomes your reality; it becomes the canvas in which you game. I’m very excited on the future of AR.

Follow Manouk Akopyan on Twitter @Manouk_Akopyan