At last month’s Augmented World Expo I had a chance to sit down with David Smith, CTO and co-founder of Wearality, for a conversation about how the Sky open source headset came to be. I had never heard of David or Wearality prior to arriving at the conference. During my first lap of the Exhibitor Expo I came across the Wearality booth, which was really just a high table with a poster hanging behind it and a few early production samples scattered about. The appearance was decidedly more start-up than established business. Next to the table was a man (who turned out to be David Smith) exuberantly extolling the virtues of his product to an attendee. The other company representatives were also deeply engaged, so I poked around at the plastic lens framesets sitting on the table but didn’t have any context to understand what I was looking at. It didn’t seem that I would be able to get attention any time soon so I moved on.
Shortly thereafter I found myself at the Smart Glass Introductions conference session where founders and executives from nine AR and VR glasses makers were given a few minutes each to promote the progress on their products. When it was David’s turn, he took the stage with no slides to show, no demo and no rehearsed spiel — a disposition that matched the sparse vibe of his booth. With the confidence of a quarterback, he launched into a brief promotion of the Wearality Sky that made the product sound absolutely revolutionary. “We think this is a game changer, in fact we know this is,” he pronounced.
I was intrigued. I made my way back to the booth the following day and found the staff to be equally busy with curious show-goers. I patiently waited until I could get a chance to do a demo. What I experienced was like nothing I had ever seen before. I was seeing vivid 3D imagery resembling the Grand Theft Auto game, projected from a phone screen through lenses enveloping my entire field of view. The Sky apparatus felt a bit toy-like, which turned out to be by design as the product is intended to be lightweight and collapsible so it can fit in a pocket. The more I examined the Sky, the more I appreciated how exquisite the engineering on these things were. I had to learn about how they came to exist. David agreed to sit down with me for this interview at the end of that day. What I heard was a fascinating account of invention brought about by a serendipitous visit to the bookstore one day by a man who has a long history of thinking outside the box. This is a story about how things get invented. Here is how the discussion unfolded:
(A)ugmera: Tell me about how the Wearality venture was conceived?
(D)avid Smith: I started the company a couple of years ago while working at Lockheed Martin where I was the Chief Innovation Officer and Senior Fellow. I was hired by Lockheed to run the program referred to as “Create the Holodeck”, which was all about creating next generation wearable devices using augmented reality and virtual reality. This was a significant project and we had an extraordinary team. Lockheed was one of the very few places that got AR and VR. This stuff is really hard to do. We had access to an incredible combination of technologies and we had millions and millions of dollars worth of infrastructure and technological expertise to pull it off.
A: Was Lockheed’s interest in flight simulation?
D: Flight simulation and all kinds of training actually. Flight simulation is one of the things we do. We built the big flight Sim Domes. We also make the F35 JSF which is the most advanced fighter on the planet. And it turns out that the F35 is requiring even more time in the simulator than normal because it’s so advanced, you really can’t train to every situation in the cockpit itself. The only way you can do it is simulation. Simulated training became the centerpiece of that whole (F35) platform.
A: Is mechanical maintenance training also one of the use cases?
D: Our focus was three-fold: on the one hand there was display technologies; on the other hand was augmented reality technologies for (aircraft) maintenance guidance — which is basically guiding a technician through the process of how to fix, maintain and modify. And of course maintenance training, what we call “remote subject matter expert” which is grabbing a 3D model dynamically in real time and delivering that across the Internet to a subject matter expert so he can help work on the project and literally do digital graffiti on the part.
A: Like a telestrator?
D: That’s right. But it’s 3D. He (the remote subject matter expert) is literally drawing on the surface of something and communicating maneuvers like “draw around the circle” and “turn this 3 degrees”. He’s literally drawing on the part from his workstation and the remote tech is seeing that locally. So it was a pretty neat system.
A: What was the display technology like for this system?
D: I’ve been doing head mounts for 30+ years, and it was pretty clear they all sucked. The best thing that was out there was probably technologies developed by Eric Howlett about 30 years ago called Leep Optics and it was really expensive, very heavy and the resolution quite poor by today’s standards.
We could do 320 x 200 in an LCD about the same size as our phones today and it was just impractical. Then the landscape started changing. First of all phones got pretty good. When we started this project 5 years ago, you know phones were reasonably equipped, but now they’re amazing and they’re about to go well beyond amazing. So we weren’t thinking in terms of phones when we started the project, we were thinking in terms of other kinds of displays but we knew that LCD displays were getting close to what we needed them to be able to do. The other thing that was really important to us was that field of view really mattered. Especially in defense where the bad guys are in the periphery, in the areas where you can’t see. You’ve probably seen images or videos of people doing night vision and they’re scanning with their head. That’s because the field of view of night vision is so narrow that they have to turn their heads really far over to the left and right to get a sense of location. So it was pretty clear that we needed a field of view that was a closer match to what humans would do, both on the AR and VR side. So we built two kinds of lenses, the one that you saw which is the VR lens — the double Fresnel. That was an idea I came up with. Sometimes it’s just accidental. I saw something and said, “Oh shoot, that might work” and it turned out that it did.
A: What’s your background and how did you come to understand the ins and outs of optics?
D: I started out in mathematics but wound up in software. I’ve done a number of things: I wrote the first real time 3D adventure game called The Colony in 1987 — so I know 3D really well. I wrote a product called Virtus Walkthrough which was the first real time design tool for the PC and for the Macintosh. I co-founded Red Storm Entertainment with Tom Clancey. It was even my idea to do Rainbow Six. So I’ve been very focused on the 3D user experience across the board. I’ve been working with Alan Kay, the father of the personal computer, since 1990. He’s the guy who led the team at Xerox PARC that created the ALTO which is the father of the Macintosh, the PC and even the iPhone. So, I’ve been doing a lot of hard core 3D stuff. I did my first head mount 30 years ago. I was doing telepresence with a Puma 560 3D robotic arm that could manipulate things and pick them up and it actually had a stereo camera that could do eye-hand coordination. That was when I did my first head mount which was a terrible, terrible system but it worked.
A: What brought you to Lockheed?
D: I started another company that did 3D training for defense. We sold that to Lockheed but I didn’t join at that time because I was working on another startup in Silicon Valley, but Lockheed wound up convincing me to come on in. So my background is in mathematics and physics but it’s very broad — you kind of learn what you need as you go. I think it’s important in a sense not to be an expert in the field sometimes because you’re not aware of the limits, and in my case I didn’t know much about optics. I knew what I was looking for and I knew what the requirements were. So in the case of these lenses I saw something that happened to work that no one had seen before. I brought it to the optics people at Lockheed Missiles and Fire Control and they said they think we think we can make that.
A: And that’s the double Fresnel lens?
D: That’s the double Fresnel.
A: As a layman, I’ve never come across such a lens…
D: No one has.
A: What’s your frame of reference? What is a double Fresnel lens and how did you come to know about it to even bring it into your scope of possibility when you were developing this?
D: The first key to invention is knowing what you’re looking for. I knew what I wanted, and in this case, this was sort of an accident waiting to happen. So what happened, I was in the bookstore, and you know how you can get those little credit card Fresnel lens that old people get so they can see? They had a sale on them..
A: Fresnel is the name of the maker or the guy who invented it?
D: Fresnel is the guy who invented it — French Guy, which is why I say freh-NELL instead of frez-NELL.
He actually invented it to create lenses for light houses. The problem is that if you made a lens for a lighthouse out of a single piece of glass it would be like a foot or more thick and the molten glass would never cool off in the center. It would just be a disaster. It’s almost impossible to make these things but you need them. So he came up with the idea. The really interesting thing about light and lenses is that light only bends when it hits the interface between the two materials — between air and glass, for example. Air bends light. Light passes straight when going through the lens and then bends on the opposite side when it goes from the glass back through air. That’s an interesting thing. You can imagine there’s this curved lens and so you imagine that you can slice it like this [makes a chopping motion] then there’s these squares of glass that aren’t contributing anything and you take them out and then you just take this whole thing and flatten it out and that is a Fresnel lens.
What was an interesting property was that when you put a Fresnel right up to your eye, you can see what you’re magnifying but you can’t focus on the rings — they’re too close. When I saw that I went out and bought like ten of these Fresnel lenses because they’re two bucks each at the bookstore and started playing with them. I tried one and had to have the phone about a foot away to see the screen. I held up two and could bring the screen closer, maybe to half that distance. What’s really neat is when you put four of these together you can hold it up close to the eye. The problem is that it was great in the center but it was out of focus at the edges. I thought, this was an interesting thing but not quite right. These cheap lenses are made of bendable vinyl so I took it into my fingers and just sort of bent it around my eye because I thought it would be cool if you could have a surround image. And when I did that, first of all I did get the surround effect — which was cool, and the other thing is that it was all in focus. I was like “oh shit, that’s it!” I’m changing the focal length between the lens and the display in a way that it became in focus everywhere. We captured that idea, and it was like, “Okay, now can we make this?” And we could. We did!
A: Fresnels are traditionally a flat lens, right? Not just the reading aid but the lighthouse lens as well.
D: That’s right, and what we did was we took that Fresnel and we curved it, which is totally weird. And that’s where the magic happened — when you curved the Fresnel. The image just popped.
A: Is that patentable?
D: The patent is applied for and we are literally a day or two or weeks away from having that one granted. But it’s pretty unique so we’re a hundred percent confident that we’ll get it.
A: How did you go about manufacturing this unique optical effect you invented?
D: It turned out no one had ever done a curved Fresnel at that level of quality before and our manufacturing partner looked at it and was like, “We don’t know how to make this.” It took us 6 months to go through the process of design. Basically you use a diamond turner, a very high quality tool, to make the mold. You must make this with an injection mold — you can’t make it out of glass. It’s impossible. You have to do it out of a material like acrylic which is what we used. The guy I worked with at Lockheed (to design the lens) figured out how to do this double convergent Fresnel that gives you an extraordinary collimated light field. One of the nice things about this you’ll notice is that as you move your eye around, inside it’s always in focus, it’s always clear and the image on the periphery is focused. He just nailed it. He took the idea and ran with it. And then getting it made was hard but now 6 months later we had them in hand and it was way better than we ever expected. So that’s how those lenses came to be. Once we knew we could make it we then had to figure out how to take it to market.
A: At what point did Google Cardboard come into the picture and what led to your decision to hitch your wagon to it?
D: A little over a year ago we saw Cardboard and it was pretty clear. By the way, Cardboard is based on an earlier version of a similar thing done by Mark Bolas, who was a good friend of mine. In fact Mark was probably my primary collaborator over the last couple of years on designs to use these lenses. So this particular design that we’ve got now for the Wearality Sky is really greatly influenced by him.
Cardboard was really interesting in that it was a consumer approach. We liked that a lot. We could see that these guys get it. They understand that the phone is extraordinarily capable. Our original focus was tablets. We saw what they (Google) could do there and we realized there might be a play here. When you look at it from a business perspective instead of a technology perspective: I could get an extraordinarily wide field of view — 170 degrees field of view with the tablet. But tablets are not where the action is anymore, and the phones were becoming extraordinary. I had a Galaxy S3, then I got an S4, then I got an S5, then I got Note and I saw that these things are fantastic. Tablets are dead. The big phablets are the place to be. So I designed a version of our head mount for my Note.
I saw that Cardboard was perfect for creating interest, but it was a terrible system with terrible lenses. It’s just a toy. I thought I could do something way better than a toy. What’s cool is they were going to make their API open, so we were the first people to embrace that and in fact I was working with Google trying to get our lenses to work with their Cardboard. And of course the problem we had is that Google gave me a tool to help get the lens model right and it crashed their program because we’re so much wider (field of view) than they were. It was kind of funny. I couldn’t get it to work so I had to go maximize out whatever their edge was, and it was pretty good but it was not quite what we needed. But it was so cool that you could take Cardboard apps and run them on our display. That was interesting — fundamentally interesting. And so we’ve been basically working with Google from the very point where they had anything working with a third party.
Google’s evolving very quickly. We’re big supporters of it, because first of all there’s hundreds and hundreds of apps now that use Cardboard and our head mount works with them. That’s simple. That’s really cool and that’s wonderful. We’re going to have THE premium Cardboard experience. Nobody’s better than us on that. And yet there’s billions and billions of phones that can use that today and that makes it an interesting space. A fundamentally interesting space. So everything else — tablets, big head mounts for PCs are fun, they’re interesting and important but they’re not a market, they’re technologies.
A: Let’s talk about the future of your product. So you’re ready to go to market with it?
D: Yeah, we’re getting close. The first delivery for the Kickstarter is going to be next week and we’re looking at November for large scale distribution. We’re still doing some re-design. This particular design we like, but it’s not as consumer ready as it needs to be so we’re working on that. We’ve got a lot of people who want straps of some sort for the head so we’re looking at that.
A: That was my reaction when I tried it. I want it to be hands-free. No one’s going to use it for any length of time if they have to hold it to their face.
D: That’s right. Actually we’re looking at where we’re going and watching movies — that’s what you want. That’s what I want. What are we going to do with it? We’re going to play games some, but we’re going to watch movies a lot. To give you an idea, IMAX theaters give you between 65 and a 110 degree field of view depending on where you sit in the audience. We’re 150 degrees. Movies are spectacular on our device. People sitting on the airplane looking at this tiny little screen (holds up his phone) — this is bullshit. I can give you a screen that’s bigger than IMAX anywhere any time. My kids might be sitting in the back seat. What are they going to be doing? They can be watching a bigger screen than an IMAX. They can watch Little Mermaid or whatever they’re watching these days in super high quality, super large image. It’s just crazy.
A: What phones is the Wearality Sky currently compatible with?
D: Any 5 inch and any 6 inch Android phone and the iPhone 6+
A: Do you intend to sell them directly?
D: We’ve got a number of options and we’re pursuing them all. We’re talking to people about distribution. We have the product completed and the product rocks. We think that in a sense calling it VR is wrong. It’s far broader than that. When you think about it, there’s billions of hours of 2D and 3D video content out there (Look at YouTube), and we’re the best player in the world for that. Stereo (vision) — we do that. Bubls — you know totally immersive video — we do that. 3D worlds — we do that. But the way we see it, there’s going to be the hard core guys that buy the Oculus and the Vive, and that’s important, we need those. But they’re going to be four to five to eight hundred dollars for the device and about two to three to four thousand for the PC. And that’s great, we want that, but we see that the real market space is going to be portable. Everybody’s got a phone. EVERYBODY’S got a phone. Our device works with just about everybody’s phone. It’s portable. It folds up, and that was intentional. That’s crucial. If you say you want to be part of the “phone ecosystem”, if you come in a case that’s a foot square, you’re not part of the phone ecosystem. You’re not even an add on. You’re nothing. You have to avoid that. You have to be the same size as the phone world that you’re going to be part of. And so we looked at that (factor) and said “well, we can do that”. It became part of the design constraint. How do you make it small enough to fit right next to your phone? Just open it up and you’ve got this incredible experience.
A: How about sound? I don’t see headphones integrated into your product.
D: There’s a reason for that. I like sound. I have very high expectations for what my headphones will do. I have these Audeze LCD-XC which are $1500 headphones — they’re incredible. The lenses of the Sky are the best in the world, why would I pair that with crappy earphones or earbuds? Why would I do that to anybody? Everybody has a phone and everyone who cares has a pretty reasonable headset. Why would I try to bundle them together when the quality of my lenses matches the quality of the highest end headphones in the world? I’m not going to do that. That’s stupid.
A: You’d have to charge way too much.
D: That’s right. That breaks the business model. I’m looking at $69 for this thing.
A: Maybe as a follow-on release you do a deluxe version.
D: You can do that. That’d be great. But I’m going to use the Bose Quiet Comforts or the LCD-XCs. I’m certainly not going to use cheapo earbuds, but other people might. And go ahead, it’ll be great. It will be a fantastic experience no matter what headphones you do. You can pay for that experience whatever you want to pay. All I can do is guarantee that the experience you get with the lenses is going to be unparalleled.
A: Anything else you’d like to add?
D: It’s going to be a really interesting time over the next year or two. We’re going to see 4k phones. We’re going to see much higher performance. What you have to understand is that the rate that phones are increasing in capability is far outstripping the rate that PCs are increasing in capability. And there’s a reason for that. It’s market size. You look at companies like Nvidia and you look at where they’re spending their money and everything’s mobile.
A: Chip companies are all pouring their money into mobile.
D: You’re going to see this intersection where here’s PC power and here’s phones (David draws a line chart in the air). Some people say, “no, phones will never catch up to PCs, and PCs will never catch up to minis, and minis will never catch up to mainframes”. But I’m sorry, you just don’t understand history. 5 years from now, PCs may not even exist outside of servers. Why would you have that when you have something that you can carry around with you and look at. VR is going to be very much a part of that ecosystem. That is way better than anything you have today on a PC or laptop or anything else. That’s the future.
A: I fully expect to be able to come home and sit down at my desk at home and cradle my phone, and hooked up to the cradle is my full size keyboard and my big monitor. This phone goes with me everywhere and my data is in the cloud so losing the device is not fatal. I see this as the evolution of the Chromebook.
D: That’s the real future. The way I look at it, my competition can go after the high end (of VR devices), because there’s going to be a race down to the phone. Whoever’s on the phones is going to win.