The date is November 4th 2015. It’s a brisk sunny day in New York City. Melissa Tait and I have just stepped out of an Uber on West 39th street. Both of us are wondering if we’re in the right place.
This can’t possibly be right…there’s an abandoned body shop over there, and a decrepit looking brick building right next to it. Check the invite again. I think to myself as we walk towards the building.
A man in what looks like a mechanics outfit standing in a dark corner notices Melissa and I looking at our phones, then at the building, then back at our phones. He must be able to tell we’re lost. As we pass by him, he motions to us and asks quietly, “You here for the Microsoft event?”
“Actually, yes we are! Do you know where it is??” Melissa asks with a chuckle.
“Right this way” the man responds, as he begins walking towards the door next to him.
I notice a small, inconspicuous sign that seems to indicate there’s an event at this location. No indication as to what it’s for other than the tiny Microsoft logo.
We walk through the door and around the corner of what looks like a parking garage. The man gestures for us to follow his lead and step up onto the giant utility elevator in front of us. We enter the elevator and he pulls the metal gate down. Soon we are ascending to what I can only assume is either a well-planned mugging or a very well hidden Hololens demo. Luckily for us, as the elevator comes to a screeching halt, the man pulls the metal gate back up and we’re greeted warmly by a young woman who ushers us out of the elevator and into an open warehouse room.
The room is large, but well laid out. This no longer looks like the decrepit brick building I saw from the outside. There is a receptionist booth in front of us where we’re asked to sign in. The woman explains that we’re scheduled for the 12:15pm demo and should proceed to the right where there is a group sitting on a comfortable looking couch. They’re watching a short instructional video which seems to be illustrating what they’re going to see and do once they begin the actual demo. Just behind them is a fairly large buffet complete with all sorts of snacks…I’m quick to scope the table and notice tasty looking mini-quiches. After all, I’m a developer, it’s practically in my job description to take advantage of free food!
Fast forward 15 minutes, it’s time to experience first-hand what is being touted as the next generation of personal computing, or “holographic computing” as Microsoft says. A stocky young gent in a plaid shirt and thick framed glasses gestures to me to enter the ad-hoc living room enclosure they’ve got setup in the middle of the warehouse floor. There are several of them setup around the area in order to keep things moving smoothly. The room is about 10’x10’ with no windows, though there is a couch to my right and a chair in the corner.
A controlled environment for optimal results, I think to myself.
The man turns around and I’m able to see the headset for the first time. He points at the headset and says “Okay – the game you’ll be playing today is called Robo Raid. It’s a mixed-reality shooter that’s intended to highlight some of the coolest features of Hololens. Basically, there will be robots coming out of the walls all around you and you have to try to shoot them. You can shoot them by looking directly at them, holding your index finger and thumb out in front of your face and making a pinching gesture.”“That’s cool…why the pinch though?” I ask.
“Well, that’s actually the universal gesture for selecting in Hololens…like a mouse click.”
He walks over to me, hands me the headset and says “Go ahead and put it on!”
As I slide the headset onto my head, I recognize that it feels kind of like wearing a heavy baseball hat, but it’s very well balanced compared to the bulky VR headsets I had been wearing lately. The VR headsets are all very front-heavy. In contrast, this headset just seemed to fit right. It was immediately apparent that Microsoft had put some time into thinking about comfort and ‘wearability’. After all – if this is the future of personal computing, you’ve got to be able to wear it for hours and not feel fatigued.
Additionally it’s very important to note that this is an augmented/mixed reality headset. There is a nearly transparent visor that the wearer is able to see through while using the device. Whatever environment the user is in is still clearly visible. All digital elements, graphics, videos, etc. are overlaid in the room you’re actually standing in. This is one of the biggest differentiators between AR and VR, where VR aims to immerse you in another reality, AR’s goal is to “augment” your existing reality.
My instructor (we’ll call him Mr. Plaid) walks back over to a laptop in the corner of the room and warns me that the game is about to start. The headset lights up and the he instructs me to look around the room. “The headset is now scanning the environment,” he says. “This is how it knows what the room looks like, where the furniture is, etc. The robots will use objects and furniture in the room to their advantage.”
A red grid is mapping the room in front of my eyes as he’s explaining the process. It looks like the headset is recreating a digital representation of the room I’m in…and in fact that’s exactly what it’s doing. It reminds me of the original Terminator movie, when we see Arnold’s POV as he scans the rooms he enters. It takes about 10 seconds and then the real fun begins. A loud BANG behind me, I turn around quickly and there’s a giant hole in the wall! Not a real hole, this is all part of the game, but at first glance it’s fairly convincing. The computer generated imagery is aligned almost perfectly with the environment (the wall in this case). The sound is noticeably binaural (I’m perceiving sounds coming from different directions just like we hear things in the real world). I look towards Mr. Plaid with a smirk that must have been wider than the Cheshire Cat in Wonderland.
“This is pretty amazi…”
Just then my world goes red for an instant and my visor is temporarily cracked. An arrow in front of me is prompting me to turn around again. As I look behind me a nasty looking robot is floating in the middle of the room, swaying from side to side, preparing to shoot at me again.
“OH YOU SON OF A…YOU’RE MINE!” I exclaim as I line him up and make the pinching gesture.
The bot explodes in front of me but before I can celebrate, another explosion happens to my right. I turn to see another giant hole in the wall with scorpion like robots crawling out and creeping across the wall.
BANG (pinch)! Another one down.
I turn to line up a third time but this one is sneaky…he’s hiding behind the couch, out of my view!
“Not for long!” I say as I shuffle awkwardly to the left and tilt my head just far enough so I can see the little guy. I shoot (pinch) and miss. My shot makes a small hole in the wall behind the couch.
“Whoops – I’ll repair that!” I say to Mr Plaid.
It’s interesting how unaccustomed I was to being able to move around the computer generated objects that had positioned themselves in the 3D space all around me. Imagine walking around a chair in the middle of a room. The chair doesn’t move, and you’re able to see it from all different perspectives depending on where you are standing. In this case, I was dealing with floating robots that actually did move and fly around the room. This is what augmented reality is all about! It’s seamlessly merging my room with computer generated 3D objects, characters, robots…and the robots are interacting with the room as if they were actually in it!
Pretty neat stuff, right? In fairness, I should mention that it’s not all rainbows and unicorns in Microsoft’s secret demo warehouse. I had heard a lot about the potential limiting factors of the headset prior to the demo. A limited field of view was one of the most frequently discussed downsides of this ‘mixed-reality’ approach. I had also heard a lot about why, with explanations ranging from the processor not being powerful enough, to the physics of the way light is channeled down through the lenses on the headset. Regardless of those drawbacks, when the game started up, I most certainly recognized that we were not dealing with a 114 degree field of view (which is what humans normally see in 3D). Instead, it seemed as though I was looking through a window. Some users have described the effect as similar to holding a smartphone horizontally 6-8 inches from your face. Anything that happens in the headset happens within this “window”.
Fortunately, that doesn’t tell the whole story. You’re still free to look around the room, anywhere, at any time. Holograms and computer generated imagery can maintain position within the room even outside your active field of view. So for example, that big hole in the wall that was behind me when the demo first started is always behind me, in the same exact spot for the duration of the demo. It doesn’t matter if I look away, walk away, leave the room, that big ugly hole will always be there when I return.
This is the other magical feature of Hololens that I discovered during the demo. Whatever Microsoft is doing behind the curtain to map out rooms is working, and it’s working very, very well. The headset is able to maintain a working memory of the room you’re in and where computer generated objects are in 3D space in relation to that room. It does all this in real time, and it does so very convincingly.
As my demo comes to an end, I thank Mr. Plaid for his time and the opportunity to try this amazing piece of technology. They say seeing is believing and after that demo, I believe. I believe that some version of this headset is going to change the world of computing as we know it, and I’m looking forward to being at the front of that wave as it happens.
Be sure to check out the next post in this series as we receive and unbox the HoloLens!