HoloLens, Unity and Recognition with Vuforia (Part 1)

NB: The usual blog disclaimer for this site applies to posts around HoloLens. I am not on the HoloLens team. I have no details on HoloLens other than what is on the public web and so what I post here is just from my own experience experimenting with pieces that are publicly available and you should always check out the official developer site for the product documentation.

HoloLens and the Windows Holographic platform that it runs provide a lot of capability to immerse your user into mixed reality but there are a number of scenarios where you might need to identify real-world objects in the user’s environment and that’s something that the platform doesn’t have built in although from scanning this API list it looks like the UWP APIs for face detection and for OCR would be present although I have yet to try them myself on HoloLens.

Beyond that, you might reach out to the cloud and use something like Cognitive Services to do some work although, clearly, you’d have to apply some filtering there as it’s not entirely practical in terms of performance or price to call a cloud service at 60 frames per second but if you had a way of selecting the right images to send to the cloud then Cognitive Services can definitely help with its vision APIs (and also perhaps with speech, knowledge, language too).

If you need something that happens on the device in real-time then a possibility is an SDK like Vuforia – I’ve known of this SDK for quite a long time but I’ve never tried it and so I wanted to see if I could get something working with the Vuforia SDK to recognise objects in a holographic app. Vuforia can recognise different types of things as listed in the ‘Features’ section here.

The first step with Vuforia is to go and get the SDK;

https://developer.vuforia.com/downloads/sdk

and, at the time of writing, I downloaded the 6.2 SDK for Unity which brings down a Unity package onto my disk.

I then went and got a development license key;

Vuforia License Keys

There’s then a developer guide on the Vuforia site;

Developing Vuforia Apps for HoloLens

and that feels largely like a grouping together of these docs on the HoloLens developer site (or vice versa);

Vuforia development overview

Getting started with Vuforia

The role of extended tracking

Binding the HoloLens Scene Camera

Building and executing a Vuforia app for HoloLens

and so I had a pretty good read of those.

Trying the Vuforia Sample

I went and downloaded the Unity sample from the ‘Digital Eyewear’ section of this page;

Vuforia Samples

and that gives you a unity package and so I made a blank 3D project in Unity 5.5 and then brought in both the HoloToolkit-Unity (as in this post) and set up my project, scene and capability settings (including the spatial perception capability) and then brought in the Vuforia package.

In importing the Vuforia samples package, I wasn’t 100% sure what I did or didn’t need and so I went for most of it missing out only the pieces that seemed to be specific to iOS/Android;

image

and then I found a scene within the sample named ‘Vuforia-2-Hololens.unity’ and so I opened that up and tried to see if I could build the project and deploy it to my device. That worked out…just fine Smile except I realised that I’d forgotten to put my newly acquired license key into the configuration;

image

and so I tried again with the license key plugged in and then I got a little bit stuck in that the sample scene that I was trying to view seemed to contain a couple of teapots;

image

but when I ran it I seemed to see nothing and I wasn’t entirely sure what was meant to be happening – i.e. I have the sample but I don’t know what I’m meant to do with it Smile 

In the docs here, there is a mention that I’m supposed to;

“follow the steps in the Building and Executing the sample section”

but I couldn’t find a Building and Executing the Sample section Smile I made sure that all 3 of the sample scenes were part of my build settings;

image

but this still left me with a blank scene and, frankly, I spent a good 5-10 minutes scratching my head wondering what the sample was supposed to do until I finally figured that maybe the sample was waiting for me to do something in the sense of present it with an image that it could recognise as those 2 teapots are sitting on two images named ‘stones and chips’ and so I did a web search and found the Vuforia PDF containing those images;

Vuforia Target Images

and I printed them out in colour and put one on my desk here and, sure enough, Vuforia did its thing and a teapot appeared Smile

Sketch

and so the main difficulty I had in getting the sample to work was in understanding what the sample was trying to do – i.e. to look for one of two images that it knew about and draw a teapot on top of them when it located them. I printed the other image out as well and I get a differently coloured teapot when I look at that image.

I think the next steps would be to try and set up an example from scratch and try and figure out whether I can do a different kind of object/image recognition – I’ll put that into another post to avoid this one get overly long…

1 thought on “HoloLens, Unity and Recognition with Vuforia (Part 1)

  1. Pingback: HoloLens, Unity and Recognition with Vuforia (Part 2) – Mike Taulty

Comments are closed.