PerceptionColor/Depth/InfraredFrameSources–Access to Camera Streams

NB: The usual blog disclaimer for this site applies to posts around HoloLens. I am not on the HoloLens team. I have no details on HoloLens other than what is on the public web and so what I post here is just from my own experience experimenting with pieces that are publicly available and you should always check out the official developer site for the product documentation.

A very short post which is really just a reminder to myself.

Quite a while ago, I wrote this post;

Kinect V2, Windows Hello and Perception APIs

where I’d used the Perception classes (now obsoleted I think) PerceptionColorFrameSource, PerceptionDepthFrameSource and PerceptionInfraredFrameSource in order to try and grab frames of data off the cameras attached to a system across Color/Depth/IR streams. At the time, I was running that code on a system which had a Kinect V2 camera attached.

As I say, I think these APIs are now marked obsolete and so now you’re meant to use MediaFramesSource instead but I think they still ‘work’ at the time of writing.

I remembered this code in the light of this forum thread around access to these types of streams on a HoloLens device;

Is there any access to HoloLens depth sensor

Not that I agree with the sentiment of the thread but I did remember running this ‘what type of camera streams have I got?’ code on a HoloLens in the early days and yet I didn’t seem to have written down the results anywhere which are that the code essentially didn’t gain access to any of the streams.

I tried to “modernise” that code and use MediaFrameSourceGroup to discover frame groups that I could access and if I use that API instead then I find the same results in that if I query for the groups available on a HoloLens then I see;

image

Suggesting that the only groups that are available here are the ones for VideoPreview (Color) and ViewRecord (Color).

Naturally, the spatial mesh is made available via other APIs but that’s another story.

Future Decoded 2017 Resources

NB: The usual blog disclaimer for this site applies to posts around HoloLens. I am not on the HoloLens team. I have no details on HoloLens other than what is on the public web and so what I post here is just from my own experience experimenting with pieces that are publicly available and you should always check out the official developer site for the product documentation.

At this year’s Future Decoded show, I presented a couple of sessions with my colleague Pete around Windows Mixed Reality.

  • Windows Mixed Reality – Bootstrapping Your Development
  • Windows Mixed Reality – Developing with Unity

Neither of these were ‘official’ in the sense that Pete and I don’t work on the HoloLens or Mixed Reality teams at Microsoft and we aren’t even on the same continent and so you should always go check the official developer site;

for everything ‘official’ and always refer back to that in case of any doubt.

With the first of these sessions our intention was to provide a list of topics/resources that we’d want to know about if we were starting in Mixed Reality development today. We’ve since screen-captured that session and you can find the video of it below. Apologies on the audio, it’s a little “flat” which comes from recording straight into a PC with a cheap microphone.

In terms of the slides that we made/used for that session – they are downloadable in PDF form from here.

The intention of the second session was to make some of what we’d shown in the first session “come to life” by putting together some rough pieces in Unity with the Mixed Reality Toolkit (MRTK) and the Mixed Reality Design Labs (MRDL) and we’ve screen-captured that session also and it’s available below;

The sharp-eyed amongst you might notice that the video jumps between 1620 and 1920 vertical resolution, apologies for that – we’re aware of it.

As you’ll hear in the session, we worked from a slightly strange starting point because the MRTK and the MRDL were in the process of ‘coming together’ at the time that we made the session.

Since the session, that work has progressed and so we would start from a different place today than we did when we put the session together. See this pull request for the gory/glory details Smile and expect that there is more work to be done to bring some more of the MRDL goodness (e.g. bounding box) into the MRTK.

All mistakes are ours – feel free to let us know about them Smile

Butterflies ‘Hello World’ Demo (across holographic and immersive headsets)

NB: The usual blog disclaimer for this site applies to posts around HoloLens. I am not on the HoloLens team. I have no details on HoloLens other than what is on the public web and so what I post here is just from my own experience experimenting with pieces that are publicly available and you should always check out the official developer site for the product documentation.

Just a quick post to share a ‘hello world’ style demo that I made to try and illustrate a few pieces in the Mixed Reality Toolkit for Unity that could be used to make code which ran across both immersive and a HoloLens headsets with a sprinkling of common features including;

  • Gaze
  • Gesture
  • Voice
  • Spatial mapping
  • Spatial sound
  • Boundary and additional content for an immersive headset

I found a couple of models on Remix3D and on the Unity asset store and I set something up around the theme of butterflies which fly around, banging into walls and responding to a couple of global/local voice commands and gestures. The flying of the butterflies is completely dubious but I quite liked the way it worked out.

I suspect I made a few errors and it wasn’t quite perfect (my teleporting isn’t quite right outside of the editor) but I screen captured it here in case it’s of any use to others;

and you can get the code (feel free to fix it Smile) from here.

There’s nothing innovative in here other than the various mistakes which I made – those are uniquely mine.