Experimenting with Research Mode and Sensor Streams on HoloLens Redstone 4 Preview (Part 2)

NB: The usual blog disclaimer for this site applies to posts around HoloLens. I am not on the HoloLens team. I have no details on HoloLens other than what is on the public web and so what I post here is just from my own experience experimenting with pieces that are publicly available and you should always check out the official developer site for the product documentation.

This is a follow-on from this previous post;

Experimenting with Research Mode and Sensor Streams on HoloLens Redstone 4 Preview

and so please read that post if you want to get the context and, importantly, for the various caveats and links that I had in that post about working with ‘research mode’ in the Redstone 4 Preview on HoloLens.

I updated the code from the previous post to provide what I think is a slightly better experience in that I removed the attempt to display multiple streams from the device at the same time and, instead, switched to a model where I have the app on the device have a notion of the ‘current stream’ that it is sending over the network to the receiving desktop app.

In that desktop app, I can then show the initial stream from the device and allow the user to cycle through the available streams as per the screenshots below. The streams are still not being sent to the desktop at their actual frame rate but, as before, on a timer-based interval which is hard-wired into the HoloLens app for the moment.

Making these changes meant altering the code such that it no longer selects one depth and one infrared stream but, instead, attempts to read from all depth, infrared and colour streams. When the desktop app connects, it is returned the descriptions for these streams and it then has buttons to notify the remote app to switch on to the next/previous stream in its list.

Here’s how that looks across the 8 different streams that I am getting back from the device.

This first one would appear to be an environment tracking stream which is looking more or less ‘straight ahead’ although the image would appear to be rotated 90 degrees anti-clockwise;

1

This second stream would again appear to be environment tracking taking in a scene that’s to the left of my gaze and again rotated 90 degrees anti-clockwise;

2

This next stream is a depth view, looking forward although it can be hard to see much in there without movement to help out.

I’m not sure that I’m building the description of this stream correctly because my code says 15fps whereas the documentation seems to suggest that depth streams are at either 1fps or 30fps so perhaps I have a bug here but this depth stream feels like it is at a wider aperture and so perhaps this is the stream which the docs describe as;

“one for high-frequency (30 fps) near-depth sensing, commonly used in hand tracking”

but that’s only a guess based on what I can visually see in this stream;

3

and the next stream that I get is an infrared stream at 3 fps with what feels like a narrow aperture;

4

with the follow-on stream being depth again at what feels like a narrow aperture;

5

and then I have an environment view to the right side of my gaze rotated 90 degrees anti-clockwise;

6

and another environment view which feels more of less ‘straight ahead’, rotated 90 degrees anti-clockwise;

7

and lastly an infrared view at 3 fps with what feels like a wider aperture;

8

This code feels a bit more ‘usable’ than what I had at the end of the previous blog post and I’ve tried to make it a little more resilient such that should one end of the connection drop, the other app should pause and be capable of reconnecting when its peer returns.

The code for this is committed to master in the same repo as I had in the previous post;

https://github.com/mtaulty/ExperimentalSensorApps

Feel free to take that, experiment with it yourself and so on but keep in mind that it’s a fairly rough experiment rather than some polished sample.

4 thoughts on “Experimenting with Research Mode and Sensor Streams on HoloLens Redstone 4 Preview (Part 2)

  1. This is excellent. I bet the two black images have data in them, they are just showing colors very close to black, and are thus hard to observe against a black background.

    1. Yes, there is data in the 2 ‘black’ images – you can see it much more easily when moving.

  2. First of all Great job! Thank you for your experimentations.
    I just wonder if you know how to fix the problem by DesktopApp. When I start it than I get a message by line
    await this.messagePipe.WaitForConnectionAsync(
    TimeSpan.FromMilliseconds(-1)); with Error: System.InvalidOperationException: “Sequence contains more than one matching element”
    In debug I can see that all parameters of this.mwssagePipe is false of null.just by connectionMask :Task: id=2 and by adverisement: MAGIC_HEADER:4277006349, MS_GBLUETOOTH_LE_ID:6
    I made a pair between Hololens and Notebook. I can#t see what can be wrong.
    If you had any idea, please give me a tip.
    Thanks!

    1. I wonder if its the code in the IPAddressExtensions.GetLocalForInternetProfile() coming back with more than one address when it expected one? I’d suggest stepping into that call to WaitForConnectionAsync and seeing where the exception is being thrown rather than where it is not being caught. I hope that helps.

Comments are closed.