NB: The usual blog disclaimer for this site applies to posts around HoloLens. I am not on the HoloLens team. I have no details on HoloLens other than what is on the public web and so what I post here is just from my own experience experimenting with pieces that are publicly available and you should always check out the official developer site for the product documentation.
A very short post which is really just a reminder to myself.
Quite a while ago, I wrote this post;
where I’d used the Perception classes (now obsoleted I think) PerceptionColorFrameSource, PerceptionDepthFrameSource and PerceptionInfraredFrameSource in order to try and grab frames of data off the cameras attached to a system across Color/Depth/IR streams. At the time, I was running that code on a system which had a Kinect V2 camera attached.
As I say, I think these APIs are now marked obsolete and so now you’re meant to use MediaFramesSource instead but I think they still ‘work’ at the time of writing.
I remembered this code in the light of this forum thread around access to these types of streams on a HoloLens device;
Not that I agree with the sentiment of the thread but I did remember running this ‘what type of camera streams have I got?’ code on a HoloLens in the early days and yet I didn’t seem to have written down the results anywhere which are that the code essentially didn’t gain access to any of the streams.
I tried to “modernise” that code and use MediaFrameSourceGroup to discover frame groups that I could access and if I use that API instead then I find the same results in that if I query for the groups available on a HoloLens then I see;
Suggesting that the only groups that are available here are the ones for VideoPreview (Color) and ViewRecord (Color).
Naturally, the spatial mesh is made available via other APIs but that’s another story.