Experiments with Shared Holograms and Azure Blob Storage/UDP Multicasting (Part 2)

NB: The usual blog disclaimer for this site applies to posts around HoloLens. I am not on the HoloLens team. I have no details on HoloLens other than what is on the public web and so what I post here is just from my own experience experimenting with pieces that are publicly available and you should always check out the official developer site for the product documentation.

Following up on my previous post, I wanted to move into Unity and create the bare bones of a shared hologram test scene using the messaging library from that post.

My aim for that scene is to build out a re-usable API and ‘infrastructure’ that allows for creating/deleting a hologram such that it will be shared with other devices over the network. I can then package those pieces (with the messaging library) and maybe use some/all of them again in further projects.

The Sketch

That sounds relatively unambitious but I think the steps involved are something along the lines of…

  • Offer some “CreateHologram” method which can;
    • Capture the ‘type’ and transform of the hologram being created – e.g. Cube/Sphere/Dragon – and create it.
      • The object should have a stable ID that can represent it on all the devices that come to know about it.
    • Determine whether there is already a suitable object in the scene which can provide a WorldAnchor to parent this new GameObject and, if suitable, re-use it to try to avoid creating ‘too many’ world anchors.
      • For these posts, I will do this by trying to find an existing, world-anchored GameObject within 3m of the new GameObject.
      • If a WorldAnchor does not exist, create one at the location of the GameObject and give it a stable ID that can represent it on all devices that come to know about it.
    • Parent the hologram inside of the selected WorldAnchor, offsetting it as necessary.
    • Wait for that WorldAnchor component to signify (via its isLocated flag) that it is located in space.
    • Export that WorldAnchor to get the blob (byte[]) that represents it.
    • Send that blob to some store that can persist it for access by other devices
      • For these posts, I’m going to send these world anchors to Azure Blob Storage but it would not be difficult to abstract this and plug in other mechanisms
    • Send a network message to other devices in the scene to inform them of;
      • The new hologram ID
      • The new hologram type
      • The parent anchor ID
      • The transform relative to the parent anchor object
  • Offer some “DeleteHologram” method which can;
    • Locate a hologram in the scene by its ID
    • Remove it
    • Send a network message to other devices in the scene to inform them of;
      • The deleted hologram ID

That feels like more than enough to be getting on with for one blog post Smile 

Additionally, there’s a need to have some code which responds to these messages such that;

  • When a “CreateHologram” message arrives;
    • The hologram of the right type is created (e.g. Cube/Dragon/etc) with the same ID as the originating hologram.
    • The world-anchored parent for the hologram is determined by the ‘parent anchor ID’
      • If this parent object is not already in the scene then there’s a need to
        • Download the blob representing the anchor from Azure blob storage
        • Create a new blank GameObject to represent the world anchor
        • Import the anchor to that GameObject
    • The transform of the new hologram is set correctly relative to the world-anchored GameObject which parents it.
  • When a “DeleteHologram” message arrives;
    • Remove the hologram with the corresponding ID from the scene.

Ok, that’s definitely enough for one post Smile

The Implementation

Ok, so how does this look in reality?

Firstly, I should say that for this experiment I am building with Unity 2017.3. I’m not at all sure that this is (yet) the recommended version of Unity for Mixed Reality development but I had a specific problem with the Unity 2017.2* versions in that I could not debug any code and I moved forward to 2017.3 for this set of blog posts and everything I needed seems to have worked to date. You may get different results.

I set up a blank Unity project, configured it to build for Windows UWP and Mixed Reality (without adding the Mixed Reality Toolkit) and added in my messaging library using the “placeholder” approach that I talked about here and you can see the 2 libraries in the Plugins folder of my solution below;


I then built out 3 sets of ideally re-usable scripts that you can see in this screenshot below;


The Messages Script Folder

The Messages folder contains 4 classes that ultimately are about presenting two different types of message-derived classes for use with the messaging library from my previous post.


and so there’s 2 base classes in here with the 2 real classes being CreatedObjectMessage and DeletedObjectMessage which line up with what I sketched out in that they carry the right pieces of data for those create/delete pieces of functionality.

The Azure Blobs Script Folder

The Azure Blobs folder contains a number of scripts intended to make the uploading/download of a blob to Azure storage relatively simple from Unity.


These scripts really surface one API exposed by the class AzureBlobStorageHelper which has public methods to Upload/Download blobs from Azure.

The rest of the code is just “infrastructure” and it leans very heavily on code that I ‘borrowed’ from my colleague Dave who has a repo of this type of code over on github. I hope that he doesn’t mind Smile and I hope that I commented the code appropriately to say where it (mostly) comes from.

In order to make use of Azure blob storage, there’s a need to have some endpoint/connection details and so there’s a type in this folder named AzureStorageDetails which stores these details and I’ll come back to its intended use.

The General Script Folder

The General scripts folder contains, again, mostly infrastructure with only perhaps 2 classes in here intended for actual use – namely, the SharedHologramsController class and the SharedCreator class.


The idea is that the SharedHologramsController is a MonoBehaviour intended to be dropped once into a project and it provides access to two key properties as seen below;


There’s the AzureStorageDetails which are intended to be configured in the editor to provide details of an Azure storage account name, key and container name as in this screenshot;


and I can easily copy those details from the Azure Storage Explorer or from the portal etc.


That SharedHologramsController instance also provides access to an instance of the other significant type here which is called SharedCreator and it is this type which has methods to Create/Delete shared holograms and perform the logic that was sketched out earlier in the post.

At the time of writing, the SharedCreator takes a string to identify the type of hologram that you want to create (Cube/Dragon/etc) and it only knows how to create primitives right now (Cube, Sphere, etc) but it would be far-from-rocket-science to adapt it so as to interpret that string in other ways – e.g. loading resources or asset bundles or similar in Unity. It’s just not something that I’ve added yet and I daresay some “IResolveHolograms” interface could easily be cooked up to do such a thing.

The Unity Package

I made a Unity Package out of the scripts and added it to the repo – it’s just an export of the scripts including the plugins.

The Package Downgrade Issue

This might be one big red herring but I think I noticed that when I build my solution from Unity 2017.3 then the generated projects look to be referencing V5.0.0 of the Microsoft.NETCore.UniversalWindowsPlatform package as shown below;


and I noticed that my messaging library project seems to be referencing V6.0.1 as below;


Confused? Yes, I am Smile

This seems to manifest itself as a build warning NU1605 when I come to build the Unity solution inside of Visual Studio;


which I read as something like;

You have a project using Nuget package X which makes use of a library which has been built against Nuget package >X.

Now, of course, I tried to get around this by simply ignoring it but I then got bitten by a runtime error;


and I essentially pinned this down to the fact that my messaging library built against UWP package 6.0.1 was expecting to load System.Net.Sockets.dll V4.1.0.0 whereas the build process had emitted System.Net.Sockets.dll V4.0.6.0 and that didn’t match.

So, it wasn’t so easy to ignore.

I don’t know whether this was caused by some mistake I made inside of my Unity project setup or whether it would be reproducible if I were to make another Unity project.

For the moment, I have worked around this by manually changing the Nuget package of the Unity projects to be 6.0.1 as shown below;


Whether this is the ‘right’ thing to do, I’m unsure but it gets me around the build time warning and the runtime error for now but I’m grateful to whoever added that Nuget package warning because I spent some time trying to figure out what was going on here and it would have been a lot longer without that warning Smile

An extra note here – I found that if by chance I had deployed the application containing this mismatch of UWP packages to a device then I had to make sure that I uninstalled that application before attempting to fix things – i.e. just switching the version numbers of the UWP packages in Visual Studio and asking to build/deploy didn’t seem to be enough but, rather, I had to make sure the application was wiped from the device.

The Usage

In terms of usage, I created a blank test project in Unity and set it up for the basics of UWP/HoloLens development, specifically;

  1. Moving the camera to the origin.
  2. Changing the camera’s clear flags to a solid black colour and its near clipping plane to 0.8.
  3. Changing the build platform to UWP, the device to HoloLens, the version of the SDK to 14393 and selecting the “C# projects” option.
  4. I changed the backend scripting engine to be .NET.
  5. I made sure that Windows Mixed Reality was set up within the XR settings.
  6. I made sure that my UWP capabilities included Internet Client/Server and Private Networks although I’m not 100% sure yet that I need both of those so this is possibly overkill. I also made sure that the capabilities included Spatial Perception.

I didn’t go to town on this – I just went with what I thought was the minimum. I then imported my Unity package that I made earlier in the blog post and which is also in the repo’s top level folder.

With that all imported, I added an empty GameObject to my scene and added the Shared Holograms Controller script to that GameObject as below;


and I filled in the details of my Azure storage account.

I then added a script named TestScript to my empty GameObject to see if I could write the following logic;

  • A tap on an empty space will create a green cube as a shared hologram.
  • Looking at a cube will turn it red, looking away will revert to green (locally, these colour changes are not intended to synchronise across devices).
  • A tap on a focused cube will delete the shared hologram.

There’s no UX around the various delays involved in creating the shared holograms which there would definitely need to be in a real-world app but this is just for testing.

The TestScript ended up looking as below;

using System;
using SharedHolograms;
using UnityEngine;
using UnityEngine.XR.WSA.Input;

public class TestScript : MonoBehaviour
    void Start()
        this.recognizer = new GestureRecognizer();
        this.recognizer.Tapped += OnTapped;
    void OnTapped(TappedEventArgs obj)
        // If we are staring at a cube, delete it. Otherwise, make a new one.
        if (this.lastHitCube == null)
    void DeleteSharedCube()
        this.lastHitCube = null;
    void CreateSharedCube()
        var forward = Camera.main.transform.forward;

        var position = Camera.main.transform.position + forward * 2.0f;

        // Note - there's potentially quite a long time here when the object has
        // been created but we're still doing network stuff so we'd need to really
        // make a UX that dealt with that which I haven't done here.
            new Vector3(0.1f, 0.1f, 0.1f),
            cube =>
                ChangeMaterial(cube, this.GreenMaterial);
    void Update()
        RaycastHit rayHitInfo;

        // Are we looking at a cube?
        if (Physics.Raycast(
            out rayHitInfo,
            this.lastHitCube = rayHitInfo.collider.gameObject;
            ChangeMaterial(this.lastHitCube, this.RedMaterial);
        else if (this.lastHitCube != null)
            ChangeMaterial(this.lastHitCube, this.GreenMaterial);
            this.lastHitCube = null;
    static void ChangeMaterial(GameObject gameObject, Material material)
        gameObject.GetComponent<Renderer>().material = material;
    Material GreenMaterial
            if (this.greenMaterial == null)
                this.greenMaterial = new Material(Shader.Find("Legacy Shaders/Diffuse"));
                this.greenMaterial.color = Color.green;
            return (this.greenMaterial);
    Material RedMaterial
            if (this.redMaterial == null)
                this.redMaterial = new Material(Shader.Find("Legacy Shaders/Diffuse"));
                this.redMaterial.color = Color.red;
            return (this.redMaterial);
    Material greenMaterial;
    Material redMaterial;
    GestureRecognizer recognizer;
    GameObject lastHitCube;

and so there’s not much code and most of it has nothing to do with shared holograms – there’s just two calls in there to SharedHologramsInstance.Create and Delete and that’s pretty much it. The rest is just Unity work to change colours and so on.

Testing – A Challenge with One Device Smile

At the time of writing, this is an experiment mostly done ‘for fun’ in the down time between Xmas and New Year and I have one HoloLens device which I can use to try things out.

Because of that, I had to write some extra code in order to use the one HoloLens as both a sender/receiver for these messages and so I added another project to the test apps folder of the messaging library project that I described in the previous blog post;


and this acts as a ‘recorder’ for the CreateObject/DeleteObject messages with a limited ability to play those messages back over the network.

This means that I can use my one HoloLens to position a number of cubes around a space and to delete some of them as well, use this console app to record that flow of messages and then restart the app on the HoloLens and play back those messages so as to check whether the holograms get re-created in the right places and deleted at the right time.

That seems to work reasonably well but, naturally, it’d be nice to also try this out on multiple devices.

Testing – The Editor

While I did try and make the messaging library and the other pieces so that as much of it as possible might run in the editor, I haven’t paid much attention to this yet as there’s a limited amount that I think that you can do with spatial anchors but the essence is there but is largely untested so far.

Wrapping Up

I (hopefully) removed my Azure storage connection details from the Unity project and I checked it, the Unity package and the underlying messaging library into github.

Source on Github

Feel very free to take it, play around with it, etc – once again, this is mainly written ‘for fun’ and for me to perhaps get some re-use of in the future so don’t expect super high quality from it – apply a pinch of salt to what you see.

What’s Next?

At the end of this post, I think I’ve got the basics to create/delete holograms and have them show up in a ‘shared manner’ across multiple devices albeit with a very limited user experience and the trade-offs that come with using the UDP multicast mechanism and Azure blob storage.

The mechanism is meant to support automatically creating world anchors as they are needed and the API is reduced down to a couple of calls to Create/Delete.

There’s one (small) Unity package to import into a solution and just one object to drop into the Unity scene.

So, there’s some basic pieces there but it would be nice to;

  • Create objects other than primitives Smile
  • Transform objects after they are created and have those transformations mirrored to other devices.
  • Have some ‘memory’ of messages that a client has missed such that not all clients have to join a scene at the same time in order to view the shared content.

I’m not sure whether I’ll have time to get through all of that but if I do then you’ll see some more posts in this series looking at some of those areas.