Blog Post #2500 and A Change of Scene

the blue badge…

Today’s one of those ‘mixed emotion’ days in that I’m both happy and a little sad to be writing that I have reached my 2500th blog post and it coincides with me moving on from my role at Microsoft after almost 20 years with the company.

I joined Microsoft in late 2000 with 10 years of software development under my belt and I can remember being very unsure whether I’d make it through the challenging first few months of the job but the endless cycle of new technology soon got a grip of me and those months quickly turned into years and decades.

In 2003, I started to experiment with this blog site and, once again, time has raced on and I find myself posting blog entry number 2500 after sixteen years writing on a wide variety of technical topics which span from the SPV Smartphone to HoloLens 2.

In 2000, I was hired into developer consultancy, working mostly behind closed doors to help UK software vendors build on the Microsoft platform. It was a role that proved to be both deep and broad within a really skilled team and it gave me great opportunities to learn. Over time, the team which was initially known as “Premier Support for Developers” ping-ponged between being part of different Microsoft organisations (i.e. Support vs Services) which was my first taste of how things can change when a re-organisation arrives 🙂

Around 2005, I made a move into developer advocacy and had a lot of fun being involved in publicly communicating Microsoft’s innovations across development languages, frameworks, tools and platforms. Those years coincided with a lot more posts to this site and to others and to doing a lot of in-person work spanning from large first and third party conferences to small user groups and events for specific companies/industries.

Once again, organisations changed around me – I was initially hired into a “DotNet Developer” (or DnD) group but in the couple of months it took to take up the role it had become the “Developer & Platform Evangelism” group (or DPE) and at a later point that morphed into a “Developer eXperience” group (or DX). I managed to mostly survive by focusing on providing a service to the end developer as the organisation around me tried to figure out whether I was “too technical”, “not technical enough” or “just right” which made for some interesting discussions 🙂

In 2018, the DX group came to an end and employees were re-organised into various roles around the company with a lot of the more code-focused people moving to a newly formed group with a mission to directly engage on development projects alongside customers/partners – the Commercial Software Engineering group (or CSE).

By that point, my path of client-side and more-personal-computing technologies had led me to be working since late 2016 with Microsoft HoloLens. I’d had a lot of fun working in .NET and Unity, showing the technology to the community via in-person events, blogging etc. and also mentoring UK companies as they enrolled in the Mixed Reality Partner Programme and I’d been lucky to get to work on some HoloLens 2 apps before the device was announced and shown at Mobile World Congress at the start of 2019.

A new role in a new division came with the tantalising promise that I could continue to code for HoloLens 2 and the Azure Mixed Reality Cloud with Microsoft’s partners and customers and that would have been great to continue and we formed a Mixed Reality team within the bounds of that wider CSE group.

In reality, finding Mixed Reality projects to work on that lined up with the mission of the CSE group proved to be quite a challenge and that led to a period of project-hunting where I felt that my time wasn’t being well used and, ultimately, that brought me to deciding to move on from the group and the company.

Microsoft has been a great home to me and, over my time, I’ve been lucky to work with a tremendous set of colleagues and to make a huge set of new friends from among the broad communities that form Microsoft’s customers, partners and the wider ecosystem.

I’d like to say a big ‘thank you’ for reading these posts, coming along to events, watching videos, coding with me on projects and generally being part of my journey and I hope that I’ll catch up with some of you further on up the road as I move into the community of Microsoft alumni.

Now, with all that said, does anyone know how I pay for an Office365 subscription? 😉

“The Future’s So Bright, I Gotta Wear…Glasses?”

With reference to Timbuk3 and, I know, it’s a cheap line but it popped into my head the other day and won’t go away so maybe writing it down will help.

“The Future’s So Bright” by Timbuk3

I’ve been wearing glasses for about 35 years, often swapping them for contact lenses but it’s only in recent years when I’ve been watching what’s happening with mobile computing that I’ve come to the full realisation that we’re on a path where perhaps everyone is likely to join me and we’ll all end up wearing glasses somewhere down the line.

We’re headed to a level playing field of “eyewear-for-everyone” (TM) or maybe it’s eyeware-for-everyone.

As an aside, I thought I just made up the term ‘eyeware‘ then Googled it to see and, of course, there’s already a Swiss company there with that name doing some type of 3D eye tracking – it’s hard to come up with anything original these days!

Recent (and long term) rumours and actual products suggest that, one way or another, all the big players are working on some form of ‘glasses’;

and Google has had something out there for a long time and, of course, there are also a tonne of other players (like Magic Leap, North, realwear, etc.) out there too.

Naturally, you might argue that Microsoft has been blazing a trail here in shipping devices and services in this space for some years and far be it from me to stop you making that argument 😉

I saw this YouTube session go by me on a timeline the other day which presents some analysis of the today/tomorrow state of the ‘glasses’ market and talks through some of the motivations of the players involved. It’s not Earth-shattering but I thought it was a good watch;

Naturally, not every rumour will come to fruition and, clearly, not all ‘glasses’ are made equal but, regardless, one version of the future promises to deliver glasses-for-all.

Glasses will become the great leveller as we all have to remember to clean them, not to sit on them and find them when we’ve misplaced them.

The only division is likely to be that some of us will still pay more than others to put prescription lenses into ours or maybe the future will solve that problem too 🙂

More Scene Understanding – “Put This on the Big Table”

NB: The usual blog disclaimer for this site applies to posts around HoloLens. I am not on the HoloLens team. I have no details on HoloLens or Azure Mixed Reality other than what is on the public web and so what I post here is just from my own experience experimenting with pieces that are publicly available and you should always check out the official developer site for the product documentation.

Following up on these previous 2 posts;

and, purely in the vein of ‘just for fun’, I kind of wanted to try out that idea that I’d floated of using a particular plane identified by the ‘Scene Understanding SDK’ to ‘anchor’ an object in space over time.

Or…in human terms I want the device to be able to;

“Put this hologram on the left hand, front edge of the largest table in the room and store its position for me until I move it”

So, looking at the screenshot below what I’ve done here is to run up my app code, let it place the full-sized model of the office in the middle of the table as below with its position and orientation coming from the plane that the Scene Understanding SDK has handed to me;

office model positioned in its default, start-up position

and then I’ve moved the model over here to the front, left-hand edge of the desk as below;

office model re-sized and positioned on the corner of the desk

and then when I run the application again it remembers this position (approximately, I never said it was as good as spatial anchors!) and puts the model back at the same position, orientation and scale;

office model position stored and restored relative to centre of largest platform in the room

and the app is doing this by simply relying on the Scene Understanding SDK’s ability to find ‘the largest platform in the room’ and to come up with the same position for it each time the app runs such that all I need to do is store the relative transform of my model across invocations of the app.

Was that hard to do? No, not based on what I already had at the end of the last post and I amended the scene that I used in that post in just a few ways.

Firstly, I moved my LargePlatformPositioningBehaviour and from being present on my office model to its parent so that this parent becomes the object that the code attempts to place ‘in the centre of the largest platform’ when the application first runs up.

making the parent object the one which moves to the centre of the largest platform on startup

Secondly, I added a new LocalPositionMemoryBehaviour to the office model itself as below;

office model now remembers its position relative to the parent

and then I made sure that I was handling the ‘manipulation ended’ event from the toolkit such that I could intervene and get hold of any modifications that had been made to the local scale, position or rotation of the office model relative to its parent;

adding a handler for the manipulation ended event

and I wired this through to a method on my new LocalPositionMemoryBehaviour which is implemented below so as to store the local position, rotation and scale values into a simple Player Preference dictionary whenever it changes & to attempt to restore those values when the application starts;

using System.Linq;
using UnityEngine;

public class LocalPositionMemoryBehaviour : MonoBehaviour
    void Start()
        if (PlayerPrefs.HasKey(
            var value = PlayerPrefs.GetString(;
            Debug.Log($"MT: Read SRT to string of {value}");
    public void OnManipulationEnded()
        // Store away our local position, rotation and scale in settings type storage.
        var srtToString = this.LocalSRTToString();

        Debug.Log($"MT: Written out SRT to string of {srtToString}");

        PlayerPrefs.SetString(, srtToString);
    string LocalSRTToString()
        var t = this.gameObject.transform.localPosition;
        var s = this.gameObject.transform.localScale;
        var r = this.gameObject.transform.localRotation;

        return ($"{Vector3ToString(s)} {QuaternionToString(r)} {Vector3ToString(t)}");
    void StringToLocalSRT(string value)
        var pieces = value.Split(' ').Select(s => float.Parse(s)).ToArray();
        this.gameObject.transform.localScale = Vector3FromStrings(pieces, 0);
        this.gameObject.transform.localRotation = QuaternionFromStrings(pieces, 3);
        this.gameObject.transform.localPosition = Vector3FromStrings(pieces, 7);
    static Quaternion QuaternionFromStrings(float[] pieces, int v) => new Quaternion(pieces[v], pieces[v+1], pieces[v+2], pieces[v+3]);
    static Vector3 Vector3FromStrings(float[] pieces, int v) => new Vector3(pieces[v], pieces[v+1], pieces[v+2]);
    static string Vector3ToString(Vector3 v) => $"{v.x} {v.y} {v.z}";
    static string QuaternionToString(Quaternion q) => $"{q.x} {q.y} {q.z} {q.w}";

and so we have the office model remembering its position relative to its parent and we have the Scene Understanding SDK helping us to put that parent back on the ‘largest platform in the room’ and so the office model then remembers its position relative to that platform and stays on the edge of the table.

As I said in the previous post, it would be interesting to now pick up the table and move it within the room or even to another room (so long as it remained the largest platform in that room) and to see if the behaviour worked and the office model stayed on the table. I suspect it would but there’s no way I’m moving that table 😉

The code is where it was previously. Naturally, I’m not suggesting that this is for anything other than ‘fun’ but I was quite impressed with the stability that I got in my one simple test and, of course, you have multiple flavours of spatial anchor that can help you with these scenarios too.