Windows 10 and the UWP via Objective-C: Rough Notes on the (preview) Windows Bridge for iOS

I need to start out by saying that I’m not an iOS developer. I don’t think I’ve even written ‘Hello World’.

I also should say that I’m definitely not an Objective-C developer. I’ve never been there although I have spent a lot of years with C-like languages from C, through C++ and on into a bit of Java and then C# but, to date, I’ve not really had the pleasure of writing anything in Objective-C.

So, I’m not at all qualified to write anything much about developing for iOS but, regardless, I was still interested to take a look at the preview of the “Windows Bridge for iOS” project that’s set up to take Objective-C code (and other assets) and build a Universal Windows Platform app from it.

By way of background, there are details around the Windows UWP bridges on this site;

Bridges for Windows 10

and there’s a lot more detail about the “Windows Bridge for iOS” in this video from //Build 2015 which I watched around the time of the live event;

Since then, more information appeared on the web;

Windows Bridge for iOS

and the project has been open sourced and is now hosted on GitHub;

and so I thought I’d give it a whirl and see if I could make any sense of it despite my lack of iOS background. As a pre-cursor to that, I had a good read of this post;

Windows Bridge for iOS- Let’s open this up

because it talks a little around projections of WinRT APIs into Objective-C and it also talks around the use of a XAML compositor which ties ‘CALayers’ into XAML elements. I had to go read what a CALayer was;

CALayer class reference

and that led me off to this discussion around Core Animation;

Core Animation Basics

and it would seem that in iOS all views are backed by layers and views are a ‘thin wrappers around layer objects’ whereas it seems that’s not the case in OS X. It also seems that a layer’s contents can be provided by giving it an object, providing it with a callback or subclassing it and doing an ‘owner draw’ style approach.

With those bits of reading skimmed through, I downloaded the SDK bits (rather than cloning the source code repo) from the site, unzipped them onto my desktop and I flicked through the requirements for use which are (copied from the github page);

  • Windows 10
  • Visual Studio 2015 with Windows developer tools. Visual Studio 2015 Community is available for free here. Select (at least) the following components during installation:

    1. Programming Languages -> Visual C++

    2. Universal Windows App Development Tools (all)

    3. Windows 8.1 and Windows Phone 8.0/8.1 Tools (all)

and I thought that I had all of those so I tried to press ahead and opened up the project named WOCCatalog for Windows 10 that lives in the samples folder (this is all as directed on the website).

I had a bit of a poke around the source which (in the first instance) seemed a bit unwelcoming and so I figured I’d be better off running the sample which I did on my desktop;


and then I tried to run it on a Windows 10 Mobile device where I got blocked around errors relating to something called “XamlCompositorCS” which seemed to be a missing reference.

I should have really read the instructions up front because it only took a couple of minutes back on the website to find out that this is a known limitation clearly stated as;

“x86 only today ARM support coming soon”

and so I figured “ok, then I can run on the phone emulator?” because the phone emulator is x86 and I did give that a whirl but haven’t (to date) had much success with it. It looks very promising initially and the app seems to build, deploy and run but then it seems to immediately exit and I get an error;


and if I try and launch it from the start screen on the Phone then it seems to just exit.

However, this might be a temporary gremlin or it might be that my emulators are now a little out of date as I haven’t updated them for a while.

Rather than get bogged down with that issue, I flicked through the various tabs of the app running on the desktop;


and by this point I was really starting to scratch my head and wonder what exactly it was I was looking at on the screen Smile

In this sample there’s a bunch of references to UWP contracts;


and then there’s quite a few Objective-C source files in the project;


and a single C++ source file as far as I could initially see;


which seems to be bootstrapping the whole process.

I figured that I might learn more by trying to debug the code rather than trying to read through it and so I launched the debugger.

Experiment 1 – Debugging without the Framework Source

I opened up that ConsumeRuntimeComponent.cpp file and set a breakpoint;


and then I stepped into Main which turned out to be Objective-C which was a little like falling through the looking glass Winking smile 


and I felt a little lost already but I figured this was saying “run the application with the callbacks being in the AppDelegate class” and so I dug that class out and set a breakpoint;


and that seemed to be saying “Let’s use a MenuTableViewController” as the ‘root controller’ and so I found that class and had a look at it and I think even I could understand this bit (maybe!);


and the viewDidLoad ‘handler’ seems to populate an array of menu items with other view controllers like this one;


and so I dived into that SBButtonsViewController and tried to see what it was doing – it seems to be a UITableViewController;


and then there’s some code in the implementation files which didn’t seem to quite line up with the header file but my main thought was more along the lines of;

“ok, where is UIButton coming from? Presumably, that’s UIKit but where’s the implementation here?”

and so I explored a little more and in the lib folder from the download, I can see;


and using dumpbin /exports on that library didn’t find me a UIButton export as such but it did show me a _OBJC_CLASS_UIButton and so I guess I’m prepared to believe that the implementation of UIButton and other UI* APIs are coming from that DLL.

But, how does that UIButton work and what’s doing the drawing?

That moved me on towards…

Experiment 2 – Debugging with Visual Studio’s Live Visual Tree

My next thought was to point the Live Visual Tree Explorer at it from Visual Studio and see how much of it was/wasn’t XAML and whether that helped me figure things out a little more.

At the top level, the content looks like;


and then if I zoom into something like a label on a button I see;


and so it feels like this CALayerXaml might be being used to display the ‘primitives’ of drawing here and perhaps of event handling too – if I go and look into UISwitch as an example then I can see that a UISwitch translates into;


and so it’s a nested tree of CALayerXaml panels containing primitive Rectangle, TextBlock, Image controls rather than (e.g.) some re-templated version of a XAML ToggleButton which might have been my first thought as to how this might have been implemented.

That led me on to…

Experiment 3 – UIKit Views and XAML Controls

I got quite interested in these 2 different views in the UI here. This one displays some UIKit controls like UISlider;


and if I look at that carefully with the Live Visual Tree explorer then I see that it is a hierarchy of 18 elements;


and so it’s rooted by a CALayerXaml and then broken down into rectangles, textblocks and images and each piece is wrapped into a CALayerXaml. The code to produce this view is dealing in terms of UIKit elements, i.e.;

    if (indexPath.row == 0) {
        // switch
        CGRect frame = CGRectMake(5.0, 12.0, 94.0, 27.0);
        UISwitch *switchCtrl = [[UISwitch alloc] initWithFrame:frame];

        // in case the parent view draws with a custom color or gradient, use a transparent color
        switchCtrl.backgroundColor = [UIColor clearColor];

        cell.accessoryView = switchCtrl;
        cell.textLabel.text = @"UISwitch";

If I take a look at this other view that displays XAML controls;


then I can see that the Slider there is just a real XAML slider wrapped into a CALayerXaml element;


and the code here is introducing ‘raw’ XAML elements and hosting them in a UIView;

    else if (indexPath.row == 4) {
       WXCSlider *slider = [WXCSlider create];
       slider.requestedTheme = WXApplicationThemeDark;
       slider.minimum = 0.0;
       slider.maximum = 100.0;
       slider.value = 25.0;
       slider.smallChange = 5.0;
       slider.largeChange = 20.0;
       UIView *sliderView = [[UIView alloc] initWithFrame: CGRectMake(0.0f, 0.0f, 300.0f, cell.frame.size.height)];
       [sliderView setNativeElement:slider];
       cell.textLabel.text = @"Slider";
       cell.accessoryView = sliderView;

and so it feels like this maps onto what was written in that blog post by Salmaan Ahmed that I referred to earlier and also the iOS documentation around UIView/CALayer in the sense that it gives the impression of;

  1. There’s an implementation of CALayer (CALayerXaml) which is wired in to display any XAML element.
  2. UIView sits on top of CALayer.
  3. UIImageView sits on top of UIView by using an XAML image element to display the image.
  4. This allows for a control like a UISlider to implement itself in terms of 3 or 4 UIImageView sub controls.

which all seems to fit together but it still left me wondering how all this got bootstrapped and plugged together.

That led me on to…

Experiment 4 – Debugging with the Framework Source

Given that the project is open-sourced, it felt like it was time to have a look at the source code to try and figure more about what’s going on and so my next step was to clone the repository from Git and build the SDK locally using the provided solution file which took just a few minutes on a Surface Pro 3 to make a debug build and which completed without any errors from the 17 projects within.

I then opened up the same sample project that I’d already been trying out but this time I opened it up and built it within the folder structure of the cloned Git repository, hoping that it would pick up the debug libraries that I’d just built so that I might better try and debug them.

That worked out fine and so I could single step my way through the sample, watching it start-up and so on.

Debugging this application is fascinating to me as someone who knows nothing about Objective-C or iOS apps – there’s an unusual combination of familiar/foreign in that I can kind of figure what’s going on conceptually but I can feel my ‘conscious incompetence’ around a lot of the details.

I started from ‘Main’ and tried to debug ‘forwards’ to see how the application spins up.

The first real entry point into the app seems to be in ConsumeRuntimeComponent.cpp, function EbrDefaultXamlMain() but it’s kind of ‘fun’ to look at the call stack that gets us to that early point;


This function calls straight into Objective-C code in main.m which has its own main which creates a UIApplicationMain and passes to it the AppDelegate class along with argv and argc.


UIApplicationMain initialises COM, and then calls Windows.UI.Xaml.Application.Start() passing its own App class and so we’re now on the way to spinning up a UWP Xaml app;


and that App class has an OnLaunched override which does;


and so, at one level that feels kind of ‘familiar’ and not too alien at all perhaps apart from the calls to IWSetXamlRoot (Islandwood?) and IWRunApplicationMain.

What do they do? They do quite a lot but, in short, they seem to do something like;


  • creates an instance of CAXamlCompositor which is in 
    • this implements a CACompositorInterface defined in CACompositorInterface.h and seems to be the abstraction that separates the code in from the details of composition. The interface deals in terms of DisplayNode, DisplayAnimation and DisplayTransaction and the actual implementation subclasses DisplayNode to make a DisplayNodeXaml.
  • the compositor is ‘registered’ with the module via a global function that puts the instance behind global/static Get/Set methods.
  • does some work to register an input handler XamlUIEventHandler (for pointer up/down/moved and key down) into a C# module (CALayerXaml.cs) that provides the CALayerXaml implementation derived from Panel and exported out of this as a WinRT component.
    • the CALayerXaml handles input by delegating the calls down to this XamlUIEventHandler before marking those events as having been handled.

In terms of how this layers, as far as I can understand it we have a layering like;

  • UISlider is a UIView
  • UIView has a CALayer
  • CALayer seems to draw through the CACompositorInterface (and core graphics) implementation that it is given which, in this case, is the CAXamlCompositor and associated DisplayNode, DisplayTransaction, DisplayAnimation types.

I think that’s how it works. As an example, if I look into UIButton there’s a method called createLabel and part of that sets the background colour;

[self->_label setBackgroundColor:[UIColor clearColor]];

and if I step into that then I see it calling into UILabel to setBackgroundColor which does;

  [super setBackgroundColor:colorref];

and that involves the call to the base class (UIView) which has the layer and so it does;

[layer setBackgroundColor:[(UIColor*) priv->backgroundColor CGColor]];

and the layer creates/uses a CATransaction;

[CATransaction _setPropertyForLayer: self name: @"backgroundColor" value: (NSObject *) color];

and calls into _setPropertyForLayer which involves getting hold of the compositor;

GetCACompositor()->setDisplayProperty([self _currentTransaction]->_transactionQueue, layer->priv->_presentationNode, [propertyName UTF8String], newValue);

which I take as a bundle of things to be done and then we call setNeedsDisplay;

[self setNeedsDisplay];

which calls back into the compositor to tell it that the ‘display tree has changed’;


and that ends up calling through to UIApplication viewChanged  which seems to call into NSRunLoop on either a main or current loop.

I lost the plot a little bit there but the call of chains from here seems to ultimately lead through to NSRunLoop which (if I read it more or less correctly) seems to sit waiting on a number of sockets and one of which can be signalled by the _wakeUp method and this seems to cause execution of the function XamlTimedMultipleWait in StarboardWSCompositor.cpp and that seems to pick up the event in question and do some kind of switching/dispatching between 2 ‘fibers’ which seem to be running a WinObjcMainLoop and a XAML dispatcher at the same time;

      auto dispatcher = CoreWindow::GetForCurrentThread()->Dispatcher;
        Windows::System::Threading::ThreadPool::RunAsync(ref new WorkItemHandler([&retval, &retValValid, events, numEvents, timeout, sockets, dispatcher](IAsyncAction ^action) {
            //  Wait for an event
            retval = EbrEventTimedMultipleWait(events, numEvents, timeout, sockets);
            retValValid = true;

            //  Dispatch it on the UI thread
                ref new DispatchedHandler([]() {

Again, I lost the call stack but I think this then ultimately leads through to a call being dispatched into the CALayerXaml class to actually do the work, i.e. the call stack here;


and that’s an interesting thing in terms of the C++/C#/Objective-C call stack.

I also stepped through IWRunApplicationMain and in as much as I can tell without spending too much time on it, this goes through via calls to IWAppInit and IWStartUIRunLoop which, respectively seem to;

  • IWAppInit
    • seems to get hold of the temporary folder and the local folder for the app and store them somewhere for later use
    • drops back into Objective-C code in order to execute SetXamlUIWaiter();


    • which I haven’t fully dug into at this point.
  • IWStartUIRunLoop
    • this seems to set up 2 ‘fibers’ with one seeming to be the XAML dispatcher and the other is assigned to run the function WinObjcMainLoop;


WinObjcMainLoop which sets up some kind of Windows event to wait upon before calling into UIApplicationMainStart which is a more meaty looking piece of work in which seems to set up the default orientation, some widths, heights and scales, a tablet setting before calling UIApplicationMainInit and UIApplicationMainLoop.

UIApplicationMainInit ( seems to call back into where I was in Experiment 1 in that this method calls back into AppDelegate.didFinishLaunchingWithOptions.


and then activating the application via AppDelegate.applicationDidBecomeActive (which the sample doesn’t do anything specific to handle) and setting its windows to be visible.

The UIApplicationMainLoop then uses NSRunLoop to run the main loop of the application.

At this point, I felt that I was getting a somewhat fuzzy but basic grip on what was going on here and so I dropped out of debugging this for a while and tried something else…

Experiment 5 – Importing a Project

I wanted to step back from trying to figure out the code and get more of a feel for what it’s like to import an XCode project here because the project that I’ve played with so far had been imported for me.

I had a look at the wiki around how this process works and then I thought I’d try an Apple sample to see if I could apply the process to a standard sample.

I went over to Apple’s sample code and I thought that I’d try out a UIKit sample like this CollectionView-Simple sample and so I downloaded that.

I picked that one out partially at random but also because it said it was a UIKit sample and I’m not sure how samples that targeted other areas like NetworkExtension or CoreMotion would or would not work as I don’t think those APIs are currently part of the preview so I suspect there’d be a lot of work to do there to try and get those samples to build.

I extracted it out to my desktop and then ran the vsimporter.exe tool on it as directed and as per below;


and then I opened up the solution generated in Visual Studio and that seemed to be ok. I got a bunch of source;


and then compilation was a little more tricky in that I got a few errors;


the first of which was that the code was using @import and this issue said what to do about that and so I changed all the @import directives into #import directives and that got me down to;


and that left me puzzling a bit over why a Cell which is a UICollectionViewCell didn’t seem to have a selectedBackgroundView property.

I looked at the reference here and, sure enough, it looks like there should be a selectedBackgroundView property from iOS 6.0 onwards.

Looking into the source code for the bridge, in UIKit/ I could find some references to a selectedBackgroundView but it didn’t seem to be a public property.

Temporarily, I commented out that line of code and tried to build but I hit;


I hadn’t realised that this sample contained storyboards and the wiki says that Storyboards are not yet supported.

Now, with a detailed knowledge of what Storyboards in an iOS application actually involve, I might be able to work around this but, for the moment, I was blocked on importing that project and getting it to build.

I’m going to take a look at a few more of the Apple samples to see if I can get those to import here but, meantime, I thought I’d share these notes in case they’re of use to other folks (including, of course, anyone who wants to correct some of the mistakes I’ve probably made along the way here!).