Speculating About Microsoft’s Project N

November 25, 2013

5 comments

Project N has been making the headlines for a few days after a mysterious demo showing startup times for the Fresh Paint Windows Store app improving considerably when magically compiled with “Project N” technology. In this post I’ll try to speculate on the very little information that has been disclosed publicly, hoping to shed light on what this project could be.

Please note that although I have just come back from the MVP Summit, the text below is not based on any discussions or information from Microsoft. No NDA material has made it into this post.

Pretty much the only “official” statement we have about Project N is the following: “We don’t do JIT (just-in-time compile) and use optimization that we use for C++ code.”

The idea of precompiling managed code to a native-only binary that doesn’t require JIT services isn’t new. Because Apple’s kernel memory protection on iOS makes JIT compilation technically impossible, Xamarin.iOS (formerly MonoTouch) has been compiling C# to native ARM instructions for several years now.

To clarify, compiling C# to native code doesn’t mean there are no managed runtime services when the application runs. The garbage collector still has to collect memory, metadata is still available for services such as Reflection, and various other “runtime” features are still present. What’s not available at runtime — for performance reasons — is the JIT compiler itself.

Precompilation to native code and excluding the JIT from the runtime services completely comes with a set of limitations. For example, you can’t emit new C#/IL code using APIs like DynamicMethod and then execute it at runtime. However, these limitations aren’t new to Windows Store and Windows Phone managed app developers — most runtime compilation services are prohibited anyway in the Windows Runtime. Another limitation would be around instantiating new generic types that weren’t available to the precompiler, using an API such as Type.MakeGenericType. E.g., if the precompiler only produced native versions of List<int> and List<string>, then at runtime doing something like typeof(List<>).MakeGenericType(new[] { typeof(double) }) would necessitate the JIT kicking in.

Native precompilation of Windows Store or Windows Phone apps, however, doesn’t necessarily mean the JIT can’t be loaded if necessary, as a fallback. Unlike iOS, the Windows kernel provides the write-and-execute memory page protection scheme required for runtime JIT services. In fact, the NGen technology has always relied on precompiling IL to native code but falling back to the JIT compiler if a method wasn’t precompiled in advance.

An interesting question about Project N is how exactly the CLR team is going to use the C++ optimizer to precompile IL code. I see several options here, but my speculation is the following: the IL code would be converted to intermediate CIL .obj files that the C++ compiler would spit when Whole Program Optimization (WPO) is enabled. This CIL format is more flexible than plain native code .obj files and allows better optimizations to take place at link time. Microsoft’s recent focus on Profile Guided Optimization (PGO) for Windows Store apps, based on the same WPO CIL .obj technology, might be another clue that this direction is being actively explored. PGO has been part of Visual Studio for almost 10 years, and it would be incredible if the CLR can find another use for it as an optimizing backend.

Wrapping up, full native compilation of managed code is entirely possible and has been available since .NET 1.0 with NGen. Microsoft has been exploring full-application precompilation since NGen 2.0, and has experimented with cloud-compile to “almost native code” with MDIL (“Triton”, “Compiler in the Cloud”) for Windows Phone 8. Using the state-of-the-art Visual C++ optimizing backend for this precompilation process sounds like the logical next step. At the same time, Microsoft has been evolving the JIT compiler with the RyuJIT project. That’s why a combination of ahead-of-time optimization to native code and runtime fallback to a vastly improved JIT engine sounds like a winner to me.


I am posting short links and updates on Twitter as well as on this blog. You can follow me: @goldshtn

Add comment
facebook linkedin twitter email

Leave a Reply

Your email address will not be published.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*

5 comments

  1. anthonyNovember 30, 2013 ב 1:46 PM

    Hi Sasha,

    I agree with your opinion but somehow, I think there is more to the story. For example, what about the rumor that Microsoft considers Native C# for the Client (AppStore software) and JIT (RyuJIT) for the server (back end stuff) ?

    Or do you think that eventually we will have a “unified” solution with JIT and Native on the same CLR, depending on the environment ?

    Thanks for the post!

    Reply
    1. Sasha Goldshtein
      Sasha GoldshteinDecember 4, 2013 ב 5:28 PM

      I think if you can deliver native precompilation with full optimization, it would be valuable on both server and client. Server application startup, especially in dense hosted environments, is super-important. First-request latency is a known problem with IIS hosters.

      Reply
  2. JoeFebruary 7, 2014 ב 10:03 AM

    Is this for desktop apps also for example winforms? Or only metro styled apps?

    Whats the point of that??

    Reply
  3. Pingback: .NET Native Performance and Internals

  4. S TenAugust 7, 2014 ב 10:54 AM

    Visual Basic 6 (VB6) has done this for years. It combines better productivity (by using the VB6 RAD) with the performance of C++ (by using the C++ compiler).

    Pre-compiling C#, rather than JIT compiling, will speed up loading time, though not necessarily performance. This is long overdue.
    Of course, the backend compiler for this is the C++ compiler – just like the VB6 programming language has used since 1998 !

    Everything old is new again

    Reply