I just learned that the C# compiler is being ported to C# and the VB.NET compiler is being ported to VB.NET. (Right now they are both written in C++, of course :-))
The beginning of the presentation focused on the three trends – dynamic, declarative, concurrent – that are prevalent in the latest releases of the .NET languages.
Luca Bolognese showed an example of converting an imperative for-loop and converting it to a LINQ query which is very declarative. Next, he used the AsParallel() extension method (from the TPL) to parallelize the query – which demonstrates how declarative programming makes it easier for the system to parallelize the query for you without having to worry about synchronization: In a way, writing declarative code is a prerequisite for many parallelism features. (See my ParallelFX category for more interesting examples of the upcoming parallelism support in .NET 4.0)
The next example was using the new dynamic programming support in C# 4.0 and the CLR. The example was a classical “dynamic bag” (a variation on the expando object), enabling a line from a CSV file to have strongly-typed properties deduced from the CSV file header row. This is an example of dynamic API design without actually writing the API 🙂
In the second part of the presentation, Luca started talking about future thoughts for next releases of the languages (without promising anything, of course).
The first area they are exploring is opening up the compiler. Currently, the compilers are black boxes – you do have an opportunity for runtime code-generation but it’s basically pushing source files in, getting IL out. The possibility of using the same data structures the C# compiler uses (the AST) is very interesting, and enables e.g. a REPL app for C# which Anders Hejlsberg showed at PDC ‘08.
Luca shows an example that takes a piece of VB.NET code and performs a refactoring to change the order of parameters. Writing this refactoring today would involve writing a lexer for the language, and then a parser – and the grammar for VB.NET is fairly complex. In a possible future compiler, it would be possible using a visitor that rewrites the abstract syntax tree (AST) by swapping the order of parameters – and that would be about 100 lines instead of implementing a full parser for the language. [There were attempts to do similar things in the past – cf. the introspection engine used by Code Analysis (FxCop) rules – but the ability to modify the compiler AST was never introduced before.]
The second area they are exploring is asynchronous programming. Because of the declarative programming features in the languages, it’s become easier to write parallel code, but long-running asynchronous invocations involving message-passing or task coordination are not still quite easy. The example starts with a console app that performs HTTP requests and returns the content-length of the returned resource. It’s synchronous – runs on one thread, and it’s easy to run it faster by performing multiple HTTP requests in parallel. The first attempt at parallelization does not require any new language features – just add an AsParallel() and it works.
The language extension involved would allow you to specify that when a method is called, it should release the current thread and send a callback when the operation completes. This is currently achievable using the APM (asynchronous delegates), but if you want to coordinate multiple invocations and think about exception propagation, it quickly becomes a PITA. [But this is what Tasks in the TPL are all about. I’m not altogether sure this is a reason to add a language feature, at least to the major languages, even though it could be made more readable than Task-based code. Cf. Axum.]
The keyword Luca uses (in an experimental, prototype version of the C# compiler) is yield in front of a method invocation – to specify that a piece of code should run asynchronously, and a new Async<T> type that specifies a future (or a Task<T>, in TPL) is returned.