During the last few months a small team from Sela, has developed course materials, exercises and demos for the DPE Metro Program, that show how to develop with the new features of Windows 7. As you may know, all of the new feature of the system are exposed to native developer, using Win32 API and COM. To use C# and .NET we developed set of wrappers. Sasha developed the Taskbar wrappers, and also wrote about it in his blog. Dima developed the Sensor & Location Platform wrappers and I developed the Shell Libraries wrapper and the Multi-touch wrappers.
You can download these wrappers from here:
For the Taskbar & Libraries we have used the Vista Bridge project with few modifications. The other wrappers don’t depend on the Vista Bridge. Currently the Windows Bridge team is looking at our wrappers as a prototype for the next version of the bridge, and they are in contact with us.
So, what is in there for you?
Sasha has wrote a lot about the Taskbar. I will talk briefly about the other wrappers and in future posts I will dive into the gory details.
Sensor & Location Platform
The Sensor API is designed for working with sensors (light, accelerometer, buttons, temperature, etc.) in a vendor-independent manner. In previous Windows versions you had to learn and target each vendor’s specific API.
The Location API builds upon the Sensor API and provides location information for location-aware application. The location information may be provided from many different sources: GPS, WiFi hotspot or GSM cell triangulation. Windows 7 Sensors and Location platform selects for you the best source automatically with no need to even understand the different providers. For the developer, this new set of APIs provides an abstraction of the underline infrastructure freeing the developer to handle the high level of application logic.
The API has many benefits (a partial list):
1. Frees you from the burden of interpreting various protocols used by GPS receivers (NMEA +non-standard extensions, NMEA2000 and others.
2. Not bound to any particular technology or vendor. No need for serial port/emulation.
3. Multiple applications may use the information simultaneously.
The S&L (Sensor & Location) wrapper is a framework that provide a managed access to the S&L Platform. It is a framework, because the design allows hardware vendor to extend the wrapper to support new .NET sensor types.
As opposed to the native API that are based on COM GUID for any type information the .NET wrapper is based on strong type design. This means that the developer knows the sensor type that she uses, and the exact report that she gets. For example for ambient light sensor, a specific data report will be generated with a strong type property telling the light intensity reading in LUX. In future post I will talk more about the wrapper.
Windows 7 introduces a new concept of Libraries as destinations where users can use their files as collections of items that may span multiple locations across computers. Libraries supersede the functionality of previous Windows versions known folders (e.g. Documents, Pictures, Music) and replace them as a main “storage” destination. Applications have a straightforward method for interacting with Libraries programmatically with the Library API. They can create, interact with and support Libraries as first-class items in their experiences.
In previous versions of Windows, each application had its own proprietary library. For example, Windows Media Player had a different set of scopes than iTunes and neither was consistent with your Music Folder. With the Library API applications can define and consume a consistent set of user-defined scopes.
Libraries may also contain network folders. This enables a better user experience at work and at home. Whenever the user opens a common file dialog he or she gets an aggregation view of all of the library locations that are available.
The Shell managed wrapper can be found under the Windows7.DesktopIntegration.dll. It contains wrappers for the new Windows Shell Taskbar and the new Windows Shell Libraries which in turn depends on the Windows Vista Bridge. The Vista Bridge that comes with this project is based on version 1.4, and it is slightly modified to support some of the new Windows 7 features.
To support the Windows Shell Libraries operations, the Windows 7 Integration Library contains the ShellLibrary class. The class has few static functions to create, load or delete a library and many other instance methods to work on the library instance itself.
Using the class is just a matter of using one of the static functions to get a ShellLibrary instance and then calling a method on that instance. The following code opens an existing library and adds new folder location to it:
public static void AddFolder(string name, string folderPath)
using (ShellLibrary library =
More on Shell Libraries in future posts.
One of the more exiting features of Windows 7 is the multi-touch support. First I have to admit that I had so much fun writing code and cool demos for multi-touch. I have two multi-touch machines, one is the HP Touchsmart, a 25.5” dual touch all-in-one computer which runs the Windows 7 beta 64 bit version, and another Dell latitude XT machine, that has a true multi-touch support from N-Trig which runs the 32bit version of Windows 7.
Windows application can target one of 3 levels of touch integration:
Good: No specific touch APIs are used but the application UI is appropriately sized and
works well with the built-in gestures
Better: The gesture APIs are supported to give
smooth natural interactions
Best: Deep touch focused experiences designed
to take advantage of Multi-touch
Windows 7 User32 controls have built-in support for multi-touch, for example you can scroll text in a Window using your fingers.
When you want to add touch support to your application you need to get touch information from Windows. There are two different approaches to choose from. The easiest approach is to let Windows understand the user intent. The user uses her fingers to take actions like rotate, translate, scale, double finger tap, and so on. You can ask Windows to interprets these actions to a simple Windows message called gesture, that informs your application about the specific action with the related data. For example the user did scaling operation with 2.5 scaling factor. There are two main problems with this simple approach. The first is that you get only one interpretation for the user action. If the user rotate and scale in the same time, you will get only one gesture, not both. The second problem is that is the user needs to manipulate more then one object in the screen, let say by using her two hands, or another user’s hand, you will not get the correct interpretation at all.
To overcome these problems you should register to get the low level Windows Message, which is the WM_TOUCH message. This gives you the row touch information. You need to decode the message, there is a good chance that this one message contains many different touch messages from each and any of the fingers that simultaneously touch the screen. The next action is to decide which message is belong to which item. Suppose you have many pictures on the screen, you need to correlate the touch message to the target picture. This can be done by doing hit-test using the location information of the touch event. Now you have to figure out what is the user intent. This is a hard task if you need to do it yourself, but don’t worry, Windows 7 provides a Manipulation Processor that does it for you. All you need to do is to pass the row touch message to the Manipulation Processor, and to get the Manipulation Delta event from it, which in turn is very similar to the gesture information, but include all the actions together. If you also want to have an inertia on each of the object you can use the Inertia Processor. This Processor is a little bit complicated and I will write about it in a future post.
All of those messages and API are native Win32 and COM based. WPF 4.0 will have a full support for touch enabled applications, including Manipulation and Inertia. If you don’t want to wait to WPF 4.0 or you need to crate WinForms base touch enabled applications, you can use our wrapper. The wrapper lets you create a touch handler for WinForm, or WPF (or even for any User32 Windows Handle). This handler can be GestureHandler or TouchHandler. After creating the handler you should register for the gesture or touch events. If you use the TouchHandler you will probably create at least one instance of ManipulationProcessor or ManipulationInertiaProcessor which wrapps the native COM based processors.
If you have a WPF 3.5 application you can use the stylus events as touch events, just use the WPF handler to enable WPF touch events.
To get better understanding of how to use the touch wrapper we created many demos, for WinForm, WPF and even a C++/CLI Win32 application that uses the wrapper. I will write a future post with more details.
Windows 7 is a cool operating system. The end-user gets a better experience when using the Shell and the built-in applications, but it is our task to use the cool features in our own applications. For the .NET developer, we will get better support in the future, whether in .NET 4.0 or from the Windows Bridge team, but for now you can enjoy using the wrappers that we developed for DPE.