Archive for the 'The Art of Programming' Category

It’s Done™ (a.k.a. it Works™)

Posted in Campfire Stories, The Art of Programming on June 10th, 2007

Have you ever had a disagreement about when something is actually done (as opposed to “not done yet”)?

I started learning about “done” when my parents started making me do chores around the house.

Doing dishes: Walk to sink, wash dishes, put in drainer. Done. Am I right? I though I was right.

“Did you empty out the dishpan?” my mom asked.

“No.”

“Well, you’re not done yet.”

Later, I say “now I’m done.”

“Did you wipe down the sink, counters, and stove?”

“That’s part of ‘washing the dishes?’” Even a ten year-old understands the concept of scope creep. The answer was yes but through my push-back I won a change in the name of the task. From then on “Doing the dishes” had to be called “Doing the dishes and cleaning up the kitchen.”

The bottom line: in a team environment the implementer can influence but does not own the determination of when a task is done. Like project scope, doneness is a value negotiated with the customer of the work.

I was recently asked to define “done” in my organization. This is a clear sign that something might be broken. Maybe it just means that there have been recent disagreements about done?

What is Not Done?

I often find that a helpful way to define a positive value is by defining the corresponding negative value.

Not Done is also “it doesn’t work.” To me this is an obvious corollary but I find that this does not occur to some people. The reverse is certainly true.

Not Done is when code “Works on My Machine.” I cringe when I hear “it’s working on my machine” especially since this is typically a response to someone saying “I can’t get it to work” This is ESPECIALLY bad when heard in response to “I can’t get it to build.” When a developer says “it works on my machine” they are really saying:

I do not know what this software depends on.

I do not know what’s involved in moving this software to a clean machine.

I have not checked in all the required code into the source code control system (and I probably don’t know what’s missing).

I do not really care that it does not work on your machine. I’m hoping that’s your problem…

Not Done is when code has not been tested. Untested code does not really work. This is because code that has not been tested invariably has some issues when actually exercised. I consider unit testing the absolute bare minimum level of testing before code can be considered done. An alternative to unit testing? Sure, a full blown test team.

Not done is when code cannot be reliably deployed. This is related to the “it works on my machine” issue. Reliable deployment means the developer has thought about all the steps, interactions and dependencies their code has and has documented them in detail sufficient to allow installation. Maybe the developer even went wild and wrote some install scripts or set up a full featured installer. It could happen.

Documentation

Is documentation required before code can be considered Done? If the code cannot be replicated, tested, or deployed without some written assistance then yes, it cannot be considered done until enough documentation has been written.

I find explaining to a developer that this piece of code is theirs to work on forever and ever until they write enough documentation to hand it off works as a powerful motivator to getting documentation written.

So What does Done mean?

Based on the discussion above I think I’ve outlined my definition of Done:

It has been tested.
It is reproducible and deployable on all supported environments.
It is documented sufficiently.
Most importantly (recalling my experience with the dishes), the customer agrees that it does what they want.

A simple WPF demo and architecture discussion

Posted in Windows Details on April 24th, 2007

This post is based on a short demo/talk I gave at Vertigo recently.

I have two main goals in this discussion:

  • How VS2005 handles WPF projects/apps (showing some Blend / VS2005 gotchas)
  • A way to look at the role of XAML in WPF architecture

Petzold books

Charles Petzold’s books have always been a huge help to me in getting all the nitty gritty details in Windows UI technology. I started my Chuck P. habit way back with his Programming Windows book. The Book for Win32 Windows programming. Ok, I do have other books too but this is the one I’d want if I was lost on a desert island alone with a c-compiler…

More recently, Petzold has written authoritative .NET WinForms books and I started to notice a trend. Especially in Programming Microsoft Windows Forms, his .NET 2.0 WinForms book, Petzold really pushes a code-everything-yourself approach as opposed to using any of Visual Studio’s “helpful” tools. His point: if you do not understand the code the various visual designer/code generators in VS create, you really do not know what is going on.

I see this as just another statement of the well-known Weasley’s Law:

Never trust anything that can think for itself

if you can’t see where it keeps its brain!

Arthur Weasley — Harry Potter and The Chamber of Secrets

I’m going to focus this very small discussion on how XAML fits in to the WPF dev process and WPF architecture and point out a few gotchas in the current WPF dev tool set.

The WPF dev process

I’m going to step through building a baby application modeled closely on the sample application Microsoft provides at MSDN in Get Started Using Windows Presentation Foundation but I’m going to walk through it highlighting aspects Petzold points out in his new WPF book Applications = Code + Markup.

Prerequisites

If you want to build and execute the demo code you will need a working WPF dev environment which requires the following: the Microsoft .NET Framework version 3.0 and the Windows Software Development Kit (SDK), Visual Studio 2005 (C# Express ok), Visual Studio 2005 extensions for .NET Framework 3.0 (WCF & WPF, November 2006 CTP or newer), and Microsoft Expression Blend (RC Version or greater).

First let’s play with a trivially simple WPF project. Source code here.

Step 1

We start Start with an empty WPF project in VS2005 which I’ve named WPFDemo.

In Solution Explorer, click the Show All Files button.

Select the files: (all the code files in the Step 1 folder)

App.xaml

ExpenseReportPage.xaml

ExpenseReportPage.xaml.cs

HomePage.xaml

HomePage.xaml.cs

Right-click on them and Include In Project.

Look at App.xaml:

    1 <Application

    2   xmlns=http://schemas.microsoft.com/winfx/2006/xaml/presentation

    3   StartupUri=HomePage.xaml>

    4 </Application>

App.xaml declares the application starting point. So let’s try to build. Assuming you followed the steps exactly you will see the following build error:

WPFDemo.exe does not contain a static ‘Main’ method suitable for an entry point

This is because by default when App.xaml was included by Visual Studio its Build Action was set to Page.

Select App.xaml and change its Build Action to Application Definition. Confirm that the application now builds and runs correctly.

Note: App.xaml is the most common name for this file but there’s nothing special about the name App.xaml. What’s important is its content and that it is configured as the Application Definition. Since the error message we got above implies this configuration has something to do with the Main() method it should also make sense that an application can only have one file configured as the application Definition.

In the steps above I’ve enacted discussion that may be found on pp. 479-481 of Petzold’s Applications = Code + Markup.

Where’s Main()?

After a quick examination of all the files we included in the project so far you should be asking about Main() – like, where is it? App.xaml defines the entry point to the application and if you dig into the generated files for the application you will find an App.g.cs file in the obj\Debug folder:

Examining this file we find a Main():

   33     /// <summary>

   34     /// GeneratedApplication

   35     /// </summary>

   36     public partial class GeneratedApplication : System.Windows.Application {

   37 

   38         /// <summary>

   39         /// InitializeComponent

   40         /// </summary>

   41         [System.Diagnostics.DebuggerNonUserCodeAttribute()]

   42         public void InitializeComponent() {

   43 

   44             #line 3 “..\..\App.xaml”

   45             this.StartupUri = new System.Uri(“HomePage.xaml”, System.UriKind.Relative);

   46 

   47             #line default

   48             #line hidden

   49         }

   50 

   51         /// <summary>

   52         /// Application Entry Point.

   53         /// </summary>

   54         [System.STAThreadAttribute()]

   55         [System.Diagnostics.DebuggerNonUserCodeAttribute()]

   56         public static void Main() {

   57             XamlGeneratedNamespace.GeneratedApplication app = new XamlGeneratedNamespace.GeneratedApplication();

   58             app.InitializeComponent();

   59             app.Run();

   60         }

   61     }

The #line 3 directive causes any compile error occurring here to be directed to the file indicated: back to the XAML file. The #line default, #line hidden directives restore error indication back to normal. These directives also affect debug step-through.

The key takeaways from this example are:

  • VS2005 creates all kinds of generated files you should be aware of.
  • Even a “pure XAML” app involves additional generated code and files.
  • VS2005 patched for WTF is not (yet) fully XAML aware.

Obviously it would be great if we could wait for the next release of Visual Studio codenamed “Orcas” (Here’s a great site by Scott Guthrie all about Orcas). But here at Vertigo we’re delivering WTF applications today so we need to be aware of the issues – and build anyway.

I’ll jump ahead by simply pasting in more UI content to the application and demonstrate a typical but very simple Blend/VS2005 development scenario. Source code here.

Step 2

Without making any changes to the structure of the app but adding UI code we get:

The idea is that you select a name on the list and click the button to view that person’s expense report. If you’re curious: the data displayed is hard coded as XML structured data in the HomePage.xaml file.

The first thing to do is to name the Button element in the HomePage.xaml file. To me it’s interesting that we already built and ran this app and yet nothing squawked about the button being an un-named element.

Being a Visual Studio-centric kind of guy I naively click on the button in the Design view and look at the control properties.

First I look for the “Name” property. Huh, not there.

Ok, next I look for the “ID” property. Double-Huh, that’s not there either?

To name the button you need to go to the XAML code and in the tag <Button add Name=btnView. Note that there is Intellisense support within the XAML code an the Button tag.

Now I want to wire the Click event.

Can I access control events in the Properties panel? No.

Can I double-click on the button in the design view to add a click handler? No.

However, I can add a reference to the “Click” property in the Button tag but this does not automagically wire up an event handler stub.

What I can do is this: rebuild the app (not just Build but Rebuild) so that the Name I gave the button is in-scope for Intellisense and then add the handler in the XAML file’s associated .cs file.

In HomePage.xaml.cs, I add the event handler by starting to type “btn…” whereupon Intellisense shows me the btnView and I add the event handler and let VS2005 create an event handler stub.

All looks ok – and utterly familiar. I then edit the handler to call goToExpenseReportPage() which I have already prepared and the application is finished.

Now let’s add a twist. Go to the event handler and select the btnView_Click name and select Refactor / Rename and rename the event handler any other random name. I used ShowExpenseReport. Rebuild the code and confirm the application still works correctly.

Mixing it up with Blend

Now open the same solution in Blend.

Select the button in the design view and select the events list in the properties panel.

Note: from Blend’s perspective the button’s click event is not wired up. This is because Blend manipulates the application by only looking at a project’s XAML files, not its .cs files. Therefore, it is excruciatingly easy at this point to go ahead and wire the click event up AGAIN.

Just to make a point, let’s do just that. let’s wire up a second event handler to the button.

In Blend double-clicking in the click event box shown above wires up the event and creates the event handler stub in HomePage.xaml.cs. This sequence should open and/or foreground VS2005 to the created stub. We can add a call to goToExpenseReportPage() and the code builds fine but may, or may not, run fine. On some machines the second event handler fails when called because navigation has already moved to the second page, disposing of objects needed by the handler code so you get null reference errors.

If we examine HomePage.xaml.cs we note that it still only shows the first event handler wired that we created with VS and does not show the connection to the second Blend created handler.

If we examine the XAML file we find the button tag has a reference to the Blend created handler (Click=btnView_Click/>) but no reference to the VS created handler. T add more confusion, the VS takes the Blend created event handler reference and generates the expected hookup in the generated file obj\Debug\HomePage.g.cs:

#line 81 “..\..\HomePage.xaml”

this.btnView.Click += new System.Windows.RoutedEventHandler(this.btnView_Click);

Again the #line directive redirects debug and error output.

Given that we’re building our apps with a combination of Visual Studio and Blend what should we do?

Blend/VS2005 recommendations:

  1. Use Blend to wire the events and create stubs. This means the event will be seen in the element’s tag in the XAML.
  2. Use the default name for the event handler.

Number 1 implies that a typical WPF app will have events and handlers referenced to controls in the XAML file, the handlers in the code-behind file, and actual code wire-ups of the events in the generated .cs files (the code-behind-behind files).

If you look purely at the code behind file (XAML.cs above) you are going to see the handlers with nothing more than their names to hint which control they bear on.

This is why I say (Number 2) renaming event handlers away from the default [control name]_[event name] pattern is a bad idea.

Why emphasize and favor using Blend? Currently Blend behaves in design mode the way we wish Visual Studio would. I’m sure this will all have to be revisited once Orcas ships but for now this is the approach I recommend.

XAML in WPF

At the beginning I made some claims that I would discuss XAML’s role from a more architectural point of view.

In architecture docs at MSDN I found two statements that seem to bear on this:

If you work the first half of Petzold’s Applications = Code + Markup, that is, the non-XAML part, you find that making a WPF is properties, properties, properties. Set the 85 bagillion properties of a Window, of a button, etc.

But look at it this way: properties are data. Data can be easily described by XML. Enter XAML. Once you have all the properties in your applications held in XAML it becomes really easy for any number of tools to traverse and modify this data.

We just demonstrated this by using Blend and Visual Studio to act on the same XAML files. The process is not perfect yet but it is a powerful technique I think we will see more and more often.

My little piece of Windows Vista

Posted in Campfire Stories, Windows Details on February 6th, 2007

null

“In addition to our summer and winter estate, he owned a valuable piece of land. True, it was a small piece, but he carried it with him wherever he went.”

From Woody Allen’s Love and Death.

So, what HAVE I been spending my time on? My little piece of the Windows Vista operating system.

For the last 20 months I’ve been building the digital locker assistant (DLA), a dedicated download client that works with Microsoft’s online digital locker, which is in turn part of Microsoft’s Windows Marketplace. Windows Marketplace is where that mysterious “Windows Catalog” link on your Start/Programs menu goes to.

Windows Marketplace supports direct browser based downloading. However, when the download is greater than 1-2 Gigabytes using the DLA is a much better way to go. The most popular use of the DLA so far has been buying and downloading entire copies of Windows Vista and Office 2007.

We were rather skeptical that users would want to download Vista or Office since they are really big downloads. However, earlier in the year the success selling and downloading super large games from Windows Marketplace convinced everyone that downloading Vista would be attractive to consumers. And indeed it has!

You can get the digital locker assistant two ways: If you have Windows XP, go to the Windows Marketplace website, create an account and download and install the MSI. It’s only a 1 meg download.

Or, it’s built into every copy of Windows Vista (except Server versions).

Actually building a part of Windows Vista was a huge effort but it’s really neat to install Vista and see my little piece in there. When I say “my,” it’s more like I’m using the Pluralis Majestatis, the Royal We. I was part of a team and We had LOTS of help.

I was the dev lead for the DLA for XP and Vista. Two very senior Windows developers with me at Vertigo, Chris Idzerda and Ralph Arvesen, rounded out the dev team (that is, they actually did most of the work). Initially, I was dev lead and PM but soon we needed more help with the process and got a full-time program manager, Anne Warren, who was also PM for the Windows Marketplace (WMP) website. The website dev team was some 15 developers and we had a build team of one (that should have been three). Our test team was in India so the dev/test cycle was almost 24/7, something like 24/6 – we’d hand off work in the afternoon and it would be tested all (our) night with a nice bug list waiting for us in the morning.

And then there’s the rest of the Vista team at Microsoft: really a cast of thousands. I think they ALL emailed me at least once. The High DPI functionality team. The Localization team (“do you know your UI looks really bad in Arabic?”), the Group Policy team, Remote Desktop team… you get the picture.

Let’s look at the app

In Vista there are two ways the digital locker assistant (DLA) may be invoked. The primary way is when you buy something at Windows Marketplace and it’s in your digital locker and you click “download.”

You may also browse to the DLA by finding it under the Vista Control Panel.

Then look under Programs or Programs and Features (using Classic view) and find “Manage programs you buy on line.” If you open this link you will invoke the DLA and if you have never sync’d up with your online digital locker you will see this:

If you have software in your online digital locker you can see it listed here by clicking on “Sign in if you already have a digital locker account.” Digital locker accounts are Windows LiveID (a.k.a. Passport) accounts mapped to a Windows Marketplace account. You’ll get a login prompt:

and after synchronizing with your online digital locker you’ll see all your purchased, free, and trial items listed. In my case (below) I clearly have a bunch of games in my locker. These were just to test downloading large items. Right.

Technology under the covers

The DLA is built in Win32/C++ as an ATL Windows application but we get some goodies from WTL as well. For those going “huh?” look at my post ATL and WTL resources.

At this point most people are asking me: why C++? Why not .NET and/or WPF? Or, if you’re using C++, why not MFC?

The DLA started (and is still available) as a downloadable application for XP. Our target users are what Alan Cooper would call “permanent beginners” (like that relative that always calls you for tech support…) — with a modem.

This means making the download as small as possible. Vertigo is a premier .NET shop but we could not use .NET because the 22 MB .NET runtime install kills us (that .NET never made it into the XP Service Packs… argh). Fortunately, we happen to have a few developers around (i.e., old geezers) who can do C++. We used ATL again to keep the size of the executable small.

In hindsight, it was just as well that the XP effort started in C++. Once we expanded the project to include being built into Vista we found that, in Windows System programming and the Vista source tree, C++ is expected and still king (See my post Has Microsoft flipped the Bozo bit on .NET? for a full discussion).

This meant that we could develop one source code base and, with some care, make it build in the Windows OS build system for Vista and VS2005 for XP.

Single source is nice but why not make a single binary that runs on Vista and XP? Sigh. We do — sort of, but it’s complicated.  From a programmer’s perspective, Vista makes one dramatic change from traditional Win32 applications and that’s in how localized resources are loaded.

To handle localization traditional Windows practice is to create an RC file for all resources (dialogs, images, sounds, strings, keyboard shortcuts, etc.) which are compiled into the resource DLL. Localization teams produce localized RC files based on your master RC file and these are all built into a suite of resource DLLs. At run time the application loads the appropriate resource DLL based on logic you have to write which looks at the calling thread’s locale settings.

Internal to the application is a language-neutral block of resources (typically English-US based) and if an appropriate external resource DLL cannot be found for the current locale settings, this internal block is used instead. This is known as “fallback” behavior.

Here’s the new twist in Vista: in Vista the OS loader (not the app) picks the resource DLL and locates it in memory where the app thinks its internal fallback resources are. This is expected behavior and currently only appears to work for a native Vista-built application so our “legacy” resource loading technique as used in XP was not acceptable to those who guard the Windows Source tree. Did I mention all the code reviews? Making Vista-style resource loading work in XP, while theoretically possible, was a task we did not choose to take on. So we ended up with one set of source code feeding two build processes; one for XP and one for Vista. Through careful coding there are remarkably few “if Vista do this, if XP do that” points in the code. 

While we currently block running the XP installer on Vista (in theory blocking installation of the XP DLA on Vista), it turns out that the XP DLA runs fine on Vista. I should not be suprised by this becuase we did quite a bit of casual sanity testing on this but it was not initially part of the test matrix. We found out by somewhat by accident as users were upgrading their XP machines (where the user had added the XP DLA) to Vista and then running the XP DLA.

For our downloading mechanism we hand off all download jobs to Microsoft BITS (Background Intelligent Transfer Service). While BITS works well for us I still think Micorsoft is tempting the gods by including “Intelligent” in their product name. BITS is the guts behind hwo Microsoft Updates are downloaded. I’ve also discovered that Google Updater uses BITS as well. What we gained by using BITS was automatic download management including background downloads, downloads that persist when our application is not running, downloads that seamlessly restart when the machine is rebooted, and lots of error handling algorthms that we did not have to write or maintain. I’d use BITS again if needed. We did have to build a simple HTTP download as well because some modem-based accelerators do not play nice with BITS.

Overall it was a great experience. While it was sometimes chaotic and exhausting, it was a lot of fun too. 

I’d do it again.

Really.

After I’ve had a couple years to rest.

Using a UDL file to generate and test database connection strings

Posted in Windows Details on December 19th, 2006

One of the most common gotchas to getting any data driven application working is the database connection string. How many times in your development life have you heard “I can’t connect to the database” and the problem ends up a minor detail in the connection string?

When I was working on a project with IdeaBlade one of the developers showed me a neat trick: keep a UDL file on your desktop. 

While there are whole websites devoted to connection string details, a simple UDL file on a Windows system gives you a really easy way to configure and test a connection string directly against the database you want to use.

In Windows the extension “.udl” is registered as a “Microsoft Data Link” and the default program is OLE DB Core Services. Not very intuitive so let me walk through the basics.

A UDL file is actually a text file so start by creating a text file on your desktop. Right-click on the desktop and select New and Text Document.  Note: I did this on Windows Vista but this works the same way on any modern version of Windows (I’ve checked as far back as NT 4.0).

Name your new file Connection.udl – or, whatever but you need the “.udl” extension. Ignore the “If you change a filename extension…” warning . The file will take on an icon appearance that is not a text file. Here’s how it looks on Vista

And XP

Now double click on (or right-click Open) the file.  You should see a familiar database connection configuration dialog.

To connect to your local SQLExpress database go to the Provider tab and select Microsoft OLE DB Provider for SQL Server.

Now go to the Connection tab and select the database to connect to. You may just browse through the list but SQLExpress instances do not always show up. You may also type the server name into the listbox. For SQLExpress it will usually be [machine name]\SQLEXPRESS if you have a full SQL 2005 instance installed it will just be the machine name. In this case I’m connecting to my local SQLExpress database on my machine HAVOCVISTA.

Select the authentication you want to use. If using a username and password choose whether you would like to embed the password in the connection string.

Now select the particular database. If the dropdown is populated when you click on it you already have a good connection. If not, then the problem is going to be the name and/or the authentication provided. Assuming all’s well, select the database and click the Test Connection button.

Now close by clicking the OK button.

Here’s the “Ah Ha” step.

Now open the file in notepad – remember, the .udl file is simply a text file. You should see something like this:

In the file is the connection string you just tested. Copy and paste where needed in your application and you should be good to go.

ATL and WTL resources

Posted in Windows Details on December 18th, 2006

The story of ATL and WTL (fit for a cocktail napkin):

In the beginning, (at least in *this* beginning) there was COM. As developers embraced COM programming for Windows applications Microsoft created the Active Template Library (ATL) as a framework to simplify and envelop the routine tasks in the creation of COM components.

Using ATL developers were happy with the ease with which small fast components could be created. They were unhappy with all the code wrappers they had to write to use any Windows controls. And, of course, everyone wrote their wrappers differently so this became an entropy generator on projects. While ATL does provide “Window” classes, they really don’t help much outside of COM. They’re mainly intended for COM control property pages.

In response, developers within Microsoft developed (in an unsupported way) the Windows Template Library (WTL). WTL extends ATL and provides a framework of light wrappers to use with Windows controls. Developers competent with ATL find WTL a great framework to quickly build small (in terms of KB EXE size) applications. Versions 7.0-7.5 are available from Microsoft but WTL has been released in the public domain and is maintained at SourceForge (see links below).

On a recent project I found these resources useful in catching up on how to work with ATL/WTL:

ATL resources:

Code:

ATL is part of the Windows SDK. 

Web:

New MFC and ATL Features - MSDN
ATL Samples - MSDN
ATL Server Samples - MSDN 

Code Project: Active Template Library (ATL)

Books:

ATL Internals: Working with ATL 8, 2nd Edition (New! July, 2006) (Amazon), by Christopher Tavares, Kirk Fertitta, Brent Rector, Chris Sells.  Very good but does not have any material on ways to build ATL Window apps without WTL (Note: I stand corrected, see Chris’s post below). Chris Sells’ web page for this book 

Beginning ATL 3 COM Programming (1999)(Amazon), by Julian Templeman, Richard Grimes, Alex Stockton, Karli Watson, George V. Reilly. This is “beginning” as in “beginning cliff diving” – has an ATL Windows app section.

Professional ATL Com Programming (1998) (Amazon), by Richard Grimes.

Developer’s Workshop to COM and ATL 3.0 (2000) (Amazon), by Andrew Troelsen.
 

WTL resources:

Code:

Windows Template Library (WTL) 7.0 4/2/2002
Windows Template Library (WTL) 7.1 12/9/2003
Windows Template Library (WTL) 7.5 6/13/2006
WTL open source at SourceForge. V 7.5 is released, 8.0 in development.

Web:

Chris Sells: Whitepapers, sample code and walkthroughs. Based on an old version of WTL but still the best starting point. WTL Makes UI Programming a Joy, part 1 and part 2, 6/2000 

Yahoo! Tech Group: WTL – online group discussion and help.

Code Project: Windows Template Library (WTL)
Code Project: Michael Dunn’s series on WTL for MFC Programmers is outstanding. Something like ten parts. Starts with WTL for MFC Programmers, Part I – ATL GUI Classes 

Books:

None as far as I know.

Has Microsoft flipped the Bozo bit on .NET?

Posted in Blogs on blogs, The Art of Programming on March 18th, 2006

My colleague, Jeff Atwood also writes about this at his excellent Coding Horror.

No, Microsoft has not.

I have recent experience with software development in Vista (literally IN Vista). I work in a .NET shop but we did not build our Vista app in .NET. We used good old unmanaged Win32/C++.

Richard Grimes is a frequent contributor to many professional software journals and his Wrox books Beginning ATL COM Programming and Professional DCOM Programming saved my ass and made me look smart when I was building a DCOM based system working over satellite networking for Ford Motor Company in the late 90’s.

Richard has some great posts on .NET, Vista and specifically .NET and Vista

He shows, and laments, that there is virtually no .NET beyond the .NET runtime in Vista.  Further, he tracks that the amount of managed code within the .NET framework itself has measurably gone down with each release.

From this he concludes that Microsoft has lost confidence in .NET.

Grimes has missed the point. In building Vista, virtually every RECENT decision (over the last 18 months) has been driven by Microsoft ruthlessly pursuing stability and security. After umpteen intense code reviews this priority is now burned on my butt. At the same time Microsoft is also insisting on broad compatibility with existing applications and systems.

As Grimes demonstrates, the Microsoft OS (Vista) and application code base (Office, etc.) is almost entirely Win32/C/C++. What Grimes neglects to mention is that the Microsoft code base is also incredibly mature in terms of contained bug fixes and work-arounds.

Porting functionality to .NET would have directly increased security. However, porting any major function or system to .NET, while also faithfully replicating all the complicated legacy details contained is very expensive and error prone.

Microsoft does not have infinite resources and they cannot push hard on everything at the same time. Pursuing increased security and stability (in C/C++ those are directly and tightly coupled objectives) had the effect of reducing the goal of “more .NET” from a MUST into a “nice to have.”

The goal of a solid platform is worthy. Improving the security of the Windows platform is also a good idea. Looking at the history of Vista there is a clear pattern of tossing any new feature (WinFS, Avalon, …) if it cannot be guaranteed rock solid.

It’s gutsy that Microsoft has been willing to forgo attractive customer-facing features to pursue greater stability (which is only visible by a lack of instability…) and security (ditto).

It’s wimpy that they have not explained this well to the developer community.

The real issue Microsoft has to tackle is getting their marketing rhetoric in line with reality.

I do not entirely agree with Grimes but he represents a very important outside opinion that many eminent software developers share.

If Microsoft does not address the concerns of people like Grimes or the community he represents Microsoft could lose the “mindshare” race in the long run.