Category: Visual Studio

Fastest Disk for Running Virtual Machines?

I read a discussion the other day about a laptop review as well as comments about running virtual machines on a Solid State Disk (SSD).

The two comments that made me think were (to paraphrase):

1) “I heard running virtual machines can cause issues with your SSD

2) “I don’t notice a large performance difference between running a virtual machine on an SSD and a normal disk

These comments were of interest to me since I run my primary virtual machines off my SSD disk. Am I potentially trashing it for no performance improvement?

I put together my own speed test based on a few disks I have lying around.

The contenders are:

Disk  Connection Comment
Intel X-25M SSD (160GB)  SATA My Solid State Disk in a laptop caddy
Western Digital 7,200 RPM 2.5” (200 GB)  E-SATA Generic external enclosure
Western Digital Passport (1TB)  USB 3 My latest external disk
Western Digital Passport (80GB)  USB 2 My first external disk, still going strong after 6 years
(From Left to Right) Intel X25-M SSD, Generic WD e-Sata, WD Passport USB3, WD Passport USB2


This test is definitely not exhaustive, and applies only to Hyper-V and my particular laptop configuration.

In addition, only some general aspects of the newly released Windows 8 (Developer Preview). The comparisons for the operations that interest you may be quite different. The comparisons I made were just to determine if there was a trend between the underlying disks.

A better test would do similar comparisons between Virtual Box, VMWare, Virtual PC etc.

Finally, the SSD had the advantage of being directly connected to the main SATA bus. I could have tested a normal disk in the caddy as well, but I felt e-SATA should be nearly as fast (if not as fast).

Test Setup

I have created a 20GB Hyper-V disk file with Windows 8 installed. The same file is copied to each of the disks above.

The Hyper-V machine for each has 4 processors with 4092MB of RAM. The configurations are identical.

Just to make things fair, I tested each machine twice by shutting down and then starting again.

The laptop was a Lenovo W520 with an i7 CPU and 16GB of RAM. The operating system was Windows Server 2008 R2 Standard Edition.

Nothing was running except for the Hyper-V process itself, and only the machine being tested run running.

A list of my Hyper-V machines. Windows 8 test machines at the bottom


I just tested and timed some simple operations in Windows 8 that anyone can do out of the box.


The results in seconds are below:

Test  Intel X-25-M WD e-Sata WD Passport USB3 WD Passport USB2
Windows Startup 





Windows Login 





Launch Visual Studio 11 





Build basic HTML5 solution in Visual Studio 





Launch Expression Blend 5 






There is a considerable speed advantage to using a Solid State Disk for running your Hyper-V virtual machines.

e-Sata still proved to be slightly faster than USB 3.

Surprisingly, USB 2 was not extremely slow compared to its USB 3 successor.

What Next?

I recognize that by using my SSD to run virtual machines, I am potentially reducing the life of the disk quite considerably.

At the time of writing an Intel SSD with 160GB is retailing at USD $300. Therefore the productivity advantage (for me) seems to outweigh the cost of the disk itself.

I will still run some machines (such as Active Directory and betas) on a ‘normal’ disk.

I’m also likely to use USB 3 more from now on. Although my e-Sata disks are a little faster, I find the connection more temperamental as well as needing two cables (data + power) which is inconvenient.

Bing Maps in my Sharepoint? It is more likely than you think..

Actually, there are many mapping solutions for SharePoint, both commercial and open source.

The thing I didn’t understand was why it was hard to find a ‘Hello World‘ example for a Javacript based map?

Microsoft provide an interactive SDK that allows you to quickly get maps up and running in HTML.

To do the same in SharePoint 2010, I will use this SDK as the basis of an example WebPart built in Visual Studio 2010.

The main problem of integration these examples is that the Map has to be created after the page has finished loading, hence the definition of the <Body onload=””> tag.

It is really hard to add this onload attribute to SharePoint since we do not have ready access to it. In addition, by hacking away at the underlying template you might break something else.

The solution is to not use onload, but to use JQuery’s $(document).ready function, that will also make a call back once the document has finished loading.

It does mean some extra configuration of your code however.. follow the steps below:

Step 1: Create a SharePoint WebPart project in Visual 2010

Step 2: Add a SharePoint Mapped Folder to your project

Step 3: In the following dialog, add the Templates->Layouts folder

This will add a corresponding folder to your WebPart solution.

Step 4: Create a unique folder under Layouts, and add javascript files.

Create a unique folder here (I have used the name ‘FlightTracks’).

In this folder, add the JQuery javascript file. Download from here. (Hint: Use the development version for easy debugging)

Then create a new empty javascript file such as ‘FlightTracks.js’.

Step 5: Add content to the custom Javascript file

function GetMap() {
map = new VEMap('myMap');

$(document).ready(function () {

Step 6: Add code to the UserControl’s CodeBehind file

        protected void Page_Load(object sender, EventArgs e)
            HtmlGenericControl bingMapsScript = new HtmlGenericControl("script");
            bingMapsScript.Attributes["type"] = "text/javascript";
            bingMapsScript.Attributes["src"] = "";

            HtmlGenericControl jQueryScript = new HtmlGenericControl("script");
            jQueryScript.Attributes["type"] = "text/javascript";
            jQueryScript.Attributes["src"] = ResolveClientUrl("../../_LAYOUTS/FlightTracks/jquery-1.4.4.js");

            HtmlGenericControl flightTracksScript = new HtmlGenericControl("script");
            flightTracksScript.Attributes["type"] = "text/javascript";
            flightTracksScript.Attributes["src"] = ResolveClientUrl("../../_LAYOUTS/FlightTracks/FlightTracks.js");

            HtmlGenericControl script = new HtmlGenericControl("script");
            script.Attributes["type"] = "text/javascript";
            script.InnerText = "var map = null; ";


Please note that if you havn’t named your Javascript folder as ‘FlightTracks’, then you will need to adjust here.

Step 7: Define the HTML entry of the map

Bing Maps will take over a defined DIV to turn into a map. Add this line to your custom control (ascx) file:

<div id='myMap' style="position:relative; width:600px; height:400px;"></div>

ESRI Javascript API: Integration with ASP.NET

Update: May 5th, 2012: The CodePlex code I posted was not working because ESRI had changed the example it was based on. A new version has been updated on CodePlex.


Esri have great examples about how to write map based applications with their Javascript API.

The examples are easy when you are adept at:

  • JQuery
  • Dojo JavaScript library

As far as I can tell, there are not too many resources that tell you (for example) how to integrate with an ASP.NET page.

Therefore I have created a project to show how to do this.

My first project extends an Esri Javascript example and places the search results in an ASP.NET DataView.

In order to achieve this, I created a Visual Studio 2010 project that:

  1. Placed example into an ASP.NET page
  2. Parsed the ESRI map search result set
  3. Created a JSON data object with JQuery
  4. Passed the object back to the code behind after the search was completed
  5. Deserialized the JSON object into C# classes
  6. Bound the classes to the DataView

The code to get the results was simple enough. I just modified the ‘showResults’ function to build a JSON object, set the value to an input field and click a hidden button for a postback.

        function showResults(results) {

            //This function works with an array of FindResult that the task returns
            var symbol = new esri.symbol.SimpleFillSymbol(esri.symbol.SimpleFillSymbol.STYLE_NULL, new esri.symbol.SimpleLineSymbol(esri.symbol.SimpleLineSymbol.STYLE_SOLID, new dojo.Color([255, 255, 0]), 2), new dojo.Color([0, 0, 0, 0]));

            var myString = "[";

            //Create items array to be added to store's data
            var items = []; //all items to be stored in data store
            for (var i = 0, il = results.length; i < il; i++) {
                myString += "{\"ID\":\"";
                myString += results[i].feature.attributes.TLID;
                myString += "\",\"OWNER1\":\"";
                myString += results[i].feature.attributes.OWNER1.replace(/,/g, '');
                myString += "\",\"OWNER2\":\"";
                myString += results[i].feature.attributes.OWNER2.replace(/,/g, '');
                myString += "\",\"OWNER3\":\"";
                myString += results[i].feature.attributes.OWNER3.replace(/,/g, '');
                myString += "\",\"VAL\":";
                myString += results[i].feature.attributes.TOTALVAL;
                myString += "},";

                var graphic = results[i].feature;

            myString = myString.slice(0, -1);
            myString += "]";

            // Find the text box
            var test = dojo.byId("textbox1");

            // Set the input value to the JSON string
            test.value = myString;

            //Zoom back to the initial map extent

            // Click the button
            __doPostBack('Button1', '');

On the server side, I deserialize the JSON string with the following:

    private List<CityProperty> GetProperties(string jsonString)
        List<CityProperty> properties = new List<CityProperty>();

        JavaScriptSerializer ser = new JavaScriptSerializer();

        object o = ser.DeserializeObject(jsonString);

        object[] oList = o as object[];

        for (int i = 0; i < oList.Length; i++)
            CityProperty cityProp = new CityProperty();

            Dictionary<string, object> foo = oList[i] as Dictionary<string, object>;

            cityProp.Id = foo["ID"] as string;
            cityProp.Owner1 = foo["OWNER1"] as string;
            cityProp.Owner2 = foo["OWNER2"] as string;
            cityProp.Owner3 = foo["OWNER3"] as string;

            object value = foo["VAL"];

            if( value != null)
                cityProp.TotalValue = (int)value;


        return properties;

I have a feeling the deserialization process could be easier. There is a library that seems highly recommended called JSON.NET, but since I was only using simple types I didn’t want to add too much complexity with another library.

Clone Detective

If my brain was a CPU, it would definitely be single core. When I’m in the ‘zone’ of development, I focus on one task at a time.

When I can just feel that new feature around the corner I really want to see it work as quickly as possible. In such a scenario I can repeat code from other places just to make it work with the promise to reference the code from a single place later on.

Enter Clone Detective. This add in for Visual Studio analysises you C# code and determines if there are any repeats of code. Really useful.

Check it out:


I don’t like linking to every single tool that comes my way. The question I always ask myself is “If my laptop crashed would I look for, download and install this tool again?”

There are tools that are so unassuming and so obviously make sense, and I have to recommend Rockscroll that Scott Hansellman has so kindly provided for the world outside of Microsoft.

In an ideal world a C# class would fit onto your screen, since it should be short and functional. However in the real world you are going to inherit bad code, and also look at generated code from time to time. You can easially spend your day navigating this one monster file. Hence Rockscroll:


This is also good for those really long SQL files and Powershell scripts. 

Full URL:

Software Development Lifecycle in a Box

A new website from Microsoft presents some guidance around an example project lifecycle.

It is interesting because it is not so much about Visual Studio and Team Foundation training. It tries to demonstrate the entire software lifecycle, and mixes Microsoft and third party tools as a method to achieve this.

Quite often on software projects, the process is in fact just the tools and nothing else. It is good to see Microsoft pushing the bigger picture here.

WPF Part 2 – Using a Grid Control

UPDATE: 04/24/2009

After a year this still gets a few views! I’ve updated the code to make it look better.


UPDATE: 12/21/2007

Hello Channel9 ppl 🙂

I have no idea why this article has got into a debate about web design!

Anyway, this is not about a complete user interface. I might address complex user interfaces later, but frankly my origional ‘pain point’ was lack of a simple Grid demo in WPF.

Please don’t read this as more than it is supposed to be ^_^


In Part 1, I discussed the need for a compelling front end to your demonstration.

Here is my first concept.. just some basic WinForms UI elements required for the job at hand.

Winforms Look n’ feel

I was happy to see this work because I knew the WCF communication code behind, and I saw this as a manifestation of that. 

However, would you show this to a customer?

Hence, I decided to go with a WPF frontend for my next iteration:
WPF Aero Look n’ feel

It still needs some work to go to become a good ‘User Interface’, but this already is 100% better on the eyes.

Doc with XAML code embedded

To give you an idea how easy it is, here is the XAML code I defined to add a progress bar to the Infragistics Grid Control above:

<Grid Width="{TemplateBinding Width}" Height="{TemplateBinding Height}">
	<ProgressBar Minimum="0" Maximum="100" Value="{Binding RelativeSource={RelativeSource TemplatedParent}, Path=Content}" ToolTip="{Binding RelativeSource={RelativeSource TemplatedParent}, Path=Content}" MaxHeight="20"/>

And here is another snippet, showing how to add a button, with a click event:

<Grid Width="{TemplateBinding Width}" Height="{TemplateBinding Height}">
	<Button Height="26" Width="26" Tag="{Binding RelativeSource={RelativeSource TemplatedParent}, Path=Content}" ToolTip="{Binding RelativeSource={RelativeSource TemplatedParent}, Path=Content}" Click="Button_Click">
		<Image Source="D:\Projects\avasmall.png"/>

This was all done in the newly released Visual Studio 2008.

Get into WPF – Part 1

Wouldn’t it be great if you could create a nice WCF Services demo, and everyone loved it?

Well, that is the problem.. I wanted a spartan interface because I wanted to show off the services and NOT the UI.

Nevertheless after to showing to a few people, I realised that the ‘plain’ Windows Forms client actually put people off. An attractive UI, although irrelevant to my goal of showing a services demo, actually has the effect of making people more interested.

Hence I wanted something to look good, and I thought I would give WPF a shot. As a learning curve, it was actually pretty easy to get into, especially if you are used to putting together ASP.NET web pages.

The important thing is creating a good XAML structure, which is given equal prominence to the GUI design window. I’m sure VB developers will hate this, but frankly the source to me is a great deal cleaner than the way it used to be done. In the past, if I made a mistake in the generated Visual C++ MFC or the C# Win-Forms code, quite often it would be easier to throw the form away and start from scratch.

Anyway I’m lazy… I could spend a week going through WPF tutorials, or I could dive right in. I discovered that Infragistics have a free Grid control for WPF:

And that Derek Harmon had a good example program about how to configure some controls inside it:

It took me about an hour to be a WPF data grid designer 🙂

Later this week I’ll show the steps required to make the Infragistics Datagrid with custom controls

Visual Studio 2008 Project Compatibility

UPDATE: 16th December: I felt it fair to update the post with my findings


At my company we have a great DSL based tool for generating WCF Services.

However, because of various reasons, the tool will only run under Visual Studio 2005 and not the all new and great Visual Studio 2008.

That is a pity, because I wanted to show off a new concept and hence wanted to use the WPF designer in 2008 to make an impression for the client.

Well, I’m I thought I was in luck! Apparently I can use the same projects in VS 2005 in VS 2008 and vice versa.

So you don’t have to abandon VS 2005 to try out the new features in VS 2008 in your projects… this is good stuff Microsoft!

I should have read the article more closely: I opened up my VS 2005 C# library project in VS 2008.. and it gave me an upgrade wizard 😦

Luckially I was able to use my assemblies from VS 2008 with no trouble, but it could have been so much better.

The State of Visual Studio Team System

This article from November 20th has a nice critique on how Team Foundation Server and Team System are going on the advent of the VS2008 launch:,289142,sid8_gci1282412,00.html

Microsoft has made a great start with Team Foundation Server, and it has every-ones attention in the Microsoft space.

The question is really how much momentum they can keep behind it. The lack of any integration tools with existing software project management / quality systems could well prove to be a problem in the long run.

Still, buying up other companies that make great tools like TFS Web Access is a good way to impress the customers.

If they could make it easier to install, backup and administrate, then that would also be a great step forward. TFS 2005 was frankly too scarey for any small project, and it seems TFS 2008 does in fact address these issues to an extent