Monthly Archives: April 2010

Windows Azure

The Windows Azure Platform has a rich diagnostics infrastructure to enable logging and performance monitoring in the cloud.

When you run a cloud app locally in the DevFabric you can view diagnostics trace output in the console window that is part of the DevFabric UI. For example, say I have the line

System.Diagnostics.Trace.TraceInformation("Old Trace called at {0}.", DateTime.Now);

That line gives a result like this in the DevFabric UI:

Screenshot of DevFabric UI

If you can’t see the DevFabric UI, you can enable it here after starting your cloud app from Visual Studio:

image 

Using the DevFabric UI is the most basic form of viewing Trace output. Looking at each individual console window doesn’t really scale well if you have many instances and, furthermore, the console window of an instance is not available in the cloud. For this, there is a special TraceListener derived class: Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener. It sends the trace output to a Windows Azure diagnostics monitor process that is able to store the messages in Windows Azure Storage. Here is a look at a trace message using the Azure Storage Explorer:

Screenshot of Azure Storage Explorer

This trace listener is enabled through web.config:

  <system.diagnostics>
    <sources>
      <source name="Diag">
        <listeners>
          <add name="AzureDiagnostics" />
        </listeners>
      </source>
    </sources>
    <sharedListeners>
      <add type="Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener, Microsoft.WindowsAzure.Diagnostics, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"
          name="AzureDiagnostics" />
    </sharedListeners>
    <trace>
      <listeners>
        <add name="AzureDiagnostics" />
      </listeners>
    </trace>
  </system.diagnostics>

You can also see a message “TraceSource called at …”. This message was output using a TraceSource instance:

private static readonly TraceSource ts = new System.Diagnostics.TraceSource("Diag", SourceLevels.All);

protected void TraceMeButton_Click(object sender, EventArgs e)
{
    ts.TraceEvent(TraceEventType.Information, 2, "TraceSource called at {0}.", DateTime.Now);
    System.Diagnostics.Trace.TraceInformation("Old Trace called at {0}.", DateTime.Now);
}

However, note that a similar message “TraceSource called at …” didn’t show up in the DevFabric UI. You might wonder what is going on. And you might wonder why I want to use a TraceSource instead of Trace? Because this MSDN article states:

One of the new features in the .NET Framework version 2.0 is an enhanced tracing system. The basic premise is unchanged: tracing messages are sent through switches to listeners, which report the data to an associated output medium. A primary difference for version 2.0 is that traces can be initiated through instances of the TraceSource class. TraceSource is intended to function as an enhanced tracing system and can be used in place of the static methods of the older Trace and Debug tracing classes. The familiar Trace and Debug classes still exist, but the recommended practice is to use the TraceSource class for tracing.

Also check out this blog post.

The reason you don’t see the message for the TraceSource in the DevFabric UI is that the DevFabric magically adds a special TraceListener for the “old fashioned” Trace class, but not for your TraceSource instance. I put together a cloud app solution (Visual Studio 2010) that shows this through a simple web role. This web role has the configuration you see above in its web.config file. If you run this simple web role in the DevFabric you’ll see:

Screenshot of sample web role app

Note that Trace has a Microsoft.ServiceHosting.Tools.DevelopmentFabric.Runtime.DevelopmentFabricTraceListener instance registered, while the TraceSource hasn’t. To remedy this, I’ve created a small class that adds a DevFabricTraceListener to a TraceSource if it is registered for Trace:

public static class TraceSourceFixer
{
    private const string DevFabricTraceListenerFullName = "Microsoft.ServiceHosting.Tools.DevelopmentFabric.Runtime.DevelopmentFabricTraceListener";

    public static void AddDevFabricTraceListener(TraceSource traceSource)
    {
        var alreadyInTraceSource = GetDevFabricTraceListeners(traceSource.Listeners);

        if (alreadyInTraceSource.Count() > 0)
            return;

        var alreadyInTrace = GetDevFabricTraceListeners(Trace.Listeners);

        var devFabricTraceListener = alreadyInTrace.FirstOrDefault();
        if (devFabricTraceListener != null)
        {
            traceSource.Listeners.Add(devFabricTraceListener);
        }
    }

    private static IEnumerable<TraceListener> GetDevFabricTraceListeners(TraceListenerCollection listeners)
    {
        var result = from TraceListener listener in listeners.Cast<TraceListener>()
                where IsDevFabricTraceListener(listener)
                select listener;

        return result;
    }

    private static bool IsDevFabricTraceListener(TraceListener listener)
    {
        return (listener.GetType().FullName == DevFabricTraceListenerFullName);
    }
}

This helper class gets called when you press the Register DevFabric Listener button. If you click the Trace Me button after that, you’ll see two trace messages show up in the DevFabric UI:

Screenshot of DevFabric UI detail

You can download my solution DiagnosticsService.zip to try it yourself.

Wow, April 2010 proves to be a jam-packed month of releases from Microsoft. A short list, that is bound to be incomplete:

  • .NET Framework 4
  • Visual Studio 2010
  • Office 2010
  • SharePoint 2010
  • Data Protection Manager 2010
  • System Center Essentials 2010
  • Dynamics GP 2010
  • Enterprise Library 5.0
  • SQL Server 2008 R2

R2 Banner

The trial versions of SQL Server 2008 R2 are now available for download. Here are the highlights of what’s new in 2008 R2:

  • PowerPivot: a managed self-service analysis solution that empowers end users to access, analyze and share data across the enterprise in an IT managed environment using Excel 2010 and SharePoint Sever 2010.
  • Master Data Services: helps IT organizations centrally manage critical data assets companywide and across diverse systems, and enables more people to securely manage master data directly, and ensure the integrity of information over time.
  • Application and Multi-server Management: helps organizations proactively manage database environments efficiently at scale through centralized visibility into resource utilization and streamlined consolidation and upgrade initiatives across the application lifecycle.
  • Report Builder 3.0: report authoring component with support for geospatial visualization. This new release provides capabilities to further increase end user productivity with enhanced wizards, more powerful visualizations, and intuitive authoring.
  • StreamInsight: a low latency complex event processing platform to help IT monitor, analyze and act on the data in motion to make more informed business decisions in near real-time.

More details about the release can be found on the team blog.

1 Comment

The Microsoft Enterprise Library has always been one of the most popular things to come out of the patterns & practices team. Yesterday p&p reached a major milestone by releasing version 5.0 of EntLib.

The improvements are too numerous to sum up here, but let me mention one: this release has full .NET 3.5 SP1 and .NET 4 compatibility and works great from both Visual Studio 2008 SP1 and Visual Studio 2010 RTM.

Full details can be found in Grigori Melnik’s blog post on this release. Or you can go straight to the download page or the documentation.

Long before I joined the company, Brad Abrams was one of the first people for me that put a human face on Microsoft.

I felt sad when I just read his blog post that Brad is leaving Microsoft. But at the same time I feel happy for all that he has accomplished for the company. Check his blog post for all that he has been involved in. Not on the list, but not less important in my mind, is the book Framework Design Guidelines that he co-authored.

Brad, I wish you all the best in your next endeavors.

.NET Frameworkin Windows Azure 

As you will probably know, Visual Studio 2010 and .NET Framework 4 will RTM on April 12, 2010 and will be available for download on MSDN Subscriptions Downloads the same day.

The Windows Azure team is committed to making .NET Framework 4 available in Windows Azure within 90 days of the RTM date.

A lesser known fact is that the latest available Windows Azure build already has a .NET 4 version installed, namely the RC bits. Although this version cannot be used to run applications on (because .NET 4 is not yet exposed in the Windows Azure dev tools), you can use this build to test if the presence of .NET 4 has impact on existing .NET 3.5 apps running on Windows Azure.

Read the official announcement here.

[Updated 2010-04-25: The lightweight view for MSDN has been fixed, so the content of this post is no longer relavant]
[Updated 2010-04-07: Updated one screenshot, because the lightweight view has changed]

MSDN has recently switched the default online view mode to “lightweight view”. Although this dramatically improves loading times for the online docs, this currently breaks parts of the Windows Azure Documentation. For example, when you view http://msdn.microsoft.com/en-us/library/ee393295(v=MSDN.10).aspx in lightweight mode, the namespaces are not hyperlinks and the menu is broken:

image

To fix this, you have to click on the “Preferences” link in the upper right corner of the page:

image

 

Choose “Classic” on the page that appears:

image

Click OK, and you’ll get back to a view that actually works:

image

 

We are working on getting the “lightweight view” fixed with this part of the Azure documentation.

VS2010-Ult_h_rgb

Last week we published a new major version of the Visual Studio Performance Testing Quick Reference Guide. The effort of creating this document was lead by Geoff Gray. Geoff is a senior consultant in the Microsoft Services Labs in the US that specializes in performance testing using Visual Studio. I was part of the Rangers team that contributed articles to this guide.

From time to time I do performance testing engagements. Last week for example I worked for a customer and load tested their new Internet facing website before it goes live later this month. Performance testing uncovers issues in the software and the infrastructure that might otherwise go unnoticed until the public hits the site. And that is a bad time to discover performance issues.

Creating and executing real-world performance testing can be a tricky job and this quick reference guide gives lots of tricks & tips from consultants from Microsoft Services and the Visual Studio product team on how to tackle challenges you might encounter.

Visual Studio has a rich extensibility model for performance testing and that’s why it is interesting for development minded people like me. You can create plug-ins in C# to do  advanced parameterization of web tests and load tests (example on MSDN). The QRG has lots of examples of when and how this is useful.

Load testing is part of Visual Studio 2008 Team Suite or Test Edition and of Visual Studio 2010 Ultimate. With VS2010, if you need to simulate more than 1000 users, you can purchase “virtual user packs”. This will enable you to simulate higher loads using a test rig consisting of one test controller and several test agents. Previously with VS2008 you had to license each agent. You can’t be sure in advance how many users you can simulate with one agent because it depends on many factors like hardware and complexity of the application and the test itself.

Many organizations forego the effort of performance testing and take a “wait and see” stance. This reactive approach can lead to costly repairs and bad publicity. My advice is to take a proactive stance and be confident about the performance of your application before it goes live. In many cases there are quick wins to improve the performance by enabling better caching (blob caching for SharePoint). Even simple things like enabling HTTP compression are still often forgotten.