A Fun ValidationAttribute Bug


I tweeted about a bug that I recently helped fix in System.ComponentModel.DataAnnotations.ValidationAttribute. As I said, it's a bug resulting from code I wrote for that class years ago. I was honestly surprised there was any interest in this, but there was! Since I piqued your interest, I thought it only fair that I quench your thirst and show you the details of the bug.

History and Backwards Compatibility

The first implementation of ValidationAttribute had a method with the following signature:

public abstract bool IsValid(object value);

 

Any class that inherited from ValidationAttribute had to override the IsValid method and put the validation logic in place.  This was fine and dandy until .NET 4.0 when I worked on a set of features to introduce context-aware validation attributes using new classes called ValidationContext and ValidationResult.  Using ValidationContext, validation attributes could perform complex business logic using application services or even calls into a database.  With this, we wanted to add an overload to IsValid to allow the following signature:

public abstract ValidationResult IsValid(object value, ValidationContext validationContext);
 

Of course we couldn’t add a new abstract method to a class, as that would break existing implementations.  So instead, we looked into adding the following:

public virtual ValidationResult IsValid(object value, ValidationContext validationContext) {
    ValidationResult result = ValidationResult.Success;
    
    if (!this.IsValid(value)) {
        string[] memberNames = validationContext.MemberName != null ? new string[] {
            validationContext.MemberName
        } : null;
        
        result = new ValidationResult(
            this.FormatErrorMessage(validationContext.DisplayName),
            memberNames);
    }

     return result;
}

 

This introduced a new problem: new attributes that want to use the ValidationContext must now override both overloads of IsValid, and that would be rather confusing.  We wanted new attributes to only have to override the ValidationContext-based IsValid overload and add documentation that the old boolean-based IsValid method should not be overridden—changing it from abstract to virtual.  We’d change that method to the following:

public virtual bool IsValid(object value) {
    // Call the ValidationContext-based method and if it's successful, return true
    return this.IsValid(value, validationContext: null) == ValidationResult.Success;
}

 

This is where I got in the code before I introduced the bug.  We’ll cover that next.

Ensuring One Overload is Overridden

This is an unusual situation.  We want to introduce a new overload that calls into the original method if it’s overridden.  But we want to make the original method virtual and have it call into the new overload if it’s overridden.

Let’s state that again, because it can be confusing:

  1. If the original method is overridden, have the new overload’s base implementation call into it
  2. If the new overload is overridden, have the original method’s base implementation call into it

A third way of stating it is:

  1. Allow implementers to override either method
  2. Call into the overridden method from whichever base implementation remains

Needless to say, there’s a risk of a circular reference that we need to prevent too.  The way I solved this was to use a private field and a lock to track the state of whether a base implementation was in the middle of making a call to an overridden implementation.  You can see this in the .NET Framework reference source for the System.ComponentModel.DataAnnotations assembly’s ValudationAttribute class.  Here’s the snippet too:

/// <summary>
/// Gets the value indicating whether or not the specified <paramref name="value"/> is valid
/// with respect to the current validation attribute.
/// <para>
/// Derived classes should not override this method as it is only available for backwards compatibility.
/// Instead, implement <see cref="IsValid(object, ValidationContext)"/>.
/// </para>
/// </summary>
/// <remarks>
/// The preferred public entry point for clients requesting validation is the <see cref="GetValidationResult"/> method.
/// </remarks>
/// <param name="value">The value to validate</param>
/// <returns><c>true</c> if the <paramref name="value"/> is acceptable, <c>false</c> if it is not acceptable</returns>
/// <exception cref="InvalidOperationException"> is thrown if the current attribute is malformed.</exception>
/// <exception cref="NotImplementedException"> is thrown when neither overload of IsValid has been implemented
/// by a derived class.
/// </exception>
#if !SILVERLIGHT
public
#else
internal
#endif
virtual bool IsValid(object value) {
    lock (this._syncLock) {
        if (this._isCallingOverload) {
            throw new NotImplementedException(DataAnnotationsResources.ValidationAttribute_IsValid_NotImplemented);
        } else {
            this._isCallingOverload = true;

            try {
                return this.IsValid(value, null) == null;
            } finally {
                this._isCallingOverload = false;
            }
        }
    }
}

#if !SILVERLIGHT
/// <summary>
/// Protected virtual method to override and implement validation logic.
/// <para>
/// Derived classes should override this method instead of <see cref="IsValid(object)"/>, which is deprecated.
/// </para>
/// </summary>
/// <param name="value">The value to validate.</param>
/// <param name="validationContext">A <see cref="ValidationContext"/> instance that provides
/// context about the validation operation, such as the object and member being validated.</param>
/// <returns>
/// When validation is valid, <see cref="ValidationResult.Success"/>.
/// <para>
/// When validation is invalid, an instance of <see cref="ValidationResult"/>.
/// </para>
/// </returns>
/// <exception cref="InvalidOperationException"> is thrown if the current attribute is malformed.</exception>
/// <exception cref="NotImplementedException"> is thrown when <see cref="IsValid(object, ValidationContext)" />
/// has not been implemented by a derived class.
/// </exception>
#else
/// <summary>
/// Protected virtual method to override and implement validation logic.
/// </summary>
/// <param name="value">The value to validate.</param>
/// <param name="validationContext">A <see cref="ValidationContext"/> instance that provides
/// context about the validation operation, such as the object and member being validated.</param>
/// <returns>
/// When validation is valid, <see cref="ValidationResult.Success"/>.
/// <para>
/// When validation is invalid, an instance of <see cref="ValidationResult"/>.
/// </para>
/// </returns>
/// <exception cref="InvalidOperationException"> is thrown if the current attribute is malformed.</exception>
/// <exception cref="NotImplementedException"> is thrown when <see cref="IsValid(object, ValidationContext)" />
/// has not been implemented by a derived class.
/// </exception>
#endif
protected virtual ValidationResult IsValid(object value, ValidationContext validationContext) {
    lock (this._syncLock) {
        if (this._isCallingOverload) {
            throw new NotImplementedException(DataAnnotationsResources.ValidationAttribute_IsValid_NotImplemented);
        } else {
            this._isCallingOverload = true;

            try {
                ValidationResult result = ValidationResult.Success;

                if (!this.IsValid(value)) {
                    string[] memberNames = validationContext.MemberName != null ? new string[] { validationContext.MemberName } : null;
                    result = new ValidationResult(this.FormatErrorMessage(validationContext.DisplayName), memberNames);
                }
                return result;
            } finally {
                this._isCallingOverload = false;
            }
        }
    }
}

You’ll notice a fun detail in that for Silverlight code (we cross-compile this code to .NET and Silverlight), we made the original method internal instead of public, because it was a new class for Silverlight—therefore there was no reason to even introduce the method on the public surface area.  Instead, we’d only have the ValidationContext-based approach.

Locks Are Bad

So this is where the code landed.  I had it code reviewed by about a dozen smart people—all much smarter than me in fact.  We all felt sick to our stomachs about it, but we couldn’t think of a better way to accomplish it.  I got code review sign-off, checked in, and this code has been in place for several years now.

Recently though, our team that helps service older code found that this lock I created is quite the bottleneck when validating a lot of attributes.  I don’t know the specific scenario, but it doesn’t really matter—that lock that I used was a bad idea and it’s causing a performance bottleneck for some customers.  It needed to be fixed.

A Cleaner Approach

A couple of weeks ago, Miguel Lacouture worked up a much cleaner approach to solving this problem that doesn’t use locks.  He asked if I would code review his fix since I had written the original code.  I had some review feedback for him, but his approach seems significantly superior to what I had come up with long ago.  With the feedback I sent him, here’s the proposed new implementation for these two methods:

private bool _hasBaseIsValid = false;
private bool _hasBaseIsValidWithContext = false;

virtual bool IsValid(object value) {
    // Track that this overload wasn't overridden
    if (!_hasBaseIsValid) {
        _hasBaseIsValid = true;
    }

    // That means that the other overload must be overridden
    // And if it hasn't been, then throw a NotImplementedException
    if (_hasBaseIsValidWithContext) {
        throw new NotImplementedException(DataAnnotationsResources.ValidationAttribute_IsValid_NotImplemented);
    }

    // We know the other overload was overridden
    // So call it to produce the result
    return this.IsValid(value, null) == null;
}

virtual ValidationResult IsValid(object value, ValidationContext validationContext) {
    // Track that this overload wasn't overridden
    if (!_hasBaseIsValidWithContext) {
        _hasBaseIsValidWithContext = true;
    }

    // That means that the other overload must be overridden
    // And if it hasn't been, then throw a NotImplementedException
    if (_hasBaseIsValid) {
        throw new NotImplementedException(DataAnnotationsResources.ValidationAttribute_IsValid_NotImplemented);
    }

    // We know the other overload was overridden
    // So call it to produce the result
    ValidationResult result = ValidationResult.Success;

    if (!this.IsValid(value)) {
    string[] memberNames = validationContext.MemberName != null ? new string[] {
        validationContext.MemberName
    } : null;

    result = new ValidationResult(
        this.FormatErrorMessage(validationContext.DisplayName),
        memberNames);

    return result;
}

The general idea is that the lock was simply unnecessary because we can easily just record which of the base implementations still exist, and then determine if they both exist.  The lock was guarding against multiple threads calling into one of the IsValid implementations and temporarily flipping a bit to true when we were calling an overload.  But this new approach just turns switches on and if multiple threads come in it won’t cause any problems.

Look Good?

This code is unique—ensuring that 1 of 2 overloads is overridden—and the existing bug has been there for years.  What do you think of the new implementation?  See any holes in it?  Of course, Miguel and the rest of his team will test the heck out of the new implementation, but if you see any issues with it, let me know.

Kudos to Miguel for finding a novel approach to solving the problem.  I wish I’d thought of this approach back when I was writing this code for RIA Services.

author: Jeff Handley | posted @ Tuesday, April 15, 2014 1:29 AM | Feedback (2)

Common NuGet Misconceptions: Package Restore


Package Restore is one of NuGet’s most popular features.  This has been especially true since NuGet 2.7 introduced Automatic Package Restore in Visual Studio and the nuget.exe restore command for simple command-line package restoration.  We hear many compliments about the feature and how it is transforming the way developers reference 3rd party libraries.  However, we also hear quite a few “bug reports” of Package Restore failing to perform a couple of expected functions that it will never do.  This post is to clear up the common misconceptions about package restore.

Content Files Are Not Restored

This is probably the number one misconception about NuGet all-up, let alone just Package Restore.  Many people think that NuGet Package Restore will bring back packages’ content files into their project folders when those content files have been omitted from source control.

It won’t.

Package Restore checks the solution’s /packages folder for all packages that have been referenced by the project(s) to ensure the packages exist in the /packages folder.  If they don’t, the package is downloaded and unpacked into the /packages folder.  That is all.  Package Restore has never checked the packages to see what content files they include to then copy those files into your project if the files are missing.  And it won’t.

Why Not?

There are several reasons.  To be brief, I’ll cover them tersely here:

  1. Adding content files to your project is a design-time action, not a build-time action
  2. Some packages add content files that you’re expected to edit – while many packages’ contents files are not meant to be edited, NuGet doesn’t presently have a way to differentiate the two scenarios
  3. Some packages use the source code transformation feature of NuGet to replace tokens in the content files with project properties, which wouldn’t be possible during Package Restore
  4. Packages’ content files can vary by target framework – knowing which target framework’s content files to copy into your project requires inspecting the project to know its target framework, which is beyond the scope and capability of Package Restore
  5. Some packages use install.ps1 PowerShell scripts to further manipulate content files after NuGet initially adds them to the project, and PowerShell scripts aren’t run during Package Restore either

Long story short, manipulating project content files is beyond the scope of Package Restore.  With the frequency at which I’ve been hearing about this misconception recently, I expect we’ll end up adding a feature to restore content files into projects at some point.  It looks like this issue on CodePlex is the best match if you’d like to upvote it.

PowerShell Scripts Aren’t Run

Wait a second!  Up in #5 above, did you say that install.ps1 PowerShell scripts aren’t run during Package Restore either‽

I sure did.  Misconception number two about Package Restore is that people expect it to run PowerShell scripts (specifically install.ps1) when restoring packages.

It won’t.

As stated above, Package Restore simply downloads and unpacks any missing packages in the /packages folder.  That is all.  Package Restore has never executed PowerShell scripts for packages after restoring them.  And it won’t.

Why Not?

Again, there are several reasons.  And again, here they are in terse form:

  1. Installing packages into a project is a design-time action, not a build-time action
  2. Install.ps1 scripts are meant to be run one time as a post-processing step of installing a package and aren’t meant to be run again
  3. Running an install.ps1 script requires that:
    1. The project be loaded in Visual Studio
    2. The NuGet PowerShell Console has been initialized, creating a PowerShell workspace primed with NuGet’s PowerShell module from the Visual Studio extension
    3. Visual Studio’s DTE object is exposed in the PowerShell workspace as well, so that the install.ps1 can access it to manipulate the project (or anything else in Visual Studio)

Besides the points made by bullets 1 and 2, the requirements for bullet 3 won’t be met either.  At least not from the nuget.exe restore command-line Package Restore.  The requirement would be met for Automatic Package Restore in Visual Studio, but again with bullets 1 and 2, there’s no need to execute the install.ps1 script again.

Now, there is a caveat around packages that contain an Init.ps1 script.  NuGet also doesn’t run any Init.ps1 scripts from packages after executing Package Restore in Visual Studio.  Instead, users have to re-open the solution to have the Init.ps1 scripts executed.  That is really just an oversight and a bug, and we plan to in NuGet 2.9.  Here’s the issue on CodePlex for that.

Download and Unpack

NuGet Package Restore is in place to simply download and unpack any missing packages.  The primary reason for this is assemblies that have references added to them.  Take a really basic package, NuGet.Core for example.  This package includes NuGet.Core.dll in the \lib folder.  When installing the NuGet.Core package into a project, the project will end up with a reference to something like ../packages/NuGet.Core.2.7.2/lib/net40-Client/NuGet.Core.dll.  But if you decide to omit the /packages folder from source control, that reference will fail at build time.  To alleviate that, NuGet Package Restore runs before build to download and unpack NuGet.Core into the /packages folder, putting NuGet.Core.dll in place before msbuild goes looking for it.

The end result is that your build-time reference needs are taken care of by Package Restore.  Project files are never modified.  Project contents are never touched or restored.  Install.ps1 scripts are never executed.  Your packages folder is simply rehydrated.  Nothing more.

Technorati Tags: ,

author: Jeff Handley | posted @ Monday, December 9, 2013 4:09 PM | Feedback (3)

Open RIA Services – Let the fun begin!


On September 19th, I made the initial contribution to Open RIA Services, committing the RIA Services codebase. The project is officially under way, giving developers even greater input into a framework that they love and reinforcing existing investments.

Right off the bat, I’d like to thank you for your patience during this process. I’d also like to thank Colin Blair, the Open RIA Services project lead, for his patience and persistence. Colin and I reached an informal agreement over 11 months ago that he would be the Open RIA Services project lead if Microsoft would donate the source code to the Outercurve Foundation. Since then, Colin has been preparing project plans, planning work items, and building the community around the project to ensure a successful launch. Well done, and thank you, Colin!

WCF RIA Services vs. Open RIA Services

As I mentioned when I first announced the open-sourcing of RIA Services, there is a distinction between Microsoft’s “WCF RIA Services” and the new “Open RIA Services” at Outercurve Foundation. Both are great technologies for creating n-tier applications with the .NET platform, and will continue to empower developers to continue to use the architecture that they prefer.

WCF RIA Services

Microsoft supports your existing investments in WCF RIA Services and Silverlight. The latest version of Silverlight will be supported for 10 years, and WCF RIA Services will be supported too. WCF RIA Services V1.0 is included with Visual Studio 2013 for customers building new and existing applications. We will ensure that you can bring your existing applications forward into new versions of Visual Studio with compile-time Silverlight code generation, as well as the runtime functionality that makes Domain Services tick. WCF RIA Services is alive and well with Microsoft standing behind it.

Open RIA Services

The Open RIA Services project will be led by Colin Blair, a Microsoft MVP who has focused on RIA Services for several years. Colin has created a project site at http://www.openriaservices.net, as well as the CodePlex project site at http://openriaservices.codeplex.com. The published roadmap lists better Silverlight 5 compatibility, change notifications, NHibernate, Multi-Client support, a new code generation strategy, and many more features across versions 4.x, 5.x, and 6.x. There’s a lot to do, and the project can use your help. We look forward to seeing how Open RIA Services will grow in the future!

Technorati Tags:

author: Jeff Handley | posted @ Monday, October 14, 2013 8:01 AM | Feedback (1)

RIA Services Silverlight NuGet Packages


As we look forward to the Open RIA Services project, we want to make sure we’re helping lay out a transition for those that desire to make the switch.  As part of that transition plan, we have just published 2 new NuGet packages for RIA Services: the Silverlight Client and the Silverlight DomainDataSource control.  These packages can be used instead of adding references from the WCF RIA Services MSI installation.  By referencing these libraries via NuGet instead of from the MSI, you’ll be better poised to switch to Open RIA Services packages when they exist, and you’ll have fewer dependencies on the bits included in the MSI.

These two Silverlight NuGet packages round out the NuGet-based offering of RIA Services.  We now have the server-side and Silverlight client-side libraries all available, allowing you to build RIA Services applications without needing the MSI installed on your machine.  To see all of the packages we’ve published, you can view the profile page for the ‘riaservices’ user on nuget.org at https://nuget.org/profiles/riaservices/.

RIAServices.Silverlight

RIAServices.Silverlight provides the System.ServiceModel.DomainServices.Client and System.ServiceModel.DomainServices.Client.Web assemblies for your Silverlight projects.

https://nuget.org/packages/RIAServices.Silverlight/

RIAServices.Silverlight.DomainDataSource

RIAServices.Silverlight.DomainDataSource provides the DomainDataSource control for Silverlight, providing a declarative model for binding UI controls to a DomainContext.

https://nuget.org/packages/RIAServices.Silverlight.DomainDataSource/

Technorati Tags:

 

author: Jeff Handley | posted @ Monday, July 15, 2013 3:21 PM | Feedback (2)

RIA Services is Getting Open-Sourced


It’s about time, right‽  In fact, it has been 3 and a half years since I first declared that getting RIA Services open-sourced was my stretch goal. Since then, I’ve seen dozens of forum posts, hundreds of tweets, and over 13,000 page-views for my original declaration.  There was even a time during a LIDNUG call when Scott Guthrie was directly asked what it was going to take to get RIA Services open-sourced.  This has been an important topic to a lot of people for a long time, and I am finally happy to announce it’s happening!

Open RIA Services


Let’s get right to it and cover some salient details:

  1. Microsoft will donate the source code of RIA Services to the Outercurve Foundation under the project name of “Open RIA Services”
  2. We expect to make this code donation later this summer—the legal process has been underway for several months and is nearing completion
  3. Open RIA Services will be submitted into the ASP.NET Open Source Gallery alongside NuGet, Orchard, DotNetOpenAuth, xUnit.net, and others
  4. The source code will be hosted on CodePlex at http://openriaservices.codeplex.com
  5. Colin Blair will be the project lead, and he will be driving the project toward the multi-version project plan he put together, with more information available at http://openriaservices.net/
  6. The Apache 2 license will be applied and the project will accept contributions
  7. Several other community members have voiced interest in contributing to the project and growing it beyond what it’s capable of today
  8. Microsoft will collaborate with the Open RIA Services project to ensure it gets off to a strong start and developers are able to successfully transition if desired

WCF RIA Services

Separately from the open source project, Microsoft will continue to support WCF RIA Services V1.0.  This is the product that shipped in the box with Visual Studio 2010 SP1, Visual Studio 2012, and is also shipping in the box with Visual Studio 2013.  We will service WCF RIA Services to ensure that the product enables you to bring your existing applications forward into new versions of Visual Studio with compile-time Silverlight code generation, as well as the runtime functionality that makes Domain Services tick.

In addition to being included with Visual Studio 2013, we also published several NuGet packages to http://nuget.org back in December of 2012, under the riaservices account.  You can read more about those packages from my post when they went out.  Many developers are choosing to use these NuGet packages to provide easier bin-deployment of applications and to get support for Entity Framework 5.0.0, Windows Azure, and the SOAP and JSON endpoints, among other features.

Visual Studio 2013

For Visual Studio 2013, we have reduced the design-time capabilities of WCF RIA Services, with the following design-time features being removed:

  1. The “Business Application Project” template
  2. The “Domain Service Class” template/wizard
  3. Toolbox items for the Silverlight designer surface
  4. Data Sources Window integration for the Silverlight designer surface

These feature reductions from Visual Studio 2013 were made with Open RIA Services in mind.  Any design-time features included in Visual Studio 2013 would have been tied to WCF RIA Services and would have made transitions to Open RIA Services more challenging.  To help provide a smooth transition from WCF RIA Services to Open RIA Services, these design-time features were removed and we are encouraging developers to use the aforementioned NuGet packages to build their RIA Services projects.  Then when the Open RIA Services project publishes its set of NuGet packages, the transition should be relatively smooth and there won’t be any design-time features that break down.

Note that even though the Business Application Project template has been removed, you can still create a new RIA Services project—just create a Silverlight application using one of the other project templates and choose to enable the RIA Services link for the project.  To create a new Domain Service class, you will just need to include the appropriate namespaces and derive from the DomainService class that suits your data access layer.

Additionally, WCF RIA Services fully supports round-tripping through Visual Studio 2010, 2012, and 2013, which allows you to create a new project in Visual Studio 2010 or Visual Studio 2012 and then work on it in Visual Studio 2013, Visual Studio 2012, or Visual Studio 2010.  The design-time features listed above are all fully supported in Visual Studio 2010 and Visual Studio 2012.

Entity Framework 6.0.0

With Entity Framework 6.0.0, namespaces changed and there were quite a few other changes that affected how RIA Services interacts with it, breaking the RIAServices.EntityFramework package.  Knowing that Open RIA Services was going to kick off, it seemed prudent to make that one of the first goals of the Open RIA Services project.  We will work with the Open RIA Services project to help make that happen around the same time that Entity Framework 6.0.0 reaches RTM.

Thank You!

I would like to thank every one of you who has asked me over the years about getting RIA Services open-sourced.  Without the strong demand you have voiced, I’m not sure this would have happened.

I would also like to thank Colin Blair.  Colin has long been the biggest advocate for RIA Services and he is famous for answering more questions on the forums than anyone else (including the entire product team); Colin really understands RIA Services through and through.  Beyond that, Colin invested significant time into a project plan for RIA Services and he has illustrated that the future is bright for Open RIA Services.  Colin’s investment and commitment to RIA Services has been vital to the project’s success and future possibilities, and I tip my hat to him.  If you would like to contribute to Open RIA Services, please contact Colin Blair and check out the new project site at http://openriaservices.net/.

Technorati Tags:

author: Jeff Handley | posted @ Wednesday, July 3, 2013 12:49 PM | Feedback (14)