September 2008 Archives

SQL Server 2008 has a great new feature called Table Value Parameters (TVP). I think there are a lot of applications for this new technology, but what about using it with some of Redmond's other new inventions, namely Entity Framework (EF) or LINQ to SQL? Specifically, can you drag and drop a stored procedure from the Server Explorer on to Entity Data Model Designer (in the case of EF) or the Object Relational Designer (in the case of LINQ to SQL)? The answer to this question is yes, but with a catch. Once you try to build the solution, you'll get one of the following compilation errors:

  • DBML1005: Mapping between DbType 'Structured' and Type 'System.Object' in Parameter 'source_key_list' of Function 'dbo.stp_GetCustomerSK' is not supported.
  • The function 'stp_GetCustomerSK' has a parameter 'source_key_list' at parameter index 0 that has a data type 'table type' which is not supported, the function was excluded.

The first error is what you'll get when you're trying to use TVPs with LINQ to SQL and the second what you'll get when using EF. These errors were reported from Visual Studio 2008 SP 1.

This state of affairs reinforces my opinion about these two ORM technologies. I think EF is an immature version one technology that should be avoided in most cases. (Try back with version two or three.) In the latter case, I think that it should only be used on products with a short life span considering that Microsoft is no longer investing any R & D dollars into it and will probably deprecate it in the next release of the .NET framework.

This week my colleague and I were trying to run a dozen load tests in sequence. What we wanted was to execute a list of tests, so that we could go home and come back the next day to the results. Unfortunately, this is not possible with Visual Studio 2008 SP1. Apparently, there is a bug (that doesn't seem to have been fixed in the first service pack) that prevents one from selecting multiple load tests in the test view and executing them all. Dennis Stone suggested in a Microsoft forum post to work around this bug by queuing up the execution of the load tests one by one. The problem with this proposal, however, is that it will only work if you are using a remote test controller; the local controller only supports the execution of one load test at a time, so a second can't be queued after the first has started.

The work around that we found to execute the load tests one after the other was to use a batch file. Once we learned how to run a load test from the command line, it was short work to put together a script that would slave away for us all night. One gotya that nabbed us was the omission of the test run configuration information. Without this, our auxiliary files weren't copied to the proper directory and some of our tests failed, thus causing our task to take two nights instead of one :-(

So, to create an ordered load test, use MSTest in a batch script like this:

mstest.exe /runconfig:localtestrun.testrunconfig /testcontainer:LoadTest1.loadtest
mstest.exe /runconfig:localtestrun.testrunconfig /testcontainer:LoadTest2.loadtest
mstest.exe /runconfig:localtestrun.testrunconfig /testcontainer:LoadTestN.loadtest

After MSTest completes, the test results will be stored in the load test database, and you will be able to analyze them as usual using Visual Studio or SSRS.

I was listening to an episode of Hanselminutes the other day about unit testing frameworks, and one of the panelists, Brad Wilson from xUnit.NET, mentioned something that really caught my attention. He said that "...there's an interesting product that most people don't really think about called SoftGrid, which...allows you to create a semi-private/semi-virtualized environment inside of an existing copy of Windows." To me, it sounded like chroot for Windows. After some digging, I've learned that that is a pretty good comparison (if you have that frame of reference). I've also learned that Microsoft has purchased SoftGrid and has RTMed a new version under the name, App-V. As you can read about on the App-V product page, this lightweight virtualization solution provides a pseudo environment for applications to execute within. Specifically, this per-user make-believe world includes a virtual registry, file system, GAC (and DLLs in general), COM, IPC, configuration files, and fonts. (Virtualized IPC is a mind blowing to me. I wonder how that is done.)

Wilson's idea that he set forth in that episode on Scott's talk show was to utilize this tool for testing. Rather than using a CI application like TeamCity or ABS to start/stop entire virtual machines, Wilson suggested using these CI tools in conjunction with App-V to "chroot" (if you will) a section of the test machine for verification of a candidate build. This would hasten the setup of test environments. While I see Wilson's point and agree that this tool deserves a good look to learn exactly how it can be used in concert with existing testing frameworks, I have a couple points I'd like to raise:  

  1. Starting virtual machines isn't a time consuming task, especially when their being automatically instantiated as a part of a build process that is taking place late at night.
  2. Licensing of App-V could be costly. (I don't what Microsoft is charging and how the model works.)
  3. It's new and unknown whereas machine-based virtualization platforms like VMWare are well understood (relatively speaking).

There are other interesting applications besides testing. For example, it can potentially lower the TCO for large application deployments. Specifically, virtualized applications could be packaged as MSIs and pushed to the workstations via some sort of distribution system (e.g., AD). Later, when a user clicks the shortcut on their desktop, the application is streamed to them from the virtual application server. Once enough of the bytes arrive, the application starts and the results are cached in some administrator-configured caching location. These are just two possible ways to use this technology.  

I doubt that I'll ever mess with App-V more than this cursory investigation, but it highlights the paradigm shift that we've recently made in the computing industry as virtualization continues to redefine the way software is created, tested, managed, deployed, and used. Just think about the levels of virtualization that are used today: Entire computers are virtualized (i.e., hardware virtualization); applications written in high-level languages run in Virtual Machines (VMs) such as the JRE, the .NET CLR, V8, TraceMonkey, etc.; and applications are being virtualized with products such as App-V. Virtualization is being used at every level and in an increasing number of ways.  

As I've said before, it sure is an exciting time to be a programmer!