Code Coverage via OpenCover and ReportGenerator

Visual Studio has code coverage built in, but only if you run VS Premium or better. Sometimes this isn't available, which is what sent me looking for a free alternative.

OpenCover fits this goal perfectly. The output isn't as pretty as VS, but no worries. ReportGenerator has integrations with the OpenCover output and creates some really nice reports. Below is the output and the simple steps that worked for me.

Summary page, with clickable links to details for each class.
Code coverage summary of Monitored Undo Framework

Class level summary, including cyclomatic complexity and per-method details.
Code coverage summary of a class.


Source code lines with highlighting for quick visual inspection.
Code coverage source code highlighting.

Getting the Tools

As with most things these days, Nuget is the answer. There are packages for both of these tools that pull the bits down into your project's "packages" folder.

PM> install-package OpenCover
PM> install-package ReportGenerator

This creates some version specific folders in your .\packages folder. In my case, "OpenCover.4.5.1604" and "ReportGenerator.". Awesome! Couple that with a solution that uses NuGet package restore and you don't even have to check them in, which makes Git happy!

Generating the Reports

To get the report, you basically need to do 3 steps:

  1. Build the solution (and restore nuget packages)
  2. Run unit tests via OpenCover
  3. Run ReportGenerator on the output

Let's break that down... We'll use my "Monitored Undo Framework" project as a real world example. I added the Nuget packages and enabled Nuget Package Restore on the solution.

For step 2, I crafted a command line that would tell OpenCover to run my MSTest unit tests, saving the output to a "reports" directory. (Note: The carrot "^" simply allows the statement to exist over multiple lines, rather than one long single line statement.)

REM Run unit tests through OpenCover
REM This allows OpenCover to gather code coverage results
 -targetargs:"/noresults /noisolation /testcontainer:..\releases\Latest\NET40\MonitoredUndoTests.dll"^

The above statement passes the arguments needed by OpenCover and MSTest to make it all happen. In this case, I have a single unit test project/dll and I run all tests, so it's simple. If you have NUnit unit tests, you simply substitute the alternate "-target" and "-targetargs" parameters to OpenCover.

For step 3, I pass the output of OpenCover into ReportGenerator:

REM Generate the report

The above uses the "output.xml" file, plus some command line parameters to create a pretty nice looking code coverage report. ReportGenerator does some nice things to make these reports extra awesome.

  • It reviews the PDB to locate the relavant source code.
  • Pulls the source into the report, with green / red highlights to show what's covered
  • Calculates coverage percentages and cyclomatic complexity. 
  • Presents the info in a nice, hierarchical table with bar charts showing coverage percentage for each class.

Putting it All Together

Getting this to work via a single (ok double) click looks like the following batch file, which I put in the same directory as my solution. In my case, I happen to have a different output directory than usual, but you can adjust to your tastes.

REM Bring dev tools into the PATH.
call "C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\Tools\VsDevCmd.bat"

mkdir .\report

REM Restore packages
msbuild .\.nuget\NuGet.targets /target:RestorePackages

REM Ensure build is up to date
msbuild "..\src\Monitored Undo Framework.sln"^

REM Run unit tests through OpenCover
REM This allows OpenCover to gather code coverage results
 -targetargs:"/noresults /noisolation /testcontainer:..\releases\Latest\NET40\MonitoredUndoTests.dll"^

REM Generate the report

REM Open the report
start .\report\index.htm


Next Steps

If you're ambitious, you could set this up to run with each build by simply adding it as a build step in your projects. And with the files under your full control, you could do whatever other processing / checks you'd like to do as part of a CI build too.

For larger projects, you'll find the other command line parameters of good use. You can easily:

  • Narrow down what coverage results are included in the report. This could be helpful to exclude 3rd party code, unit tests, etc.
  • Filter the list of tests that are actually executed. MSTest lets you filter by "category" or "trait" using the "/category:" parameter. You can even use logical operators to include / exclude traits as needed. The trick is getting the command line to properly handle the embedded quotes. Using two double quotes next to each other seems to do the trick: "MSTest.exe [...] /category:""!Performance&!Web Service Integration"""

Until next time...


author: Nathan Allen-Wagner | posted @ Thursday, August 15, 2013 4:49 PM | Feedback (11)

Healthy Debate on “Software Craftsmanship” and Elitism

A good debate between Ted Neward and Uncle Bob Martin regarding "Software Craftsmanship".

I find the debate to be very worth the (long) read.
- Ted points out that we need to not be Elitist and egotistical about our approach and skills. Amen.
- Bob's goal is to institute some principles that help the industry mature; that help us all improve.

Ted's Original Article:
"On The Dark Side Of Craftsmanship"

Bob's Response:
The Craftsman And The Laborer

Ted's Reply:
More On Craftsmanship

Bob's Reply:
The Humble Craftsman

Ted's Final thoguhts:
Last Thoughts On Craftsmanship

Bob's clarification of Craftsmanship (not a direct reply, but related):
The Principles Of Craftsmanship

Can we find a way to move things forward without thumping our own chest and putting others down? Can we improve ourselves and find pride in our work without being too inwardly focused? Are we able to “Git’r done” with some “low quality code” if that’s what the client really needs? I think so, but it takes awareness of the issues and the world around us. It requires that we pull out heads out of the code sand to see what’s appropriate for the task at hand.

GK Chesterton loved paradox. I think that is fitting here. We are best when we can simultaneously embrace both, seemingly opposing sides… when we can embrace a seeming paradox. We find balance, not by watering either side down, but by letting both positions flourish.

In the end, I appreciate the above debate because it forced some skeletons out of the closet. And as the last response from Bob shows, it forced some clarification and specificity around this “hot new thing” called the Craftsmanship movement.

Thanks Ted and Bob!


author: Nathan Allen-Wagner | posted @ Monday, February 11, 2013 8:00 AM | Feedback (4)

Going Offline-Online with VS Missed a File

I really appreciate Visual Studio’s “Go Offline” with TFS feature. Saves me a bunch of headache when I’m disconnected. And for the most part it works great.

Yesterday, I discovered that somehow, the “Go Online” had not picked up all my changes. Why?

It seems that the “Go Online” logic traverses the solution tree to find files that it should consider. That’s all well and good, but I have some files that are conditionally included in the solution. VS always seems to treat these as not part of the solution. But they are in source control and they are part of the compilation.

I edited these files while offline and then, when going online, they were skipped. Now TFS had a different copy that on my local machine. Had I not seen some strange behavior, I probably never would have caught it.

My mistake, and yes, I see that this is how the feature works, but I still stumbled on this one.


author: Nathan Allen-Wagner | posted @ Thursday, December 13, 2012 8:07 PM | Feedback (0)

Standing to Code

Today I tried a Standing Desk for the day. I've heard about this a number of times, mostly on Twitter. The idea's not new, but it is gaining attention due to the health concerns around so much sitting.

I started standing at 7am. It's currently 3pm, and I'm still standing here. So far so good. My muscles are a little tired, but my mind feels more awake than most other days!

Another thing I'm noticing is that it's a lot easier to switch gears. If I need to step over for a cup of coffee, I'm already standing. No big deal. And getting back to coding is just as easy. Just walk up and get to it. No need to try and find that comfortable position in the chair, scoot it up to the desk, etc.

As for my desk… Well, it's the same desk I usually use… plus some boxes and a board to elevate my laptop to standing height.

Cost = $0. Adding years to my life and energy to my day… priceless.

(I'll report back soon to let you know how things are going after a week of standing)


author: Nathan Allen-Wagner | posted @ Tuesday, March 20, 2012 3:12 PM | Feedback (3)

SqlTransaction Ain't Always Transactional

Q: "SQL run using a SqlTransaction is part of a SQL Server Transaction and can be rolled back."

A) True
B) False

The answer is of course "A", right? That's what I thought too.

Basic SqlTransaction Usage

Most of us assume that we can write this:

using (var con = new SqlConnection(".."))
  using (var tran = con.BeginTransaction())
    var cmd = con.CreateCommand();
    cmd.CommandText = "Update dbo.Customer ...";
    cmd.Transaction = tran;


This is plain and simple SqlTransaction based logic. Exec the command. If it completes without error, call Commit(). If it has an issue, then call Rollback();

But ... What Does This Do?

I challenge you to predict the behavior of the following "Retry" scenario. The code may seem a bit odd… Why would you execute more SQL after an exception? But it shows what one might do if the code implemented Retry semantics inside a transaction.

For retries, one might want to open a transaction, run a set of statements, and then retry any statements that had failed. The SQL transaction should remain in force in spite of any errors thrown by SQL server. For example, a foreign key violation does not automatically abort the transaction. Rather, the code would catch the FK violation and queue it for subsequent execution after completing the 1st pass.

This does in fact work… most of the time. I say most of the time, because deadlock errors in SQL Server cause some interesting behavior.

Here's a snippet that should easily run in LinqPad without too much work. The only pre-requisite is to create a DB in SqlExpress named "Scratch".

Pay close attention to lines 144-145. How will they behave after the deadlock? What is their impact on the database?

   1:      void Main()
   2:      {
   3:         var connectionString = @"Data Source=.\sqlexpress;Initial Catalog=Scratch;Integrated Security=True";
   5:         var sqlSchema = @"
   6:           DROP TABLE dbo.orders;
   7:           DROP TABLE dbo.customer2;
   8:           DROP TABLE dbo.customer;
  10:           CREATE TABLE [dbo].[customer](
  11:              [customerid] [int] NOT NULL,
  12:              [firstname] [nvarchar](256) NULL,
  13:              [lastname] [nvarchar](256) NULL,
  14:              CONSTRAINT [PK_customer] PRIMARY KEY CLUSTERED
  15:                 ([customerid] ASC) ON [PRIMARY]
  16:           ) ON [PRIMARY];
  18:           CREATE TABLE [dbo].[orders](
  19:              [orderid] [int] NOT NULL,
  20:              [customerid] [int] NOT NULL,
  21:              [shippingid] [int] NOT NULL,
  22:              [otherid] [int] NULL,
  23:              CONSTRAINT [PK_orders] PRIMARY KEY CLUSTERED
  24:              ([orderid] ASC) ON [PRIMARY]
  25:           ) ON [PRIMARY];
  27:           ALTER TABLE [dbo].[orders] WITH CHECK
  28:              ADD CONSTRAINT [FK_orders_customer] FOREIGN KEY([customerid])
  29:              REFERENCES [dbo].[customer] ([customerid]);
  31:           CREATE TABLE [dbo].[customer2](
  32:              [customerid] [int] NOT NULL,
  33:              [firstname] [nvarchar](256) NULL,
  34:              [lastname] [nvarchar](256) NULL,
  35:              CONSTRAINT [PK_customer2] PRIMARY KEY CLUSTERED
  36:                 ([customerid] ASC) ON [PRIMARY]
  37:           ) ON [PRIMARY];
  39:           insert into customer (customerid, firstname, lastname) values (111, 'FN', 'LN');
  40:           insert into orders (orderid, customerid, shippingid, otherid) values (221, 111, 0, 0);
  41:           insert into customer2 (customerid, firstname, lastname) values (111, 'FN', 'LN');
  42:           insert into customer2 (customerid, firstname, lastname) values (112, 'FN', 'LN');
  43:        ";
  45:         // Change this to a different value in order to see
  46:         //whether this run affected rows in the database.
  47:         var runId = "9";  
  49:         // The following statements will force a deadlock.
  50:         var sql1a = @"
  51:           UPDATE Customer SET LastName = 'John_Updated" + runId + @"' WHERE CustomerId=111
  52:           WAITFOR DELAY '00:00:03' -- Wait for 5 ms
  53:           UPDATE Orders SET OtherId = " + runId + @" WHERE OrderId = 221";
  55:         var sql2a = @"
  56:           UPDATE Orders SET ShippingId = 1" + runId + @" WHERE OrderId = 221
  57:           WAITFOR DELAY '00:00:03' -- Wait for 5 ms
  58:           UPDATE Customer SET FirstName = 'Mike_Updated" + runId + @"' WHERE CustomerId=111";
  61:         // These two statements affect a table that should not be locked by the deadlock.
  62:         var sql1c = @"UPDATE Customer2 SET LastName = 'Updated 1_" + runId + @"' WHERE CustomerId = 111";
  63:         var sql2c = @"UPDATE Customer2 SET LastName = 'Updated 1_" + runId + @"' WHERE CustomerId = 112";
  66:         using (var con1 = new SqlConnection(connectionString))
  67:         using (var con2 = new SqlConnection(connectionString))
  68:         using (var con3 = new SqlConnection(connectionString))
  69:         {
  70:            con1.Open();
  71:            con2.Open();
  72:            con3.Open();
  73:            using (var tran1 = con1.BeginTransaction(System.Data.IsolationLevel.RepeatableRead))
  74:            using (var tran2 = con2.BeginTransaction(System.Data.IsolationLevel.RepeatableRead))
  75:            {
  76:               try
  77:               {
  78:                  // Create tables and insert sample data.
  79:                  SqlRunner.RunSQL(con3, null, sqlSchema);
  81:                  // Show sample data
  82:                  SqlRunner.GetDataTable(con3, null, "Select * from CUSTOMER")
  83:                           .Dump("Data in CUSTOMER before changes");
  84:                  SqlRunner.GetDataTable(con3, null, "Select * from ORDERS")
  85:                           .Dump("Data in ORDERS before changes");
  86:                  SqlRunner.GetDataTable(con3, null, "Select * from CUSTOMER2")
  87:                           .Dump("Data in CUSTOMER2 before changes");
  89:                  Console.WriteLine("Running SQL 1A");
  90:                  var t1 = Task.Factory.StartNew(() => SqlRunner.RunSQL(con1, tran1, sql1a));  
  92:                  // Allow slight delay to ensure deadlock.
  93:                  System.Threading.Thread.Sleep(1000);
  95:                  Console.WriteLine("Running SQL 2A");
  96:                  var t2 = Task.Factory.StartNew(() => SqlRunner.RunSQL(con2, tran2, sql2a));
  98:                  Console.WriteLine("Waiting");
  99:                  Task.WaitAll(t1, t2);   // This should throw SqlException for Deadlock.
 100:                  t1.Wait();
 102:                  // The following should not run.
 103:                  // But it's what would run if things succeeded.
 104:                  Console.WriteLine("Committing");
 105:                  tran1.Commit();
 106:                  tran2.Commit();
 107:               }
 108:               catch (Exception ex)
 109:               {
 110:                  Console.WriteLine("Error:");
 111:                  Console.WriteLine(ex.ToString());
 113:                  var sqlEx = ex.InnerException as SqlException;
 114:                  if (null != sqlEx)
 115:                  {
 116:                     Console.WriteLine("SqlException Details:");   // Should be deadlock
 117:                     Console.WriteLine("Class     = {0}", sqlEx.Class);   // 13
 118:                     Console.WriteLine("Number    = {0}", sqlEx.Number); // 1205 = Deadlock
 119:                     Console.WriteLine("Procedure = {0}", sqlEx.Procedure); // ""
 120:                     Console.WriteLine("Server    = {0}", sqlEx.Server); // .\sqlexpress
 121:                     Console.WriteLine("Source    = {0}", sqlEx.Source); // .Net SqlClient Data Provider
 122:                     Console.WriteLine("State     = {0}", sqlEx.State);    // 51
 123:                  }
 125:                  // Should be open. Deadlocks don't close the connection.
 126:                  Console.WriteLine("Connection States (Should be Open):");
 127:                  Console.WriteLine("Con1 State = {0}", con1.State);
 128:                  Console.WriteLine("Con2 State = {0}", con1.State);
 130:                  Console.WriteLine("Transaction States Before 'C' SQL Statements:");
 131:                  SqlRunner.GetDataTable(con1, tran1,
 132:                           "SELECT @@TRANCOUNT TranCount, XACT_STATE() TranState")
 133:                           .Dump("Con1 State After Exception, Before '1C'");        
 134:                  SqlRunner.GetDataTable(con2, tran2,
 135:                          "SELECT @@TRANCOUNT TranCount, XACT_STATE() TranState")
 136:                          .Dump("Con2 State After Exception, Before '2C'");        
 138:                  try
 139:                  {
 140:                     // ******************************************
 141:                     // THESE ARE THE IMPORTANT LINES.
 142:                     // How will they behave after the deadlock
 143:                     // ******************************************
 144:                     SqlRunner.RunSQL(con1, tran1, sql1c);        
 145:                     SqlRunner.RunSQL(con2, tran2, sql2c);        
 146:                  }
 147:                  finally
 148:                  {
 149:                     Console.WriteLine("Transaction States After 'C' SQL Statements:");
 150:                     SqlRunner.GetDataTable(con1, tran1,
 151:                              "SELECT @@TRANCOUNT TranCount, XACT_STATE() TranState")
 152:                              .Dump("Con1 State After Run '1C' SQL");        
 153:                     SqlRunner.GetDataTable(con2, tran2,
 154:                              "SELECT @@TRANCOUNT TranCount, XACT_STATE() TranState")
 155:                              .Dump("Con2 State After Run '2C' SQL");        
 156:                  }
 158:                  try { Console.WriteLine("Rolling Back 1"); tran1.Rollback(); }
 159:                  catch (Exception ex1) {
 160:                    Console.WriteLine("Rollback 1 Error:"); Console.WriteLine(ex1.ToString()); }
 161:                  try { Console.WriteLine("Rolling Back 2"); tran2.Rollback(); }
 162:                  catch (Exception ex2) {
 163:                    Console.WriteLine("Rollback 2 Error:"); Console.WriteLine(ex2.ToString()); }        
 165:                  //try { Console.WriteLine("Committing 1"); tran1.Commit(); }
 166:                  //catch (Exception ex1) {
 167:                  //  Console.WriteLine("Commit 1 Error:"); Console.WriteLine(ex1.ToString()); }
 168:                  //try { Console.WriteLine("Committing 2"); tran2.Commit(); }
 169:                  //catch (Exception ex2) {
 170:                  //  Console.WriteLine("Commit 2 Error:"); Console.WriteLine(ex2.ToString()); }        
 172:                  SqlRunner.GetDataTable(con3, null, "Select * from CUSTOMER")
 173:                           .Dump("Data in CUSTOMER after ROLLBACK");
 174:                  SqlRunner.GetDataTable(con3, null, "Select * from ORDERS")
 175:                           .Dump("Data in ORDERS after ROLLBACK");
 176:                  SqlRunner.GetDataTable(con3, null, "Select * from customer2")
 177:                           .Dump("Data in CUSTOMER2 after ROLLBACK");
 178:               }
 179:            }
 180:         }  
 181:      }
 183:      // Define other methods and classes here
 186:      static class SqlRunner
 187:      {
 189:         public static int RunSQL(SqlConnection con, SqlTransaction tran, string sql)
 190:         {
 191:            SqlCommand cmd = con.CreateCommand();
 192:            cmd.CommandText = sql;
 194:            if (null != tran)
 195:               cmd.Transaction = tran;
 197:            return cmd.ExecuteNonQuery();
 198:         }
 200:         public static DataTable GetDataTable(SqlConnection con, SqlTransaction tran, string sql, params object[] parameters)
 201:         {
 202:            var cmd = con.CreateCommand();
 203:            cmd.CommandText = sql;
 205:            if (null != tran)
 206:               cmd.Transaction = tran;
 208:            if (null != parameters && parameters.Length > 0)
 209:            {
 210:               for (int i = 0; i < parameters.Length; i++)
 211:               {
 212:                  cmd.Parameters.AddWithValue("@p" + i.ToString(CultureInfo.InvariantCulture), parameters[i]);
 213:               }
 214:            }
 216:            SqlDataAdapter da = new SqlDataAdapter(cmd);
 217:            DataTable tbl = new DataTable();
 218:            tbl.Locale = CultureInfo.CurrentCulture;
 219:            da.Fill(tbl);
 220:            return (tbl);
 221:         }
 223:      }

(Download from Pastebin)

Shocked, Stupefied, and Generally Stumped

If you run the above code and look at the data that ends up in [dbo].[customer2], you'll notice a very strange thing:

The transactions were rolled back, but one of the "C" statement was still applied!


Short Answer…

The Deadlock aborts the underlying transaction on SQL Server, but SqlTransaction and SqlCommand happily continue to execute more commands outside of any transaction.

Hindsight is 20/20

After diagnosing and fixing this issue, it almost makes sense, but it's not at all intuitive up front. I asked a couple colleagues to guess the outcome of the code, and both guessed wrong. I suspect that most of you will also be surprised by the results. Hindsight is 20/20, but up front, this one is nonsense.

Deadlocks in SQL server mean that SQL will kill one of the transactions and roll it back. This frees the locks and allows the other transaction to proceed. In the scenario above, we force a deadlock, meaning that one of the transactions is ROLLED BACK on the server. It's not just "zombied", which would cause "SELECT XACT_STATE()" to return "-1". No, in this case, the transaction is done. There is no more transaction.

So, how does one of those "C" statements get applied? Basically, SqlCommand doesn't know that there is a problem with the transaction on the server, nor does it ask the SqlTransaction object to check. Instead, it just sends the statement to the server, assuming that the server still has a transaction open on the current connection. But there is no transaction. As a result, those statements are executed with no transaction at all. The subsequent Rollback() has no means to undo their work.

Is This a Bug?

Calling something a bug is rather severe. In this case, I suspect that the framework can do better.

The system does some checks to ensure proper & safe usage of the SqlTransaction, but not enough. After a deadlock, the client still has a reference to a SqlTransaction object. This object still thinks that it has a transaction on the server. It will even allow you to call Rollback() on this transaction object after the deadlock. But it won't let you call Commit(). It seems that SqlTransaction knows that the transaction can't commit, and throws an error back at you. Additionally, if you use a SqlConnection with a SqlCommand, but don't pass the associated SqlTransaction, then it will throw an error.

So the system does some internal checks, but it doesn't check for the deadlock scenario. Well, I think it should do better. Before executing a command, it should check whether the associated transaction is gone. If the server transaction is gone, it shouldn't even run the command. It should just error. Otherwise, SqlCommand will continue to run the statements outside of any transaction.

I don't know if this is possible, or if it has some sever side-effects, but the current behavior of SqlTransaction is a very dangerous and sharp edge on the API surface of the .NET framework. A transaction that doesn't actually roll back the associated commands is not good.

Solution: Catch Deadlock Errors and Halt

The solution seems to require catching the SqlException, inspecting the Number to see if it's a deadlock, and then taking an alternate course of action. In the case of retry logic, the processing should immediately stop and rollback to prevent any subsequent commands from executing outside a transaction.

To catch a deadlock, look for SqlException.Number == 1205. This indicates a deadlock occurred.

Happy (Transaction) Coding!


author: Nathan Allen-Wagner | posted @ Saturday, March 03, 2012 10:03 PM | Feedback (26)

Updated version of Monitored Undo Framework released

Today I released an updated version of the Monitored Undo Framework.

Changes include:

  1. A new parameter on the DefaultChangeFactory's methods for specifying the "description of the change". This can be helpful in cases where the UI shows a list of the undo / redo changes.
  2. A new WPF sample that shows, in a simpler codebase, how to use the framework.
  3. Updated build script that compiles all solutions, run the tests, and package the NuGet package.

These changes are also available via NuGet.


author: Nathan Allen-Wagner | posted @ Tuesday, January 31, 2012 10:24 AM | Feedback (0)

"tfpt unshelve /migrate" Doesn't Account for Renames

A quick note about Team Foundation Power Tools (TFPT)'s "Unshelve" command…

tfpt unshelve has a /MIGRATE switch that will allow unshelving changes into a different branch / location than the one where they were originally shelved. I won't re-iterate the many articles describing how to use this command.

One issue that I ran into is unshelving a set of changes into a branch where some of the shelved files had been renamed / moved. It seems that the /migrate switch doesn't go back into TFS version history to locate the renames. Instead, it simply pulls out the changes and re-applies them to the new location.

For Example:

Let's say that I start with Branch A. One user (John Doe) get's latest and starts making changes.

Branch A

Then… a different user (Jane Smith) renames File1.txt and checks that in.

Branch A

Then… (for some reason), they need to move John's changes from Branch A to another branch. (Let's pretend that Branch A is the main branch, but the changes are too large for this iteration, so they want to move them to a separate branch for a future release.) So they create Branch B, which looks like this:

Branch B

Then… John shelves his changes and they attempt to use tfpt unshelve /migrate to move the changes to Branch B.

But a strange thing happens. They end up with this:

Branch B

Huh? How'd that happen?

Someone who knows version control well could probably explain why, and suggest that this is obvious. But it wasn't to me, and as a result, some of the changes in the shelveset did not make it into the target solution.

How I should have done it…

If I was going to do this again, I think I'd:

  1. Get the source and the target branches to the same version. (Merge changes as needed between the branches.)
  2. In the source branch, get latest version so that the local workspace has the latest version.
    1. This should propagate any pending renames / moves that were made in that source branch, or it's parent branches.
  3. Shelve the changes that should be moved.
  4. In the target branch, get latest version.
  5. Run tfpt unshelve /migrate …



author: Nathan Allen-Wagner | posted @ Thursday, January 26, 2012 4:14 PM | Feedback (0)

A Visual Studio Extesion to make Scrolling Better!

Hi All,

A friend of mine (John Nastase) saw a tweet or two of mine complaining about scrolling in Visual Studio.

What did he do? He wrote a VS extension to make it happen!!!

Here's the extension that he just published in the VS extension gallery.

It basically adds a "cursor buffer" space to the top and bottom of the window so that as arrow down the page, the page starts to scrolling before the cursor hits the VERY BOTTOM OF THE PAGE. Instead, it starts scrolling a little earlier so that you can see what's coming. This saves you from scrolling too far and then having to arrow back up to where you want to be.



author: Nathan Allen-Wagner | posted @ Monday, January 23, 2012 10:37 PM | Feedback (0)

EF Inserts Failing Because of Missing Association in SSDL

I just found that Entity Framework (4.0) can stumble on One-To-One relationships if the EDMX's schema storage layer (SSDL) is missing the association that represents the Foreign Key in the database. I suppose this makes sense, but it's only obvious to me in hindsight.

I don't know how my EDMX ended up missing the SSDL association for the Foreign Key, but it was missing. However the the Conceptual Layer had the relationship, so the diagram made it look like it was associated. And because this is a supported scenario, it didn't complain at me that I was missing anything.


When I went to save the records, EF stumbled on the insert. In my case, I had two entities that were related by a "1 to 0..1" relationship. I'm guessing that EF saw this, evaluated the SSDL layer's metadata and decided that order of insert didn't matter. As a result, it inserted them in alphabetical order… and failed. It failed because in my DB, I did have a foreign key.

The fix was to manually add the association into the SSDL layer.

Note: Editing EDMX files by hand SUCKS!

author: Nathan Allen-Wagner | posted @ Thursday, January 19, 2012 4:37 PM | Feedback (0)

Microsoft's MVP program and Rob Eisenberg

This is a long read, but it's got some detailed insight on the MVP program at Microsoft. I've heard these sentiments before, and I'm sure that it's not black and white. Regardless, Microsoft's actions in this case are unacceptable.

If you don't know Rob Eisenberg, he's the creator of Caliburn and Caliburn Micro frameworks for XAML based platforms (WPF, Silverlight, WP7). If you want to have your mind blown, take a look at his source code for Caliburn Micro (and even more so for Caliburn). While you're at it, watch his "Build your own MVVM framework" presentation. I reviewed a lot of WPF frameworks, but Caliburn Micro stands alone in it's elegance, simplicity, and amazing use of the C# / .NET platform.

I seriously hope that MS addresses whatever is behind this story. We need people like Rob! They should be rewarded for their amazing contributions to the community.

author: Nathan Allen-Wagner | posted @ Thursday, January 05, 2012 2:34 PM | Feedback (0)