Updating disconnected Entity Framework child collections

One pain point I have with Entity Framework is dealing with updating child collections on disconnected entities. The most common scenario I run into is in web APIs. I have a web page which allows a user to edit an entity which includes a child collection. New children can be added, existing children edited, and existing children deleted. All the actions are saved to the server at once. When I POST this to my API, I end up writing a lot of boilerplate to figure out what changed. To prevent that, I came up with this method. A sample usage is below. This is for EF 6.1.

We Are Generation Earth

I stumbled across the BBC documentary mini-series Supersized Earth on Netflix and it’s really quite fascinating. Host Dallas Campbell explores how humans have changed the face of the Earth over the past 100 years by visiting some of the largest engineering projects around the world. They are just mind boggling. In Hong Kong, over 3.5 million people live above the fourteenth floor. That’s like lifting the entire city of Chicago into skyrises. Our open face mines dive even deeper into the earth than our cities rise. We have dammed over 1/3 of the world’s river flow capacity. And our cities don’t flood because we can divert  rivers through underground caverns with pumps that could drain a swimming pool in a second.

The pace of change is increasing too. In 1936 Hoover Dam was the tallest dam in the world. Today it doesn’t even make it in the top 25. The South To North aquifer under construction in China, designed to relieve water shortages in the north, will be one of the longest rivers in the world–longer the the width of the continental US. China is also leading highway construction. In the last 20 years they’ve built more highways than exist in the US.

Another fascinating feat: a boat designed to transport the untransportable. Campbell visits the Blue Marlin which is preparing to transport an oil rig across the Pacific Ocean. Because the oil rig cannot be lifted, the Blue Marlin must sink 10 meters underwater to scoop it up.

Overall the documentary is very well produced, with slick animations woven with satellite images and some very impressive views. Campbell keeps it interesting too, undertaking some challenges at each stop, like downhill bike racing, cleaning windows on the world’s tallest building, and detonating explosives at a mine. It’s since been removed from Netflix, but you can still see parts of it on Youtube.

Episode 1 (can’t find it), Episode 2Episode 3

Measure What Matters To Customers

In Measure What Matters To Customers, Ron Baker challenges several common notions held by professional service firms including “costs drive prices” and “productivity can be measured by timesheets”. Too many firms, Baker says, are focused on optimizing production and lowering costs to the detriment of effectively serving their customers.

To be successful in today’s information economy, executives must shift their focus to ensuring the success of customers. In this new model, executives must increase the firm’s intellectual capital, price, and effectiveness. Baker advocates developing firm-wide Key Predictive Indicators–forward looking predictors of customer success, not backwards looking performance measures. If you are helping your customers be successful, it’s likely you will be as well. KPIs should be generated by hypothesis and periodically tested. If a KPI isn’t actually predicting your firm’s outcomes, go back to the whiteboard.

Baker presents Gordon Bethune’s transformation of Continental Airlines as an example of the new business model. In the 90’s, Continental was a budget airline so cheap nobody wanted to fly on it. It ranked last in all performance measures for airlines. All efforts had been made to reduce the cost per seat mile traveled. Bethune shifted the focus to customer metrics: on-time arrivals, luggage lost, and complaints received. The airline quickly won more customer satisfaction awards than any other airline in the world and the stock priced increased 25X.

Baker also discusses the rise of the intellectual worker. He regards the timesheet as a remnant of Taylorism. Knowledge workers are not like workers of the industrial revolution. They are paid for their ideas, not hours worked. Setting billable hour quotas is demoralizing. Knowledge workers should be, at least in part, compensated for the results they produce in the form of bonuses or stock options.

Without timesheets, how should services be billed for? Simple. Set the price of the service relative to its value to the customer. With a price set upfront, the firm can tailor its services’s cost appropriately. Decoupling the cost from hours works can lead to innovation within the company. By taking on the risk of a fixed price contract, the firm gains the ability to earn far more than margin on labor.*

I recommend this book to every professional services manager. Baker provides insight into where some of our widely held beliefs originated. I’m confident following his advice will help other’s find a profitable future serving others.

*For more on this topic, see his book Implementing Value Pricing.

Generating SQL from expression trees, Part 2

As promised, here is a much improved version of the where clause builder from my last post. This version generates parameterized queries so it isn’t vulnerable to SQL injection. Using parameters also allowed me to simply the logic since I don’t need to worry about stringifying the values or using “IS” instead of “=” for null checking.

I moved all the string concatenation into a separate class called WherePart. These objects are composable in a structure similar to the source expression tree. Extracting this class is my favorite part of the refactoring.

I’m still not happy with how I’m handling the LIKE queries. I have to pass a prefix and postfix parameter down to the next level of recursion which clutters up the method signature. It might be better to just build the string in place.

Continue reading

Generating SQL from expression trees

Is this further down the rabbit hole than IL generation? I’m not sure but I went there.

(Note: This was an experiment. It doesn’t generate safe SQL. I’ll follow up with a better version.)

An expression tree is an abstract representation of code as data. In .NET they are primarily used for LINQ-style code. In C#, lambda expressions can be decomposed into expression trees. Here is an example of a lambda and it’s expression tree:

x => x.PosId == 1 && x.Name == "Main"


There are five key types of expression tree nodes.

UnaryExpression: An operation with a single operand such as negation.

BinaryExpression: An operation with two operands such as addition or && or ||.

MemberExpression: Accessing a property, field, or method of an object or a variable. (Variable references in lambda expressions are implemented as fields on a class generated by the compiler.)

ConstantExpression: A node that is a constant value.

ParameterExpression: An input to a lambda function.

The following code recursively walks an expression tree and generates the equivalent where clause in SQL, for sufficiently simple expressions. One of the areas that was a bit tricky is SQL’s handling of NULL. I have to check the right side of a binary expression for NULL so I can generate “x IS NULL” instead of “x = NULL”. I used parentheses liberally so ease composing the expressions. Handling negation was done naively. It could be cleaned up by propagating the negation into the child node.

Continue reading

Playing with IL generation

This week I started learning how to generate CIL (Common Intermediate Language, the .NET runtime’s equivalent of assembly). The .NET classes for doing this looked a bit intimidating so I chose to try out FluentIL first. FluentIL is a helper library to simplify generating IL. I’m not sure if it helped or hurt–I had to mentally translate all the online samples I looked at into FluentIL and I also found an opcode (castclass) which wasn’t implemented in the fluent helpers.

I used the cheater method to write the example below. First I wrote a strongly typed C# method which did exactly what I wanted the IL to do and built it in release mode. Then I used dotPeek to view the generated IL and wrote an equivalent generic version.

One of the difficulties of writing IL is debugging it. If the program isn’t perfectly valid, the runtime gives you a very generic invalid program error. Also, if you have a runtime type violation in the IL you get an exception saying the operation could make the runtime unstable. There is no stack trace to help you find the error.

This code provides a way to directly set properties on an object without using reflection. It would be useful if you have an object but only know the type at runtime. This isn’t production ready code.
Continue reading

Compiling legacy TypeScript

Last week I installed Visual Studio 2015. This week I was unable to use Visual Studio 2013 to rebuild a legacy app that included TypeScript 1.0 code originally developed in Visual Studio 2012. Here’s how I sorted it out.

VS2015 installed TypeScript 1.7. I won’t be using TypeScript going forward so I uninstalled the TypeScript 1.7 compiler in Programs & Features.

This is not enough though. Now I received an “unknown option: ‘noEmitOnError'” during building. TypeScript compilers (as of present) share a common MSBuild target file which contains the 1.7 options even after you uninstall. The file is at C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v12.0\TypeScript\Microsoft.TypeScript.jsproj.targets (Adjust the v12.0 as approriate for your version of VS). I ended up overwriting the entire TypeScript folder (keep a backup!) with a copy from another developer who had not yet upgraded. However I did discover that you can copy the targets file from an older version folder as well. (The one from v11.0 worked for me too, but it is slightly different.)

If you need to compile multiple versions of TypeScript, I think you’ll have to make a copy of the TypeScript folder and edit your csproj file to point to the correct one. The relevant line is:

<Import Project="$(MSBuildExtensionsPath32)\Microsoft\VisualStudio\v$(VisualStudioVersion)\TypeScript\Microsoft.TypeScript.targets" />

Was $30,000 a good asking price?

Yesterday the internet broke. Well at least a lot of javascript builds did. But I don’t want to focus on that part of the story.

In the published conversation, Kik asked if they could compensate Azer for transferring control of the kik project name on npm. Azer asked for $30,000. Kik walked away from the negotiation and asked npm to transfer the name since they owned the trademark to Kik.

First, I don’t believe Azer was violating Kik’s trademark. Kik’s trademark is for mobile software, instant messaging, and a website. Azer’s kik project is a software bootstrapping tool. I’m not a lawyer, but I don’t see how Azer’s use of the name is trademark infringement. There are many other companies that use the Kik name. The USPTO’s TESS system has 15 records for Kik. On this basis, it’s fair to say that Kik should compensate Azer for the name.

Now, was Azer’s asking price of $30,000 fair? Kik obviously did not think so. To an individual developer this is probably a sizable amount of money. But what is the value to Kik? While they don’t exactly say what software tool they are releasing, since it’s being released on npm it’s likely a tool for extending their service. The more extensions there are to Kik, the more reasons users will find to become engaged with it. Growing the user base has a huge return in terms of valuation potential for a VC-backed startup. Kik’s argument was that if they had to compete with an unrelated kik package, it could confuse developers (likely reducing adoption). So my question is: what is it worth to reduce that confusion? I don’t know Kik’s financials (they’ve publicly raised $120M on a $1B valuation), but I do know that software development projects aren’t cheap. At a company Kik’s size, anything worth releasing is probably costing tens, if not hundreds, of thousands in developer time, plus management, marketing, and overhead. If reducing developer confusion could lead to higher user numbers and thus lead to higher valuation, I could probably be convinced $30,000 was a fair price to pay. And if it isn’t, they could always counter.

Finally, if Kik’s goal was to get this project name for cheap, it was critical for them to set an anchor price early. By asking Azer to give the first offer they lost control of the negotiation.

Gordon Bethune on motivating your employees

I’m about to start reading From Worst to First by Gordon Bethune so I found this presentation to get an introduction. He describes how he transformed Continental from the worst performing airline to the best by focusing on how to motivate the employees to serve customers. I’d love to work for a company that shares the kind of culture he built at Continental.

Note: He has a few off-color remarks but look past those to the lesson he gives.

The Heavy Water War

If you enjoy WWII history, check out The Heavy Water War on Netflix. The Norwegian miniseries tells the true story of the Allied efforts to sabotage a factory which produced a critical ingredient for the Nazi nuclear weapons program–deuterium oxide–heavy water. A small team consisting of English and Norwegian soldiers was tasked with sneaking explosives into the basement of the guarded facility in the height of Scandinavian winter. They endured feats of unbelievable courage and made difficult decisions to carry out their orders.