Custom metric in Application Insights

Tracking custom metrics in Application Insights is easy. I wanted to track how long our cash register takes to print receipts so I could compare performance across hardware and make better recommendations to our sales team and diagnose customer issues related to printing speed.

You will need a TelemetryClient instance. Use the GetMetric() method to get or create a metric by name. You can use the overloads to provide names for additional custom dimensions. In this case I am tracking the receipt number and the number of images printed on the receipt.

Call TrackValue() to add a new measurement. The TelemetryClient will aggregate the metrics over time and report them to Application Insights.
The default interval appears to be 54 seconds.

In my case, aggregation is not doing much since each printed receipt has unique dimensions and a register is not likely to print more than one receipt every 54 seconds.

var metric = _telemetry.GetMetric("PrintReceiptDurationMs", "ReceiptNumber", "ImageCount");
metric.TrackValue(sw.ElapsedMilliseconds, receipt.ReceiptNumber, imageCount.ToString());

In Log Analytics you can now query for the results.

customMetrics 
| where name == 'PrintReceiptDurationMs'
| extend receipt_number = tostring(customDimensions.ReceiptNumber)
| extend image_count = todouble(customDimensions.ImageCount)
| project value, receipt_number, image_count
| order by receipt_number desc

Or you could plot a chart.

customMetrics 
| where name == 'PrintReceiptDurationMs'
| summarize avg(value) by todouble(customDimensions.ImageCount)
| render barchart

You can query across all metrics if they share common custom dimensions.

customMetrics 
| where customDimensions.ReceiptNumber == 'RC-00092261-7'
| project name, value, timestamp 
| order by name

Prototype: Generating Vue forms from JSON

I’m fascinated with generated code. I love to write code, but when it comes to repetitive CRUD screens, nothing beats a template. Being able to quickly generate screens builds confidence with clients and gets you right into the meat of the application.

I used to build applications primarily in ASP.NET MVC. Recently I’ve started using Vue and immediately missed having input and form builders. Since I still use a C# Web API on the back end, I had to creatively get the C# model from server to client. I did this using a modified JSON Schema. I tried several libraries, but was not very happy with the extensibility of any of them. This proof of concept uses NJsonSchema.

You’re probably here for the client side. Here’s a demo. The code is in the second and third tab.

This form requires two objects from the server. A schema object describing the form and an object containing the form data.

The type attribute comes from the C# property type. When I was limited by NJsonSchema, I added an xtype attribute so I could pick how to render DateTimes and option lists. Select list options come from a property on the formData object mapped from optionsFromProperty in the schema.

You can find the (currently) very ugly server side model here:
https://gist.github.com/ryanohs/781a62717325dd2897addaeb14459e98

Improvements:

For simplicity I published the demo as a single component, but I did break it into several components in my own code.

I will probably end up writing my own schema generator so I’m not constrained by the assumptions of existing ones. JSON Schema is designed for validating JSON data, not building UIs so I’m really stretching the use case here. I would prefer to use the DataAnnotations attributes whenever possible since many tools, like EF, Newtonsoft, and ASP.NET data validation, are already capable of generating and interpreting them.

I couldn’t generate enum drop downs in this demo because NJsonSchema renders them as $ref properties which I didn’t want to interpret client-side.

It would also be great to have sensible default attributes so you can build a form directly from a plain class or EF table object without manually defining labels and enum/list data types.

In a production build scenario, you could precompile the schema as a dependency json file so only the form data is downloaded at run-time.

Thanks for reading! Let me know what features would be useful to you.

Discovering connections in code via Reflection, part 2

Ayende has a really neat post about about using an AST visitor to generate workflow diagrams from code. I used that as inspiration to modify my previous pub/sub documentation generator to output GraphViz syntax. It was a trivial change.

Console.WriteLine($"{messageUsage.Publisher} -> {handler}");

I copy the output into the demo at
https://www.planttext.com/ and it generates a diagram of all the messages passed in my application.

In the future, I may import the GraphViz nuget package and generate the diagram inside my own code.

A Start menu alternative

I recently installed Launchy on my machine to automate common actions. Launchy is a program that lets you quickly execute shortcuts via the keyboard. To activate it, press Alt+Spacebar then type your shortcut. It has autocomplete and defaults to the last shortcut you ran.

It indexes your Start Menu and Quick Launch. I created an additional index and added frequently visited Chrome bookmarks as well as some batch and Powershell scripts that I regularly use.

Some of my shortcuts:

  • JIRA (link to current sprint)
  • Backlog (link to backlog)
  • Pull Requests
  • Prod Insights (my production activity monitor)
  • Script Out DBs (batch file regenerates EF Core files from a SQL DB)
  • Vue Debug (launches a background Vue compiler and watches for changes)
  • ES6 cheat sheet

Discovering connections in code via Reflection

I wrote a .NET application that makes heavy use of the publish/subscribe pattern. In order to help other developers learn about the code base I wrote a unit test that finds all publishers and subscribers and describes how they are connected.

Each published message is a class inheriting from IMessage.

Each subscriber inherits from ISubscribeTo<TheMessageType>.

This code uses an IL reflector (source code here) to find each location a message type is constructed (before it’s published) and type reflection to find all its subscribers. Then it builds a text document describing what methods publish each message type and what subscribes to it.

The output looks like this. One improvement would be to remove the return type from the method signature so it reads more naturally.

{Method} publishes  ==>  {message type}
	-> handled by {subscriber type}

AccountViewModel.Void Execute_NewAccountSelectedCmd()  ==>  AccountSelected
	-> CustomerDetailViewModel
AddCustomerAccountsViewModel.Void Execute_CloseCmd()  ==>  AccountSelected
	-> CustomerDetailViewModel
AppliedTenderViewModel.Void Execute_RemoveTenderCmd()  ==>  RemoveTender
	-> TransactionViewModel
AuthorizationService.User ValidateUser(System.String, System.String)  ==>  LogoutRequested
	-> RegisterViewModel
	-> TransactionViewModel
	-> HardwareService
BasketIdViewModel.Void Execute_ApplyCmd()  ==>  ApplyBasketId
	-> TransactionViewModel

The code:

public void ListOfAllPublishersAndSubscribers()
{
	Console.WriteLine("{Method} publishes  ==>  {message type}");
	Console.WriteLine("\t-> handled by {subscriber type}");
	Console.WriteLine();
	Console.WriteLine("Discovered via Reflection. Duplicates not removed.");
	Console.WriteLine();
	Console.WriteLine();

	var domain = typeof(App).Assembly;
	var pos = typeof(TransactionViewModel).Assembly;
	var assemblies = new List<Assembly>() { domain, pos };

	var handlerType = typeof(ISubscribeTo<>);
	var handlersByType = assemblies
		.SelectMany(s => s.GetTypes())
		.SelectMany(s => s.GetInterfaces(), (t, i) => new { Type = t, Interface = i })
		.Where(p => p.Interface.IsGenericType && handlerType.IsAssignableFrom(p.Interface.GetGenericTypeDefinition()))
		.GroupBy(t => t.Interface.GetGenericArguments().First().Name)
		.ToDictionary(g => g.Key, g => g.Select(x => x.Type.Name));

	var imessage = typeof(IMessage);
	foreach (var messageUsage in assemblies
		.SelectMany(s => s.GetTypes())
		.Where(type => type.IsClass)
		.SelectMany(cl => cl.GetMethods().OfType<MethodBase>(), (t, mb) => new { t, mb })
		.SelectMany(a => MethodBodyReader.GetInstructions(a.mb), (a, i) => new { Publisher = $"{a.t.Name}.{a.mb.ToString()}", op = i.Operand as ConstructorInfo })
		.Where(a => a.op != null)
		.Where(a => imessage.IsAssignableFrom(a.op.DeclaringType))
		.OrderBy(a => a.Publisher)
		.ThenBy(a => a.op.DeclaringType.Name))
	{
		Console.WriteLine($"{messageUsage.Publisher}  ==>  {messageUsage.op.DeclaringType.Name}");
		if (handlersByType.ContainsKey(messageUsage.op.DeclaringType.Name))
		{
			foreach (var handler in handlersByType[messageUsage.op.DeclaringType.Name])
			{
				Console.WriteLine($"\t-> {handler}");
			}
		}
		else
		{
			Console.WriteLine("\t-> NO HANDLERS");
		}
	}
}

Updating disconnected Entity Framework child collections

One pain point I have with Entity Framework is dealing with updating child collections on disconnected entities. The most common scenario I run into is in web APIs. I have a web page which allows a user to edit an entity which includes a child collection. New children can be added, existing children edited, and existing children deleted. All the actions are saved to the server at once. When I POST this to my API, I end up writing a lot of boilerplate to figure out what changed. To prevent that, I came up with this method. A sample usage is below. This is for EF 6.1.

We Are Generation Earth

I stumbled across the BBC documentary mini-series Supersized Earth on Netflix and it’s really quite fascinating. Host Dallas Campbell explores how humans have changed the face of the Earth over the past 100 years by visiting some of the largest engineering projects around the world. They are just mind boggling. In Hong Kong, over 3.5 million people live above the fourteenth floor. That’s like lifting the entire city of Chicago into skyrises. Our open face mines dive even deeper into the earth than our cities rise. We have dammed over 1/3 of the world’s river flow capacity. And our cities don’t flood because we can divert  rivers through underground caverns with pumps that could drain a swimming pool in a second.

The pace of change is increasing too. In 1936 Hoover Dam was the tallest dam in the world. Today it doesn’t even make it in the top 25. The South To North aquifer under construction in China, designed to relieve water shortages in the north, will be one of the longest rivers in the world–longer the the width of the continental US. China is also leading highway construction. In the last 20 years they’ve built more highways than exist in the US.

Another fascinating feat: a boat designed to transport the untransportable. Campbell visits the Blue Marlin which is preparing to transport an oil rig across the Pacific Ocean. Because the oil rig cannot be lifted, the Blue Marlin must sink 10 meters underwater to scoop it up.

Overall the documentary is very well produced, with slick animations woven with satellite images and some very impressive views. Campbell keeps it interesting too, undertaking some challenges at each stop, like downhill bike racing, cleaning windows on the world’s tallest building, and detonating explosives at a mine. It’s since been removed from Netflix, but you can still see parts of it on Youtube.

Episode 1 (can’t find it), Episode 2Episode 3

Measure What Matters To Customers

In Measure What Matters To Customers, Ron Baker challenges several common notions held by professional service firms including “costs drive prices” and “productivity can be measured by timesheets”. Too many firms, Baker says, are focused on optimizing production and lowering costs to the detriment of effectively serving their customers.

To be successful in today’s information economy, executives must shift their focus to ensuring the success of customers. In this new model, executives must increase the firm’s intellectual capital, price, and effectiveness. Baker advocates developing firm-wide Key Predictive Indicators–forward looking predictors of customer success, not backwards looking performance measures. If you are helping your customers be successful, it’s likely you will be as well. KPIs should be generated by hypothesis and periodically tested. If a KPI isn’t actually predicting your firm’s outcomes, go back to the whiteboard.

Baker presents Gordon Bethune’s transformation of Continental Airlines as an example of the new business model. In the 90’s, Continental was a budget airline so cheap nobody wanted to fly on it. It ranked last in all performance measures for airlines. All efforts had been made to reduce the cost per seat mile traveled. Bethune shifted the focus to customer metrics: on-time arrivals, luggage lost, and complaints received. The airline quickly won more customer satisfaction awards than any other airline in the world and the stock priced increased 25X.

Baker also discusses the rise of the intellectual worker. He regards the timesheet as a remnant of Taylorism. Knowledge workers are not like workers of the industrial revolution. They are paid for their ideas, not hours worked. Setting billable hour quotas is demoralizing. Knowledge workers should be, at least in part, compensated for the results they produce in the form of bonuses or stock options.

Without timesheets, how should services be billed for? Simple. Set the price of the service relative to its value to the customer. With a price set upfront, the firm can tailor its services’s cost appropriately. Decoupling the cost from hours works can lead to innovation within the company. By taking on the risk of a fixed price contract, the firm gains the ability to earn far more than margin on labor.*

I recommend this book to every professional services manager. Baker provides insight into where some of our widely held beliefs originated. I’m confident following his advice will help other’s find a profitable future serving others.

*For more on this topic, see his book Implementing Value Pricing.

Generating SQL from expression trees, Part 2

As promised, here is a much improved version of the where clause builder from my last post. This version generates parameterized queries so it isn’t vulnerable to SQL injection. Using parameters also allowed me to simply the logic since I don’t need to worry about stringifying the values or using “IS” instead of “=” for null checking.

I moved all the string concatenation into a separate class called WherePart. These objects are composable in a structure similar to the source expression tree. Extracting this class is my favorite part of the refactoring.

I’m still not happy with how I’m handling the LIKE queries. I have to pass a prefix and postfix parameter down to the next level of recursion which clutters up the method signature. It might be better to just build the string in place.

Continue reading

Generating SQL from expression trees

Is this further down the rabbit hole than IL generation? I’m not sure but I went there.

(Note: This was an experiment. It doesn’t generate safe SQL. I’ll follow up with a better version.)

An expression tree is an abstract representation of code as data. In .NET they are primarily used for LINQ-style code. In C#, lambda expressions can be decomposed into expression trees. Here is an example of a lambda and it’s expression tree:

x => x.PosId == 1 && x.Name == "Main"

ExpressionTree

There are five key types of expression tree nodes.

UnaryExpression: An operation with a single operand such as negation.

BinaryExpression: An operation with two operands such as addition or && or ||.

MemberExpression: Accessing a property, field, or method of an object or a variable. (Variable references in lambda expressions are implemented as fields on a class generated by the compiler.)

ConstantExpression: A node that is a constant value.

ParameterExpression: An input to a lambda function.

The following code recursively walks an expression tree and generates the equivalent where clause in SQL, for sufficiently simple expressions. One of the areas that was a bit tricky is SQL’s handling of NULL. I have to check the right side of a binary expression for NULL so I can generate “x IS NULL” instead of “x = NULL”. I used parentheses liberally so ease composing the expressions. Handling negation was done naively. It could be cleaned up by propagating the negation into the child node.

Continue reading