Update Local Development Environment Certificates

One day my local development stops working. I was setting them up one year ago for a specific project. My development environment is using certificates for client-server communication. I know for sure that the certificates were expired. The ADFS signing token certificate and self-sign SSL certificate were expired.

A small problem! But the real problem is that I cannot remember what should be done step by step. A year is long enough for an occasional task. Not anymore! This time I will document here for … the next years.

Generate self-sign certificate

Microsoft has a very detail page for it. In my case, I just need this piece of code

New-SelfSignedCertificate -DnsName "tad.local" -CertStoreLocation "cert:\LocalMachine\My"

Once generated, go to my personal certificate store

  1. Export the certificate to file.
  2. Import it into the trusted authority store.
  3. Delete the old certificate.
  4. Update SSL certificate in IIS

ADFS signing token and decryption token certificates

Open PowerShell in the ADFS server and type in these commands

Run each command in PowerShell

The rest is to update the application configs to reflect the changes.

Hidden Cost of an Architecture

In a normal day of developer life, I was hunting a performance issue and memory leak. It sounds mysterious, but, just another bug, another issue to solve, after all. 

When come to the performance/memory issue, one should go for PerfView. The tool gives a very detail picture of what was going on in the memory at a reasonable level that a developer can understand.

The system is a WCF service which works base on the DataContract. From the profiler, I found out that if a returned value has 10MB in size, it will cost the OS of 50MB, proximately of 3 times extra cost. That does not count the memory consumed by the WCF framework to serialize the contract.

Note that I do not judge the architecture good or bad. There were good reasons what it was designed that way.

A very simplified version looks

With that simple code setup, a simple console app that consumes the service

Here is the result of downloading a file of 74MB. The total memory consumed in the heap is 146MB.

There are 2784 objects with total of 146MB memory

Where are those extra cost coming from? The extra cost comes from BinaryDataContractSerializer.Serialize method.

  1. The memory consumed by the DataContractSerializer.
  2. And the memory consumed by the MemoryStream to return an array of bytes.

In many cases, with modern hardware, it is not a big problem. There is Garbage Collector taking care of reclaiming the memory. And if both request and response are small, you do not even notice. Well, of course, unless one day in the production, there are many requests.

There are a couple of potential issues about consuming more memory

  1. If the size is more than 85K (85000 bytes), it is stored in Gen 2 eventually. I would suggest you read more about memory allocation, especially Large Object Heap (LOH). I am so amateur to explain it.
  2. Cause memory fragmentation. Memory keeps increasing. GC has a very hard time to reclaim them.
  3. Of course the system is not in a good shape.

How could we solve the problem without changing the design, with less impact?

We know that some operation will consume lots of memory, such as downloading a file, returning a data set. Instead of returning the byte array, we extend the response to carry the object. We could do that for all operations and get rid of the byte array. However, there are hundreds of operations. And we want to keep the contract simple and with less changes as much as possible.

So an improved version looks like

Design a LargeObject contract to reduce the serialization cost

Run the application and see the memory again

There are 366 objects with total of 73MB memory

Comparing the two, there is a big win: 2784 objects vs 366 objects; 146MB vs 73MB.

With the increasing power of hardware, RAM and Disk are not problems anymore. With the support from the managed language (such as C#/DotNet), developers code without caring too much about memory, memory allocation. I am not saying all developers. However, I believe there are many that do not care much about that issue.

It is about time to care every single line of code we write, shall we? We do not have to learn and understand every detail about the topics. These are good enough to start

  1. Memory allocation in Heap, Gen 0, Gen 1, and Gen 2.
  2. Memory fragmentation. Just like disk fragmentation.
  3. Memory profiler at abstract level, such as using dotMemory, PerfView.
  4. Garbage Collector. Just have a feel of it is a good start.

I am sure you will be surprised with how fun, how far it takes you.

Design Tips: Allow Consumers To Handle Exceptions

I ran into a piece of code where it implemented a kind of retry pattern. It will retry a call if an exception is thrown from the first call. A simple version looks like (not a production or real code).

One of a problem with the above code is that when the first timeout exception occurs, the consumer cannot see it anywhere. At least, the ServiceCallHelper should give consumers a chance to deal with exceptions.

How do we do that with less impact on the design? We should strive for a solution which does not introduce a new dependency. I have seen a tendency of injecting a Logger. I might, and will, work. However, now you have dependency on the logger. And the consumer has to know about that logger.

Now take a look at the new version that I propose

With that simple change, we have introduced an extension point that consumers will be happy. And we ensure that our API does not hide away exceptions.

If you are designing an API, consider your extension points, and whether the API allows consumers to have a chance to be aware of exceptions.

.NET framework comes with powerful built classes Func and Action. Take advantages of the two might help you simplified your design.