Recently I was optimizing some methods that were using Entity Framework to get some entities from database. I’ve put Stopwatch and started to measure performance gain. Saying that the query was not optimal is not enough. It was terrible.
What drew my attention the most was the fact that after the query finished, there was a foreach loop that updated part of the viewmodel. To my surprise, removing the nasty foreach asking the base for the result of a simple Count reduced the execution time by more than 3 times. To get rid of those additional queries, I had to a little bit denormalize data model.
Normalization vs Denormalization
I have impression that there is not enough talk about denormalization at universities. The aim of normalization is to reduce data redundancy and improve data integrity. One gets the impression that normalization is a standard and should be used anytime, anywhere. It turns out that a bit of deliberate and denormalization and data redundancy can significantly increase the efficiency of our application.
The simplest example is the sum of the invoice items. The cost of an additional column is not large compared to the cost of calculating this value each time. Of course, you have to remember to update this value. When conflict occurs, the highest priority has the sum query.