Preparing for Dapper (Part 1)

When I got myself in to writing a new library for data access layers, NOT choosing Microsoft's Entity Framework was an easy decision. Likewise, micro-ORM Dapper, was an easy choice.

I actually had a good one created like 10 years ago. It was inspired by Microsoft's Enterprise Library. It was used with another set of libraries inspired by CSLA.Net. They're good and serve their purpose well.

However, I figured it was time to escape from the old and start using something new. I looked at ORMs and how they could work for me. I also looked at the experiences I had from the old libraries for what I can keep and what I can improve on.

One of the things that was promoted in the old libraries was the use of stored procedures. The team that used the old libraries is made up of developers whose SQL skills are good, and something that they can be proud of. Their applications are mostly designed to consume and maintain data from a single database. The team is mostly open to using a lot of SQLs in their applications. Reviewing their experiences in the past 10 years, the team noticed that it was easier to support applications that almost exclusively used stored procedures to define business logic and rules. Everything about the business requirements could be found in one place (note that even those innocent looking JOIN and WHERE clauses were actually business rules). In addition, they were easier to maintain and deploy to test environments and to production. So, in violation of DDD, the stored procedures are here to stay. More so, the team decided that they would use stored procedures, as much as possible, as the place to store all, yes, ALL, business logic and rules.

The decisions to use SQL and stored procedures are hands down the most controversial. Most developers in general go against SQL and stored procedures for the glory of DDD and TDD compliance. In some arguments, the concern really falls on questioning the developer's SQL skills, which is understandable. The team, however, has no problem with SQL. So the main argument really is how they can ensure that all (if not most) of the business rules could be stored in just one place.

With the old libraries, jumping between C# codes and stored procedures to understand a complex business rule is a terrible waste of time, especially when solving time critical issues with our applications. Even applications that tried to avoid stored procedures were painful to support. Jumping from C# codes to C# codes to find queries and then elsewhere for how they are used are headaches. And when different developers threw in abstractions of their own here and there, differing styles and techniques mixed up through the years, and cowboy/quick fixes to patch time crunched critical production issues, you suddenly have more learning to go through than you thought was needed for the backlog assigned to you.

Was it just bad practice? Was it bad design? Was the team doing it all wrong? Not really. These applications started off with good intentions. The deviations were simply scars from the harshness of the real world.

However, note that the team have applications that are darlings to support. By simply searching the database for keywords that could list all objects, including stored procedures, they could easily complete the whole picture using only one tool. They could see everything that they needed to see in one place. And they didn't even have to open any C# code! Because SQL is SQL and developers can't really do more than what other developers know, everyone, even new members of the team, could easily decipher what the original business requirements were. That is what the team wants and that's what the team wants to keep.

But good ORMs can "write" the SQL for you, right? Well, yes and no. The team's strength includes SQL skills. I would not throw those skills away just because an ORM can translate LINQ in to SQL. Besides, I don't think an ORM's "SQL skills" could compete with the performance optimized SQL that the team can come up with. Instead of falling victim at the mercy of the ORM's own SQL limitations, I would rather trust the team to produce better performing and quality SQL.

Of course, it can be argued that ORMs support stored procedures and in-line SQL, but that's where the problem can start. While starting off with LINQ and then in some parts you switch to SQL, you already promote scattering the business logic and rules everywhere. When the time comes that you need to support the application, you would need to be ready to open all of your tools just to put the picture back together again. Can you imagine an if (...) construct in C#, complimented by JOIN, WHERE and GROUP BY clauses in SQL, which result gets looped/filtered some more by a LINQ statement to be passed to a method that has more mix of LINQ and SQL implementations, all to define one piece of a complex business rule? One word: Eeewww!!!

Given that the team's new applications have only a database as data store, the team can use stored procedures to also centralize the storage of the business logic and rules. Application or system rules can be placed everywhere. As much as possible, and as much as the team can, the business rules will be in one place.

So where does Dapper come in the picture? Well, because everything will be through stored procedures, Microsoft's Entity Framework is... well, too heavy. The team need something that's lighter and yet just as competitive. Also, knowing EF's performance issues, the team wants something that is known to perform faster than EF out-of-the-box. Micro-ORMs are the answer, yes, but it was specifically Dapper that ultimately fit the bill: it is actively supported, it is used live, and it's fast, fast, fast and, you guessed it, fast!

Comments