Category: Sql Server

Fast & Simple Data Access using

Dapper is a micro-ORM. It helps you to create a strongly typed result by providing you an easy way to work with your database. It is simple to use, it does simple stuff, essentially extends your IDBConnection interface. So if you are looking for a way to query your data real fast (yes, this is the fastest it gets barring pure, you should checkout It is developed by some smart folks over at ! (Another good reason to try it out!)

Installing Dapper

It is simple to install Dapper, you could simply use Nuget Package Manager to add Dapper to your project. From Visual Studio, Click on Tools > Library Package Manager > Package Manager Console
Type in Install-Pacakage Dapper and hit Enter

In order to demonstrate Dapper, we will use the standard Microsoft sample database NorthWind.

In NorthWind database Customers table represent customers and Orders table represent Orders placed by all the customers.

Here are some POCOs we will use:

    internal class Customer
        public string CustomerId { get; set; }
        public string CompanyName { get; set; }
        public string ContactName { get; set; }
        public IEnumerable<Order> Orders { get; set; }   

    internal class Order
        public int OrderId { get; set; }
        public DateTime OrderDate { get; set; }

A useful helper class that would get us a valid, opened SqlConnection object

    internal static class SqlHelper
        public static SqlConnection GetOpenConnection()
            var connectionString = ConfigurationManager.ConnectionStrings["NorthWind"];
            var connection = new SqlConnection(connectionString.ToString());
            return connection;

And a Data Access class that actually does some work !!

    internal class CustomerDao
        public IEnumerable<Customer> GetAll()

            const string sql = "Select * from Customers";
            using (var connection = SqlHelper.GetOpenConnection())
                return connection.Query<Customer>(sql);

        public IEnumerable<Customer> GetById(string customerId)

            const string sql = "Select * from Customers where CustomerId = @id";
            using (var connection = SqlHelper.GetOpenConnection())
                return connection.Query<Customer>(sql, new { Id = customerId });

        public IEnumerable<Order> GetOrders(string customerId)
            const string sql = @"select o.Orderid, o.OrderDate from Customers c inner join Orders o 
                                 on c.CustomerId = o.CustomerId and c.CustomerId = @Id";

            using (var connection = SqlHelper.GetOpenConnection())
                return connection.Query<Order>(sql, new { Id = customerId });


Above class illustrates some very a straight select and some simple joins. You can use the same strategy to run your insert and update queries. Dapper also supports multi-mappings where you can auto map your aggregates and multiple result sets also. For more visit Dapper page on Google code:

Here is some tests that actually tests the code above:

    public class DapperTest
        private CustomerDao _customerDao;
        public void TestFixtureSetUp()
            _customerDao = new CustomerDao();
        public void CreateConnection_ReturnsNotNull()
            var connection = SqlHelper.GetOpenConnection();
            Assert.That(connection, Is.Not.Null);
        public void GetAllCustomers_Returns_Records()
            var allCustomers = _customerDao.GetAll();
            Assert.That(allCustomers.Count(), Is.Not.EqualTo(0));

        public void GetCustomers_With_Where_Clause()
            var testId = "BERGS";
            var allCustomers = _customerDao.GetById(testId);
            Assert.That(allCustomers.Count(), Is.Not.EqualTo(0));

        public void GetOrders_For_Customer()
            const string testId = "BERGS";
            var orders  = _customerDao.GetOrders(testId);
            Assert.That(orders.Count(), Is.Not.EqualTo(0));


Massive and Peta Poco are two other micro ORMs that competes with Dapper. You can find Massive here => and Peta Poco here =>

(I haven’t tried Massive, but have played with Peta Poco. I find auto mapping features are better in Peta Poco compared to Dapper)

I hope you enjoyed this post! Happy Programming!

My gripe about SQL Server 2008 installer…

You need to keep your fingers crossed when you install any Microsoft products, every version of every product I have used, installer behaviour changes!!! And these softwares are huge in size too… I am talkling about developer products here, like Visual Stuidio, SQL Server, etc. I was trying to install SQL Server 2008 Developer edition in one of my machines today and was hit with the infamous “VSShellInstalledRule”. Ok, I agree its a rule and a dependency, but my point is I had to click through the installer Wizard many steps deep before you get this wrench. Urghhh!!!

This error is complaining that I need to have Visual Studio 2008 SP1 installed. I have Visual Studio 2010 installed in my machine (and also VS 2008), so I am assuming that installer should be smart enough to acknowledge the presence of VS 2010 and live with it. Am I asking for too much here?

15 seconds with Google, I was pointed to the comments section in this blog post where a tip was given how to ignore these rules when setup is run (Thanks Paul)

Basically you do this:

Setup /ACTION=install /SkipRules=VSShellInstalledRule

Until your next installation…peace 🙂

[update] I thought this would fix my issue, it helped me to go ahead with the installation, but ultimately it failed in installing components that requires Visual Studio integration (like integration services). So now my challenge is to make this stupid installer realize that I have a newer version of Visual Studio (2010) or install Visual Studio 2008 SP1 (that takes forever)

sad, I am 😦

[update] So I installed VS 2008 SP1, I had ended up with a messed up installation from previous trials, I removed the installation completely, removed my SQL Server 2005 Management Studio Express (that was another error in the list), tried again. I still got some error saying regarding Reporting Services. This is because it installed RS catalog db, so it wants me to install using a “files only mode”, Google to rescue again..

setup /q /ACTION=install /RSINSTALLMODE=”FilesOnlyMode”

BTW, I also found a MSDN link that explains the command line installation of SQL Server 2008 => link.

I rest this case now.

A broad look at Web Application Performance

So does your web application perform well? Is it responsive enough so that it increases the web user experience? It is a fact that user can relate only to what they see and what they click, that is the user interface (UI) and the experience with it (UX) largely depends on the performance of the overall
application. But what is the complaint that we usually hear, “this page is slow”, “this grid takes for ever to load”, “we click on that button and then we go for a coffee break”…and UI is always blamed first. We all know that it has to do with the perception a lot, but a services guy ignores it, database developer closes his eyes too, till someone proves where is the actual bottleneck.
Performance Tips
My cose friend and ex-colleague called me the other morning asking me help to figure out why the application was timing out, users are screaming and cursing this particular search page, it started timing out suddenly last week. They did all the usual tricks, iisreset, checking logs, checking db connections, even few server restarts, no avail. So this morning, we started looked thru the logs, stack traces and we could easily spot the culprit, a stored procedure in SQL Server, has gone rogue and slow. So in this case, we need to look closely at that proc, probably data has grown so much in one of the table that now a bad join has created performance issues.

I shared that experience, to point to fact that performance and thus performance tuning has to start at the very tail end (or head depends the way your look at the layered architecture :). So now lets take the layers, there is presentation layer, business layer, database access layer (persistence layer) and then there is the database itself.

Lets start with Database: What are the performance optimizations that you can think of, here is some starters.

  • Good logical design, helps in how db is physically stored, think thru your normalization levels depending on your usage, make sure you dont over-normalize, it will kill your joins and thus leads to bad performing queries.
  • Good key selection, have a good candidate for primary key, have proper foreign keys that are indexed (create non-clusterd index)
  • When writing queries, do not return all columns (i.e., no select * …)
  • Use SET NO COUNT ON, think about providing sql hints
  • There is 2 camps about using temp tables, some hate it, some love it, according to me temp table can be very useful when you are dealing with a large amount of tables to join. Create temp tables to divide and conquer. Always think about “number of rows” in the table when you make these decisions and potential growth of data. Check query plans, avoid table scans.
  • Use sp_executesql when you want to execute a dynamic sql
  • Reduce round trips to the server, try to do bulk operations.
  • This is just few I can think of now, Google database tuning and I am sure you can find a tons of good tuning tips from database gurus.

    Now how about Data Access Layer (DAL)? So what is DAL? DAL is nothing but a mechanism that provides us some CRUD capability along with some transactions, querying and concurrency control. In simpler systems, ADO.Net provides an database independent way to achieve this, So ADO.Net is our DAL layer. Or you might choose to absract the core features provided by ADO.Net into another library where you have tighter control and ability to switch databases at will, a DB Services Factory. If you are following a Domain Driven Design, then your OR/M plays the role of DAL.

    Lets see some ADO.Net related performance tips,

  • For connection pooling to work, always use the same connection string. By default pool size is 100, you can increase this (rarely you will need to)
  • Use db server IP rather than the server name in connection string to avoid a domain name resolution lookup. (This is a performance tip, but not a governance tip, for better governance, always use DNS name)
  • Always close the connections explicitly
  • Connect using a service account. Use windows authentication.
  • Use stored procedures, return only what you need.
  • Use some caching strategy to cache frequently used static data.
  • Again this is tip of the iceberg on what performance considerations you can use, for a more complete ADO.Net performance tuning tips, refer to “Improving .Net Application Performance and Scalability” from Microsoft patterns and practices.

    We will continue to explore performance from a OR/M layer perspective, then a Business Layer and finally UI layer in a future post.

    Until then Cheers and Happy Programming.

    StackOverflow featured as Microsoft Case Study

    I bumped into this Microsoft Case Study showcasing Stack Overflow site and its ASP.Net MVC based architecture. [link]. The point of interest for me was the infrastructure specs for that site…

    # Presentation. Customers browse pages served by Windows Server 2008 Standard, through IIS 7.5. The Web servers are computers with Xeon quad core CPUs and 8 gigabytes (GB) RAM. When a user submits a question or an answer, their information is written to the database using the lightweight LINQ to SQL database abstraction layer.
    # Database. The database is hosted on a dedicated server with 2 Intel Xeon quad core CPUs and 48 GB of RAM, and runs Microsoft SQL Server 2008 Enterprise. Stack Overflow takes advantage of the Online Indexing feature, introduced with SQL Server 2008 Enterprise, which enables table re-indexing while the database continues to run.

    Look at the number of horses that run that SQL Server backend… 48GB of RAM !!!!!!

    How to share transaction between classic ADO and ADO.Net

    Level: Advanced

    If you want to share a transaction between your ADO connection and your connection, you can use bound connections. Now you might ask why would you ever want to do that, I was faced with a scenario where i was calling my .net modules from my existing VB code using COM interop. (I am planning to discuss COM Interop sometime soon!). So in my case VB/ADO was starting a transaction, then it calls a .net module to do something where I needed the data that was part of the VB/ADO transaction. So only way for me toto get to that data was by enlisting my ADO.Net calls in the same transaction.
    Now to the point, we need to make 2 steps,
    Step 1: Get a unique identifier for my ADO transaction using sp_getbindtoken
    Step 2: Enlist my transaction using that identifier and sp_bindsession
    Note: You will need to pass the transaction identifier to your .net code

    Example: VB Code:
    Set cmd = New ADODB.Commands
    SPName = “sp_getbindtoken”
    cmd.CommandText = sSPName

    cmd.ActiveConnection = oDBConnection
    cmd.CommandType = adCmdStoredProc
    cmd.Parameters.Append cmd.CreateParameter(“@out_token”, adVarChar, adParamOutput, 255)
    transactionID = cmd.Parameters(0).Value

    Pass this transactionID to .net code

    In .Net side, (C#)

    string spName = “EXEC sp_bindsession ‘” + sToken + “‘”;
    SqlCommand cmd = new SqlCommand(“EXEC sp_bindsession ‘” + sToken + “‘”,sqlConnection); cmd.CommandType = CommandType.Text;
    try {

    cmd.Connection = null; }
    catch (SqlException ex) { throw (ex); }

    When you are executing sp_bindsession, SQL Server somehow expects the T-SQL statement to be executed using the provider.If you user any other formats to execute this, it will not work!!!

    Once you have completed these 2 steps, your ADO.Net transaction is enlisted in the parent ADO transaction and you will be able to see the data that was changes/inserted by the ADO within the ADO.Net process.

    If you are using SqlHelper (Microsoft Data Access Application Block), you will have to modify the SqlHelper class to take this new transaction identifier as one of the parameter to SqlHelper methods and call sp_bindsession where ever you are using a transaction.