Fix database issues
Databases are one of the main sources of performance issues in web applications. As a database has to deal with huge amounts of data, even a minor algorithmic error when preparing a query can lead to serious consequences. This is especially true when using ORM systems (the examples in this chapter use Entity Framework).
DPA detects database issues related to long query execution times, large number of database connections, large number of the same database commands, and large number of records in a response. The issues are grouped under the Database tab in the Dynamic Program Analysis window.
In this chapter, you will find examples of code design that may lead to such issues and tips on how you can fix it.
DB command time
If a command execution time exceeds the specified threshold, DPA marks the code that runs the command with a DB Command Time issue. It is tough to single out any one reason for long execution time of a particular command. Such issues could be related to a complicated resulting query, to network connection problems, and so on. It's quite possible that a long command time is not an issue at all – it's just the way the application works, and nothing can be done to fix the situation.
How to fix
As there could be numerous reasons behind the issue, the only advice that could be given is to:
Find the problematic code and open the corresponding issue in the Dynamic Program Analysis window.
In the issue details, check the SQL query.
If it's still not clear what makes the problem, try running the query directly on the database server and rule out one by one all possible causes, e.g., communication issues, limited cache size, non-indexed columns, table locks, deadlocks, and other.
If a number of simultaneously opened connections to a database exceeds a threshold, DPA marks the code that opens a connection with a DB Connections issue. For example, during the execution, a function opens and closes 100 connections, then 200 connections, then 150 connections. Supposing the threshold is set to 50 connections, the resulting issue value will be '200 connections'.
Below you will find some of the possible reasons for the connection leaking.
Connection leaks in 'try' blocks
Consider the following code:
Though we close the connection after running the command, if the command throws an exception, the connection will remain open. What is worse, we will handle the exception without even knowing about an open connection.
How to fix
You must close the connection in any case using the
Connection leaks via SQLDataReader
SQLDataReader for reading stream of rows from a database, make sure you're using the correct
CommandBehavior. Consider an example:
SqlDataReader instance doesn't close the connection after executing a command. If the reader is created many times, this spawns a corresponding number of open connections.
How to fix
Make sure you use an instance of
SqlDataReader that closes the connection via
CommandBehavior.CloseConnection. The problem here is that an
SqlDataReader with the default or another behavior may be required by some other code. In this case, you should refactor the code so that it creates instances of
SqlDataReader with behaviors for all required use cases.
If a number of the command executions exceeds a threshold, DPA marks the code that runs the same command multiple times with the DB Commands issue.
The main reason why this check exists is to prevent the well-known N+1 problem. For example, you have a table of blogs and each
Blog has a number of posts inside. Thus,
Post is the one-to-many relationship. Suppose you want to get the list of all posts in all blogs. The straightforward way to do this with Entity Framework would be:
The code above will result in N+1 queries where N is the total number of posts (select all blogs + select posts from each blog).
How to fix
You should try getting all the required data in a single request to the database. For example:
If a database command returns a number of records that exceeds a threshold, DPA marks the code that runs the command with the DB Connections issue. In some cases, getting many records is implied by design. But sometimes, this may happen accidentally due to suboptimal code patterns.
Casting IQueryable to IEnumerable
IQueryable implies querying an external data source while
IEnumerable queries only in-memory data. So, if you query an
IEnumerable collection, the application will first get all related data from the database and then apply the query to the in-memory data. With
IQueryable, the application will send the query to the external database directly. Consider the following example.
All we want to get in the example above is a number of posts matching some condition. In theory, this could be done with a
SELECT COUNT database query. In practice, as
CustomFilter accepts only
IEnumerable collections, our query is casted from
IEnumerable. As a result the line
CustomFilter(dbContext.Posts.Where(_ => _.PostId > 0)).Count() loads all posts to memory with:
Next, the filter is applied to the in-memory collection of posts. A DPA issue shows how many records we receive due to this query:
The shown example is very obvious. In a real application such casting may be hidden inside numerous calls (e.g., inside some query filter chains).
How to fix
The filter function must explicitly use
IQueryable collections. For example:
CustomFilter works with
CustomFilter(dbContext.Posts.Where(_ => _.PostId > 0)).Count() query is translated to:
So all the filtering happens on the server side, and the application receives only one record with the count result.