#0176 – SQL Server – Creating a database without the log backup file – Error Msg. 5120


Recently, one of my friends (new to SQL Server) was attempting to the deploy the AdventureWorks2012 versions of the sample databases available for download from the CodePlex community on his laptop for study purposes, but was facing an error and hence called me up.

When I reached to his place, he told me that he had downloaded the AdventureWorks2012 database from the following website: http://msftdbprodsamples.codeplex.com/releases/view/55330#DownloadId=165399

When attempting to restore the database, he was facing the following error:

Attach Database Wizard showing an error arising out of a missing log file

Error message #5120

A quick glance at the screenshot shows us the error –

Unable to open the physical file “E:SQL DatabasesAdventureWorksAdventureWorks_Log.ldf”. Operating system error 2: “2(The system cannot find the file specified.)”. (Microsoft SQL Server, Error: 5120)

Clearly the error is because the sample databases available on the CodePlex website are just the data files. A transaction log file is not supplied by CodePlex. To restore a database that comes without a data file, one can use the option to rebuild the transaction log when running the CREATE DATABASE command. For example:

USE [master]
GO

CREATE DATABASE [AdventureWorks] 
ON  PRIMARY ( NAME = N'AdventureWorks_Data', 
              FILENAME = N'E:SQL DatabasesAdventureWorksAdventureWorks_Data.mdf', 
              SIZE = 3072KB,
              MAXSIZE = UNLIMITED,
              FILEGROWTH = 1024KB )
FOR ATTACH_REBUILD_LOG
GO

Some finer points about ATTACH_REBUILD_LOG:

  • When rebuilding the transaction log, a collation name cannot be specified to change the collation of the database
  • This option is only available for read/write databases
  • Because the transaction log file is being rebuilt, we do not have control over the location, initial size and file growth increment values. The new log file will be created at the default log path location and will have a size of 1MB
  • Naturally, this operation breaks the log chain

You can also find this script with a lot of other SQL Server, C# and XML scripts at the Scripts Module on BeyondRelational.com: http://beyondrelational.com/modules/30/scripts.aspx?s=stream&tab=scripts

Until we meet next time,

Be courteous. Drive responsibly.

#0175-BeyondRelational.com-Introducing the new Scripts module! Share and find code snippets and scripts


Don’t reinvent the wheel. Period.

The evolution of the human race is based on the fact that humans have had intelligence to reuse an already available solution to a problem to achieve solutions to even bigger problems. In software engineering, we know this as design patterns, which are general reusable solutions to commonly occurring problems. In the world of T-SQL, these are downloadable scripts, which, with a finite amount of customization can be reused within any administrator’s environment or any developer’s T-SQL code. In the world of C# or other programming languages & frameworks, these are blocks or snippets of code that help realize a business requirement.

I have often seen developers and administrators searching for hours on search engines to download scripts or code snippets which they can use for a variety of purposes:

  1. To learn about a particular feature
  2. To implement an industry-standard practice in their environments
  3. To compare a pre-existing solution with industry-standard best practices
  4. To use as reference in order to get out of “writers block”
  5. And many more..

The BeyondRelational.com platform has come up with a revolutionary new concept of the “Scripts” module (http://beyondrelational.com/modules/30/scripts.aspx?s=stream&tab=scripts), where one can share scripts and code snippets which demonstrate solutions to a given engineering problem, examples include:

So, if you are a developer or administrator or an enthusiast and use a piece of code frequently to address a business problem, the “Scripts” module is the right platform for you to share your solution with the community.

So, head over to the Scripts module today – share your scripts while learning something new!

BeyondRelational.com will be celebrating a “scripts week” from July 09, 2012 – July 15, 2012. Do not forget to visit the Scripts module during this time for fresh, new and extremely useful scripts on each day of the week!

Until we meet next time,

Be courteous. Drive responsibly.

#0174 – SQL Server – sys.dm_exec_sessions – Identify a session based on the Windows Process Id


Recently at the office an automated performance testing team came up to me and asked me a very interesting question – Based on the Windows Process Id, how can one determine the number sessions that have been initiated by a process? The question is interesting because their application is a client-server based application, capable of running on a terminal server. Multiple instances of the same application may be running from the same machine, and hence, simply tracing based on the host name would not be sufficient.

Depending upon the requirement, there are two methods to answer this question:

  1. If a T-SQL based method is required for logging and other purposes, the DMV: sys.dm_exec_sessions can be used
  2. If the intention is to troubleshoot or monitor a process (i.e. a one-time activity), the SQL Server Profiler can be used

Using the DMV: sys.dm_exec_sessions

The sys.dm_exec_sessions DMV returns one row per authenticated session on SQL Server. Therefore, the view will return us one record for each session that an application establishes with the SQL Server.

Because the sys.dm_exec_sessions DMV maps to the sys.processes system table (it’s a best practice not to query the system tables directly), the columns returned contain information about the client/host process including (but not limited to):

  1. Host Name
  2. User context used to establish the session
  3. Program name, and
  4. Host Process Id

The column of interest for us is the “host_process_id”. This columns holds the (Windows) process ID of the client program that initiated the session. Please note that this column will show a value of NULL for internal sessions (established by the SQL Server itself).

Using the SQL Server Profiler

The SQL Server profiler, being a dedicated debugging and performance analysis tool, has to have a mechanism to track the processes based on the windows process Id.

Editing a profiler trace allows us to select data columns of our choice (If you are new to Profiler, or need instructions on how to customize a Profiler/SQL Trace, please follow my series of tutorials on the subject at: http://beyondrelational.com/modules/12/tutorials/631/getting-started-with-sql-server-profiler.aspx?tab=tutorials&ts=46&bs=57). The column – “ClientProcessId” captures the (windows) process ID of the application calling SQL Server.

Filtering on the required ClientProcessId column will give us a way to monitor the required session.

Example

Let’s take an example – assume we want to monitor the sessions initiated by the SQL Server Management Studio on my machine.

To begin, get the Process ID (PID) of the required process.

  1. An application might use system functions to obtain this information
  2. If you wish to manually get this information, you can use the Task Manager (Press: Ctrl + Alt + Del, launch the Task Manager)

Based on the screenshot below, you can see that the SSMS process on my test workstation is running under a PID = 2152

image
Armed with this information, let us go to SQL Server Management Studio, and use the following query to know the open sessions initiated by SQL Server:

SELECT *
FROM sys.dm_exec_sessions
WHERE host_process_id = 2152

Running this query yields 2 records as shown below:

image
(Remember that the SSMS can open multiple connections to the SQL Server: http://beyondrelational.com/modules/2/blogs/77/posts/11275/the-multiple-connections-of-ssms.aspx)
Now, let us launch the SQL Server profiler. Choose a template of your choice to create the SQL trace.

(If you are new to Profiler, or need instructions on how to customize a Profiler/SQL Trace, please follow my series of tutorials on the subject at: http://beyondrelational.com/modules/12/tutorials/631/getting-started-with-sql-server-profiler.aspx?tab=tutorials&ts=46&bs=57)

As part of the customization, choose to display all columns for editing and choose to display the “ClientProcessID” in the trace output.
image
Click on “Column Filters” and filter on the required host process Id (i.e. PID = 2152).

Once done, click “OK” to start the trace.
image
Navigate to the SSMS, create a new window and run any test query. For example,

USE AdventureWorks2012
GO
SELECT * FROM HumanResources.Employee
GO

Look at the output of the sys.dm_exec_sessions (query above) and the Profiler. Note that both show the presence of a new user session. The Profiler also shows the associated activity.

image
image
image

References:

Until we meet next time,

Be courteous. Drive responsibly.

#0173-SQL Server-SSMS-Productivity improvement feature-Open new queries in SQLCMD mode


Most of our deployment queries (including the build & deployment output) from the Visual Studio 2010 database solution (SQL Server Data Tools (SSDT) for SQL 2012 Denali) generate their output in SQL queries designed to be executed under the CMD mode. Many of our pre-designed debugging scripts are also written to be executed in the SQLCMD mode.


(*If you are new to SQLCMD mode, you may want to explore my post: http://beyondrelational.com/modules/2/blogs/77/posts/11317/sqlcmd-mode-in-ssms-final-part-in-the-series-underappreciated-features-of-microsoft-sql-server.aspx)


To simply the day and to save a couple of steps of going to the Query menu and choosing the SQLCMD mode, the SSMS on the each team members’ workstation has the “Open new queries in SQLCMD mode” switch set by default. To the best of my knowledge, this feature is available from SSMS for SQL 2008 and above (I have not checked SSMS for SQL Server 2005). Once set, any new query editor window that opens (after restarting SSMS) will open in the SQLCMD mode.


To set this switch, one needs to follow the following simple steps:



  1. Launch SSMS
  2. Go to Tools –> Options
  3. image
  4. Navigate out to the options for “Query Execution”
  5. Check the switch “Open new queries in SQLCMD mode”
  6. image
  7. Click OK
  8. Exit out of SSMS

If you use SQLCMD mode a lot, I recommend that you keep this switch checked in your environment too.


Until we meet next time,


Be courteous. Drive responsibly.

#0172-SQL Server-Changing compatibility level of a database causes recompilation of cached plans


It is a universal expectation that depending upon the nature of the product, organizations support at least one or two prior releases of their product. Microsoft SQL Server allows users to be able to use older databases with newer, latest releases of the server system through a property called as compatibility level.

Compatibility levels provide partial backward compatibility with earlier versions of SQL Server. Compatibility levels affects the behavior of a specific database, not the entire server – that way, most databases can continue to leverage the newly introduced features of SQL Server, while only those databases that have not been made compatible can use the relevant compatibility level setting for functioning.

Here’s a quick table showing the compatibility levels supported by SQL Server 2012 (code named “Denali”):

Compatibility Level Corresponding SQL Server Version supported
90 SQL Server 2005
100 SQL Server 2008/R2
110 SQL Server 2012

The default compatibility level for SQL Server 2012 is 110. All databases created in SQL Server 2012 have this compatibility level, unless the model database has a lower compatibility level (because all databases, including the tempdb are copies of the model database).

Please note that support for compatibility level 80 has been discontinued from SQL Server 2012. You can refer my post: http://beyondrelational.com/modules/2/blogs/77/Posts/14429/0156-sql-server-2012-deprecated-features-valid-compatibility-levels-compatibilitylevel-80-support-ms.aspx for more details.

Now, when upgrading a database from a prior version of SQL Server to the latest version, one of the things that need to be changed is the compatibility level. Today, I would like to draw your attention to the fact that changing of the compatibility level will cause all cached plans to be recompiled when the related queries/batches are executed again.

For a quick demonstration, run through the following set of queries step-by-step.

-- 0. Create the DB
CREATE DATABASE CompatLevelTest
GO

-- 1. Set the compatibility level
ALTER DATABASE CompatLevelTest SET COMPATIBILITY_LEVEL = 90
GO

-- 2. Clean the buffers
DBCC DROPCLEANBUFFERS
DBCC FREEPROCCACHE
GO

-- 3. Create the stored procedure
USE CompatLevelTest
GO
CREATE PROCEDURE proc_AddNumbers (@num1 INT, @num2 INT)
AS 
BEGIN
    SELECT (@num1 + @num2) AS Summation
END
GO

--4. Execute the procedure
USE CompatLevelTest
GO
EXEC proc_AddNumbers 2, 5
GO

-- 5. Check the plan_generation_num
USE CompatLevelTest
GO
--Check the plan generation 
SELECT execution_count, sql_handle, plan_handle, *
FROM sys.dm_exec_procedure_stats
CROSS APPLY sys.dm_exec_sql_text(sql_handle)
WHERE text like '%proc_AddNumbers%'
  AND database_id = DB_ID('CompatLevelTest')

-- 6. Change the compatibility level
ALTER DATABASE CompatLevelTest SET COMPATIBILITY_LEVEL = 100
GO

--7. Execute the procedure
USE CompatLevelTest
GO
EXEC proc_AddNumbers 2, 5
GO

-- 8. Check the plan_generation_num
USE CompatLevelTest
GO
--Check the plan generation 
SELECT execution_count, sql_handle, plan_handle, *
FROM sys.dm_exec_procedure_stats
CROSS APPLY sys.dm_exec_sql_text(sql_handle)
WHERE text like '%proc_AddNumbers%'
  AND database_id = DB_ID('CompatLevelTest')
  
  
--Repeat steps 7 & 8 again to verify the outputs
-- 9. Cleanup!
USE master
GO
DROP DATABASE CompatLevelTest
GO

The points of interest are the outputs of the DMV: sys.dm_exec_procedure_stats. This is same as the sys.dm_exec_query_stats DMV, except that this returns aggregate performance statistics for cached stored procedures. The view returns one row for each cached stored procedure plan, and the lifetime of the row is as long as the stored procedure remains cached. When a stored procedure is removed from the cache, the corresponding row is eliminated from this view. If a plan is being re-used the value in the execution_count column will go up.

Here is the abridged result of the above test.

Compatibility Level Iteration Execution Count Type Type_Desc
90 1 1 P SQL_STORED_PROCEDURE
100 2 1 P SQL_STORED_PROCEDURE
100 3 2 P SQL_STORED_PROCEDURE

The above table shows us that when the procedure was executed for the 2nd time, i.e. after changing the compatibility level, the procedure was recompiled, because the execution_count value remains 1.

Why did I write about it?

The reason this observation attracted me is that many ISVs fail to convey this to the end customers. They would upgrade the database and change compatibility level of their database (the customer would already be using a higher version of SQL Server), and then complaints start pouring in about unusually slow response times from the server, when in their view, the response time should have been faster than the prior release. This is quite obvious because the SQL Server has to redo all the cached plans. If the end users are educated to expect a performance degradation, then a lot of customer support calls can be reduced.

References:

Until we meet next time,

Be courteous. Drive responsibly.