Tag Archives: HowTo

All topics “HowTo” in Microsoft SQL Server.

#0405 – SQL Server – Msg 5133 – Backup/Restore Errors – Directory lookup for file failed – Operating System Error 5(Access is denied.).


We got a new server recently and one of my colleagues ran into an error when restoring a database. The error was a quite generic (reformatted below for readability):

Msg 5133, Level 16, State 1, Line 1
Directory lookup for the file 
"C:\Users\SQLTwins\Documents\AdventureWorks2014\AdventureWorks2014_Data.mdf" 
failed with the operating system error 5(Access is denied.).

Msg 3156, Level 16, State 3, Line 1
File 'AdventureWorks2014_Data' cannot be restored to 
'C:\Users\SQLTwins\Documents\AdventureWorks2014\AdventureWorks2014_Data.mdf'. 
Use WITH MOVE to identify a valid location for the file.

My immediate reaction was to  review the restore script.

USE [master];
GO
RESTORE DATABASE [AdventureWorks2014]
FROM DISK = 'C:\Program Files\Microsoft SQL Server\MSSQL13.MSSQLSERVER\MSSQL\Backup\AdventureWorks2014.bak'
WITH
MOVE 'AdventureWorks2014_Data' TO 'C:\Users\SQLTwins\Documents\AdventureWorks2014\AdventureWorks2014_Data.mdf',
MOVE 'AdventureWorks2014_Log' TO 'C:\Users\SQLTwins\Documents\AdventureWorks2014\AdventureWorks2014_Log.ldf';
GO

All looked well, I subsequently moved to the environmental aspect of troubleshooting. It was a new server and we had just created the target folders to which the database was to be restored.

We attempted to restore to the default database locations configured during SQL Server installation and the restore worked. So, one thing that became clear: the SQL Server service did not have appropriate security rights on the destination folder.

The Solution

Once we determined that it was the security rights on the destination folder, the only thing remaining was to grant the rights. Here’s how we do it.

  1. Cross-check the user set as the service user for Microsoft SQL Server database engine (use the SQL Server Configuration Manager for interacting with the SQL Server service – here’s why).
  2. Under Folder properties, ensure that this user has full security rights (or at least equivalent to the rights assigned on the default database folders specified at the time of installation)

Here’s are detailed screenshots showing the above process.

01_SQLServerConfigurationManager

Identifying the user running the SQL Server Database Engine service

02_GrantingPermissions_01

Navigating into file system folder security options to grant access to the SQL Server service

02_GrantingPermissions_02

Choosing the appropriate account running the SQL Server service

02_GrantingPermissions_03

Applying appropriate rights to folder

By the way: If you encounter similar issues in accessing your backup files, the root cause and solution are the same. Check the permissions on the folders housing your backups and you should  see that the database engine does not have the necessary security rights.

Further Reading

Maintaining permissions on data folders and appropriate registry entries is something that is handled by the SQL Server Configuration Manager when you change the service account under which  the database engine is running. If you use services.msc (the old way), this is not done and your SQL Server may stop working.

  • Changing SQL Server Service Account or Password – Avoid restarting SQL Server [Blog Link]
  • Blog Post #0344 – SQL Server – Missing Configuration Manager on Windows 8 [Blog Link]

Until we meet next time,

Be courteous. Drive responsibly.

Advertisement
Screenshots showing that objects have been given default constraint names by SQL Server in case a name was not supplied by the user

#0401 – SQL Server – Script to validate object naming convention


A few weeks ago, I ran into a question on one of the forums asking for a script that can help the team validate object naming conventions. Immediately, I was able to sympathize with the team.

What happens is that when developers use the graphical (GUI) tools in the SQL Server Management Studio (SSMS) or via a simple script, they often fail to specify a name to each individual constraint. These slips are not intentional – developers don’t often realize that each constraint is an independent object because they are ultimately related to  another user defined object (a table).

However, when a name is not explicitly specified for a particular constraint, what Microsoft SQL Server does is provide a name by combining the following:

  1. A standard prefix indicating the object (e.g. “DF” for default constraints)
  2. 9 characters of the object name
  3. 5 characters of the field name
  4. Finally, the unique Id of the object, represented in hexa-decimal format

While this format will always generate a unique value, it would generate names that may not be intuitive. It is therefore a common  practice to review the database code and review for compliance with naming conventions  that have been defined in the product/project.

This logic can be leveraged during code reviews/audits to identify objects where standard project naming conventions are not met.

To demonstrate the functionality of the script, I create one table with a wide range of constraints – none of which have a name specified.

USE [tempdb];
GO
IF OBJECT_ID('dbo.ConstraintsWithoutNames','U') IS NOT NULL
BEGIN
    DROP TABLE dbo.ConstraintsWithoutNames;
END
GO

CREATE TABLE dbo.ConstraintsWithoutNames 
    ([RecordId]     INT          NOT NULL IDENTITY(1,1) 
                                 PRIMARY KEY CLUSTERED,
     [RecordName]   VARCHAR(255)     NULL,
     [RecordStatus] TINYINT      NOT NULL DEFAULT (0) 
                    CHECK ([RecordStatus] IN (0, 2, 4, 8))
    );
GO

Now, the following script is a simple string search that looks for strings ending with the hexa-decimal representation of the parent object.

USE [tempdb];
GO
SELECT * 
FROM [sys].[objects] AS [so]
WHERE [so].[is_ms_shipped] = 0 --Considering user objects only
  AND [so].[name] LIKE ('%' + REPLACE(CONVERT(NVARCHAR(255),CAST([so].[object_id] AS VARBINARY(MAX)),1),'0x',''))
                        --Only those objects whose names end with the hexadecimal
                        --representation of their object Id
Screenshots showing that objects have been given default constraint names by SQL Server in case a name was not supplied by the user

Objects given default constraint names

I  hope you found this script useful. Please do  share your ideas/scripts that you may be using in your day-to-day activities.

Until we meet next time,

Be courteous. Drive responsibly.

#0395 – SQL Server – SSIS – Adding date and time to a file name after processing


As we have been discussing over the last couple of weeks, moving data from a “source” to a “destination” is the crux of SSIS. The data source can be literally anything – a SQL Server  database, a web-service, a script, an XML or even a conventional delimited file.

Picking up files from a folder and processing them is one of the most common use-case that I have seen. In order to avoid processing the same file over and over again, the most common requirement is to append the file name with a date-time value. In this post, I will provide an expression I use to achieve this goal.

Appending a Date/Time and renaming a file in SSIS

The easiest way to append a date/time value to a file name and renaming the file in SSIS is via SSIS expressions.

Put every simply – an expression is a formula, a combination of variables, literals and operators that work together and evaluate to a single value.

We can use the following expression to yield a date value:

SUBSTRING(@[User::FileName], 1, FINDSTRING(@[User::FileName],".",1) -1 ) + (DT_WSTR,4)YEAR(GETDATE()) + (DT_WSTR,2)MONTH(GETDATE()) + (DT_WSTR,2)DAY(GETDATE())

This can then be appended to the file name and a simple File System Task can rename the file for us on the file system. Let’s see this in action with an example.

Demo

Assume a scenario where I have a set of files in a folder, and I need to do some processing on them. After processing, I need to update the file names with the date. For the sake of brevity of this example, I will not be performing any other operation on the files other than renaming them.

My folder containing the input files looks like this:

Folder with input files which need to be renamed once processing is complete

Folder with input files which need to be renamed once processing is complete

Now, I create an SSIS package that contains a simple For Each file iterator and a File System task.

For Each Loop configuration

The For Each Loop (configured as a file iterator) uses an expression populated by a variable to get the source directory information.

The file name and extension is fetched into a variable which will be used during the processing and subsequent renaming.

0395-sqltwins-overall-package

Overall package configuration showing the For Each File Iterator

0395-sqltwins-foreachloop-collection-configuration

For Each Iterator – Collection configuration using a variabel for the source directory

0395-sqltwins-foreachloop-fetching-file-name

Fetching the individual file name & extension into a variable

Now, I create a new user variable “OutputFileName” and use the expression below to generate the output file name. The expression has essentially 3 distinct parts:

  1. Fetch the file name (without the extension)
  2. Append the date to this string
  3. Fetch the file extension and append to the modified/new file name
@[User::SourceFolder] + 
SUBSTRING(@[User::FileName], 1, FINDSTRING(@[User::FileName],".",1) -1 ) + (DT_WSTR,4)YEAR(GETDATE()) + (DT_WSTR,2)MONTH(GETDATE()) + (DT_WSTR,2)DAY(GETDATE()) + 
SUBSTRING(@[User::FileName], FINDSTRING(@[User::FileName],".",1), LEN(@[User::FileName]))
Output File Path expression

Output File Path expression

This variable is now used in the configuration of the file system task which is responsible for renaming the file.

File System Task configuration

The file system task is an extremely flexible task component in SSIS. It can operate on the file system not only by using file connections but also on the basis of variables! For our problem, we will leverage this flexibility of the File System task.

As can be seen from the screenshot below, my File System task has been configured as follows:

  • Source File Path is a variable
  • Destination File Path is a variable
  • File Operation type = “Rename File”

The “Rename File” operation renames the file specified by the old file path and renames it to the name specified by the new file path.

File System Task configuration using “Rename File” mode where input & output file names are supplied via variables

Once the package is executed, we can see that the files in the source folder are now updated as expected.

Source Folder with the File Names updated as expected

Source Folder with the File Names updated as expected

Further Reading

Until we meet next time,

Be courteous. Drive responsibly.

High-level usage of the File Ordering script task

#0394 – SQL Server – SSIS – ForEach Loop – Controlling processing order


One of the SSIS related questions that I see often in the forums is around the ability to control the sequence in which a For Each Loop processes the files from a directory. When performing data movement as part of an ETL or an integration, it may be required to work asynchronously. The “sender” of the data may place the files in a folder for the data processing to pickup. These files generally contain ordered data that needs to be loaded in a specific sequence.

The problem is that the SSIS For Each Loop container does not have the provision that allows a user to specify the order in which the files are to be processed. Today, I present one of the workarounds that I came up with (there may be others, and I would be very interested to know what do you prefer in these scenarios).

Environment Prep

To demonstrate the ordered loading of files, I created a set of simple files  and a table to which I will be adding the data read from these files.

USE [tempdb];
GO

IF OBJECT_ID('dbo.ImportedData','U') IS NOT NULL
BEGIN
    DROP TABLE dbo.ImportedData;
END
GO


CREATE TABLE dbo.ImportedData 
   (RecordId           INT          NOT NULL IDENTITY(1,1),
    ProcessedFileName  NVARCHAR(255) NOT NULL,
    RecordValue        NVARCHAR(100)     NULL,
    ProcessedDate      DATETIME     NOT NULL,
    CONSTRAINT pk_ImportedData PRIMARY KEY CLUSTERED (RecordId, ProcessedDate)
   );
GO

To demonstrate the fix, I will not be processing the files in order of their file names, but I will instead be processing them in order of  the date on which they were last saved. The screenshot below shows the order in which files will be processed.

Folder showing files to process in the required non-alphabetical order

Folder showing files to process in the required non-alphabetical order

Core Logic – Identify the Order of the Files to Process

The For Each container is just a mechanism to loop through a set (of objects/files/records). It does not have the provision to control the order in which the set is to be iterated through.

Hence, the workaround that I came up with was to supply a pre-ordered set to the For Each container. I therefore used a script task to order the filenames in a recordset object which I will pass to the For Each container.

As shown below, I pass the path to the folder where the files are located, and accept the sorted list into an object.

High-level usage of the File Ordering script task

High-level usage of the File Ordering script task

The script used within the script task which is also the heart of this solution is provided below.

public void Main()
{
    // TODO: Add your code here
    DirectoryInfo di = new DirectoryInfo((string)Dts.Variables["User::FolderPath"].Value);
    List<string> files = new List<string>();

    foreach (var fi in di.EnumerateFiles().OrderBy(t=>t.LastWriteTime))
    {
        files.Add(fi.FullName);
    }

    Dts.Variables["User::FileListing"].Value = files;

    Dts.TaskResult = (int)ScriptResults.Success;
}

Process the File

The  next stage is simply to process the file.

  • The sorted set of files is iterated upon by the For Each container
  • I have a script task that simply prints out a message box with the file being processed currently
  • The data flow task simply reads from the file and writes to a conventional OLE DB destination

Screenshots showing these bits of the configuration are provided below.

0394-sqltwins-ssis-foreachcontainer

For Each Loop Iteration Configuration

0394-sqltwins-ssis-foreachcontainer2

For Each Loop File Name configuration

0394-sqltwins-ssis-dfd_fileconnection

Setting the file connection based on the File Name set by For Each Iterator

If I execute the SSIS package and then query the destination table, we see that the files were indeed ordered in the expected order.

USE [tempdb];
GO
SELECT * 
FROM dbo.ImportedData 
WHERE RecordValue LIKE '%1%';
GO
0394-sqltwins-ssis-output

Output of script to confirm that data was loaded in the required sequence

Further Reading

Until we meet next time,

Be courteous. Drive responsibly.

Showing how to debug a DML trigger by stepping into a DML trigger during a SSMS T-SQL Debugging session

#0382 – SQL Server – SSMS – Debugging a T-SQL DML Trigger


I often receive questions related to debugging code using SSMS. Based on these questions, I have written a lot of blog posts in the past, however I was recently asked the question:

How can I debug a DML trigger in SSMS?

I realized that while I had encountered the same question and researched it in my initial days (when I worked on SQL Server 2000), I had never written about it. So, here goes.

In order to demonstrate how to debug a trigger, all we need to do is debug the statement that initiates the trigger.

For this demonstration, I will fire an update on the [Sales].[SalesOrderHeader] table of the AdventureWorks database.

USE AdventureWorks2012;
GO
UPDATE soh
SET soh.SubTotal += 2
FROM Sales.SalesOrderHeader AS soh
WHERE soh.SalesOrderNumber = 'SO43697';

When we debug this query (using F10), the SSMS debugger launches and we can step through the code using the same key combinations as we have in Visual Studio.

Demonstrating how to debug DML triggers using SSMS

Debugging T-SQL script using SSMS

Executing the update should fire the DML trigger [uSalesOrderHeader] which can be debugged like any other T-SQL code by stepping into the trigger (press F11 when executing the update).

Showing how to debug a DML trigger by stepping into a DML trigger during a SSMS T-SQL Debugging session

Stepping into a DML trigger during a SSMS T-SQL Debugging session

That’s it. Debugging a trigger is no different than debugging a stored procedure or any other T-SQL script.

In case you would like to learn more about debugging in SSMS, please do refer my previous posts (links below). If there is anything specific that you would like to know more about, do drop in a line.

Further Reading

  • T-SQL Debugger – Underappreciated Features of Microsoft SQL Server [Blog Link]
  • Enabling T-SQL Debugger in SQL Server Management Studio (SSMS) [Blog Link]
  • SSMS – Tutorial – Part 03 – The T-SQL Debugger (L200) [Blog Link]
  • T-SQL Debugging – Connection Errors & Firewall settings [Blog Link]
  • T-SQL Debugging – SSMS errors – MSDBG2.DLL [Blog Link]
  • Setting, Labeling, Using & Searching Conditional Breakpoints [Blog Link]
  • Export/Import Breakpoints [Blog Link]
  • Using the “When Hit, Do Something” breakpoint configuration [Blog Link]
  • Using Data-Tips [Blog Link]
  • Editing, Exporting and Importing Data Tips [Blog Link]

Until we meet next time,
Be courteous. Drive responsibly.