Simple • Straightforward

So easy a Caveman
Programmer can do it.

›› Let's Get Started

Long Daily Processes

What is Performance Tracing?

Setting up a trace file allows the SQL events taking place to be monitored and tracked. This can become very useful for monitoring every procedure being run and gathering the stop/start times. Once you get that data imported into a table, it is fairly easy to send alerts/daily reports when system slowness starts becoming an issue.



How to Create a Trace File

Getting the trace file created is surprisingly easy, considering Microsoft created the procedures required to do it. The syntax is below

DECLARE @TraceID INT
DECLARE @Options INT
DECLARE @TraceFile NVARCHAR(245)
DECLARE @MaxFileSizeMB BIGINT

SET @Options = 2 SET @TraceFile = 'C:\TraceFiles\tracedbname.trc' SET @MaxFileSizeMB = 100
EXEC sp_trace_create @TraceID output, @Options, @TraceFile, @MaxFileSizeMB

The parameters are important to understanding what is going on.

  1. Trace ID - The procedure will give you a trace id as an output, and you need to save that off (into a configurations table of some sort) so you can use your trace file later.
  2. Options - You can read about the option types on Microsoft's website, but an option value of 2 is pretty standard. This just means that when the trace file reaches the maximum limit, it will create a new trace file with the same name with an integer appended to the end (tracefile1.trc, tracefil2.trc, etc).
  3. Trace File - This is the path that the file wil be saved too.
  4. Max File Size - This is the size (in MB) that will be the maximum size for the file

There are some additional parameters, but we will just start with these to get you going.



Adding Events and Filters

Now that our trace file has been created, we need to added events to monitor. Yet again, Microsoft has made this easy on us and we can use their built in procedure. If we were to monitor stored procedure start and end times, you could add these events/columns:

EXEC sp_trace_setevent @TraceID, 42, 1, 1
EXEC sp_trace_setevent @TraceID, 42, 14, 1
EXEC sp_trace_setevent @TraceID, 42, 15, 1
EXEC sp_trace_setevent @TraceID, 42, 13, 1
EXEC sp_trace_setevent @TraceID, 42, 35, 1

Parameters Definitions

  1. Trace ID of the trace being run.
  2. The setting of the event to monitory (42 = SP starting).
  3. The columns that are being recorded (1 = the procedure text, 14 = start time, 15 = end time, 13 = total duration in micro-seconds, 35 = databasename).
  4. Turning the event on.

There are a lot of events and columns that can be added. You can find the full list here:
Microsoft Trace - Events and Columns

The last thing we will need is to add a couple filters to monitor a specific database and set a minimum duration for an event to run before it will be logged into the file. This will help prevent the files from becoming overloaded with too much data.

EXEC sp_trace_setfilter @TraceID, 13, 0, 4, 5000000
EXEC sp_trace_setfilter @TraceID, 35, 0, 0, 'DB1'

These parameters are as follows:

  1. Trace ID that this filter applies too.
  2. The column that the filter will be working with.
  3. The logical operator applied to the filter (and/or).
  4. The comparison operator (= => =< Like).
  5. The value being applied.

This translates to our above 2 filters meaning: Only return events that have a duration greater than 5 seconds and a database name of DB1. The full list of operators can be found here:
Microsoft Trace - Filters



Turning Traces On and Off

The set status procedure will get our traces turned on and off. An example on how to start it is below:

EXEC sp_trace_setstatus @TraceID, 0

You can also stop the trace using a 1 instead of a 0. The other option is to stop it and delete it by using a 2.



Extracting the Data

After your trace file has been running and collecting data, you need to review it. The best way to do that is to import it into a table. There is a nice trick to doing that as shown below.

SELECT *
FROM ::fn_trace_gettable(@TraceID, default)

This will output all of the data in the file and you can feel free to insert it into a table that is setup with all of the columns you have been tracking.

With all of the data in a table, you can now run reports and create job alerts that will let you know if there are performance issues with your database.





Comments

Name:
Title:   
Commenting Currently Disabled


Sofija
8/25/2012 12:58:13 PM
BaAbMguhWmjjbPC


2011 ? 11 ? 12 ? - ?? 2:43 6F United Kingdom Internet Explorer 8.0 Windows 7I used to be recommended this web site by means of my coiusn. I am now not certain whether or not this publish is written by means of him as no one else understand such distinctive approximately my trouble. You're incredible! Thank you!
Sharky
8/3/2011 6:15:22 PM



Last one to uitlzie this is a rotten egg!