Monthly Archives: January 2012

SSIS Configurations From a DBA Perspective

The title might be a little miss leading. “DBA Perspective” is solely my perspective in dealing with SSIS up to this point.

The list that I have tallied on how to proceed with configurations are as follows:

  • XML Configurations – XML file that contains the SSIS configurations (can include multiples)
  • Environment Variable – environment variable that contains configurations
  • Registry Entry
  • Parent Package Variable – This config type is more often used to update child packages
  • SQL Server Table – a table inside SQL Server contains the configurations (can include multiples)

Now, the irony if you will, is that in my specific case we currently have a number of variations of packages that use many different methods – This to me = monstrosity

The two styles that seem the easiest for me are the XML Configurations or the SQL Server Table configuration. The preferred method of my choice is to store them all in a SQL table.

I have three tables that I use:

  • SSIS_Configurations
  • SSIS_File_Spec_Configuration
  • SSIS_Log_Details

I’ll try to break down the schematics of each table:

SSIS_Configurations

  • ConfigurationFilter – Name of process
  • ConfiguredValue – Server Instance
  • PackagePath – package connections
  • ConfiguredValueType – self explanatory (string, etc.)

SSIS_File_Spec_Configuration

  • Pkey – identifies multiple entries for one package
  • JobIdentifier – name of package
  • Apptype – what application is it a part of
  • InputFileType – .txt, .csv, etc.
  • InputFilePathType – is it a share, sql server, etc.
  • InputFilePath – location of where the input file path is coming from
  • SProcName – name of procedure utilized
  • OutputFileType – .xlsx, .csv, etc.
  • OutputFilePath – location of where the output file will be sent
  • InputFileArchivePath – Archive location for the input file

SSIS_Log_Details

  • ExecStartTime – start time of the SSIS run
  • ExecEndTime – end time of the execution
  • TaskName – name of Task being executed
  • Status – what is the status of the package
  • ErrorDescription – what was the error the package encountered
  • Duration – duration of the package
  • PkgName – package name
  • SubPkgName – child package
  • BatchId – self explanatory

This process has proved helpful for me; each DBA is different though and I can only say to find what works best for you in your particular situation and environment.

Advertisements

Where’s that Baseline?

Everybody needs one; each business is different so what are you going to do? No, if you’re like me and the first thing you thought of was to call the Ghost-busters I’d advise against it (although re-living the Bill Murray glory days might not be a bad idea) . In reality, as DBA’s, we will have a need to trend patterns over time. It is inevitable and should be a way of life for us in most instances. How do we do it; or shall I say how am I going to do it. I don’t believe that I can specifically tell you how to meet your business needs; however after attending the Pass Summit this past year and getting some great ideas I think I’ve come up with something that works for me as a DBA.

I definitely want to recommend checking out Chris Shaw’s Utility Databases presentation he has made available on  his blog. It’s in attending his session at PASS that this idea spawned; I am part of a DBA team that falls into the tax / banking industry. We heavily support the tax side of things so for 6 months out of the year it is a slamming great time!! Needless to say 6 weeks is small compared to 52 weeks out of the year so capturing data is important to trend it out and see what is happening and occurring in the system.

I have chosen to capture a few things that will help me in the future troubleshoot and look at problem areas. Due to regulations I’m not going to dump my findings into my prod environment but will in turn opt to dump them into my local environment ( I will do this only because I do not have live production data in my local environment; just statistics etc.).

Needless to say I have my tables and schema set up to accommodate the following criteria (date time stamps of entries are recorded):

Server Level items

  • Calculates average stalls per read, per write, and per total input/output for each database file
  • CPU utilization by database
  • total buffer usage by database for current instance
  • Isolate top waits for server instance since last restart or statistics clear
  • Signal Waits for instance
  • Get CPU Utilization History for last 256 minutes
  • Basic information about memory amounts and state
  • SQL Server Process Address space info
  • Memory Clerk Usage for instance
  • Single-use, ad-hoc queries that are bloating the plan cache

Database Level:

  • Individual File Sizes and space available for current database
  • I/O Statistics by file for the current database
  • Top Cached SPs By Execution Count
  • Top Cached SPs By Avg Elapsed Time
  • Top Cached SPs By Total Worker time
  • Top Cached SPs By Total Logical Reads
  • Top Cached SPs By Total Physical Reads
  • Top Cached SPs By Total Logical Writes
  • Lists the top statements by average input/output usage for the current database
  • Possible Bad NC Indexes (writes > reads)
  • Missing Indexes current database by Index Advantage
  • Find missing index warnings for cached plans in the current database
  • When were Statistics last updated on all indexes
  • Get fragmentation info for all indexes above a certain size in the current database (the maintenance job we have will take care of fragmentation but I wanted to see what it was like after our jobs ran and before the next run)
  • Index Read/Write stats

In the end, I’ll have a plethora of information to go back to after the “season” ends to review. If you don’t have a “utility” database I suggest you get one.

 

Pro-Activeness vs. Passiveness

I was posed this question the other day by a fellow DBA of mine. We tend to bounce ideas off each other randomly and the question in simple terms was this, “Do you consider yourself a pro-active DBA or do you consider yourself a Passive DBA.” First I want to explain the term passive so it is not taken out of context of the conversation I had. Passive is not meant to be downgrading in any manner. The question asked simply meant do you wait for things to happen than attack the situation.

In my career I have seen various DBA methods utilized by many different people. I myself tend to fall in the Pro-Active category. I want to head as many items off before they happen than to re-act to them. This doesn’t mean some won’t slip through the cracks; of course they will we are human. It simply means that everyday I come into the office I give 110% in every aspect to make my environment as secure and efficient as possible.

So, I pose this question out to my fellow DBA’s and also a challenge. Which category do you fall in and do you think it is the right category to be in?

This simple question turned into an hr conversation over lunch on the why’s and why-not. Challenge yourself in 2012; motivate  yourself to become a better person and better DBA.

What Is In Your Tool Belt

Every DBA that I have ever known has his/her own tool belt. Usually a tool belt consists of tools that you are in need of while currently working on a project. I take this in a literal sense being if I am working on framing I’ll have my hammer, nails, screwdriver, drill, etc. Taking that concept into the DBA realm of things; when working on a project I still like to go in with my tool belt. I work in an environment that is generous enough to afford our team the Red Gate tool belt. In the tool belt we have such things as:

  • SQL Backup
  • SQL Compare
  • SQL Comparison SDK
  • SQL Data Compare
  • SQL Data Generator
  • SQL Dependency Tracker
  • SQL Doc
  • SQL Monitor Installer
  • SQL Multi Script
  • SQL Object Level Recovery Native
  • SQL Packager
  • SQL Prompt
  • SQL Search
  • SQL Source Control
  • SQL Tab Magic

While all these tools are each effective in their own right I went one step further and grabbed the freebie from Red Gate called SQL Scripts Manager. I won’t go into much detail here on what SQL Scripts Manager is but I would highly recommend you check it out; in it you will find a plethora of information via scripts that you can tailor to meet your needs within your respective businesses. Authors such as Tim Ford, Gail Shaw, and Phil Factor comprise many scripts for tasks such as backups, diagnostics, DMV’s, and indexes.

Take time to check out this great tool; another instance of how great the SQL Community really is in sharing ideas and thoughts.

Policy Based Management With a CMS Twist

One of the many perks I have of being on a DBA Team is the number of ideas that are bounced around to find different solutions. One project that we have ran across that we are working on implementing is enforcing policies based on our newly stood up CMS server. The CMS server is nice simply because we can have rapid deployment among 90 some servers.

Not all of these servers will have all of the policies pushed out to them; some might only  have 3 some might have 23 who knows. In order to alleviate some of that pressure we split up our servers into 4 groups within our CMS Server; basically segregating out test boxes, prod boxes, some straggling 2000 boxes (due to third-party apps), and boxes the group doesn’t maintain but still nice to see what they are.

I’m big on  having a plan and road map in place then executing that plan; our team is deciding which group we’d like to tackle first, then choose the policies (standard) out of the box to push first. We can come back in a separate phase and tackle custom policies.

Another great aspect I like about the CMS server-side of things is that we now have a one location repository with all of our server information that we are responsible for. In the repository table we capture the following pieces of information:

  • Server Name
  • Instance Name
  • Server Name \ Instance Name (reporting purposes)
  • Domain
  • IP Address
  • Port
  • Memory Size
  • CPU Count
  • Current Version of Windows
  • Windows Name
  • SQL Server Product Name
  • SQL Server Version
  • SQL Server Edition
  • SQL Server Service Pack
  • SQL Server Language (eh why not throw it in)
  • SQL Server Root Directory
  • SQL Server Collation
  • Methods
  • Business Owner
  • DBA Notes (in case we need to add special info on the servers)
  • Maintained (Do we maintain this server or not; in our industry some servers we do not maintain but we are aware that they are there and in place)
  • Category (prod, test, dev, etc)

Having one central location with all this information has helped tremendously. In the next few posts I’ll go over the policies of what are considered some best practices. Each business is different so tailoring them to meet your needs is the fun part.

Let the games begin!!

It’s a New Year

I haven’t been able to post as much as I would like to and am working on changing that this upcoming year; building it more into my schedule. As this year begins I want to take a moment and implore my readers to learn something new this year. As a SQL DBA, I am always trying to evolve my techniques, learn new ways of doing things, enhancing productivity, streamlining processes, etc. In order to do this I have to stay on top of my game  – always reading articles, blogs, doing extra work on my own time, conferences, SQL Saturday’s.

So I ask the question, what are you going to do in 2012? Will we stagnate? I hope not….take this not as a rallying cry but more of an encouragement; there are so many opportunities out there to get involved in the community, helping others, and just learning in general.

As a SQL DBA my door is always open to help as is so many others. Good luck this year and I wish everyone the best in their endeavors.