Monthly Archives: April 2012

Drop and Create Databases ~ SQLCMD Style

Sometimes there are instances when I need to drop 6 or 7 databases on my local instance. Currently, I run 3 instances locally, two are for OLTP duties and one is for reporting. One method for dropping the databases that I like is based on just a couple of SQL files and a .bat file. First, I make sure that my replication is turned off on my instances. If replication is not turned off then the below process will error out when trying to drop the database.

SQL FILE CREATION FOR DROPPING AND CREATING THE DATABASES

In my case I need to drop multiple databases so my SQL file resembles:

DROP DATABASE db1

DROP DATABASE db2

DROP DATABASE db3

DROP DATABASE db4

DROP DATABASE db5

DROP DATABASE db6

DROP DATABASE db7

DROP DATABASE db8

Once complete I create a generic directory on my C:\ and saved this file as DropDatabases.sql. The create database script is the same script just replace the drop with the create. Keep in mind on the create database statement you can add more setting defaults to the script but I will not go into those here. You can also tinker around with this script and add some standards such as checking to see if the database exists before dropping and creating etc. This is just an idea up to you on how you use it.

CREATING THE .BAT FILE(s)

I use the sqlcmd utility in my .bat file and make calls to the server instance and specific file. Simply open up a notepad editor and type the following command(s)

sqlcmd -S (local) -i”C:\CI\DropDatabase.sql”

sqlcmd -S (local) -i”C:\CI\CreateDatabase.sql”

Then save the file as DropDatabase.bat and CreateDatabase.bat

The two statements will call either the DropDatabase.sql file or the CreateDatabase.sql file

—————————————————-

I believe the SQLCMD utility is a pretty useful tool that I will eventually incorporate into my continuous integration initiatives. For those of you who know me or follow my blog for a long period of time know that I am Red-Gate fan. Because of this I have many useful tools at my disposal and currently am going to set up a form of CI following some of their practices which can be found here

MSDN provides a thorough listing of SQLCMD syntax and variables you can view here

I enjoy trying different ways of doing things that help daily performance initiatives. Hope this helps someone else in the future.

Advertisements

To Document or Not To Document ~ That is the question

Recently, I approached a person in a business unit and asked if they had some standard documentation on one of their processes. The reply I received in general terms was, “No we do not have any standard documentation but we do continue to build on our processes”.

Knowing how I am, I began to read into it and I couldn’t get passed that statement. Then it dawned on me, of course everyone is different and we all have different mindsets. My DBA processes I feel must be documented, but others will not necessarily share that same view point – especially in other business units.

I’ve always heard that documenting is an age old battle; most people I know don’t like doing it but for me I do believe it is a necessity. I’m not the best at it but having standards in place within a team or solo act is beneficial for the next person that comes in. One of my old mentors several years ago told me if he walked out to lunch and something happened to him he would want me to be able to pick up a process even if I didn’t know it and look at his documentation and be comfortable in completing the task. Somewhere along the way that kind of stuck with me. Some of the standard documentation we have put in place for the DBA team that I’m on are:

  • Backup Procedures
  • Job Retention
  • Security
  • Maintenance
  • Server Installations
  • Code Promotions
  • Object Naming Conventions
  • Job Notifications

The list could go on but those are some high level ones that I just threw out there. Documentation made it on SQL Server Pedia’s “10 Things Every DBA Should Do” – the article came from John Sansom who really has a great blog; I suggest you check him out.

One of the things I like to do when starting new employment is to look at their documentation; some cases it doesn’t really exist. I like to take that and use it to gain knowledge of the systems and what they do. Documentation……”To Document or Not To Document ~ That is the question” To me it is a no brainer.

Drop me a line and let me know what your thoughts are if you think documentation is important or not important and some standards that you may have in place.

Someone Forget to Logoff that Remote Desktop Session?

 

Did Someone Forget To Logoff?

Phone rings….. alerts start going off……… coffee hasn’t been made yet….. it’s 2:13 a.m. and the issue requires you to remote login to the server to perform some task; only to find out that the server has already exceeded the maximum number of users. Has that happened to you before? Has me; either other people are on the box looking at something else or someone forgets to go ahead and log off when they are done using the box.

When this occurs I like to use the QWINSTA\RWINSTA operations from the command line to help me identify who is on the box and then graciously log them off. I know there are other methods to complete the same task; however I tend to roll with this one as it is engrained in my head.

To display information about Terminal Sessions you can type the following command in the cmd prompt:

  • qwinsta /server:[servername]

This command will give you the session name, username, id, state, type, and device.

This information is good however; how can you get someone off the server? That can be accomplished by using a similiar cmd line; howerver I DO RECOMMEND that you check with the individual prior to just booting them out unless it is an extreme emergency

  • rwinsta /server:[servername] [sessionid]

You need to make sure that you capture the sessionid that you saw from the qwinsta function in order to log them off.

I know Windows Administrators go through the Terminal Services Manager route but if you’re like me and enjoy some good command line based tools this can be another option for you; especially when you are woken up at night and the other people logged in could be fast asleep!!

SQL Saturday #122 – Pre cons

Just a reminder of the pre cons that are available this July 20th, 2012. Kevin Kline will have a session on Leadership Skills for IT Professional, Bill Pearson will have a session on Practical Self Service BI With PowerPivot, and Dave Fackler will have a session on SSIS – Live it, Love it, Learn it.

Cost is $100 per pre con session

The main event will be held July 21st, 2012 at the University of Louisville School of Business

University of Louisville

2301 S 3rd St

Louisville, Ky 40208

You won’t want to miss this event. Alot of good industry speakers will be at your disposal; visit the  main site to register today.

Are you a fit DBA?

Over my career I have seen many articles on what it takes to be a DBA. I always enjoy reading other viewpoints on the topic; so much so I’ve wanted to share my mentality on how I attack being a DBA. Every person is different and unique in their own way; however some of the principles, I believe, can remain the same.

How does a DBA stay on top of his/her game?

For those that know me individually know that I enjoy playing the game of basketball. I grew up with it; played up through college and once college was over still delve in it for fun. I can remember putting in countless hours in the gym shooting thousands of jump shots, running hundreds of laps, waking up at 4 in the morning to work out then going to class, then back to practice after classes. The reason I trained so hard was to be the best that I could be and squeeze the best out of my potential.

The other day as I was driving down the road i thought about my career as a DBA really is no different. In order for me to reach my goals and to stay on top of my game and enhance my skill set is to condition myself in the realm of DBA expertise. How do I become a better DBA? To me I need to take my conditioning mentality and make that into a SQL conditioning mentality. As I stated before each person is unique so your goals might be different than my goals but the underlying base is similar in many ways.

TRAINING YOURSELF

In any training program you have to continually work at it. My jump shot didn’t just happen over night; I couldn’t even begin to tell you how many hours I put in at the gym to where it became second nature to me. It was no longer thinking of the mechanics “Step, Bend, Follow-Thru” it became automatic. To me why would my SQL training be any different? I took 5 (my magic number, yours may be different) and looked at those 5 in ways that I can improve them. Then guess what; training begins! Train yourself and keep training. Just because you train for a day here or a day there means you will improve. You have to keep at it and train regularly just like you would when you train for a race, lifting weights, or in my case putting a ball through a hoop.

EAT HEALTHY

Part of my training consisted of eating healthy, and eating certain foods before big games to maximize my energy. Same thing goes for my SQL Skill sets. You will see on this blog many various bloggers who I look up to. To name a few of them:

The list could go on but these guys who I look up to I try to gain as much knowledge as I can from them and be like a sponge. They provide a plethora of information and you know what….I found that all of them are as eager to help as I am eager to learn. The SQL Community is abundant with tips and resolutions from such sites as:

To grow in this industry take advantage of what is at your disposal. Chances are issues or questions that you may have will be answered by someone or some form of partition in the SQL Community. That is why I enjoy the SQL Community so much; it is one big family. Eat knowledge regularly; grow your knowledge just like an infant eats and grows as he or she gets bigger.

Maximize Your Abilities ~ Fourth Quarter!

In high school (4th quarter) and in college (second half) I had this thing mentally I would do. I’d come out and slap both hands on my chest and then smack the floor. By this time I was dog dead tired; I had pushed myself as hard as I could the whole game and then I had to find the strength (where conditioning comes in) to make it through this quarter or half and stay focused when I was tired.

Have you ever pushed your limits with SQL? If not then how are you growing? I challenged myself and I will challenge you; push your limits and get out of your comfort zone. Learn something you didn’t know before; add some skill sets to your arsenal. You can’t expect to go out and knock out every inch of transactional replication if you’ve never seen it. Think of it in terms of taking small baby steps, but keep moving forward.

HELP OTHERS

I could chalk this up to getting older, the longer I’m in the SQL industry the more I enjoy helping others along their way achieve their goals. Take advantage of the many forums out there; share your knowledge; who knows maybe one day you can help others as they have helped me. I think a lot of times it is easy for us in the industry to take for granted the ones who are at the forefront such as the Grant Fritchey’s, the Paul Randal’s, the Glenn Barry’s, the Chris Shaw’s, the Brent Ozars, and the list could go on and on. These guys get it and have worked hard so that people like you and I could also get it.

As I close this thought, I recently read a post from Paul Randal. I can’t find it now but in essence it spoke of a request he received for advice and the request didn’t even say please or a thank you. I implore you to not only challenge yourself but also remember that a lot of times these guys help us for free. A simple please and thank you can go along way. Take the time to thank them for what they do. You may have others that you follow or mentors ~ have you thanked them lately for their advice or for being a sounding board as you maneuver through your career. If not think about it.

How many .MDF Files did you say?

How many File Groups?

It isn’t abnormal in the line of business that I am in to run across a past vendor who has set up a SQL Database, train the users on the application, then depart off to the wild blue yonder leaving the DB’s in the hands of the company’s DBA team on site. Not long ago this very same scenario came about, once the vendor had departed I was curious as to what they had done; and since now I’m responsible for the maintenance, issues, etc I figured it would behoove me to check it out.

I found a 3 GB Database which is cool; checking out the file groups I discovered over 300 .MDF files had been created. My initial gut instinct was one of why, but as I got into further research I can see where they were going with this. The sliding window approach was utilized by this vendor because of two reason upon me asking them…….one they had one architect and that is what they used; and two that is how their app interacts with the product on the back end. I figured given their answer a more prudent search was merited for my own purposes. I was intrigued……

In researching I came up with a few reasons that multiple data files would be utilized

  1. Performance increases
  2. Parallelism
  3. Rolling off extinct data

Now, I’m sure that there are more than just three gains that can be brought up; however in my case these were the ones that stood out.

I can see that if tables/indexes were placed onto separate file groups that it would allow for better performance from a complex queries standpoint in that SQL Server can utilize the “in” parallel I/O operations. I can also see that if the app is doing obsolete removal of data that separate partitioning can be utilized pretty heavily.

All in all I found that that their are some perks to having multiple File Groups. I’m not totally sold on having over 300 data files for a 3 GB Database; however I do see some valid points on to use this type of methodology against a database that is very large along with complex queries hitting the database.

I think it goes without question that thorough testing should be completed when using multiple file groups. Just because it works well in one environment does not mean it will work as well in the other, and I also want to STRONGLY recommend that you don’t decide to make this jump in a prod environment. I would hope that is given but always like to re-iterate. Some formal testing in my opinion should occur first.