Static Data

Static Data Scripts


Ever had to wipe out tables only to put data back into the table such as maybe look up tables or some tables that never really ever change but maybe on a yearly basis?

To me static data scripts are integral in a sense it keeps you from re-inventing the wheel over and over again. In thinking of this topic I realized that I had some gaps myself I could shore up and get some quick scripts created so that when the time comes I have them at my disposal.

A basic template to go buy can be something like the one below:

/***************************************
*** Static data management script ***
***************************************/

PRINT ‘Updating static data table [dbo].[tabl1]’

— Turn off affected rows being returned
SET NOCOUNT ON

— Change this to 1 to delete missing records in the target
— WARNING: Setting this to 1 can cause damage to your database
— and cause failed deployment if there are any rows referencing
— a record which has been deleted.
DECLARE @DeleteMissingRecords BIT
SET @DeleteMissingRecords = 1

— 1: Define table variable
DECLARE @tblTempTable TABLE (
[Id] int,
[Info1] BIT
)

— 2: Populate the table variable with data
INSERT INTO @tblTempTable ([Id], [Info1])
VALUES (‘1′, ’10’)

— 3: Delete any missing records from the target
IF @DeleteMissingRecords = 1
BEGIN
DELETE FROM [dbo].[tabl1] FROM [dbo].[tabl1] LiveTable
LEFT JOIN @tblTempTable tmp ON LiveTable.[Id] = tmp.[Id]
WHERE tmp.[Id] IS NULL
END

— 4: Update any modified values with the values from the table variable
UPDATE LiveTable SET
LiveTable.[Info1] = tmp.[Info1],
FROM [dbo].[tabl1] LiveTable
INNER JOIN @tblTempTable tmp ON LiveTable.[Id] = tmp.[Id]

— 3: Insert any new items into the table from the table variable
INSERT INTO [dbo].[tabl1] ([Id], [Info1])
SELECT tmp.[Id], tmp.[Info1]
FROM @tblTempTable tmp
LEFT JOIN [dbo].[tabl1] tbl ON tbl.[Id] = tmp.[Id]
WHERE tbl.[Id] IS NULL

PRINT ‘Finished updating static data table [dbo].[tabl1]’

GO

I know in times past this type of methodology has saved me some hours here and there and if you have data that you know will not change, especially if you push from one environment to another makes life a little easier.

 

Sessions For PASS 2012 Released

I was excited to see the sessions have been released for this years PASS 2012 Summit

If you’ve been involved in the SQL Community for any particular time than hopefully PASS has come up here or there. I can say from personally attending in the past that the knowledge gained and networking that you will do is invaluable.

Some of my peers have inquired what are the pro’s, and why should I attend such a Summit. Again, I can only say that you get out of it what you put into it. The atmosphere and mindset going into the Summit to learn and gain some knowledge from the best speakers the SQL Community offers ~ well you’ll walk away amazed.

I think PASS does a great job in detailing what they are about and the why’s and who should attend questions. You can check all that out here.

Everything you will need to know about the Summit from lodging to pricing, etc is available on their site. If you are attending ~ GREAT; if you are on the border line than can I implore you to go; and if you don’t think you are going I ask you to take a hard look at it in  hopes to change your mind.

Let’s make this PASS Summit the best yet. Hope to see you there!

Granting View Definition to All Procedures

The other day I came across something where I needed to grant view definition rights to a specific user for one single DB. Now, there are several different ways to achieve this but I decided to dabble in just writing some code to take care of it for me. Others might have something very similar or different mechanisms to do such but for me I used the following code:

The basic gist is as follows:

  1. Create Temp Table – you can use a # table just drop it when done with it
  2. Declaration of Variables
  3. Insert data into temp table – only want to grab the procedures and nothing with a _dt
  4. Grab all of the procedures found in the DB and dynamically set the grant permissions to a SQL string
  5. Execute the SQL string
  6. Restart the counter and reset the name variable just for good measure
  7. Select the temp table to view what was just granted view definition to

Code in doing this:

DECLARE @PermTable TABLE
(
PK_ID INT IDENTITY(1, 1)
NOT NULL
PRIMARY KEY CLUSTERED ,
[name] SYSNAME
)

–declare variables
DECLARE @name SYSNAME ,
@RowCount INT ,
@RecordCount INT ,
@strSQL VARCHAR(2000)

INSERT INTO @PermTable
(
[name]
)

SELECT ROUTINE_SCHEMA + ‘.’ + ROUTINE_NAME
FROM INFORMATION_SCHEMA.ROUTINES
WHERE ROUTINE_TYPE = ‘Procedure’
AND ROUTINE_NAME NOT LIKE ‘dt_%’

— counters for while
SET @RecordCount = ( SELECT COUNT(*)
FROM @PermTable
)

SET @RowCount = 1
WHILE ( @RowCount < @RecordCount + 1 )
BEGIN
SELECT @name = [name]
FROM @PermTable
WHERE PK_ID = @RowCount
SET @strSQL = N’Grant VIEW Definition on ‘
+ RTRIM(CAST(@name AS VARCHAR(128))) + ‘ to [User]’

–Execute the Sql
EXEC(@strSQL)

–Restart the Counter
SET @RowCount = @RowCount + 1

–Never know so why not reset
SET @name = NULL
END

SELECT *
FROM @PermTable

Hope this helps; cheers.

 

“Make It Right”

Over the years I have grown accustomed to watching a show called Holmes Inspection. I enjoy working projects at the house ~ am I good at it, eh some things yes some things no, but nevertheless I do enjoy it.

The premise of the show is Mike Holmes goes into these homes in Canada where contractors have come in and basically not done the job they should have done; or families have bought homes and the inspector hadn’t caught obvious issues with the home to report back to the family. His biggest quote and what he is known for is “Make It Right” and also a big believer in “Do it right the first time and you won’t have to come back to it”. He has built his show and business around three things:

  1. Quality
  2. Integrity
  3. Trust

I’ve started to sit and dwell on these things and how that applies to my DBA work that I enjoy doing.

Quality – without a doubt the quality of a DBA is very important. In some instances it can make or break a company. From security down to the backup of the tiniest of databases. If a DBA is not on top of his/her game daily potential catastrophic events can occur. This in no way means that the DBA will be perfect; we hold ourselves to very high standards but at the end of the day we are human and mistakes will be made. When those mistakes do occur how do we handle those?

Integrity – one definition found in the dictionary says that this about integrity, “possession of firm principles: the quality of possessing and steadfastly adhering to high moral principles or professional standards” I can’t think of a better synopsis of a DBA.

Trust – as a DBA do people trust you? To me this is an obvious question….if you are a DBA you have all the ins and outs to the database where all the data is stored. A whole topic can be delved into on trust.

I’ve done things in the past and have viewed other DBA’s work that was done before me at jobs. Some things I sit back in amazement and think, “Why in the world is it done this way?”, and “What was I thinking?”

I find it very prudent to “Do It Right” the first time even if it means taking a bit longer to accomplish the task and ensuring all aspects (if applicable) have been thought out, and along those same lines if I come upon something that I know isn’t right then, “Make It Right” ~ quality, integrity, and trust

 

T-SQL Tuesday #31 – Logging

Looking at my schedule I realized after discussing with fellow DBA’s that I was missing out on something ~ T-SQL Tuesday. I think this is fantastic idea and to host a T-SQL Tuesday you can contact Adam Machanic.

Today is T-SQL Tuesday #31 and is by Aaron Nelson on Logging……….

Logging

In general, to me, logging is an essential part of my every day activities as a DBA. It helps me in many ways tracking changes from deployments from fellow DBA’s and myself to bringing back statistical information for us to determine growth.

I recently came across two real world examples that has helped me with logging.

  1. A DBA friend was in the middle of a deployment; after several minutes the deployment was labeled complete and everyone was on their merry way. A few days later some catastrophic events unfolded; and everyone wanted to know the why; because we were logging the who, what, when along with the commands given we traced back quickly the change and what was changed by the user.
  2. I’m a huge fan of the Utility Database idea; have been ever sense I sat in on it at PASS ( Chris Shaw presented it). After getting back to the office it was my mission to get this idea implemented in. Currently, we are building an enterprise edition out that encaptulates a plethora of information from statistical stats, heartbeats, file sizes, space on drives, and the list goes on. Logging this information will help guide and plan for sizing and space.

I’ve only touched on a couple of topics regarding logging; there is so much more out there.

Roles, Schemas, and Drops Oh My!

I recently ran into an issue where a development manager was creating a new role in SQL for his data warehouse initiative and cube building. He granted the role ownership of a newly created schema then changed his mind on what he wanted regarding a few items.

I provided the explanation to him that I had ran into this in the past and provided the following so that he could transfer the owned schema to another user or role then drop his role he created.

If you try to drop the role while the schema is owned by it SQL will produce the error, “The database principal owns the schema in the database, and cannot be dropped”.

In this particular instance we only had one schema this role was tied to but if a user or role was tied to many you can run the below query to determine what all schema’s are tied to the user:

SELECT s.name
FROM sys.schemas s
WHERE s.principal_id = USER_ID(‘testname’)

The next query will transfer the schema’s you want to move over to dbo for example so you can remove the role then re-assign out as needed:

ALTER AUTHORIZATION ON SCHEMA::SchemaName TO dbo

As always I recommend testing locally or in a test environment prior to doing any type of prod related activity.