QA, Utility Databases, and Job Executions

thinking-outside-the-box1Sometimes we, as data professionals, have to think outside the box. I know, crazy idea right? Each shop and situation are different; there will always be several different ways in most cases that you can arrive at a solid solution.

 

This post has a few intentions behind it:

  • It is not a “take this solution; it’s the only way”.
  • Generate some thought and additional methods to reach a goal.
  • This is not intended for a production environment.

Good, now that we have those few things out-of-the-way let’s get to the meat of the topic. A situation arises where you want to give a bit more control to teams to execute jobs without giving full access to the SQL agent. In that case a good utility database may come in handy.

Example of an issue: A QA team is in need of kicking jobs off to test in a specific environment. Keeping in mind that each shop can be different this also means that security levels at varying shops will be different. There are a few choices that may come to mind with this issue:

  • Fire off an email to the DBA team and wait for them to kick job off.
  • Fire off an email to someone with access and wait for them to kick the job off.
  • Wait for the predefined schedule on the job agent and let it kick the job off.

Another method would be to utilize a utility database. You can give it whatever name meets your criteria in this case we will just call it TestingJobs. Let’s look at the overall picture below and how this all fits together:

Things you’ll need

  • UtilityDatabase
  • Two Stored Procedures
  • Table
  • Agent Job

Step1: Create the TestingJobs database (I won’t go into specifics here on proper set up; assume this is already created).

Step2: Create a table called ControlJobs inside the TestingJobs database

USE [TestingJobs]
GO

SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

SET ANSI_PADDING ON
GO

CREATE TABLE [dbo].[ControlJobs](
[JobControlID] [INT] IDENTITY(1,1) NOT NULL,
[JobName] [VARCHAR](500) NOT NULL,
[RunStatus] [BIT] NOT NULL DEFAULT ((0)),
[LastRanBy] [VARCHAR](50) NOT NULL,
[LastRanByApp] [VARCHAR](150) NULL,
[Date_Modified] [DATETIME] NOT NULL DEFAULT (GETDATE()),
[Active] [BIT] NOT NULL DEFAULT ((1))
) ON [PRIMARY]

GO

SET ANSI_PADDING OFF
GO

Step3: Store procedure creation for table insertion (note the parameter @JobName)

USE [TestingJobs]
GO

SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO

ALTER PROCEDURE [dbo].[test_StartJobs] ( @JobName VARCHAR(100) )
AS
BEGIN

        /************************************************************************
This script will insert the record needed to kick off agent jobs.

        ************************************************************************/

INSERT  INTO [TestingJobs].[dbo].[ControlJobs]
( [JobName] ,
[RunStatus] ,
[LastRanBy] ,
[LastRanByApp] ,
[Date_Modified] ,
[Active]
)
VALUES  ( @JobName ,
1 ,
” ,
” ,
GETDATE() ,
1
);

END;

Step4: Set up stored procedure that will run the pending jobs.

USE [TestingJobs]

GO

SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
ALTER PROCEDURE
[dbo].[RunPendingJobs]
AS
SET NOCOUNT ON;

    DECLARE @JobName VARCHAR(500) ,
@JobStatus INT ,
@RC INT;

DECLARE cur_RunJobs CURSOR
FOR
        SELECT  JobName
FROM    Ddbo.ControlJobs
WHERE   RunStatus = 1
ORDER BY JobName;

OPEN cur_RunJobs;

FETCH NEXT FROM cur_RunJobs
INTO @JobName;

WHILE @@FETCH_STATUS = 0
BEGIN
PRINT ‘Checking to see if job is currently running. ‘;

EXEC @RC = dbo.GetCurrentRunStatus @job_name = @JobName;

IF @RC = 0
EXEC msdb.dbo.sp_start_job @JobName;
ELSE
PRINT
@JobName + ‘ is currently running.’;

UPDATE  ControlJobs
SET     RunStatus = 0 ,
Date_Modified = GETDATE()
WHERE   JobName = @JobName;

FETCH NEXT FROM cur_RunJobs INTO @JobName;

END;

CLOSE cur_RunJobs;
DEALLOCATE cur_RunJobs;

Step5: Set up stored procedure to check if job is already running

USE [TestingJobs];
GO

SET ANSI_NULLS ON;
GO
SET QUOTED_IDENTIFIER ON;
GO
ALTER PROCEDURE
[dbo].[GetCurrentRunStatus] ( @job_name sysname )
AS
SET NOCOUNT ON;

    /* Is the execution status for the jobs.
Value Description
0 Returns only those jobs that are not idle or suspended.
1 Executing.
2 Waiting for thread.
3 Between retries.
4 Idle.
5 Suspended.
7 Performing completion actions  */

DECLARE @job_id UNIQUEIDENTIFIER ,
@is_sysadmin INT ,
@job_owner sysname ,
@Status INT;

SELECT  @job_id = job_id
FROM    msdb..sysjobs_view
WHERE   [name] = @job_name;
SELECT  @is_sysadmin = ISNULL(IS_SRVROLEMEMBER(N’sysadmin’), 0);
SELECT  @job_owner = SUSER_SNAME();

CREATE TABLE #xp_results
(
job_id UNIQUEIDENTIFIER NOT NULL ,
last_run_date INT NOT NULL ,
last_run_time INT NOT NULL ,
next_run_date INT NOT NULL ,
next_run_time INT NOT NULL ,
next_run_schedule_id INT NOT NULL ,
requested_to_run INT NOT NULL , — BOOL
request_source INT NOT NULL ,
request_source_id sysname COLLATE DATABASE_DEFAULT
NULL ,
running INT NOT NULL , — BOOL
current_step INT NOT NULL ,
current_retry_attempt INT NOT NULL ,
job_state INT NOT NULL
);
INSERT  INTO #xp_results
EXECUTE master.dbo.xp_sqlagent_enum_jobs @is_sysadmin, @job_owner,
@job_id;
SELECT  @Status = running
FROM    #xp_results;
RETURN @Status;

DROP TABLE #xp_results;

SET NOCOUNT OFF;

Step6: Job Creation

Create a SQL agent job that will call the RunPendingJobs in the database. You can set this schedule to three minutes for this test.

The Benefit

Now think of a QA team member sitting at their desk running multiple tasks. This does take some coordinated effort in getting the job names but now that the basics are set up the team member could run the execute command for the test_StartJobs which will place the necessary information into the control jobs table. Of course the proper security would need to be set up in order for the user to be added (think AD groups). By utilizing the above method the team can suffice on it’s own in a non prod environment streamlining some of the inefficiencies that have plagued the groups in the past.

Summary

A few things to note here:

  • Don’t ever take code off the internet without testing it. This is just a thought-provoking post and there are some things in this post that are dependent upon one to set up and test.
  • I realize there are multiple ways to accomplish this. This is just an avenue to explore and test with some thinking outside the box.
  • Don’t limit yourself to “I can’t” or “This will not fly at my shop”; challenge yourself to become innovative and think of ways to tackle problems.

T-SQL Tuesday #89 Invitation – The times they are a-changing

TSQL2SDAY-150x150This month’s topic by Koen Verbeeck (B|T) is based around times that are a changing. To break it down somewhat it was inspired by a blog post that Kendra Little (B|T) put together around Will the Cloud Eat My DBA Job. Koen is wanting to know with the ever-changing world of technology what kind of impact has it had on you and how do you plan to deal with the future of data management/analysis.

To The Cloud

I think one of the topics that I’ve seen a gradual change in is the topic that revolves around the cloud. Being in the financial district cloud talk is not always welcome. It is a myth to some; I am glad that a while back I headed some of Grant Fritchey’s (B|T) advice in that he said you better start learning cloud techniques sooner than later. The cloud discussion is not always a welcomed one, but is one that needs to be had to keep up with innovative technologies.

Finding what is the best solution for you and your respective area the cloud does allow for flexibility and control. One of the main issues I see most shops running into are security based around a cloud model along with costs in data size etc.

With the proper planning and oversight the cloud is a viable option that should not scare away data professionals

Third Party Utilities

I think over time some of the third-party tools have become game changers in a lot of shops. I see vendors such as SQL Sentry and Red Gate that have evolved over time to help streamline and provided better efficiencies around data management and monitoring. Cutting edge keeps the users of these third-party tools on edge and wanting more. Tying all these into automation of daily tasks and not just on premise monitoring but cloud monitoring has been a huge plus for the community.

The Way Business Interprets Data

Data is what drives us; it is what a lot of decisions are derived from and direction of companies and shops. The data is ever-changing and how we look at it. Take for example Power BI. The methods we used years ago have morphed into a greater approach of delivery to businesses. I never thought I would be watching a professional sports game and see them pull out their tablets and review live data in between innings or set of downs.

PowerShell

No, this isn’t a tribute to Mike Fal (B|T) or Drew Furgiuele (B|T), but I do appreciate their nudge in getting me to utilize PowerShell for some of my every day usage. This could fall under the third-party utilities section up above but I thought it beneficial to state that in some cases it has been a game changer. Stumbling upon the DBATools website has been a blessing in disguise; I love getting to work with technology that I may have not utilized as much in the past.

Do You Feel Endangered?

No, and neither should you. I should paraphrase that with don’t be afraid of change for it allows us to learn new technologies and grow on our journey. There will be opportunities to always learn; each day you should strive to learn something new that you didn’t know before. A wise SQL community member once told me, “The data will always be here; will you?”

T-SQL Tuesday

For those that are not aware T-SQL Tuesday is a block party started by Adam Machanic (B|T). Each month community members who blog pick a topic then gather all the blogs who participated in the event and provides a recap. It’s a great method to share knowledge and an avenue to give back to the community. If you are an avid blogger and would like to be a host then do reach out to Adam via the methods provided.

Don’t Duck On Responsibilities

ResponsibilitiesBeing a data professional you assume a certain amount of responsibility. It often requires having the right attitude and an action plan in place for finding the solutions to our problems at hand. Too many times we attack the symptoms causing the issue, but overlook the root cause. The quick Band-Aid fixes are found many times over, whereas our jobs should be identifying the real issues that lie beneath the symptoms. Now, don’t get me wrong – I understand at times you have to stop the bleeding. In the end though one should uncover the root cause and make the permanent fix.

Prioritize the issue at hand

Chances are you, dear reader, encounter many problems throughout the day. Never try to solve all the problems at one time; instead make them line up for you one by one. Might seem odd but make them stand in a single file line and tackle them one at a time until you’ve knocked them all out. You may not like what you find when uncovering the root cause issues, but that is part of the process. Be careful of this uncovering and be cognizant that what you find with the issues may or may not be the root to all the problems.

Take time and define the problem

In it’s simplest form, take time out and ask yourself this question – “What is the problem?” Sounds easy enough doesn’t it; you’d be amazed by the many accounts of knee jerk reactions data professionals make all over the world. You  may be thinking to yourself that there has to be more to it than that. Think about it in four easy steps:

  • Ask the right questions – if you only have a vague idea of the situation, then don’t ask general questions. Do not speculate but instead ask process related questions things relating to trends or timing. What transpired over the course of the week that may have led to this issue.
  • Talk to the right people – you will face people who inevitably will have the all-knowing and all correct way that things should be done. Heed caution to such as you may find resistant to change and blind spots by these individuals. Creativity is, at times, essential to any problem-solving skill.
  • Get the “set in stone” facts – once the facts are all laid out and defined you may find that the decision is pretty concise and clear on action that should be taken.
  • Be involved – don’t just let the first three steps define you; get involved in the process of being the solution.

Questions to ask yourself regarding the problem

  • Is this a real problem?
  • Is it urgent?
  • Is the true nature of the problem known?
  • Is it specific?
  • Are all parties who are competent to discuss the issue involved?

Build a repository

Once you’ve come to the conclusion and provided a solution to the issue – document it. I know I just lost several readers there. Believe it or not documentation will save your bacon at some point. Maybe not next week or next month, but at some point down the line it will. Some things to consider are:

  • Were we able to identify the real cause to the problem?
  • Did we make the right decision?
  • Has the problem been resolved by the fix?
  • Have any key people accepted the solution?

I am reminded by a saying I once ran across:

Policies are many, Principles are few, Polices will change, Principles never do

Summary

Each day we encounter issues and problems. Don’t let them define you but rather you define the issue. Often times we overlook the root cause; remember to go through your process, policy, and standards in rectifying the problems at hand. It is better to tackle the problems when they are known than to sweep them under the rug for the next data professional to come along and then they are faced with fixing them.

Hopefully this short post will provoke you to think about the issues you deal with on a daily basis and how best to tackle them.

Are You Wasting Energy?

Teamsuccess.jpgOften times we as leaders within our respective shops tend to waste our time focusing on the wrong things. Think about that for a moment and think about the team that you are on or that you are leading. I’ve seen it happen among some very talented teams where we (yes I include myself in this) are not tapping into the expertise of our teams bringing for the most potential out of each team member.

Over the course of time I’ve come across some things that have helped me in a leadership role that may help some other data professionals out there who are starting out or maybe even a season vet.

Want Results?

  • Eliminate those distractions – you have to define out what matters most. What are you or the team doing that may prevent you from focusing on the real tasks at hand.
  • Get Real – face it; there will be times when those awkward conversations are needed. Hold each other accountable if you are on a team, and if you are a lone DBA which some of my friends out there are then build a base in the community of trusted advisors. Bounce some ideas off them.
  • Point out what is not working – this may seem simple enough, but believe me it’s not always that easy to overcome it. Constantly review processes and procedures to make sure you are thriving forward; not drifting backward.
  • Set some goals – do this with your team, individuals, or yourself – put the emphasis on with.

Change is hard; change is never easy. That’s where coaching comes in; you have to stick with it. Trust me; if it were easy then everyone would be doing it.

What Are Some Ways To Define Success?

  • Respect and leverage – I’ve personally found that when teams respect each other and can strategically leverage each persons talents then watch out. You are about to witness something special take place.
  • Management has focus – as I typed this I had to take a moment and reflect on the team I’m blessed to lead. I’m I positively focused on leading the group – I do believe I have their best interest but that doesn’t mean “we” won’t make mistakes. I include we, cause dear reader, you may be in this category with me.
  • Does your team matter – your colleagues and teammates; the ones you get in the trenches with on a daily basis should feel like they matter.
  • Ability to be innovative – one of the key success points I’ve experienced is turning a team loose and just say, “be innovative”. End of day I got your back, and guess what you will fail. Let me repeat myself; you will fail. However, if you are not being innovative or your team or colleagues are scared to try anything new from fear of backlash then are you truly pushing forward?
  • Good enough isn’t really good enough – a saying that has stuck with me my whole life is a simple one. Somewhere someone is practicing getting better, and when you meet that person one on one will you rise to the challenge. It is okay to set the bar high and it is also okay to keep working hard toward and end goal. On the flip side to that it is also okay to learn from your mistakes and let that be the fuel to the fire to keep getting after it.

Summary

I challenge you to embrace your aspirations today. If you lead a team of data professionals then take a long hard look at how you are leading your group. Let innovation; collaboration, and engagement with others turn into respect, leveraging talent, and building on success. If we are bragging or dwelling on the past; then that may mean we are not doing enough in the present.

How’s that fuel in the fire; are you passionate about succeeding? These are just some thoughts in my own mind that I’ve jotted down over the years that may help you along your journey as a data professional. Time to get after it and make it happen ~ BE THE CHANGE.

Is There A Threat Inside?

Data-PrivacyIf you’ve been involved in technology for any length of time you are aware of outside threats to your network or databases. You read about some of these threats in the news such as hacking, breaches, etc.

All of these outside threats are pertinent and require our attention to detail as data professionals, but along with that threat are you considering any threats that could occur on the inside? Every shop should have some form of guidelines, documentation, regulations around their processes.

The risk from inside threats such as employees, ex-employees, and trusted partners. Some of these threats are accidental while others can be of a malicious nature. In either circumstance the consequences can be devastating for a company. Below are some things to think about within your own environment to prevent such actions from occurring.

Secure User Access

  • Stop unauthorized access – in all honesty this means button up the shop. If you have SA access across the board you are doing it wrong. Think about utilization of role based security, AD groups, etc. You are responsible for the data so don’t make this an afterthought.
  • Manage the threat of shared passwords – fifteen people shouldn’t have access to critical accounts. Check into secure user and password utility such as Secret Server; there are a number of companies out there that provide such products. Who is accessing these accounts and why?
  • Organizational Critical Assets – a companies assets such as data is one of the most important and integral pieces to the puzzle – it needs to be treated as such. This can mean many different things on many different levels. Do you know who is accessing your data and why?
  • Immediate Response to Suspicious Behavior – What do you do when you find activity going on that raises some concern? If you don’t have a process in place of reporting this then I suggest you think about getting one in place. Standards of such events are important; trust me on this. The time will come (and it will come) when threats become real. Procedures should be in place and gone over with all related data teams.

I ran across this article some time back from simple-talk and found it to be very fruitful in showing you How to Get SQL Server Security Horribly Wrong When you get time do check it out. In many cases I have run across security is an after thought – don’t let it be.

Define Areas of Vulnerability

This is a key component in getting started with taking your data seriously. Accessibility to information is a key deliverable in most shops; the data is the heartbeat. Face it; we live in a world today that is data driven; many decisions throughout every minute of the day are based on integrity of the data. Without addressing security in the design around the data it will leave you open to potential threats.

  • Network File Shares
  • Legacy Permissions
  • Logging and Monitoring
  • Change Control

These are just to name a few that could be potential vulnerabilities a shop can be exposed to.

Summary

We, as data professionals, need to take control and secure our data. But even more importantly we need to educate our end users on best practices and standards within the companies and shops we are associated with. Security can no longer be an afterthought.

If this means changing some things and rattling some cages then so be it; it may just save you in the end from a major security breach. We often are aware of external threats; what most people tend to over look are the threats from within the walls of a company.

It is imperative to take preventative measures and even the highest level of clearance should be monitored in some form or fashion. Think about the DBA for a second, and not just because I am one. They have the keys to the kingdom so to speak; same as a lot of sysadmins. There should be transparency in their actions; auditing should occur as to the what, when, and why.

Taking it a step further would be conducting data forensics (that would be a fun topic of discussion)

Bottom line I encourage you to start taking security around your data seriously if not someone else will.

How I Became A…SQL Server Data Professional

OopsLast night I saw a tweet from Matt Gordon (b|t); the topic caught my eye – “How I Became A…SQL Server Data Professional”. The original idea spawned from Kevin who is known as the SQL Cyclist (b|t) over here at this post

I’ll have to fill in the blank the same as Matt Gordon has with “How I Became A SQL Server Data Professional”

It definitely was not a bed of roses to get to where I am at now. My story is the same as many others across the world in that I became a DBA pretty much on accident at the time. I was a developer right out of college writing code in a language called Progress (think Visual Basic). I still remember to this today seeing variables in the code that had been passed down for a while with the Gilligan Island characters; you know skipper = minnow + Gilligan.

Needless to say I spent three years at that job and enjoyed it, but with growth and thinking I could take on the world I wanted to spread my wings and fly a little bit; which landed me the next 8 years at a place where my groundwork for getting into SQL would really flourish. I was doing SQL development work sprinkled in with some .Net and quickly realized that I wanted to stick with the core engine. Not sure why but SQL just stuck; it was intriguing to me. How to make queries run faster, how to get the most out of the engine,  why were queries taking such a long time to execute. All these things kept running through my head. Doing database work was building me into becoming the DBA that would bring me to my next job.

It was at this job where I got my feet wet with what SQL Community, PASS Summit, and Mentorship was all about. It’s been 5 solid years since getting involved and it has been one heck of a ride. As I sit back and look at where I started to where I am today I would never have guessed it. If you were to go back when I was younger people in my past would tell you that if it wasn’t sports oriented I would not have anything to do with it. Becoming a SQL Data Professional is not just a job to me; it’s a passion.

As I sit here and reflect back to all those memories I’m thankful for each one of them. No, it hasn’t always been easy. I believe the harder times have molded me and made me into a stronger more durable Data Professional. Whatever road you take to become what you are remember one thing – it is your journey; your story. You are the CEO of your destiny. Rise and grind – get it done.

Summary

I encourage you to take Kevin’s initial request to heart and think about when you became what you are now. How did you get there? What roads did you travel? Hope you had a good time reading this post; as it sparked a lot of memories for me. I wouldn’t trade any of them for anything in the world.

What Is SQL Saturday?

SQL SaturdayI was recently approached and asked, “What is SQL Saturday? I actually get asked that question more times than not from other people within and outside the community that have never had the pleasure of attending an event. I’d like to take a moment, and from my own perspective lay out the what, why, and how come you should familiarize yourself with such great events across the globe.

The What

You can read the official “What is SQL Saturday About” here

The Yates version is what you will find in this blog post and what it means to me as a data professional. Oddly enough I attended my first SQL Saturday event 4 yrs ago, and that stemmed from going to my first PASS Summit 6 yrs ago. Why the gap you ask? I don’t have a good reason; but what I can say is I was enamored by the fact I could go to this free event and learn from top-tier speakers. Usually there is a fee involved for lunch which is minimal and in some cases I’ve seen where you bring your own. These events usually consist of DBA, BI, Professional Development tracks split up across a full days time. In it you’ll find some speakers you would see at PASS Summit all the way to local and regional speakers. The good thing about these events are you can network and share experiences and knowledge with other data professionals from all walks of professions on top of the learning.

Depending on the size of the event you will have a chance to talk to various sponsors regarding their products that they offer. Not all events will have sponsors and that’s okay. The purpose of these events are to “help” people continue to learn and for others to pay forward the opportunities they have experienced.

The Why

The why is important. If you don’t get or read anything else on this post I want you to stop, open your eyes, take this in. I can sum it up in one word for you – PEOPLE.

I’ve found myself being a co-organizer for our local event in Louisville, KY – you can read about our upcoming one here. This is no easy undertaking, and I’ve seen the value; the difference it makes in data professionals. In speaking of giving back to what has been afforded to you; this is one way I believe that I can make an impact on the community. There is nothing like seeing a light bulb go off or seeing someone who has attended these events come up to you and say that something finally clicks. It’s about the attendees; the people.

Listen, it’s not a life about glitz and glamour. It’s hard work, it’s dedication, it’s requiring you to have a drive that when you are faced with adversity you overcome it. These events provide avenues for data professionals who can’t travel to the big conferences. Providing good quality learning is key to developing and cultivating our growing SQL Community ~ I’m a huge proponent that each one reach one. Stop and think about that for a second. Imagine how many people we have in our SQL Community. If each person reached one other person, my oh my, and if we help one at these said events then it is worth it in my eyes.

How Come

In my finite mind it started with a vision Andy Warren, Brian Knight, and Steve Jones had back in 2007. Knowing 2 of the 3 people listed I know that the mindset was geared toward  helping others learn because I know how much they have invested in me over the years. I look at SQL Saturday’s differently than I used to; over time it has grown from learning, soaking up all the knowledge I could; to speaking, volunteering, and helping at said events.

Believe it or not; speakers are people to. It is always encouraging when you see new speakers submit abstracts to these events. It allows for development and growth of upcoming rising stars to let their talents shine through.

Conclusion

If you haven’t attended a SQL Saturday then why not start? You can get a full listing of events here

I do request one thing of you if you attend. You will see volunteers at these events; remember one thing. They do this for free; we aren’t paid huge salary major league contracts; instead they (we) put in blood sweat and tears to put on a good event for you to come learn. I encourage you to seek one of them out and just say thank you; you will have no idea how much it will mean to them.

If you need help getting plugged into a SQL Saturday near you let me know. I’d love to talk to you and help you get started on your journey to further learning – just leave me a comment and I’ll reach out to you.

If you are a new and upcoming speaker, again give me a shout. I’ll be happy to provide some insights and tricks that has helped me over the years.

Let’s get after it and make it happen – each one reach one. Let’s Roll