Backups Over Time

I’ve written about backups at various times, including responsibility, problems with systems, and home protection. This is a subject that is important to me, as I consider the responsibility for protecting data to be first and foremost. Security, performance and more all matter, but backup and restores are the most important items to pay attention to.
As data professionals, whether looking at organizational systems or our personal data, we will find that our backup strategies change over time. We may also find that systems come and go, and it can be easy to forget about older systems. I know I’ve had to track down tapes and restore decommissioned systems years after they were reformatted or powered off.
I ran across an interesting post from someone that went through and found their old backup media and moved it all to newer media, as well as cloud storage. While I’m not sure that I really want to go through old hard drives and keep old code or data, it’s an interesting exercise to think about.
Do you worry about losing data from old backups? This probably applies more to home systems than organizations, but perhaps you have regulatory requirements to keep seven (or more) years of backups around. Maybe you want to be sure that your old code, projects, pictures, and more are saved elsewhere. Maybe you even want to ensure that newer formats of media are being used.
What techniques might you use to accomplish this? I know I have a Windows Home Server that receives copies of pictures/music/video and a Crashplan account that backs up various machines in the house. That seems to work well, though I do consider taking those pictures/video and putting them on DVDs for long term storage out of the house. I’m hoping that .jpg and other formats don’t go out of popularity anytime soon.
Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 2.8MB) podcast or subscribe to the feed at iTunes and LibSyn.

Promotions and Conflicts of Interest

I noticed my co-worker, friend, and PASS board member, Grant Fritchey, posted a note on members of the PASS Board of Directors (BoD) presenting pre-cons at SQL Saturday events. It’s potentially a legal issue, and conflict of interest. I’m glad that the issue is being raised, and discussed publically. Here’s Grant’s question:

The question is simple, for a PASS branded event, should a member of the PASS board receive payment?

There’s two parts to this, because there are two events. There are events that the organization runs and takes legal and financial responsibility for, and there are events associated with PASS, but run by others who have responsibility.

My short answer is yes to local events, and no to events run by the PASS board. I’ve read through some of the comments and I have some thoughts.

PASS Run Events

First, events run by the organization, such as the Summit and BAC, are different. The BoD can vote on aspects of these events, and can override the decisions on which individuals are chosen. With that being the case, I think there is a clear conflict of interest here and for the limited time the BoD members serve, they should not be a part of these events. Whether they receive direct payment or not, I’d say no.

There was a conflict about this a few years ago, and I think it was justified. If you serve, you can’t present a pre-conference (or post) session. You have other duties, and a responsibility here.

If your business or your employer wants you to be a part of this event in a different way, resign one position or the other.

SQL Saturdays

Really this could be any event that PASS might support or lend their name to, but doesn’t have any financial (or likely, legal) responsibility. This is trickier, as certainly the ability to bestow favors on the organizers of these events in terms of choosing them for PASS run events is possible. However I’d say that this is very unlikely, and hasn’t been an abuse of power. If that changes, I’d change my opinion

I think that the BoD members are still speakers and well respected trainers, and I really have no issue with them being accepted to present a pre-con and being paid by the events. I’d like the fact disclosed, but this doesn’t seem to be a conflict of interest to me at this point.

It’s also good for the community.

Guess the RTM Build

I’m going to take a page from Joey D’Antoni and start a contest. We’ve gotten quite a few releases of SQL Server 2016, including now 4 Release Candidates (RC). RC3 was released recently, and noted as being feature complete. I am assuming this might is the last RC, though certainly any big bugs might result in an RC4 (or 5).

This week, for a bit of fun, I’m going to start a pool for the final RTM Release number. If you post in the comment thread for this editorial, and guess correctly, I’ll send you a prize. Likely this will be something from Amazon, though I haven’t picked anything yet. Suggestions are welcome, though I am thinking some new SSD. We all need more fast storage.

I’ll use the Price is Right rules, so the closest person, without going over. Ties will result in the first person posting winning, based on thread order. If the final build is 13.00.9500.12, and you post 13.00.9500.13, you’re out of the running.

The RCs have been almost 100 builds apart. However there’s no guarantee that the RTM build will be follow any pattern, though it seems like previous releases have ended on an easy to remember build number like 12.00.2000, 11.00.2100, 10.0.1600, etc.

Take your guess, and if you’re right, I’ll contact you with a private message to redeem your prize. Contest ends when SQL Server 2016 RTMs.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 2.4MB) podcast or subscribe to the feed at iTunes and LibSyn.

Crack that Encrypted Data

Ransomware appears to be gaining some traction as a new trend. For awhile in my career, it was virus programs designed to send to all your contacts. Then it was infections to use your computer as part of a bot net. Now it’s encrypting your files and demanding a payment to get the password.

I’m starting to think that a)I need to ensure I have solid, better backups on all devices, and b) I should pay attention and be aware of decryption programs. I’d love to say that I could build a decryption program, like someone did, but as much as I’m interesting in encryption and study it, that’s a little out of my skillset wheelhouse.

I’m actually starting to think that this might be a way that people in communities, like the SQL Server community, can help each other. We can be aware of potential ransomware threats, like the one that hit this hospital, and potentially share ways to recover from the incident, or even decrypt the drives. In fact, I suspect it might be worth keeping a system handy to practice decryption techniques, if you can determine the attack vector.

I’m sure many organizations wouldn’t want to share details of attacks and infections, but this is exactly the type of information that we, as data professionals, should be sharing. It’s incredibly difficult to keep up with all the threats and attacks, not to mention the techniques to recover systems. I’d urge all of you to ask your employers if you can at least help others, even if you can’t disclose how or where you gained the knowledge. If nothing else, the information needs to be shared more publicly to allow us to better protect our systems and be effective custodians of data.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 2.2MB) podcast or subscribe to the feed at iTunes and LibSyn.

Getting Started with Encryption in SQL Server 2016

The release of SQL Server 2016 gives developers a number of ways in which they can securely encrypt and protect their data. In this introductory session, you will learn about the encryption options in SQL Server 2016, watching Always Encrypted, TDE, Row Level Security, Dynamic Data Masking, and server side column encryption can secure and protect your data from unauthorized users.

Length: 60-75 minutes

Code: Github

Powerpoint deck: Getting Started with Encryption in SQL 2016.pptx


As I write more code, especially database code, and I collaborate with others (or myself), I find that I have the need to merge code more and more. It’s inevitable, and since I work across a couple machines, I even find that I need to merge my own code at times.

One way to do this well is use a merge tool of some sort. KDiff3 is a popular one, that’s free, and it’s one that integrates nicely with SQL Source Control, which I also use for various tasks.

If you install KDiff3 (use “choco instal kdiff3”) then you get a basic tool that allows you to compare files. If you start it, you’ll get a simple interface that lets you select multiple files and view them together in one interface. If you don’t use version control, apart from making a mistake, you also probably manage code like this.

2016-04-07 14_20_47-Settings

Or you might have this:

2016-04-07 14_23_22-Settings

Either way creates problems. If you have multiple people doing this, then you have other issues.

KDiff gives you a nice interface to see the differences between files. As you can see below, I have the changes marked in each file.

2016-04-07 14_26_43-Settings

I can edit the files directly, or choose to merge them together. If I do the merge, I’ll get a third window that has the merged code.

2016-04-07 14_28_41-Settings

As you can see, if I click the merge area, I can choose the code from either of the original files.

On the right is a scroll bar that lets me know where in the file the differences are.

2016-04-07 14_26_49-Settings

Software developers think nothing of these types of tools, but DBAs aren’t as easily used to using tools like this. I’d suggest that you download kdiff3 and give it a chance in trying to reconcile any code between team members or servers.

And start using Version Control. It’s easy and incredibly useful.

The Hidden Trigger

I ran across a post recently where someone asked about using a trigger for a particular issue. There were quite a few responses, but one person brought up the fact that triggers have various downsides, all of which I agreed with. However I didn’t see one mentioned, which is a big one with me.

Triggers are hidden.

It’s not that triggers are hidden from DBAs by default, after all, there’s a triggers folder under each table in the Object Explorer, but how often do you open that folder? It’s just not obvious when a trigger exists on a table. The use of triggers is too uncommon, and it’s easy to forget they exist, even if you’re the person that added them to a database.

This is really my one big complaint about triggers. I can’t easily determine when a trigger is firing, which sometimes leads to issues. I’ve troubleshot strange things in SQL Servers, only to realize hours later that a trigger is firing based on some action.

Perhaps it’s me, but do many of you think about triggers first when something strange happens in SQL Server? I think I’ve gotten better over the years at looking for them, but I still get stymied and waste time trying to debug issues. I really wish that a “triggers” tab would appear in SSMS, maybe after the Execution Plan tab, whenever a trigger fired. It could have a copy of inserted and deleted, or even just the trigger code. That alone might clue me in quicker.

I think triggers can be valuable in database systems, but because they seem to be so rarely used, I’ve grown to distrust and dislike them over the years.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 1.9MB) podcast or subscribe to the feed at iTunes and LibSyn.

End to End Always Encrypted in SQL Server 2016

Abstract: Protecting our data from unauthorized access becomes more and more important all the time, however it has been difficult to ensure sensitive data is encrypted in SQL Server. The new Always Encrypted feature in SQL Server 2016 makes this much simpler for developers and DBAs with a framework for protecting data from the client, across networks, and inside of the database. This new feature allows for limiting access to the data, even from the DBAs and sysadmins that may control the database instance itself. Learn how to implement and use Always Encrypted in your applications.

Length: 60-75 Minutes

  • Demos:
  • Always Encryption Setup
  • Working with Data in a client application
  • Certification creation and transfer
  • Certificate Rotation

Powerpoint Deck: End to End Always Encrypted.pptx

Code: Github

What’s Your Downtime?

That’s the question this week: What’s your downtime?

I thought about this as I read this piece on Azure’s downtime being greater than its rivals in 2014. You do need to provide an email to read the article, but essentially the article builds on tracking from CloudHarmony to show that Azure had more downtime for the year, with about 54 hours. In Europe, that high water mark was 5.97 hours for compute and 1.31 hours for storage, so the European Azure cloud is doing better than the others.

That’s pretty good. Certainly individual machines went down, and services were unavailable for short periods of time during failover, but keeping a hardware platform up around 5 hours of downtime a year is good. I’m not sure that many of my companies have done this, though to be fair, mostly it’s been patches from Microsoft that caused downtime for Windows machines.

However, let’s look at your database instances. Do you know what your downtime is? I’d encourag you to track it, and be sure that you report on it. Or at least have the data, in case the boss asks. I don’t know what our SSC total downtime is, but I can tell you that our cluster rebooted in Aug 2015 and Mar 2016, brief outages for some patching. In the past, I’ve typically seen our database instances run the better part of a year, usually only coming down for a Service Pack.

If you have monitoring in place, and you should, then I hope you have some idea of downtime for your main databases. If you can share, see if you might set the high water mark for the SQLServerCentral community in the last year.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 2.4MB) podcast or subscribe to the feed at iTunes and LibSyn.

Basic SQLCMD–#SQLNewBlogger

Another post for me that is simple and hopefully serves as an example for people trying to get blogging as #SQLNewBloggers.

I had the need to connect from the command line recently, and decided to make a quick post on using SQLCMD, as I had an issue.

SQLCMD is a command line utility that comes with SQL Server. I know many people don’t use command lines, but they are handy at times. I recently opened a command prompt.

2016-04-06 12_47_33-Photos

I then typed SQLCMD. After a delay, I got this:

2016-04-06 12_51_23-Photos

The issue here is that I don’t have a default instance on this machine. All of mine are named. I need to provide a –S parameter, with a server name (and possibly instance name).

2016-04-06 13_01_33-Photos

I do that and I’m connected. By default, SQLCMD (and osql) try to use Windows Auth. The 1> indicates that the utility is ready for T-SQL queries. You need to know your language here as there’s no help.

I can enter code, and check my user name. I do this, and get a 2>. The end of a batch is indicated with “GO” and this will execute the batch. You can see how this works below:

2016-04-06 13_01_47-Photos

I can use this to make my code easier to read. I can format code as I would in an editor, though be aware you can’t go back and edit previous lines.

2016-04-06 13_07_03-Photos

If I enter go, I’ll get this:

2016-04-06 13_07_15-Photos

Not so easy to read. I have to scroll up to even figure out what the display is:

2016-04-06 13_07_26-Photos

As you can see, using SELECT *, or retrieving too many columns make results hard to read. You would to wise to pick only those columns you need to return.

To leave SQLCMD, you can type exit, which will return you to the command prompt.

2016-04-06 14_00_07-Start

This is a short look at SQLCMD. The older, osql, utility functions the same way, and both are good, lightweight ways to connect to your SQL Server instance.