Using TRY..CATCH in SQL Prompt

I learned something new recently about SQL Prompt. I was working on documenting and experimenting with snippets and found a quicker way to write code.

One of the things I think is great is to use the TRY..CATCH structure. I don’t do it enough, but I want to do this more, and make it smooth. SQL Prompt includes a tc snippet, which gives me this code when I type t-c-TAB.

2016-09-23 12_13_39-SQLQuery4.sql - (local)_SQL2014.Sandbox (PLATO_Steve (63))_ - Microsoft SQL Serv

You’ll notice that my cursor is in the CATCH block. Why? Well, if you pause after t-c, you’ll see the snippet and code (unless you’ve turned off the window). Notice the $CURSOR$.

2016-09-23 12_13_17-SQLQuery4.sql - (local)_SQL2014.Sandbox (PLATO_Steve (63))_ - Microsoft SQL Serv

This means the cursor appears there. Not too handy. However, also notice the $SELECTEDTEXT$ snippet. This is handy.

Let’s change our code and include an update statement. For example, suppose as part of a procedure, I’m writing this code:

2016-09-23 12_15_44-SQLQuery4.sql - (local)_SQL2014.Sandbox (PLATO_Steve (63))_ - Microsoft SQL Serv

I’ve got a procedure, and I’ve forgotten to include TRY..CATCH. Certainly I can surround the highlighted code with the structure, but that’s cumbersome, even using Prompt’s intellisense.

There’s a better way.

Notice the little SQL Prompt icon in the side bar. Prompt is active, and if I click CTRL, I’ll get this:

2016-09-23 12_16_59-SQLQuery4.sql - (local)_SQL2014.Sandbox (PLATO_Steve (63))_ - Microsoft SQL Serv

I’ve activated SQL Prompt in the context of my highlighted text. Now I can type t-c and see the snippet.

2016-09-23 12_17_49-SQLQuery4.sql - (local)_SQL2014.Sandbox (PLATO_Steve (63))_ - Microsoft SQL Serv

If I hit TAB, I get this:

2016-09-23 12_18_23-SQLQuery4.sql - (local)_SQL2014.Sandbox (PLATO_Steve (63))_ - Microsoft SQL Serv

Pretty cool. My update is surrounded by the snippet code, and placed where the $SELECTEDTEXT$ placeholder was used.

Use this. Write code quicker and make it better.

Try a SQL Prompt evaluation today and then ask your boss to get you this productivity enhancing tool, or if you’re using the tool, practice using ii the next time you need to insert some data.

You can see a complete list of SQL Prompt tips at Redgate.

Posted in Blog | Tagged , , , | Leave a comment

Monday Night Networking Dinner Tonight

If you are attending the  PASS Summit, or even in the Seattle area, we have a SQL Server pro networking dinner tonight. This is at The Yardhouse on 4th Avenue. This is a buy your own event, but you’ll have the chance to meet with and network with other professionals.

We’ve split the dinner up into four times to lessen the load on the restaurant, and we’d like you to sign up so we can plan better.

The times are:

5:30 – Sign up

6:30 – Sign up

7:30 – Sign up

8:30 – Sign up

I’ll be there for awhile, along with Andy Warren and a few volunteers. Hope to see you there.

Posted in Blog | Tagged , , | Leave a comment

Your Technology Budget

After I wrote about a home lab recently, someone was curious about the amount of money that others invest in their technology lab, habit, or hobby. I thought that was fairly interesting, so I decided to ask others to get an idea of what amounts people spend and even on what.

How much do you spend on technology in a year?

Let’s skip cell phone and Internet service bills, but if you buy cell phones, tablets, computers, Kindles, or other gadgets, let us know. I’d be curious about your spending on your career, which might be travel and events, training, books, etc. versus the spending on technology for fun.

For me the amounts vary dramatically. The vast majority of my spending on technology is for replacement pieces and parts. I will buy hard drives or monitors or memory, but rarely anything more fun in most years. I don’t spend too much on learning as I get a Pluralsight subscription, and of course, I read SQLServerCentral. I rarely take training, but my generous employer usually covers that. In fact, most years, my total expense for technology is likely a few hundred dollars at the outside.

I have purchased a few Kindle devices for reading, and I’ve gotten cases and batteries for my cell phones. I do go through keyboards and mice every year or two, sot hose are expenses, but rarely anything more interesting. I did buy a Pebble watch a few years ago. This year was an exception as I spent a bit to rebuild a desktop that has been aging. I didn’t get everything I wanted, but I substantially improved the speed, which was the important thing. I’ll also likely add in a new tablet next year to replace my aging iPad Mini (non Retina) to help with coaching volleyball. I’d like a drone for some fun camera work with kids and sports, but I suspect that is not likely to happen in the next year. Those gadgets are expensive for a complete indulgence.

Let us know this week what you’re spending money on in technology and if any of those items have proven themselves really valuable, or just lots of fun.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 4.2MB) podcast or subscribe to the feed at iTunes and Libsyn.

Posted in Editorial | Tagged | 2 Comments

Meet and Greet Data Pros at #sqlsummit

Next Monday night is the networking dinner that Andy Warren and I have put together for the last 6 years. Once again, we’ve arranged for everyone in Seattle to come join in.

If you don’t have plans for Monday night, pick a time, and join us at the Yardhouse. We’ve got free tickets that are just being used for planning purposes. There’s no charge, other than the food or drink that you want to order.

Or buy for someone you meet.

This is a great networking event, so please pass the word and attend if you can.

Posted in Blog | Tagged , , | Leave a comment

#RedgateRocks at the PASS Summit

If you are attending the PASS Summit next week in Seattle, and want to attend a fun party on Thursday night, come by the Redgate Software booth in the expo hall for your #RedgateRocks bracelet. This is your ticket to admission Thursday night.

Walk down to 1927 3rd Ave between 7:30-10:30 Thursday night and come in. We’ll buy the first drink and have some nibbles, as the Brits say.

Grant and I will be down there for at least part of the night along with plenty of Redgaters enjoying the night. We might not look quite the same, but we’ll be having some fun.


There’s plenty of other Redgate contests and fun for the week, so come see us at the Summit.

Posted in Blog | Tagged , | Leave a comment

Where Do You Run Your R Scripts?

I know most of you don’t work with the R language. In fact, plenty of you might not know anything about R other than a cursory understand of this as some sort of data analysis language. If you want to know more, here’s what the R Project is.

Microsoft wants you to use R Services in SQL Server, or the R Server product available as a standalone system. However, I saw someone ask the question why would someone run their R scripts inside SQL Server, because these are expensive CPU cycles to burn on analysis. Someone else noted that Microsoft loves your licensing dollars, so their push to use R Services is perhaps a little self serving.

Push the intellignce to the data makes sense. Isn’t that what we do with large data warehousing queries or SSAS cubes? We’re trying to get the analysis done at scale without having to move the data elsewhere, especially considering we’ve (usually) already moved the data in some sort of ETL (or ELT) process. Gaining insights from our ever increasing scales of data requires some computational cycles somewhere.

What’s the alternative? Large queries that pull data to some client? I think that’s fine, and that might be a better alternative since simple queries to pull data don’t burn as many CPU cycles as those that might perform analysis. I certainly understand that the licensed CPU cycles for a SQL Server instance are expensive, and we want to be careful how they are used. Adding complex R scripts might not be the best use of our licensing dollars. On the other hand, if I can perform analysis quicker, that is more useful, than perhaps I can eliminate other random queries analysts want to run on my database?

Ultimately I think that R Services make some sense in SQL Server, but not as some experiment. I would suggest that the R client is the way to experiment, preferably on a copy of data that allows someone to build scripts and determine if there is insight to be gained from a particular set of data. Build a Proof of Concept (POC), and only deploy it to a SQL Server if you find it provides value.

And if you do so, continue to experiment. That R script you run today might not be as useful in six months as your application, database, and business evolve. Data analysis isn’t a set-it-and-forget-it, but rather an ongoing, iterative process.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 3.8MB) podcast or subscribe to the feed at iTunes and Libsyn.

Posted in Editorial | Tagged , | 2 Comments

Creating a Logon Trigger–#SQLNewBlogger

Another post for me that is simple and hopefully serves as an example for people trying to get blogging as #SQLNewBloggers.

Suppose you want to audit logins for your SQL Server instance. There are multiple ways to do this, but Logon Triggers have a few advantages. First, they get data into a table that most of us are familiar with, SQL Server. Second, they guarantee that the event is captured on the instance if the trigger is enabled.

There are plenty of other uses for these triggers, but beware that you can cause problems if your code doesn’t execute flawlessly.

I’ll show you how to create a basic logon trigger here. Note, these are server level items, and you’ll need to be able to create the trigger in the master database.

The structure of this code is similar to other triggers. We’ll use the CREATE TRIGGER DDL. Where this differs from DML triggers is that we use the ON ALL SERVER command. For auditing, I tend to set this as an EXECUTE AS ‘sa’, but you may choose a different type of account.

Here’s my basic code:

--ALTER TRIGGER LogonTrigger
    INSERT DBAAdmin.dbo.LoginAuditing
Note in this code  I’m specifying a specific table to store data in. This has to exist.

I also use the FOR LOGON event. You can scope triggers for other events, at the server or database levels, and read more about DDL triggers in BOL.

Once a user attempts to logon, the trigger fires and a simple insert takes place. If there are errors in inserting this data, the user may not be able to logon.

There is a lot to write about logon triggers, but for this short piece, I’m just showing hot to get started. Please, please, please, be careful with these and read the documentation carefully. Be sure you understand how they work and how to disable them. If you implement one, test it extensively.


I had a small issue building a logon trigger, and thought I’d get a few posts written on the topic. This was 5 minutes work since I had the code and just wanted to describe things, but I’ve got a few more posts sketched on this topics  as I’ve learned more.

Learning something, solving a problem, writing about it. This is a good way to show someone you are learning about a topic and developing some skill.

Posted in Blog | Tagged , , , | 2 Comments

Fun and Contests at Summit 2016

I’ll be attending the 2016 Pass Summit as a part of the Redgate Software crew. I’m not speaking this year, so most of my time will be at the Redgate booth, showing you a few tricks and tips with products and maybe convincing you that we can help your company produce software just a little bit better.

As a part of the Redgate experience this year, we have a few fun contests for you, an amazing new product, and a party as well. First, the contests.

Win Coffee

Many of us like coffee. Even if you don’t, I bet someone in your life does. Redgate is giving you that chance to win Death Wish Coffee. In fact, you can win a year’s supply. Just the thing to help you get through those longs weeks at work.

Come by the Redgate booth and learn the answers to a few questions. Then enter online.

Oh year, you’ll also win a SQL Toolbelt + support.

Win a GoPro Hero 5 Session

We have a difference flyer that you can fill out and return to win a GoPro Hero 5 Session. That’s a neat little camera that looks like a lot of fun. In fact, maybe I need to get one of these for some mini movies in my life.

SQL Clone

I’m excited about SQL Clone. In fact, I saw a demo, using mocked up basic PoSh over a year ago and I knew this was a great idea. It’s finally coming to fruition, with a beta release next month (Nov 2016).

However, Grant is presenting a session SQL Clone on Thursday, Oct 27, at 3:15pm. Come see this talk and learn why I think this will really help with database development (and other tasks).

Redgate Rocks

Thursday night, Oct 27,  Redgate Software is hosting a party. This will be from 7:30-10:30 at 1927 Events on 3rd Ave. With drinks and snacks, this will be a fun time and  a good chance to unwind and chat with colleagues.

You’ll need a wristband to attend, so be sure to stop by the Redgate booth and pick one up.

Follow @redgate

We’re on twitter as @redgate, and we’ll be posting updates during the week, so keep in touch with us and hope to see you next week.

Posted in Blog | Tagged , , | 3 Comments

One Million

Is a million writes/sec a large number? I guess it’s all relative, but I’d consider it a fairly large number. Our peak transfers for SQLServerCentral are in the hundreds/sec, though we’re not really a busy database server. Some of you might have much higher numbers, and if you’re in the 100,000/sec and can let us know, I’m sure others would be interested in hearing about your experiences.

I ran across a piece on the Uber database infrastructure that I found impressive. No, Uber doesn’t use SQL Server (they use Cassandra), but they have worked to build a large scale infrastructure. Their goals:

  • 1 in 100,000 requests can fail
  • 1,000,000 writes/sec
  • > 100,000 reads/sec

Those are quite impressive. While I’m not sure they’ve ever achieved these levels in production, I’m glad they’re testing at these points. I think far too many people forget to test the limits of where there systems might grow and only stick with where they are today. Or where they were a month ago when they refreshed a test environment from production. Test at larger than production levels, at least once in awhile.

There’s something impressive with one million. Getting to a MB, roughly 1mm bytes, was impressive to me. Not such a big deal now (with pictures requiring > 1MB), but 1mm MB is a terabyte, and while I carry that in my pocket, it’s still an impressive size. Crossing one million members at SQLServerCentral was impressive. I think $1mm is a lot of money. One in a million still seems like a very small chance of an event. At the recent Data Science Summit, we see SQL Server scoring over 1mm fraud predictions/sec.

Achieving 1mm of anything in a database system is still a large number. I know many people have tables with over a billion rows, but I’d still say a million is large. Perhaps you disagree, but I’m still a little awed at seeing SQL Server process a query of 1mm rows in less than a second.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 3.1MB) podcast or subscribe to the feed at iTunes and Libsyn.

Posted in Editorial | Tagged | Leave a comment

DB CI: Getting Test Results in VSTS

One of the things I struggled with a bit was getting test results to appear in VSTS from my builds.

Here’s my original Build flow:

2016-10-07 10_50_32-Microsoft Visual Studio Team Services

However, when I’d run a build, I got this in the Build summary:

2016-10-07 10_51_43-Microsoft Visual Studio Team Services

That’s not great, and I knew this could work better, so I started to look at where my test results were. Here’s what I did to get the results to appear.

I added a Publish Test Results task to the end. This is in the “Test” section of the Task catalog.

2016-10-07 10_39_11-Microsoft Visual Studio Team Services

When I looked in the output of the build tasks, I found these files:

2016-10-07 10_31_41-s

I kept thinking I needed to get the .trx file to load, but that wasn’t correct. I really wanted the “*reports.junit.xml” file. After a little experimenting, I configured the task like this:

2016-10-07 10_40_33-Microsoft Visual Studio Team Services

Once I saved this (with a comment), and ran a new build, I could see test results.

2016-10-07 10_31_50-Microsoft Visual Studio Team Services

I could have just read the XML file in my build result folder, but that’s a pain. Typically we don’t care unless there is a failed test, but when there is, it’s nice to see what actually failed.

Plus, having the results in the console means I can show the other developers and management that tests are being run and are checking code.

Posted in Blog | Tagged , , , | Leave a comment