Your Technology Budget

After I wrote about a home lab recently, someone was curious about the amount of money that others invest in their technology lab, habit, or hobby. I thought that was fairly interesting, so I decided to ask others to get an idea of what amounts people spend and even on what.

How much do you spend on technology in a year?

Let’s skip cell phone and Internet service bills, but if you buy cell phones, tablets, computers, Kindles, or other gadgets, let us know. I’d be curious about your spending on your career, which might be travel and events, training, books, etc. versus the spending on technology for fun.

For me the amounts vary dramatically. The vast majority of my spending on technology is for replacement pieces and parts. I will buy hard drives or monitors or memory, but rarely anything more fun in most years. I don’t spend too much on learning as I get a Pluralsight subscription, and of course, I read SQLServerCentral. I rarely take training, but my generous employer usually covers that. In fact, most years, my total expense for technology is likely a few hundred dollars at the outside.

I have purchased a few Kindle devices for reading, and I’ve gotten cases and batteries for my cell phones. I do go through keyboards and mice every year or two, sot hose are expenses, but rarely anything more interesting. I did buy a Pebble watch a few years ago. This year was an exception as I spent a bit to rebuild a desktop that has been aging. I didn’t get everything I wanted, but I substantially improved the speed, which was the important thing. I’ll also likely add in a new tablet next year to replace my aging iPad Mini (non Retina) to help with coaching volleyball. I’d like a drone for some fun camera work with kids and sports, but I suspect that is not likely to happen in the next year. Those gadgets are expensive for a complete indulgence.

Let us know this week what you’re spending money on in technology and if any of those items have proven themselves really valuable, or just lots of fun.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 4.2MB) podcast or subscribe to the feed at iTunes and Libsyn.

Posted in Editorial | Tagged | 1 Comment

Meet and Greet Data Pros at #sqlsummit

Next Monday night is the networking dinner that Andy Warren and I have put together for the last 6 years. Once again, we’ve arranged for everyone in Seattle to come join in.

If you don’t have plans for Monday night, pick a time, and join us at the Yardhouse. We’ve got free tickets that are just being used for planning purposes. There’s no charge, other than the food or drink that you want to order.

Or buy for someone you meet.

This is a great networking event, so please pass the word and attend if you can.

Posted in Blog | Tagged , , | Leave a comment

#RedgateRocks at the PASS Summit

If you are attending the PASS Summit next week in Seattle, and want to attend a fun party on Thursday night, come by the Redgate Software booth in the expo hall for your #RedgateRocks bracelet. This is your ticket to admission Thursday night.

Walk down to 1927 3rd Ave between 7:30-10:30 Thursday night and come in. We’ll buy the first drink and have some nibbles, as the Brits say.

Grant and I will be down there for at least part of the night along with plenty of Redgaters enjoying the night. We might not look quite the same, but we’ll be having some fun.


There’s plenty of other Redgate contests and fun for the week, so come see us at the Summit.

Posted in Blog | Tagged , | Leave a comment

Where Do You Run Your R Scripts?

I know most of you don’t work with the R language. In fact, plenty of you might not know anything about R other than a cursory understand of this as some sort of data analysis language. If you want to know more, here’s what the R Project is.

Microsoft wants you to use R Services in SQL Server, or the R Server product available as a standalone system. However, I saw someone ask the question why would someone run their R scripts inside SQL Server, because these are expensive CPU cycles to burn on analysis. Someone else noted that Microsoft loves your licensing dollars, so their push to use R Services is perhaps a little self serving.

Push the intellignce to the data makes sense. Isn’t that what we do with large data warehousing queries or SSAS cubes? We’re trying to get the analysis done at scale without having to move the data elsewhere, especially considering we’ve (usually) already moved the data in some sort of ETL (or ELT) process. Gaining insights from our ever increasing scales of data requires some computational cycles somewhere.

What’s the alternative? Large queries that pull data to some client? I think that’s fine, and that might be a better alternative since simple queries to pull data don’t burn as many CPU cycles as those that might perform analysis. I certainly understand that the licensed CPU cycles for a SQL Server instance are expensive, and we want to be careful how they are used. Adding complex R scripts might not be the best use of our licensing dollars. On the other hand, if I can perform analysis quicker, that is more useful, than perhaps I can eliminate other random queries analysts want to run on my database?

Ultimately I think that R Services make some sense in SQL Server, but not as some experiment. I would suggest that the R client is the way to experiment, preferably on a copy of data that allows someone to build scripts and determine if there is insight to be gained from a particular set of data. Build a Proof of Concept (POC), and only deploy it to a SQL Server if you find it provides value.

And if you do so, continue to experiment. That R script you run today might not be as useful in six months as your application, database, and business evolve. Data analysis isn’t a set-it-and-forget-it, but rather an ongoing, iterative process.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 3.8MB) podcast or subscribe to the feed at iTunes and Libsyn.

Posted in Editorial | Tagged , | 2 Comments

Creating a Logon Trigger–#SQLNewBlogger

Another post for me that is simple and hopefully serves as an example for people trying to get blogging as #SQLNewBloggers.

Suppose you want to audit logins for your SQL Server instance. There are multiple ways to do this, but Logon Triggers have a few advantages. First, they get data into a table that most of us are familiar with, SQL Server. Second, they guarantee that the event is captured on the instance if the trigger is enabled.

There are plenty of other uses for these triggers, but beware that you can cause problems if your code doesn’t execute flawlessly.

I’ll show you how to create a basic logon trigger here. Note, these are server level items, and you’ll need to be able to create the trigger in the master database.

The structure of this code is similar to other triggers. We’ll use the CREATE TRIGGER DDL. Where this differs from DML triggers is that we use the ON ALL SERVER command. For auditing, I tend to set this as an EXECUTE AS ‘sa’, but you may choose a different type of account.

Here’s my basic code:

--ALTER TRIGGER LogonTrigger
    INSERT DBAAdmin.dbo.LoginAuditing
Note in this code  I’m specifying a specific table to store data in. This has to exist.

I also use the FOR LOGON event. You can scope triggers for other events, at the server or database levels, and read more about DDL triggers in BOL.

Once a user attempts to logon, the trigger fires and a simple insert takes place. If there are errors in inserting this data, the user may not be able to logon.

There is a lot to write about logon triggers, but for this short piece, I’m just showing hot to get started. Please, please, please, be careful with these and read the documentation carefully. Be sure you understand how they work and how to disable them. If you implement one, test it extensively.


I had a small issue building a logon trigger, and thought I’d get a few posts written on the topic. This was 5 minutes work since I had the code and just wanted to describe things, but I’ve got a few more posts sketched on this topics  as I’ve learned more.

Learning something, solving a problem, writing about it. This is a good way to show someone you are learning about a topic and developing some skill.

Posted in Blog | Tagged , , , | 2 Comments

Fun and Contests at Summit 2016

I’ll be attending the 2016 Pass Summit as a part of the Redgate Software crew. I’m not speaking this year, so most of my time will be at the Redgate booth, showing you a few tricks and tips with products and maybe convincing you that we can help your company produce software just a little bit better.

As a part of the Redgate experience this year, we have a few fun contests for you, an amazing new product, and a party as well. First, the contests.

Win Coffee

Many of us like coffee. Even if you don’t, I bet someone in your life does. Redgate is giving you that chance to win Death Wish Coffee. In fact, you can win a year’s supply. Just the thing to help you get through those longs weeks at work.

Come by the Redgate booth and learn the answers to a few questions. Then enter online.

Oh year, you’ll also win a SQL Toolbelt + support.

Win a GoPro Hero 5 Session

We have a difference flyer that you can fill out and return to win a GoPro Hero 5 Session. That’s a neat little camera that looks like a lot of fun. In fact, maybe I need to get one of these for some mini movies in my life.

SQL Clone

I’m excited about SQL Clone. In fact, I saw a demo, using mocked up basic PoSh over a year ago and I knew this was a great idea. It’s finally coming to fruition, with a beta release next month (Nov 2016).

However, Grant is presenting a session SQL Clone on Thursday, Oct 27, at 3:15pm. Come see this talk and learn why I think this will really help with database development (and other tasks).

Redgate Rocks

Thursday night, Oct 27,  Redgate Software is hosting a party. This will be from 7:30-10:30 at 1927 Events on 3rd Ave. With drinks and snacks, this will be a fun time and  a good chance to unwind and chat with colleagues.

You’ll need a wristband to attend, so be sure to stop by the Redgate booth and pick one up.

Follow @redgate

We’re on twitter as @redgate, and we’ll be posting updates during the week, so keep in touch with us and hope to see you next week.

Posted in Blog | Tagged , , | 3 Comments

One Million

Is a million writes/sec a large number? I guess it’s all relative, but I’d consider it a fairly large number. Our peak transfers for SQLServerCentral are in the hundreds/sec, though we’re not really a busy database server. Some of you might have much higher numbers, and if you’re in the 100,000/sec and can let us know, I’m sure others would be interested in hearing about your experiences.

I ran across a piece on the Uber database infrastructure that I found impressive. No, Uber doesn’t use SQL Server (they use Cassandra), but they have worked to build a large scale infrastructure. Their goals:

  • 1 in 100,000 requests can fail
  • 1,000,000 writes/sec
  • > 100,000 reads/sec

Those are quite impressive. While I’m not sure they’ve ever achieved these levels in production, I’m glad they’re testing at these points. I think far too many people forget to test the limits of where there systems might grow and only stick with where they are today. Or where they were a month ago when they refreshed a test environment from production. Test at larger than production levels, at least once in awhile.

There’s something impressive with one million. Getting to a MB, roughly 1mm bytes, was impressive to me. Not such a big deal now (with pictures requiring > 1MB), but 1mm MB is a terabyte, and while I carry that in my pocket, it’s still an impressive size. Crossing one million members at SQLServerCentral was impressive. I think $1mm is a lot of money. One in a million still seems like a very small chance of an event. At the recent Data Science Summit, we see SQL Server scoring over 1mm fraud predictions/sec.

Achieving 1mm of anything in a database system is still a large number. I know many people have tables with over a billion rows, but I’d still say a million is large. Perhaps you disagree, but I’m still a little awed at seeing SQL Server process a query of 1mm rows in less than a second.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 3.1MB) podcast or subscribe to the feed at iTunes and Libsyn.

Posted in Editorial | Tagged | Leave a comment

DB CI: Getting Test Results in VSTS

One of the things I struggled with a bit was getting test results to appear in VSTS from my builds.

Here’s my original Build flow:

2016-10-07 10_50_32-Microsoft Visual Studio Team Services

However, when I’d run a build, I got this in the Build summary:

2016-10-07 10_51_43-Microsoft Visual Studio Team Services

That’s not great, and I knew this could work better, so I started to look at where my test results were. Here’s what I did to get the results to appear.

I added a Publish Test Results task to the end. This is in the “Test” section of the Task catalog.

2016-10-07 10_39_11-Microsoft Visual Studio Team Services

When I looked in the output of the build tasks, I found these files:

2016-10-07 10_31_41-s

I kept thinking I needed to get the .trx file to load, but that wasn’t correct. I really wanted the “*reports.junit.xml” file. After a little experimenting, I configured the task like this:

2016-10-07 10_40_33-Microsoft Visual Studio Team Services

Once I saved this (with a comment), and ran a new build, I could see test results.

2016-10-07 10_31_50-Microsoft Visual Studio Team Services

I could have just read the XML file in my build result folder, but that’s a pain. Typically we don’t care unless there is a failed test, but when there is, it’s nice to see what actually failed.

Plus, having the results in the console means I can show the other developers and management that tests are being run and are checking code.

Posted in Blog | Tagged , , , | Leave a comment

Archaic Commands

I was watching the San Diego Technology Immersion Group meeting in September, and found myself transported back into the past. This month was the first meeting of the Linux chapter, spending a few months working through The Linux Command Line.

Watching from home, I watched the hosts walking through the command line in Linux, showcasing various basic file comments. Seeing them explain and execute commands like ls, rm, alias, cat, tail, less, and other commands in a text session reminded me of learning Unix at university on SunOS. Slowly various bits of knowledge came back to me from my own experiments learning how to work with files, full of text data, interspaced with my regular visits to the “man” pages when something didn’t work. In fact, seeing the man pages come up on the YouTube session made me smile.

It’s funny to think how much the world has changed. Even now, I have an Ubuntu VM that I run at times, but it’s mostly a GUI shell, and my interactions with the command line are fairly minimal. That’s completely the opposite of my time at university and my first job where I pretty much lived in the command line. My code editors were vi and emacs that I ran across a dozen X Windows sessions open on various machines in the college computer lab. My interactions were really working with text and text commands. Even now, I might use SSMS and Visual Studio, but I’ll pop into a command line to check my git repo or spend time playing with PowerShell and executing commands in text sessions. While working with Python I’m as likely to use the REPL from a command window as I am to use an editor like VS or PyCharm. I really just like the interaction of the command line.

Not everyone feels that way, and that’s fine. I agree that GUI based systems are more convenient at times. In fact, I often use various GUI items in SSMS build up a task, such as an Extended Events session, and then script the result to give me code that I can execute. GUIs do some things extremely well, but allowing for repeatability is not one of them. They can also easily hide the actual commands and defaults being executed.

I think everyone working in technology needs to have some skill in working with a command line. Even if you want to use T-SQL to script all your tasks, you should be familiar enough with something like SQLCMD to execute a script in case you run into a situation where SSMS might not work. This might be especially true as you learn to automate more of the work your systems require and perhaps execute these scripts through other tools, like an Enterprise scheduler.

If nothing else, the command line teaches you how to work in a methodical, step by step process. However, I’d argue it develops new skills that might lead you to do solve a problem in more creative ways. Perhaps you’ll even grow to use a command line language like PowerShell to accomplish some repeatable task. You might find it works better than the old way you used to work.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 4.1MB) podcast or subscribe to the feed at iTunes and Libsyn.

Posted in Editorial | Tagged | Leave a comment

Webinar: Database Deployment with Git, Jenkins, and Octopus Deploy

I’ve got a webinar later today that looks at how Redgate Software’s DLM Automation and other SQL Toolbelt tools can be used to smoothly move development changes from one database to another.

Sign up and spend an hour with me this morning.

This is part of a monthly series that Redgate is running, where we’re changing the platform out every month. In this case, I’m going to use a git as my VCS, storing changes in there as I make them to a database.

From there, I will push the changes to a remote git repo and have Jenkins automatically pick up the changes and build, test, and deploy the changes to another database. I’ll make table changes, data changes, proc changes, anything you want. Just ask.

However, that’s not all. From there, we’ll see Octopus Deploy take a package of changes and deploy them out to other environments, letting my manage and track the process with the click of a button.

I’ve been doing this demos for a few years, across multiple platforms, but never on Jenkins. However, I’m amazed that things work as smoothly here as they do with TeamCity or VSTS.

Building software is building software, including database software.

If you’ve got some time today, join me and see how easy this can be.

Sign up and see database changes made live.

Posted in Blog | Tagged , , , , , | Leave a comment