Python Command Line Calls

There was a time I worked in a company any we lived in the command line. This was in the early 90s, prior to Windows v3.1 being released and we primarily used DOS on a Novell network.

We also had paper phone books on every desk for the 1,000+ people on the property. However, as you might guess, the phone books went out of date, and were only updated a couple times a year. We might get a new sheet, but mostly people learned to cross out a name and write a new extension in for the people they dealt with regularly.

However updates to the master list happened regularly, every few days. These were done in a small Lotus 1-2-3 file that an administrative assistant maintained. As a network person, I knew where this lived, and arranged a text export of this every night with a scheduled task.

Why text? Well, I’d come from a university where we had our phone details in a text file and would grep the file to find information. In DOS, I knew we could do the same thing with FIND. However, rather than write the find command with parameters, I put a batch file on our global share that called FIND with the parameters needed and the path to the phone book. I called this 411.bat. When I needed a phone number, I could type

411 “Andy Warren”

I’d get Andy’s name, location, and phone number back. It was a trivial piece of programming for me, but the rest of the network team, all non-programmers, were thrilled. I even added a /? check to return help information to the user.

With my playing with Python last week, I decided to do this for myself as well. I took my Python program to send tweets and changed it to send a tweet when the program was called, and to send the parameter as the tweet. The code looked like this::

import sys
import tweepy
def send_a_tweet(tweettext):
consumer_key = "X1GWqgKpPP4OuqWJZ7hw6"
consumer_secret = "QW3EkMHlyzF69AHn8DjyWyRG5CAQ0wjK9RqUZ2"
access_token = '146009-MqpeZFslj0VGgTSik1fq5klvJpqc1x6HAsiu'
access_token_secret = '8wIZlPbIqXtczc2LOsJgjMP3dCRRw5ajMvkjEspF'

auth = tweepy.OAuthHandler(consumer_key, consumer_secret)
auth.set_access_token(access_token, access_token_secret)

api = tweepy.API(auth)


if __name__ == '__main__':

I placed this in a “\tools” folder that I have in my path. I also added a “tweet.cmd” file in this folder with this code:

python c:\tools\ %1

Since Python.exe is in my path as well, I can do this:

2015-11-06 15_25_19-Command Prompt

And I’ll see this on my timeline. I guess you’ll all be able to see this as well.

2015-11-06 15_25_26-Steve Jones (@way0utwest) _ Twitter


Why bother? Well, it was partially playing around. As I have been learning Python, I have mostly been playing in an IDE, solving small problems, but not really doing things useful. I also like the idea of command line tools, since I find them quick. Tweetdeck is big and bloated, and if I wanted to send a tweet from my desk, this is a quick way to do it. I could do a “readtweets” as well, and may.

However I also learned how to call Python programs with a command line, which is a good step to starting to build more useful programs that I can customize. This is also the start of me being able to schedule a program, and perhaps build more automation into my life with Python.

Mostly, however, it was just fun.

Test if a File Exists with Powershell

Another post for me that is simple and hopefully serves as an example for people trying to get blogging as#SQLNewBloggers.

Awhile back I wrote a PowerShell script (PoSh) to download the SQL Saturday XML files used for the Guidebook application. These contain a bunch of the information that I wanted to collect, transform, and keep around.

However the script wasn’t great. It basically looped a set number of times and re-downloaded the files. Not the more efficient solution, especially if I want this to run regularly.

One of the enhancements I wanted to make was check if the file exists, and if not, then download it. However, if it does exist, then I’ll skip the file. I know this means I don’t get updated files if schedules change, which is possible, but in that case, I can just delete the file from my desktop and go from there.

I made a quick search, and found a few links to the Test-Path cmdlet. Essentially you give this a file path and it returns true or false. Almost exactly what I need. This is what I added to my code:

if (Test-Path $DestinationFile) {

#do something


However I want to take action if the file doesn’t exist. In most languages, I’d add a ! in front to signify "not". However that doesn’t work in PoSh, just like > doesn’t mean greater than.

Another search showed me the -NOT operator. That’s what I need, but I can’t do this:

if (-NOT Test-Path $DestinationFile) {

#do something


Instead, I need to have a single expression for -NOT, which means more parenthesis. Not a big deal. I used this code:

if (-NOT (Test-Path $DestinationFile)) {

#do something


That worked great and now I only download the files I need. If I want to re-download (for newer events), I just delete those files and re-run the script.


This post came about when I started working on the script. It actually took longer to write this than find the solution, add it to code, and test it. That process was about 5 minutes.

This post took about 10 minutes to write. I also had a second post based on similar modifications to the script, so I’m did that in another 5 minutes.


A few of the links I used:

Python and Tweepy

One of the projects that’s been on my list lately is to programmatically access Twitter for a few ideas I want to play with. Since I’ve been trying to learn some Python, I thought I would take a look using Python to update status and read status.

A quick Google search showed me lots of Python clients, but Tweepy caught my eye for some reason. I’m not sure why, but I ended up popping a command line open and downloading the library.

From there, I saw a short tutorial over at Python Central. I started by creating an app at Twitter for myself, which was very simple. Once that was done, I had a set of consumer tokens (key and secret), that I could use. Another click got me to the access key and secret. Note, the easy way to do this is over at

My first attempt was using this sample code.

import tweepy

consumer_key = “ABCDE”
consumer_secret = “12345”
access_token = ‘asdfasdf’
access_token_secret = ‘98765’

auth = tweepy.OAuthHandler(consumer_key, consumer_secret)
auth.set_access_token(access_token, access_token_secret)

api = tweepy.API(auth)

public_tweets = api.user_timeline()
for tweet in public_tweets:

This gives me a list of my tweets. At least, the last 20.

2015-11-06 13_30_24-SDTIG - [C__Users_Steve_OneDrive_Documents_Python_SDTIG] - ..._Last_10_twitter_l

That’s progress.

I then went to make an update by using this:

api.update_status('Hello, Pyton')

However that returned an error:

2015-11-06 13_50_53-Cortana

Hmmm, I double checked the API, and this should work, but I’m guessing there’s another issue. I searched, and found there’s a bug. However, I should be using named parameters, so no big deal. Change the code.

api.update_status(status='Hello, Pyton')

Now it works.

2015-11-06 13_53_06-Steve Jones (@way0utwest) _ Twitter

This is the first step for me to look at building an app that might send some tweets on my behalf, perhaps with the data stored somewhere, like, I don’t know, maybe a database?

Viewing Extended Properties for Information

Another post for me that is simple and hopefully serves as an example for people trying to get blogging as#SQLNewBloggers.

I’ve been working a little with extended properties, adding and updating them for various objects. However in addition to altering properties, viewing the properties on an object is often necessary. This post will look at how we get the properties in a few different ways.

The easiest way to see extended properties is to look at the properties of an object in the SSMS Object Explorer. For example, I can right click on a table in OE.

2015-11-02 20_30_55-

Once I click Properties, I get a dialog with a lot of items on the left. The bottom one is for Extended Properties, with a simple add/edit/delete grid. Here I can see the property(ies) I’ve added.

2015-11-02 20_31_07-Table Properties - SalesHeader_Staging

However this is cumbersome for me. I’d much rather find a way to query the information, which is what I need to do with an application of some sort. I’d think sp_help would work, but it doesn’t. If I run this, I get the following result sets:

  • header with owner, type, and creation date.
  • column list with meta data
  • identity property information.
  • RowGuid column information
  • filegroup storage location.
  • Messages with index, constraint, FK, and schemabinding relations.

Not very helpful in this case.

I do know that extended property information is in sys.extended_properties. I can query this view, which gives me some information, but I need to join this with sys.objects for easy to understand information.

2015-11-02 20_38_42-SQLQuery13.sql - aristotle.RaiseCodeQuality (ARISTOTLE_Steve (69))_ - Microsoft

This works, and this is one of the ways in which I do query properties in various tSQLt tests.

There is one other way I’ve seen to query extended properties. When perusing the BOL page for sp_updateextendedproperty, I found sys.fn_listextendedpropery. This is a DMF, a function, that you can use to query for property values. Since it’s a TVF function, I need to use it in a query as a functional object.

2015-11-02 20_42_27-SQLQuery13.sql - aristotle.RaiseCodeQuality (ARISTOTLE_Steve (69))_ - Microsoft

There are lots of parameters in this function. However you can guess what they are after working with the other extended property procedures. In fact, the first time I started this post, I was disconnected and had to experiment with the function, adding parameters until it ran without an error.

The first parameter is the name of the property. This can be NULL, in which case you’ll get all the properties that exist.

2015-11-02 20_44_48-SQLQuery13.sql - aristotle.RaiseCodeQuality (ARISTOTLE_Steve (69))_ - Microsoft

The rest of the properties correspond to the level 0, 1, 2 types and names that you are using to filter the results. This is actually a good technique to use with this function, and I’ll be using this more in the future.


This post followed on from the previous ones. In this case, I started this disconnected, using the knowledge I had to write the basics with SSMS and the system table. That took about 20 minutes to document and then I spent 5 minutes experimenting with the function, whose name I had on an open browser tab. Once I worked through that, I spent another 5 minutes writing.

Thirty minutes to a post. You can do this.


A few items from BOL:

sp_help –

sys.extended_properties –

sys.fn_listextendedproperty –

Back to SQL Saturday Washington DC

I’ve been to the SQL Saturday in Washington, DC a few times and I’m looking forward to going back. It’s a good event and since I have family in the area, I can usually take a day or two of vacation and visit.

This year the event is on Dec 5, 2015, and it’s SQL Saturday #470. Wow, we’re closing on #500, which is amazing.

I’ve got two talks I’m giving, both of which I really like. I’ve revamped them both a little, incorporating a few new items in there and I hope everyone enjoys them.

  • Branding Yourself for a Dream Job
  • Get Testing with tSQLt

The room I’m in will be the room for the day, with both myself and Brent Ozar giving two sessions around Wayne Sheffield talking T-SQL window functions. I know there are other great sessions, but feel free to camp out in this room.

If you’re in the area, come join us for a free day of training. The event is in Chevy Chase at the Microsoft building.

Thankful in November

I’m taking a short break from technical talk today to spend a few minutes thinking about life. My wife started the Thankfulness challenge in November, and while I didn’t participate, I’ve been thinking about life as I watch her posts.

This is a career and learning blog, and much of my life has been about growing my career. However I’ve also had many things outside of my career that are important to me and occupy much of my time. My wife, my kids, my health and fitness, and various hobbies keep me busy, not to mention occasionally filling in as the ranch hand for my wife. I even got some unexpected, but very appreciated thanks this week.

However there is something that I was amazed by this week. The outpouring of support for my call to make a #sqlwish come true was incredible. I was truly touched that so many people came together and donated money that allowed us to raise the entire amount in a day. Not even a day. This is one of those things that makes me think the SQL Server community, the #sqlfamily, is truly a special and amazing thing.

Life is Hard

I do struggle at times with the overall load. I struggle with stress at times. I have plenty of guilt when I’m away from home and something happens. I definitely find that I don’t always have the time I’d like for myself amid all the chaos in life.

However I’m very lucky. I have a wonderful wife I love dearly. I have three kids, well two kids and an adult child, that are all healthy and successful in life. I have a good job that I enjoy. My family is financially secure. We don’t really have any complaints, and I definitely appreciate all that I have. I truly do try to stop and think about the amazing times I’ve had. I use Timehop to revisit memories in my life. I stop as I travel and reflect on the good and bad I see, while considering how I can leave the world a little better than what I found.

I wrote about a fundraiser yesterday, trying to make a #sqlwish come true. I can’t imagine for a moment how my life would be different, or how much things would change if someone in my family were sick. I can’t really imagine the situation, and my heart goes out to Lucy and her family.

As great as it’s been to make the wish come true, I also would like you to reflect a moment on the good things in your life. Your family, your health, your beliefs, the people around you, I’m sure there is some good around you. Stop for a minute and give thanks.

I do hope that all of you reading this are well, and you are able to create a great career doing something you love. I will help where I can, and hopefully SQLServerCentral does that. However I know that life isn’t always easy or smooth. I hope that when you experience tough times that you have someone to help pull you through. If you really need an ear, the #sqlfamily is great. On Twitter, or in our wild and wacky thread, there are plenty willing to listen.

Git Pull in SQL Source Control

I wrote recently on Git Push in SQL Source Control (SOC), and now wanted to add the other side of the story, pull. This process is even simpler.

I’ll be working with a new database, one that has two client systems set up already with Git support. I know I haven’t tackled that part of SOC and Git, but I’ll do that in a new post.

In this case, I’ve got a repository on Github for my Redgate and SQL in the City demos. It’s a sample database, but I’ve been making multiple changes to the database across time, using Version Control as a way of continuity, and then being able to rollback changes or see history.

I’ve got a connection in a VM to a second development machine. When I launch my VM, and SSMS, I go to the second SOC tab, the one that is labled “Get Latest.” You can see the image below, and notice that there are two buttons in the upper left. There is the familiar “Apply changes to database,” which has been the way that we pull changes from TFS and SVN, but now there is a new “Pull from remote respository.” This is the one specific to Git.

2015-10-22 23_19_54-OneNote

In this case, I’ve refreshed my database previously in sync with the local Git repo. Therefore the button is grayed out. However, I don’t know if there are remote changes until I click the other button.

I do that and get the familiar four step dialog from SOC that checks for changes and then compares them to the local database. However what you don’t see is that a “git pull” is issued first, pulling changes from the remote repo. In my case, I had one change I’d made in London, actually, and hadn’t synced with my VM.

2015-10-22 23_20_15-salesdemo-2015-02-25-1422-export-i-fgg5cstb - VMware Workstation

This is the familiar SOC actions now where I see the changes and can use the “Apply changes to database” to compile this code in my local database.

In some sense, this is less exciting then push, but it’s nice to be able to do this in the client.

I’ll write some more on the workflow as I develop changes in the coming posts, which are going to be aimed at building a small database in Azure SQL database and moving changes from my local system to Git for tracking, and then to Azure for production use.

Webinar: Unit Testing with SQL Server and tSQLt

I ran into Sebastian Meine at the PASS Summit a few weeks ago and we were talking testing. Sebastian is the founder and developer of tSQLt, which I really like using. We’ve done some teaching together and I’ve delivered a number of sessions on tSQLt at various events, but we wanted to get more people interested in testing code.

I had a session at PASS, which was very well attended. 150+ people came, which was stunning to me. I was expecting to see 20, and afterwards Sebastian and I started talking about what else we could do.

We’ve decided to do a webinar, but one driven by you. We are looking for you to ask questions about code you’d like tested, or which you’re unsure of how to approach. Leave a comment here, or put your question in the webinar registration. The details are:

Unit Testing in SQL Server with tSQLt
Thurs, Nov 19, 2015 11:00 AM – 12:00 PM EDT

Join unit testing experts Steve Jones and Sebastian Meine for this exciting opportunity to learn about unit testing and the tSQLt framework to improve your T-SQL code quality and maintainability. If this day/time is not good for you, register anyway so you receive a link to the recording when it is available.

Updating Extended Properties

Another post for me that is simple and hopefully serves as an example for people trying to get blogging as#SQLNewBloggers.

I wrote recently about adding extended properties. Updating them is very similar. There’s an analogous procedure called sp_updateextendedproperty that changes the value of properties.

The arguments are again, unintuitive, but the more I work with extended properties, the more comfortable I become. In this case, I have the same name and value, and then the level 0,1, 2 items with both a type and name.

I highly suggest, however, that you name your parameters, including the names in your calls so programmers running across the T-SQL aren’t depending on position for an understanding of the parameter.

If I look at the table from the previous post, I can update the value of my property with this code:

EXEC sys.sp_updateextendedproperty
  @name = 'PKException'
, @value = 0
, @level0type = 'schema'
, @level0name = 'dbo'
, @level1type = 'table'
, @level1name = 'SalesHeader_Staging' -- sysname

However my property needs to exist. If I call this procedure with the wrong property, I get an error.

2015-11-02 17_25_03-Cortana

This means that you need to be sure that the property exists before you update it. Good code would have the error handling somewhere.


After writing the previous post, this one took only about ten minutes to do the typing. I’d been working with extended properties, so I had the code and just needed to take the screenshot.


A few items from BOL

sp_updateextendedproperty –

Poor Data Modeling – T-SQL Tuesday #72

tsqltuesdayThis month Mickey Stuewe hosts the T-SQL Tuesday and she has a great topic. Data Modeling is something few people seem to do, especially developers, and it often can go wrong. That’s the topic, and I have a short story.

T-SQL Tuesday is a monthly blog party, on the second Tuesday of the month. The rules are on Mickey’s blog, and you can read them there, but essentially you write a post every month.

Or later.

The Aging Model

I once worked at company that shall rename nameless. We had a system built before my time which had a hierarchical set of entities. I can’t disclose the exact industry, but imagine that we had a list of items like this:

  • Manufacturers of products
  • Distributors of products
  • Warehouse holding products
  • Vendors selling products.

In this structure, we have a strict linkage where each item below the next is contained in the item above it. In other words, a manufacturer works with specific distributors and only those distributors. Distributors don’t work with other manufacturers.

I know this sounds contrived, and it is for a supply chain. However not for the industry in which I worked. So imagine we’re 100 years ago when power was more concentrated with supply chains.

This resulted in tables like this:

CREATE TABLE Manufacturers ( manufacturerid INT IDENTITY(1, 1) CONSTRAINT manufacturer_PK PRIMARY KEY ( manufacturerid ) , manufacturername VARCHAR(200) , primarycontactid INT -- ... ); GO CREATE TABLE Distributors ( distributorid INT IDENTITY(1, 1) CONSTRAINT distributor_PK PRIMARY KEY ( distributorid ) , manufacturerid INT CONSTRAINT Distributor_Manufacturer_FK FOREIGN KEY REFERENCES dbo.Manufacturers ( manufacturerid ) , manufacturername VARCHAR(200) , PrimarySalesPersonid INT -- ... ); GO CREATE TABLE Warehouses ( warehouseid INT IDENTITY(1, 1) CONSTRAINT warehouse_PK PRIMARY KEY ( distributorid ) , distributorid INT CONSTRAINT Warehouse_Distributor_FK FOREIGN KEY REFERENCES dbo.Distributors ( distributorid ) , warehouse VARCHAR(200) , regionid INT -- ... ); GO


Each of these links to the item above it. This means that I might have a Manufacturer  table like this:

manufacturerid  manufacturername   …

————–  —————-  

1               Acme

2               Big Product Corp

With warehouses linked as shown.

warehouseid  manufacturerid warehousename   …

———–  ————– ————-  

1            1              Denver Central

2            1              Kansas City East

3            2              Santa Fe

4            1              Billings Depot

This would mean that I used distributors that worked with a warehouse, and their data would be.

distributorid warehouseid distributorname  …

————- ———– ————-  

1            1            Denver City

2            1            Denver South

3            1            Aurora

4            2            Kansas City Distributors

5            3            New Mexico Wholesale

If I wanted to get a list of the distributors that carried a manufacturer’s products, I’d have to join through the warehouse table.

SELECT manufacturerid , ... FROM dbo.Manufacturers AS m INNER JOIN dbo.Distributors AS d ON d.manufacturerid = m.manufacturerid INNER JOIN dbo.Warehouses AS w ON w.distributorid = d.distributorid ...

Not a big deal, but we had 5 levels of this appear over time. Which means that queries that might need a higher level had to cross tables in between to join data. These also weren’t narrow tables, with a decent amount of meta data for each entity. Indexing helped, but certainly we needed better rollups of data, and performance suffered as the amount of legacy data grew.

What’s more, over time, we learned that business changes. A warehouse might start to work with multiple distributors, and our model couldn’t cope.

Eventually we embarked upon a project that set of link tables between entities, so that we had just IDs in a table that showed linkages where appropriate. This was ongoing when I left, but it was a major disruption and problem for the company.

I’ve tried to avoid embedding hard relationships based on business contracts in my models. Certainly some FKs make sense, like order details linked to a specific order, but many other relationships aren’t as tightly bound, and it is possible that they will change. It doesn’t cost me a lot in modeling to treat suspect relationships as many:many, and if application development does this from the start, it’s an easy architecture to incorporate in their development.

Data modeling is always a struggle as we often work with incomplete information because an organization will morph over time. It is, however, worth taking some time to think about the possible ways in which things might change in the future and allow for some flexibility in your model where possible.