Maybe Security is Harder than Rocket Science

I was giving a talk last year about software development and I made an off-hand remark that most of developers out there shouldn’t be writing authentication schemes for applications. My reasoning was that most people aren’t good at writing these systems and there are plenty of good authentication schemes already written that you can incorporate into a system.

However there are some bad ones as well. While I hope you don’t pick any of them, I also know that many of you might just build a poorly architected system because your focus isn’t on authentication. Your focus is on some other aspect of your application. I’m sure you know some of the good advice for building systems, as well as the best ways to handle passwords, but do you follow it? Under deadlines? When you have pressure to focus on more important aspects of your system? Or do you implement anti-patterns because it’s easy?

The European Space Agency (ESA) is full of rocket scientists. Literally, as they send rockets and astronauts into orbit around the earth. However they were hacked recently and the disclosures aren’t pretty. They not only had personal information released, but passwords were stored in plain text. What’s worse, 39% of the passwords were three letters.

Three.

I’m sure many of the people working on ESA systems were smart individuals, and they may be great web developers that build beautiful, useful sites. However their security programming is extremely poor, and really, there’s no excuse. Not even the pressure of scientists that want simple, easy logins.

It’s 2016. No short passwords, no limitations on complexity such as preventing special characters (one site recently didn’t allow a “,” for me), and no storage in a reversible format. There are lots of best practices, but they require some effort to learn, understand, and implement, as well as modification over time to keep up with changing guidelines.

Or, as I suggested, just stop implementing this yourself. Use some authentication scheme that’s been shown to work well with few vulnerabilities.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 2.6MB) podcast or subscribe to the feed at iTunes and LibSyn.

Training at the Top

Many of us see flaws and problems in the way that we code securely as we build software, as well as the way in which our infrastructure security is configured. There have been no shortage of times in my career when I, or a coworker, wondered why our company didn’t work to implement better security in its systems.

Perhaps it wasn’t us. Perhaps it’s not a lack of desire, but maybe it was due to a lack of knowledge. I ran across a piece in Enterprise Security that notes we should have security training starting at the top, with our C-level executives. Far too many of them don’t necessarily understand the threats or nature of the threats because many of these threats didn’t exist 20, or even 10, years ago. Often we have management that has never faced these kinds of vulnerabilities.

I think there’s certainly room for most of us to learn more about security, especially database security and SQL Injection as these are fundamental issues around some of our most important assets: our data. However when we want to implement stronger security, or limit access, we need the support of management, who themselves need to understand the issues, not just respond to whoever makes the best case, or complaints the loudest.

The world has changed, in that our valuable assets can be transferred to our competitors, or common criminals, and we aren’t away of the disclosure. Or perhaps worse, our enemies could change some data and we might never know without the ability to perform comprehensive audits of our systems, something many of us might not be able to do. We certainly need technical capabilities, but also the time and support from management.

I think there is a good case to ask our management make an effort to understand cybersecurity, and I’d urge you to pass this link along to your management.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 2.3MB) podcast or subscribe to the feed at iTunes and LibSyn. feed

 

Security Convenience

I wrote a question of the day recently that seemed to catch many people. The question had to do with mapping users when a login isn’t specified in the call. The behavior is to auto match existing logins with the same name. About 60% of the people answering the next day got it right, but a third missed it, expecting an error to be thrown.

One of the commenters was surprised that more people didn’t know this. I’d hope people knew this, though to be fair, I bet lots of people manage security through SSMS or a GUI and never write security code. I know I did for years early on. However I really think that the third of the people that got this wrong in its behavior, are actually right about how SQL Server security should work.

We do not want ambiguity when we configure security. We should be sure that rights granted (or removed) are exactly those that we expect. A strong security system should not tolerate any unexpected behaviors.

Security should require specificity.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 1.6MB) podcast or subscribe to the feed at iTunes and LibSyn.

Hacking to Hide

It’s probably no surprise to you that the black boxes for ships are vulnerable to hacking. These are the Voyage Data Recorders (VDR) that should capture telemetry, audio recordings, and more. These devices are really computers now, connected to the onboard networks used for satellite communications and physically accessible in many vessels.

It was surprising to hear that some of these VDRs are running Windows XP. While I get that there is some ease of development in using Window systems, that OS wasn’t what I’d call robust and stable for stressful and rugged environments. Some systems use real time OSes, which seems like a better compromise, but Linux might be the best choice for a system both tolerant to a variety of conditions as well as one that might be easy to build applications for.

However no matter what the choice, I’d hope that the developers building software for these systems would treat them with the importance they deserve. While lives aren’t at stake from these applications, liability is. These systems are used in legal proceedings, so the data they collect, in an autonomous fashion, should be protected to ensure its integrity.

Apparently that doesn’t happen, as there are incidents of these devices being hacked an data erased, corrupted, interrupted, or even accessed by those on the ships. That’s not surprising as it seems people always find ways to take advantage of the computer systems they physically control. Ultimately I’d hope that we might constantly transmit some of this data off the ship to ensure there are backup copies, but that brings to mind the problems of securing data in transit, preventing access or disclosure and more.

However these systems might provide a good testbed for researchers looking to better build and architect auditing systems. This is a challenging environment, with high stakes, and if we can develop ways to ensure auditing data is intact when we have lost physical control of the device for long periods of time, perhaps we can find ways to build this same auditing into other platforms.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 2.4MB) podcast or subscribe to the feed at iTunes and LibSyn. feed

The End of 2015

I’m off on vacation, so I won’t be able to respond, but I decided to take a few minutes and look back at 2015 before leaving for the year.

This was a year that seems to have been marked by lots of data loss and breaches. Certainly these were on my mind as I worked on the SQL in the City keynote, but as I’ve been following Troy Hunt, it seems that every week or two he’s uploading more data to HaveIBeenPwned.com. We had some larger ones, with tens or hundreds of millions of account records released. Despite all the press from the Sony hack over a year ago, it seems that few companies have bothered to update their security. In fact, it seems that until they’re hacked, no one bothers to fix fundamental security issues in their systems. Perhaps some companies are doing so in the background, and aren’t being attacked, but I’m not so sure.

We didn’t have any SQL Server releases this year, but we had lots of news, and an unexpected Service Pack. Service Pack 3 for SQL Server 2012 appeared out of the blue for me. I hadn’t seen much news on it, and had given up on bothering Microsoft about new SPs for versions. Perhaps they’ll continue to build these yearly for supported versions, which is what I hope, but we will see. It does seem that Cumulative Updates have been appearing regularly, with relatively few issues in them, but I’m still wary of using those as the main patching mechanism for critical servers.

We did have a lot of growth in the SQL Server space, with many features being announced and starting to take shape. If you’re looking to get up to speed, check out our collection of Learning SQL Server 2016 topics, where we try to keep a growing collection of links to help you learn.  I am excited to see some of the growth of SQL Server to include newer features that people want in their data platform. I’m also glad that things like Stretch Database can be used to help manage the ever growing amount of data we have. Of course, encryption is big on my list, and Always Encrypted is something I am hoping gets lots of adoption.

We’ve also seen Microsoft really pushing the envelope in terms of data analysis. There is a constant set of articles and blogs written about data scientists, and some of us are moving to learn more about how to better analyze data. Microsoft continues to help, with their forays into Machine Learning, the expansion of the Power set of products (Power BI, Power Query, Power View, Power Pivot, etc.), R language integration, and more. I suspect that more and more of us will get the chance to play with some interesting data analysis tools if we want to. Even if you don’t use those to help your business, I have already seen these tools being used to perform index analysis, DBA monitoring, and more. I’d urge you to let your imagination run wild and see what things you might build.

It does also seem that more companies are starting to realize that their data and its integrity and management are important. Data professionals are becoming more valued, but the skills required are growing. There are more and more services and tools to help us manage systems that I do think the simple DBA that administers backups and security is really on the decline. At some point, our employers will demand more.

It’s been a good year, and I look forward to 2016. If there are things you want us to help you learn about, leave a comment here, and I’ll review them when I get back from vacation. Have a safe, Happy New Year, and I’ll see you in 2016.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 3.9MB) podcast or subscribe to the feed at iTunes and LibSyn.

Why Use the Principle of Least Privilege?

This editorial was originally published on April 12, 2011. It is being re-run as Steve is away on vacation.

SQL Injection is not the fault of the SQL Server. Brian Kelley pointed that out, and reminded me that SQL Injection isn’t an case of malformed SQL. It’s legitimate code, including SQL commands that we might use from any query connection, especially an administrative one. We regularly issue update and delete commands from our applications, and SQL Injection takes advantage of this to issue an update the we might not be expecting.

Would you expect this input handwritten injection from an application? Or this table guessing attempt? You wouldn’t, but they can come through data entry in your application if the input isn’t well sanitized. Someone setting all your prices to $0.01 or all of your customers to “W3 0wnz U!” isn’t what you want to happen. You can’t necessarily prevent all of these patterns  or check for every permutation, but you can prevent things like ‘shutdown’ or ‘drop table’ from being run by your application. Even adding a new user to the database system isn’t something I would want to allow.

Education is the key here. As Andy Leonard (blog | @AndyLeonard) would say, design patterns are important. When developers have an understanding of the issue, many of these things will be avoided. Having standard ways to begin building an application, checking for bad input, and setting up database users and permissions easily, should make this easy for anyone that wants to code against a database. We still have work to do here to build better frameworks, and ORM tools that require elevated permissions to the database are not the answer. They might become the answer, but they aren’t a better solution right now.

Grant Fritchey wrote a nice piece about developers and DBAS, noting the need that we both have the same goals, but need to learn to communicate better.  This is one area where we ought to make an effort to communicate better, pass along education about security issues, and work to make life easier for developers to work with a database.

That also means teaching them to work with the minimum privileges needed in order to make an application work, just in case someone plans on submitting some input you didn’t expect.

Steve Jones

Correct Old Mistakes

I ran across this piece on the VTech hack that recently occurred. It’s almost a classic example of what not to do in data storage. You can read the piece, and also look at Troy Hunt’s analysis, but clearly we can see that poor encryption, unencrypted communications, plain text storage of passwords, and more. What’s especially disconcerting is that we have kids’ information disclosed, plenty of which could be problematic years down the road as these kids grow up.

Apart from all of the technology issues, there are certainly responsibility issues. I expect that VTech will deny knowledge of issues, and certainly limit the amount of time they admit to knowing about the security issues. After all, they’re a corporation and if they can deny liability it could certainly limit the number of actions taken against them. However I’m hoping that the developers and operational people that manage this technology realize they made mistakes while building these systems.

There’s a certain immaturity that’s prevalent in the analysis of this system. I’m guessing that developers were under pressure to get websites up and running, in concert with product launches, and that plenty of code was shared among their various sites and web domains. However I would hope that the current developers at VTech would have learned more about building robust applications, and would be looking to rewrite and rebuild their systems to be more secure with more current technologies. I hope that any of you running Flash based systems, or using MD5, or any other well known, poor security practices, would be pressing your management to correct those deficiencies and giving them more secure solutions. You might also give them a copy of the article linked above.

Likely most companies out there, I’m guessing VTech’s management don’t want to spend money to rebuild systems that work, regardless of security flaws. Likely developers that have learned how to better code public facing sites don’t have the time to spend rewriting old code when they have new systems to develop. However I think that the old code that lives out there, that poorly built code that most of us have written in our past, would get updated over time, as part of the cost of doing business. This is especially true for anyone using encryption, where upgrades should be regular and mandated as Moore’s law and better mathematics consistently eliminate the security provided by older algorithms.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 2.5MB) podcast or subscribe to the feed at iTunes and LibSyn.

The $90,000 Laptop

A hospital got the opportunity to pay $90k for a lost laptop.

There’s no excuse for this. If you have a Windows laptop, enable bitlocker today. If you have OSX, setup FileVault. If you’re on Linux, choose dm_crypt or something else. Go ahead, get that setup, save off your keys as a backup, and come back. I’ll wait.

Now, don’t you feel better? I’m sure you do, and you are somewhat more protected from the random theft, misplacement, or loss of your laptop. All of those things happen regularly. Not to each of us, or many of us, but as the collective set of developers and DBAs around the world, we lose laptops regularly.

Certainly some of us have precautions like never carrying data around. That’s good, and I’d recommend that. For those that need to develop on the go, they might have a curated set of test data we can use for development. That’s fine. We should make an investment in building test data and have that data used for unit, integration, and system testing.

We should have investments in ensuring that our systems can not only be encrypted, including encryption of backups and networks, but that we can restore those systems in disasters. Make the investment in ensuring recovery, and everyone is more likely to accept encryption. We need to invest in a process for managing keys, revoking them, and re-issuing them. We need to be sure we can upgrade encryption algorithms over time. What is secure today might be easily broken tomorrow.

I’d urge you to make all those investments, but until you do, at the very least, encrypt your laptops and desktops. While any random theft could result in a lost laptop, it’s not unheard of for anyone walking through your office to pilfer a drive lying around, or even one inside a desktop.

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 2.1MB) podcast or subscribe to the feed at iTunes and LibSyn.

Security Decisions

How many of you have written code that results in a security issue or data breach? It’s possible many of you have, but are unaware of the issues. I’m sure a few of you have been hacked, have had to clean up data or code, but often it’s not your code. Maybe it’s something you inherited. Maybe it’s code your team wrote together. In any case, the feedback loop between your action (writing code) and the result (a security incident) isn’t tightly coupled.

I ran across a post from Bruce Schneier on how people learn about cybersecurity. The piece links to a research paper, and it’s an interesting read. It turns out the researchers see most non-experts learning from news, friends, and websites, often with biases towards actions that have had immediate negative consequences, but not necessarily those that are more serious.

That has me wondering about us, as supposed expert, or expert-ish, developers and DBAs. How many of us get security training, or get this training updated? How many of us learn from friends, or websites, and re-use this knowledge over and over in our work, not necessarily understanding, or unsuring, that we are building in strong security into our systems. I suspect many of us just try to get by, writing the minimal level of security that “works” for us, not really understanding how there might be flaws or holes in our system.

Our code, our configurations, our systems have much farther reaching impact than ourselves. In some sense, I think that a fundamental broken concept of information technology is the lack of security practices and architectures being built into our platforms and applications from the start. While convenience may get the boss off our back, or allow greater profit for our companies, it’s not helping our industry, or even our companies in the long term.

I just wish I had some idea on how to change things.

Steve Jones

The Auditor Attack Vector

The phone on the desk buzzed. The CEO picked it up, expecting his assistant to let him know his next appointment had arrived. Instead he was told a person had called and wanted to discuss why his managers were paid less than some of their direct reports.

The CEO was puzzled, and worried, so he accepted the call.

“Did you know that you have programmers making more than some of their managers? ” the caller asked, quoting the specific people and their salaries.

The CEO did know, acknowledged this, but declined to discuss the matter. Instead he asked who was on the phone, and how did they know the salaries of his employees.

The caller declined to give their name, but told him that they had found a USB thumb drive outside on the street and had plugged it into a computer. A number of spreadsheets were on the drive, with one containing the salaries and organizational structure of the company. The called left the story there, promising to mail the drive back to the CEO.

The CEO was upset, and worried, but waited a few days to get a package in the mail. He had been planning to terminate someone for carelessness. However when he opened the package, he realized none of his employees was to blame. Instead, this was a device given to an auditor who was verifying the accounting practices of the company.

I don’t know the rest of the story, but it was given to me by someone that runs a decent sized company. It’s a scary story and shows a concern one that has nothing to do with most of us that work in technology departments. However this does show that there are always holes in our processes and practices. We need to consider the fact that many of the businesspeople we work with value convenience much more than security. We need to be sure we take precautions where possible, such as encrypting all data at rest, and in transit, wherever possible.

It might not be our fault, and it might not be something we’re blamed for, but I certainly would feel some guilt if I had copied the data onto the USB drive without providing additional security, such as encryption or at least a password.

 

 

Steve Jones

The Voice of the DBA Podcast

Listen to the MP3 Audio ( 2.2MB) podcast or subscribe to the feed at iTunes and LibSyn.