How to sync user logins across SQL Server instances – dbatools is brilliant

This blog post is about how brilliant dbatools are. In this case – for syncing user logins between SQL Server instances.

Background:

Whenever I do my “DevOps and the DBA” talk I dedicate a minute or two talking about dbatools.

You can find the tools here:

https://dbatools.io/

Simply download and install on your server – or a machine that has visibility to what you want to connect to.

When building new servers the most important thing after restoring and securing the database is syncing up the users. This is especially important for Availability Groups as SQL Authenticated users required the SIDS to be the same.

In the past I had some very long winded code that would do the sync – it was a mixture of TSQL and PowerShell. It worked but you know – it was cumbersome.

So I effectively practiced what I preached recently and used the Copy-DbaLogin command – actually I looked at what it did first by running:

Get-Help Copy-DbaLogin -Detailed

For what I needed to do – I needed both the SQL Authenticated users AND all Windows users/groups – so I just ran this:

Copy-DbaLogin -Source OLDSERVER -Destination NEWServer

Which results in the following example:

copydbaloginresult
This is brilliant

The upshot of all this is that syncing users between SQL Server instances has never been easier and means I can throw away my terribly written script.

Yip.

 

Advertisements

How to change the TFS Agent _work folder

This blog post is about how to change the default work folder _work that TFS agents use when building a solution.

Background:

I’m now a consultant – which is awesome – it means I get to visit clients and make a difference for them.

One particular client had installed TFS and their remote build agent was installed in C:\TFSAgent.

Methodology:

By default when installing TFS Agent you can choose the default for the work folder _work and normally this goes under the root directory of where you install the agent. So in this example they had the agent work folder at:

C:\TFSAgent\_work

Which was fine – until the builds were kicking off regularly (thanks to good Continuous Integration practices they were doing builds almost hourly) and C:\ started running out of space.

So a D:\ was added to the server.

but how to change the work folder to D:\TFSAgent\_work

A lot of posts on the internet are saying just remove the old agent and install it again. That to me seems a bit drastic.

If you’ve read my previous blog post on changing agent settings– you will know about the hidden file .agent

Agent_Settings
The .agent file is our friend for changing settings

Except the settings file is set out in JSON.

Which caught me out – as I made the change D:\TFSAgent\_work and the agent was not happy at all.

So to change the default _work folder to be D:\TFSAgent you need to:

1. Stop the agent service

2. Open the .agent file which will look something like this:

{
“agentId”: 10,
“agentName”: “BUILDAGENT”,
“poolId”: 3,
“serverUrl”: “https://YourtfsURL.something.local/tfs/”,
“workFolder”: _work”
}

3. Edit it like this:

“workFolder”: “D:\\tfsagent\\_work”

Note the double slashes – due to JSON

4. Start the agent service and kick off a build and watch TFS update the directory with what it need.

That really is it – but hopefully by reading this post it will save you time and energy by NOT having to reinstall your agent.

Yip.

Use Boot Diagnostics to see what your Azure VM is doing at boot time

This blog post is about how to diagnose what your Azure VM is doing while it is booting.

I have a DEMO VM that is hosted in Azure – it is where I have Visual Studio, Redgate tools as well as all my DEMO systems, presentations hosted for when I speak. That way when I go to a venue to speaker I only need (at worst) internet connectivity – and I have a phone with fantastic internet if the venue doesn’t.

What I do is keep the VM up to date in terms of windows patches and I make a point of 2 days out before an event of making sure all outstanding patches are installed.

Hint: this might tell you where this post is headed.

So 2 days out from speaking in Spokane – DevOPs & the DBA I made sure to start up my VM to check things were good. The only complicating factor was this was a day before I was to give a session to TheDevOpsLab so I thought – best to get this out of the way and practice my database unit test session that was going to be recorded.

So I went into the Azure Portal and down to Virtual Machines and clicked “start”:

start VM
Let’s start up the VM and get started

Normally this start up process would take about 3-5 minutes whilst things started up.

However after 10 minutes I still could not connect. After 15 minutes I started to become worried. So I clicked on the Virtual Machine in the Azure Portal to see what usage was happening.

Things were happening alright:

The keen eye will note that is 2 hours worth of activity…..

Yip – my VM was busy doing heaps of stuff for 2 hours and the whole time I could NOT log onto it via RDP. Which is when I discovered “Boot Diagnostics” in the Azure Portal for Virtual Machines. It allows us to see the console of the VM.

Simply click on your VM and choose “Boot Diagnostics”:

Boot Diagnostics
Let’s see what the VM is doing

Which gave me an insight to what my VM was spending so much time doing:

windows update
Ugh…  Windows Updates.

So I waited for 2 hours whilst my VM applied these patches (to be fair it was a very big update).

The good thing was I could monitor the progress via Boot Diagnostics.

BTW – in normal running this is the console of my VM:

Normal Console
You should consider visiting New Zealand

Which is the view out my front gate. If you have never considered coming to New Zealand – hopefully the console of my VM hosted in Azure will help you decide.
Or consider submitting to SQL Saturday South Island:

http://www.sqlsaturday.com/712/EventHome.aspx

We’re still open until 26th April 2018 (I extended it today) and to be honest if it’s past that date and you are International speaker – hit me up on twitter — @theHybridDBA if you want to speak – we’re relaxed as in New Zealand and love people who like to give their time to the community.

I’ll even give up my own speaking slot.

Yip.

Authentication issues with GitHub: An error occurred while sending the request —> System.Net.WebException: The request was aborted: Could not create SSL/TLS secure channel.

This blog post is about an error that you may receive if you’re using git commands in PowerShell and authenticating against GitHub.

My GitHub account uses 2 Factor Authentication so I thought it might be that – however I could use GitHub Desktop fine.

I was cloning a new project for a client:

git clone https://github.com/Client/Documentation.git

and I got a prompt for my username/password but when I entered both I got an error. O knew my username and password OK and I knew 2FA was working too.

The error (taken from event logs) was:

System.Net.Http.HttpRequestException: An error occurred while sending the request. —> System.Net.WebException: The request was aborted: Could not create SSL/TLS secure channel.

So I looked at my Git for Windows version:

PS C:\Yip> git version
git version 2.14.2.windows.1

So I decided to upgrade – as I had been meaning to do this for a while.

So I downloaded the latest version for windows and 2.16.2

I then ran the same command – which prompted me for my GitHub username/password. I entered them and asked me for my 2 Factor password which I put in and hooray!! — it worked.

I successfully cloned the repo and can now do work for a client who has stuff stored in GitHub.

Yip.

Redgate SQL Test utility isn’t showing up in SSMS

I had recently installed the SQL Toolbelt from Redgate at a client site on a laptop they’d supplied me.

(Fantastic product that SQL Toolbelt BTW.)

Things were going swimmingly – in fact thanks to SQL Compare, SQL Source Control and Team Foundation Server I had implemented an automated Continuous Delivery pipeline for the clients databases and applications.

The next thing I wanted to do was start implementing unit testing for both the applications and database. DEV were going to do the application side (this client is a start-up so it made sense that they had little to no unit tests) and I’d do the database side.

Except in SSMS I couldn’t find SQL Test…??

I knew to click on the icons to bring down other utilities but it wasn’t here either:

Redgate SSMS Where is SQL TEST
Not here either…. what have I done wrong??

So as a windows user I naturally looked in the Windows Apps:

Redgate No SQLTEST in apps
Hmmm…. nothing here either

At this point I decided it had to be my machine as I did a google and looked on forums and no one seemed to have experienced this.

So I uninstalled everything – SSMS and the Toolbelt.

Reinstalled everything.

And got this:

It was while clicking around like a madman I found this:

Phew.

And of course now I can do this:

Redgate Now in my happy place
Let’s start unit testing!!

 

So if you have recently installed Redgate SQL Toolbelt and can’t find SQL Test – hopefully this blog post will help you.

By the way I do think there was something wrong with the laptop the client gave me as now when I right click I get the ability to run tests:

Redgate SQL Test context menu
This was definitely not there before the uninstall/reinstall fiasco

So now I can start my unit tests – the good news the DEV team have started theirs and are really getting behind it. I think they’ve got on with it to stop me talking about unit tests…!!

We now have 4 minutes of unit tests per checked in build but that is definitely something I’ll respond to with:

Yip.

Automation and cleaner code/data are the key to future success

Doing DevOPs for your Database? You need to start here…

This month’s T-SQL Tuesday #100 is hosted by Adam Machanic (B | T) who started T-SQL Tuesday 8 years ago and has invited the community to look forward 8 years at what will be T-SQL Tuesday #200…

 

Before I do that though – I want to think about what I was doing 8 years ago. At the time I was working with Object Orientated databases and the company I worked for had just implemented extensive unit testing across our technology stack. I use that word because we had both the database and application code going through unit tests, integration tests and regression tests. Most of which was automated.

It was around this time that I was starting to do more things in SQL Server, namely SQL Server 2005, funny how 7 years later I was still doing some things in SQL Server 2005 – but that is for another blog post…

It was when I came across to the SQL Server world that I realised that 2 things were different:

  1. Database build/code definitions were not in source control
  2. Unit testing was almost non-existent
  3. Automated testing or deployment was also not a common thing

Even now in 2018 I find that when speaking at conferences and I ask the questions:

  1. Who here uses source control – 60% of hands go up
  2. Who here puts there database into source control – 50% of those 60% of hands go down…..
  3. Who here does automated testing across their databases – another 50% of those hands go down

I generally get ~30 people in my sessions so if at best 5 people out of 30 are doing this – we need to change this.

Yet…

For years people have been discussing both getting our database code into source control and unit testing has been discussed at length during the past 8 years.

So what does this mean for the next 8 years?

I’m on a personal crusade to bring automated testing and DevOPs philosophies to your database…

In fact I am hoping that DevOPs doesn’t exist in 8 years time – we as an industry matured and called it “commonsense“.

Let’s talk about automation – as whilst people might not see the value in testing our most precious resource – data – they will hopefully see the light in that the more we can automate the better our jobs will be.

But won’t it kill the DBA off I hear some of you ask.

No.

It won’t.

In 2026 the role of the DBA will have morphed, in fact all Data Professionals will have an appreciation and understanding of how to automate the delivery of value to the end user. I choose my words carefully – too often we as technical people thing about 1s and 0s but in fact it is how we deliver value to our clients that dictates our success. In terms of customer experience we need to ensure we are delivering value of higher value than our competitors.

Many people are worried that automation will put them out of a job. This won’t happen, and in fact there will never be a shortage of work to do in a successful company. Rather, people are freed up from mindless drudge-work to focus on higher value activities.

This also has the benefit of improving quality, since we humans are at our most error-prone when performing mindless tasks.

By utilising the features in SQL Server Data Tools (SSDT) Version 2026 – which does both State based AND Migration based deployments to databases automatically means that we as DBAs can start focusing on activities that bring value to our clients. Databases have been touted to be self tuning for decades but I simply don’t see it happening and we need DBAs that can tune queries, and more importantly understand how automation can make their lives better.

Data Science is big time in 2026 – the past 8 years have seen a massive jump in it’s usage. This is where Data Professionals have a massive part to play – cleaning up the underlying data – who knows – through automated cleansing jobs that self tune themselves… automatically.

See where I’m going with this — the growth of data we will see over the next 8 years means we need to be more vigilant in the ways that we interact with the data.  Using automation and more testing means that the data scientists can focus on what they’re good at – doing stuff with data that they don’t have to spend (up to) 60% of the time trying to cleanse.

I am not scared of automation putting me out of a job, I’m more scared that over the next 8 years that people won’t embrace it enough to  help with the delivery of value to the end user.

Yip.

 

Doing DevOPs for your Database? You need to start here…

I’ve been doing that thing called DevOPs for about 17 years – you know – before it had a name…

In fact it was when I first joined Jade Software that I realised that Ops and DEV had a common interest in making systems go. In 2002 I started working with our toolset team designing our in-house written toolsets to deploy JADE databases/systems automatically and reliably. Little did I know I would be repeating that methodology around .NET written systems in 2010 and then repeating it again with deploying SQL Server code in 2014…

The major breakthrough for the JADE applications and database was unit testing.

When we started developing in .NET (over a JADE database at the time) it made sense to do unit tests – in fact I don’t know of many good .NET developers who don’t do unit testing.

It was when I started speaking to audiences that I realised that not many people do unit tests for their databases. Why??

Well it appears I was talking to the wrong crowd…. sort of.

Code first developers who design databases using Entity Framework are used to writing unit tests so most times they are testing code that will change the underlying database.

So that led me to ask the question: “Are DBAs doing Unit tests?”

No.

They are not.

Firstly – what is a unit test?

The primary goal of unit testing is to take the smallest piece of testable software in the application, isolate it from the remainder of the code, and determine whether it behaves exactly as you expect.

This testing is done as part of the development process, and a unit test will check that the code being tested meets a specification, while a library of unit tests together will check the functions expected of the application. By automating the process , which allows the library of unit tests to be run frequently, and repeatedly, it allows us to find bugs earlier, and in smaller units of code, which are far more easier to debug.

Unit tests should be self contained enough that you are isolated in what you are testing so that you know whether have a correctly performing test.  Too often we think of databases as massive stores of data and generally we only test for performance. However – with unit tests we are looking at far smaller sets of data, we want tests that are fast and we are testing the functionality/quality of the code rather than the execution speed.

The upshot is that once we embrace unit testing we can then start to utilise regression testing  which can allow us to refactor our database code just as easily (if not more confidently) as developers do for application code.

So if hiring a .NET developer who doesn’t do unit tests is unthinkable – why would we accept this as the norm for the people who are writing the stored procedures etc that touch/influence and ultimately own our most precious resource – our data…?

Because it is too hard?

I find tuning indexes and queries hard – writing a unit test to prove that my code won’t kill PROD seems way easier. Also if I find a bug in my code when I’m writing the code at 3pm – it’s way easier to fix it then than at 3am when an online banking system has crashed/burned or is now corrupt…. I’m sorry but saying unit testing is too hard is a cop out.

Because it is too slow?

Refer the example above – way easier to write a little bit of a unit test and prove that my change is going to work when it will only trash my DEV instance. Fixing it then is far quicker than when 1,000s of users are affect – because there are less people calling my phone/emailing me when I fix it in my DEV instance..

Because it is too new?

Not at all – SQL Server Data Tools (SSDT) has provided database developers the ability to do unit testing since 2012. https://blogs.msdn.microsoft.com/ssdt/2012/12/07/getting-started-with-sql-server-database-unit-testing-in-ssdt/

In fact there is an old article at SQLServer Central that is from 2009!!  http://www.sqlservercentral.com/articles/Editorial/68586/

Because it involves learning a new language?

You don’t have to – tSQLt and SQLTest by Redgate both allow unit tests to be written in TSQL – which most DBAs thrive on. Even SSDT allow you to write unit tests in TSQL.

I have used SSDT a far bit so if you are a database developer then I highly recommend you read “Creating and Running a SQL Server Unit Test”   https://msdn.microsoft.com/en-us/library/jj851212(v=vs.103).aspx

If you are a DBA then I highly recommend you look at tSQLt.org – the official website has lots of useful information. There is a slight learning curve – but after 15 minutes reading and trying it out – it is very simple to use. It allows you to isolate your testing to a particular schema and makes use of fake tables – equivalent of mocking –which allows us to take copy of the table as it is and test against it.

There is a great pluralsight course here:  http://pluralsight.com/training/Courses/TableOfContents/unit-testing-t-sql-tsqlt

David Green is the author of the above Pluralsight course and has written a fair bit about tSQLt – http://d-a-green.blogspot.co.uk/search/label/tSQLt

Greg Lucas has also written a lot about tSQLt http://datacentricity.net/tag/tsqlt/

His article on http://datacentricity.net/2011/12/advanced-database-unit-testing-with-tsqlt-testing-cross-database-tables  is particularly helpful.

Of course there are some other great utilities and I mentioned one earlier:

https://www.red-gate.com/products/sql-development/sql-test/index

SQLTest by Redgate – which is an absolutely awesome tool for DBAs – mainly as it plugs straight into SQL Server Management Studio (SSMS).

It uses the tSQLt framework and incorporates SQLCop SQL Test which will help you enforce best practices for database development and run static analysis tests.

Best part is if you are on a DevOPs for Database journey then you can fold those tests into your Continuous Integration processes to really bring up the quality of your database code. You need to start looking at unit tests – today.

Yip.