Azure SQL Managed Instance – full blown SQL Server in the cloud….?

This blog post is one that I have had percolating in the background since around November 2018.

My good mate John Martin ( t ) was speaking at PASS Summit on Azure SQL Managed Instances and we had talked about use cases, some of the gotchas and things to consider when migrating your databases to it.

I have been discussing Managed Instances (MI) with more people recently and this blog post is basically a run down of what MI is and how you can migrate to it with some considerations.

The Past

Before we go into MI we should look at what the current offerings were in Azure before MI became available in 2018.

We already had Azure SQL Database (introduced in 2010) and also the ability to run up SQL Server on an Azure VM. Both of these offerings were attractive as:

Azure SQL Database:


Server Administration is handled by Microsoft

You could scale out as required

Backups were fully managed

(BTW you can configure a long-term backup retention policy
to automatically retain backups in Azure blob storage for up to 10 years.)


We don’t have SQL Server Agent

It’s running 24/7

Cross-database queries are not native

No control of files and filegroups

Can’t use native backups for restore

No Service Broker

Backups are fully managed (for control freaks like myself this can be an annoyance)

Note: Azure SQL Database comes in Singleton databases or Elastic Pools

SQL Server running an Azure VM:


Good old familiar SQL Server

SQL Server Agent jobs!!

We can power down the VM if we do not require SQL Server 24/7


We still have to administrate a server

The costs associated with hosting VMs in Azure

The management overhead of hosting VMs in Azure

Introducing Managed Instances:


HA is built in

Infrastructure is handled by Azure

Can backup/restore to Azure Blob Storage

Lift & shift migrations

SQL Server Agent

Cross Database Queries

Linked Server (I know, I know….)

But like everything there are some cons.


Commissioning the Managed Instance can take a fair while.

We can’t shut it down so cost can be prohibitive if you have scaled it wrong.

Azure SQL Database Managed Instance does not currently support long-term backup retention

Here is a good comparison of Managed Instance vs Azure SQL Database

A good comparison of Azure offerings for SQL Databases

differences between azure databases

Reference: “Azure Managed Instance your bridge to the cloud”, Joey D’Antoni, SQLSaturday Cambridge 2018

Requirements for Azure Managed Instances

Configuring network environment:

You need to configure the network environment where Managed instance will be created. You will need to create an Azure VNet and a subnet where the instance will be placed.

Although the VNet/subnet can be automatically configured when the instance is created, the only drawback is the fact that it will configure it with some default parameters that you cannot change later.

If you already have a VNet and subnet where you would like to deploy your Managed Instance, you would need to make sure that your VNet and subnet satisfy networking requirements.

Creating Managed Instance:

Once we have the network environment configured the Managed instaqnce can be created. The easiest method is to use the Azure portal, however you can use PowerShell, PowerShell with ARM template, or Azure CLI.

My recommendation is to script it out as that adheres to my philosophy of using Infrastructure as Code to do pretty much anything where possible.

Be careful though…

Storage is vital to the performance of your database:

You need to size your underlying disk with throughput in mind:

disk sizing and throughput

So if we want at least 5,000 IOPS per disk then we need to size at P30 or combine smaller disks to achieve the necessary IOPS.

Provisioned capacity and performance

When you provision a premium storage disk, unlike standard storage, you are guaranteed the capacity, IOPS, and throughput of that disk. For example, if you create a P50 disk, Azure provisions 4,095-GB storage capacity, 7,500 IOPS, and 250-MB/s throughput for that disk. Your application can use all or part of the capacity and performance.

Disk size

Azure maps the disk size (rounded up) to the nearest premium storage disk option, as specified in the table above. For example, a disk size of 100 GB is classified as a P10 option. It can perform up to 500 IOPS, with up to 100-MB/s throughput. Similarly, a disk of size 400 GB is classified as a P20. It can perform up to 2,300 IOPS, with 150-MB/s throughput.

Connecting to Managed Instance:

Essentially – Managed Instance is a private service placed on a private IP inside your VNet, so you cannot connect via public IPs.

You can connect to your Managed Instance in a variety of ways:

  • Create an Azure VM with installed SSMS and other apps that can be used to access your Managed Instance in a subnet within the same VNet where your Managed Instance is placed. The VM cannot be in the same subnet with your Managed Instances.
  • Setup Point-to-site connection on your computer that will enable you to “join” your computer to the VNet where Managed Instance is placed and use Managed Instance as any other SQL Server in your network.
  • Connect your local network using express route or site-to-site connection.

Validating your database before migration:

It is vital that you check that there are no differences between your SQL Server and Managed Instance.  You need to understand what features you are using and whether you need to update your existing instance in order to migrate.

A good method is to install the Data Migration Assistant which will analyse the database on your SQL Server and alert you to any issue that  could block the migration.

You also need to consider your authentication methods for your users and applications:

Authentication considerations for your applications when migrating to Azure SQL Managed Instances

Migrating databases:

There are several ways to move your database:

  • Native restore functionality that enables you to create a backup of your database, upload it to an Azure blob storage and RESTORE database from the blob storage. This is probably the faster approach for migration, but requires some downtime because your database cannot be used until you restore it on Managed Instance. You can even roll your own log shipping to it to minimise the amount of downtime.
  • Data Migration Service is a service that can migrate your database with minimal downtime. It does require vNet connectivity for source, VPN or Express Route to Azure.
  • Transactional Replication – this also minimises the amount of down-time and you can use a push subscriber model for Managed Instance

You can migrate up to 100 database on a single Managed Instance.

T-SQL Considerations:

There are some differences in T-SQL syntax and behaviour between Managed Instance and on-premises SQL Server.

It is highly recommended that you read this:

So there you have it – Azure SQL Managed Instances are live – some might say that they are the future of SQL databases in Azure.

I personally think that like anything it has it’s place. For new cloud based apps that require a SQL database in Azure I’d probably still use Azure SQL Database but of course there is that good old saying:

It depends


Some of my goals for 2019

In late December 2018 I had a discussion with Amanda Martin the PASS Community Manager around things that I would be doing in 2019.  I thought I would share the questions we discussed, and my replies below.

What are your professional goals for 2019?

My professional goals for 2019 are based around what SQL Server 2019 will bring to the Data Platform.  I am looking forward specifically to the changes around SQL Server on Linux – specifically the Red Hat Enterprise Linux based images.

See my blog post where I run up the Ubuntu images on CentOS:

Setting up CentOS to run docker for SQL Server

I have clients who are heavily into containers and this has driven me to start promoting this within the community – to help others learn about what containerized SQL Server is all about.

Is there a technology you want to learn or master?

As mentioned above SQL Server 2019 is going to be a great release in terms of what it can offer people to extend their data platform.

This in itself will be a technology that I will be mastering in 2019 and I look forward to be able to back-fill that knowledge to our community.

A technology I am looking forward to learn and master in 2019 is Azure Kubernetes Service (AKS).

The reason for this is I have presented on how powerful the Azure platform can be. I am also educating my clients and community around containers so this is a great partnership of both technologies.

In 2018 an area that I have been working in a fair bit has been Availability Groups and the fact that SQL Server 2019 CTP 2.2 allows for running an AG on docker with Kubernetes is exciting.

Using containers as part of a DevOps deployment pipeline for databases and applications is an area that will grow and being at the forefront of that is a driver for me in 2019.

Do you have a skill you want to upgrade?

I speak on continuous improvement – around deploying quality value every time you release software.

A skill I want to “upgrade” is my speaking skills, I want to spend time honing my speaking craft. I want to continually improve how I deliver content to the community and industry.

Being able to deliver quality content that helps people learn is fundamental to why I get up in front of crowds and talk about how to #MakeStuffGo

In terms of technological skills I want to upgrade – I always want to be a better Data Professional and so this year will be spent reading blog posts, registering for some of the fantastic (and free!!) webinars that PASS run.

Our Data Platform is growing exponentially and so a fair chunk of my free time will be spent ensuring that the skills I have within it are relevant, current and transferable.

What are your PASS community goals for 2019?

My PASS community goals for 2019 are to be more involved. To extend my reach and influence within the community. I live in New Zealand which is quite a remote country – which is a good thing and sometimes a bad thing when it comes to travelling (anywhere).

I want to be able to reach the wider community and part of this will be where I do more webcasts as opposed to writing. Being able to run webinars is a great way to connect and share content with a fair wider and diverse audience.

Are you already involved with a PASS Local or Virtual Group? If so, do you want to get more involved in these speaking, organizing and coordination opportunities?

I am already a PASS Local User Group Leader in Christchurch, New Zealand. I run the SQL Server and Data Management User Group which has grown from 250 members to 745 members in 3 years under my leadership.

I have recently been involved in the  “rebooted” DevOps Virtual Group with Rob Sewell (t | w) which has been a fantastic platform for us to get the DevOps message out to the world via the PASS VG platform. Rob is a very energetic guy and I think it’s gonna be brilliant working with a guy who is as bouncy as myself!!

It means that at least twice a month I am speaking, organizing and coordinating educational content to up-skill people – for free. That in itself is awesome and thanks PASS for all the support you give us UG/VG leaders.

Do you want to share more ideas and content with the PASS community through blogging or publishing video content?

I certainly do – 2018 has been a proving ground year for me. Over the past 3 years being involved with PASS I’ve grown my contributions within the community and the reason I’ve grown it is that it is rewarding to see the impact of what we all can do. 2019 I will be looking at how I can do more with web based sessions – so that a more diverse group of people can learn from both myself and others.

Collaboration is a key thing in how I work and also in my personal life and I am looking forward to seeing how I can work with others to increase our “touch points” within the industry to help people connect, learn and share.

I am also looking to collaborate with some people on writing a book to help people learn via another medium.

Do you want to get involved with or plan a SQLSaturday for your local PASS community?

I already organize a SQLSaturday – SQLSaturday South Island. This event has grown each year since I took it over in 2016 and it now boasts Australasia’s largest % of WIT speakers.

I want to further this and a goal for 2019 is to be a more inclusive and diverse community based event.

This is an exciting goal as I have never been one to stray away from goals or from encouraging people to be part of a community.

I am looking forward to collaborating with a whole range of people within both my local and extended network.

So there you have it – 4 days into the New Year and I’m excited about where 2019 will lead me – both from a technological, personal and community perspective.

A mention has to go to PASS as if it weren’t for the platform they provide then I wouldn’t be able to use such things as the Learning Center

to learn about containers and SQL Server….

…or attend heaps of SQLSaturdays around the world to learn for free (and to give back to the community by speaking)…..

…. or attend PASS Summit year in Seattle to both learn things and share my own knowledge.

So if you’re reading this and have not joined the PASS Community  then register here:

it’s free!! And will help you set and achieve some professional, technical and community goals you may set yourself for 2019.


Adding a database running in a docker container to SQLTest

in my previous post I wrote about how easy it is to add the tSQLt unit test framework to your database if it is running in a docker container.

How to use tSQLt unit test framework with a SQL Server database in a docker container

But let’s say you have the SQLTest utility from Redgate (hopefully you’ve bought the SQL Toolbelt as it really does make your life way simpler and more productive)

So as per usual – open up SQLTest and choose the database that you want to add the tSQLt framework (that is encapsulated in SQLTest) to;

4. Adding database to SQLTest
This is just the same as if the database was running in SQL Server on windows – right?

You will be presented with this screen – I generally take off the tick for SQL Cop and add those tests later..

5. Add Database to SQL TEST


Unfortunately we get an error:

6. Error


Basically we need to add tSQLt manually and edit some things to make it work with SQL Server running in a docker container:

1. First we need to ensure set both clr enabled and clr strict security to enabled:

EXEC sp_configure 'show advanced options', 1

EXEC sp_configure 'clr enabled', 1
EXEC sp_configure 'clr strict security', 0


2. We can then either grab the script manually that SQLTest is using or manually download the tSQLt framework from

We will use the tSQLt.Class.sql script and basically search for:


and change it to


We can now either add our database to SQLTest as per normal (in fact if we’ve done everything correctly the it will just appear) so just try and hit refresh and your database should be there.



11. Tests Run
Let’s write and run some unit tests  (and keep ourselves employed!!)

There is now no reason at all that you should not be writing/running unit tests for your SQL code.

You can even do it against databases running in a docker container using SQLTest.

I always say unit tests are the thing that keep me employed – as all my buggy code (and there’s a lot) gets caught on my laptop (or container) and NOT in PROD or even UAT systems.

So go visit

or these other great sites:

and in no time at all you will be writing your very own unit test.


How to use tSQLt unit test framework with a SQL Server database in a docker container

This blog post describes how to add the tSQLt unit testing framework to a database running in a docker container. You may have already read my article on how easy it is to install against a ‘normal’ SQL Server instance – running on windows…

How to install the tSQLt unit test framework to start unit testing your database code

..but what about SQL Server running in a docker container??


I have a SQL Server instance running in a docker container as described in my previous post:

Setting up CentOS to run docker for SQL Server

The main reason why I run SQL Server in a container is so that I can run up specific versions – all from a script and generally as part of an automated build within Continuous Integration processes.


The first thing we need to do is spin up our docker container which has our SQL Server instance:

1. Spinning up our SQL Server in container
Let’s spin up SQL Server and use the latest version of SQL Server 2017

Just to verify that it is running let’s run

docker ps

We can see that the container has an ID of 789f58fbef3f and a name of sql1 .

I have also mapped the default port of 1433 to 59666

2. container SQL Server is running

We can now connect to SQL Server on port 59666 and do some typical queries whenever we connect to an instance for the first time:

3. What version is running
Connecting to our SQL Server instance running in a container known as 789f58fbef3f

Note that in the screenshot above – the servername result is the name of our container.

Now we can create our database and SQL Authenticated users as normal.

Yes – you interpreted that sentence correctly – right now you cannot use windows authenticated users against SQL Server running in a docker container on Linux. There is a way to get around this that I will be blogging about in late January 2019.

We can add our database to Source Control as per normal and apply the latest changes in our branch. I use the popular Redgate tooling for Source Control (and DevOps) activities

9. Adding stuff from Source Control
We can treat our database in the container like any other database (except for windows auth…)
10. Progress
Applying changes from Source Control to a database running in a container – easy as..!!

So all that is left to do is download the tSQLt framework (a whopping 86KB zip file)  from

Inside the zip file are some files:

7. tSQLt download
tSQLt comes with an example database you can use to try things out!!

There is:


Which is a script that allows you to create a database and run your first unit tests. It’s a great example and you can follow the documentation at


Which is a script to enable CLR. This is partly why you should only run tSQLt on your DEV or selected TEST databases.

This is where I diverge slightly from a stock standard install:

The SetClrEnabled script does the following:

EXEC sp_configure 'clr enabled', 1;

Whereas for Linux based SQL Server we need to run this – as it will not install with clr strict security enabled – note we have to ‘show advanced options’ first:

EXEC sp_configure 'show advanced options', 1

EXEC sp_configure 'clr enabled', 1
EXEC sp_configure 'clr strict security', 0


This is the framework itself. Now before you run this you have to do the following to get it to run in a database running in a container:

In the script you need to change PERMISSION_SET = EXTERNAL_ACCESS to be PERMISSION_SET = SAFE

We now run the script and voila – we have tSQLt in our database and can now write or run all the unit tests we want.

8. Running tSQLt against container SQL Server
Hooray – we now have tSQLt running in a container

So there you have it – you can add tSQLt very easily (just 2 changes to make) to a database running in a docker container.

As mentioned the reason why I am very interested in running tests against databases running on container is around testing – I can spin up the container, download a particular patch level of SQL Server 2017 and above, add my database from source control, apply any referential data from source control and apply tSQLt which allows me to run a plethora of unit tests.

All within seconds and all consistently across multiple environment levels (QA, Integration, Functional test etc).

So go visit

or these other great sites:

and in no time at all you will be writing your very own unit test.


How to install the tSQLt unit test framework to start unit testing your database code

This blog post describes how easy it is to install the tSQLt framework.

Don’t know much about tSQLt? If you are serious about dealing with data in SQL Server then you should be as serious about unit testing the changes you are making to your stored procedures, functions and the like.

In fact I wrote a blog post about unit testing here:

Doing DevOps for your Database? You need to start here…

So I am assuming that you have realised how important it is to unit test your data and code so here’s how to install tSQLt to your database.

First – go to the tSQLt website:

and go to download the framework:

Yip – it is just 86KB of unit testing goodness

The direct URL is

You will get a zip file with some files in it:

7. tSQLt download

There is:


Which is a script that allows you to create a database and run your first unit tests. It’s a great example and you can follow the documentation at


Which is a script to enable CLR. This is partly why you should only run tSQLt on your DEV or selected TEST databases.


This is the framework itself.

We now run the SetClrEnabled & tSQLt.Class scripts and voila – we have tSQLt in our database and can now write or run all the unit tests we want.

I guess at this stage you might ask – why do I need to enable CLR – well let’s say you don’t enable it and just run tSQLt.Class.sql – you will get the following error:

You need CLR – but that’s OK as only install tSQLt in your DEV or selected TEST environments.

So yeah – enable CLR and then when you run the tSQLt.Class.sql script – things will go much better:

This really is the most simple and easy method of  installing a unit test framework for SQL code

So now you can start writing your own tests. The tSQLt framework adheres to the three A’s of unit testing:




It is very easy to create a template that you just use to write your unit tests.

The tSQLt website has some great examples and tutorials and honestly – the learning curve is very small – I’d say most Data Professionals can be up and running unit tests within 1-2 hours.

So go visit

or these other great sites:

and in no time at all you will be writing your very own unit test.

In my opinion tSQLt is that sweet spot of easy to use and comprehensive enough to make a difference.


SQL Server services will not start automatically after server reboots

This blog post is about a situation where after windows patching the SQL Server service and SQL Server Agent services will not start automatically – but will manually.

At a client site they had recently patched the servers – things went swimmingly – except the SQL Server services would not start.

Whilst automatic – the services would not start

Manually starting the services by right clicking | choosing start worked….

It is SQL Server 2017 with CU12 applied.

We tried a few things:

Making the services Automatic (Delayed)

Increasing the service pipeline timeout from 30 seconds to 60 seconds

But nothing worked.

Looking at the event logs I (eventually) found this:

Event ID 7000:

The MSSQLSERVER service failed to start due to the following error:
The account name is invalid or does not exist, or the password is invalid for the account name specified.

Which is bizarre – as the service account had been used for months – but after each reboot the services had to be manually started. GPO and other things had been blamed but no one could actually find out why.

Just for a laugh – I decided to “rename” the service account.

From svcSQL@<domain> to <domain>\svcSQL

Rebooted and it worked.

Just to make sure it wasn’t a false positive I left the SQL Server Agent service how it was – and it did not start.

<domain>\<service account> works – – but why…?

As to why this is the case – I’m not sure. To date I had always set up my SQL Server services with <domain>\<service account> so I have never stumbled across this – until now.

But I don’t care too much – it means that the client can reboot their server and know SQL Server will come back…


Setting up CentOS to run docker for SQL Server

This blog post is about running SQL Server in a docker container on CentOS Linux distribution.

This was both an experiment and also for me to re-learn CentOS – years ago I had done a fair bit with CentOS – back when I was a true Operations Consultant. Up until last weekend I had been using Ubuntu – as it is really easy to use.

All SQL Server container images on docker are built on the Ubuntu image so I wondered – can I run it on a CentOS Linux server running docker?

BTW – the obvious answer is ‘yes, of course because it is just a container”.

But I wanted to try it out and see what differences there were.

The first main difference for me so far – remembering to run sudo when I want to elevate my permissions. This is a good thing – I did many bad things in Ubuntu that I should have thought about first – that is why sudo is a good thing for people like me…!!

[Yip – you can run sudo -i but I also like to remind myself not to do bad things…]

So if you are thinking of running docker on CentOS and

Install the latest version of Docker CE:

$ sudo yum install docker-ce

If prompted to accept the GPG key, verify that the fingerprint matches

060A 61C5 1B55 8A7F 742B 77AA C52F EB6B 621E 9F35

and if so, accept it.

Initially start docker:

sudo systemctl start docker

To make it start every time the  Linux server

sudo systemctl enable docker

Download latest SQL server image:

sudo docker pull microsoft/mssql-server-linux:latest

Run up our container and assign a non-default port – in this case I am  swapping out port 1433 to port 56969:

sudo docker run -e 'ACCEPT_EULA=Y' -e 'MSSQL_SA_PASSWORD=<YourStrong!Passw0rd>' -e 'MSSQL_PID=Developer' -p 56669:1433 --name SQLServer-Docker-DEV -d microsoft/mssql-server-linux:latest

Port mapping is specified via -p host port:container port.

Therefore in my example above -p 56669:1433 means that port 56669 on the host (CentOS VM) is mapped to port 1433 in the container. I am not a fan of using the default port 1433  – mainly as I like to stipulate different ports for different SQL Server instances (QA vs UAT vs PROD). This means that all connection strings I create from outside the container will need to include port 56669. How do I remember this number – that’s where configuration management comes in – another blog post I think…

I also want to change the password to something more secure (and less published on the internet) and this allows me to also show you how to access the SQL tools which are built into the image:

How to use SQL Server tools in the container

sudo docker exec -it SQLServer-Docker-DEV "bash"

Once inside the container, connect locally with sqlcmd. Note that sqlcmd is not in the path by default, so you have to specify the full path.

/opt/mssql-tools/bin/sqlcmd -S localhost -U SA -P '<YourStrong!Passw0rd>'

How to Change SA password in container

Once we have connected with sqlcmd we can then use our native T-SQL to change the password:

Using sqlcmd to change our sa password and then testing it out

Now we can go to the Azure portal and allow access to this port:

Allowing network access to the SQL Server port from certain IP addresses in Azure portal

My preference is to tie it right down to the hosts that I know will access the SQL Server instance running in this Ubuntu container on this CentOS VM hosted in Azure.

Kinda reminds me of the dream layers of the movie ‘Inception’.

In the next blog post – let’s connect to this VM using SQL Server Management Studio on our PC.