T-SQL Tuesday #90 — Bringing Continuous Delivery to a ‘brownfield’ database system

tsql2sday150x150-1[1]

This blog post is part of  T-SQL Tuesday #90 – Shipping Database Changes .

I have decided to write about  a client’s SQL Server based system that was having issues with deployment of code changes. The code changes were being deployed by a third party contractor.

The client engaged with the company I work for to see if we could help out as we were deploying to their other systems with zero issues in PROD.

The issues they were experiencing with the other contractor were:

  1. error prone manual deploys to PROD
  2. no UAT or QA systems so effectively deploying straight from DEV to PROD.
  3. little testing and what testing there was — manual and laborious
  4. no source control of database changes
  5. no central repository (or consistency) of configuration settings
  6. deploy outages that were 4 hours long

So rather than a greenfields project where we can do anything and everything with Continuous Integration and Continuous Delivery we were going to back fill these against a running PROD system – carefully.

So this is what is termed as ‘brownfield’ – a running legacy system…

We worked with the 3rd party contractor as they too were having issues as their DBAs were working at 3am until 7am and it wasn’t a great time.

Note: we were not replacing the 3rd party contractor – we were helping everyone involved. The particular client was very important to us and we were happy to help them out across their other systems we did not actively manage (yet).

The first step was to introduce Source Control and Continuous Integration – so all code that was being developed was pushed up to Source Control. For ease of use and to prove that this was a good thing we started with the application code.

Builds were kicked off in Team City and a consistent artifact was created, we also introduced automated testing.

We then retrofitted consistent environments to the application lifecycle – how’d we do the database side?

We used the Data Tier Application (DAC) model of creating an entity that contains all of the database and instance objects used by an application. So we generated a DAC out of the PROD system (of just the objects and schema definitions – NOT the data), stored it in Source Control and used sqlpackage to create 5 databases that were all standard (compared to PROD).

They were:

CI, QA, Integration, UAT, prePROD

CI – was there to test every build generated against.

QA — was there for nominated QA engineers to test functionality against

Integration – was there for Integration testing with certain backend/frontend systems

UAT – was there for User Acceptance Testing

prePROD – was there for a true copy of PROD

prePROD was a special case as this was a full replica of PROD but with scrambled data.

The application tier was standardised according to my OCD like naming scheme – see my post here about why this was a good thing.

By then making the 3rd party developers push all application & database code changes to source control we then had a versioned state of what changes would be made.

This wasn’t a “hit the GO button” scenario — we had to try things out manually and carefully and there were things that went bump. BUT they went bump in CI and Integration and even prePROD – but by following Continuous Delivery principles and applying to them to the database we have in the past two years never gone bump in PROD.

Now we couldn’t do all the things we wanted – this was not a greenfields project. So here is what we compromised on:

We had to use invoke-sqlcmd for deploying the changes against the databases.

SQLCMD Script

This was OK – because we using a variablised script (which the 3rd party contractor had never done before so we showed them how) and it was deployed consistently across all databases — any issues we caught early on.

Also our deployment process step:

process step

was standard across all environments in our database/application deployment lifecycle.

Here is what the standardised packages from the Continuous Integrationbuild looked like:

Packages

The best part was that ALL database changes in the DBScript package were in the same version as associated application changes. In fact we also included any configuration changes in the repo as well – you know Infrastructure as Code type stuff.

So let’s measure how the changes are against the initial issues:

  1. error prone manual deploys to PROD
    The deploys were automated and any errors were caught early
  2. no UAT or QA systems so effectively deploying straight from DEV to PROD.
    We had PROD-like environment that could give us feedback immdiately and protect PROD from out of process changes.
  3. little testing and what testing there was — manual and laborious
    Introducing both automated tests in the Continuous Integration build steps and automating the deploy
  4. no source control of database changes
    All database changes were pushed up to source control and were versioned.
  5. no central repository (or consistency) of configuration settings
    All configuration settings were stored in a central reposistory and deployed out as aprt of Continuous Delivery processes. Everything was a variable and consistent across the whole application/database lifecycle
  6. deploy outages that were 4 hours long
    Deploys were now around 5 minutes as everything was consistent and proven and we never rolled back once. We could if we wanted to – but didn’t need to.

The upshot of using Continuous Delivery for their ‘brownfield’ system was that database & application changes were consistent, reliable and automated.

The client was very happy, the 3rd party contractor was very happy too (as their time-in-lieu, angst was reduced) and the client approached us to manage more of their systems.

For more things like this — go read the other posts about this month’s T-SQL Tuesday

Yip.

 

.

Advertisements

Removing an App Service Plan in Azure

In my previous post I adhered to my O-OCD (Operational OCD) and standardised my App Service Plan name to fit in line with my Database, User and App Service naming standard.

Why do I standardise everything? Have a quick read here:

The 20 second rule (or why standards matter).

Here is what we have:

AppService_removal

Now that I have standardised – I want to get rid of the historical (and nonsensical) named “CientPortalAppServicePlan” .

Now I could just click on it in the portal and hit “Delete”:

AppService_removal_2

But where is the scripting fun in that?

Also I may one day need to do this for MANY App Service Plans, so let’s do this via PowerShell:

Login-AzureRmAccount
$resourcegroupname = “AzureDEMO_RG”
$location = “Australia East”
$AppServicePlanName = “CientPortalAppServicePlan”

Remove-AzureRmAppServicePlan

-Name $AppServicePlanName

-ResourceGroupName $resourcegroupname

AppServicePlan_removal

Click “Yes” to the pop-up and it is now gone and I can (now) relax.

Which means that I have a couple of Azure App Services and Azure SQL Databases — and they all have standardised names — which NOW means I can do some awesome Continuous Integration and Continuous Delivery/Deployment stuff.

Yip.

Creating an App Service in Azure that connects to an Azure SQL Database.

Using the methodology listed in my previous blog post on creating an Azure SQL Database we now have a Continuous Integration database (named  CIAzureWebAppDEMO_DB).

Now that we have a database its now time to create the web app. This will be an App Service in the Azure portal.

For more background information have a look at Microsoft’s documents on this area:

https://docs.microsoft.com/en-us/azure/app-service-web/app-service-web-app-azure-resource-manager-powershell
  1. We’ll create a App service plan first (mainly as my old name was … old):
App_Service_reason
This App Service Plan has a historical and nonsensical name so has to go…

So using PowerShell:

$resourcegroupname = “AzureDEMO_RG”
$location = “Australia East”
$AppServicePlanName = “AzureDEMO_AppServicePlan”

New-AzureRmAppServicePlan

-Name $AppServicePlanName

-Location $location

-ResourceGroupName $resourcegroupname

-Tier Free

Next we’ll create our App Service:

$resourcegroupname = “AzureDEMO_RG”
$location = “Australia East”
$AppServiceName = “CIAzureWebAppDEMO”
$AppServicePlanName = “AzureDEMO_AppServicePlan”

New-AzureRmWebApp

-Name $AppServiceName

-AppServicePlan $AppServicePlanName

-ResourceGroupName $resourcegroupname

-Location $location `

We can now configure the app service application settings using PowerShell or through the portal and as all Azure Web Apps need configuration values, database based applications also need to have their database Connection String values configured.

Therefore because this app service connects to an Azure SQL database we need to assign it a connection string.

Portal

You can use the Azure portal to configure your newly created App Server:

Configure_AppService_DB_Cxn

  1. Open the Azure Management Portal via https://portal.azure.com
  2. Navigate to the Web App within the portal.
  3. Under “All settings” open up the “Application settings” pane
  4. Scroll down to the “Connection strings” section
  5. Configure as necessary.

PowerShell:

I prefer to use PowerShell as I can feed this into my Continuous Delivery pipeline as I build applications/databases on the fly.

For security reasons we will hash the connection string:

$resourcegroupname = “AzureDEMO_RG”
$location = “Australia East”
$AppServiceName = “CIAzureWebAppDEMO”
$AppServicePlanName = “AzureDEMO_AppServicePlan”

# Create Hash variable for Connection Strings
$hash = @{}

# Add a Connection String to the Hash by using a Hash for the Connection String details
$hash[“defaultConnection”] = @{ Type = “SqlAzure”; Value = “Data Source=tcp:webdbsrv69.database.windows.net,1433;Initial Catalog=CIAzureWebAppDEMO_DB;User Id=CIAzureWebAppDEMO@webdbsrv69.database.windows.net;Password=<redacted>;” }
# Save Connection String to Azure Web App
Set-AzureRmWebApp -ResourceGroupName $resourcegroupname -Name $AppServiceName -ConnectionStrings $hash

So there we have it – an App Service created in a specific App Service Plan that can now connect to an Azure SQL Database. Using this methodology I could have configured anything/everything in the application settings.

Now let’s get rid of that badly named App Service Plan in my next post.

Yip.

Removing an Azure SQL Database using PowerShell..

In this post we will remove some databases located in Azure.

This related to my last post where I am cleaning up some badly named databases and replacing them with standardised database names.

So we have this situation:

database_create_1

We want to get rid of the top two databases AzureWebAppFunctionalTestDB and AzureWebAppQADB.

So we’ll check we’re logged in:

PS > Get-AzureAccount

If not – then just login

PS > Login-AzureRmAccount

It is always a good idea to list the databases on your server:

PS >Get-AzureSqlDatabase -ServerName $servername

And now we’ll simply remove the database:

# Some variables for our resources
$resourcegroupname = “AzureDEMO_RG”
$databasename1 = “QAAzureWebAppDEMO_DB”
$databasename2 = “FTAzureWebAppDEMO_DB”

Remove-AzureSqlDatabase -ServerName $servername -DatabaseName $databasename1

Remove-AzureSqlDatabase -ServerName $servername -DatabaseName $databasename2

You will then be asked if you really want o remove the database:

database_remove

Answer appropriately and viola – we have removed our Azure SQL Databases via PowerShell

 

Yip.

Creating an Azure SQL Database via PowerShell

This post is about using the brilliance of PowerShell to script the creation of databases in Azure.

Background:

Apart from the obvious question of why not? The actual reason is one of standards and how I did not adhere to my own standards that I’ve been preaching for the past 17 years.

For background read my post on standards:

The 20 second rule (or why standards matter).

And you will see how badly I strayed…

Exhibit A:

The WHY of this presentation 3

Yes… my application name and database name do not match my standard of “everything should be able to be tracked from web to database via a name

So of course being a guy who does Data Platform stuff – we’ll create the database first, then the app service then we’ll adjust our release in VSTS. These of course will be split over 2 or more blog posts.

Quick discussion on Tiers in Azure:

There are quite a few different pricing tiers available with Azure for a Microsoft Azure SQL Database.

It allows us to select the capacity metrics that are relevant for our application, without paying too much than what we actually need. Switching between tier plans is very easy. If you have short periods of high performance demand, we can scale up our Azure SQL database to meet those demands. Later on, after the demand diminishes, we can scale the SQL database back down to a lower pricing tier, thus savings us costs.

For this database I’m going to use the Basic tier.

PowerShell Scripts:

So in powershell we’ll log into our Azure subscription:

PS > Login-AzureRmAccount

Which will ask us to login.

We then see our subscription details:

Environment : AzureCloud
Account : moosh69
TenantId : [some hexadecimal numbbers here]
SubscriptionId : [some more hexadecimal numbbers here]
SubscriptionName : [Name of your subscription]
CurrentStorageAccount :

OK, for fun let’s see what databases we have using PowerShell.

Firstly we’ll find out our database server:

PS > Get-AzureSqlDatabaseServer

Which will list our database server and we’ll create a variable $servername using the output of the above.

We now run this command to list the databases in that server:

Get-AzureSqlDatabase -ServerName $servername

Name : AzureWebAppQADB
CollationName : SQL_Latin1_General_CP1_CI_AS
Edition : Free
MaxSizeGB : 0
MaxSizeBytes : 33554432
ServiceObjectiveName : Free
ServiceObjectiveAssignmentStateDescription :
CreationDate : 07/01/2017 8:18:54 AM
RecoveryPeriodStartDate : 27/04/2017 8:29:07 PM

Name : AzureWebAppFunctionalTestDB
CollationName : SQL_Latin1_General_CP1_CI_AS
Edition : Basic
MaxSizeGB : 2
MaxSizeBytes : 2147483648
ServiceObjectiveName : Basic
ServiceObjectiveAssignmentStateDescription :
CreationDate : 08/01/2017 5:57:42 PM
RecoveryPeriodStartDate : 26/04/2017 8:07:56 PM

Name : master
CollationName : SQL_Latin1_General_CP1_CI_AS
Edition : System
MaxSizeGB : 30
MaxSizeBytes : 32212254720
ServiceObjectiveName : System0
ServiceObjectiveAssignmentStateDescription :
CreationDate : 02/01/2017 10:09:58 PM
RecoveryPeriodStartDate :

We’re not too interested in the MASTER database.

So we’re going to create 2 new databases (and eventually point our app services at them).

Here is the very simple code:

# Some variables for our resources
$resourcegroupname = “AzureDEMO_RG”
$databasename1 = “QAAzureWebAppDEMO_DB”
$databasename2 = “FTAzureWebAppDEMO_DB”

New-AzureRmSqlDatabase -ResourceGroupName $resourcegroupname `
-ServerName $servername `
-DatabaseName $databasename1 `
-Edition “Basic”

New-AzureRmSqlDatabase -ResourceGroupName $resourcegroupname `
-ServerName $servername `
-DatabaseName $databasename2 `
-Edition “Basic”

And just like that we’ve created our first database in Azure using Powershell in about 5 seconds.

Yip.

Moving Azure resources between subscriptions – especially VSTS Team Services Account

For the past 6 months I’ve been paying for my own Azure subscription. My work has a plan but for some reason I (and others) who had a MSDN Subscription (Infrastructure) could not access the ‘free’ credits. I use Visual Studio Team Services (VSTS)  in a lot of my DEMOs and thus was paying quite a bit of my own money to design/create/test my DEMOs before presenting them (which also was costing me run time $$).

Until today.

I finally got added to my work’s “Pay-As-You-Go” subscription. Which meant I had to transfer ALL my Azure resources. And I mean ALL MY AZURE resources.

So I decided to use the portal and it really was as simple as going into my Resource Groups and clicking change subscription.

CHanging Resource Groups
Then choosing the new subscription, creating a new resource group to move all the resources to and clicking that you understand that tools and scripts need to be updated.

CHanging Resource Groups_2

It took about 3 minutes and was very painless.

At this point I’d like to state I could have probably used PowerShell but I wanted to actually see if the portal would do what I needed.

It did.

Except that when I ran up my DEMOs in VSTS — it couldn’t see any of my app services. Which wasn’t surprising as I had actually clicked “I understand that tools and scripts associated with moved resources will not work until I update them to use new resources IDs”.

Duh…

So I then spent most of the afternoon trying to move my VSTS Team Services resource. I got a heap of move failures.

Then I read:

https://social.msdn.microsoft.com/Forums/vstudio/en-US/7404fed9-f9cd-4d11-acae-a7726d7dbb15/move-visual-studio-team-services-to-another-subscription?forum=TFService

Which lead me to:

https://blog.kloud.com.au/2014/01/06/how-to-link-existing-visual-studio-online-with-windows-azure/

I then used https://manage.windowsazure.comto unlink my VSTS subscription from my old ‘Pay-As-You-Go’ subscription and then link it to the new one.

CHanging Resource Groups_3

All that was needed now was to check in the Azure Portal that VSTS was on the new subscription (it was) and then to edit the Service End Point for each project that might use them in VSTS:

CHanging Resource Groups_4

What this now means is I can now start creating a heap of Azure resources ( mostly Data Platform stuff because… Data Platform).

So my next post is going to be about creating a heap of Azure resources.

Yip.

Installing a Visual Studio Team Services private build agent hosted On-Premises

This blog post is about a situation where I  use Visual Studio Team Services (VSTS) to build/deploy my DEMOs. Those DEMOs are what I use to illustrate Continuous Integration & Continuous Delivery (important parts of that thing called DevOPs).

I use my own personal VSTS account so that I am not showing anything commercially sensitive.

One thing with using a basic account is that you only get 240 build minutes per month. These builds are utilising the hosted build agents in Azure.

This works great – up until I have 4 DEMOs that are different and require testing before I present them. In May 2017 I had 4 DEMOs to conduct:

Code Camp Christchurch on May 13th  where I was speaking twice: (VSTS and Azure DEMO and SQL Server Data Tools DEMO) both of which would be using VSTS.

A webinar for the DBA Virtual Chapter on May 18th which was on “DevOPs and the DBA”.

SQL Saturday Brisbane on May 27th – mostly the same DEMO as the DBA VC.

Whilst the DEMOs themselves would only use 12 minutes of build time – the preparation of them would push me over the limit. I also wanted to try quite a few new things related to SSDT, DACPACs and other things related to SQL Azure.

However…..

You can use your own hosted (known as ‘private ‘) build agent and connect to up to VSTS – so long as it has access and the right tools installed on it….

That is another reason to use your own hosted agent — you may have specific software required for your build(s). In my case I would be doing a lot of builds relating only to SQL Server, Azure SQL Database and associated tools.

Oh yeah — VSTS allows one free private hosted agent.

Hooray!!

So I have created a dedicated build agent just for my SQL Server activities.

Here is a step by step guide to installing a private hosted build agent. For authentication I will be using a Personal Access Token (PAT) for the agent to authenticate to VSTS.

  1. Log into VSTS and from your home page, open your profile. Go to your security details.VSTS_Agent_0
  2. Configure the PAT — for the scope select Agent Pools (read, manage) and make sure all the other boxes are clearedVSTS_Agent_0_1
  3. Save the token somewhere as you will need it when installing/configuring the agent on the on-premises server.
  4. Log on to the machine using the account for which you have permissions to install software and also access VSTS
  5. In your web browser, sign on to VSTS, and navigate to the Agent pools tab: https://{your_account}.visualstudio.com/_admin/_AgentPool

    VSTS_Agent

  6. Click Download agent.
  7. On the Get agent dialog box, click Windows.
  8. Click the Download button.
  9. On the server – extract the ZIP file in a directory and run config.cmdVSTS_Agent_3
  10. Fill in details as required – you will use the PAT token from step 2/3 above:VSTS_Agent_4
  11. Go back into VSTS and you will now have a new hosted agent:VSTS_Agent_5
  12. For the particular project — go into Settings | Agent Queues and choose the existing pool that will have the newly installed private agent in it.VSTS_Agent_6
  13. Next we want to associate our build steps with this queue – so edit your build definition and choose the Agent Queue from above:VSTS_Agent_7
  14. Now the REAL fun begins – let’s queue a build!!:
    VSTS_Agent_8
  15. And of course it works the first time….VSTS_Agent_9
  16. If  we now look in generated artifact we will have our desired result – a DACPAC fileVSTS_Agent_10

 

That we can now use VSTS to deploy out to our Azure SQL Database (and also on-premises SQL Server).

Which is the basis for another blog post…..

Yip.