Use Boot Diagnostics to see what your Azure VM is doing at boot time

This blog post is about how to diagnose what your Azure VM is doing while it is booting.

I have a DEMO VM that is hosted in Azure – it is where I have Visual Studio, Redgate tools as well as all my DEMO systems, presentations hosted for when I speak. That way when I go to a venue to speaker I only need (at worst) internet connectivity – and I have a phone with fantastic internet if the venue doesn’t.

What I do is keep the VM up to date in terms of windows patches and I make a point of 2 days out before an event of making sure all outstanding patches are installed.

Hint: this might tell you where this post is headed.

So 2 days out from speaking in Spokane – DevOPs & the DBA I made sure to start up my VM to check things were good. The only complicating factor was this was a day before I was to give a session to TheDevOpsLab so I thought – best to get this out of the way and practice my database unit test session that was going to be recorded.

So I went into the Azure Portal and down to Virtual Machines and clicked “start”:

start VM
Let’s start up the VM and get started

Normally this start up process would take about 3-5 minutes whilst things started up.

However after 10 minutes I still could not connect. After 15 minutes I started to become worried. So I clicked on the Virtual Machine in the Azure Portal to see what usage was happening.

Things were happening alright:

The keen eye will note that is 2 hours worth of activity…..

Yip – my VM was busy doing heaps of stuff for 2 hours and the whole time I could NOT log onto it via RDP. Which is when I discovered “Boot Diagnostics” in the Azure Portal for Virtual Machines. It allows us to see the console of the VM.

Simply click on your VM and choose “Boot Diagnostics”:

Boot Diagnostics
Let’s see what the VM is doing

Which gave me an insight to what my VM was spending so much time doing:

windows update
Ugh…  Windows Updates.

So I waited for 2 hours whilst my VM applied these patches (to be fair it was a very big update).

The good thing was I could monitor the progress via Boot Diagnostics.

BTW – in normal running this is the console of my VM:

Normal Console
You should consider visiting New Zealand

Which is the view out my front gate. If you have never considered coming to New Zealand – hopefully the console of my VM hosted in Azure will help you decide.
Or consider submitting to SQL Saturday South Island:

http://www.sqlsaturday.com/712/EventHome.aspx

We’re still open until 26th April 2018 (I extended it today) and to be honest if it’s past that date and you are International speaker – hit me up on twitter — @theHybridDBA if you want to speak – we’re relaxed as in New Zealand and love people who like to give their time to the community.

I’ll even give up my own speaking slot.

Yip.

Advertisements

Authentication issues with GitHub: An error occurred while sending the request —> System.Net.WebException: The request was aborted: Could not create SSL/TLS secure channel.

This blog post is about an error that you may receive if you’re using git commands in PowerShell and authenticating against GitHub.

My GitHub account uses 2 Factor Authentication so I thought it might be that – however I could use GitHub Desktop fine.

I was cloning a new project for a client:

git clone https://github.com/Client/Documentation.git

and I got a prompt for my username/password but when I entered both I got an error. O knew my username and password OK and I knew 2FA was working too.

The error (taken from event logs) was:

System.Net.Http.HttpRequestException: An error occurred while sending the request. —> System.Net.WebException: The request was aborted: Could not create SSL/TLS secure channel.

So I looked at my Git for Windows version:

PS C:\Yip> git version
git version 2.14.2.windows.1

So I decided to upgrade – as I had been meaning to do this for a while.

So I downloaded the latest version for windows and 2.16.2

I then ran the same command – which prompted me for my GitHub username/password. I entered them and asked me for my 2 Factor password which I put in and hooray!! — it worked.

I successfully cloned the repo and can now do work for a client who has stuff stored in GitHub.

Yip.

Redgate SQL Test utility isn’t showing up in SSMS

I had recently installed the SQL Toolbelt from Redgate at a client site on a laptop they’d supplied me.

(Fantastic product that SQL Toolbelt BTW.)

Things were going swimmingly – in fact thanks to SQL Compare, SQL Source Control and Team Foundation Server I had implemented an automated Continuous Delivery pipeline for the clients databases and applications.

The next thing I wanted to do was start implementing unit testing for both the applications and database. DEV were going to do the application side (this client is a start-up so it made sense that they had little to no unit tests) and I’d do the database side.

Except in SSMS I couldn’t find SQL Test…??

I knew to click on the icons to bring down other utilities but it wasn’t here either:

Redgate SSMS Where is SQL TEST
Not here either…. what have I done wrong??

So as a windows user I naturally looked in the Windows Apps:

Redgate No SQLTEST in apps
Hmmm…. nothing here either

At this point I decided it had to be my machine as I did a google and looked on forums and no one seemed to have experienced this.

So I uninstalled everything – SSMS and the Toolbelt.

Reinstalled everything.

And got this:

It was while clicking around like a madman I found this:

Phew.

And of course now I can do this:

Redgate Now in my happy place
Let’s start unit testing!!

 

So if you have recently installed Redgate SQL Toolbelt and can’t find SQL Test – hopefully this blog post will help you.

By the way I do think there was something wrong with the laptop the client gave me as now when I right click I get the ability to run tests:

Redgate SQL Test context menu
This was definitely not there before the uninstall/reinstall fiasco

So now I can start my unit tests – the good news the DEV team have started theirs and are really getting behind it. I think they’ve got on with it to stop me talking about unit tests…!!

We now have 4 minutes of unit tests per checked in build but that is definitely something I’ll respond to with:

Yip.

Automation and cleaner code/data are the key to future success

Doing DevOPs for your Database? You need to start here…

This month’s T-SQL Tuesday #100 is hosted by Adam Machanic (B | T) who started T-SQL Tuesday 8 years ago and has invited the community to look forward 8 years at what will be T-SQL Tuesday #200…

 

Before I do that though – I want to think about what I was doing 8 years ago. At the time I was working with Object Orientated databases and the company I worked for had just implemented extensive unit testing across our technology stack. I use that word because we had both the database and application code going through unit tests, integration tests and regression tests. Most of which was automated.

It was around this time that I was starting to do more things in SQL Server, namely SQL Server 2005, funny how 7 years later I was still doing some things in SQL Server 2005 – but that is for another blog post…

It was when I came across to the SQL Server world that I realised that 2 things were different:

  1. Database build/code definitions were not in source control
  2. Unit testing was almost non-existent
  3. Automated testing or deployment was also not a common thing

Even now in 2018 I find that when speaking at conferences and I ask the questions:

  1. Who here uses source control – 60% of hands go up
  2. Who here puts there database into source control – 50% of those 60% of hands go down…..
  3. Who here does automated testing across their databases – another 50% of those hands go down

I generally get ~30 people in my sessions so if at best 5 people out of 30 are doing this – we need to change this.

Yet…

For years people have been discussing both getting our database code into source control and unit testing has been discussed at length during the past 8 years.

So what does this mean for the next 8 years?

I’m on a personal crusade to bring automated testing and DevOPs philosophies to your database…

In fact I am hoping that DevOPs doesn’t exist in 8 years time – we as an industry matured and called it “commonsense“.

Let’s talk about automation – as whilst people might not see the value in testing our most precious resource – data – they will hopefully see the light in that the more we can automate the better our jobs will be.

But won’t it kill the DBA off I hear some of you ask.

No.

It won’t.

In 2026 the role of the DBA will have morphed, in fact all Data Professionals will have an appreciation and understanding of how to automate the delivery of value to the end user. I choose my words carefully – too often we as technical people thing about 1s and 0s but in fact it is how we deliver value to our clients that dictates our success. In terms of customer experience we need to ensure we are delivering value of higher value than our competitors.

Many people are worried that automation will put them out of a job. This won’t happen, and in fact there will never be a shortage of work to do in a successful company. Rather, people are freed up from mindless drudge-work to focus on higher value activities.

This also has the benefit of improving quality, since we humans are at our most error-prone when performing mindless tasks.

By utilising the features in SQL Server Data Tools (SSDT) Version 2026 – which does both State based AND Migration based deployments to databases automatically means that we as DBAs can start focusing on activities that bring value to our clients. Databases have been touted to be self tuning for decades but I simply don’t see it happening and we need DBAs that can tune queries, and more importantly understand how automation can make their lives better.

Data Science is big time in 2026 – the past 8 years have seen a massive jump in it’s usage. This is where Data Professionals have a massive part to play – cleaning up the underlying data – who knows – through automated cleansing jobs that self tune themselves… automatically.

See where I’m going with this — the growth of data we will see over the next 8 years means we need to be more vigilant in the ways that we interact with the data.  Using automation and more testing means that the data scientists can focus on what they’re good at – doing stuff with data that they don’t have to spend (up to) 60% of the time trying to cleanse.

I am not scared of automation putting me out of a job, I’m more scared that over the next 8 years that people won’t embrace it enough to  help with the delivery of value to the end user.

Yip.

 

Doing DevOPs for your Database? You need to start here…

I’ve been doing that thing called DevOPs for about 17 years – you know – before it had a name…

In fact it was when I first joined Jade Software that I realised that Ops and DEV had a common interest in making systems go. In 2002 I started working with our toolset team designing our in-house written toolsets to deploy JADE databases/systems automatically and reliably. Little did I know I would be repeating that methodology around .NET written systems in 2010 and then repeating it again with deploying SQL Server code in 2014…

The major breakthrough for the JADE applications and database was unit testing.

When we started developing in .NET (over a JADE database at the time) it made sense to do unit tests – in fact I don’t know of many good .NET developers who don’t do unit testing.

It was when I started speaking to audiences that I realised that not many people do unit tests for their databases. Why??

Well it appears I was talking to the wrong crowd…. sort of.

Code first developers who design databases using Entity Framework are used to writing unit tests so most times they are testing code that will change the underlying database.

So that led me to ask the question: “Are DBAs doing Unit tests?”

No.

They are not.

Firstly – what is a unit test?

The primary goal of unit testing is to take the smallest piece of testable software in the application, isolate it from the remainder of the code, and determine whether it behaves exactly as you expect.

This testing is done as part of the development process, and a unit test will check that the code being tested meets a specification, while a library of unit tests together will check the functions expected of the application. By automating the process , which allows the library of unit tests to be run frequently, and repeatedly, it allows us to find bugs earlier, and in smaller units of code, which are far more easier to debug.

Unit tests should be self contained enough that you are isolated in what you are testing so that you know whether have a correctly performing test.  Too often we think of databases as massive stores of data and generally we only test for performance. However – with unit tests we are looking at far smaller sets of data, we want tests that are fast and we are testing the functionality/quality of the code rather than the execution speed.

The upshot is that once we embrace unit testing we can then start to utilise regression testing  which can allow us to refactor our database code just as easily (if not more confidently) as developers do for application code.

So if hiring a .NET developer who doesn’t do unit tests is unthinkable – why would we accept this as the norm for the people who are writing the stored procedures etc that touch/influence and ultimately own our most precious resource – our data…?

Because it is too hard?

I find tuning indexes and queries hard – writing a unit test to prove that my code won’t kill PROD seems way easier. Also if I find a bug in my code when I’m writing the code at 3pm – it’s way easier to fix it then than at 3am when an online banking system has crashed/burned or is now corrupt…. I’m sorry but saying unit testing is too hard is a cop out.

Because it is too slow?

Refer the example above – way easier to write a little bit of a unit test and prove that my change is going to work when it will only trash my DEV instance. Fixing it then is far quicker than when 1,000s of users are affect – because there are less people calling my phone/emailing me when I fix it in my DEV instance..

Because it is too new?

Not at all – SQL Server Data Tools (SSDT) has provided database developers the ability to do unit testing since 2012. https://blogs.msdn.microsoft.com/ssdt/2012/12/07/getting-started-with-sql-server-database-unit-testing-in-ssdt/

In fact there is an old article at SQLServer Central that is from 2009!!  http://www.sqlservercentral.com/articles/Editorial/68586/

Because it involves learning a new language?

You don’t have to – tSQLt and SQLTest by Redgate both allow unit tests to be written in TSQL – which most DBAs thrive on. Even SSDT allow you to write unit tests in TSQL.

I have used SSDT a far bit so if you are a database developer then I highly recommend you read “Creating and Running a SQL Server Unit Test”   https://msdn.microsoft.com/en-us/library/jj851212(v=vs.103).aspx

If you are a DBA then I highly recommend you look at tSQLt.org – the official website has lots of useful information. There is a slight learning curve – but after 15 minutes reading and trying it out – it is very simple to use. It allows you to isolate your testing to a particular schema and makes use of fake tables – equivalent of mocking –which allows us to take copy of the table as it is and test against it.

There is a great pluralsight course here:  http://pluralsight.com/training/Courses/TableOfContents/unit-testing-t-sql-tsqlt

David Green is the author of the above Pluralsight course and has written a fair bit about tSQLt – http://d-a-green.blogspot.co.uk/search/label/tSQLt

Greg Lucas has also written a lot about tSQLt http://datacentricity.net/tag/tsqlt/

His article on http://datacentricity.net/2011/12/advanced-database-unit-testing-with-tsqlt-testing-cross-database-tables  is particularly helpful.

Of course there are some other great utilities and I mentioned one earlier:

https://www.red-gate.com/products/sql-development/sql-test/index

SQLTest by Redgate – which is an absolutely awesome tool for DBAs – mainly as it plugs straight into SQL Server Management Studio (SSMS).

It uses the tSQLt framework and incorporates SQLCop SQL Test which will help you enforce best practices for database development and run static analysis tests.

Best part is if you are on a DevOPs for Database journey then you can fold those tests into your Continuous Integration processes to really bring up the quality of your database code. You need to start looking at unit tests – today.

Yip.

 

 

 

 

The Data Platform has expanded – so too should our approach in using it….

In recent years we as data professionals have moved from dealing with SQL Server databases with SQL Server Reporting Server and SQL Server Analysis Services interacting with them (all on-premises) to a wide scale data platform.

In fact even the name of most SQL Server things (like my MVP) have morphed into the name of “Data Platform”.

The name allows for new technologies and processes to be folded into the ecosystem. The radical changes brought about by the Azure platform have recently been matched by the breadth of technological choice in how you interact, manage and understand your data.

Let’s look at some key areas of what Microsoft have to offer on the Data Platform:

Database products

SQL Server 2017 – Lets you bring the industry-leading performance and security of SQL Server to the platform of your choice—use it on Windows, Linux, and Docker containers.

SQL Database – Built for developers, SQL Database is a relational database management system with enterprise-class availability, scalability, and security, and built-in intelligence capable of learning app patterns, that can be accessed from anywhere in the world.

Azure Database for MySQL – Quickly stand up a MySQL database and scale on the fly with this fully managed database service for app development and deployment that includes high-availability, security, and recovery at no extra cost.

Azure Database for PostgreSQL – Stand up a PostgreSQL database in minutes and scale on the fly—this fully managed database service for app development and deployment also gives you high-availability, security, and recovery at no extra cost.

SQL Data Warehouse – Scale compute and storage independently with this SQL-based, fully managed, petabyte-scale cloud data warehouse that’s highly elastic and enables you to set up in minutes and scale capacity in seconds.

Azure Cosmos DB – With a guarantee of single-digit-millisecond latencies at the 99th percentile anywhere in the world, this multimodel database service offers turnkey global distribution across any number of Azure regions by transparently scaling and replicating your data to wherever your users are.

Data and analytics products

SQL Server 2017 – With up to 1 million predictions per second using built-in Python and R integration, SQL Server 2017 delivers real-time intelligence as it brings the industry-leading performance and security of SQL Server to the platform of your choice.

HD Insight – A fully managed cloud Spark and Hadoop service, HDInsight provides open source analytic clusters for Spark, Hive, MapReduce, HBase, Storm, Kafka, and Microsoft R Server backed by a 99.9% SLA.

Machine Learning – Easily build, deploy, and manage predictive analytics solutions with this fully managed cloud service and deploy your model into production as a web service in minutes that can be called from any device, anywhere.

Stream Analytics – Develop and run massively parallel real-time analytics on multiple streams of data with this analytics service that helps uncover real-time insights from devices, sensors, infrastructure, and applications.

Azure Bot Service – Accelerate bot development with this intelligent, serverless bot service that scales on demand, requires no server management or patching, and provides built-in templates.

Data Lake Analytics – Develop and run massively parallel data transformation and processing programs in U-SQL, R, Python, and Microsoft .NET over petabytes of data with this on-demand service that provides a simple, scalable way to analyze big data—in seconds.

Data Lake Store – Built to the open HDFS standard, this is a no-limits cloud data lake for your enterprise’s unstructured, semi-structured, and structured data that’s massively scalable and secured, and allows you to run massively parallel analytics.

Data Catalog – Spend less time looking for data and more time getting value from it with this fully managed cloud service that lets you register, enrich, discover, understand, and consume your enterprise data sources.

The current state of the Data Platform is exciting, innovative and vast.  For years my aim was to understand how best I could tune, manage and deploy on SQL Server. The good news is that with “recent” improvements to the SQL Server engine:

https://msdn.microsoft.com/en-us/library/aa226166(v=sql.70).aspx

we can now all focus on other aspects of the Data Platform…. (sorry but I had to put that in there).

With recent enhancements to the SQL Server engine and the maturity of running databases in Azure – it does mean our roles as data professionals are evolving.

Hard core DBAs are now finding themselves talking to Data Scientists on what is required for a stable, reliable, clean, tested, backed-up and secure data processing strategy.

The ability to deploy to the cloud calls for secure and efficient processes around those deployments and nowadays DBAs are also finding themselves involved in conversations around getting database code into source control, code being tested as part of continuous integration and changes deployed via continuous delivery processes.

Or god forbid – knowing being part of something called agile….!!

The data platform has expanded and grown, our approach in how we manage and deploy to it needs to grow as well.

The good thing is that Microsoft have put a massive amount of effort into https://docs.microsoft.com – I used to despair with MSDN and Technet documentation – but I am loving and inspired with the quality of articles being put out on https://docs.microsoft.com 

These days if I’m interacting with a new feature or need to diagnose something being able to quickly use these docs has been fantastic in helping me cope with the new world of an expansive Data platform.

Yip.

If your Surface Book loses it’s keyboard and trackpad – try removing KB4074588 update

My Surface Book (gen 1) recently did updates and then a day or so later the keyboard and trackpad mysteriously stopped working. Then I heard of 3 other people who experienced the same issue within 24 hours.

I could use the Surface Book in tablet mode – but I tired of that pretty quickly.

The weird things was I could hit the Fn key and it would light up – but could not use the keyboard at all.

Some of the people affected by this issue did system restores which worked. I tried many system restores – which gave me my keyboard and trackpad back BUT the Surface Book repeatedly bluescreened….

Yuck.

Surface Support was contacted – I’d like to say they were helpful. But I wouldn’t be writing this blog post if they completely solved my issue. They sent me here:

https://www.microsoft.com/en-au/software-download/windows10

Which did nothing.

Awesome.

BTW if people in support think it’s acceptable to leave a person waiting 60 hours before re-contacting them — maybe support isn’t the job for you.

(I used to be Second Level Support Manager so I know how vital it is to update/engage with people who are experiencing issues with software….)

I stumbled across the fact that windows update KB4074588 was installed when I had the updates install – so I removed it. As my next move was a factory reset — so I had nothing to lose.

Restarted and boom I have a keyboard and trackpad and 3 minutes later wrote this to hopefully help others.

I emailed Surface Support to let them know too.

Yip.