Use Boot Diagnostics to see what your Azure VM is doing at boot time

This blog post is about how to diagnose what your Azure VM is doing while it is booting.

I have a DEMO VM that is hosted in Azure – it is where I have Visual Studio, Redgate tools as well as all my DEMO systems, presentations hosted for when I speak. That way when I go to a venue to speaker I only need (at worst) internet connectivity – and I have a phone with fantastic internet if the venue doesn’t.

What I do is keep the VM up to date in terms of windows patches and I make a point of 2 days out before an event of making sure all outstanding patches are installed.

Hint: this might tell you where this post is headed.

So 2 days out from speaking in Spokane – DevOPs & the DBA I made sure to start up my VM to check things were good. The only complicating factor was this was a day before I was to give a session to TheDevOpsLab so I thought – best to get this out of the way and practice my database unit test session that was going to be recorded.

So I went into the Azure Portal and down to Virtual Machines and clicked “start”:

start VM
Let’s start up the VM and get started

Normally this start up process would take about 3-5 minutes whilst things started up.

However after 10 minutes I still could not connect. After 15 minutes I started to become worried. So I clicked on the Virtual Machine in the Azure Portal to see what usage was happening.

Things were happening alright:

The keen eye will note that is 2 hours worth of activity…..

Yip – my VM was busy doing heaps of stuff for 2 hours and the whole time I could NOT log onto it via RDP. Which is when I discovered “Boot Diagnostics” in the Azure Portal for Virtual Machines. It allows us to see the console of the VM.

Simply click on your VM and choose “Boot Diagnostics”:

Boot Diagnostics
Let’s see what the VM is doing

Which gave me an insight to what my VM was spending so much time doing:

windows update
Ugh…  Windows Updates.

So I waited for 2 hours whilst my VM applied these patches (to be fair it was a very big update).

The good thing was I could monitor the progress via Boot Diagnostics.

BTW – in normal running this is the console of my VM:

Normal Console
You should consider visiting New Zealand

Which is the view out my front gate. If you have never considered coming to New Zealand – hopefully the console of my VM hosted in Azure will help you decide.
Or consider submitting to SQL Saturday South Island:

We’re still open until 26th April 2018 (I extended it today) and to be honest if it’s past that date and you are International speaker – hit me up on twitter — @theHybridDBA if you want to speak – we’re relaxed as in New Zealand and love people who like to give their time to the community.

I’ll even give up my own speaking slot.


Authentication issues with GitHub: An error occurred while sending the request —> System.Net.WebException: The request was aborted: Could not create SSL/TLS secure channel.

This blog post is about an error that you may receive if you’re using git commands in PowerShell and authenticating against GitHub.

My GitHub account uses 2 Factor Authentication so I thought it might be that – however I could use GitHub Desktop fine.

I was cloning a new project for a client:

git clone

and I got a prompt for my username/password but when I entered both I got an error. O knew my username and password OK and I knew 2FA was working too.

The error (taken from event logs) was:

System.Net.Http.HttpRequestException: An error occurred while sending the request. —> System.Net.WebException: The request was aborted: Could not create SSL/TLS secure channel.

So I looked at my Git for Windows version:

PS C:\Yip> git version
git version

So I decided to upgrade – as I had been meaning to do this for a while.

So I downloaded the latest version for windows and 2.16.2

I then ran the same command – which prompted me for my GitHub username/password. I entered them and asked me for my 2 Factor password which I put in and hooray!! — it worked.

I successfully cloned the repo and can now do work for a client who has stuff stored in GitHub.


Redgate SQL Test utility isn’t showing up in SSMS

I had recently installed the SQL Toolbelt from Redgate at a client site on a laptop they’d supplied me.

(Fantastic product that SQL Toolbelt BTW.)

Things were going swimmingly – in fact thanks to SQL Compare, SQL Source Control and Team Foundation Server I had implemented an automated Continuous Delivery pipeline for the clients databases and applications.

The next thing I wanted to do was start implementing unit testing for both the applications and database. DEV were going to do the application side (this client is a start-up so it made sense that they had little to no unit tests) and I’d do the database side.

Except in SSMS I couldn’t find SQL Test…??

I knew to click on the icons to bring down other utilities but it wasn’t here either:

Redgate SSMS Where is SQL TEST
Not here either…. what have I done wrong??

So as a windows user I naturally looked in the Windows Apps:

Redgate No SQLTEST in apps
Hmmm…. nothing here either

At this point I decided it had to be my machine as I did a google and looked on forums and no one seemed to have experienced this.

So I uninstalled everything – SSMS and the Toolbelt.

Reinstalled everything.

And got this:

It was while clicking around like a madman I found this:


And of course now I can do this:

Redgate Now in my happy place
Let’s start unit testing!!


So if you have recently installed Redgate SQL Toolbelt and can’t find SQL Test – hopefully this blog post will help you.

By the way I do think there was something wrong with the laptop the client gave me as now when I right click I get the ability to run tests:

Redgate SQL Test context menu
This was definitely not there before the uninstall/reinstall fiasco

So now I can start my unit tests – the good news the DEV team have started theirs and are really getting behind it. I think they’ve got on with it to stop me talking about unit tests…!!

We now have 4 minutes of unit tests per checked in build but that is definitely something I’ll respond to with:


Automation and cleaner code/data are the key to future success

Doing DevOPs for your Database? You need to start here…

This month’s T-SQL Tuesday #100 is hosted by Adam Machanic (B | T) who started T-SQL Tuesday 8 years ago and has invited the community to look forward 8 years at what will be T-SQL Tuesday #200…


Before I do that though – I want to think about what I was doing 8 years ago. At the time I was working with Object Orientated databases and the company I worked for had just implemented extensive unit testing across our technology stack. I use that word because we had both the database and application code going through unit tests, integration tests and regression tests. Most of which was automated.

It was around this time that I was starting to do more things in SQL Server, namely SQL Server 2005, funny how 7 years later I was still doing some things in SQL Server 2005 – but that is for another blog post…

It was when I came across to the SQL Server world that I realised that 2 things were different:

  1. Database build/code definitions were not in source control
  2. Unit testing was almost non-existent
  3. Automated testing or deployment was also not a common thing

Even now in 2018 I find that when speaking at conferences and I ask the questions:

  1. Who here uses source control – 60% of hands go up
  2. Who here puts there database into source control – 50% of those 60% of hands go down…..
  3. Who here does automated testing across their databases – another 50% of those hands go down

I generally get ~30 people in my sessions so if at best 5 people out of 30 are doing this – we need to change this.


For years people have been discussing both getting our database code into source control and unit testing has been discussed at length during the past 8 years.

So what does this mean for the next 8 years?

I’m on a personal crusade to bring automated testing and DevOPs philosophies to your database…

In fact I am hoping that DevOPs doesn’t exist in 8 years time – we as an industry matured and called it “commonsense“.

Let’s talk about automation – as whilst people might not see the value in testing our most precious resource – data – they will hopefully see the light in that the more we can automate the better our jobs will be.

But won’t it kill the DBA off I hear some of you ask.


It won’t.

In 2026 the role of the DBA will have morphed, in fact all Data Professionals will have an appreciation and understanding of how to automate the delivery of value to the end user. I choose my words carefully – too often we as technical people thing about 1s and 0s but in fact it is how we deliver value to our clients that dictates our success. In terms of customer experience we need to ensure we are delivering value of higher value than our competitors.

Many people are worried that automation will put them out of a job. This won’t happen, and in fact there will never be a shortage of work to do in a successful company. Rather, people are freed up from mindless drudge-work to focus on higher value activities.

This also has the benefit of improving quality, since we humans are at our most error-prone when performing mindless tasks.

By utilising the features in SQL Server Data Tools (SSDT) Version 2026 – which does both State based AND Migration based deployments to databases automatically means that we as DBAs can start focusing on activities that bring value to our clients. Databases have been touted to be self tuning for decades but I simply don’t see it happening and we need DBAs that can tune queries, and more importantly understand how automation can make their lives better.

Data Science is big time in 2026 – the past 8 years have seen a massive jump in it’s usage. This is where Data Professionals have a massive part to play – cleaning up the underlying data – who knows – through automated cleansing jobs that self tune themselves… automatically.

See where I’m going with this — the growth of data we will see over the next 8 years means we need to be more vigilant in the ways that we interact with the data.  Using automation and more testing means that the data scientists can focus on what they’re good at – doing stuff with data that they don’t have to spend (up to) 60% of the time trying to cleanse.

I am not scared of automation putting me out of a job, I’m more scared that over the next 8 years that people won’t embrace it enough to  help with the delivery of value to the end user.



Doing DevOps for your Database? You need to start here…

I’ve been doing that thing called DevOps for about 17 years – you know – before it had a name…

In fact it was when I first joined Jade Software that I realised that Ops and DEV had a common interest in making systems go. In 2002 I started working with our toolset team designing our in-house written toolsets to deploy JADE databases/systems automatically and reliably. Little did I know I would be repeating that methodology around .NET written systems in 2010 and then repeating it again with deploying SQL Server code in 2014…

The major breakthrough for the JADE applications and database was unit testing.

When we started developing in .NET (over a JADE database at the time) it made sense to do unit tests – in fact I don’t know of many good .NET developers who don’t do unit testing.

It was when I started speaking to audiences that I realised that not many people do unit tests for their databases. Why??

Well it appears I was talking to the wrong crowd…. sort of.

Code first developers who design databases using Entity Framework are used to writing unit tests so most times they are testing code that will change the underlying database.

So that led me to ask the question: “Are DBAs doing Unit tests?”


They are not.

Firstly – what is a unit test?

The primary goal of unit testing is to take the smallest piece of testable software in the application, isolate it from the remainder of the code, and determine whether it behaves exactly as you expect.

This testing is done as part of the development process, and a unit test will check that the code being tested meets a specification, while a library of unit tests together will check the functions expected of the application. By automating the process , which allows the library of unit tests to be run frequently, and repeatedly, it allows us to find bugs earlier, and in smaller units of code, which are far more easier to debug.

Unit tests should be self contained enough that you are isolated in what you are testing so that you know whether have a correctly performing test.  Too often we think of databases as massive stores of data and generally we only test for performance. However – with unit tests we are looking at far smaller sets of data, we want tests that are fast and we are testing the functionality/quality of the code rather than the execution speed.

The upshot is that once we embrace unit testing we can then start to utilise regression testing  which can allow us to refactor our database code just as easily (if not more confidently) as developers do for application code.

So if hiring a .NET developer who doesn’t do unit tests is unthinkable – why would we accept this as the norm for the people who are writing the stored procedures etc that touch/influence and ultimately own our most precious resource – our data…?

Because it is too hard?

I find tuning indexes and queries hard – writing a unit test to prove that my code won’t kill PROD seems way easier. Also if I find a bug in my code when I’m writing the code at 3pm – it’s way easier to fix it then than at 3am when an online banking system has crashed/burned or is now corrupt…. I’m sorry but saying unit testing is too hard is a cop out.

Because it is too slow?

Refer the example above – way easier to write a little bit of a unit test and prove that my change is going to work when it will only trash my DEV instance. Fixing it then is far quicker than when 1,000s of users are affect – because there are less people calling my phone/emailing me when I fix it in my DEV instance..

Because it is too new?

Not at all – SQL Server Data Tools (SSDT) has provided database developers the ability to do unit testing since 2012.

In fact there is an old article at SQLServer Central that is from 2009!!

Because it involves learning a new language?

You don’t have to – tSQLt and SQLTest by Redgate both allow unit tests to be written in TSQL – which most DBAs thrive on. Even SSDT allow you to write unit tests in TSQL.

I have used SSDT a far bit so if you are a database developer then I highly recommend you read “Creating and Running a SQL Server Unit Test”

If you are a DBA then I highly recommend you look at – the official website has lots of useful information. There is a slight learning curve – but after 15 minutes reading and trying it out – it is very simple to use. It allows you to isolate your testing to a particular schema and makes use of fake tables – equivalent of mocking –which allows us to take copy of the table as it is and test against it.

There is a great pluralsight course here:

David Green is the author of the above Pluralsight course and has written a fair bit about tSQLt –

Greg Lucas has also written a lot about tSQLt

His article on  is particularly helpful.

Of course there are some other great utilities and I mentioned one earlier:

SQLTest by Redgate – which is an absolutely awesome tool for DBAs – mainly as it plugs straight into SQL Server Management Studio (SSMS).

It uses the tSQLt framework and incorporates SQLCop SQL Test which will help you enforce best practices for database development and run static analysis tests.

Best part is if you are on a DevOps for Database journey then you can fold those tests into your Continuous Integration processes to really bring up the quality of your database code. You need to start looking at unit tests – today.






The Data Platform has expanded – so too should our approach in using it….

In recent years we as data professionals have moved from dealing with SQL Server databases with SQL Server Reporting Server and SQL Server Analysis Services interacting with them (all on-premises) to a wide scale data platform.

In fact even the name of most SQL Server things (like my MVP) have morphed into the name of “Data Platform”.

The name allows for new technologies and processes to be folded into the ecosystem. The radical changes brought about by the Azure platform have recently been matched by the breadth of technological choice in how you interact, manage and understand your data.

Let’s look at some key areas of what Microsoft have to offer on the Data Platform:

Database products

SQL Server 2017 – Lets you bring the industry-leading performance and security of SQL Server to the platform of your choice—use it on Windows, Linux, and Docker containers.

SQL Database – Built for developers, SQL Database is a relational database management system with enterprise-class availability, scalability, and security, and built-in intelligence capable of learning app patterns, that can be accessed from anywhere in the world.

Azure Database for MySQL – Quickly stand up a MySQL database and scale on the fly with this fully managed database service for app development and deployment that includes high-availability, security, and recovery at no extra cost.

Azure Database for PostgreSQL – Stand up a PostgreSQL database in minutes and scale on the fly—this fully managed database service for app development and deployment also gives you high-availability, security, and recovery at no extra cost.

SQL Data Warehouse – Scale compute and storage independently with this SQL-based, fully managed, petabyte-scale cloud data warehouse that’s highly elastic and enables you to set up in minutes and scale capacity in seconds.

Azure Cosmos DB – With a guarantee of single-digit-millisecond latencies at the 99th percentile anywhere in the world, this multimodel database service offers turnkey global distribution across any number of Azure regions by transparently scaling and replicating your data to wherever your users are.

Data and analytics products

SQL Server 2017 – With up to 1 million predictions per second using built-in Python and R integration, SQL Server 2017 delivers real-time intelligence as it brings the industry-leading performance and security of SQL Server to the platform of your choice.

HD Insight – A fully managed cloud Spark and Hadoop service, HDInsight provides open source analytic clusters for Spark, Hive, MapReduce, HBase, Storm, Kafka, and Microsoft R Server backed by a 99.9% SLA.

Machine Learning – Easily build, deploy, and manage predictive analytics solutions with this fully managed cloud service and deploy your model into production as a web service in minutes that can be called from any device, anywhere.

Stream Analytics – Develop and run massively parallel real-time analytics on multiple streams of data with this analytics service that helps uncover real-time insights from devices, sensors, infrastructure, and applications.

Azure Bot Service – Accelerate bot development with this intelligent, serverless bot service that scales on demand, requires no server management or patching, and provides built-in templates.

Data Lake Analytics – Develop and run massively parallel data transformation and processing programs in U-SQL, R, Python, and Microsoft .NET over petabytes of data with this on-demand service that provides a simple, scalable way to analyze big data—in seconds.

Data Lake Store – Built to the open HDFS standard, this is a no-limits cloud data lake for your enterprise’s unstructured, semi-structured, and structured data that’s massively scalable and secured, and allows you to run massively parallel analytics.

Data Catalog – Spend less time looking for data and more time getting value from it with this fully managed cloud service that lets you register, enrich, discover, understand, and consume your enterprise data sources.

The current state of the Data Platform is exciting, innovative and vast.  For years my aim was to understand how best I could tune, manage and deploy on SQL Server. The good news is that with “recent” improvements to the SQL Server engine:

we can now all focus on other aspects of the Data Platform…. (sorry but I had to put that in there).

With recent enhancements to the SQL Server engine and the maturity of running databases in Azure – it does mean our roles as data professionals are evolving.

Hard core DBAs are now finding themselves talking to Data Scientists on what is required for a stable, reliable, clean, tested, backed-up and secure data processing strategy.

The ability to deploy to the cloud calls for secure and efficient processes around those deployments and nowadays DBAs are also finding themselves involved in conversations around getting database code into source control, code being tested as part of continuous integration and changes deployed via continuous delivery processes.

Or god forbid – knowing being part of something called agile….!!

The data platform has expanded and grown, our approach in how we manage and deploy to it needs to grow as well.

The good thing is that Microsoft have put a massive amount of effort into – I used to despair with MSDN and Technet documentation – but I am loving and inspired with the quality of articles being put out on 

These days if I’m interacting with a new feature or need to diagnose something being able to quickly use these docs has been fantastic in helping me cope with the new world of an expansive Data platform.


If your Surface Book loses it’s keyboard and trackpad – try removing KB4074588 update

My Surface Book (gen 1) recently did updates and then a day or so later the keyboard and trackpad mysteriously stopped working. Then I heard of 3 other people who experienced the same issue within 24 hours.

I could use the Surface Book in tablet mode – but I tired of that pretty quickly.

The weird things was I could hit the Fn key and it would light up – but could not use the keyboard at all.

Some of the people affected by this issue did system restores which worked. I tried many system restores – which gave me my keyboard and trackpad back BUT the Surface Book repeatedly bluescreened….


Surface Support was contacted – I’d like to say they were helpful. But I wouldn’t be writing this blog post if they completely solved my issue. They sent me here:

Which did nothing.


BTW if people in support think it’s acceptable to leave a person waiting 60 hours before re-contacting them — maybe support isn’t the job for you.

(I used to be Second Level Support Manager so I know how vital it is to update/engage with people who are experiencing issues with software….)

I stumbled across the fact that windows update KB4074588 was installed when I had the updates install – so I removed it. As my next move was a factory reset — so I had nothing to lose.

Restarted and boom I have a keyboard and trackpad and 3 minutes later wrote this to hopefully help others.

I emailed Surface Support to let them know too.




Speaking at PASS Summit and why you need to think about submitting….

This post is about the honor and experience of speaking at PASS Summit not once (2016) but twice (2017).

I recently received an email from PASS HQ that asked past speakers to share our success stories – to help others consider submitting for PASS Summit as a speaker.

This is an easy one for me – as I loved speaking at PASS Summit a lot.
Both times I learnt so many different things that helped me grow not only as a speaker but as a data platform technologist.

This is the first thing that I want to pass onto others who are considering submitting.

You will learn a lot:

When preparing a presentation you learn a lot when you prepare a presentation. You want to be ready to have questions. When selecting a topic I want to know as much about it to answer the questions attendees might have. Not just the basic questions but the more advanced ones that will help them implement/change the setup of whatever technology I am talking about.

The flow on effect of this was one particular area I was talking on (SQL Server on Linux) helped the company I was working at as it changed our direction and usage of the product. Now that is definitely a win | win situation!!

As a speaker I learnt a lot about speaking to crowds of people who are engaged and also want to learn from you.  This helped me grow as a speaker as I spent more time on preparation – so that I could deliver the content really well at an event like PASS Summit.

Disclaimer: I personally think I have a way to go before I’m a really effective speaker – but I speak about Continuous Improvement with technology so happy to embrace this with my speaking craft.

Here is your chance to pay it forward:

For years I had been a consumer of content, whenever I had an issue there were people who had written resolutions which had helped me with just about every part of our technology stack. I had also attended free conferences like SQLSaturday and Code Camp and had learnt so much that helped me manage/deploy/tune SQL Server.

By standing up in front of people I was replaying all the kindness of those people who had given up their time to help me. My tag line has always been “I speak so that I can help at least one person in the crowd learn..”.

The great thing has been after both my PASS Summit sessions I’ve had people stay behind and ask questions – which is great as it means that people were engaged and got something out of my session.

You are now part of a group of people who really care:

My first ever speaking engagement was with my good friend Martin Catherall, for years I had seen him speak and he was good enough to put in a co-speaking session for us both at SQLSaturday Oregon in October 2015. It was brilliant as it allowed me to try my hand at speaking with my good mate next to me for support.

By being part of the speaker group I then met some of the most awesome caring people, who really care about the community.

Start small and achieve greatness:

So let’s say you want to start speaking and giving back to the community, a great place to start and practice for speaking at PASS Summit is to support your local user group.

For a couple of reasons:

  1. It allows you to become an expert of your material and to grow in confidence as a speaker. Speaking to a room of 20 people whom I knew was a very rewarding experience and allowed me to get feedback on my material before going large.
  2. I run a user group and am always on the lookout for grass roots speakers and will support them by offering a slot at my SQL Server User Group. Because one of the hardest parts of running a User Group is finding speakers.
    So you know — win | win.

After speaking at a local user group — submit to your local SQLSaturday. I also run one and for the past 3 years I have offered new speakers the chance to speak in front of a larger more disparate crowd than their local User Groups.

So go ahead — think of a topic, write an abstract and submit!!

We need speakers like you in the community and PASS Summit needs more speakers to submit — so please take the plunge.  If nothing else — you now have a subject that you can support your local user groups and community conferences with.

The ultimate is that you get picked for PASS Summit and in a year or two write about your own experiences to help incubate another person to make a positive difference in our vibrant community.


Why I am leaving a role/company I loved

This non-technical post is about why I am leaving a technical company after working there for 17 + years.

If you are thinking this will be a post that will resemble this:

DevOPs is never about burning bridges

then I’m sorry but you will be sadly disappointed.

My reasons for leaving are about doing new things rather than hating on the old things…

I resigned from my position as Operations Manager at Jade Software on 22nd December 2017, it was the 6,311th day that I had worked there. It was also the 6,969th day of my IT career — it seemed the right kind of day to do something huge.

Some people would say — but you didn’t work 6,311 days there — you’re counting weekends too!! To which I’d reply that when you work for a company that is energetic about doing things — it is infectious to be thinking about work on the weekends or writing emails/planning future work/projects.

It’s funny looking back at my time there — I originally was only going to work 23 months but I found after a year that I loved working there.

If you look at the longevity of the people that work there — there are people who’ve worked there over 30 years. It is that kind of company where people who are passionate about technology and stuff — stay. And they’re good people too!!

These people could easily leave and get really good money elsewhere. But they don’t because we believed in what we were doing there, that a lot of other things at Jade outweighed more money.

I was very lucky during my time at Jade to be part of a team of guys who were passionate, brilliant and committed to what we were doing. We socialised together and shared a love of getting the work done and having a beer, and eating hot n spicy food.

I think there is some saying that goes “why do do you go to work each day?” and the answer is “the people”. The reason I stayed so long at Jade was the people and the fact that every 5-8 years the company reinvented itself and/or did a stepwise change. It was exciting to be part of that. The culture at Jade was one of striving for excellence and also having fun along the way.

In some respects I used to think of Jade as this beautiful woman who was like a fickle mistress…. As at times I would drop everything to do things for my job. And have to explain to those dear to me why I was doing such things. Because it was awesome brilliant times making things – that others struggled with – work.

In a sense for those who loved this Jade woman – she consumed us, was at time a jealous lover but rewarded us well. As all fickle mistresses should.

On the day I resigned I bought 6 bottles of the wine below which inspired the above sentence (Note: I did not drink all 6 bottles at once to come up with the sentence above)

Jade – the most enjoyable yet fickle of all mistresses….

So why leave?

One of the reasons I am leaving is so that there is a breath of fresh air within Operations.  16 years ago today (7/1/2002) I started as the Operations Team Leader – a newly created role and so I’ve led my team through new technologies, company-wide redundancies, introduction of SQL Server, NT4 —-> Windows 2016, removal of everyone being oncall and even PowerShell.

My aim over those years was to manage as I’d want to be managed. That we were a team which meant that my staff’s opinions mattered more than my own and that I wanted to be told if I was wrong – but I fully expected an answer/solution to what I was doing that was wrong. My staff knew that at 3am they could call me if they were stuck and if necessary I’d drive into work — because I expected the same. I didn’t like the word ‘manager’ as that just reminded me of David Brent like paper shufflers. I wanted to lead my team and for them to actively participate in the direction we’d go.

As a team.

BTW – telling my team I was resigning was hard.

Really hard.

The other reason I am leaving is around the fact that I left home when I was 18, I went to university to study Chemical Engineering in another city.

Leaving my (small) home town of Napier was hard at the time, in fact the first year was hell. But worth it — as it made me the man I am today.

And that in lies the analogy I’m using for leaving Jade, I learnt so many wondrous cool things whilst working there, my talent was incubated by some of the most technically brilliant people I’ve met. I matured as a person both technically and socially and now is time to leave ‘home’ again. To leave the confines and security of a job I loved and go out into the real world again.


I want to try my hand at consulting (and contracting) — some exciting news soon…

I want to help companies achieve some of the awesome stuff we did at Jade around DevOPs and specifically with databases.

I want to continue to make a difference in the community and help people learn (and laugh).

I want to make a fair bit of money so I can (finally) upgrade my car.

This next part of my career will be exciting, I am a little nervous about what the first few years will be like, but I feel it is time to leave. That nervousness BTW is what I use to drive myself, I thrive on energy whether it is good energy and not-so-good energy like stress. Before I speak — around 2 hours before I look like I’m going to throw up and a mess. But that is my way of centering myself and getting ready to make people laugh and learn.

So for the past 2 weeks — each day I have confronted the nervousness that I feel and remember how I felt on 11th September 2000 when I drove to my first day at Jade — I wrote a list of things I wanted to learn in the first 3 months…. because I was nervous I didn’t know enough. Thanks to some guys who would later become senior members of my team I’d learnt those things within 2 weeks.

That is the special kind of place Jade was — where the right kind of people would help you out, would go out of their way and also make you feel like ‘family’.  If I ever employ enough staff to have a team again I want to emulate what I did and the culture we had at Jade.

I’ll be sad to leave but so glad I stayed.


Changing TFS to use HTTPS? — update your agent settings too….

This blog post is about Team Foundation Server (TFS) and is about the situation where you need remember to update your TFS Agent settings.

I will assume that you already have TFS setup and are just using HTTP and want to make things a bit more secure with HTTPS. I am also assuming that you will be using port 443 for HTTPS traffic.

To update TFS to use HTTPS you need to do a couple of things:

  1. Have a legitimate certificate installed on the server that you can bind to
  2. Have an IP address on the server and have firewall access setup to that IP address on port 443

So in IIS we will add our new binding to our Team Foundation Server website:

IIS Setup for new binding of HTTPS

We will now go into TFS Administration Console to change our public URL. The added HTTPS binding will have flowed through from IIS and you should now see it in the bindings.

HTTPS Setup TFS Admin
Adding our URL to TFS Admin Console



So now we have HTTPS working for our TFS instance. Users can connect to the new URL and we can utilise URL rewriting to direct anyone who forgets and uses HTTP.

Except our first nightly builds failed…

HTTPS Agent failed
Automated Nightly Build Failure

Looking at the diagnostic logs on the agent we can see the following (note the time is UTC time):

[2017-12-07 13:30:05Z ERR Program] System.Net.Http.HttpRequestException: An error occurred while sending the request. —> System.Net.Http.WinHttpException: A security error occurred at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Net.Http.WinHttpHandler.<StartRequest>d__101.MoveNext()
— End of inner exception stack trace —
at Microsoft.VisualStudio.Services.Common.VssHttpRetryMessageHandler.<SendAsync>d__3.MoveNext()
— End of stack trace from previous location where exception was thrown —

The logs also showed that the agent was trying to go to the old address. So it was a simple change to the agent settings to point to HTTPS address.

Browsing to where the agent is installed we can now edit the .Agents file:

Editing the .agent file

Within the .agent file we will change the following setting:

serverUrl: https://YourURL/tfs/

Kick off a queued build and it works as intended.