KQL Series – understanding KQL queries (Part 2)

In Part1 we talked about what a query is.

The most common query we will write is a tabular expression statement which is what people usually have in mind when we talk about queries. This statement usually appears last in the statement list, and both its input and its output consists of tables or tabular data sets.

NOTE: Any two statements must be separated by a semicolon ;

We use a data flow model for the tabular expression statement. A tabular expression statement is generally composed of tabular data sources such as the tables we will query, tabular data operators such as filters and projections, and optional rendering operators.

The composition is represented by the pipe character (|), giving the statement a very regular form that visually represents the flow of tabular data from left to right. Each operator accepts a tabular data set “from the pipe”, and other inputs including more tabular data sets from the body of the operator, then emits a tabular data set to the next operator that follows.

So our query will look something like this:

Source | Operator1 | Operator2 | RenderInstruction

  • Source – tabular data sources such as Azure Data Explorer tables
  • Operator – tabular data operators such as filters and projections
  • RenderInstruction – rendering operators or instructions


In the following slightly more complex example, the join operator is used to combine records from two input data sets, one that is a filter on the Logs table, and another that is a filter on the Events table.

| where Timestamp > ago(1d) 
| join 
    | where continent == 'Europe'
) on RequestId 

So for now let’s use tabluar expressions and look at how we construct information out of our data..

Or in other words the workflow in constructing our query..



KQL Series – understanding KQL Queries (Part 1)

In my pervious post I wrote about KQL queries that we will write – we also could use some free some samples that Microsoft put up for us to try things out:


But what is a query? Kudos to Microsoft for putting some great definitions up on their site which I’ve used here.


At times you will see reference to Azure Data Explorer and Azure Monitor.
My first experience with KQL was with Azure Monitor – but I will talk to Azure Data Explorer later in this series. Back to queries…

A Kusto query is a read-only request to process data and return results. The request is stated in plain text, using a data-flow model that is easy to read, author, and automate. Kusto queries are made of one or more query statements.

A query consists of one or more query statements, delimited by a semicolon (;).

At least one of these query statements must be a tabular expression statement. The tabular expression statement generates one or more tabular results. Any two statements must be separated by a semicolon. When the query has more than one tabular expression statement, the query has a batch of tabular expression statements, and the tabular results generated by these statements are all returned by the query.

Two types of query statements:

  • Statements that are primarily used by users (user query statements),
  • Statements that have been designed to support scenarios in which mid-tier applications take user queries and send a modified version of them to Kusto (application query statements).

Some query statements are useful in both scenarios.

here are three kinds of user query statements:

All query statements are separated by a ; (semicolon), and only affect the query at hand.

To be honest most of the time I write tabular expressions – let’s have a closer look at this…

KQL Series – what is Kusto Query Language?

This blog post will detail what KQL is all about…

KQL was developed to take advantage of the power of the cloud through clustering and compute. Using this capability, KQL is designed as a well-performing tool to help surface critical data quickly. This a big part of why it works so well and outshines many other query languages like it. KQL was built for the cloud and to be used against large data sets.

Kusto Query Language is a powerful tool to explore your data and discover patterns, identify anomalies and outliers, create statistical modeling, and more. The query uses schema entities that are organized in a hierarchy similar to SQL’s: databases, tables, and columns.

The query we write in KQL is a read-only request to process data and return results. The request is stated in plain text, using a data-flow model that is easy to read, author, and automate. Kusto queries are made of one or more query statements.

Here is an example query:

| where StartTime between (datetime(2007-11-01) .. datetime(2007-12-01))
| where State == "FLORIDA"
| count

I will show you below how to run this code and also any code that I put in these blog posts – for FREE!!

Where did I first discover Kusto Query Language?
In Azure using Log Analytics – as I was collecting diagnostic data and needed to analyse it:

Where can you discover and try out Kusto Query Language?
Well Microsoft have some free demo sites you can use:

•Log Analytics  https://aka.ms/LADemo

•Application Insights  https://aka.ms/AIAnalyticsDemo

But it gets better….
Go here to run the code above and get the result of “28”


You don’t need to pay anything just have an account registered with Microsoft.

It’s that easy…


KQL Series – why is it called Kusto Query Language?

I use KQL on an hourly basis….

But for a query language – why call it Kusto..?

Where is a funny tidbit of information:

You have probably heard something sounds like Kusto before…?
I know I had – when I was a young kid.


KQL is named after Jacques Cousteau. Even today, you can find evidence of this in the Azure Monitor Docs. If you go to the datatable operator page right now, you’ll still find a reference to him in an example that lists his date of birth, the date he entered the naval academy, when he published his first book entitled “The Silent World: A Story of Undersea Discovery and Adventure,” and the date when he passed.

So the puns about deep diving into data, exploring the depths of data to find and present information – basically we are going to be explorers in a sea of data…


Kusto Query Language – a series

This blog post is about a new query language that I have learnt and I really think you need to learn it too. Especially if you are doing ANYTHING in Azure.

Basically about 3.5 years ago I was building some things in Azure and I realised that I need to get diagnostics and metric information – from all the data that I was capturing.I found that Log Analytics had everything I needed from an infrastructure side…

Then I worked with some awesome frontend developer who were using KQL and Application Insights to get telemetry information about their web services hosted in Azure.

I started learning KQL and realised that I needed to start sharing the brilliance of it.

I have felt so strongly about this that I’ve done a number of talks and even videos on it.

The idea behind it is pretty much the same:

Azure is the leading platform for many companies, and to manage your databases and other infrastructure you need to have insights into what is happening.
You need to be able to query the Azure platform, to not only understand your infrastructure but also to leverage monitoring and analytics to react to operational changes occurring in it.
This session will introduce you to the Kusto Query Language (KQL) which will allow you to query a variety of Azure resources.

At User Groups:


Videos: (excuse the hair….)


I knew I was onto a good thing when my close mate Steve Jones blogged about it here:

So I have decided to start a series of blog posts here that talk about the magic-ness of KQL.

Here we go……


Azure Platform Series – renaming your YAML pipeline in Azure DevOps

This blog post is about YAML pipelines in Azure DevOps.

I had a repo called InfrastructureAsCode for a client. I have been transitioning them to use YAML for their build pipelines in Azure DevOps – because now we can source control the build itself.

If you do not know what a YAML pipeline is in Azure DevOps check it out here:


You can try this for yourself by using the awesome AzureDevOpsLabs:


I had created a pipeline using YAML – which was called InfrastructureAsCode as the YAMP file was in the root directory.

However I wanted to move it into a folder .\InfrastructureAsCode\pipelines\… and run the YAML file from there – as I would have a non-prod and PROD version of them (as the schedule was different for each).

That went well – but I didn’t like the name InfrastructureAsCode as my pipeline name. And it got worse when I added the PROD version of the YAML file – I now had InfrastuctureAsCode(1).
That’s even uglier.

So I needed to rename the pipeline – but how do you do this with YAML??

Well – click your pipeline and edit it and then up in the right hand corner is the 3 dots – click that and choose Triggers:

Click on YAML tab:

And now you have renamed your pipeline!!


Azure Platform Series: cloudshell code – your (brilliant) online editor

This post is about code which is an online editor you can use with cloudshell.

I live in the Azure platform all day (almost) every day. What this means is I need to be able to run scripts from just about any device – there are times where I will be scaling a Hyperscale database at 10:30pm from my phone….

So I use cloudshell a LOT in Azure. I wrote a post about setting up cloudshell here:
Azure Platform Series: How to setup Azure Cloud Shell

The best part of cloudshell is the online editor – which you can access within cloudshell by clicking on the {} in the toolbar:

Or you can access it by typing code . in the commandline:

which brings up the online editor:

You can now browse the directory structure and click on a script that you might have written before. In the example below I have clicked on scalePREPROD.sh to edit it:

You can read more about the editor here:


For me – it is a great way to edit and store those scripts I use most in Azure.


Bash: ^^}: bad substitution – upgrade your Bash version (macOS)

This blog post is about a situation where you are capitalising an argument in bash and you get the following error:

bash: ${state^^}: bad substitution

In this example I was capitalising a variable called state to STATE. After some quick googling I found that the reason I am getting this error is not because of my bad coding – but because I might not be using a version of bash greater than 4.

When I checked the version I am using on my mac it was:

$ bash --version
GNU bash, version 3.2.57(1)-release (x86_64-apple-darwin20)
Copyright (C) 2007 Free Software Foundation, Inc.

and running the same command in Visual Studio showed the same – that I was running 3.2.57 (??!!??)

Which seemed quite old – that copyright of 2007 gives it away. I am running Big Sur – version 11.1 of macOS and from some more googling it appears there is some history around Apple and Bash and GPL licenses and <yawn>…… here are some links http://meta.ath0.com/2012/02/05/apples-great-gpl-purge/ and

And I realised that many macOS users probably don’t know is that they are using an outdated version of Bash shell.

How to upgrade Bash on macOS.

For macOS you should use homebrew to install things:


$ brew install bash

To verify the installation, you can check that you now have two versions of Bash on your system:

$ which -a bash

The first one is the new version, and the second one is the old version:

$ /usr/local/bin/bash --version
GNU bash, version 5.1.4(1)-release (x86_64-apple-darwin20.2.0)
Copyright (C) 2020 Free Software Foundation, Inc. License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>
This is free software; you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law.

$ /bin/bash --version
GNU bash, version 3.2.57(1)-release (x86_64-apple-darwin20.2.0)
Copyright (C) 2007 Free Software Foundation, Inc.

Since the directory of the new version (/usr/local/bin) comes by default before the directory of the old version (/bin) in the PATH variable, the version used when you just type bash is the new one:

$ bash –version
GNU bash, version 5.1.4(1)-release (x86_64-apple-darwin20.2.0)
Copyright (C) 2020 Free Software Foundation, Inc.

We now need to whitelist our shell as Unix systems restrict this, so to use the newly installed Bash shell as the default shell, it must be able to act as a login shell. So we will add it to the /etc/shells file. We will edit this file as the root user:

$ sudo vim /etc/shells

and enter in your admin password.

You will now be in vim editor:

Use the i key to go into INSERT mode, paste in /usr/local/bin/bash down the bottom then hit th ESC key and type


using the vim editor

Which will write and save our changes.

Set Default Shell

Because /bin/bash is still set as the default shell – you will still be using version 3.2.57. To change this to the new shell, execute the following command:

$ chsh -s /usr/local/bin/bash

The default shell for your user is now set to the new version of Bash. If you close and reopen the terminal window, you should now be using the new version already:

 $ echo "$BASH_VERSION"

The chsh command changes the default shell only for the user who executes the command. If you want to change the default shell for other users too, so run the following (which will ask for your admin password):

$ sudo chsh -s /usr/local/bin/bash

All done – right?
So I went back into Visual Code and ran the original script – but still had the issue!!

Updating Visual Code to use the new Bash version

I found you need to go to Preferences and change Settings:

In the search field type in bash

click on “Edit in settings.json” for Osx and enter in “/usr/local/bin/bash” as shown below

Exit the Terminal and a new bash session will have


I ran my script and it worked.


Azure Platform Series “You have exceeded the maximum amount of scale changes within the past hour”

This blog post relates to where you might be doing scale operations of your app services or VMs in Azure and get the following error after doing quite a few scale up and down operations.

The exact error I received was:

Failed to update configuration for 'Api'. {"Code":"Conflict","Message":"You have exceeded the maximum amount of scale changes within the past hour(56 changes and limit is 40). Please retry later.","Target":null,"Details":[{"Message":"You have exceeded the maximum amount of scale changes within the past hour(56 changes and limit is 40). Please retry later."},{"Code":"Conflict"},{"ErrorEntity":{"ExtendedCode":"03024","MessageTemplate":"You have exceeded the maximum amount of scale changes within the past hour({0} changes and limit is {1}). Please retry later.","Parameters":["56","40"],"Code":"Conflict","Message":"You have exceeded the maximum amount of scale changes within the past hour(56 changes and limit is 40). Please retry later."}}],"Innererror":null}

Where I was scaling my app service plan called Api and went past some arbitrary limit.

It is hard to find the exact limits within Azure for app scale operations but by a process of elimination I have determined:


Unfortunately there isn’t much we can do here – you just have to wait until your hour is up….

Bash: “command not found” in IF statement

This blog post is about a situation where writing an IF THEN ELSE statement with variables fails with something like

./scalePREPROD.sh: line 26: [[1: command not found

This was in a script called scalePREPROD that scaled app services if they were not a certain value – so I am checking to see IF they are a value of 15 and if they are not then scale up to 15.


You need a space in the IF statement when variables are used.

The code looked something like this:

for i in "${appServiceArray[@]}"; do
appcapacity=<bash query to check app services>
[[$appcapacity -ne $scaleAPPWorkersUp]];
<stuff to do>
<echo something back that we're all good>

So in the example above – I wanted to see if for my array of 6 app services ${appServiceArray[@]} any were not equal to a value of $scaleAPPWorkersUp.

To fix it I just needed to put a space so:

[[$appcapacity -ne $scaleAPPWorkersUp]];


[[ $appcapacity -ne $scaleAPPWorkersUp ]];

Pretty easy fix.