KQL Series – KQL the next query language you need to learn – a video from Data Exposed

I love KQL so much I even made a video and if you compare it to my last blog post – yeah my hair has changed a bit…
And my beard.
And my clothing…

Check out Data Exposed here – it is an AWESOME site:

https://learn.microsoft.com/en-us/shows/data-exposed/

If you want to see a longer version of a session on KQL then do visit my last blog post here

KQL Series – some DevOps things: Provisioning using ARM templates (dedicated to Scott)

I was talking to an ex-client earlier this week and he saw the drafts of my blog posts around DevOps infra-as-code spinning up Azure Data Explorer and he said – but what about ARM templates…..

Now this is a trigger. I do not like ARM templates.
But…

Some things pay the mortgage and back when we worked together I spun up their whole platform using ARM (this was before bicep was a thing).
So here it is for you Scott… you lover of ARM (and a good red wine).

Here are the steps to provision an Azure Data Explorer cluster using an ARM template.

Prerequisites:

  • An Azure subscription
  • Basic knowledge of Azure Resource Manager templates
  • Being a lover of JSON (just kidding – I have to write this…)

Step 1: Create the ARM Template The first step is to create the ARM template. The template defines the resources that will be provisioned in Azure. We’ll create a simple template that provisions an Azure Data Explorer cluster.

Here’s an example ARM template that provisions an Azure Data Explorer cluster:

{
  "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "resources": [
    {
      "type": "Microsoft.Kusto/clusters",
      "apiVersion": "2021-04-01-preview",
      "name": "[variables('adxClusterName')]",
      "location": "[variables('location')]",
      "sku": {
        "name": "Standard_L16s",
        "tier": "Standard"
      },
      "properties": {
        "enableDiskEncryption": false,
        "dataRetentionTime": "365.00:00:00",
        "trustedExternalTenants": []
      }
    }
  ],
  "variables": {
    "adxClusterName": "adx-cluster",
    "location": "[resourceGroup().location]"
  }
}

This template provisions an Azure Data Explorer cluster with the name “adx-cluster” and the SKU “Standard_L16s” in the same location as the resource group.

Step 2: Deploy the ARM Template
To deploy the ARM template, we’ll use the Azure Portal. Navigate to the resource group where you want to provision the Azure Data Explorer cluster and click on “Deploy a custom template”.

Select the “Build your own template in the editor” option and paste the ARM template code into the editor. Click “Save” and then “Review + Create”.

Review the template parameters and click “Create” to deploy the Azure Data Explorer cluster.

Step 3: Verify the Deployment
After the deployment is complete, you can verify that the Azure Data Explorer cluster was provisioned successfully. You can check the status of the cluster in the Azure Portal or use the Azure CLI to run the following command:

az kusto cluster show --name adx-cluster --resource-group MyResourceGroup

This command will return information about the Azure Data Explorer cluster, including the status and SKU.

So there you have it – yes ARM templates are “relatively” easy, but they can be horrid..
Use bicep as a first go to or my favourite – terraform.

Either way – just use infrastructure as code to do any provisioning – OK??!!

(Also Scott – when I am in Seattle next – you own me a lot of red wine)

Yip.

KQL Series – Kusto Engine V3 – let’s make stuff go faster

In late March 2021 Azure Data Explorer Engine v3 was release and it is the latest version of the engine that powers Azure Data Explorer. It is designed to provide a number of key features and benefits for data analytics, including:

  1. Scalability: The Azure Data Explorer Engine v3 is designed to be highly scalable, allowing you to process and analyze large amounts of data quickly and efficiently. The engine can be scaled up or down as needed to handle varying data volumes and workloads.
  2. Performance: The Azure Data Explorer Engine v3 is highly optimized for performance, allowing you to analyze data in near real-time. The engine can perform complex queries and aggregations on large data sets quickly and efficiently.
  3. Flexibility: The Azure Data Explorer Engine v3 is designed to be highly flexible, allowing you to work with a wide range of data types and formats. The engine can handle structured, semi-structured, and unstructured data, as well as data in a variety of formats, including CSV, JSON, and more.
  4. Reliability: The Azure Data Explorer Engine v3 is designed to be highly reliable, ensuring that your data is always available and secure. The engine is built on top of Azure, providing built-in redundancy and failover capabilities to ensure high availability.

Key features of the Azure Data Explorer Engine v3 The Azure Data Explorer Engine v3 provides a number of key features that make it a powerful and flexible platform for data analytics, including:

  1. Columnstore Indexes: The Azure Data Explorer Engine v3 uses columnstore indexes to store and query data. Columnstore indexes provide high compression rates and fast query performance, allowing you to quickly analyze large amounts of data.
  2. Time Series: The Azure Data Explorer Engine v3 includes built-in support for time series data, allowing you to analyze time-based data quickly and efficiently. The engine can handle large volumes of time-series data and provide real-time insights into trends and patterns.
  3. Query Language: The Azure Data Explorer Engine v3 supports a powerful query language called Kusto Query Language (KQL). KQL provides a simple and intuitive syntax for querying data, allowing you to quickly and easily analyze large data sets. But you know this from these blog posts right?
  4. Security: The Azure Data Explorer Engine v3 provides built-in security features, including role-based access control (RBAC) and encryption, to ensure that your data is always secure.

You can read more about V3 of the engine here:
https://learn.microsoft.com/en-us/azure/data-explorer/engine-v3

KQL Series – some DevOps things: Provisioning using Powershell

If you have read any of my last few posts on provisioning Azure Data Explorer then you will probably be wondering…..

Will he write about Powershell?

Ok, I will.

Step 1: Install and configure Azure PowerShell To use PowerShell to provision Azure Data Explorer, you first need to install the Azure PowerShell module. You can install it using the following command:

Install-Module -Name Az -AllowClobber -Scope CurrentUser

Once the module is installed, you need to log in to Azure using the Connect-AzAccount command. This command will prompt you to enter your Azure credentials to authenticate with the Azure portal.

Step 2: Create an Azure Data Explorer cluster To create an ADX cluster, you can use the New-AzKustoCluster cmdlet.

Here’s an example command that creates an ADX cluster named “myadxcluster” in the “East US” region with a “D13_v2” SKU and two nodes:

New-AzKustoCluster -ResourceGroupName myResourceGroup -Name myadxcluster -Location EastUS -SkuName D13_v2 -Capacity 2

This command will create an ADX cluster with the specified name, location, SKU, and node capacity. You can customize these settings to fit your needs.

Step 3: Create an Azure Data Explorer database After creating an ADX cluster, you can create a database within the cluster using the New-AzKustoDatabase cmdlet. Here’s an example command that creates a database named “myadxdatabase” within the “myadxcluster” cluster:

New-AzKustoDatabase -ClusterName myadxcluster -Name myadxdatabase

This command will create a new database with the specified name within the ADX cluster.

Step 4: Configure data ingestion Once you have created a database, you can configure data ingestion using the Azure Data Explorer management portal or PowerShell. To use PowerShell, you can create a data ingestion rule using the New-AzKustoDataIngestionMapping cmdlet.

Here’s an example command that creates a data ingestion rule for a CSV file – we will use a file that Microsoft has provided for learning KQL and also Azure Data Explorer ( https://kustosamples.blob.core.windows.net/samplefiles/StormEvents.csv ):

$mapping = @"
col1:string
col2:int
col3:datetime
"@

New-AzKustoDataIngestionMapping -ClusterName myadxcluster -DatabaseName myadxdatabase -Name mydataingestionrule -Format csv -IgnoreFirstRecord $true -FlushImmediately $true -Mapping $mapping -DataSource @"
https://kustosamples.blob.core.windows.net/samplefiles/StormEvents.csv
"@

This command will create a data ingestion rule named “mydataingestionrule” for a CSV file named “StormEvents.csv” within the specified ADX cluster and database. The data ingestion rule specifies the file format, data mapping, and ingestion behavior.

Step 5: Verify your deployment.
Once you have completed the above steps, you can verify your Azure Data Explorer deployment by running queries and analyzing data in the ADX cluster. You can use tools like Azure Data Studio, which provides a graphical user interface for querying and analyzing data in ADX.

Using Powershell is fairly easy – personally I prefer terraform – but that’s only because I am a fully cloud person on all the clouds…

But that could be another blog post

Yip.

KQL Series – some DevOps things: Provisioning using Azure CLI

I can’t really write about provisioning anything in Azure without mentioning Azure CLI.
My last two posts were about using

terraform and bicep

Here we will be using the Azure CLI.

Step 1: Install and configure the Azure CLI To use the Azure CLI, you first need to install it on your local machine. The Azure CLI can be installed on Windows, macOS, or Linux, and detailed instructions can be found in the Azure documentation.

Once the Azure CLI is installed, you need to log in using the az login command. This command will prompt you to enter your Azure credentials to authenticate with the Azure portal.

Step 2: Create an Azure Data Explorer cluster To create an ADX cluster, you can use the az kusto cluster create command. Here’s an example command that creates an ADX cluster named “myadxcluster” in the “East US” region with a “D13_v2” SKU and two nodes:

az kusto cluster create --name myadxcluster --location eastus --sku D13_v2 --capacity 2

This command will create an ADX cluster with the specified name, location, SKU, and node capacity. You can customize these settings to fit your needs.

Step 3: Create an Azure Data Explorer database After creating an ADX cluster, you can create a database within the cluster using the az kusto database create command.

Here’s an example command that creates a database named “myadxdatabase” within the “myadxcluster” cluster:

az kusto database create –cluster-name myadxcluster –name myadxdatabase

This command will create a new database with the specified name within the ADX cluster.

Step 4: Configure data ingestion Once you have created a database, you can configure data ingestion using the Azure Data Explorer management portal or the Azure CLI. To use the Azure CLI, you can create a data ingestion rule using the az kusto data-ingestion-rule create command.

Here’s an example command that creates a data ingestion rule for a CSV file:

az kusto data-ingestion-rule create --cluster-name myadxcluster --database-name myadxdatabase --name mydataingestionrule --data-source @'./mydata.csv' --format csv --ignore-first-record --flush-immediately --mapping 'col1:string,col2:int,col3:datetime'

This command will create a data ingestion rule named “mydataingestionrule” for a CSV file named “mydata.csv” within the specified ADX cluster and database. The data ingestion rule specifies the file format, data mapping, and ingestion behavior.

Step 5: Verify your deployment Once you have completed the above steps, you can verify your Azure Data Explorer deployment by running queries and analyzing data in the ADX cluster. You can use tools like Azure Data Studio, which provides a graphical user interface for querying and analyzing data in ADX.

Provisioning Azure Data Explorer using the Azure CLI is a pretty simple and straightforward process.

Yip.

KQL Series – some DevOps things: Provisioning using bicep

So in my last post I wrote about how to provision Azure Data Explorer using terraform .

In this post I will use bicep to provision Azure Data Explorer.

Steps:

Step 1: Set up your environment Before you can begin using Bicep to provision Azure Data Explorer, you need to set up your environment. This involves installing the Azure CLI and Bicep. You’ll also need to create an Azure account and set up authentication.

Step 2: Define your infrastructure as code Once you have your environment set up, you can begin defining your infrastructure as code using Bicep. This involves writing code that defines the resources you want to provision, such as Azure Data Explorer clusters, databases, and data ingestion rules.

Here’s an example of a Bicep file that provisions an Azure Data Explorer cluster and database:

param resourceGroupName string
param location string
param clusterName string
param capacity int

resource cluster 'Microsoft.Kusto/clusters@2022-01-01-preview' = {
  name: clusterName
  location: location
  sku: {
    name: 'D13_v2'
    capacity: capacity
  }
}

resource db 'Microsoft.Kusto/clusters/databases@2022-01-01-preview' = {
  name: 'my-kusto-database'
  parent: cluster
  dependsOn: [cluster]
}

In this code, we declare four parameters: the resource group name, the location, the cluster name, and the cluster capacity. We then define an Azure Data Explorer cluster using the Microsoft.Kusto/clusters resource, specifying the name, location, SKU, and capacity. Finally, we define a database using the Microsoft.Kusto/clusters/databases resource, specifying the name and the parent cluster.

Step 3: Deploy your infrastructure. Now that you have defined your infrastructure as code, you can deploy it using the Azure CLI. First, run the az login command to authenticate with Azure. Then, run the following commands to create a new resource group, build the Bicep file, and deploy the infrastructure:

az group create --name my-resource-group --location westus2
az deployment group create --resource-group my-resource-group --template-file main.bicep --parameters resourceGroupName=my-resource-group location=westus2 clusterName=my-kusto-cluster capacity=2

This will create a new resource group, build the Bicep file, and deploy the Azure Data Explorer cluster and database.

Step 4: Test and monitor your deployment Once your infrastructure is deployed, you should test and monitor it to ensure it is working as expected. This may involve running queries on your Azure Data Explorer cluster, monitoring data ingestion rates, and analyzing performance metrics.

Using Bicep to provision Azure Data Explorer offers many benefits, including faster deployment times, greater reliability, and improved scalability. By automating the provisioning process, you can focus on more important tasks, such as analyzing your data and gaining insights into your business.

Bicep is a powerful tool that can simplify the process of provisioning Azure Data Explorer. By following the steps outlined in this blog post, you can quickly and easily set up a Azure Data Explorer.

Yip.

KQL Series – some DevOps things: Provisioning using terraform

If you’ve read my blogs before or seen me speak you’ll know that I love the DevOps.

Provisioning Azure Data Explorer can be a complex task, involving multiple steps and configurations. However, with the help of DevOps methodologies like infrastructure as code we can spin up an Azure Data Explorer cluster in no time.

I like to use terraform, which is an open-source infrastructure as code tool, which we can automate the entire provisioning process, making it faster, more reliable, and less error-prone.

In this blog post, I discuss how to use Terraform to provision Azure Data Explorer, step-by-step.

Step 1: Set up your environment Before you can begin using Terraform to provision Azure Data Explorer, you need to set up your environment. This involves installing the Azure CLI, Terraform, and other necessary tools. You’ll also need to create an Azure account and set up authentication.

Step 2: Define your infrastructure as code Once you have your environment set up, you can begin defining your infrastructure as code using Terraform. This involves writing code that defines the resources you want to provision, such as Azure Data Explorer clusters, databases, and data ingestion rules.

Step 3: Initialize your Terraform project After defining your infrastructure as code, you need to initialize your Terraform project by running the ‘terraform init’ command. This will download any necessary plugins and modules and prepare your project for deployment.

Step 4: Deploy your infrastructure Now that your Terraform project is initialized, you can deploy your infrastructure by running the ‘terraform apply’ command. This will provision all the resources defined in your code and configure them according to your specifications.

Step 5: Test and monitor your deployment Once your infrastructure is deployed, you should test and monitor it to ensure it is working as expected. This may involve running queries on your Azure Data Explorer cluster, monitoring data ingestion rates, and analyzing performance metrics.

provider "azurerm" {
  features {}
}

resource "azurerm_resource_group" "example" {
  name     = "my-resource-group"
  location = "West US 2"
}

resource "azurerm_kusto_cluster" "example" {
  name                = "my-kusto-cluster"
  location            = azurerm_resource_group.example.location
  resource_group_name = azurerm_resource_group.example.name
  sku                 = "D13_v2"
  capacity            = 2
}

resource "azurerm_kusto_database" "example" {
  name                = "my-kusto-database"
  resource_group_name = azurerm_resource_group.example.name
  cluster_name        = azurerm_kusto_cluster.example.name
}

In this code, we first declare the Azure provider and define a resource group. We then define an Azure Data Explorer cluster using the azurerm_kusto_cluster resource, specifying the name, location, resource group, SKU, and capacity. Finally, we define a database using the azurerm_kusto_database resource, specifying the name, resource group, and the name of the cluster it belongs to.

Once you have this code in a .tf file, you can use the Terraform CLI to initialize your project, authenticate with Azure, and deploy your infrastructure

Using Terraform to provision Azure Data Explorer offers many benefits, including faster deployment times, greater reliability, and improved scalability. By automating the provisioning process, you can focus on more important tasks, such as analyzing your data and gaining insights into your business.

Terraform is a powerful tool that can simplify the process of provisioning Azure Data Explorer. By following the steps outlined in this blog post, you can quickly and easily set up a scalable and efficient data analytics solution that can handle even the largest data volumes.

Yip.

KQL Series – ingesting data into Azure Data Explorer from IoT Hub

This blog post is rare because I am dedicating it to my good mate Bryn Lewis who spoke recently at a community event I ran (the first free community event in Christchurch since June 2019 – the #makeStuffGo conference
https://makestuffgo.moosh.co.nz/

Bryn did a session on IoT and it got me thinking – how could I ingest data into Azure Data Explorer from IoT Hub…?

So first of all what is this IoT thing:

The Internet of Things (IoT) has revolutionized the way we interact with the world around us. From smart homes to industrial automation, IoT devices generate an enormous amount of data. To make sense of this data, we need powerful tools for analysis and visualization. Azure Data Explorer is a cloud-based analytics service that can help you store and analyze large volumes of diverse data in real-time. In this blog post, we’ll explore how to ingest data from IoT Hub into Azure Data Explorer.

What is IoT Hub?

Azure IoT Hub is a cloud-based service that enables you to connect, monitor, and manage IoT devices. IoT Hub provides a secure and scalable platform for IoT device management and data collection. It can handle millions of connected devices and billions of messages per day. IoT Hub supports a range of protocols, including MQTT, HTTPS, and AMQP, allowing you to connect a wide variety of devices.

Why Ingest Data from IoT Hub into Azure Data Explorer?

IoT Hub can collect and store data from millions of IoT devices, but analyzing this data in real-time can be challenging. Azure Data Explorer provides a powerful platform for real-time data analysis, allowing you to gain insights into your IoT data quickly. By ingesting data from IoT Hub into Azure Data Explorer, you can:

  • Analyze data in real-time: Azure Data Explorer can process data in real-time, allowing you to gain insights into your IoT data as it is generated.
  • Store data at scale: Azure Data Explorer is a cloud-based service that can store and analyze large volumes of data. You can store data from IoT Hub in Azure Data Explorer and analyze it at any scale.
  • Simplify data analysis: Azure Data Explorer provides a range of powerful analytical tools that can help you gain insights into your IoT data quickly. You can use these tools to identify patterns and anomalies in your data, detect trends over time, and more.

To ingest data from IoT Hub into Azure Data Explorer, we can follow these steps:

  1. Create an IoT Hub: We can create an IoT Hub in the Azure portal or using Azure CLI. Once we’ve created an IoT Hub, we can connect your IoT devices to it.
  2. Create an Event Hub-compatible endpoint: Azure Data Explorer can ingest data from Event Hub-compatible endpoints, including IoT Hub. To ingest data from IoT Hub, you need to create an Event Hub-compatible endpoint in IoT Hub.
  3. Configure IoT devices to send data to IoT Hub: You need to configure your IoT devices to send data to IoT Hub. You can use any of the protocols supported by IoT Hub, including MQTT, HTTPS, and AMQP.
  4. Create an Azure Data Explorer cluster: You can create an Azure Data Explorer cluster in the Azure portal or using Azure CLI.
  5. Create a database and table in Azure Data Explorer: You can create a database and table in Azure Data Explorer using Azure Data Explorer Web UI, Azure CLI, or Azure PowerShell. The table should have the same structure as the data you want to ingest from IoT Hub.
  6. Create an Event Hub-compatible endpoint in Azure Data Explorer: You can create an Event Hub-compatible endpoint in Azure Data Explorer using Azure Data Explorer Web UI or Azure PowerShell.
  7. Create an Event Hub-compatible consumer group: You can create an Event Hub-compatible consumer group in Azure Data Explorer using Azure Data Explorer Web UI or Azure PowerShell.
  8. Create an Event Hub-compatible source mapping: You can create an Event Hub-compatible source mapping in Azure Data Explorer using Azure Data Explorer Web UI or Azure PowerShell. The mapping should map the data from IoT Hub to the Azure Data Explorer table.
  9. Start the ingestion process: You can start the ingestion process using Azure Data Explorer Web UI or Azure PowerShell. Once the ingestion process has started, Azure Data Explorer will automatically ingest data from IoT Hub and store it in the specified table.

Here is an example PowerShell script to create an Event Hub-compatible endpoint, consumer group, and source mapping in Azure Data Explorer:

# Set variables
$resourceGroupName = "MyResourceGroup"
$dataExplorerClusterName = "MyDataExplorerCluster"
$databaseName = "MyDatabase"
$tableName = "MyTable"
$eventHubEndpointName = "MyEventHubEndpoint"
$consumerGroupName = "MyConsumerGroup"
$eventHubConnectionString = "Endpoint=sb://myeventhub.servicebus.windows.net/;SharedAccessKeyName=mykeyname;SharedAccessKey=mykey;EntityPath=myeventhub"
$sourceMappingName = "MySourceMapping"

# Create Event Hub-compatible endpoint
New-AzKustoEventHubDataConnection `
  -ResourceGroupName $resourceGroupName `
  -ClusterName $dataExplorerClusterName `
  -DatabaseName $databaseName `
  -TableName $tableName `
  -EventHubConnection $eventHubConnectionString `
  -EventHubName $eventHubEndpointName

# Create Event Hub-compatible consumer group
New-AzKustoEventHubConnection `
  -ResourceGroupName $resourceGroupName `
  -ClusterName $dataExplorerClusterName `
  -DatabaseName $databaseName `
  -TableName $tableName `
  -ConsumerGroupName $consumerGroupName

# Create Event Hub-compatible source mapping
New-AzKustoEventHubMapping `
  -ResourceGroupName $resourceGroupName `
  -ClusterName $dataExplorerClusterName `
  -DatabaseName $databaseName `
  -TableName $tableName `
  -EventHubConnection $eventHubConnectionString `
  -EventHubName $eventHubEndpointName `
  -ConsumerGroupName $consumerGroupName `
  -MappingName $sourceMappingName

This script uses the New-AzKustoEventHubDataConnection, New-AzKustoEventHubConnection, and New-AzKustoEventHubMapping cmdlets to create an Event Hub-compatible endpoint, consumer group, and source mapping in Azure Data Explorer.

Make sure to replace the variable values with your own values before running the script. You will need to provide the names of your resource group, Azure Data Explorer cluster, database, table, and Event Hub-compatible endpoint, consumer group, and source mapping. Additionally, you will need to provide the Event Hub connection string.

You can learn a whole heap more here:

https://learn.microsoft.com/en-us/azure/data-explorer/ingest-data-iot-hub

KQL Series – ingesting data via Event Grid into Azure Data Explorer

This blog post is about using event grid to ingest data into Azure Data Explorer and was a method I had to use with a client.

It was awesome because it forced me to write some C# code for an Azure function – so be nice and don’t judge the code. Your code is always better than mine….

To ingest data from Event Grid into Azure Data Explorer, we can follow these steps:

  1. Create an Event Grid subscription: You can create an Event Grid subscription to subscribe to the events we want to ingest into Azure Data Explorer. When an event is published to the Event Grid topic, Event Grid sends a notification to our subscription.
  2. Create an Azure Function: We can create an Azure Function that triggers when an event is published to the Event Grid topic. The function will receive the event data as input.
  3. Prepare the event data for ingestion: In the Azure Function, we can prepare the event data for ingestion into Azure Data Explorer. This may include parsing the event data and transforming it into a format that can be ingested by Azure Data Explorer.
  4. Ingest the event data into Azure Data Explorer: Using the Azure Data Explorer .NET SDK, we can ingest the event data into Azure Data Explorer. We can use the SDK to create a table and ingest the data into that table.

Here’s an example Azure Function that ingests Event Grid data into Azure Data Explorer:

using System;
using System.IO;
using System.Text;
using System.Threading.Tasks;
using Microsoft.Azure.EventGrid.Models;
using Microsoft.Azure.WebJobs;
using Microsoft.Extensions.Logging;
using Kusto.Data.Common;
using Kusto.Data.Net.Client;

public static void Run(EventGridEvent eventGridEvent, ILogger log)
{
    // Get the data from the event
    var eventData = (dynamic)eventGridEvent.Data;

    // Prepare the data for ingestion into Azure Data Explorer
    var timestamp = DateTime.UtcNow;
    var value = eventData.Value;

    // Ingest the data into Azure Data Explorer
    var kustoConnectionStringBuilder = new KustoConnectionStringBuilder("https://<clustername>.<region>.kusto.windows.net", "<databasename>")
        .WithAadApplicationKeyAuthentication("<applicationId>", "<applicationKey>", "<tenantId>");
    using (var kustoClient = KustoClientFactory.CreateCslQueryProvider(kustoConnectionStringBuilder))
    {
        var command = $"'.create table <tablename> (Timestamp:datetime, Value:real)'\n"
            + $"'| .set-or-replace <tablename> <| {timestamp:s} , {value} |>'";
        await kustoClient.ExecuteControlCommandAsync(command);
    }

    log.LogInformation($"Data ingested into Azure Data Explorer: {timestamp} {value}");
}

Replace <clustername>, <region>, <databasename>, <applicationId>, <applicationKey>, <tenantId>, and <tablename> with your own values.

In my example I am using an Azure AD application key for authentication. Alternatively, you could use Azure AD user authentication or a managed identity for authentication.

You can find out more information here:

https://learn.microsoft.com/en-us/azure/data-explorer/ingest-data-event-grid?tabs=adx

Yip.