Today I needed to migrate an Azure Logic App from a test environment into our project’s Ressource Group and connect it to the Blob Storage using Managed Identity. Since we already have pipelines in place with a nice little Bicep script for all the infrastructure I wanted it also to be part of this deployment process.
Here are the things that needed solving:
Continue reading Deploying Azure Logic Apps + Managed Identity with Bicep
- Deploying the Logic App with environment-specific parameters
- Makeing it easy to update the Logic App definition without having to touch the Bicep file
- Connecting Logic App to Azure Blob Storage using an Azure Managed Identity
Recently I had the requirement of retreiving the newest entry in an Azure Table Storage table. While there’s of course the Timestamp column that could be used to indetify the desired row, we would need to query the whole table to find the youngest date. This could be narrowed down by for example knowing on which day a record was added and filtering accordingly, but we’d probably still get multiple rows to check manually for the newest one.
Continue reading Get newest entry in Azure Table Storage
In this post we will see how to efficiently delete all rows in an Azure Table Storage using the new(ish) Azure.Data.Tables SDK. We’ll try to optimize for speed while also being mindful about the memory.
Note: When deleting a lot of data from Azure Table Storage usually the fastest way is to just drop the whole table. However, we cannot be sure when exactly we’re able to create a new table with the same name since it can take up to a minute or even longer for Azure to actually get rid of the table.
Continue reading Deleting all Rows from Azure Table Storage (as fast as possible)
When working with Azure Table Storage and Application Insights you might have noticed a lot of dependency errors, logging a 409 Conflict event everytime the CreateIfNotExists() method is called to make sure a table is available.
This is because the Azure.Data.Tables SDK will simply try to create a new table — and will hide the error if it already exists. While you don’t notice this in the code, by default Application Insights will catch this and clutter your logs with errors you probably don’t want.
Continue reading Application Insights & Azure Table Storage CreateIfNotExists Errors
Recently I wanted to migrate the documentation of my C# WordPress library WordPressPCL from the GitHub wiki to something more flexible. With version 2 of the libaray coming soon the idea was to move all the docs into a static site hosted with GitHub Pages and managed in Markdown files. Luckily with mkdocs it’s pretty straight forward to get this up-and-running quickly.
Here’s what I wanted:
Continue reading Documentation in GitHub Pages with mkdocs & readthedocs Theme
- Docs as markdown files in main repository with readthedocs theme
- Automatic build of the static website using GitHub Actions
- Deployment of static website to GitHub Pages
When working with Azure Table Storage (ATS) in C# / .NET there are currently at least four NuGet packages offered by Microsoft for working with tables. It gets even more complicated as there’s an Azure.Storage SDK (currently in Version 12) that works with all Azure Storage Account related services — except Table Storage. Additionally Microsoft mostly talks about the CosmosDB Table API, but all libraries also work with the regular ATS since the APIs are identical.
So what are our options and which one should I choose?
Continue reading Which Azure Table Storage .NET SDK should I use?
Using the Static website feature Azure Storage Account is a cheap and convenient way to quicky deploy an Angular app (or any kind of SPA for that matter). And since I’m creating such a pipeline on a fairly regular basis I though it might come handy share my default approach for doing that.
Continue reading Deploying Angular to Azure BLOB Storage using Azure DevOps pipelines
In many CI/CD scenarios it’s necessary to adjust the build, test or deployment process depending on which GIT branch has triggered the pipeline. In our case we build lots of Angular apps automatically with their desired target environment, e.g.
ng build --configuration="production" or
ng build --configuration="staging"
depending on where the artifact should be deployed afterwards. This can be achieved by adding some YAML that looks like this:
Continue reading Branch Name as Variable in Azure DevOps Pipelines with YAML
One of the major reasons for utilizing an Azure Iot Edge Gateway is to connect devices to the internet that can’t directly establish a connection themselves. A very common case are Bluetooth beacons, that can provide sensor data through a Bluetooth Low Energy (BLE) connection but are not able to send this information directly to an Azure IoT Hub.
For this scenario a bridge or gateway is required to create an IoT message from the Bluetooth payload. This is called Protocol Translation and can be achieved by creating a custom IoT Edge Module. Interestingly enough Microsoft has still not managed to provide a sample or best practice on how to do this with the IoT Edge Runtime V2, so here’s how we did it in our latest IoT project.
Continue reading Bluetooth Low Energy BLE devices with Azure IoT Edge
In a lot of IoT solutions Gateways play important roles. They can help connect downstream devices to the Cloud through Access Points, provide offline caching capabilities or translate protocols that are not suitable for direct internet access like Bluetooth. With Azure IoT Edge Microsoft has a great product for these kinds of devices which allows you to deploy custom docker modules for different kinds of tasks like evaluating data on the Edge before sending everything to a connected Azure IoT Hub.
Continue reading Azure IoT Edge Identity Translation: Getting Started