Codit Wiki

Loading information... Please wait.

Codit Blog

Posted on Thursday, March 23, 2017 3:15 PM

Toon Vanhoutte by Toon Vanhoutte

Within Logic Apps, there is a pay-per-usage pricing model. You get invoiced for every action that is executed, but also for every polling trigger (regardless whether the trigger is skipped or fired). You can reduce the cost of polling triggers by scheduling Logic Apps to run only during business hours, if your scenario allows this of course.

This blog posts describes all the detailed steps required to schedule Logic Apps automatically.

Create Active Directory application

Before an Azure AD tenant will allow an application to access the resources it is securing, a service principal must be created in the given tenant. The service principal provides the basis for Azure AD to secure the application's access to resources owned by users from that tenant.  

Let's start with registering an application in Active Directory and a corresponding service principal.

  • Open the Active Directory blade.
  • Go to the Properties tab.
  • Select the Directory ID, you will need it later as the Tenant ID.

  • Go to the App Registrations tab.
  • Click Add.

  • Provide a meaningful name, select Web app / API as the application type and provide just a dummy, but valid URL. Click Create.
  • Copy the Application ID, you will need it later as the OAuth Client ID.
  • Go to the Keys tab
  • Provide a Description and choose the Expiration Setting. Click Save.
  • Copy the Value, you will need it later as the OAuth Client Secret.

Assign the right permissions

Permissions can be configured at any level: subscription, resource group or resource. They get inherited by default to the lower level, so having specific rights on a resources group applies also to all resources underneath. Just navigate to the desired blade, which is in our case the Logic App itself.

  • Go to the Access Control (IAM) tab.
  • Click Add.

  • First select a role. Take the Logic App Operator, as this allows you to read, enable and disable the Logic App.
  • In the Add Users blade, search now for the AD application that you just created. Click Select.
  • Click now OK.

Compare scheduling services

In this section, I briefly compare three Azure services that allows scheduling.

For the sake of this blog post, I prefer to go for Azure Scheduler, as it allows me to meet the expectations without writing any code.  If you need to deal with scheduling many Logic Apps, I'd rather look into creating a script that auto-discovers all Logic Apps inside my resource group and enables/disables them all in one go via Azure WebJobs or Automation.  If you want to go that way, this simple script can be a starting point:

Azure Scheduler

  • Add Azure Scheduler to your resource group and click Create.
  • Provide 'logicapp1-enable' as the Job Name.

  • Click Configure Job Collection and select Create New.
  • Type 'logicappschedules' as the Name, choose the appropriate Pricing Tier, select an existing Resource Group and the desired Location. Click OK.
  • Click Configure Action Settings and provide the following information. The URL is derived from the Logic Apps Workflow Management API documentation.

    >Action: Https
    >Method: Post
    {resourceGroupName} /providers/Microsoft.Logic/workflows/{workflowName}/
  • Click Configure Authentication Settings and select Active Directory OAuth as the Authentication Type. Provide the ID's that you collected from the previous steps: Tenant ID, Client ID and Client Secret. Click OK twice.
  • Click Configure Schedule and configure the schedule to run every weekday at 8:00. Click OK twice.
  • Click Create.
  • Ensure the Logic App is disabled.  Browse now to the created scheduler job and click Run Now to give it a try.
  • Consult the History tab to see the outcome of the job execution.
  • Hopefully you see now a successful job history.
  • You can now repeat the previous steps to schedule the Logic App to disable every weekday at 18:00.

Check Audit Trail

  • Browse now to the Activity Log of the Logic App.
  • You can see that I manually disabled the Logic App and that the Enable operation was executed by the created application identity. Nice visibility!

Feedback to the product team

The documentation can be improved on this subject. I took me a while to figure out the correct URL, as the documentation already provides a part of it. The authentication procedure was also not very clear to me, but via very quick assistance by the product team on Twitter, I got this sample running in 5 minutes!

It would be nice to have a service window available on every Logic App, so you can schedule it to get automatically enabled/disabled. A similar user experience as in Azure Scheduler or the CRON expressions from Azure WebJobs would be nice. Are you in favor of this idea? Vote here!


You can always rely on other Azure services, in case Logic Apps does not provide the functionality out-of-the-box. This was a very user-friendly experience of scheduling a Logic App without writing a single line of code!

Hope this post can save you some time and money!


Categories: Azure
written by: Toon Vanhoutte

Posted on Tuesday, March 14, 2017 4:12 PM

Toon Vanhoutte by Toon Vanhoutte

Postman is a great and popular tool to test Web API's. There are however a few steps needed to get it authenticated against Microsoft's standard API's, such as the Azure Service Management API. This blog post covers two ways on how to authenticate Postman quick and easily.

In case you try to access the Azure Service Management API, without any specific authorization, you'll get the following exception: 'Authentication failed.  The 'Authorization' header is missing'.

There are two main ways to authenticate with Azure: using your own Microsoft account or using a Service Principal. Let's have a closer look!

Authenticate with Microsoft account

Use ARMClient

  • Download ARMClient over here.
  • Open Command Prompt or PowerShell.
  • Run the following command: ARMClient.exe login
  • Provide your credentials.
  • Copy the token to the clipboard, via this command:

  • In Postman, add an Authorization header to your HTTP request. As a value, provide 'Bearer', followed by a space and then the token from the clipboard. Send your request and it should work fine!

Use Fiddler

  • Download Fiddler over here.
  • Ensure you configure Fiddler to Decrypt HTTPS traffic.
  • Perform a request in the Azure portal and find it back in Fiddler.
  • Copy the bearer token from the HTTP security header.
  • In Postman, add an Authorization header to your HTTP request. As a value, provide the copied bearer token, including the 'Bearer'. Send your request and you should get access!


Authenticate with Service Principal

Setup a Service Principal

Create an Active Directory application (Service Principal) that represents your Postman instance.

  • Open the Active Directory blade.
  • Go to the Properties tab.
  • Select the Directory ID, you will need it later as the Tenant ID.

  • Go to the App Registrations tab.
  • Click Add.

  • Provide a meaningful name, select Web app / API as the application type and provide just a dummy, but valid URL. Click Create.
  • Copy the Application ID, you will need it later as the Client ID.
  • Go to the Keys tab
  • Provide a Description and choose the Expiration Setting. Click Save.
  • Copy the Value, you will need it later as the Client Secret.

Permissions can be configured at any level: subscription, resource group or resource. They get inherited be default to the lower level, so having specific rights on a resources group applies also to all resources underneath. Just navigate to the desired blade, which is in our case the Logic App itself.

  • Go to the Access Control (IAM) tab.


  • Give a specific access role to the service principal you just created.

Use ARMClient

  • Download ARMClient over here.
  • Open Command Prompt or PowerShell.
  • Run the following command, <placeholders> must be replaced by the values collected above: 

  • Copy the token to the clipboard, via this command: 

  • In Postman, add an Authorization header to your HTTP request. As a value, provide 'Bearer', followed by a space and then the token from the clipboard. Send your request and you should be good to go!


It's quite simple to authenticate Postman against the Azure API's. You can include these authorization headers as presets, but keep in mind that bearer tokens have an expiration time.

Hope this was interesting! Let me know if you have easier ways to authenticate Postman against Azure API's!


Categories: Azure
written by: Toon Vanhoutte

Posted on Thursday, March 9, 2017 8:16 AM

Stijn Degrieck by Stijn Degrieck

You most probably know that Microsoft is the world’s largest contributor to the open source community on the popular GitHub platform, no? That’s right. When it comes to sharing code for open development and collaboration, it is leaving behind companies like Facebook, Google and Red Hat. All this is the result of a major strategic shift initiated by Steve Ballmer, and accelerated by Satya Nadella. One that will allow Microsoft to transform to a full-blown Software-as-a-Service company.

In a letter to all employees two years ago, Satya Nadella, who had just been appointed CEO, said: “Our strategy is to build best-in-class platforms and productivity services for a mobile-first, cloud-first world. Our platforms will harmonize the interests of end users, developers and IT better than any competing ecosystem or platform.”

Today, Microsoft is reporting impressive growth for its SaaS solutions. Revenue from its cloud platform, Azure, grew triple digits, with usage of key computing and database workloads more than doubling year-over-year. And embracing Apple and Android is paying off, making its software easily available on all operating systems. (In fact, that’s often where you’ll find the best Microsoft apps.) Office 365’s enterprise user base is also growing quickly. End of last year, reported it’s already twice as popular as Google’s G Suite in organizations across Europe. It’s a bold move for a company once considered an evil monopolist who perceived open-source as an existential threat to their business. As one court order stated: they put up ‘technical barriers’, making it hard for the competition to work on the Windows operating system. Remember the ‘browser wars’?

I’m happy to see Microsoft’s progress and its approach to open source. At Codit, we welcome the transition from a closed Microsoft-only stack to an open Azure platform. It’s the perfect foundation for co-creation with our customers. For instance on projects related to the Internet of Things.  

We have many customers exploring IoT. Usually they have lots of ideas, devices and sensors. But they have no resources, expertise nor experience to connect these to the cloud and putting their data to work. Cue in the Nebulus™ IoT Gateway. You can use it to link any sensor or device in a couple of minutes to the Microsoft Azure cloud, allowing you to connect, capture and control data in real-time.

I’m a big fan of co-creation. Most customers have a clear view on what they want. But they need help translating it into specific technology features and functions. That’s where we come in, helping you turn big ideas into new tangible services.

What’s your big idea? We’re listening.

- Stijn Degrieck, CEO Codit

Categories: Opinions
Tags: Azure
written by: Stijn Degrieck

Posted on Monday, March 6, 2017 2:18 PM

Toon Vanhoutte by Toon Vanhoutte

Lately, I was working on a proof of concept on Logic Apps. I created a Logic App that looped over all files in a specific folder and processed them. I was curious about the performance, so I copied 10.000 files in the folder. The Logic App kicked off and started processing them. After a while the Logic App ended up in a failed state. Let's have a look what happened.

Apparently, I stumbled upon a limitation that is documented over here.

The documentation suggests to use the query action to filter arrays.  It offers the following capabilities:

You can indeed filter an array, based on a specific query.  However, I did not have an object available that contains the position of an item within the array.  I wanted assurance that I would not hit the for each limit, so this was no option.  If you know a way how to get the position within the array, please let me know via the comments sections below.

I continued my search within the Logic Apps Workflow Definition Language and found the @take function.  This is what the documentation states:

This did the trick. It takes the first 5000 items from an array. Luckily, you do not get an out-of-range exception if the incoming array contains less items!

It's always a good practice to validate your design upfront against the Logic Apps limits.

Hope this helps!

Categories: Azure
Tags: Logic Apps
written by: Toon Vanhoutte

Posted on Friday, March 3, 2017 2:41 PM

Toon Vanhoutte by Toon Vanhoutte

A good architect and great developer have one thing in common: they are lazy! They design the solution in such a way that they can reuse as much as possible common components within their solution. This applies to any technology, so it does for Logic Apps. Logic Apps provides many ways to benefit from re-usability.

This blog post focuses on consuming existing Logic Apps from within other Logic Apps, also known as nested workflows. It's a nice feature, but as usual: be aware of the caveats!


Let's take the following scenario as a base to start from. Logic App 1 is exposed as a web service. It consumes Logic App 2 which calls on its turn Logic App 3. Logic App 2 and 3 are considered to be reusable building blocks of the solution design. As an example, Logic App 3 puts a request message on the particular queue. Below you can find the outcome of a successful run that finishes within an acceptable timeframe for a web service.

Exception Scenario

If you stick to the above design, you'll discover unpleasant behavior in case you need to cope with failure. Building cloud-based solutions means dealing with failure in your design, even in this basic scenario. Let's simulate an exception in Logic App 3, by trying to put a message on a non-existing queue. As a result, Logic App 1 fails after 6 minutes of processing!

I expected a long delay and potentially a timeout, but those 6 minutes were a real surprise to me. The reason for this behavior is the default retry policies that are applied on Logic Apps. I consulted the documentation and that explains everything. Logic App 1 was fired once. Logic App 2 got retried 4 times, which results in 5 failed instances. The third workflow got even executed 25 (5x5) times.

The retry interval is specified in the ISO 8601 format. Its default value is 20 seconds, which is also the minimum value. The maximum value is 1 hour. The default retry count is 4, 4 is also the maximum retry count. If the retry policy definition is not specified, a fixed strategy is used with default retry count and interval values. To disable the retry policy, set its type to None.

Optimize the retry policies

Time to overwrite those default retry policies. For Logic App 1, I do not want any retry in case Logic App 2 fails. This is achieved by updating the code view:

In Logic App 2, I configure the retry policy to retry once after 20 seconds:

The result is acceptable from a timing perspective:

On the other hand, the exception message we receive is completely meaningless.  Check out this post to learn more about exception handling in such a situation.

Implement fire and forget

In the previous examples, we invoked the underlying Logic App in a synchronous way: call the Logic App and only continue if the Logic App has completed its processing. For those with a BizTalk background: this is comparable with the Call Orchestration shape. As Logic Apps gives you complete freedom on where to put the Response action in your workflow, you can also go for a fire-and-forget pattern, comparable with the Start Orchestration shape. This can be achieved by placing the Response action right after the Request trigger. Via this way, these reusable Logic Apps execute independently from their calling process.

This eventual consistency can have an impact on the way user applications are built and it requires also good operational monitoring in case asynchronous processes fail. Remark in the example below, that the consuming application is not aware that Logic App 3 failed.

Update: Recently I discovered that it's even possible to leave out the Response action within the nested workflows.  Just ensure to update the consuming Logic App action with the following expression: "operationOptions": "DisableAsyncPattern".  This is even more fire-and-forget style and will improve performance a little bit.

This solution reduces processing dependencies between the reusable Logic Apps. Unfortunately, the design is still not bullet-proof. Under a high load, throttling could occur in the underlying Logic Apps, which could result in time-outs when calling the nested workflows. A more convenient design, is to put a Service Bus queue in between. This increases on the other hand the complexity of development, maintenance and operations. It's important to assess this potential issue of throttling within the context of your business case. Is it really worth the effort? It depends on so many factors…


As a final topic, I want to demonstrate the nested workflows all share a common identifier. The parent workflow has a specific ID of its instance.

This ID appears in every involved Logic App run execution, in the form of a Correlation ID. This ID can be used to link / correlate the Logic App instances with each other.

This ID is handed over to the underlying workflow, via the x-ms-client-tracking-id HTTP Header.

Feedback to the product team

It's fantastic that you get full control on the retry policies. The minimal retry interval of 20 seconds seems quite long to me, if you need to deal with sync web service scenario. I found also a nice suggestion to include an exponential back-off retry mechanism. Implementing circuit breakers would also be nice to have!

The monitoring experience for retried instances could be improved. In the Azure portal, they just show up as individual instances. There's no easy way to find out that they are all related to each other. Would be a great feature if all runs with the same Correlation ID are grouped together in the default. Like it? Vote here!


Logic Apps nested workflows are very powerful to reuse functionality in a user-friendly way. Think about the location of the Response action within the underlying Logic App, as this greatly impacts the runtime dependencies. Implement fire and forget if your business scenario allows it and consider a queuing system in case you need a scalable solution that must handle a high load.

Thanks for reading!

Categories: Azure
written by: Toon Vanhoutte