Codit Wiki

Loading information... Please wait.

Codit Blog

Posted on Monday, June 18, 2018 2:09 PM

Tom Kerkhove by Tom Kerkhove

Codit is growing and so does Alfred. Nowadays we are automatically synchronizing every Codit employee with our backend by using Azure Logic Apps.

In order to do that, we've built an API app which is capable of querying Azure AD directory with Azure AD Application authentication and is now available on GitHub.

Just over a year ago, Codit welcomed Alfred, our personal butler that is taking care of our visitors. This has been a side project that I've been working on with Pieter Vandenheede, Wouter Seye and has been a ton of fun - if you've missed that, read more about it in this article.

In the past year, we've focused on improving our foundation while starting experimenting with bots.

Here is a brief overview of what we did:

  • Automatically synchronize all employees based on the Codit AD - Fully exposed via an API and orchestrated via Azure Logic Apps.
  • Migration from the Codit Internal Azure subscription to dedicated Azure subscriptions - Allows us to manage all resources as we'd like.
  • Ability to determine how you want to be notified when a visitor arrived - Fully exposed via an API, ready to be consumed.
  • Build more robust release pipelines in VSTS - Fully automated releases with deployment gates, Azure resource creation (ARM), Application Insights annotations and more

Automatically synchronize all employees based on the Codit AD

When we were building our synchronization workflow, we decided to use Azure Logic Apps as an orchestrator.

We are currently using the following Logic Apps:

  1. Company Debatcher - Triggers a new run per company by passing the company name, such as Codit Managed Services, and a list of users that can be blacklisted. This can be used to filter out users that are being used by applications or for testing purposes.
  2. Company Synchronizer - Retrieves all users from Azure AD for a specific company and filter out the blaclisted ones, if that is applicable. Once that is done, it will trigger a new run for every employee in the result set
  3. Employee Synchronizer - Imports and configures default notification configuration in the Santiago platform via our Management API

Here's a simplistic overview:

While building this synchronization flow, we started off using the built-in Azure AD connector. However, it soon began clear that this was not the best fit for us since it requires Global Administrator permissions.

We ask our Global Administrator if this would be ok but we decided that using Azure AD Applications to act on behalf of our platform would be the best fit.

Unfortunately, this is not supported at the time of implementation and decided to our own Active Directory "connector".

Introducing Active Directory connector for Logic Apps

It took us a long time but as of today our Azure Active Directory connector with Azure AD Application support is now available on GitHub and ready to be deployed on your subscription.


In order to use our connector, you'll have to create & configure your Azure AD application in the Azure AD itself.

The Azure AD Application needs to have Read permissions for the Windows Azure Active Directory API API.

Want to learn more? Read our docs.


Given our synchronization flow is centered around users, we only support the following operations for now:

  • Get a list of all users
  • Get a list of all users by company name
  • Get a specific user by user principle name ie.
  • Send telemetry to Azure Application Insights

However, this is certainly not the end. If you'd like to request a feature or help us extend this we recommend creating a new issue.


Azure Logic Apps allowed us to very quickly build a synchronization workflow that easily integrates with our API. However, the built-in AD connector that we tried required us to use global administrator powers of our company active directory.

We did not want to do that given this is risky and decided to build our own Active Directory connector with Azure AD Application support. This Active Directory connector is now open-source and available on GitHub.

This enabled us to easily deploy and integrate it into our workflow and query our Active Directory with limited capabilities and still achieve our goal.
Although the component is very limited; we are open for suggestions and accept contributions, but prefer to discuss them in an issue first.

But this is only the beginning, we are working hard to open-source Alfred & Santiago as well so that you can run your own version of our internal visitor system.

Thanks for reading,


Categories: Azure
written by: Tom Kerkhove

Posted on Wednesday, May 23, 2018 3:58 PM

Toon Vanhoutte by Toon Vanhoutte

What to do if a message is invalid? The default answer is mostly to make corrections at the source system and resend the messages. But… What if the source system cannot resend those messages in the right format? What if the source is a very important customer of yours and we're dealing with just-in-time orders? In some scenarios, you need to keep the business running and you need to solve the issue asap on your integration layer. This is where edit - resubmit functionality plays a key role: correct the invalid message and reinject it into your integration engine.

I'm aware that this is a debatable feature and some organizations do not allow it. My opinion is that it could be tolerated under certain circumstances, in order not to block the business. However, there must always be an action to solve the problem at its root, otherwise you'll end up with an employee that edit/resubmits messages as a fulltime occupation!

There are some enhancements to the resubmit feature in the Logic Apps release pipeline, but the edit functionality has a quite low priority at the moment, which makes perfectly sense in my opinion. In this blog post, I'll show you how you can setup and edit - resubmit process for invalid messages, using standard Logic Apps actions.


In this scenario, we're receiving orders from an FTP server. Each time a new order is placed on the FTP server, it must be validated as a first step. In case the order is valid, it can be sent towards the ERP system. Otherwise, there should be human intervention to modify the message and resubmit it into the order process.


First of all, we need a message store where business users can modify messages and mark them for resubmit. A SharePoint document library seems a perfect fit for this requirement: it's a well-known environment for business users, there's the opportunity to modify documents online and by adding a custom column to the document library, we can decide what messages that must be resubmitted. Logic Apps has a first-class integration with SharePoint, so we're safe!

Second, we need a design that allows to resubmit messages into the order process. This is done by splitting the process in three separate Logic Apps:

  1. Receive the order from FTP. Decouple the receive protocol, which allows others to be added in the future.
  2. Process the order, which means in our case to validate the order against its schema.
  3. Send the order the ERP system. Again decoupling protocol handling from message processing.

In case the message turns out to be invalid, it gets sent to the SharePoint document library. A business user modifies the message and sets the Resubmit column to Yes. Another Logic App is polling the document library, waiting for messages that have Resubmit set to Yes. If there is such a message, it gets received and deleted from the document library, and it's sent again to the Logic App that validates the message.



The Validate Process

The Validate Logic App looks like this. The message is received by the Request trigger. Then, a Validate JSON performs the message validation. This could also be performed by the XML Validation action from the Integration Account. If the validate succeeded, the Send to ERP Logic App is invoked. If the validate failed, the message gets uploaded to the SharePoint document library, via the Create File action. Afterwards, the Logic Apps run gets terminated with a Success status.


The Create File action looks like this:


It's configured with the following Run After setting:


The Document Library

The document library looks as follows. Invalid messages get uploaded here and can be modified by functional key users. By changing the Resubmit column to Yes, the message gets resubmitted!

The Resubmit Process

This process is pretty straightforward. A polling trigger fires when a file is created or modified in the document library. If the file has the Resubmit value set to Yes, its content is received and sent back to the Validate Logic App. After this succeeds, the file gets deleted from the document library.



By combining SharePoint and its easy-to-use Logic App connectors, we can easily enrich our integrations with human intervention! This edit/resubmit scenario is just one use case. Think also about approval processes, or tasks that must be completed before an automated process should kick in…


Categories: Azure
written by: Toon Vanhoutte

Posted on Friday, May 18, 2018 1:59 PM

Tom Kerkhove by Tom Kerkhove

GDPR mandates that you make data available to users on their request. In this post I show you how you can use Azure Serverless to achieve this with very little effort.

GDPR is around the corner which mandates that every company serving European customers need to comply with a lot of additional rules such as being transparent in what data is being stored, how it is being processed, and more.

The most interesting ones are actually that you need to be able to request what data they are storing about you and make it available to you, all of it. A great example is how Google allows you to select the data you want to have and give it to you, try it here.

Being inspired by this, I decided to build a similar flow running on Azure and show how easy it is to achieve this.

Consolidating user data with Azure Serverless

In this sample, I'm using a fictitious company that is called Themis Inc. which provides a web application where users can signup, create a profile and does awesome things. That application is powered by a big data set of survey information which is being processed to analyze and see if the company can deliver targeted ads for specific users.

Unfortunately, this means that the company is storing Personal Identifiable Information (PII) for the user profile and the survey results for that user. Both of these datasets need to be consolidated and provided as a download to the user.

For the sake of this sample, we are actually using the StackExchange data set and the web app simply allows me to request all my stored information.

This is a perfect fit for Azure Serverless where we will combine Azure Data Factory_, the unsung serverless hero,_ with Azure Logic Apps, Azure Event Grid and Azure Data Lake Analytics.

How it all fits together

If we look at the consolidation process, it actually consists of three steps:

  1. Triggering the data consolidation and send an email to the customer that we are working on it
  2. Consolidating, compressing and making the data available for download
  3. Sending an email to the customer with a link to the data

Here is an overview of all the pieces fit together:

Azure Logic Apps is a great way to orchestrate steps that are part of your application. Because of this, we are using a Logic App that is in charge of handling new data consolidation requests that were requested by customers in the web app. It will trigger the Data Factory pipeline that is in charge of preparing all the data. After that, it will get basic profile information about the user by calling the Users API and send out an email that the process has started.

The core of this flow is being managed by an Azure Data Factory pipeline which is great to orchestrate one or more data operations that represent a business process. In our case, it will get all the user information from our Azure SQL DB and get all data, related to that specific user, in our big data set that is stored on Azure Data Lake Store. Both data sets are being moved to a container in Azure Blob Storage and compressed after which a new Azure Event Grid event is being published with a link to the data.

To consolidate all the user information from our big data set we are using U-SQL because it allows me to write a very small script and submit this, while Azure Data Lake Analytics runs and looks through your data. This is where Data Lake Analytics shines because you don't need to be a big data expert to use it, it does all the heavy lifting for you by determining how it needs to execute it, scale it, and so on.

Last but not least, a second Logic App is subscribing to our custom Event Grid topic and sends out emails to customers with a link to their data.

By using Azure Event Grid topics, we remove the responsibility of the pipeline to know who should act on his outcome and trigger it. It also makes our current architecture flexible by providing extension points that can be used by other processes to integrate with it in the process in case we need to make the process more complex. It also removes the responsibility from the pipeline to know who should act on his outcome.

This is not the end

Users can now download their stored data, great! But there is more...

Use an API Gateway

The URLs that are currently exposed by our Logic Apps & Data Factory pipelines are generated by Azure and are tightly coupled to those resources.

As the cloud is constantly changing, this can become a problem when you decide to use another service or somebody simply deletes and you need to recreate it where it will have a new URL. Azure API Management is a great service for this where it will basically shield away from the backend process from the consumer and act as an API gateway. This means that if your backend changes; you don't need to update all you consumers, simply update the gateway instead.

Azure Data Factory pipelines can be triggered via HTTP calls but this has to be done via a REST API - Great! The downside is that it is secured via Azure AD which brings some overhead in certain scenarios. Using Azure API Management, you can shield this from your consumers by using an API key and leave the AD authentication up to the API gateway.

User Deletion

GDPR mandates that every platform needs to give a user the capability to delete all the data for a specific user on request. To achieve this a similar approach can be used or even refactor the current process so that they re-use certain components such as the Logic Apps.


Azure Serverless is a very great way to focus on what we need to achieve and not worry about the underlying infrastructure. Another big benefit is that we only need to pay for what we are using. Given this flow will be used very sporadically this is perfect because we don't want to set up an infrastructure which needs to be maintained and hosted if it will only be used once a month.

Azure Event Grid makes it easy to decouple our processes during this flow and provide more extension points where there is a need for this.

Personally, I am a fan of Azure Data Factory because it makes me as a developer so easy to automate data processes and comes with the complete package - Code & visual editor, built-in monitoring, etc.

Last but not least, this is a wonderful example of how you can combine both Azure Logic Apps & Azure Data Factory to build automated workflows. While at first, they can seem as competitors, they are actually a perfect match - One focusses on the application orchestration while the other one does the data orchestration. You can read more about this here.

Want to see this in action? Attend my "Next Generation of Data Integration with Azure Data Factory" talk at Intelligent Cloud Conference on 29th of May.

In a later post, we will go more into detail on how we can use these components to build this automated flow. Curious to see the details already? Everything will be available on GitHub.

Thanks for reading,


Posted on Tuesday, April 10, 2018 2:48 PM

Sagar Sharma by Sagar Sharma

Calling on-premise hosted web services from Logic Apps is super easy now. Use the on-premise data gateway and custom connector to meet this integration.

In this article, I will show you how to connect to on-premise hosted HTTP endpoints (which now has public access) from Logic Apps. We will do this by using the Logic Apps on-premise data gateway and Logic Apps custom connector. This feature is recently available by the Logic Apps product team. For the people who had never used on-premise data gateway before, please read my previous blog post “Installing and Configuring on-premise data gateway for Logic Apps” which contains a detailed explanation of the Logic App on-premise data gateway.

Part 1: Deploying a webservice on a local machine

You are already familiar with this part:

  • Open visual studio. Create new project>Web>ASP.NET Web Application 

  • I am doing this in the most classical way. Empty template, no authentication and add>new item>Web Service (ASMX). You can do it in your preferred way REST, MVC Web API or WCF Webservice etc.
  • Write some web method. Again, I am doing it in easiest way so, for example "HelloWorld" with one parameter:

  • Build the web application and deploy it to local IIS.
  • Browse the website and save the full WSDL. You will need this WSDL file in part 2.

Part 2: Creating a custom connector for the webservice

  • Log on to your azure subscription where you have on-premise data gateway registered. Create a resource of type “Logic Apps Custom Connector”.
  • Open a custom connector and click on edit. Choose API Endpoint as SOAP and Call mode as SOAP to REST and then browse to upload WSDL file of your on-premise webservice.

 Please note that if you are trying to access an REST/Swagger/OpenAPI web service, you will need to choose REST as API endpoint.

  • Don’t forget to select “Connect via on premise data gateway”

  • Click on continue to go to security tab from general tab

  • Again, click on continue as no authentication for this demo
  • In the definition tab, fill some summary and description. Keep default value of rest of the configuration and click on “Update connecter” from top right of the screen.
  • The custom connector is ready to use now.

Part 3: Integrating with an on-premise webservice from Logic Apps

  • Create a new Logic App. Start with Recurrence trigger.
  • Add an action and search for your custom connector:
  • Choose your web method as action. Then choose your on-premise gateway which you want to use to connect with your on-premise web service and click on create
  • Enter the value for your name parameter and with that your final logic app should look like the following:
  • Click on save and then run it. Within a moment you should see response from your on-premise web service based on your input parameter

Thanks for reading! I hope you've found this article useful. If you have any questions around this or looking for additional information, feel free to comment below.

Want to discover more? Check out these sites:

Posted on Monday, March 26, 2018 3:18 PM

Sagar Sharma by Sagar Sharma

If you want to connect to your on-premise data sources from Azure hosted Logic Apps, then you can use an on-premise data gateway. Let's see how to install and configure it.

Logic App is a new generation integration platform available in Azure. Being a serverless technology, there is no upfront hardware and licensing cost. That leads to a faster time to market. Because of all these features, Logic App is picking up pace in the Integration world.

Because Logic Apps are hosted in cloud, it’s not straight forward to access on-premise network hosted data sources from Logic Apps. To overcome that limitation, Microsoft introduced “on-premise data gateway”. 

The gateway acts as a bridge that provides quick data transfer and encryption between data sources on-premises and your Logic Apps. All traffic originates as secure outbound traffic from the gateway agent to Logic Apps through Azure Service Bus Relay in background.

Currently, the gateway supports connections to the following data sources hosted on-premises:

  • BizTalk Server 2016
  • PostgreSQL
  • DB2
  • SAP Application Server
  • File System
  • SharePoint
  • Informix
  • SQL Server
  • MQ
  • Teradata
  • Oracle Database
  • SAP Message Server


Part 1: How does the Logic App on-premise data gateway works?

  1. The gateway cloud service creates a query, along with the encrypted credentials for the data source, and sends the query to the queue for the gateway to process.
  2. The gateway cloud service analyzes the query and pushes the request to the Azure Service Bus.
  3. The on-premises data gateway polls the Azure Service Bus for pending requests.
  4. The gateway gets the query, decrypts the credentials, and connects to the data source with those credentials.
  5. The gateway sends the query to the data source for execution.
  6. The results are sent from the data source, back to the gateway, and then to the gateway cloud service. The gateway cloud service then uses the results.

Part 2: How to install the on-premises data gateway?

Before we install on-premise data gateway, it’s very important to take following points into consideration:

  1. Download and run the gateway installer on a local computer. Link:

  2. Review and accept the terms of use and privacy statement. Specify the path on your local computer where you want to install the gateway.

  3. When prompted, sign in with your Azure work or school account, not a Microsoft account.

  4. Now register your installed gateway with the gateway cloud service. Choose "Register a new gateway on this computer". Provide a name for your gateway installation. Create a recovery key, then confirm your recovery key.

    In order to achieve high availability, you can also configure the gateway in cluster mode. For that select “Add to an existing gateway cluster”. 

    To change the default region for the gateway cloud service and Azure Service Bus used by your gateway installation, choose “Change Region”. For example, you might select the same region as your logic app, or select the region closest to your on-premises data source so you can reduce latency. Your gateway resource and logic app can have different locations.

  5. Click on Configure and your gateway installation should be ready. Now we need to register this on-premise installation in Azure. For that log-on to your Azure subscription. Make sure you use the azure subscription which is associated with your work/school tenant. Create new resource of type “On-premises data gateway”

  6. Enter some name of your gateway. Choose subscription and resource group. Make sure you choose the same location as you selected during the gateway installation. You should be able to see the name of your gateway installation after choosing the same location

  7. Click on create and within a moment you will be able to use the on-premise gateway in your Logic App.

  8. You will be able to choose on-premise gateway installation to access on-premise hosted data sources in supported connectors.
    For example:
    • File System
    • SQL Server

Some important things to keep in mind

  • The on-premise data gateway is firewall friendly. There are no inbound connections to the gateway from the Logic Apps. The gateway always uses outbound connections.
  • Logic App on-premise data gateway also supports High availability via Cluster configuration. You can have more than one installation of gateway and configure them in cluster mode.
  • When you install the gateway on one machine, it can connect to all hosts with in that network. So there is no need to install a gateway on each data source machine rather one in each network.
  • Install the on-premises data gateway only on a local computer. You can't install the gateway on a domain controller.
  • Don't install the gateway on a computer that turns off, goes to sleep, or doesn't connect to the Internet because the gateway can't run under those circumstances. Also, the gateway performance might suffer over a wireless network.
  • During installation, you must sign in with a work or school account that's managed by Azure Active Directory (Azure AD), not a Microsoft account.

You can find all official limitations around logic apps at

Configure a firewall or proxy

  • The gateway creates an outbound connection to Azure Service Bus Relay. To provide proxy information for your gateway, see Configure proxy settings.
  • To check whether your firewall, or proxy, might block connections, confirm whether your machine can actually connect to the internet and the Azure Service Bus. From a PowerShell prompt, run this command-

 Test-NetConnection -ComputerName -Port 9350

    • This command only tests network connectivity and connectivity to the Azure Service Bus. So, the command doesn't have anything to do with the gateway or the gateway cloud service that encrypts and stores your credentials and gateway details.
    • Also, this command is only available on Windows Server 2012 R2 or later, and Windows 8.1 or later. On earlier OS versions, you can use Telnet to test connectivity. Learn more about Azure Service Bus and hybrid solutions.
    • If TcpTestSucceeded is not set to True, you might be blocked by a firewall. If you want to be comprehensive, substitute the ComputerName and Port values with the values listed under Configure ports in this article.
  • The firewall might also block connections that the Azure Service Bus makes to the Azure datacenters. If this scenario happens, approve (unblock) all the IP addresses for those datacenters in your region. For those IP addresses, get the Azure IP addresses list here.

Configure ports

  • The gateway creates an outbound connection to Azure Service Bus and communicates on outbound ports: TCP 443 (default), 5671, 5672, 9350 through 9354. The gateway doesn't require inbound ports.

Domain names

Outbound ports










Advanced Message Queuing Protocol (AMQP)


443, 9350-9354

Listeners on Service Bus Relay over TCP (requires 443 for Access Control token acquisition)











Used to test internet connectivity when the gateway is unreachable by the Power BI service.

  • If you must approve IP addresses instead of the domains, you can download and use the Microsoft Azure Datacenter IP ranges list. In some cases, the Azure Service Bus connections are made with IP Address rather than fully qualified domain names.

Want to read more about on-premise data gateways?

Check out the sites below:

Thanks for reading!

P.S.: In the last couple of months, I have extensively worked on Logic Apps and on-premise data gateway. So, feel free to contact me if you have any questions.

Categories: Azure
Tags: Azure, Logic Apps
written by: Sagar Sharma