Codit Wiki

Loading information... Please wait.

Codit Blog

Posted on Sunday, July 8, 2018 12:35 PM

Tom Kerkhove by Tom Kerkhove

Renovate automatically checks if there are new versions of your NuGet & Docker dependencies and helps to keep them up to date via automatic pull requests

Recently The Changelog #289 introduced me to Rhys Arkins who is the creator of Renovate.

Renovate saves you time and reduces risk in software projects by automating the tedious process of updating dependencies. This is fully automated and its behavior is fully customizable to fit your needs.

Renovate supports a variety of languages :

NuGet support was only recently added by my colleague Sam Neirinck - Thank you Sam!

Onboarding Renovate

Onboarding Renovate is super easy and currently supports GitHub, GitLab, VSTS with Bitbucket on the way. You can choose between using the GitHub App or running a self-hosted version via a Docker image available on Docker Hub.

Once you've added that to your Git repository it will send you an onboarding PR that will explain what it will do:

In case you'd like to tweak its behavior a bit, you can fully configure how it will contribute via renovate.json (docs)

Dependency Updates

Once configured, Renovate will periodically check your dependencies.

If it finds a new version it will automatically create a new pull request with the latest version:

When your dependency is keeping track of release notes it will also include all the notes of the versions you are behind. This information is coming from all the GitHub Releases that were made.

If you would like to check less often, have less PRs or only update minor versions you can tweak this in the renovate.json file.

Streamlined Automation

It is highly recommended to only use Renovate if you have automated builds when pull requests are created that verify the code changes and runs all your tests. This will prevent you from merging in bad updates.

In this example, Renovate wants to upgrade Swashbuckle.AspNetCore which introducing breaking changes given it's a major version increase but Travis CI made me aware of this and was able to fix it before merging it in.

This is another good example of why you should always run automated builds when a pull request is created.

It should verify as much as it can ranging from building code, running automated tests, building Dockerfiles, scanning for security vulnerabilities with and more.

Better safe than sorry and automate as much as possible.

A note to package maintainers

When I started using Renovate it became clear that package maintainers are an important factor and can improve the process by doing a few small tweaks.

The importance of GitHub releases

Using Renovate was also an eye-opener on the importance of having good GitHub releases. These really help consumers of products to get a good understanding of what changed and what to expect when updating. Having these in the update pull requests is really nice.

This was the main driver for automatically adding GitHub releases on what I'm doing and providing context on what changed. An example is v0.0.2-preview for Arcus.EventGrid.

Well-document your Nuspec

In order to achieve automatic release notes with NuGet, a well-documented Nuspec is required which specifies the URL to the GitHub repository in the following sections:

  • Specify in PackageProjectUrl such as Swashbuckle.AspNetCore  - <PackageProjectUrl></PackageProjectUrl>
  • repository such as YamlDotNet - <repository type="git" url="" />


When I started using Renovate on Promitor I was really amazed by all the work that was done by Rhys and how streamlined Renovate is.

Renovate is really cool and this certainly helps me keep my applications up to date and do all the chores for me. It's free for open-source projects so give it a spin!

A very big thank you to Rhys Arkins for building this tooling and also thank you Sam Neirinck for adding NuGet support! All the credit goes to them, not me.

Want to learn more about Renovate? Listen to the episode of The Changelog.

Thanks for reading,


Categories: Technology
written by: Tom Kerkhove

Posted on Monday, June 18, 2018 2:09 PM

Tom Kerkhove by Tom Kerkhove

Codit is growing and so does Alfred. Nowadays we are automatically synchronizing every Codit employee with our backend by using Azure Logic Apps.

In order to do that, we've built an API app which is capable of querying Azure AD directory with Azure AD Application authentication and is now available on GitHub.

Just over a year ago, Codit welcomed Alfred, our personal butler that is taking care of our visitors. This has been a side project that I've been working on with Pieter Vandenheede, Wouter Seye and has been a ton of fun - if you've missed that, read more about it in this article.

In the past year, we've focused on improving our foundation while starting experimenting with bots.

Here is a brief overview of what we did:

  • Automatically synchronize all employees based on the Codit AD - Fully exposed via an API and orchestrated via Azure Logic Apps.
  • Migration from the Codit Internal Azure subscription to dedicated Azure subscriptions - Allows us to manage all resources as we'd like.
  • Ability to determine how you want to be notified when a visitor arrived - Fully exposed via an API, ready to be consumed.
  • Build more robust release pipelines in VSTS - Fully automated releases with deployment gates, Azure resource creation (ARM), Application Insights annotations and more

Automatically synchronize all employees based on the Codit AD

When we were building our synchronization workflow, we decided to use Azure Logic Apps as an orchestrator.

We are currently using the following Logic Apps:

  1. Company Debatcher - Triggers a new run per company by passing the company name, such as Codit Managed Services, and a list of users that can be blacklisted. This can be used to filter out users that are being used by applications or for testing purposes.
  2. Company Synchronizer - Retrieves all users from Azure AD for a specific company and filter out the blaclisted ones, if that is applicable. Once that is done, it will trigger a new run for every employee in the result set
  3. Employee Synchronizer - Imports and configures default notification configuration in the Santiago platform via our Management API

Here's a simplistic overview:

While building this synchronization flow, we started off using the built-in Azure AD connector. However, it soon began clear that this was not the best fit for us since it requires Global Administrator permissions.

We ask our Global Administrator if this would be ok but we decided that using Azure AD Applications to act on behalf of our platform would be the best fit.

Unfortunately, this is not supported at the time of implementation and decided to our own Active Directory "connector".

Introducing Active Directory connector for Logic Apps

It took us a long time but as of today our Azure Active Directory connector with Azure AD Application support is now available on GitHub and ready to be deployed on your subscription.


In order to use our connector, you'll have to create & configure your Azure AD application in the Azure AD itself.

The Azure AD Application needs to have Read permissions for the Windows Azure Active Directory API API.

Want to learn more? Read our docs.


Given our synchronization flow is centered around users, we only support the following operations for now:

  • Get a list of all users
  • Get a list of all users by company name
  • Get a specific user by user principle name ie.
  • Send telemetry to Azure Application Insights

However, this is certainly not the end. If you'd like to request a feature or help us extend this we recommend creating a new issue.


Azure Logic Apps allowed us to very quickly build a synchronization workflow that easily integrates with our API. However, the built-in AD connector that we tried required us to use global administrator powers of our company active directory.

We did not want to do that given this is risky and decided to build our own Active Directory connector with Azure AD Application support. This Active Directory connector is now open-source and available on GitHub.

This enabled us to easily deploy and integrate it into our workflow and query our Active Directory with limited capabilities and still achieve our goal.
Although the component is very limited; we are open for suggestions and accept contributions, but prefer to discuss them in an issue first.

But this is only the beginning, we are working hard to open-source Alfred & Santiago as well so that you can run your own version of our internal visitor system.

Thanks for reading,


Categories: Azure
written by: Tom Kerkhove

Posted on Friday, June 15, 2018 10:52 AM

Stijn Degrieck by Stijn Degrieck

Microsoft has bought GitHub. For no less than 7.5 billion dollars. This is undeniably a huge amount of money. But if you ask me, it’s a bargain. A smart investment with great return prospects. Because, with the takeover of the platform, Microsoft will also be gaining a major developer community.

This takeover has come at the right time for GitHub too. The company behind the platform has not always enjoyed positive news coverage. One of the co-founders was forced to leave a little while ago due to allegations of inappropriate behaviour. GitHub was also struggling to keep its head above water financially. It needed a capital injection via private investors or even an IPO, but both scenarios demand a lot of time and resources. It looked like selling was the only real option.

And there was no shortage of potential buyers. In addition to Microsoft, Google is believed to have been in the running. And Amazon has also shown interest in the past. But Nadella came out on top. The fact that Microsoft has the biggest user community on the platform will certainly have played a role.

But GitHub is actually technology agnostic. Any developer writing code using any technology can share and manage this on the platform as well as developing it further with the other community members. So it’s not an exclusive Microsoft club. And that’s exactly what makes it attractive. If Microsoft soon starts to run analyses of the codes that users upload, this is guaranteed to provide them with good ideas for the new solutions that they have in the pipeline or plan to develop themselves. This kind of platform is a great way to keep a finger on the pulse of the developer community.

Of course there are also a lot of potential Microsoft developers on GitHub. If they succeed in convincing just a fraction of these people to work with their own Azure platform, the investment will soon pay for itself.

But all that requires Microsoft to shake off its old reputation as a money grabber. Although the company has not been in the news due to legal cases and damage claims for years now, many people have not forgotten that the guys from Redmond previously dominated the market a little too enthusiastically.

A number of GitHub users among them, naturally. They immediately made the switch to the competitor platform GitLab following the announcement. To the extent that the platform briefly went down, as it could not handle the influx. But all in all, the outflow has remained limited. It will be different if one of the other major contributors decides to leave in the coming period. I can imagine that over at Red Hat they are not amused that Microsoft is suddenly hosting a significant part of their source code...

But everything will work out. Under Nadella, Microsoft made a strategic U- turn and began to behave completely differently. With much more humility. And this was necessary, after they had missed a couple of crucial evolutions in the market. Windows Phone, anyone?

Over the past few years, Microsoft has already put a lot of effort into getting the developer community on its side. At GitHub, the company’s employees are already providing the lists of top contributors under the motto 'Open source, from Microsoft with love'. This was a strong signal in itself. Besides the fact that they have now completely absorbed the platform.

Of course Nadella knows that you can’t have a successful cloud platform without a development community. And to give it every chance of success, everyone must be welcome and you have to keep the platform as open as possible. This offers the best guarantee of commercial success. In fact: a closed environment is simply unsellable these days.

At Codit, we are big fans of the Azure platform. It’s the perfect co-creation platform for developing innovative solutions along with our customers. When it comes to the IoT, for example. But I’ve talked about that previously.

What do you think about this move by Microsoft? And what if Google takes on competitor GitLab in the near future? Will that be less problematic? How ‘open’ source is Google really? Surely it’s a matter of perception?


This article was first published on Data News in Dutch.

Categories: Opinions
written by: Stijn Degrieck

Posted on Monday, June 11, 2018 11:06 AM

Toon Vanhoutte by Toon Vanhoutte

Last week, Codit was represented at INTEGRATE 2018 with a big delegation! With more than 20 people from the different Microsoft product teams and 25 speakers (2 from Codit), this is the biggest event to remain up-to-date with the latest evolutions in the integration space. After this successful edition, it's time to look back at the most important announcements!

Azure Integration Services

The announcement of Azure Integration Services is probably one of the biggest evolutions that is going on. Azure Integration Services is the umbrella for Logic Apps, API Management, Service Bus and Event Grid. It's clear that this is more than a marketing effort. The final goal is to ensure seamless integration and collaboration between these separate services and to provide one uniform integration Platform as a Service (iPaaS).

This will be achieved by designing detailled reference architectures, by developing quick-start templates that solve common integration challenges and by designing best practices about CI/CD, disaster recovery and scalability.

Integration Service Environments

The roadmap of Azure Logic Apps was presented, including lots of new features. The most important one is definitely the introduction of an Integration Service Environment (ISE). This gives you the ability to run Logic Apps on dedicated and isolated compute power, including a way to connect to on-premise application through VNET integration. An ISE will come with a fixed price per 50M actions. At the time of writing, there are no more details available on the pricing, but it's obvious that dedicated compute comes with a higher price.

Next to that, there was the announcement that the Integration Account will get a consumption based billing model, which is great news for many of our customers. Other features that will be released later this year are: availability of Logic Apps in China cloud, beter testing/mocking capabilities, support for OAuth2 on HTTP trigger, extended support for Managed Service Identity and Key Vault. And yes… the trigger for the SAP connector is now in private preview!

The future of BizTalk Server

Microsoft announced the third Feature Pack for BizTalk Server, with a main focus on compliance (GDPR), advanced scheduling and Office365 adapters. It's clear that Microsoft is fully betting on Azure Integration Services and that more and more BizTalk functionalities will move to the cloud. Next to that, there's also the trend that Logic Apps gets more and more positioned for on premises integration. Via an ISE, this will become already easier and Logic Apps will also be made available on Azure Stack.

Our existing BizTalk Server customers don't need to be concerned. Microsoft is aware of its extended customer base that runs business critical processes on top of BizTalk Server. BizTalk Server will be maintained at least until there is a valid alternative available on-premises. Be aware that extended support for the current BizTalk version (2016) is still available until 2027.
Don't hesitate to get in touch, if you need more information!

written by: Toon Vanhoutte

Posted on Wednesday, June 6, 2018 5:24 PM

Third and final day of Integrate 2018! THE yearly conference for Microsoft integration. This is the day 3 recap of the sessions presented at Integrate 2018 with the views and opinions of the Codit staff.

Architecting Highly Available Cloud Integrations - Richard Seroter

Richard starts out by stating that creating a cloud integration is similar to sticking pieces together on a model plane; the strength and flexibility of the glue determine the resilience of the model.   

In a (cloud) integration system this glue would be the messaging and eventing infrastructure connecting the other system parts and allowing for recovery and retry.  

In general, one could say that all the capabilities for scaling and resilience are there, but it is up to you to take advantage of it by making choices in configuration and architecture.  

He also states the fact that control is not yours in a lot of cases; Microsoft will determine if, when and where a condition is classified as a disaster, and you should be prepared to react when this happens.  

All in all, it comes down to knowing what components in your architecture have what capabilities regarding availability and scaling and to decide if and how you will make use of these capabilities.  

The primary takeaway from this session, however, is still the golden rule; design for disaster by being able to rebuild your environment fast in a different region by eliminating manual deployment and provisioning steps. In Azure that would mean using the ARM to the max.   

Richard ended his talk with some golden tips:  

  • Always integrate with high available endpoints  
  • Clearly understand what services failover together  
  • Regularly perform chaos testing

Real-world Integration Scenarios for Dynamics 365 - Michael Stephenson

Michael Stephenson delivered the second session on the last day, explaining how Microsoft Flow can simplify DevOps tasks and how these flows can be extended to any user within the team. After giving a brief background about himself, he moved on to explain the "DevOps Wall" and daily tasks related to the role. 

He based his example on how "Azure AD B2B Setup" can be used for Multi-Region Group solutions. He demonstrated a "major release user creation" was an Active Directory Admin pushes a csv to a PowerShell script to add users to the AD and email invite notifications.    

Michael explains how non-technical teams or individuals might be reluctant to take on such tasks with the amount of responsibility required, especially if they lack the knowledge of what is exactly happening in the background. Michael continues to demonstrate how Microsoft Flow simplifies the process for non-technical people,  acting as a black box by hiding the individual processes and resources used by the Flow. He then displays the process on screen, broken into tasks to show how many steps are required to complete the action above.  

He then did a demo using Minecraft, simulating a situation where a user was unable to log in due to not having access. He then proceeded to add the user and granted him access through Minecraft, which it then triggered a Microsoft Flow to complete the process and allow access to the user.  

Michael explained how crucial it is to be resourceful and keep things efficient when building tasks, which will ultimately cost money for the company. As a key point, Michael insisted on the importance of automating repetitive tasks and how Flow can help you achieve this efficiently, as a result, reducing work overhead and costs. 

Using VSTS to deploy to a BizTalk Server, what you need to know - Johan Hedberg

Johan Hedberg showcased the ALM capabilities that now come with the BizTalk 2016 feature packs.  

In a hands-on practical session, he started by creating and configuring a BizTalk application project using the BizTalk. Deployment Task in VSTS CI. Later he did all the VSTS plumbing for obtaining a usable build agent, and he showed us how to set up the build and release pipelines, even including unit tests! 

Support for Tokenized binding files, pre- and post-processing steps support in VSTS. 

This session was excellent for everyone wishing to get started with CI/CD for BizTalk. 

Exposing BizTalk Server to the World - Wagner Silveira

In this session, Wagner Silveira talked about the different ways to expose BizTalk endpoints to the outside world using Azure technologies. Reasons why you would want this, include consuming on-premise resources, offering a single gateway and extending cloud workflows. 

Message exchange options like queues and topics, file exchange or e-mail are undoubtedly possible, but Wagner focused on exposing HTTP endpoints. 

For this,  he went over the following options: 

  • Azure Relay Services 
  • Logic Apps 
  • Azure Function Proxies 
  • Azure API Management 

Each of these come with different security and message format possibilities. 

To make an educated decision on which Azure services to use, you should identify your needs and know about the possibilities and limitations of each option. 

Then Wagner demoed exposing endpoints using Relay Services and API Management. 

Anatomy of an Enterprise Integration Architecture - Dan Toomey

Dan Toomey gave a more architecture oriented presentation for his session at Integrate 2018. 

Typical businesses don't have one monolithic application, but multiple, and sometimes even hundreds of applications to run. Although this decreases the cost of change, it increases the cost of integration. It's a challenge to find the sweet spot.  

These different applications can be split up in layers, on the bottom there is the Systems of Record (CRM, SQL, ...) layer for transaction processing and master data, this is the least common changing layer. One up is the Systems of Differentiation layer, where processes unique to the business can be found. Topping it all off you have the Systems of Innovation, where the fast-changing new applications and requirements lay. These different layers have different properties concerning Business Processes, Data/Information, Change Control/Governance, Business Engagement, and Planning timelines. 

From an integration perspective, the differences can be found in the Rate Of Change, API Design, Change Control and Testing Regime. The different layers havedifferent options on how to integrate them; you'll mostly use BizTalk and API Management in the Records-layer, BizTalk and Logic Apps for the Differentiation, and Flow/PowerApps/PowerBI for the Innovation layer. These all have different characteristics which should be considered. 

Dan ended the session with some considerations you should keep in mind to integrate better. Consider how your applications will be integrated, make sure your Systems of Record layer is stable, limit customization in Systems of Record, consider using canonical data models, loosely couple inter-layer communications, and allow room for innovation. 

Unlock the power of hybrid integration with BizTalk Server and webhooks! - Toon Vanhoutte

Webhooks is a relatively new messaging pattern that may replace synchronous request-response and asynchronous polling techniques. 

Using webhooks is not only faster, but it also allows for improved extensibility, and it requires no client-side state registration. 

In his Talk, Toon discussed the design considerations and implementation details of webhooks in combination with BizTalk Server and Azure Event Grid. 

In his demos, BizTalk Server publishes or consumes events that are sent or received through Azure Event Grid, and Azure Event Grid is responsible for managing the webhook registrations. 

For event publishing scenarios,  it is essential to implement retry policies, fallback mechanisms, proper security settings, and continuous endpoint validation. 

When consuming events, there should be focused on the scalability, reliability of connections and on the fact that the sequence of incoming messages cannot be guaranteed. Toon pointed out that especially this last point is often forgotten. 

Refining BizTalk implementations - Mattias Lögdberg

In this session, Mattias gave a real-life presentation of how he modernized an existing architecture to leverage cloud components. 

The webshop was moved to the cloud which brought some challenges to connect the webshop to on-premise systems like the ERP system. 

Azure services that were introduced including table storage, API Management, Logic Apps, Azure Functions, and DocumentDb. ServiceBus queues and topics, as well as DocumentDb, were used to throttle the load on the ERP system. 

Monolithic applications were transformed to a set of loosely coupled components, and the complex framework was turned into microservices. 

BizTalk is just as vital as before, but there are now more tools being added to the toolbox instead of just BizTalk to execute enterprise-grade integration. 

Focused roundtable with Microsoft Pro Integration Team

Last part of the day was the roundtable.

And some of the questions asked are below.

Thank you for reading our blog post, feel free to comment or give us feedback in person.

This blogpost was prepared by:

Bart Cocquyt
Charles Storm
Danny Buysse
Jasper Defesche
Jef Cools
Jonathan Gurevich
Keith Grima
Matthijs den Haan
Michel Pauwels
Niels van der Kaap
Nils Gruson
Peter Brouwer
Ronald Lokers
Sjoerd van Lochem
Steef-Jan Wiggers
Toon Vanhoutte
Wouter Seye