wiki

Codit Wiki

Loading information... Please wait.

Codit Blog

Posted on Monday, June 18, 2018 2:09 PM

Tom Kerkhove by Tom Kerkhove

Codit is growing and so does Alfred. Nowadays we are automatically synchronizing every Codit employee with our backend by using Azure Logic Apps.

In order to do that, we've built an API app which is capable of querying Azure AD directory with Azure AD Application authentication and is now available on GitHub.

Just over a year ago, Codit welcomed Alfred, our personal butler that is taking care of our visitors. This has been a side project that I've been working on with Pieter Vandenheede, Wouter Seye and has been a ton of fun - if you've missed that, read more about it in this article.

In the past year, we've focused on improving our foundation while starting experimenting with bots.

Here is a brief overview of what we did:

  • Automatically synchronize all employees based on the Codit AD - Fully exposed via an API and orchestrated via Azure Logic Apps.
  • Migration from the Codit Internal Azure subscription to dedicated Azure subscriptions - Allows us to manage all resources as we'd like.
  • Ability to determine how you want to be notified when a visitor arrived - Fully exposed via an API, ready to be consumed.
  • Build more robust release pipelines in VSTS - Fully automated releases with deployment gates, Azure resource creation (ARM), Application Insights annotations and more

Automatically synchronize all employees based on the Codit AD

When we were building our synchronization workflow, we decided to use Azure Logic Apps as an orchestrator.

We are currently using the following Logic Apps:

  1. Company Debatcher - Triggers a new run per company by passing the company name, such as Codit Managed Services, and a list of users that can be blacklisted. This can be used to filter out users that are being used by applications or for testing purposes.
  2. Company Synchronizer - Retrieves all users from Azure AD for a specific company and filter out the blaclisted ones, if that is applicable. Once that is done, it will trigger a new run for every employee in the result set
  3. Employee Synchronizer - Imports and configures default notification configuration in the Santiago platform via our Management API

Here's a simplistic overview:



While building this synchronization flow, we started off using the built-in Azure AD connector. However, it soon began clear that this was not the best fit for us since it requires Global Administrator permissions.

We ask our Global Administrator if this would be ok but we decided that using Azure AD Applications to act on behalf of our platform would be the best fit.

Unfortunately, this is not supported at the time of implementation and decided to our own Active Directory "connector".

Introducing Active Directory connector for Logic Apps

It took us a long time but as of today our Azure Active Directory connector with Azure AD Application support is now available on GitHub and ready to be deployed on your subscription.

Authentication

In order to use our connector, you'll have to create & configure your Azure AD application in the Azure AD itself.

The Azure AD Application needs to have Read permissions for the Windows Azure Active Directory API API.

Want to learn more? Read our docs.

Features

Given our synchronization flow is centered around users, we only support the following operations for now:

  • Get a list of all users
  • Get a list of all users by company name
  • Get a specific user by user principle name ie. foo.bar@codit.eu
  • Send telemetry to Azure Application Insights

However, this is certainly not the end. If you'd like to request a feature or help us extend this we recommend creating a new issue.

Conclusion

Azure Logic Apps allowed us to very quickly build a synchronization workflow that easily integrates with our API. However, the built-in AD connector that we tried required us to use global administrator powers of our company active directory.

We did not want to do that given this is risky and decided to build our own Active Directory connector with Azure AD Application support. This Active Directory connector is now open-source and available on GitHub.

This enabled us to easily deploy and integrate it into our workflow and query our Active Directory with limited capabilities and still achieve our goal.
Although the component is very limited; we are open for suggestions and accept contributions, but prefer to discuss them in an issue first.

But this is only the beginning, we are working hard to open-source Alfred & Santiago as well so that you can run your own version of our internal visitor system.

Thanks for reading,

Tom.

Categories: Azure
written by: Tom Kerkhove

Posted on Friday, June 15, 2018 10:52 AM

Stijn Degrieck by Stijn Degrieck

Microsoft has bought GitHub. For no less than 7.5 billion dollars. This is undeniably a huge amount of money. But if you ask me, it’s a bargain. A smart investment with great return prospects. Because, with the takeover of the platform, Microsoft will also be gaining a major developer community.

This takeover has come at the right time for GitHub too. The company behind the platform has not always enjoyed positive news coverage. One of the co-founders was forced to leave a little while ago due to allegations of inappropriate behaviour. GitHub was also struggling to keep its head above water financially. It needed a capital injection via private investors or even an IPO, but both scenarios demand a lot of time and resources. It looked like selling was the only real option.

And there was no shortage of potential buyers. In addition to Microsoft, Google is believed to have been in the running. And Amazon has also shown interest in the past. But Nadella came out on top. The fact that Microsoft has the biggest user community on the platform will certainly have played a role.

But GitHub is actually technology agnostic. Any developer writing code using any technology can share and manage this on the platform as well as developing it further with the other community members. So it’s not an exclusive Microsoft club. And that’s exactly what makes it attractive. If Microsoft soon starts to run analyses of the codes that users upload, this is guaranteed to provide them with good ideas for the new solutions that they have in the pipeline or plan to develop themselves. This kind of platform is a great way to keep a finger on the pulse of the developer community.

Of course there are also a lot of potential Microsoft developers on GitHub. If they succeed in convincing just a fraction of these people to work with their own Azure platform, the investment will soon pay for itself.

But all that requires Microsoft to shake off its old reputation as a money grabber. Although the company has not been in the news due to legal cases and damage claims for years now, many people have not forgotten that the guys from Redmond previously dominated the market a little too enthusiastically.

A number of GitHub users among them, naturally. They immediately made the switch to the competitor platform GitLab following the announcement. To the extent that the platform briefly went down, as it could not handle the influx. But all in all, the outflow has remained limited. It will be different if one of the other major contributors decides to leave in the coming period. I can imagine that over at Red Hat they are not amused that Microsoft is suddenly hosting a significant part of their source code...

But everything will work out. Under Nadella, Microsoft made a strategic U- turn and began to behave completely differently. With much more humility. And this was necessary, after they had missed a couple of crucial evolutions in the market. Windows Phone, anyone?

Over the past few years, Microsoft has already put a lot of effort into getting the developer community on its side. At GitHub, the company’s employees are already providing the lists of top contributors under the motto 'Open source, from Microsoft with love'. This was a strong signal in itself. Besides the fact that they have now completely absorbed the platform.

Of course Nadella knows that you can’t have a successful cloud platform without a development community. And to give it every chance of success, everyone must be welcome and you have to keep the platform as open as possible. This offers the best guarantee of commercial success. In fact: a closed environment is simply unsellable these days.

At Codit, we are big fans of the Azure platform. It’s the perfect co-creation platform for developing innovative solutions along with our customers. When it comes to the IoT, for example. But I’ve talked about that previously.

What do you think about this move by Microsoft? And what if Google takes on competitor GitLab in the near future? Will that be less problematic? How ‘open’ source is Google really? Surely it’s a matter of perception?

Stijn.

This article was first published on Data News in Dutch.

Categories: Opinions
written by: Stijn Degrieck

Posted on Monday, June 11, 2018 11:06 AM

Toon Vanhoutte by Toon Vanhoutte

Last week, Codit was represented at INTEGRATE 2018 with a big delegation! With more than 20 people from the different Microsoft product teams and 25 speakers (2 from Codit), this is the biggest event to remain up-to-date with the latest evolutions in the integration space. After this successful edition, it's time to look back at the most important announcements!

Azure Integration Services

The announcement of Azure Integration Services is probably one of the biggest evolutions that is going on. Azure Integration Services is the umbrella for Logic Apps, API Management, Service Bus and Event Grid. It's clear that this is more than a marketing effort. The final goal is to ensure seamless integration and collaboration between these separate services and to provide one uniform integration Platform as a Service (iPaaS).

This will be achieved by designing detailled reference architectures, by developing quick-start templates that solve common integration challenges and by designing best practices about CI/CD, disaster recovery and scalability.

Integration Service Environments

The roadmap of Azure Logic Apps was presented, including lots of new features. The most important one is definitely the introduction of an Integration Service Environment (ISE). This gives you the ability to run Logic Apps on dedicated and isolated compute power, including a way to connect to on-premise application through VNET integration. An ISE will come with a fixed price per 50M actions. At the time of writing, there are no more details available on the pricing, but it's obvious that dedicated compute comes with a higher price.

Next to that, there was the announcement that the Integration Account will get a consumption based billing model, which is great news for many of our customers. Other features that will be released later this year are: availability of Logic Apps in China cloud, beter testing/mocking capabilities, support for OAuth2 on HTTP trigger, extended support for Managed Service Identity and Key Vault. And yes… the trigger for the SAP connector is now in private preview!

The future of BizTalk Server

Microsoft announced the third Feature Pack for BizTalk Server, with a main focus on compliance (GDPR), advanced scheduling and Office365 adapters. It's clear that Microsoft is fully betting on Azure Integration Services and that more and more BizTalk functionalities will move to the cloud. Next to that, there's also the trend that Logic Apps gets more and more positioned for on premises integration. Via an ISE, this will become already easier and Logic Apps will also be made available on Azure Stack.

Our existing BizTalk Server customers don't need to be concerned. Microsoft is aware of its extended customer base that runs business critical processes on top of BizTalk Server. BizTalk Server will be maintained at least until there is a valid alternative available on-premises. Be aware that extended support for the current BizTalk version (2016) is still available until 2027.
 
Don't hesitate to get in touch, if you need more information!

written by: Toon Vanhoutte

Posted on Wednesday, June 6, 2018 5:24 PM

Third and final day of Integrate 2018! THE yearly conference for Microsoft integration. This is the day 3 recap of the sessions presented at Integrate 2018 with the views and opinions of the Codit staff.

Architecting Highly Available Cloud Integrations - Richard Seroter

Richard starts out by stating that creating a cloud integration is similar to sticking pieces together on a model plane; the strength and flexibility of the glue determine the resilience of the model.   

In a (cloud) integration system this glue would be the messaging and eventing infrastructure connecting the other system parts and allowing for recovery and retry.  

In general, one could say that all the capabilities for scaling and resilience are there, but it is up to you to take advantage of it by making choices in configuration and architecture.  

He also states the fact that control is not yours in a lot of cases; Microsoft will determine if, when and where a condition is classified as a disaster, and you should be prepared to react when this happens.  

All in all, it comes down to knowing what components in your architecture have what capabilities regarding availability and scaling and to decide if and how you will make use of these capabilities.  

The primary takeaway from this session, however, is still the golden rule; design for disaster by being able to rebuild your environment fast in a different region by eliminating manual deployment and provisioning steps. In Azure that would mean using the ARM to the max.   

Richard ended his talk with some golden tips:  

  • Always integrate with high available endpoints  
  • Clearly understand what services failover together  
  • Regularly perform chaos testing

Real-world Integration Scenarios for Dynamics 365 - Michael Stephenson

Michael Stephenson delivered the second session on the last day, explaining how Microsoft Flow can simplify DevOps tasks and how these flows can be extended to any user within the team. After giving a brief background about himself, he moved on to explain the "DevOps Wall" and daily tasks related to the role. 

He based his example on how "Azure AD B2B Setup" can be used for Multi-Region Group solutions. He demonstrated a "major release user creation" was an Active Directory Admin pushes a csv to a PowerShell script to add users to the AD and email invite notifications.    

Michael explains how non-technical teams or individuals might be reluctant to take on such tasks with the amount of responsibility required, especially if they lack the knowledge of what is exactly happening in the background. Michael continues to demonstrate how Microsoft Flow simplifies the process for non-technical people,  acting as a black box by hiding the individual processes and resources used by the Flow. He then displays the process on screen, broken into tasks to show how many steps are required to complete the action above.  

He then did a demo using Minecraft, simulating a situation where a user was unable to log in due to not having access. He then proceeded to add the user and granted him access through Minecraft, which it then triggered a Microsoft Flow to complete the process and allow access to the user.  

Michael explained how crucial it is to be resourceful and keep things efficient when building tasks, which will ultimately cost money for the company. As a key point, Michael insisted on the importance of automating repetitive tasks and how Flow can help you achieve this efficiently, as a result, reducing work overhead and costs. 

Using VSTS to deploy to a BizTalk Server, what you need to know - Johan Hedberg

Johan Hedberg showcased the ALM capabilities that now come with the BizTalk 2016 feature packs.  

In a hands-on practical session, he started by creating and configuring a BizTalk application project using the BizTalk. Deployment Task in VSTS CI. Later he did all the VSTS plumbing for obtaining a usable build agent, and he showed us how to set up the build and release pipelines, even including unit tests! 

Support for Tokenized binding files, pre- and post-processing steps support in VSTS. 

This session was excellent for everyone wishing to get started with CI/CD for BizTalk. 

Exposing BizTalk Server to the World - Wagner Silveira

In this session, Wagner Silveira talked about the different ways to expose BizTalk endpoints to the outside world using Azure technologies. Reasons why you would want this, include consuming on-premise resources, offering a single gateway and extending cloud workflows. 

Message exchange options like queues and topics, file exchange or e-mail are undoubtedly possible, but Wagner focused on exposing HTTP endpoints. 

For this,  he went over the following options: 

  • Azure Relay Services 
  • Logic Apps 
  • Azure Function Proxies 
  • Azure API Management 

Each of these come with different security and message format possibilities. 

To make an educated decision on which Azure services to use, you should identify your needs and know about the possibilities and limitations of each option. 

Then Wagner demoed exposing endpoints using Relay Services and API Management. 

Anatomy of an Enterprise Integration Architecture - Dan Toomey

Dan Toomey gave a more architecture oriented presentation for his session at Integrate 2018. 

Typical businesses don't have one monolithic application, but multiple, and sometimes even hundreds of applications to run. Although this decreases the cost of change, it increases the cost of integration. It's a challenge to find the sweet spot.  

These different applications can be split up in layers, on the bottom there is the Systems of Record (CRM, SQL, ...) layer for transaction processing and master data, this is the least common changing layer. One up is the Systems of Differentiation layer, where processes unique to the business can be found. Topping it all off you have the Systems of Innovation, where the fast-changing new applications and requirements lay. These different layers have different properties concerning Business Processes, Data/Information, Change Control/Governance, Business Engagement, and Planning timelines. 

From an integration perspective, the differences can be found in the Rate Of Change, API Design, Change Control and Testing Regime. The different layers havedifferent options on how to integrate them; you'll mostly use BizTalk and API Management in the Records-layer, BizTalk and Logic Apps for the Differentiation, and Flow/PowerApps/PowerBI for the Innovation layer. These all have different characteristics which should be considered. 

Dan ended the session with some considerations you should keep in mind to integrate better. Consider how your applications will be integrated, make sure your Systems of Record layer is stable, limit customization in Systems of Record, consider using canonical data models, loosely couple inter-layer communications, and allow room for innovation. 

Unlock the power of hybrid integration with BizTalk Server and webhooks! - Toon Vanhoutte

Webhooks is a relatively new messaging pattern that may replace synchronous request-response and asynchronous polling techniques. 

Using webhooks is not only faster, but it also allows for improved extensibility, and it requires no client-side state registration. 

In his Talk, Toon discussed the design considerations and implementation details of webhooks in combination with BizTalk Server and Azure Event Grid. 

In his demos, BizTalk Server publishes or consumes events that are sent or received through Azure Event Grid, and Azure Event Grid is responsible for managing the webhook registrations. 

For event publishing scenarios,  it is essential to implement retry policies, fallback mechanisms, proper security settings, and continuous endpoint validation. 

When consuming events, there should be focused on the scalability, reliability of connections and on the fact that the sequence of incoming messages cannot be guaranteed. Toon pointed out that especially this last point is often forgotten. 

Refining BizTalk implementations - Mattias Lögdberg

In this session, Mattias gave a real-life presentation of how he modernized an existing architecture to leverage cloud components. 

The webshop was moved to the cloud which brought some challenges to connect the webshop to on-premise systems like the ERP system. 

Azure services that were introduced including table storage, API Management, Logic Apps, Azure Functions, and DocumentDb. ServiceBus queues and topics, as well as DocumentDb, were used to throttle the load on the ERP system. 

Monolithic applications were transformed to a set of loosely coupled components, and the complex framework was turned into microservices. 

BizTalk is just as vital as before, but there are now more tools being added to the toolbox instead of just BizTalk to execute enterprise-grade integration. 

Focused roundtable with Microsoft Pro Integration Team

Last part of the day was the roundtable.

And some of the questions asked are below.

Thank you for reading our blog post, feel free to comment or give us feedback in person.

This blogpost was prepared by:

Bart Cocquyt
Charles Storm
Danny Buysse
Jasper Defesche
Jef Cools
Jonathan Gurevich
Keith Grima
Matthijs den Haan
Michel Pauwels
Niels van der Kaap
Nils Gruson
Peter Brouwer
Ronald Lokers
Sjoerd van Lochem
Steef-Jan Wiggers
Toon Vanhoutte
Wouter Seye

Posted on Tuesday, June 5, 2018 2:55 PM

Second day of Integrate 2018! THE yearly conference for Microsoft integration. This is the day 2 recap of the sessions presented at Integrate 2018 with the views and opinions of the Codit staff.

Microsoft IT (CSE) use of Logic Apps for Enterprise Integration - Divya Swarnkar & Amit Kumar Dua

Divya opened by explaining how essential business integration is for the internal processes and overall functioning of Microsoft. The company is actively migrating their B2B partner integrations from on-premise to the cloud. They use interesting integration patterns and developed a specialized migration tool to move B2B partner agreements from BizTalk to Logic Apps.  

Diya discussed two dynamic integration patterns for B2B and A2A integration scenarios, where API Management plays a central role in routing incoming messages through a sequence of Logic Apps that each have a specialized function. This approach does not only facilitate versioning, exception handling and process monitoring, but it also covers hybrid integration connectivity.  

After some nice demos, they wrapped up the talk with some very interesting learnings that you can find in the picture below. 

API Management deep dive - Vladimir Vinogradsky & Miao Jiang

In this session, Vladimir Vinogradksy covered developer authentication, data plane security and deployment automation.

To implement developer authentication, you can use a combination of users, groups, products and subscriptions. Supported authentication methods are username/password, identity providers, delegated OAuth, Azure AD and Azure AD B2C.

Next, Vlad showed how to secure the API gateway using technologies like API keys, OAuth2, OpenID Connect, mutual certificates, IP filters, VNETs and network security groups. He focused quite a bit on the powerful JWT validation policy for advanced authorization.

Finally, he talked about deployment automation. You can include ARM templates into your CI/CD pipelines to publish API's and sub resources to Azure. These ARM templates can be constructed from a development environment and then be merged to a publisher repository with pull requests.

Logic Apps Deep Dive - Matt Farmer & Kevin Lam

Kevin Lam kicked off this deep dive session with some recaps of last year concerning the Logic Apps designer and internal workflow engine.

Then came the interesting stuff: Integration Service Environments will become available at the end of the year. This will spin some dedicated and isolated compute power which you can use to run your Logic Apps in. The base model, which will have a fixed price, will allow you to run 50M actions / month, includes 1 standard integration account and 1 enterprise connector with unlimited connections. It will also allow VNET connectivity to on-premise systems. If 50M actions is too little for you, you can buy additional processing units for 50M actions extra each.

Afterwards, Matt Farmer took to the stage and talked about custom connectors. He showed us how to easily create your own custom connector for any SOAP/REST API by using a simple wizard. To finish off he enlightened us about the requirements for connector certification.

Logic Apps Patterns & Best Practices - Kevin Lam & Derek Li

Kevin Lam was back again after the break to talk about Logic Apps patterns and best practices. He walked us through common workflow and messaging patterns. In a demo, he showed how easy it is to implement a  peek-lock messaging pattern with Logic Apps: only 15 seconds to achieve this.

He talked about the try/catch possibilities, to make robust enterprise integrations. He stressed the point that the default mode of logic apps is to run in parallel, but you can control the level of concurrency!

Then Derek Li took over to show some best practices, focusing on variables and concurrency. Variables are global: something you should be aware of when using them inside a for each, as they can give unexpected results.  He showcased several possibilities to loop through an array while using conditions, explained how you can use the filter array action or how splitting can be achieved by calling a child Logic App with a SplitOn trigger.

Microsoft Integration Roadmap - Jon Fancey & Matt Farmer

Jon and Matt kicked off this session by giving their view on integration and a retrospect on the past year. They are proud to have reached the leader status in Gartner's iPaaS quadrant.

Features that will be available soon in Logic Apps are:

  • Availability in China cloud 
  • Smart designer, including favorites and intelligent suggestions 
  • Integration Service Environments 
  • Testability and mocking 
  • Ability to run in Azure Stack 
  • Managed Service Identity for all Azure connectors 
  • OAuth for request triggers

Next to that, there's the great announcement of Azure Integration Services, where Logic Apps, API Management, Service Bus and Event Grid will be positioned under one umbrella. This is not a pure marketing effort, they're really looking to glue these service together even better by creating reference architectures, templates for common solution patterns and guidance on CI/CD, disaster recovery and scalability.

Azure Integration Services will continue to on-board BizTalk functionality (business rules, HL7, RosettaNet…) and give it a cloud-native implementation. BizTalk is not positioned in this picture, but it will remain to play a crucial integration role for many customers.

What's there & what's coming in BizTalk360 & ServiceBus360 - Saravana Kumar

The first part of Saravana's session covered BizTalk360, that mainly provides operations, monitoring and analytics for BizTalk environments. A nice demo showed the value of BizTalk360, with customizable dashboards constructed with predefined widgets and by combining all admin tools in one web interface. Monitoring of several Azure Services has been added to the product, mainly to cover the management of Logic Apps for hybrid solutions, as more and more customers are exploring hybrid scenarios.

The second part focused on ServiceBus360, that will be renamed to Serverless360! The main goal of Serverless360 is getting rid of the silo offering in Azure, and approach monitoring as severless composite applications. After Sarvana concluded his demo, he continued with explaining the security features available in Serverless360, such as governance for serverless apps and secure user access at application level. With nice features like edit/resubmit, auto cleaning and more. He concluded his talk with the choice for hosting : SaaS or Private Hosting.

Serverless Messaging with Microsoft Azure - Steef-Jan Wiggers

Our very own Steef-Jan started his session with an introduction to serverless in general. He compared the Azure serverless messaging services with the four Daltons: they work together nicely and don't compete with each other. Here's an overview of the four Daltons:

  • Service Bus for enterprise messaging. "Business state transition"
  • Event Hubs for big data streams. "Flow data and telemetry in real-time"
  • Storage queues for simple tasks. "Coordinate simple tasks across compute"
  • Event Grid for reactive programming. "Reacting dynamically to the work around you"

He continued with impressive demos of interesting scenarios. He sent messages to Event Hubs, using .NET Kafka libraries. He showcased a tollbooth license plate recognition solution, leveragig Azure Event Grid integration with storage accounts. Besides that, he showed the power of the CosmosDb change feed for a serverless home automation scenario. Finally, he showed messaging with Kubernetes.

To summarize, he gave the following architectural advise:

  • Choose the right messaging service(s)
  • Look at the type of workloads
  • Think about security and compliance
  • Are there cross-platform requirements
  • Take into account cost when architecting messaging solutions
  • Automate your devops process

What's there & what's coming in Atomic Scope & Document360 - Saravana Kumar

According to Saravana Kumar, Atomic Scope provides end to end visualization for business integration processes. There is some instrumentation needed to be able to use the tool, but you gain a significant amount of time compared to writing a custom monitoring solution yourself. You can track BizTalk and Logic Apps integrations and messages from a functional perspective. According to Bart Scheurweghs, the added value is about tracking messages, message archiving, business exceptions, easy user configuration and dashboard. During a live demo, he showed an actual customer scenario, including exceptions.  The session ended with some feedback from the field, with good suggestions that will be placed on the roadmap of Atomic Scope. 

BizTalk Server: Lessons from the Road - Sandro Pereira

Sandro Pereiro was up next, with some ‘Lessons from the Road’. He started off by telling us to use the right tool for the right job, and stressed that BizTalk or Logic Apps might not always be that tool, using his usual sarcastic style as seen below: 

He told us to ‘stop logging everything’: though BizTalk itself is GDPR compliant in itself, when you log entire messages, your application may not be. Another point he mentioned was that we should be careful with feature packs because they might break our code (calling out Microsoft for removing an assembly from the GAC instead of just adding a new version);

Cumulative updates on the other hand should always be installed. He finished his session by talking about patterns like singletons and sequence convoys that should be avoided in his opinion, as they can produce issues  like ever-running instances or even zombies. Accordng to Sandro you can use alternative designs to avoid these types of patterns. All in all an entertaining session with some nice tips!

Using BizTalk Server as your Foundation to the Clouds - Stephen W Thomas

The last session of the day was by Stephen W. Thomas who showed us some use cases about connecting BizTalk Server with the cloud. The session was split in 2 parts: the first one was about how and why to connect BizTalk to your Logic Apps.

One reason to include Logic Apps is when you want to use special connectors for social media monitoring, customer communications, cross team communication and incident management. Second use case is batching, everybody has certainly done this before in BizTalk, but Logic Apps is a lot better in it, which he showed in a quick demo. You can do your inbound de-batching before sending the actual data to BizTalk. Inbound FTP/SFTP connections are another use case where Logic Apps can replace your BizTalk to do the grunt work. And the last use case is using Logic Apps to replace your reverse proxy. In all these use cases, you just use your BizTalk to do the actual on-premise processing.

The second part of the session covered some friction points that clients might raise because they don't want to move to Logic Apps or Azure in general. For every friction point, Stephen gave some pointers on how to handle these. Some will say "we already have BizTalk, why not use that", or that their data is too sensitive, or because the infrastructure manager says no. Another thing they throw your way is that the learning curve for Logic Apps is too high. Or that Azure changes to frequently. The last friction point was CEO/CTO says NO to the cloud.

Thank you for reading our blog post, feel free to comment or give us feedback in person.

This blogpost was prepared by:

Bart Cocquyt
Charles Storm
Danny Buysse
Jasper Defesche
Jef Cools
Jonathan Gurevich
Keith Grima
Matthijs den Haan
Michel Pauwels
Niels van der Kaap
Nils Gruson
Peter Brouwer
Ronald Lokers
Steef-Jan Wiggers
Toon Vanhoutte
Wouter Seye