wiki

Codit Wiki

Loading information... Please wait.

Codit Blog

Posted on Thursday, July 20, 2017 6:26 PM

Toon Vanhoutte by Toon Vanhoutte

Jon Fancey announced at Integrate the out-of-the-box batching feature in Logic Apps! This early morning, I saw by accident that this feature is already released in West-Europe. This blog contains a short preview of the batching functionality. There will definitely be a follow up with more details and scenarios!

In batching you need to have two processes:

  • Batch ingestion: the one responsible to queue messages into a specific batch
  • Batch release: the one responsible to dequeue the messages from a specific batch, when certain criteria are met (time, number of messages, external trigger…)

Batch Release

In Logic Apps, you must start with the batch release Logic App, as you will need to reference it from the batch ingestion workflow. This is to avoid that you are sending messages into a batch that does not exist! This is how the batch release trigger looks like:

You need to provide:

  • Batch Name: the name of your batch
  • Message Count: specify the number of messages required in the batch to release it

In the future, definitely more release criteria will be supported.

Batch Ingestion

Now you can inject messages into the batch. Therefore, I created just a simple request / response Logic App, that contains the Send messages to batch action. First you need to specify the previously created Logic App that is responsible for the batch release.

Once you've done this, you can specify all required info.

You need to provide:

  • Batch Name: the name of the batch. This will be validated at runtime!
  • Message Content: the content of the message to be batched.
  • Partition Name: specify a "sub-batch" within the batch. In my scenario, all invoices for one particular customer will be batched together. If empty, the partition will be DEFAULT.
  • MessageId: a message identifier. If empty, a GUID will be generated.

The result

I've just triggered the batch-ingest Logic Apps many times. This queues messages within the batch.

Each time 5 messages, belonging to the same partition, are available in the batch, the batch release Logic App gets fired.

The output looks like this:

Conclusion

Very happy to see this has been added to the product, as batching is still required nowadays. I thought this would have been part of the Integration Account; cool to see there is no dependency on that. The batch release process is not using a polling trigger, so this saves you also some additional costs.

I'll get in touch with the product group for some feedback, but this looks already very promising!

Categories: Azure
Tags: Logic Apps
written by: Toon Vanhoutte

Posted on Friday, June 16, 2017 5:00 PM

Toon Vanhoutte by Toon Vanhoutte

Lately, I was working at a customer that heavily invested in BizTalk Server on premises during the last decade. They are considering to migrate parts of their existing integrations towards Logic Apps, to leverage the smooth integration with modern SaaS applications. A very important aspect is the ability to reuse their existing schemas and maps as much as possible. BizTalk schemas and maps can be easily used within the Logic Apps Integration Account, but there is no support for extension objects at the moment. Let's have a look on how we tackled this problem.

Extension objects are used to consume external .NET libraries from within XSLT maps. This is often required to perform database lookups or complex functions during a transformation. Read more about extension objects in this excellent blog.

Analysis

Requirements

We are facing two big challenges:

  1. We must execute the existing XSLT's with extension objects in Logic Apps
  2. On premises Oracle and SQL databases must be accessed from within these maps

Analysis

It's clear that we should extend Logic Apps with non-standard functionality. This can be done by leveraging Azure Functions or Azure API Apps. Both allow custom coding, integrate seamlessly with Logic Apps and offer the following hybrid network options (when using App Service Plans):

  • Hybrid Connections: most applicable for light weight integrations and development / demo purposes
  • VNET Integration: if you want to access a number of on premise resources through your Site-to-Site VPN
  • App Service Environment: if you want to access a high number of on premise resources via ExpressRoute

As the pricing model is quite identical, because we must use an App Service Plan, the choice for Azure API Apps was made. The main reason was the already existing WebAPI knowledge within the organization.

Design

A Site-to-Site VPN is used to connect to the on-premise SQL and Oracle databases. By using a standard App Service Plan, we can enable VNET integration on the custom Transform API App. Behind the scenes, this creates a Point-to-Site VPN between the API App and the VNET, as described here. The Transform API App can be consumed easily from the Logic App, while being secured with Active Directory authentication.

Solution

Implementation

The following steps were needed to build the solution. More details can be found in the referenced documentation.

  1. Create a VNET in Azure. (link)
  2. Setup a Site-to-Site VPN between the VNET and your on-premises network. (link)
  3. Develop an API App that executes XSLT's with corresponding extension objects. (link)
  4. Foresee Swagger documentation for the API App. (link)
  5. Deploy the API App. Expose the Swagger metadata and configure CORS policy. (link)
  6. Configure VNET Integration to add the API App to the VNET. (link)
  7. Add Active Directory authentication to the API App. (link)
  8. Consume the API App from within Logic Apps.

Transform API

The source code of the Transform API can be found here. It leverages Azure Blob Storage, to retrieve the required files. The Transform API must be configured with the required app settings, that define the blob storage connection string and the containers where the artefacts will be uploaded.

The Transform API offers one Transform operation, that requires 3 parameters:

  • InputXml: the byte[] that needs to be transformed
  • MapName: the blob name of the XSLT map to be executed
  • ExtensionObjectName: the blob name of the extension object to be used

Sample

You can run this sample to test the Transform API with custom extension objects.

Input XML

This is a sample input that can be provided as input for the Transform action.

Transformation XSLT

This XSLT must be uploaded to the right blob storage container and will be executed during the Transform action.

Extension Object XML

This extension object must be uploaded to the right blob storage container and will be used to load the required assemblies.

External Assembly

Create an assembly named, TVH.Sample.dll, that contains the class Common.cs. This class contains a simple method to generate a GUID. Upload this assembly to the right blob storage container, so it can be loaded at runtime.

Output XML

Deploy the Transform API, using the instructions on GitHub. You can easily test it using the Request / Response actions:

As a response, you should get the following output XML, that contains the generated GUID.

Important remark: Do not forget to add security to your Transform API (Step 7), as is it accessible on public internet, by default!

Conclusion

Thanks to the Logic Apps extensibility through API Apps and their VNET integration capabilities, we were able to build this solution in a very short time span. The solution offers an easy way to migrate BizTalk maps as-is towards Logic Apps, which is a big time saver! Access to resources that remain on premises is also a big plus nowadays, as many organizations have a hybrid application landscape.

Hope to see this functionality out-of-the-box in the future, as part of the Integration Account!

Thanks for reading. Sharing is caring!
Toon

Categories: Azure
Tags: Logic Apps
written by: Toon Vanhoutte

Posted on Thursday, June 8, 2017 3:11 PM

Toon Vanhoutte by Toon Vanhoutte

This post contains 10 useful tips for designing enterprise integration solutions on top of Logic Apps. It's important to think upfront about reusability, reliability, security, error handling and maintenance.

Democratization of integration

Before we dive into the details, I want to provide some reasoning behind this post. With the rise of cloud technology, integration takes a more prominent role than ever before. In Microsoft's integration vision, democratization of integration is on top of the list.

Microsoft aims to take integration out of its niche market and offers it as an intuitive and easy-to-use service to everyone. The so-called Citizen Integrators are now capable of creating light-weight integrations without the steep learning curve that for example BizTalk Server requires. Such integrations are typically point-to-point, user-centric and have some accepted level of fault tolerance.

As an Integration Expert, you must be aware of this. Enterprise integration faces completely different requirements than light-weight citizen integration: loosely coupling is required, no message loss is accepted because it's mission critical interfacing, integrations must be optimized for operations personnel (monitoring and error handling), etc…

Keep this in mind when designing Logic App solutions for enterprise integration! Make sure you know your cloud and integration patterns. Ensure you understand the strengths and limits of Logic Apps. The advice below can give you a jump start in designing reliable interfaces within Logic Apps!

Design enterprise integration solutions

1. Decouple protocol and message processing

Once you created a Logic App that receives a message via a specific transport protocol, it's extremely difficult to change the protocol afterwards. This is because the subsequent actions of your Logic App often have a hard dependency on your protocol trigger / action. The advice is to perform the protocol handling in one Logic App and hand over the message to another Logic App to perform the message processing. This decoupling will allow you to change the receiving transport protocol in a flexible way, in case the requirements change or in case a certain protocol (e.g. SFTP) is not available in your DEV / TEST environment.

2. Establish reliable messaging

You must realize that every action you execute, is performed by an underlying HTTP connection. By its nature, an HTTP request/response is not reliable: the service is not aware if the client disconnects during request processing. That's why receiving messages must always happen in two phases: first you mark the data as returned by the service; second you label the data as received by the client (in our case the Logic App). The Service Bus Peek-Lock pattern is a great example that provides such at-least-once reliability.  Another example can be found here.

3. Design for reuse

Real enterprise integration is composed of several common integration tasks such as: receive, decode, transform, debatch, batch, enrich, send, etc… In many cases, each task is performed by a combination of several Logic App actions. To avoid reconfiguring these tasks over and over again, you need to design the solution upfront to encourage reuse of these common integration tasks. You can for example use the Process Manager pattern that orchestrates the message processing by reusing nested Logic Apps or introduce the Routing Slip pattern to build integration on top of generic Logic Apps. Reuse can also be achieved on the deployment side, by having some kind of templated deployments of reusable integration tasks.

4. Secure your Logic Apps

From a security perspective, you need to take into account both role-based access control to your Logic App resources and runtime security considerations. RBAC can be configured in the Access Control (IAM) tab of your Logic App or on a Resource Group level. The runtime security really depends on the triggers and actions you're using. As an example: Request endpoints are secured via a Shared Access Signature that must be part of the URL, IP restrictions can be applied. Azure API Management is the way to go if you want to govern API security centrally, on a larger scale. It's a good practice to assign the minimum required privileges (e.g. read only) to your Logic Apps.

5. Think about idempotence

Logic Apps can be considered as composite services, built on top of several API's. API's leverage the HTTP protocol, which can cause data consistency issues due to its nature. As described in this blog, there are multiple ways the client and server can get misaligned about the processing state. In such situations, clients will mostly retry automatically, which could result in the same data being processed twice at server side. Idempotent service endpoints are required in such scenarios, to avoid duplicate data entries. Logic Apps connectors that provide Upsert functionality are very helpful in these cases.

6. Have a clear error handling strategy

With the rise of cloud technology, exception and error handling become even more important. You need to cope with failure when connecting to multiple on premise systems and cloud services. With Logic Apps, retry policies are your first resort to build resilient integrations. You can configure a retry count and interval at every action, there's no support for exponential retries or circuit breaker pattern. In case the retry policy doesn't solve the issue, it's advised to return a clear error description within sync integrations and to ensure a resumable workflow within async integrations. Read here how you can design a good resume / resubmit strategy.

7. Ensure decent monitoring

Every IT solution benefits from a good monitoring. It provides visibility and improves the operational experience for your support personnel. If you want to expose business properties within your monitoring, you can use Logic Apps custom outputs or tracked properties. These can be consumed via the Logic Apps Workflow Management API or via OMS Log Analytics. From an operational perspective, it's important to be aware that there is an out-of-the-box alerting mechanism that can send emails or trigger Logic Apps in case a run fails. Unfortunately, Logic Apps has no built-in support for Application Insights, but you can leverage extensibility (custom API App or Azure Function) to achieve this. If your integration spans multiple Logic Apps, you must foresee correlation in your monitoring / tracing!  Find here more details about monitoring in Logic Apps.

8. Use async wherever possible

Solid integrations are often characterized by asynchronous messaging. Unless the business requirements really demand request/response patterns, try to implement them asynchronously. It comes with the advantage that you introduce real decoupling, both from a design and runtime perspective. Introducing a queuing system (e.g. Azure Service Bus) in fire-and-forget integrations, results in highly scalable solutions that can handle an enormous amount of messages. Retry policies in Logic Apps must have different settings depending whether you're dealing with async or sync integration. Read more about it here.

9. Don't forget your integration patterns

Whereas BizTalk Server forces you to design and develop in specific integration patterns, Logic Apps is more intuitive and easier to use. This could come with a potential downside that you forget about integration patterns, because they are not suggested by the service itself. As an integration expert, it's your duty to determine which integration patterns should be applied on your interfaces. Loosely coupling is common for enterprise integration. You can for example introduce Azure Service Bus that provides a Publish/Subscribe architecture. Its message size limitation can be worked around by leveraging the Claim Check pattern, with Azure Blob Storage. This is just one example of introducing enterprise integration patterns.

10. Apply application lifecycle management (ALM)

The move to a PaaS architecture, should be done carefully and must be governed well, as described here. Developers should not have full access to the production resources within the Azure portal, because the change of one small setting can have an enormous impact. Therefore, it's very important to setup ALM, to deploy your Logic App solutions throughout the DTAP-street. This ensures uniformity and avoids human deployment errors. Check this video to get a head start on continuous integration for Logic Apps and read this blog on how to use Azure Key Vault to retrieve passwords within ARM deployments. Consider ALM as an important aspect within your disaster recovery strategy!

Conclusion

Yes, we can! Logic Apps really is a fit for enterprise integration, if you know what you're doing! Make sure you know your cloud and integration patterns. Ensure you understand the strengths and limits of Logic Apps. The Logic App framework is a truly amazing and stable platform that brings a whole range of new opportunities to organizations. The way you use it, should be depending on the type of integration you are facing!

Interested in more?  Definitely check out this session about building loosely coupled integrations with Logic Apps!

Any questions or doubts? Do not hesitate to get in touch!
Toon

Categories: Azure
Tags: Logic Apps
written by: Toon Vanhoutte

Posted on Thursday, April 27, 2017 5:22 PM

Toon Vanhoutte by Toon Vanhoutte

This blog post dives into the details of how you can achieve batching with Logic Apps. Batching is still a highly demanded feature for a middle-ware layer. It's mostly introduced to reduce the performance impact on the target system or for functional purposes. Let's have a closer look.

Scenario

For this blog post, I decided to try to batch the following XML message.  As Logic Apps supports JSON natively, we can assume that a similar setup will work quite easily for JSON messages.  Remark that the XML snippet below contains an XML declaration, so pure string appending won't work.  Also namespaces are included.

Requirements

I came up with the following requirements for my batching solution:

  • External message store: in integration I like to avoid long-running workflow instances at all time. Therefore I prefer messages to be stored somewhere out-of-the-process, waiting to be batched, instead of keeping them active in a singleton workflow instance (e.g. BizTalk sequential convoy).

  • Message and metadata together: I want to avoid to store the message in a specific place and the metadata in another one. Keep them together, to simplify development and maintenance.

  • Native Logic Apps integration: preferably I can leverage an Azure service, that has native and smooth integration with Azure Logic Apps. It must ensure we can reliably assign messages to a specific batch and we must be able to remove them easily from the message store.

  • Multiple batch release triggers: I want to support multiple ways to decide when a batch can be released.
    > # Messages: send out batches containing each X messages
    > Time: send out a batch at a specific time of the day
    > External Trigger: release the batch when an external trigger is receive

Solution

After some analysis, I was convinced that Azure Service Bus queues are a good fit:

  • External message store: the messages can be queued for a long time in an Azure Service Bus queue.

  • Message and metadata together: the message is placed together with its properties on the queue. Each batch configuration can have its own queue assigned.

  • Native Logic Apps integration: there is a Service Bus connector to receive multiple messages inside one Logic App instance. With the peak-lock pattern, you can reliably assign messages to a batch and remove them from the queue.

  • Multiple batch release triggers:
    > # Messages: In the Service Bus connector, you can choose how many messages you want to receive in one Logic App instance

    > Time
    : Service Bus has a great property ScheduledEnqueueTimeUtc, which ensures that a message becomes only visible on the queue from a specific moment in time. This is a great way to schedule messages to be releases at a specific time, without the need for an external scheduler.

    > External Trigger
    : The Logic App can be easily instantiated via the native HTTP Request trigger

 

Implementation

Batching Store

The goal of this workflow is to put the message on a specific queue for batching purpose.  This Logic App is very straightforward to implement. Add a Request trigger to receive the messages that need to be batched and use the Send Message Service Bus connector to send the message to a specific queue.

In case you want to release the batch only at a specific moment in time, you must provide a value for the ScheduledEnqueueTimeUtc property in the advanced settings.

Batching Release

This is the more complex part of the solution. The first challenge is to receive for example 3 messages in one Logic App instance. My first attempt failed, because there is apparently a different behaviour in the Service Bus receive trigger and action:

  • When one or more messages arrive in a queue: this trigger receives messages in a batch from a Service Bus queue, but it creates for every message a specific Logic App instance. This is not desired for our scenario, but can be very useful in high throughput scenarios.

  • Get messages from a queue: this action can receive multiple messages in batch from a Service Bus queue. This results in an array of Service Bus messages, inside one Logic App instance. This is the result that we want for this batching exercise!

Let's use the peak-lock pattern to ensure reliability and receive 3 messages in one batch:

As a result, we get this JSON array back from the Service Bus connector:

The challenge is to parse this array, decode the base64 content in the ContentData and create a valid XML batch message from it.  I tried several complex Logic App expressions, but realized soon that Azure Functions is better suited to take care of this complicated parsing.  I created the following Azure Fuction, as a Generic Webhook C# type:

Let's consume this function now from within our Logic App.  There is seamless integration with Logic Apps, which is really great!


As an output of the GetBatchMessage Azure Funtion, I get the following XML :-)

Large Messages

This solution is very nice, but what with large messages? Recently, I wrote a Service Bus connector that uses the claim check pattern, which exchanges large payloads via Blob Storage. In this batching scenario we can also leverage this functionality. When I have open sourced this project, I'll update this blog with a working example.  Stay tuned for more!

Conclusion

This is a great and flexible way to perform batching within Logic Apps. It really demonstrates the power of the Better Together story with Azure Logic Apps, Service Bus and Functions. I'm sure this is not the only way to perform batching in Logic Apps, so do not hesitate to share your solution for this common integration challenge in the comments section below!

I hope this gave you some fresh insights in the capabilities of Azure Logic Apps!
Toon

Categories: Azure
Tags: Logic Apps
written by: Toon Vanhoutte

Posted on Monday, March 6, 2017 2:18 PM

Toon Vanhoutte by Toon Vanhoutte

Lately, I was working on a proof of concept on Logic Apps. I created a Logic App that looped over all files in a specific folder and processed them. I was curious about the performance, so I copied 10.000 files in the folder. The Logic App kicked off and started processing them. After a while the Logic App ended up in a failed state. Let's have a look what happened.

Apparently, I stumbled upon a limitation that is documented over here.

The documentation suggests to use the query action to filter arrays.  It offers the following capabilities:

You can indeed filter an array, based on a specific query.  However, I did not have an object available that contains the position of an item within the array.  I wanted assurance that I would not hit the for each limit, so this was no option.  If you know a way how to get the position within the array, please let me know via the comments sections below.

I continued my search within the Logic Apps Workflow Definition Language and found the @take function.  This is what the documentation states:

This did the trick. It takes the first 5000 items from an array. Luckily, you do not get an out-of-range exception if the incoming array contains less items!

It's always a good practice to validate your design upfront against the Logic Apps limits.

Hope this helps!
Toon

Categories: Azure
Tags: Logic Apps
written by: Toon Vanhoutte