wiki

Codit Wiki

Loading information... Please wait.

Codit Blog

Posted on Thursday, July 20, 2017 6:26 PM

Toon Vanhoutte by Toon Vanhoutte

Jon Fancey announced at Integrate the out-of-the-box batching feature in Logic Apps! This early morning, I saw by accident that this feature is already released in West-Europe. This blog contains a short preview of the batching functionality. There will definitely be a follow up with more details and scenarios!

In batching you need to have two processes:

  • Batch ingestion: the one responsible to queue messages into a specific batch
  • Batch release: the one responsible to dequeue the messages from a specific batch, when certain criteria are met (time, number of messages, external trigger…)

Batch Release

In Logic Apps, you must start with the batch release Logic App, as you will need to reference it from the batch ingestion workflow. This is to avoid that you are sending messages into a batch that does not exist! This is how the batch release trigger looks like:

You need to provide:

  • Batch Name: the name of your batch
  • Message Count: specify the number of messages required in the batch to release it

In the future, definitely more release criteria will be supported.

Batch Ingestion

Now you can inject messages into the batch. Therefore, I created just a simple request / response Logic App, that contains the Send messages to batch action. First you need to specify the previously created Logic App that is responsible for the batch release.

Once you've done this, you can specify all required info.

You need to provide:

  • Batch Name: the name of the batch. This will be validated at runtime!
  • Message Content: the content of the message to be batched.
  • Partition Name: specify a "sub-batch" within the batch. In my scenario, all invoices for one particular customer will be batched together. If empty, the partition will be DEFAULT.
  • MessageId: a message identifier. If empty, a GUID will be generated.

The result

I've just triggered the batch-ingest Logic Apps many times. This queues messages within the batch.

Each time 5 messages, belonging to the same partition, are available in the batch, the batch release Logic App gets fired.

The output looks like this:

Conclusion

Very happy to see this has been added to the product, as batching is still required nowadays. I thought this would have been part of the Integration Account; cool to see there is no dependency on that. The batch release process is not using a polling trigger, so this saves you also some additional costs.

I'll get in touch with the product group for some feedback, but this looks already very promising!

Categories: Azure
Tags: Logic Apps
written by: Toon Vanhoutte

Posted on Monday, July 3, 2017 11:52 AM

Toon Vanhoutte by Toon Vanhoutte

Azure Service Bus is a very robust and powerful message broker. As with every Azure service, you need to be aware of its strengths and limitations. The most important limitation of Azure Service Bus is the message size. The Standard tier allows messages up to 256kB, within the Premium tier you hit the limit for messages larger than 1 MB. A way to overcome this limit, is implementing the claim check pattern. This blog posts explains how you can use this pattern within Logic Apps to send/receive large messages to/from Azure Service Bus queues.

Claim Check Pattern

The claim check pattern is described over here. The pattern aims to reduce the size of the message being exchanged, without sacrificing information content. In a nutshell, this is how it works:

  1. The sender uploads the payload in an external data store, to which the receiver has also access.
  2. The sender sends a message, that includes a reference to the uploaded payload, to the receiver.
  3. The receiver downloads the payload, using the reference extracted from the exchanged message.

A real-life example of this pattern is the way WeTransfer is used to email large data.

  1. The sender uploads the large data to the WeTransfer data store
  2. An email, including a download link, is sent to the receiver
  3. The receiver clicks the download link and receives the large data

Claim Check API App

Logic Apps and Azure Service Bus work perfectly together. If we can overcome the message size limit of 256kB, a whole bunch of new scenarios reveals. Azure Blob Storage can perfectly take the role of external data store and we can leverage its SAS tokens to give read access to the receiver.

As a proof of concept, I created a custom API app that provides this functionality. You can view and download the code here. This page also includes instructions on how the deploy and configure this API App. Are you new to creating API Apps for Logic Apps? Definitely check out this post that explains how to create a custom polling trigger and how you can leverage the cool TRex library.

This is how the API App implements the claim check pattern:

  1. The sending Logic App uploads the payload to blob storage and assigns a read-only SAS policy 
  2. The sending Logic App sends a message to a service bus queue, containing the blob URI (including SAS token) in the 'claimcheck-uri' header.
  3. The receiving Logic App receives the message from the queue and retrieves the blob via the URI provided in the 'claimcheck-uri' header.

The custom API App contains several actions and triggers. The ones relevant for this post are Send Message to Queue and Receive Message from Queue.

Send Message to Queue

The user experience of this action is very similar to the default Service Bus action. However, under the hood, the claim check pattern is applied. The following parameters are available:

  • Content: Content of the message.
  • Content Type: Content type of the message content.
  • Queue Name: Name of the queue.
  • Properties: Message properties in JSON format (optional).
  • Scheduled Enqueue Time: UTC time in MM/dd/yyyy HH:mm:ss format (optional).

Receive Message from Queue

This polling trigger is used to receive messages from the queue. When there are still messages available in the queue, the trigger will fire continuously, until the queue is empty.

As an output, this trigger provides the message content (retrieved from blob storage), the content type and the message properties. The lock token must be used to explicitly complete the message in the Logic App, as this is required to ensure at-least-once delivery.

The API App provides also a variant on this trigger to retrieve multiple messages from the queue within one batch.

Conclusion

API Apps are very powerful extension points to Logic Apps. In this scenario, it helped us to overcome the Service Bus message size limitation of 256kB. By implementing the claim check pattern with Azure Blob Storage, we are now capable of exchanging payloads up to 50 MB, which is the current Logic Apps message size limit!

Hope you enjoyed this one!
Toon

 

Categories: Azure
Tags: Logic Apps
written by: Toon Vanhoutte

Posted on Tuesday, June 20, 2017 11:24 PM

Pieter Vandenheede by Pieter Vandenheede

In this blog I'll try and explain a real world example of a Logic App used to provide the short links to promote the blog posts appearing on our blog. Ready for the journey as I walk you through?

Introduction

At Codit, I manage the blog. We have some very passionate people on board who like to invest their time to get to the bottom of things and - also very important - share it with the world!
That small part of my job means I get to review blog posts before publishing on a technical level. It's always good to have one extra pair of eyes reading the post before publishing it to the public, so this definitely pays off!

An even smaller part of publishing blog posts is making sure they get enough coverage. Sharing them on Twitter, LinkedIn or even Facebook is part of the job for our devoted marketing department! And analytics around these shares on social media definitely come in handy! For that specific reason we use Bitly to shorten our URLs.
Every time a blog post gets published, someone needed to add them manually to out Bitly account and send out an e-mail. This takes a small amount of time, but as you can imagine it accumulates quickly with the amount of posts we generate lately!

Logic Apps to the rescue!

I was looking for an excuse to start playing with Logic Apps and they recently added Bitly as one of their Preview connectors, so I started digging!

First, let's try and list the requirements of our Logic App to-be:

Must-haves:

  • The Logic App should trigger automatically whenever a new blog post is published.
  • It should create a short link, specifically for usage on Twitter.
  • It also should create a short link, specifically for LinkedIn usage.
  • It should send out an e-mail with the short links.
  • I want the short URLs to appear in the Bitly dashboard, so we can track click-through-rate (CTR).
  • I want to spend a minimum of Azure consumption.

Nice-to-haves:

  • I want the Logic App to trigger immediately after publishing the blog post.
  • I want the e-mail to be sent out to me, the marketing department and the author of the post for (possibly) immediate usage on social media.
  • If I resubmit a logic app, I don't want new URLs (idempotency), I want to keep the ones already in the Bitly dashboard.
  • I want the e-mail to appear as if it was coming directly from me.

Logic App Trigger

I could easily fill in one of the first requirements, since the Logic App RSS connector provides me a very easy way to trigger a logic app based on a RSS feed. Our Codit blog RSS feed seemed to do the trick perfectly!

Now it's all about timing the polling interval: if we poll every minute we get the e-mail faster, but will spend more on Azure consumption since the Logic App gets triggered more... I decided 30 minutes would probably be good enough.

Now I needed to try and get the URL for any new posts that were published. Luckily, the links - Item provides me the perfect way of doing that. The Logic Apps designer conveniently detects this might be an array of links (in case two posts get published at once) and places this within a "For each" shape!

Now that I had the URL(s), all I needed to do was save the Logic App and wait until a blog post was published to test the Logic App. In the Logic App "Runs history" I was able to click through and see for myself that I got the links array nicely:

Seems there is only one item in the array for each blog post, which is perfect for our use-case!

Shortening the URL

For this part of the exercise I needed several things:

  • I actually need two URLs: one for Twitter and one for LinkedIn, so I need to call the Bitly connector twice!
  • Each link gets a little extra information in the query string called UTM codes. If you are unfamiliar with those, read up on UTM codes here. (In short: it adds extra visibility and tracking in Google Analytics).
    So I needed to concatenate the original URL with some static UTM string + one part which needed to be dynamic: the UTM campaign.

For that last part (the campaign): we already have our CMS cleaning up the title of a blog post in the last part of the URL being published! This seems ideal for us here.

However, due to lack of knowledge in Logic Apps-syntax I got a bit frustrated and - at first - created an Azure Function to do just that (extract the interesting part from the URL):

I wasn't pleased with this, but at least I was able to get things running...
It however meant I needed extra, unwanted, Azure resources:

  • Extra Azure storage account (to store the function in)
  • Azure App Service Plan to host the function in
  • An Azure function to do the trivial task of some string manipulation.

After some additional (but determined) trial and error late in the evening, I ended up doing the same in a Logic App Compose shape! Happy days!

Inputs: @split(item(), '/')[add(length(split(item(), '/')), -2)]

It takes the URL, splits it into an array, based on the slash ('/') and takes the part which is interesting for my use-case. See for yourself:

Now I still needed to concatenate all pieces of string together. The concat() function seems to be able to do the trick, but an even easier solution is to just use another Compose shape:

Concatenation comes naturally to the Compose shape!

Then I still needed to create the short links by calling the Bitly connector:

Let's send out an e-mail

Sending out e-mail, using my Office365 account is actually the easiest thing ever:

Conclusion

My first practical Logic App seems to be a hit! And probably saves us about half an hour of work every week. A few hours of Logic App "R&D" will definitely pay off in the long run!

Here's the overview of my complete Logic App:

Some remarks

During development, I came across - what appear to me - some limitations :

  • The author of the blog post is not in the output of the RSS connector, which is a pity! This would have allowed me to use his/her e-mail address directly or, if it was his/her name, to look-up the e-mail address using the Office 365 users connector!
  • I'm missing some kind of expression shape in Logic Apps!
    Coming from BizTalk Server where expression shapes containing a limited form of C# code are very handy in a BizTalk orchestration, this is something that should be included one way or the other (without the Azure function implementation).
    A few lines of code in there is awesome for dirty work like string manipulation for example.
  • It took me a while to get my head around Logic Apps syntax.
    It's not really explained in the documentation when or when not to use @function() or @{function()}. It's not that hard at all once you get the hang of it. Unfortunately it took me a lot of save errors and even some run-time errors (not covered at design time) to get to that point. Might be just me however...
  • I cannot rename API connections in my Azure Resource Group. Some generic names like 'rss', 'bitly' and 'office-365' are used. I can set some connection properties so they appear nicely in the Logic App however.
  • We have Office365 Multi-Factor Authentication enabled at our company. I can authorize the Office365 API connection, but this will only last for 30 days. I might need to change to an account without multi-factor authentication if I don't want to re-authorize every 30 days...

Let me know what you think in the comments! Is this the way to go?
Any alternative versions I could use? Any feedback is more than welcome.

In a next blog post I will take some of our Logic Apps best practices to heart and optimize the Logic App.

Have a nice day!
Pieter

Categories: Azure
written by: Pieter Vandenheede

Posted on Friday, June 16, 2017 5:00 PM

Toon Vanhoutte by Toon Vanhoutte

Lately, I was working at a customer that heavily invested in BizTalk Server on premises during the last decade. They are considering to migrate parts of their existing integrations towards Logic Apps, to leverage the smooth integration with modern SaaS applications. A very important aspect is the ability to reuse their existing schemas and maps as much as possible. BizTalk schemas and maps can be easily used within the Logic Apps Integration Account, but there is no support for extension objects at the moment. Let's have a look on how we tackled this problem.

Extension objects are used to consume external .NET libraries from within XSLT maps. This is often required to perform database lookups or complex functions during a transformation. Read more about extension objects in this excellent blog.

Analysis

Requirements

We are facing two big challenges:

  1. We must execute the existing XSLT's with extension objects in Logic Apps
  2. On premises Oracle and SQL databases must be accessed from within these maps

Analysis

It's clear that we should extend Logic Apps with non-standard functionality. This can be done by leveraging Azure Functions or Azure API Apps. Both allow custom coding, integrate seamlessly with Logic Apps and offer the following hybrid network options (when using App Service Plans):

  • Hybrid Connections: most applicable for light weight integrations and development / demo purposes
  • VNET Integration: if you want to access a number of on premise resources through your Site-to-Site VPN
  • App Service Environment: if you want to access a high number of on premise resources via ExpressRoute

As the pricing model is quite identical, because we must use an App Service Plan, the choice for Azure API Apps was made. The main reason was the already existing WebAPI knowledge within the organization.

Design

A Site-to-Site VPN is used to connect to the on-premise SQL and Oracle databases. By using a standard App Service Plan, we can enable VNET integration on the custom Transform API App. Behind the scenes, this creates a Point-to-Site VPN between the API App and the VNET, as described here. The Transform API App can be consumed easily from the Logic App, while being secured with Active Directory authentication.

Solution

Implementation

The following steps were needed to build the solution. More details can be found in the referenced documentation.

  1. Create a VNET in Azure. (link)
  2. Setup a Site-to-Site VPN between the VNET and your on-premises network. (link)
  3. Develop an API App that executes XSLT's with corresponding extension objects. (link)
  4. Foresee Swagger documentation for the API App. (link)
  5. Deploy the API App. Expose the Swagger metadata and configure CORS policy. (link)
  6. Configure VNET Integration to add the API App to the VNET. (link)
  7. Add Active Directory authentication to the API App. (link)
  8. Consume the API App from within Logic Apps.

Transform API

The source code of the Transform API can be found here. It leverages Azure Blob Storage, to retrieve the required files. The Transform API must be configured with the required app settings, that define the blob storage connection string and the containers where the artefacts will be uploaded.

The Transform API offers one Transform operation, that requires 3 parameters:

  • InputXml: the byte[] that needs to be transformed
  • MapName: the blob name of the XSLT map to be executed
  • ExtensionObjectName: the blob name of the extension object to be used

Sample

You can run this sample to test the Transform API with custom extension objects.

Input XML

This is a sample input that can be provided as input for the Transform action.

Transformation XSLT

This XSLT must be uploaded to the right blob storage container and will be executed during the Transform action.

Extension Object XML

This extension object must be uploaded to the right blob storage container and will be used to load the required assemblies.

External Assembly

Create an assembly named, TVH.Sample.dll, that contains the class Common.cs. This class contains a simple method to generate a GUID. Upload this assembly to the right blob storage container, so it can be loaded at runtime.

Output XML

Deploy the Transform API, using the instructions on GitHub. You can easily test it using the Request / Response actions:

As a response, you should get the following output XML, that contains the generated GUID.

Important remark: Do not forget to add security to your Transform API (Step 7), as is it accessible on public internet, by default!

Conclusion

Thanks to the Logic Apps extensibility through API Apps and their VNET integration capabilities, we were able to build this solution in a very short time span. The solution offers an easy way to migrate BizTalk maps as-is towards Logic Apps, which is a big time saver! Access to resources that remain on premises is also a big plus nowadays, as many organizations have a hybrid application landscape.

Hope to see this functionality out-of-the-box in the future, as part of the Integration Account!

Thanks for reading. Sharing is caring!
Toon

Categories: Azure
Tags: Logic Apps
written by: Toon Vanhoutte

Posted on Thursday, June 8, 2017 3:11 PM

Toon Vanhoutte by Toon Vanhoutte

This post contains 10 useful tips for designing enterprise integration solutions on top of Logic Apps. It's important to think upfront about reusability, reliability, security, error handling and maintenance.

Democratization of integration

Before we dive into the details, I want to provide some reasoning behind this post. With the rise of cloud technology, integration takes a more prominent role than ever before. In Microsoft's integration vision, democratization of integration is on top of the list.

Microsoft aims to take integration out of its niche market and offers it as an intuitive and easy-to-use service to everyone. The so-called Citizen Integrators are now capable of creating light-weight integrations without the steep learning curve that for example BizTalk Server requires. Such integrations are typically point-to-point, user-centric and have some accepted level of fault tolerance.

As an Integration Expert, you must be aware of this. Enterprise integration faces completely different requirements than light-weight citizen integration: loosely coupling is required, no message loss is accepted because it's mission critical interfacing, integrations must be optimized for operations personnel (monitoring and error handling), etc…

Keep this in mind when designing Logic App solutions for enterprise integration! Make sure you know your cloud and integration patterns. Ensure you understand the strengths and limits of Logic Apps. The advice below can give you a jump start in designing reliable interfaces within Logic Apps!

Design enterprise integration solutions

1. Decouple protocol and message processing

Once you created a Logic App that receives a message via a specific transport protocol, it's extremely difficult to change the protocol afterwards. This is because the subsequent actions of your Logic App often have a hard dependency on your protocol trigger / action. The advice is to perform the protocol handling in one Logic App and hand over the message to another Logic App to perform the message processing. This decoupling will allow you to change the receiving transport protocol in a flexible way, in case the requirements change or in case a certain protocol (e.g. SFTP) is not available in your DEV / TEST environment.

2. Establish reliable messaging

You must realize that every action you execute, is performed by an underlying HTTP connection. By its nature, an HTTP request/response is not reliable: the service is not aware if the client disconnects during request processing. That's why receiving messages must always happen in two phases: first you mark the data as returned by the service; second you label the data as received by the client (in our case the Logic App). The Service Bus Peek-Lock pattern is a great example that provides such at-least-once reliability.  Another example can be found here.

3. Design for reuse

Real enterprise integration is composed of several common integration tasks such as: receive, decode, transform, debatch, batch, enrich, send, etc… In many cases, each task is performed by a combination of several Logic App actions. To avoid reconfiguring these tasks over and over again, you need to design the solution upfront to encourage reuse of these common integration tasks. You can for example use the Process Manager pattern that orchestrates the message processing by reusing nested Logic Apps or introduce the Routing Slip pattern to build integration on top of generic Logic Apps. Reuse can also be achieved on the deployment side, by having some kind of templated deployments of reusable integration tasks.

4. Secure your Logic Apps

From a security perspective, you need to take into account both role-based access control to your Logic App resources and runtime security considerations. RBAC can be configured in the Access Control (IAM) tab of your Logic App or on a Resource Group level. The runtime security really depends on the triggers and actions you're using. As an example: Request endpoints are secured via a Shared Access Signature that must be part of the URL, IP restrictions can be applied. Azure API Management is the way to go if you want to govern API security centrally, on a larger scale. It's a good practice to assign the minimum required privileges (e.g. read only) to your Logic Apps.

5. Think about idempotence

Logic Apps can be considered as composite services, built on top of several API's. API's leverage the HTTP protocol, which can cause data consistency issues due to its nature. As described in this blog, there are multiple ways the client and server can get misaligned about the processing state. In such situations, clients will mostly retry automatically, which could result in the same data being processed twice at server side. Idempotent service endpoints are required in such scenarios, to avoid duplicate data entries. Logic Apps connectors that provide Upsert functionality are very helpful in these cases.

6. Have a clear error handling strategy

With the rise of cloud technology, exception and error handling become even more important. You need to cope with failure when connecting to multiple on premise systems and cloud services. With Logic Apps, retry policies are your first resort to build resilient integrations. You can configure a retry count and interval at every action, there's no support for exponential retries or circuit breaker pattern. In case the retry policy doesn't solve the issue, it's advised to return a clear error description within sync integrations and to ensure a resumable workflow within async integrations. Read here how you can design a good resume / resubmit strategy.

7. Ensure decent monitoring

Every IT solution benefits from a good monitoring. It provides visibility and improves the operational experience for your support personnel. If you want to expose business properties within your monitoring, you can use Logic Apps custom outputs or tracked properties. These can be consumed via the Logic Apps Workflow Management API or via OMS Log Analytics. From an operational perspective, it's important to be aware that there is an out-of-the-box alerting mechanism that can send emails or trigger Logic Apps in case a run fails. Unfortunately, Logic Apps has no built-in support for Application Insights, but you can leverage extensibility (custom API App or Azure Function) to achieve this. If your integration spans multiple Logic Apps, you must foresee correlation in your monitoring / tracing!  Find here more details about monitoring in Logic Apps.

8. Use async wherever possible

Solid integrations are often characterized by asynchronous messaging. Unless the business requirements really demand request/response patterns, try to implement them asynchronously. It comes with the advantage that you introduce real decoupling, both from a design and runtime perspective. Introducing a queuing system (e.g. Azure Service Bus) in fire-and-forget integrations, results in highly scalable solutions that can handle an enormous amount of messages. Retry policies in Logic Apps must have different settings depending whether you're dealing with async or sync integration. Read more about it here.

9. Don't forget your integration patterns

Whereas BizTalk Server forces you to design and develop in specific integration patterns, Logic Apps is more intuitive and easier to use. This could come with a potential downside that you forget about integration patterns, because they are not suggested by the service itself. As an integration expert, it's your duty to determine which integration patterns should be applied on your interfaces. Loosely coupling is common for enterprise integration. You can for example introduce Azure Service Bus that provides a Publish/Subscribe architecture. Its message size limitation can be worked around by leveraging the Claim Check pattern, with Azure Blob Storage. This is just one example of introducing enterprise integration patterns.

10. Apply application lifecycle management (ALM)

The move to a PaaS architecture, should be done carefully and must be governed well, as described here. Developers should not have full access to the production resources within the Azure portal, because the change of one small setting can have an enormous impact. Therefore, it's very important to setup ALM, to deploy your Logic App solutions throughout the DTAP-street. This ensures uniformity and avoids human deployment errors. Check this video to get a head start on continuous integration for Logic Apps and read this blog on how to use Azure Key Vault to retrieve passwords within ARM deployments. Consider ALM as an important aspect within your disaster recovery strategy!

Conclusion

Yes, we can! Logic Apps really is a fit for enterprise integration, if you know what you're doing! Make sure you know your cloud and integration patterns. Ensure you understand the strengths and limits of Logic Apps. The Logic App framework is a truly amazing and stable platform that brings a whole range of new opportunities to organizations. The way you use it, should be depending on the type of integration you are facing!

Interested in more?  Definitely check out this session about building loosely coupled integrations with Logic Apps!

Any questions or doubts? Do not hesitate to get in touch!
Toon

Categories: Azure
Tags: Logic Apps
written by: Toon Vanhoutte