wiki

Codit Wiki

Loading information... Please wait.

Codit Blog

Posted on Thursday, April 27, 2017 5:22 PM

Toon Vanhoutte by Toon Vanhoutte

This blog post dives into the details of how you can achieve batching with Logic Apps. Batching is still a highly demanded feature for a middle-ware layer. It's mostly introduced to reduce the performance impact on the target system or for functional purposes. Let's have a closer look.

Scenario

For this blog post, I decided to try to batch the following XML message.  As Logic Apps supports JSON natively, we can assume that a similar setup will work quite easily for JSON messages.  Remark that the XML snippet below contains an XML declaration, so pure string appending won't work.  Also namespaces are included.

Requirements

I came up with the following requirements for my batching solution:

  • External message store: in integration I like to avoid long-running workflow instances at all time. Therefore I prefer messages to be stored somewhere out-of-the-process, waiting to be batched, instead of keeping them active in a singleton workflow instance (e.g. BizTalk sequential convoy).

  • Message and metadata together: I want to avoid to store the message in a specific place and the metadata in another one. Keep them together, to simplify development and maintenance.

  • Native Logic Apps integration: preferably I can leverage an Azure service, that has native and smooth integration with Azure Logic Apps. It must ensure we can reliably assign messages to a specific batch and we must be able to remove them easily from the message store.

  • Multiple batch release triggers: I want to support multiple ways to decide when a batch can be released.
    > # Messages: send out batches containing each X messages
    > Time: send out a batch at a specific time of the day
    > External Trigger: release the batch when an external trigger is receive

Solution

After some analysis, I was convinced that Azure Service Bus queues are a good fit:

  • External message store: the messages can be queued for a long time in an Azure Service Bus queue.

  • Message and metadata together: the message is placed together with its properties on the queue. Each batch configuration can have its own queue assigned.

  • Native Logic Apps integration: there is a Service Bus connector to receive multiple messages inside one Logic App instance. With the peak-lock pattern, you can reliably assign messages to a batch and remove them from the queue.

  • Multiple batch release triggers:
    > # Messages: In the Service Bus connector, you can choose how many messages you want to receive in one Logic App instance

    > Time
    : Service Bus has a great property ScheduledEnqueueTimeUtc, which ensures that a message becomes only visible on the queue from a specific moment in time. This is a great way to schedule messages to be releases at a specific time, without the need for an external scheduler.

    > External Trigger
    : The Logic App can be easily instantiated via the native HTTP Request trigger

 

Implementation

Batching Store

The goal of this workflow is to put the message on a specific queue for batching purpose.  This Logic App is very straightforward to implement. Add a Request trigger to receive the messages that need to be batched and use the Send Message Service Bus connector to send the message to a specific queue.

In case you want to release the batch only at a specific moment in time, you must provide a value for the ScheduledEnqueueTimeUtc property in the advanced settings.

Batching Release

This is the more complex part of the solution. The first challenge is to receive for example 3 messages in one Logic App instance. My first attempt failed, because there is apparently a different behaviour in the Service Bus receive trigger and action:

  • When one or more messages arrive in a queue: this trigger receives messages in a batch from a Service Bus queue, but it creates for every message a specific Logic App instance. This is not desired for our scenario, but can be very useful in high throughput scenarios.

  • Get messages from a queue: this action can receive multiple messages in batch from a Service Bus queue. This results in an array of Service Bus messages, inside one Logic App instance. This is the result that we want for this batching exercise!

Let's use the peak-lock pattern to ensure reliability and receive 3 messages in one batch:

As a result, we get this JSON array back from the Service Bus connector:

The challenge is to parse this array, decode the base64 content in the ContentData and create a valid XML batch message from it.  I tried several complex Logic App expressions, but realized soon that Azure Functions is better suited to take care of this complicated parsing.  I created the following Azure Fuction, as a Generic Webhook C# type:

Let's consume this function now from within our Logic App.  There is seamless integration with Logic Apps, which is really great!


As an output of the GetBatchMessage Azure Funtion, I get the following XML :-)

Large Messages

This solution is very nice, but what with large messages? Recently, I wrote a Service Bus connector that uses the claim check pattern, which exchanges large payloads via Blob Storage. In this batching scenario we can also leverage this functionality. When I have open sourced this project, I'll update this blog with a working example.  Stay tuned for more!

Conclusion

This is a great and flexible way to perform batching within Logic Apps. It really demonstrates the power of the Better Together story with Azure Logic Apps, Service Bus and Functions. I'm sure this is not the only way to perform batching in Logic Apps, so do not hesitate to share your solution for this common integration challenge in the comments section below!

I hope this gave you some fresh insights in the capabilities of Azure Logic Apps!
Toon

Categories: Azure
Tags: Logic Apps
written by: Toon Vanhoutte

Posted on Monday, March 6, 2017 2:18 PM

Toon Vanhoutte by Toon Vanhoutte

Lately, I was working on a proof of concept on Logic Apps. I created a Logic App that looped over all files in a specific folder and processed them. I was curious about the performance, so I copied 10.000 files in the folder. The Logic App kicked off and started processing them. After a while the Logic App ended up in a failed state. Let's have a look what happened.

Apparently, I stumbled upon a limitation that is documented over here.

The documentation suggests to use the query action to filter arrays.  It offers the following capabilities:

You can indeed filter an array, based on a specific query.  However, I did not have an object available that contains the position of an item within the array.  I wanted assurance that I would not hit the for each limit, so this was no option.  If you know a way how to get the position within the array, please let me know via the comments sections below.

I continued my search within the Logic Apps Workflow Definition Language and found the @take function.  This is what the documentation states:

This did the trick. It takes the first 5000 items from an array. Luckily, you do not get an out-of-range exception if the incoming array contains less items!

It's always a good practice to validate your design upfront against the Logic Apps limits.

Hope this helps!
Toon

Categories: Azure
Tags: Logic Apps
written by: Toon Vanhoutte

Posted on Wednesday, May 27, 2015 4:14 PM

Tom Kerkhove by Tom Kerkhove

Microsoft's Yammer has been around for a while and people who are part of one or more networks will agree that Yammer can turn into Spammer.

In this blog post I demonstrate how you can automatically post to a Slack channel.

This blog post was also released on my personal blog.

Microsoft's Yammer has been around for a while and people who are part of one or more networks will agree that Yammer can turn into Spammer.

For each new conversation & comment, Yammer will send you an email resulting in mail floods. The easy fix would be to disable the notification email but then you risk the chance to miss out on interesting/important discussions.

At our current project we use Slack to communicate with each other and it's a really nice tool - Nice & clean just how I like it.

Slack Logo

So lets get rid of the notification emails and notify your team when someone starts a new conversation on Yammer! This is where Microsoft Azure App Services come in, more specifically Microsoft Azure API & Logic Apps.

With Azure Logic apps I've created a flow where I have one API app listening on a Yammer group for new conversations while another Slack API App will notify us in a channel when something pops-up.

How does that look?!

When I create a new conversation in Yammer i.e. "We're ready to go in production" - 
New Yammer Conversation

The Yammer API App in my Logic App will notice that there is a new conversation and will send a message to my team's Slack channel as the Project Announcements-bot. 

Slack Bot Response

Want it yourself? Here's how!

Before getting our hands dirty let's summarize what's on today's schedule.

We will start with provisioning our API apps that we will use from the Azure Marketplace. After that we will create a new Logic app that will describe the flow of our app.

Provisioning the API Apps

As of today you have two options for provisioning your API Apps - One is to provision them upfront where you have more control on naming and such. Second is provision them while you are designing your Logic app and let Azure take care of the naming. Be aware: Azure uses names like YammerConnector1431593752292 that doesn't really say where they're being used.

Since I always want to name my components as self-describing as possible we will provision two API apps up front :

  • A Yammer App that will trigger our Logic app when a new conversation is posted
  • A Slack App that will send a message to Slack as a Bot

Provisioning an API App is super simple : Browse to the new Azure Portal > Click New > Select Web + Mobile > Browse the Azure Marketplace > Select the API Apps section > Select the API App you want.

After you've selected your API App you basically you give it a Name, assign the App Plan & Resource Group : 
Provision API App

Azure will start provisioning the API App for you in the background, while they are doing that let's have a look at the Connector Info.

Before actually provisioning the App you see that each API App or Connector gives you an overview of it's capabilities in a Logic App. Here you can see that the Slack Connector will only be able to act in a Logic app. 
Slack Connector overview

Now when we look at the Yammer Connector Info we see that it can act withing a Logic App but also Trigger it on a certain condition. 
Yammer Connector overview

Defining the flow in a Logic App

Before we can start defining our flow we need to create a new Logic App.

In the Azure Portal click New > Select Web + Mobile > Logic App. Give it a self-describing name and add it to the same App Plan as your provisioned API Apps.

Once it is configured, open it and click Triggers and Actions
Configure Logic App

We will define our flow by defining the sequence of connectors. You can find our provisioned connectors on the side, click on your Yammer connector to add it. 
Clean Logic App

After that, the default card will be replaced with your Yammer connector. As you can see we first need to authenticate with Yammer. Click Authorize
Basic Yammer Card

A pop-up will show to do the OAuth dancing with Yammer. After you've logged in you will see need to grant access to your Logic App.

Read the statement carefully and click Allow if you agree.

Yammer Authentication

(In order to complete the following steps you need to allow access)

Now that you've allowed access to your Yammer account it's interesting to know that the authentication token will be stored in the secure store of the Gateway (A Gateway is used by API Connectors to communicate with each other and outbound services). This is because the gateway will handle all the authentication with Yammer for us.

Once that's done you get an overview of all the triggers the Yammer connector has. Luckily the only one that is available is the one we need, click New Message
Yammer Triggers

Configuring the trigger is fairly easy - We define the trigger frequency in which the connector will look for new messages. Next to that we assign the Group Id of our Yammer Group that we are interested in. The granularity of your trigger frequency depends on the hosting App Plan. In my example I'm using 1 minute which requires me to use a Standard-tier App Plan.

You can find the group Id by browsing to your group and copying the Feed Id.

https://www.yammer.com/Your-Network/#/threads/inGroup?type=in_group&feedId=579250

Yammer Connector Configured

Click the checkmark to save your configuration.

Go back to the side bar and click on your Slack Connector to add it to the pane. Here we need to authenticate with our Slack by clicking Authorize
Basic Slack Connector

Just like with Yammer, Azure will request access to your Slack account to post messages. 
Slack Authentication

Our last step is to configure the Slack connector.

What we will do is send the original message as a quote along with who posted it and a link to the conversation. In Slack that results in the following markup statement -

>>> _"Original-Message"_ by *User* _(Url)_

To achieve this we will use the @concat function to assign the Text value -

This statement is retrieving some of the output values of the Yammer connector.

We will also configure to which Slack channel you want to send it. Optionally you can assign a name to the Slack bot and give it a icon. Here I gave the name of my Yammer group as Slack bot name. 
Slack Connector Configured

Click the checkmark to save your configuration & save the flow of your logic app. 
Save Logic App

After a few seconds/minutes, depending on your trigger configuration, you will see that the Yammer connector picked up your new message and triggered your Logic App. 
Logic App - Run Overview

Now you should see a new message in your Slack channel!

Ship it!

That's it - we're done!

Your Yammer connector will now poll for new conversations in your Yammer group every cycle you've defined in its configuration. If there are new ones, your Logic App will start processing it and you will be notified in Slack!

Wrapping up

As you can see, you can very easily use Azure API & Logic Apps to create small IFTT-like flows. Nevertheless you can even build more full-blown integration scenarios by using the more advanced BizTalk API Apps!

If you want you can even expand this demo and add support for multiple Yammer groups. To do so you'll need to open the Code View and copy additional triggers in the JSON file (Thank you Sam Vanhoutte for the tip on how to create multiple trigger).

Keep in mind that the Slack bot's name that is posting is currently hardcoded, unfortunately the Yammer app doesn't expose the name of the group so this is something you'll have to work around.

Can't get enough of this? You can build your own API App or read Sam Vanhoutte his initial thoughts on Azure App Services!

Thanks for reading,

Tom Kerkhove

Categories: Azure
Tags: Logic Apps
written by: Tom Kerkhove

Posted on Tuesday, March 24, 2015 6:01 PM

Sam Vanhoutte by Sam Vanhoutte

In this post, I'm sharing some of my thoughts about the fresh Azure App Service, that were announced by Scott Guthrie and Bill Staples.

Today, Scott Guthrie and Bill Staples announced a new interesting Azure Service: Azure App Service.  Actually it's a set of services, combined under one umbrella, allowing customers to build rich business oriented applications.  Azure App Services is now the new home for:

  • Azure Web Apps (previously called Azure Websites)
  • Azure Mobile Apps (previously Mobile Services)
  • Azure Logic Apps (the new 'workflow' style apps)
  • Azure API Apps (previously announced as Microservices)

It speaks for itself that the Logic Apps and API apps will be the most important for integration people.  The Azure Microservices were first announced in public on the Integrate 2014 event and it's clear that integration is at the core of App Services, which should make us happy. 

Codit has been part of the private preview program 

Codit has been actively involved in private preview programs and we want to congratulate the various teams in the excellent job they have done.  They have really been listening to the feedback and made incredible improvements over the past months.  While everyone knows there is still a lot to do, it seems they are ready to take more feedback, as everything is public now.  

My personal advise would be to look at it with an open mind, knowing that a lot of things will be totally different from what we've been doing over the past 10-15 years (with BizTalk).  I'm sure a lot of things will (have to) happen in order to make mission critical, loosely coupled integration solutions running on App Services.  But I am confident they will happen.

Is this different from what was said at Integrate2014?

As Integrate 2014 was solely focused on the BizTalk Services, the other things (such as Web Sites and Media apps were not mentioned).  But most of the things we saw and heard back then, now made it to the public preview. 

  • Azure Microservices are now called API apps and are really just web API's in your favorite language, enriched with Swagger metadata and version control.  These API apps can be published to a gallery (public gallery might come later on) or directly to a resource group in your subscription.
  • The Workflows (they used to be called Flow Apps) are now called Logic Apps.  These will allow us to combine various API apps from the gallery in order to instrument and orchestrate logical applications (or workflows).

Important concepts

I tried to list the most important concepts below.

All of the components are built on top of Azure Websites.  This way, they can benefit from the existing out of the box capabilities there:

  • Hybrid connectivity: Hybrid Connections or Azure Virtual Networking.  Both of these are available for any API app you want to write.  And the choice is up to the user of your API app!
  • Auto-scaling: do you want to scale your specific API app automatically?  That's perfectly possible now.  If you have a transformation service deployed and the end of month message load needs to be handled, all should be fine!
  • New pricing model (more pay per use, compared to BizTalk Services)
  • And many more: Speed of deployment, the new portal: we get the new portal

API Apps really form the core of this platform.  They are restful API's, with Swagger metadata that is used to model and link the workflows (you can flow values from one API app to another in Logic apps).

API Apps can be published to the 'nuget-based' gallery, or directly to a resource group in your subscription.  When you will be able to publish to the public gallery over time, it will be possible for other users to leverage your API app in their own applications and logic apps, by provisioning an instance of that package into their own subscription.  That means that all the cost and deployment hassle is for the user of your API app.

Where I hope for improvements

As I mentioned, this is a first version of a very new service.  A lot of people have been working on this and the service will still be shaped over the coming months.  It seems the teams are taking feedback seriously and that's definitely a positive thing.  This is the feedback I posted on uservoice.  If you agree, don't hesitate to go and vote for these ideas!

  • Please think about ALM.  Doing everything in the portal (including rules, mapping, etc) is nice, but for real enterprise scenarios, we need version and source control. I really would love to see a Visual Studio designer experience for more complex workflows as well. The portal is nice for self-service and easy workflows, but it takes some time and is limited in its nature, compared to pro-dev experience in Visual Studio.
    Vote here
  • Seperate configuration from flow or business logic.
    If we now have a look at the json file that makes up a Logic app, we can see that throughout the entire file, references are being added to the actual runtime deployment of the various API apps. We also see values for the various connectors in the json structure. It would really help (in deployment of one flow to various staging slots) to seperate configuration or runtime values from the actual flow logic. 
    Vote here
  • Management
    Now it is extremely difficult to see the various "triggers" and to stop/start them.  With BizTalk, we have receive locations that we can all see in one view and we can stop/start them.  (the same thing for send ports).  Now all of that is encapsulated in the logic app and it would really be a good thing to provide more "management views".  As an example, we have customers with more than 1000 receive endpoints.  I want to get them in an easy to handle and search overview.
    Vote here
  • The usability in the browser has increased a lot, but still I believe it would make sense to make the cards or shapes smaller (or collapsable).  This way, we'll get a better overview of the end to end flow and that will be needed in the typical complex workflows we build today (including exception handling, etc) 
    Vote here

More posts will definitely follow in the coming weeks, so please stay tuned!

Categories: Azure
Tags: Logic Apps
written by: Sam Vanhoutte

Posted on Wednesday, May 24, 2017 12:00 PM

Toon Vanhoutte by Toon Vanhoutte

This blog post covers several ways to optimize the monitoring experience of your Logic Apps. Logic Apps provide two ways to add functional data to your workflow runs: tracked properties and outputs. This functional data can be used to improve operational experience, to find a specific run based on business data and to serve as a base for reporting. Both options are discussed in depth and compared.

Introduction

Tracked Properties

As the documentation states: Tracked properties can be added onto actions in the workflow definition to track inputs or outputs in diagnostics data. This can be useful if you wish to track data like an "order ID" in your telemetry

Tracked properties can be added in code view to your Logic Apps actions in this way:

Outputs

As the documentation states: Outputs specify information that can be returned from a workflow run. For example, if you have a specific status or value that you want to track for each run, you can include that data in the run outputs. The data appears in the Management REST API for that run, and in the management UI for that run in the Azure portal. You can also flow these outputs to other external systems like PowerBI for creating dashboards. Outputs are not used to respond to incoming requests on the Service REST API

Outputs can be added in code view to your Logic Apps workflow in this way. Remark that you can also specify their data type:

Runtime Behaviour

Do runtime exceptions influence tracking?

What happens if my Logic App fails? Are the business properties still tracked or are they only available in a happy scenario? After some testing, I can conclude the following:

  • Tracked properties are only tracked if the appropriate action is executed.
  • Outputs are only tracked if the whole Logic App completes successfully.

When using tracked properties, my advice is to assign them to the first action (not supported on triggers) of your Logic App, so they are certainly tracked.

Do tracking exceptions influence runtime?

Another important aspect on tracking / monitoring is the fact that it should never influence the runtime behaviour. It's unacceptable that a Logic App run fails, because a specific data field to be tracked is missing. Let's have a look what happens in case we try to track a property that does not exist, e.g.: "Reference": "@{triggerBody()['xxx']}"

In both cases the Logic App ends in a failed state, however there are some differences:

Tracked Properties

  • The action that is configured with tracked properties fails.
  • There is a TrackedPropertiesEvaluationFailed exception.

Outputs

  • All actions completed successfully, despite the fact that the Logic App run failed.
  • In the Run Details, an exception is shown for the specific output. Depending on the scenario, I've encountered two exceptions:

    >  The template language expression 'triggerBody()['xxx']' cannot be evaluated because property 'xxx' doesn't exist, available properties are 'OrderId, Customer, Product, Quantity'.

    > The provided value for the workflow parameter 'Reference' at line '1' and column '57' is not valid.

Below you can find a screen capture of the workflow details.

This is not the desired behaviour. Luckily, we can easily avoid this. Follow this advice and you'll be fine:

  • For strings: Use the question mark operator to reference potential null properties of an object without a runtime error. In case of a null reference, an empty string will be tracked. Example: "Reference": "@{triggerBody()?['xxx']}"

  • For integers: Use the int function to convert the property to an integer, in combination with the coalesce function which returns the first non-null object in the arguments passed. Example: "Quantity": "@int(coalesce(triggerBody()?['xxx'], 0))"

Azure Portal

The first place where issues are investigated is the Azure Portal. So it would be very helpful to have these business properties displayed over there. Tracked properties are nowhere to be found in the Azure Portal, but fortunately the outputs are displayed nicely in the Run Details section. There is however no option to search for a specific Logic App run, based on these outputs.

Operations Management Suite

In this post, I won't cover the Logic Apps integration with Operations Management Suite (OMS) in depth, as there are already good resources available on the web. If you are new to this topic, be sure to check out this blog post and this webcast.

Important take-aways on Logic Apps integration with OMS:

  • Only tracked properties are available in OMS, there's no trace of outputs within OMS.
  • OMS can be used to easily search a specific run, based on business properties
    > The status field is the status of the particular action, not of the complete run
    > The resource runId can be used to find back the run details in the Azure Portal

  • OMS can be used to easily create reports on the tracked business properties
  • Think about data retention. The free pricing tier gives you data retention of 7 days.

Management API

Another way to search for particular Logic Apps runs, is by using the Azure Service Management API. This might come in handy if you want to develop your own dashboard on top of Logic Apps.
The documentation clearly describes operations available to retrieve historical Logic Apps run in a programmatic way. This can be done easily with the Microsoft.Azure.Management.Logic library. Outputs are easier to access than tracked properties, as they reside on the WorkflowRun level. The following code snippet can serve as a starting point:

Unfortunately, the filter operations that are supported are limited to the ones available in the Azure Portal:

This means you can only find specific Logic Apps runs - based on outputs or tracked properties - if you navigate through all of them, which is not feasible from a performance perspective.

Conclusion

Comparison

Below you can find a comparison table of the investigated features. My current advice is to use both of them, as you'll get the most in return. 

Feature Tracked Properties Outputs
Tracking Level Action Logic App Run
Runtime behaviour    If action executes If Logic App succeeds 
Azure Portal No integration Visible in run details
OMS Full integration No integration
Management API No search No search

Feedback to Product Team

Monitoring is one of areas that needs some improvements to provide all we need for operating and monitoring an integration environment. Here are my feature requests in order of priority:

  • Ensure outputs are also monitored in case a workflow run fails. Vote here.
  • Provide OMS integration for outputs. Vote here.
  • Extend the search filter in the Management API. Vote here.
  • Provide a unique URL to directly redirect to a specific Logic App run.

Hope this post clarifies the monitoring capabilities of Logic Apps. 

Feel free to vote for the mentioned feature requests!
Toon

Categories: Azure
Tags: Logic Apps
written by: Toon Vanhoutte