For this blog post, I decided to try to batch the following XML message. As Logic Apps supports JSON natively, we can assume that a similar setup will work quite easily for JSON messages. Remark that the XML snippet below contains an XML declaration, so pure string appending won't work. Also namespaces are included.
I came up with the following requirements for my batching solution:
- External message store: in integration I like to avoid long-running workflow instances at all time. Therefore I prefer messages to be stored somewhere out-of-the-process, waiting to be batched, instead of keeping them active in a singleton workflow instance (e.g. BizTalk sequential convoy).
- Message and metadata together: I want to avoid to store the message in a specific place and the metadata in another one. Keep them together, to simplify development and maintenance.
- Native Logic Apps integration: preferably I can leverage an Azure service, that has native and smooth integration with Azure Logic Apps. It must ensure we can reliably assign messages to a specific batch and we must be able to remove them easily from the message store.
- Multiple batch release triggers: I want to support multiple ways to decide when a batch can be released.
> # Messages: send out batches containing each X messages
> Time: send out a batch at a specific time of the day
> External Trigger: release the batch when an external trigger is receive
After some analysis, I was convinced that Azure Service Bus queues are a good fit:
- External message store: the messages can be queued for a long time in an Azure Service Bus queue.
- Message and metadata together: the message is placed together with its properties on the queue. Each batch configuration can have its own queue assigned.
- Native Logic Apps integration: there is a Service Bus connector to receive multiple messages inside one Logic App instance. With the peak-lock pattern, you can reliably assign messages to a batch and remove them from the queue.
- Multiple batch release triggers:
> # Messages: In the Service Bus connector, you can choose how many messages you want to receive in one Logic App instance
> Time: Service Bus has a great property ScheduledEnqueueTimeUtc, which ensures that a message becomes only visible on the queue from a specific moment in time. This is a great way to schedule messages to be releases at a specific time, without the need for an external scheduler.
> External Trigger: The Logic App can be easily instantiated via the native HTTP Request trigger
The goal of this workflow is to put the message on a specific queue for batching purpose. This Logic App is very straightforward to implement. Add a Request trigger to receive the messages that need to be batched and use the Send Message Service Bus connector to send the message to a specific queue.
This is the more complex part of the solution. The first challenge is to receive for example 3 messages in one Logic App instance. My first attempt failed, because there is apparently a different behaviour in the Service Bus receive trigger and action:
- When one or more messages arrive in a queue: this trigger receives messages in a batch from a Service Bus queue, but it creates for every message a specific Logic App instance. This is not desired for our scenario, but can be very useful in high throughput scenarios.
- Get messages from a queue: this action can receive multiple messages in batch from a Service Bus queue. This results in an array of Service Bus messages, inside one Logic App instance. This is the result that we want for this batching exercise!
As a result, we get this JSON array back from the Service Bus connector:
The challenge is to parse this array, decode the base64 content in the ContentData and create a valid XML batch message from it. I tried several complex Logic App expressions, but realized soon that Azure Functions is better suited to take care of this complicated parsing. I created the following Azure Fuction, as a Generic Webhook C# type:
Let's consume this function now from within our Logic App. There is seamless integration with Logic Apps, which is really great!
This solution is very nice, but what with large messages? Recently, I wrote a Service Bus connector that uses the claim check pattern, which exchanges large payloads via Blob Storage. In this batching scenario we can also leverage this functionality. When I have open sourced this project, I'll update this blog with a working example. Stay tuned for more!
This is a great and flexible way to perform batching within Logic Apps. It really demonstrates the power of the Better Together story with Azure Logic Apps, Service Bus and Functions. I'm sure this is not the only way to perform batching in Logic Apps, so do not hesitate to share your solution for this common integration challenge in the comments section below!
I hope this gave you some fresh insights in the capabilities of Azure Logic Apps!