wiki

Codit Wiki

Loading information... Please wait.

Codit Blog

Posted on Thursday, May 4, 2017 1:40 PM

Toon Vanhoutte by Toon Vanhoutte

Recently, the product team released a first feature pack for BizTalk Server 2016. Via this way, Microsoft aims to provide more agility into the release model of BizTalk Server. The feature pack contains a lot of interesting features, of which the brand new Management & Operational API is an important one. Let's have a look at what's included!

The documentation of the Management API can be found here.  In short: almost everything you can access in the BizTalk Administration Console is now available in the BizTalk Management API.  The API is very well documented with Swagger, so it's pretty much self-explaining.  

What is included?

A complete list of available operations can be found here.

Deployment

There are new opportunities on the deployment side. Here are some ideas that popped into my mind:

  • Dynamically create ports. Some messaging solutions are very generic. Adding new parties is sometimes just a matter of creating a new set of receive and send ports. This can now be done through this Management API, so you don't need to do the plumbing yourself anymore.

  • Update tracking settings. We all know it quite difficult to keep your tracking settings consistent through all applications and binding files. The REST API can now be leveraged to change the tracking settings on the fly to their desired state.

Runtime

Also the runtime processing might benefit from this new functionality. Some scenarios:

  • Start and stop processes on demand. In situations that the business wants to take control on when certain processes should be active, you can start/stop receive/send ports on demand. Just a small UI on top of the Management API, including the appropriate security measures, and you're good to go!

  • Maintenance windows. BizTalk is in the middle of your application landscape. Deployments on backend applications, can have a serious impact on running integrations. That's why stopping certain ports during maintenance windows is a good approach. This can now be easily automated or controlled by non-BizTalk experts.

Monitoring

Most new opportunities reside on the monitoring side. A couple of potential use cases:

  • Simplified and short-lived BAM. It's possible to create some simple reports with basic statistics of your BizTalk environment. You can leverage the Management API or the Operational OData Service. You can easily visualize the number of messages per port and for example the number of suspended instances. All of this is built on top of the data in your MessageBox and DTA database, so there's no long term reporting out-of-the-box.

  • Troubleshooting. There are very easy-to-use operations available to get a list of services instances with a specific status. In that way, you can easily create a dashboard that gives an overview of all instances that require intervention. Suspended instances can be resumed and terminated through the Management API, without the need to access your BizTalk Server.

This is an example of the basic Power BI reports that are shipped with this feature pack.

What is not included?

This brand new BizTalk Management API is quite complete, very excited about the result! As always, I looked at it with a critical mindset and tried to identify missing elements that would enable even more additional value. Here are some aspects that are currently not exposed by the API, but would be handy in future releases:

  • Host Instances: it would be great to have the opportunity to also check the state of the host instances and to even start / stop / restart them. Currently, only a GET operation on the hosts is available.

  • Tracked Context Properties: I'm quite fond of these, as they enable you to search for particular message events, based on functional search criteria (e.g. OrderId, Domain…). Would be a nice addition to this API!

  • Real deployment: first I thought that the new deployment feature was built on top of this API, but that was wrong. The API exposes functionality to create and manage ports, but no real option to update / deploy a schema, pipeline, orchestration or map. Could be nice to have, but on the other hand, we have a new deployment feature of which we need to take advantage of!

  • Business Activity Monitoring: I really like to idea of the Operational OData Service, which smoothly integrates with Power BI. Would be great to have a similar and generic approach for BAM, so we can easily consume the business data without creating custom dashboards. The old BAM portal is really no option anymore nowadays. You can vote here.

Conclusion!

Very happy to see more commitment from Microsoft towards BizTalk Server. This emphasises their "better together" integration vision on BizTalk Server and Logic Apps! Check out the BizTalk User Voice page if you want to influence the BizTalk roadmap!

The exposure of BizTalk as a REST API opens up a new range of great opportunities. Don't forget to apply the required security measures when exposing this API! By introducing this API, the need for auditing all activity becomes even more important!

Thanks BizTalk for this great addition! Thank you for reading!

Cheers,
Toon

Categories: BizTalk
written by: Toon Vanhoutte

Posted on Wednesday, May 3, 2017 5:06 PM

Toon Vanhoutte by Toon Vanhoutte

Recently, the product team released a first feature pack for BizTalk Server 2016. Via this way, Microsoft aims to provide more agility into the release model of BizTalk Server. The feature pack contains a lot of new and interesting features, of which the new scheduling capabilities is an important one. Let's have a look at what's included!

The documentation of this scheduling feature can be found on MSDN.

What is included?

Support for time zones

The times provided within the schedule tab of receive locations are now accompanied by a time zone. This ensures your solution is not depending anymore on the local computer settings. There's also a checkbox to automatically adjust for daylight saving time.

This is a small, but handy addition to the product! It avoids unpleasant surprises when rolling out your BizTalk solutions throughout multiple environments or even multiple customers!

Service window recurrence

The configuration of service windows is now a lot more advanced. You have multiple recurrence options available:

  • Daily: used to run the receive location every x number of days
  • Weekly: used to run the receive location on specific days of the week
  • Monthly: used to run the receive location on specific dates or specific days of the month

Up till now, I didn't use the service window that much. These new capabilities allow some new scenarios. As an example, this would come in handy to schedule the release of batch messages on a specific time of the day, which is often required in EDI scenarios!

What is not included?

This is not a replacement for the BizTalk Scheduled Task Adapter, which is a great community adapter! There is a fundamental difference between an advanced service window configuration and the Scheduled Task Adapter. A service window configures the time on which a receive locations is active, whereas the Scheduled Task Adapter executes a pre-defined task on the configured recurrence cadence.

For the following scenarios, we still need the Scheduled Task Adapter:

  • Send a specific message every x seconds / minutes.
  • Trigger a process every x seconds / minutes.
  • Poll a rest endpoint every x seconds / minutes. Read more about it here.

Conclusion!

Very happy to see more commitment from Microsoft towards BizTalk Server. This emphasises their "better together" integration vision on BizTalk Server and Logic Apps! Check out the BizTalk User Voice page if you want to influence the BizTalk roadmap!

These new scheduling capabilities are a nice addition to BizTalk's toolbelt! In future feature packs, I hope to see similar capabilities as the Scheduled Task Adapter. Many customers are still reluctant to use community adapters, so a supported adapter would be very nice! You can vote here!

Thanks for reading!
Toon

Categories: BizTalk
written by: Toon Vanhoutte

Posted on Wednesday, May 3, 2017 10:55 AM

Stijn Degrieck by Stijn Degrieck

"One in ten IT specialists in Belgium is a cheap Indian," some media recently wrote. They work for minimum wages, ensuring unfair competition, and do not make a fair contribution to our welfare state, since they are not covered by Belgian social security. It was the socialist trade union BBTK who rang the bell. "Belgian employees are losing their jobs and the government is missing out on 26 million euros each year," they complained. Employers in the Belgian IT sector deliberately abuse the employment status of their Indian programmers to find people on the cheap. Ouch, that hurts.

I do not doubt the figures from the BBTK. I honestly don’t know how many Indian IT specialists are currently in our country. I do however know how many IT people we need. It’s in the thousands. This is what Agoria is hearing from its members. And that permanent shortage is almost always the topic of discussion when I speak to colleagues in the sector. It is difficult for us all to fulfill vacancies, in spite of great wages, benefits and a huge investment in time and resources to create the most pleasant and dynamic work environment. Believe me: we go out of our way to do that. Recently, we even had an info stand at Facts, a quirky fair in Flanders Expo dedicated to comic and gaming fans, Trekkies, Star Wars fans and who knows what else. You can look it up on Facts.be, if you dare. But that’s another story. I think this job loss is not so huge. Those Indians are not taking our jobs away, we need them to fill in the gaps. It’s a good thing they exist! Because no IT specialist willing to work is out of a job for long here. In many companies, you don’t even need an official diploma anymore. A good self-taught person is better than an open vacancy.

What I refuse to believe, however, is that ‘employers’ purposefully try to save on social security by employing low-cost workforce. That generalization is too easy. For your information, at Codit we are talking about one in 130. And that one Indian colleague is paid according to Belgian standards. The same way we reward our employees in France, The Netherlands, the United Kingdom, Switzerland and Portugal according to local conditions, regardless of their nationality. Does our Indian colleague earn a lot? To Indian standards, certainly. To Swiss standards, perhaps not.

I’m afraid the BBTK is barking up the wrong tree. People being taxed in their country of origin as a result of a trade agreement is a policy issue. Individual employers and their employees have little to do with it. And in any case, it is primarily rearguard action in a globalized economy. The vast majority of those cheap Indian IT employees work in ‘Belgian IT’, but not for a ‘Belgian company’. There are a lot of international players in our market. They are indeed trying to acquire work as cheaply as possible in order to stay competitive. I could call this distortion of competition. The thing is that we at Codit (and many other Belgian IT companies) look beyond the local market.

I invite the trade union to expand their field of view as well. Let's do something about that shortage, because it is putting a brake on the growth of Belgian IT companies. In our knowledge economy, we need to invest in talent. And that should not be limited to young people. We have a lot of people ‘on the bench’ whose skills no longer match the needs of our companies. Perhaps the trade union can help warm them to a career switch? Imagine meeting our ambitions and our country having lots of internationally relevant and innovative IT companies. That would be much more beneficial to our welfare state than fighting over breadcrumbs.

Stijn Degrieck is CEO of Codit, a fast growing and internationally active IT company headquartered in Ghent.

Note: This opinion was first published via De Standaard on 2 May 2017 (in Dutch). 

Categories: Opinions
written by: Stijn Degrieck

Posted on Tuesday, May 2, 2017 12:40 PM

Toon Vanhoutte by Toon Vanhoutte

Recently, the product team released a first feature pack for BizTalk Server 2016. Via this way, Microsoft aims to provide more agility into the release model of BizTalk Server. The feature pack contains a lot of new and interesting features, of which the automated deployment from VSTS is probably the most important one. This blog post contains a detailed walk-through.

Introduction

I've created this walkthrough mainly because I had difficulties to fully understand how it works. The documentation does not seem 100% complete and some blog posts I've read created some confusion for me. This is a high-level overview of how it works:

  1. The developer must configure what assemblies and bindings should be part of the BizTalk application. Also, the order of deployment must be specified. This is done in the new BizTalk Application Project.

  2. The developer must check-in the BizTalk projects, including the configured BizTalk Application Project. Also, the required binding files must be added to the chosen source control system.

  3. A build is triggered (automatically or manually). A local build agent compiles the code. By building the BizTalk Application Project, a deployment package (.zip) is automatically generated with all required assemblies and bindings. This deployment package (.zip) is published to the drop folder.

  4. After the build completed, the release can be triggered (automatically or manually). A local deploy agent, installed on the BizTalk server, takes the deployment package (.zip) from the build's drop folder and performs the deployment, based on the configurations done in step 1. Placeholders in the binding files are replaced by VSTS environment variables.

Some advice:

  • Make a clear distinction between build and release pipelines!
  • Do not create and check-in the deployment package (.zip) yourself!

You can follow the steps below to set up full continuous deployment of BizTalk applications. Make sure you check the prerequisites documented over here.

Create a build agent

As VSTS does not support building BizTalk projects out-of-the-box, we need to create a local build agent that performs the job.

Create Personal Access Token

For the build agent to authenticate, a Personal Access Token is required.

  • Browse to your VSTS home page. In my case this is https://toonvanhoutte.visualstudio.com

  • Click on the profile icon and select Security.

 

  • Select Personal access tokens and click Add

 

  • Provide a meaningful name, expiration time and select the appropriate account. Ensure you allow access to Agent Pools (read, manage).

 

  • Click Create Token

 

  • Ensure you copy the generated access token, as we will need this later.


Install local build agent

The build agent should be installed on the server that has Visual Studio, the BizTalk Project Build Component and BizTalk Developer Tools installed.

  • Select the Settings icon and choose Agent queues.

  • Select the Default agent queue. As an alternative, you could also create a new queue.

  • Click on Download agent

  • Click Download. Remark that the required PowerShell scripts to install the agent are provided.

  • Open PowerShell as administrator on the build server.
    Run the following command to unzip and launch the installation:
    mkdir agent ; cd agent
    Add-Type -AssemblyName System.IO.Compression.FileSystem ; System.IO.Compression.ZipFile]::ExtractToDirectory("$HOME\Downloads\vsts-agent-win7-x64-2.115.0.zip", "$PWD")

  • Execute this command to launch the configuration:
    .\config.cmd

  • Provide the requested information:
    > Server URL: https://toonvanhoutte.visualstudio.com
    > Authentication: PAT
    > PAT: The personal access token copied in the previous step

 

  • Press enter for default pool
  • Press enter for default name
  • Press enter for default work folder
  • Provide Y to run as a service
  • Provide user
  • Provide password

  • Double check that the local build service is created and running.

  • If everything went fine, you should see the build agent online!

Create a build definition

Let's now create and configure the required build definition.

  • In the Builds tab, click on New to create a new build definition.

  • Select Visual Studio to start with a pre-configured build definition. Click Next to continue.

  • Select your Team Project as the source, enable continuous integration, select the Default queue agent and click Create.

  • Delete the following build steps, so the build pipeline looks like this:
    > NuGet Installer
    > Visual Studio Test
    > Publish Symbols

  • Configure the Visual Studio Build step. Select the BizTalk solution that contains all required artifacts. Make sure Visual Studio 2015 is picked and verify that MSBuild architecture is set to MSBuild x86.

  • The other build steps can remain as-is. Click Save.

  • Provide a clear name for the build definition and click OK.

  • Queue a new build.

  •  Confirm with OK.

  • Hopefully your build finishes successful. Solve potential issues in case the build failed.

Configure BizTalk Application

In this chapter, we need to create and configure the definition of our BizTalk application. The BizTalk Server 2016 Feature Pack 1 introduces a new BizTalk project type: BizTalk Server Application Project. Let's have a look how we can use this to kick off an automated deployment.

  • On your solution, click Add, Add New Project.
  • Ensure you select .NET Framework 4.6.1 and you are in the BizTalk Projects tab. Choose BizTalk Server Application Project and provide a descriptive name.

  • Add references to all projects that needs to be included in this BizTalk application and click OK.

  • Add all required binding files to the project. Make sure that every binding has Copy to Output Directory set to Copy Always. Via this way, the bindings will be included in the generated deploy package (.zip).

  • In case you want to replace environment specific settings in your binding file, such as connection string and passwords, you must add placeholders with the $(placeholder) notation.

  • Open the BizTalkServerInventory.json file and configure the following items:
    > Name and path of all assemblies that must be deployed in the BizTalk application
    > Name and path of all binding files that must be imported into the BizTalk application
    > The deployment sequence of assemblies to be deployed and bindings to be imported.

  • Right click on the BizTalk Application Project and choose Properties. Here you can specify the desired version of the BizTalk Application. Be aware that this version is different, depending whether you're building in debug or release mode. Click OK to save the changes.

 

  • Build the application project locally. Fix any errors if they might occur. If the build succeeds, you should see a deployment package (.zip) in the bin folder. This package will be used to deploy the BizTalk application.

  • Check-in the new BizTalk Application Project. This should automatically trigger a new build. Verify that the deployment package (.zip) is also available in the drop folder of the build. This can be done by navigating to the Artifacts tab and clicking on Explore.

  • You should see the deployment package (.zip) in the bin folder of the BizTalk Application Project.

Create a release definition

We've created a successful build, that generated the required deployment package (.zip). Now it's time to configure a release pipeline that takes this deployment package as an input and deploys it automatically on our BizTalk Server.

  • Navigate to the Releases tab and click Create release definition.

  • Select Empty to start with an empty release definition and click Next to continue.

  • Choose Build as the source for the release, as the build output contains the deployment package (.zip). Make sure you select the correct build definition. If you want to setup continuous deployment, make sure you check the option. Click Create to continue.

  • Change the name of the Release to a more meaningful name.

  • Change the name of the Environment to a more meaningful name.

  • Click on the "…" icon and choose Configure variables.

  • Add an environment variable, named Environment. This will ensure that every occurrence of $(Environment) in your binding file, will be replaced with the configured value (DEV). Click OK to confirm.

  • Click Add Tasks to add a new task. In the Deploy tab, click Add next to the BizTalk Server Application Deployment task. Click Close to continue.

  • Provide the Application Name in the task properties.

  • For the Deployment package path, navigate to the deployment package (.zip) that is in the drop folder of the linked build artefact. Click OK to confirm.

  • Specify, in the Advanced Options, the applications to reference, if any.

  • Select Run on agent and select the previously created agent queue to perform the deployment. In a real scenario, this will need to be a deployment agent per environment.

  • Save the release definition and provide a comment to confirm.

Test continuous deployment

  • Trigger now a release, by selecting Create Release.

  • Keep the default settings and click Create.

  • In the release logs, you can see all details. The BizTalk deployment task has very good log statements, so in case of an issue you can easily pinpoint the problem. Hopefully you encounter a successful deployment!

  • On the BizTalk Server, you'll notice that the BizTalk application has been created and started. Notice that the application version is applied and the application references are created!

 


In case you selected the continuous integration options, there will now be an automated deployment each time you check in a change in source control. Continuous deployment has been set up!

Wrap-up

Hope you've enjoyed this detailed, but basic walkthrough. For real scenarios, I highly encourage to extend this continuous integration approach with:

  • Automated unit testing and optional integration testing
  • Versioning of the assembly file versions
  • Include the version dynamically in the build and release names

Cheers,
Toon

Categories: BizTalk
written by: Toon Vanhoutte

Posted on Thursday, April 27, 2017 5:22 PM

Toon Vanhoutte by Toon Vanhoutte

This blog post dives into the details of how you can achieve batching with Logic Apps. Batching is still a highly demanded feature for a middle-ware layer. It's mostly introduced to reduce the performance impact on the target system or for functional purposes. Let's have a closer look.

Scenario

For this blog post, I decided to try to batch the following XML message.  As Logic Apps supports JSON natively, we can assume that a similar setup will work quite easily for JSON messages.  Remark that the XML snippet below contains an XML declaration, so pure string appending won't work.  Also namespaces are included.

Requirements

I came up with the following requirements for my batching solution:

  • External message store: in integration I like to avoid long-running workflow instances at all time. Therefore I prefer messages to be stored somewhere out-of-the-process, waiting to be batched, instead of keeping them active in a singleton workflow instance (e.g. BizTalk sequential convoy).

  • Message and metadata together: I want to avoid to store the message in a specific place and the metadata in another one. Keep them together, to simplify development and maintenance.

  • Native Logic Apps integration: preferably I can leverage an Azure service, that has native and smooth integration with Azure Logic Apps. It must ensure we can reliably assign messages to a specific batch and we must be able to remove them easily from the message store.

  • Multiple batch release triggers: I want to support multiple ways to decide when a batch can be released.
    > # Messages: send out batches containing each X messages
    > Time: send out a batch at a specific time of the day
    > External Trigger: release the batch when an external trigger is receive

Solution

After some analysis, I was convinced that Azure Service Bus queues are a good fit:

  • External message store: the messages can be queued for a long time in an Azure Service Bus queue.

  • Message and metadata together: the message is placed together with its properties on the queue. Each batch configuration can have its own queue assigned.

  • Native Logic Apps integration: there is a Service Bus connector to receive multiple messages inside one Logic App instance. With the peak-lock pattern, you can reliably assign messages to a batch and remove them from the queue.

  • Multiple batch release triggers:
    > # Messages: In the Service Bus connector, you can choose how many messages you want to receive in one Logic App instance

    > Time
    : Service Bus has a great property ScheduledEnqueueTimeUtc, which ensures that a message becomes only visible on the queue from a specific moment in time. This is a great way to schedule messages to be releases at a specific time, without the need for an external scheduler.

    > External Trigger
    : The Logic App can be easily instantiated via the native HTTP Request trigger

 

Implementation

Batching Store

The goal of this workflow is to put the message on a specific queue for batching purpose.  This Logic App is very straightforward to implement. Add a Request trigger to receive the messages that need to be batched and use the Send Message Service Bus connector to send the message to a specific queue.

In case you want to release the batch only at a specific moment in time, you must provide a value for the ScheduledEnqueueTimeUtc property in the advanced settings.

Batching Release

This is the more complex part of the solution. The first challenge is to receive for example 3 messages in one Logic App instance. My first attempt failed, because there is apparently a different behaviour in the Service Bus receive trigger and action:

  • When one or more messages arrive in a queue: this trigger receives messages in a batch from a Service Bus queue, but it creates for every message a specific Logic App instance. This is not desired for our scenario, but can be very useful in high throughput scenarios.

  • Get messages from a queue: this action can receive multiple messages in batch from a Service Bus queue. This results in an array of Service Bus messages, inside one Logic App instance. This is the result that we want for this batching exercise!

Let's use the peak-lock pattern to ensure reliability and receive 3 messages in one batch:

As a result, we get this JSON array back from the Service Bus connector:

The challenge is to parse this array, decode the base64 content in the ContentData and create a valid XML batch message from it.  I tried several complex Logic App expressions, but realized soon that Azure Functions is better suited to take care of this complicated parsing.  I created the following Azure Fuction, as a Generic Webhook C# type:

Let's consume this function now from within our Logic App.  There is seamless integration with Logic Apps, which is really great!


As an output of the GetBatchMessage Azure Funtion, I get the following XML :-)

Large Messages

This solution is very nice, but what with large messages? Recently, I wrote a Service Bus connector that uses the claim check pattern, which exchanges large payloads via Blob Storage. In this batching scenario we can also leverage this functionality. When I have open sourced this project, I'll update this blog with a working example.  Stay tuned for more!

Conclusion

This is a great and flexible way to perform batching within Logic Apps. It really demonstrates the power of the Better Together story with Azure Logic Apps, Service Bus and Functions. I'm sure this is not the only way to perform batching in Logic Apps, so do not hesitate to share your solution for this common integration challenge in the comments section below!

I hope this gave you some fresh insights in the capabilities of Azure Logic Apps!
Toon

Categories: Azure
Tags: Logic Apps
written by: Toon Vanhoutte