wiki

Codit Wiki

Loading information... Please wait.

Codit Blog

Posted on Wednesday, July 22, 2015 1:36 PM

Tom Kerkhove by Tom Kerkhove

Security is more important than ever and no day goes by without a company being hacked, a breach has been detected in some 3th party plugins or whatsoever.

We - as developers & IT Pros - are responsible for building hardened applications and securely store sensitive data as if it were our own.

In this blog post I'll talk about Azure Key Vault and how it can help you store keys and secrets such as connection strings in the cloud.

Security is and will always be very important. In the past years we've seen how Snowden revealed the activities of the NSA, how a big company as Sony can be hacked, how governments spy on each other, etc. Next to that we also have the new technologies and concepts like the Internet-of-Things, these also introduce new concerns and problems to tackle.

These events create more awareness concerning privacy, security & data ownership while end users are still using passwords like '123456' according to CNet, good luck with that.

The applications that we, as developers/ITPros, build are responsible for protecting those users their information as much as required, whatever it takes. Alas, building secure applications is not easy and requires planning & implementation from the start - It's not something that you just add at the end of development. Unfortunately, some applications still have to deal with threats such as SQL injection as Troy Hunt mentions on DotNetRocks or even storing passwords as plain text, luckily we have Have-I-Been-Pwned to notify us for these kind of breaches.

Have I Been Pwned?

There are additional aspects we need to secure in our solution i.e. where will we store the configuration values, in our web.config? How about our API keys & connection strings? While considering where to store it, how do we protect it from humans such as operators? Can we shield the information from them? When you need to add support for encryption or signing there is the additional burden of storing these keys.

It would be easy if all these sensitive secrets are stored in one central secure place.
This is just the start, hopefully these are questions you've asked yourself before.

This is exactly where Azure Key Vault comes in and helps us with some of these concerns, let's have a look how!

Introducing Azure Key Vault

Azure Key Vault is a service that enables you to store & manage cryptographic keys and secrets in one central secure vault. All the sensitive data is stored on physical hardware security modules (HSM) - FIPS 140-2 Level 2 certified - inside the datacenter where the data will be encrypted by VMs or directly on the HSM, more on this later.

A vault owner can create a Key Vault gaining full access & control over the vault. In a later release the vault owner will have an audit trail available to see who accessed their secrets & keys. They are now in full control of the key lifecycles as well, they can roll a new version of the key, back it up, etc.

A vault consumer can perform actions on the assets inside the Key Vault when the vault owner grants him/her access and depending on the permissions granted, we will discuss this more in a sec.

This enables us to give our customers full control on their sensitive data - They can decide how their key lifecycle looks and whom has access to it. Based on the audit logs they are aware of what the consumers are doing and if they are still trustworthy.

On the other hand, developers are now no longer responsible for storing sensitive data such as API tokens, certificates & encryption keys. Operators will also no longer be able to see sensitive data in the database, web.config, etc.

Feature Overview

Let's dig a little deeper in the features is provides and their constraints.

Before we do so, it's important to know that all keys & secrets are versioned allowing you to retrieve the latest or stick to a specific version. These versions are used when you f.e. change the value of a secret.

Secrets

A secret is a sequence of bytes limited to 10 kB to which you can assign any value, this can be a certificate, string or whatever you want.

The consumer can save or read back values based on the name of the secret, if they have the required permissions. It basically is a Key-Value store that encrypts your data and stores it in the HSM.

It's important to know that consumers will receive value of the secret as plain-text. This means that they can do anything with these values without the vault owner knowing what they are doing, the trust boundary ends when the data is sent back and the audit log has been updated.

On the other hand, if the type of data you are sharing allows rerolling new versions the consumer will have to come back every x minutes/hours/days to fetch the latest value. You are making them dependent and the chance of losing control. Because this is something you need to consider as well, how are they storing your secrets? In cache? Database? How are these secured? Rolling secrets is a good practice you should consider.

Keys

A key is a cryptographic RSA 2048 key that consumers can use for typical key operations such as encrypt, decrypt, sign, verify, etc. Key Vault will handle all these operations for the consumers because they can't read back the value.

All keys are encrypted and stored in physical HSMs but come in two flavors:

  • Software Keys are using Azure VMs to handle operations on the keys. They are pretty cheap but less secure. These keys are typically used for dev/test scenarios.
  • HSM Keys are performing key operations directly on the HSM and thus more secure. However, these keys are more expensive and require you to use a Premium-tier vault.

A key has a higher latency than a secret, if you need to frequently use the key it is recommended to store it as a secret.

Audit Logs (Coming soon)

In the near future Azure Key Vault will also provide audit logs of whom accessed your vault and how often. These logs allow you to act based on what is happening, f.e. revoking access to someone who doesn't need access anymore or is very suspicious.

Bring-Your-Own-Key (BYOK)

Key Vault also allows you to transfer keys from your on-premises HSM up to Azure or back to your datacenter by using a secure HSM-to-HSM transfer. As an example, you can create keys on-premises and once your application goes into production in Azure you can transfer and use that key in Key Vault.

BYOK Flow

© Microsoft

If you want to know more about bring-your-own-key, I recommend this article

Authentication

Azure Key Vault leverages enterprise-grade authentication & authorization by integrating with Azure Active Directory where you grant a person or application in your directory access to the vault with a specific set of permissions. However, be aware of the fact that these permissions are granted on the vault-level.

Here is a nice overview of how the authentication process works -

Key Vault Authentication Flow

© Microsoft

When you provision a Key Vault you need to change the Access Control Lists (ACL), this can be done with a simple PowerShell script.

Set-AzureKeyVaultAccessPolicy -VaultName 'Codito' -ServicePrincipalName $azureAdClientId -PermissionsToKeys encrypt -PermissionsToSecrets get

The consumer can than authenticate with Azure Active Directory by using his Account Id & Secret or his Account Id & Certificate. You then use the granted token and give it to Key Vault along with the operation you want to perform.

If you want to revoke access or simple restrict a consumer you can run the same script with less or no permissions.

This means that you can re-use your existing active directory, unfortunately this is a requirement in order to use Key Vault.

Scenarios

Let's have a look at some of the scenarios where you should use Azure Key Vault. As I mentioned before, it's not a silver bullet but helps you store sensitive data as good as possible.

Internal vault

First scenario is a simple one - Some applications have to use or communicate with 3th party systems or parties.

Here are two examples :

  • A database needs a connection string to know where the database is located and how it should authenticate with it.
  • An external service where you need to identify yourself in order to gain access by using a token or password such Twilio.

Where do you store these things? In a database? Nope because you don't know where it is. A common location to store it is the web.config or app.config however this is insecure and an operator can steal this data and sell it so other people can send text messages in your name.

You could use Azure Key Vault as an internal vault containing this data for you. When you then need to authenticate with Twilio you can ask your vault for your API token and use it. Ideally you would cache it and let it expire after x minutes, get it, cache it, you get the picture.

Sharing sensitive data with a third party

Another scenario is where a third party grants you access to their assets, in this example a database.

As mentioned in the previous section there are a lot of ways to store a connection string but this means that the 3th party needs to trust you with that information and they have no clue on how you store it. Here we are just storing it in the app settings in plain text.
Basic Scenario without Key Vault

© Microsoft

However, the customer could give you the same information by creating & sharing it as part of their Key Vault. This makes certain that the data is stored in a secure manner and they have an audit trail of how you interact with the service. If they don't like what you are doing, they can still revoke your access.
Sharing senstivite data with third party scenario

© Microsoft

Important thing to note is that when you as the consumer get the value of the secret, you get it as plain text and the customer has to trust you with it. You can still save it in a file or cache it or whatsoever. On the other hand, the customer is more confident of how the secret data is stored and they have full control over it. If it were a rollable key they could implement an automatic roll system as we will see in a minute.

Multi-tenancy scenario

Key Vault can be used in a multi-tenancy scenario as well where we use the first to scenarios to build a trustworthy relationship with the customer. They can share their sensitive data, here a Azure Storage key, by allowing us to retrieve it from their Azure Key Vault. We, as a service, store the Azure Active Directory authentication for consuming their vault in an internal vault.

Multi-Tenantcy scenario

I'll walk you through the process of how it could work :

  1. The customer provisions a new Azure Key Vault
  2. They create an Azure Active Directory entity for us and set the ACLs on the Key Vault
  3. Codito signs up for our service giving us the AD Id & secret and the names of the secrets we need.
  4. We store the authentication data in our internal
  5. The service stores the names of the customers secrets in a datastore, here Azure SQL Database
  6. Our service authenticates with Azure AD to gain a access token
  7. We request the value for the secrets by passing the access token

Automatically roll keys

Last but not least there is the scenario where you want to automatically re-roll your keys without breaking your running applications. Dushyant Gill actually write a very nice article on how you can automatically roll your Azure Storage Key without breaking any applications.

Storing the vault authentication secrets

While these are only some of the possible scenarios they share a common issue - How do you store your authentication data to your Key Vault.

Well that's a hard one, as mentioned before you have two types of authentication - With a password key or a certificate. Personally, using a certificate seems like the way to go. It's easier to securely store this than a password key and easier to shield from people as well, they have to know where to look as well.

Although this does not get rid of the exposure entirely, it limits the exposure and stores most of the data in a more secure way.

Integration with Azure services

You can use Azure Key Vault to store your keys and use them in other Azure services.

  • SQL Server Encryption in Azure VM (Preview) - When using SQL Server Enterprise you can use Azure Key Vault as a SQL Server connector as an extensible key management provider. This allows you to use a key from Key Vault for Transparent Data Encryption (TDE), Column Level Encryption (CLE) & Backup encryption. This is also a feature you can use on-premises as well. More information here.

  • Azure Storage client side encryption (Preview) - You can now encrypt data before uploading to Azure Storage or decrypt while downloading. The SDK allows you to use keys from your existing Key Vault so you can manage them as you want. More information here.

  • VM Encryption with CloudLink - CloudLink allows you to encrypt and decrypt VMs while using Key Vault as a key repository. More information here

And there is even more, a full list can be found here.

Vault Management & Tooling

Management of your vault such as provisioning a new one or setting the ACLs can be done with PowerShell scripts or using the Azure CLI for Linux & Mac. Here is a PowerShell script that outline some of the Key Vault cmdlets you can use.

If you go to the portal you can provision a new Key Vault by clicking New > Management > Key Vault > oh wait, it's not ready yet!

Portal - Provision a Azure Key Vault

Fair enough, in the end it's a secondary service that is focused for enterprises, scripting such a thing are a good practice.

From a consumers perspective you can use the REST API, .Net libraries on NuGet or the preview SDK for Node.js with more in the works.

Vault Isolation

A Key Vault is dedicated to one specific region and thus you will not be able to consult data from within a different region. All the secrets and keys will be stored in physical HSMs in that specific region, the data will never leave that geographic region.

Certain countries have laws demanding that data should never leave the region, the same goes for compliances. When you deploy your application across regions this means you will have a Key Vault per region with the same structure of keys and secrets. The keys will all be different but it can happen that a secret contains the same value across regions such as a Twilio API key.

Thinking about disaster recovery

This limitation can cause some headaches when you are planning for disaster recovery. If your deployments in one region go down you still want to offer an alternative to your customers.

A possibility to cope with this is to set up manual synchronization for the secrets that are not region-specific. As an example, if we have a Twilio API key and an Azure Storage account key in our vault we would only want to synchronize the API key so we have to update one "master" vault.

Vault Replication

Unfortunately, if you are heavily using keys there is no option for DR.

If you are limited to one region this will not be applicable for your scenario.

Thinking about pricing

So who pays for everything? It's pretty simple.

If you're the owner of the vault than you pay for everything while vault consumers don't have to pay for anything.

This means that if you have a chatty consumer the cost for the vault is increasing without having control over it. Luckily the price are defined per 10,000 operations and are really low.

At the time of writing, you will be charged €0.0224 per 10,000 operations on a software key or secret while for HSM keys you also have to pay €0.7447 for each key and version of a key in your vault.

If you want to have a complete overview, here's an overview.

Azure Key Vault is now general available!

As of the 24th of June, Key Vault is now general available meaning that you can use it in production environments and is backed with a 99.9% SLA and Azure Support Plan.

You can read the announcement here.

Conclusion

Azure Key Vault has a lot to offer and helps developers store sensitive data as good as possible while the data owner has full control and proof of who is using their data.

However, Azure Key Vault is not a silver bullet and it was only build for secrets & keys but it helps us a lot. In my opinion, every new project running in Azure should use Key Vault for optimal security around these kind of sensitive data and setup automatically rerolling of authentication keys where possible.

There was also an interesting session at Ignite I recommend if you want to know more about Key Vault.

The question is not if you will be hacked, but when.

Thanks for reading,

Tom.

Categories: Azure Security
written by: Tom Kerkhove

Posted on Wednesday, July 8, 2015 3:16 PM

Glenn Colpaert by Glenn Colpaert

In this blogpost I will demonstrate how easy it is to expose SAP functionality to the cloud with the new Azure App Services. I will use a combination of API Apps to create a Logic App that exposes BAPI functionality to an external website.

In March of this year Microsoft announced the new Azure App Services, which brings together the functionality of Azure Websites, Azure Mobile Services and Azure BizTalk Services into a single development experience.

App Service has everything you need to build apps that target both web and mobile clients from a single app back-end. Using API Apps, you can connect your app to dozens of popular services like Office 365 and salesforce.com in minutes, and integrate your own APIs so they can be used within any app. Also a number of BizTalk capabilities are available as API Apps to be used inside complex integration scenarios. Finally with Logic Apps, you can automate business processes using a simple no-code experience.

In this blogpost I will demonstrate how easy it is to expose SAP functionality to the cloud with the new Azure App Services. I will use a combination of API Apps to create a Logic App that exposes BAPI functionality to an external website.

Also I'll use a client application that will directly call our provisioned Logic App through HTTP. Our Logic App will then call our on premise SAP system over Hybrid Connection and return the result back to our client application.

Provisioning the API Apps

As Tom already explained in his blogpost we currently have two option to provision API Apps. One of them is to provision them upfront where you have full control of the naming, second is to provision them while designing your Logic App and leave Azure in control of your naming.

In this blogpost we will provision the API Apps upfront, so we have full control over all the naming and settings of our API apps.

HTTP Listener

The first API App we will provision is the HTTP Listener. The HTTP Listener allows you to open an endpoint that acts as a HTTP server and listen to incoming HTTP requests.

Open the Azure Marketplace and select the HTTP Listener API App and click on Create.

In the Configuration window provide an applicable name for your API App. Another important setting for our setup can be found inside the package settings. There we need to configure our HTTP Listener not to send a response back automatically. That way we are in full control of what response we send back to our calling clients.

Once our HTTP Listener is provisioned we can apply additional configuration to it. We can specify the level of security (None or Basic). We can choose if we want to automatically update our HTTP Listener to the newest version and we can also specify the Access Level.

To simplify this demo, I've chosen Security "None" and have set the Access level to "Public (anonymous)".

This rounds up the configuration of our HTTP Listener.

SAP Connector

The second API App we will provision is the SAP Connector. The SAP Connector lets you connect to an SAP server and invoke RFCs, BAPIs and tRFCs. It also allows you to send IDOCs to SAP Server.

Open the Azure Marketplace and select the SAP Connector API App and click on Create.

In the Configuration window provide an applicable name for your API App. Click on package settings to configure SAP specific configuration values, once your done filling in the necessary configuration values click on Create to provision the SAP Connector.

  • Server name - Enter the SAP Server name
  • User name - Enter a valid user name to connect to the SAP server.
  • Password - Enter a valid password to connect to the SAP server.
  • System number - Enter the system number of the SAP Application server.
  • Language - Enter the logon language.
  • Service bus connection string - This should be a valid Service Bus Namespace connection string.
  • RFCs - Enter the RFCs in SAP that are allowed to be called by the connector.
  • TRFCs - Enter the TRFCs in SAP that are allowed to be called by the connector.
  • BAPI - Enter the BAPIs in SAP that are allowed to be called by the connector.
  • IDOCs - Enter the IDOCs in SAP that can be sent by the connector.

Once our SAP Connector Listener is provisioned, we need to install and configure the Hybrid Connection as it will initially appear as Setup Incomplete.

Open the Hybrid Connection blade and click on Download and Configure to install the On-Premise Hybrid Connection Manager. This is a simple next-next finish installer, so this is very straightforward.

During the installation you will be prompted for the Relay Listen Connection String, copy this string from the Hybrid Connection Blade (Primary Configuration String).

Once the installation is finished and after a refresh (F5) of the portal, you will see the Hybrid Connection listed as connected.

This rounds up the configuration of our SAP Connector.

Creating the Logic App

In the second part of this blogpost we will use our provisioned API Apps to create a Logic App.

Logic Apps allow developers to design workflows that are activated with a trigger and execute a series of steps (API Apps).

Open the Azure Marketplace and select the Logic App and click on Create.

In the Configuration window provide an applicable name for your Logic App and click on Create to provision the Logic App.

Once our Logic App is provisioned we need to configure the necessary Triggers and Actions. You can start this configuration by clicking Triggers and Actions in your Logic App configuration.

Link the API Apps you created in the first part of this blog together as shown below. Click on Save to apply the changes to your Logic App.

This rounds up the configuration of our Logic App.

Testing your Logic App

You can test this Logic App by browsing to your HTTP Listener and opening up the host blade. Copy the URL value as shown below. You can now use this URL to submit HTTP Requests to your Logic App. In my demo I have executed a HTTP GET request from inside a website.

Be aware: Before sending HTTP Requests to the copied url, change the HTTP prefix to HTTPS.

You can see the result of your testing inside your Logic App by navigating to the Operations Blade of your Logic App. 

Conclusion

The new Azure App Services is a big step forward for cloud integration if you compare it with the previous Azure BizTalk Services. However it is still missing some heavy enterprise focused integration patterns like convoys, long running processes, large message handling... Also some improvements on seperating configuration values from the runtime logic can be done.

However the future looks bright and with this demo I showed you how easy it already is to expose SAP functionality to the cloud with Azure App Services.

Cheers,

Glenn Colpaert

Categories: App Services SAP Azure
written by: Glenn Colpaert

Posted on Tuesday, June 23, 2015 12:31 PM

Massimo Crippa by Massimo Crippa

It's crazy to see that the Power BI APIs are documented and managed on Apiary and not on Azure API management, isn't it?
Let’s see how to configure APIM so you can try out all of the Power BI APIs without writing a single line of code using the Microsoft Service.

Moving to Azure API management is more than seting up the documentation and the interactive console with a different look and feel. It gives us the possibility to take advantage of capabilities like throttling, usage analytics, caching and many more.

Here is the 4 steps procedure I did for this exercise:

  • Virtualization layer definition
  • Configure the Authorization Service (Azure Active Directory)
  • Configure Azure API Management to register the Authorization Service
  • Dev portal customization (optional)

Power BI API calls are made on behalf of an authenticated user by sending to the resource service an authorization token acquired through Azure AD.
The diagram below shows the OAuth 2.0 Authorization Code Grant flow.

Virtualization layer

First thing to do is to create the API Façade on Azure API Management defining the set of the operations that will be proxied to the Power BI web service (https://api.powerbi.com/beta/myorg).

Since Power BI APIs don't expose a swagger metadata endpoint, I manually created the API, added the operations, descriptions, parameters and representations (you can find the documentation here).

Then I defined my “Microsoft PowerBI APIs” product activating the visibility to the Guest group and with the Subscription Required option enabled and I added the API to the Product. With this configuration the API are visible to everyone so you can freely access to the documentation and on the other hand a subscription key is required to tryout the API using the built-in console.

The PowerBI APIs require an authentication token, so if we try to call the underline service at this point of the procedure we receive a 403 Forbidden answer.

Authorization Service

In order to provide a secure sign-in and authenticate our service calls with the Power BI APIs, we need to register our APIM built-in console in Azure AD as a Web Application. To complete this step you first need to sign up for the Power BI service and an Azure AD with at least one organization user.

Here you can find the register procedure. 

The main parameters to be set up are:

  • APP ID URI : Unique web application identifier. (e.g. https://coditapi.portal.azure-api.net/)
  • Reply URL : This is the redirect_uri for the auth code. The value configure in this field will be provided by the API Management's "Register Authorization Server" procedure (next section). 
  • Add the delegated permissions that will be added as claims in the authorization token.

Once the AD application is created you will get the ClientId and the ClientSecret values.

Note that the AD web application has been created on the Codit Office 365’s Azure AD so our setup will be valid only for our corporate Power BI tenant. Something different is the Apiary setup where I imagine that the Apiary WebApplication is enabled by default in every Office365's AD.

 

Register the Authorization Service

In this step we register the Authorization Service in Azure API Management and then we setup our Power BI façade API to use the registered Authorization Service. 

Here you can find the step-by-step procedure.

The main parameters to be set up are:

  • Authorization endpoint URL and Token Endpoint URL. 
  • The Resource we want to access on behalf of the authenticated user. 
  • ClientId and ClientSecret. Specify the value we got from the AAD  

Authorization endpoint URL and Token endpoint URL. Go to the Azure AD, select the application section and click the Endpoint button to access to the endpoint details.

The Resource service (Power BI) parameter must be specified as a body parameter using application/x-www-form-urlencoded format. This is the value of the resource parameter https://analysis.windows.net/powerbi/api

Common errors with wrong resource parameter are:

  • An error has occurred while authorizing access via PowerBI: invalid_resource AADSTS50001: Resource identifier is not provided. Trace ID: {Guid} Correlation ID: {Guid} Timestamp: {TimeStamp}
  • An error has occurred while authorizing access via PowerBI: invalid_request AADSTS90002: No service namespace named '{resourcename}' was found in the data store. Trace ID: {Guid} Correlation ID: {Guid} Timestamp: {timestamp}
  • An error has occurred while authorizing access via {APIM Authorization Server Name}

The "Register Authorization Server" procedure generates a redirect_uri that must be used to update the "Reply URL" value in the AD Web Application we set up in the previous step. If not, you'll get this error at the first tryout :

  • AADSTS50011: The reply address '{redirect_uri}' does not match the reply addresses configured for the application: {Guid}.

The last step is to configure our Power BI façade API to use the OAuth 2.0 authorization mechanism. Open the API, click on the Security tab and then check the OAuth 2.0 box and select the registered Authorization Server.

Try it out

The API Management built-in console allows us to quickly test the API. 

Choose an API operation and then select "Authorization Code" from the authorization dropdown to access to the sign-in User Agent provided by Azure Active Directory. Once you have signed the HTTP request will be updated with the Bearer Token (the token value is not readable in the UI). 

Now you can specify the desired values for the additional parameters, and click Send to test the API. 

If you are interested to get access to the Bearer token, use Fiddler to intercept the Azure AD reply and then jwt.io to analyze the token content with all the properties and claims. 

Developer Portal

As the Developer portal is a completely customizable CMS, you can set the look and feel following the branding strategy and add all the content you need to help to drive the APIs adoption. 

In this case, I created a custom page (https://coditapi.portal.azure-api.net/PowerBI) dedicated to the Power BI APIs to collect some resources to facilitate the API intake like MSDN references and code samples.  

Conclusions

As you can see, you can very easily use Azure API Management to connect to Power BI APIs on behalf of an authenticated user.

Moving to API management is not only like coming back home but also gives us the possibilities to take advantage of APIM capabilities like usage analytics to get insights about the health and usage levels of the API to identify key trends that impact the business.

Cheers!

Massimo Crippa

Categories: Azure API Management
written by: Massimo Crippa

Posted on Tuesday, June 16, 2015 10:30 AM

Tom Kerkhove by Tom Kerkhove

Brecht Vancauwenberghe by Brecht Vancauwenberghe

On June 11th, the second edition of ITPROceed was organised. In this blog post you will find a recap of the sessions that we followed that day.

ITPROceed took place at Utopolis in Mechelen. A nice venue! Parking in front of the building and a working wifi connection immediately set the pace for a professional conference. There were 20 sessions in total (5 tracks).

Keynote: Microsoft Platform State-of-the-union: Today and Beyond By Scott Klein

Scott Klein kicked off the day with the keynote and gave us a nice state-of-the-union.
Scott explained us once the many advantages of using the cloud and compared it with the old days.

A big advantage is the faster delivery of features, where in the past it could take 3/5 years to deliver a feature through service packs or new versions of software.

Why Microsoft puts Cloud first:

  • Speed
  • Agility
  • Proven
  • Feedback

Next to that Scott showed us several new features, SQL Server 2016, Azure SQL Data Warehouse, PolyBase, Data Lake, Windows 10,...

The world is changing for IT Professionals, this means lot's of changes but also a lot of opportunities!

Are you rolling your environments lights-out and hands-free? by Nick Trogh

Nick gave us a lot of demo's and showed us how we could spin up new application environment in a reliable and repeatable process. In this session we looked into tools such as Docker, Chef and Puppet and how you can leverage them in your day-to-day activities.

Demystifying PowerBI

Speaker - Koen Verbeeck

Koen gave us a short BI history lesson whereafter he illustrated what each tool is capable of, when you should use it and where you can use it. To wrap up he showed the audience what Power BI v2 (Preview) looks like and how easy it is to use. Great introduction session to (Power) BI!

If you're interested in Power BI v2, you can start here

Data Platform & Internet of Things: Optimal Azure Database Development by Karel Coenye

In this session Karel told us about several techniques to optimize databases in Azure to get the most out of them and reducing the cost.
With Azure SQL databases you need to think outside the box and optimise with following principle:  

Cross premise connectivity with Microsoft Azure & Windows Server by Rasmus Hald

Running everything in the cloud in the year 2015 is very optimistic, often several systems are still running on premise and have not been migrated already to the cloud. Network connectivity between the cloud and on premise is necessary!

Within Codit we already have experience with Azure networking, it was very nice to follow the session to get more tips and tricks from the field.

Rasmus covered four topics:

1. How Windows Server & Microsoft Azure can be used to extend your existing datacenter.
2. How to use existing 3rd party firewalls to connect to Azure.
3. The new Azure ExpressRoute offering.
4. Windows Azure Pack.

Big Data & Data Lake

 Speaker - Scott Klein

SQL Scott was back for more, this time on BIG data! He kicked off with an introduction of how many data was processed to calculate the space trip for Neil Armstrong to the moon and how the amounts of data have evolved.

Bigger amounts of data means we need to be able to store them as good & efficient as possible but also be able to work with that data. In some cases plain old databases are not good enough anymore - That's where Hadoop comes in. 

The Hadoop ecosystem allows us to store big amounts of data across data nodes and where we can use technologies such as Pig, Hive, Sqoop and others to process those big amounts of data.

As we start storing more and more data Microsoft started delivering Hadoop clusters in Azure called HDInsights which is based on the Hortonworks Data Platform. If you want to know more about HDInsight, have a look here.

Process big data obviously requires the big data itself, during Scott's talk he also talked about Azure Data Lake which was announced at //BUILD/ this year. There are several scenarios where you can benefit from Azure Data Lake - It's built to store your data without any limitation, whether it is size, type or whatsoever, in its raw format.

In the slide below you can see how you can use Azure Data Lake in an IoT scenario.

Just to clarify - Data Lake is not something Microsoft has invented, it's a well known concept in the Data-space that is kinda the contrary of Data Warehousing. If you want to learn more about Data Lake or the relation with Data Warehousing, read Martin Fowlers vision on it.

Scott wrapped up his session with a demo on how you can spin up an Azure HDInsight cluster. After that he used that cluster to run a Hive query on your big files stored in an Azure Storage account as blobs.

Great introduction session to big data on Azure.

Securing sensitive data with Azure Key Vault by Tom Kerkhove

Speaker - Tom Kerkhove

In the closing session Tom introduced us the concepts of Microsoft Azure Key Vault that allows us to securely store keys, credentials and other secrets in the cloud.

 Why you should use Azure Key Vault:

  • Store sensitve data in hardware security modules (HSM)
  • Gives back control to the customer
    • Full control over lifecycle and audit logs
    • Management of keys
  • Removes responsibility from developers
    • Secure storage for passwords, keys and certificates
    • Protect production data

Posted on Monday, June 8, 2015 8:31 AM

Brecht Vancauwenberghe by Brecht Vancauwenberghe

Filiep Maes by Filiep Maes

Jonathan Maes by Jonathan Maes

Toon Vanhoutte by Toon Vanhoutte

Pieter Vandenheede by Pieter Vandenheede

On June 4th, BTUG.be organized the first "Integration Day". At Codit we were happy to be present and in this blog post you will find a recap of the sessions that were brought that day.

The BTUG Integration Day took place in the Moonbeat studio in Mechelen. A nice venue for about 35 eager participants of the event. There were 9 sessions in total, and the day kicked off with a nice keynote session from Jan Tielens.

Keynote

Speaker: Jan Tielens

Jan started off with a small recap of where integration comes from and where it is headed to. From monolithic designs to API apps, web apps and logic apps. He proceeded with a demo on provisioning API apps and logic apps and how to retrieve tweets using a certain #hashtag by using a recurrence app and the new Logic Apps.

The demo didn't go exactly as planned due to the network acting up, but it involves retrieving tweets from Twitter and sending them to a local Raspberry Pi printer. Later that day it seems that it worked just fine:

Jan continued his keynote talking about the capabilities of API apps and the Swagger integration and the concept of hybrid integration: integration between different cloud services or the combination cloud and on-premises.

 

BizTalk Server Deep Dive Tips & Tricks for Developers and Admins

Speaker: Sandro Pereira

 

After the keynote, Sandro took the stage to have a session on BizTalk tips and tricks for both BizTalk administrators and developers. 

The first part was focused on BizTalk administration.


The most important tips:

  • Configure the default BizTalk backup job, to include custom databases
  • Take advantage of PowerShell to automate the tracking configuration settings
  • Automatically clean up custom message archiving folders

The second part was more developer oriented. Interesting tricks:

  • Use default or custom pipeline components in a generic way
  • Request/Response patterns are supported in a messaging only scenario
  • Via a registry change, the Business Rules Engine supports consuming static methods

It's good to see that most of the tips and tricks are already part of the Codit BizTalk Best Practices, our technical 'bible' to ensure quality within BizTalk integration projects!

 

Demystifying Logic Apps

Speaker: Ashwin Prabhu

Aswhin started giving an overview of the history of Logic Apps, even it's not been here for a long time it has some interesting key-points.

Windows first announced Windows Azure in 2008, in 2011 Service Bus was introduced and this were the first integration (baby)step.

In 2013 BizTalk Services were announced, but after some architectural changes this was re-worked so it would fit in the new eco-system (App services). The main reason for this is that Microsoft would like to provide us a democratic eco-system so we - as a developer - can (re-)use functionality from each other (e.g. mapping functionality)
These different building blocks (Logic app, API app, Web app, mobile app) provide us an easy way to use different functionality without having a steep learning curve. 

During the demo: Aswhin created a logic app with 2 different connectors a SQL connector and File connector - SQL server was queried and some data was picked up and sent to the file adapter.

What can we expect for Logic Apps in the future?

  • Integration patters (Convoys, long running processes, Auto delivery)
  • Large messaging handling
  • Azure services on premise.
  • Build-in designer for Visual Studio.
  • Bug fixes, important is that you provide your feedback Microsoft is ready to listen! (Tip: if you are using connectors at this moment, and you don’t want to be bothered with updates, you can disable the auto-update in the portal.
  • Better error messages

During the question round, Aswhin got the question if Logic Apps created to take over of BizTalk Server? BizTalk server on-premises is here to stay, but things are moving! For example: a start-up may be better served with cloud services so they can focus on their functionality instead of infrastructure,

Microsofts purpose is to provide an alternative in the cloud. But both worlds can exist next to each other.

 

5 Advanced techniques in BizTalk360 to Solve Operational Problems

Speaker: Saravana Kumar

Just before lunch Saravana took the lead and presented how BizTalk 360 can help you to solve daily Operational problems.

BizTalk 360 has 50 features in BizTalk 360 focused on Operations & Monitoring.

Saravana his sessions was hands-on containing 5 different interesting demo's.

1.  Visualize your message flow

  • However complex they are, with zero coding change you can visualize the BizTalk flows.
  • Admin console is difficult to understand, very technical.

2. Automate your operations

  • A lot of time is lost daily on monotonous tasks.
  • Data monitoring / MessageBox monitoring (In our opinion the BizTalk flows should handle these tasks as much as possible leaving no suspended messages/manual intervention).

3. Import/Export, Auto correct monitoring configuration

  • Import/Export: moving monitoring configuration from Test to Production.
  • Autocorrect: receive location down, gets automatically started by BizTalk 360.

4. Manage any internet connected BizTalk environment remotely

  • In a secure way
  • No complex VPN connection necessary
  • Handy for operations teams that need to be able to connect 24/7 to the BizTalk environment: BTS 360 is using Service bus Relay

5. Understand Throttling

  • Understanding throttling is a complex process and requires a BizTalk expert to understand the problem.
  • BizTalk 360 can be used to easily understand what the real problem is on the environment.

Next to BizTalk 360 there are different monitoring tools on the market (Codit Integration Dashboard, System Center Operation Manager, AIMS, BizTalk Health Monitor) each having their advantages.

 

BAM Experiences in Large Scale Deployments

Speaker: Marius W Storsten

AIMS Innovation has - up until now - used BAM as a core component of their monitoring solution for BizTalk: AIMS Innovation. Marius shared AIMS' experiences on using BAM in a monitoring setup -> how it works, the good & bad, performance, bugs, tips, tricks and challenges

Marius tried to make it an interactive session, which is very nice, but I don't think he counted on a Belgian audience :)
Luckily some Dutch guys were quicker to answer.

It is AIMS' experience that the choice for BAM has not been the best and Marius showed us this by referencing some of their experiences and discoveries around BAM. One of them being a dependency between bttdeploy.exe and the Tracking Profile Editor (TPE). Meaning that bttdeploy.exe depended on TPE and not the way around.

Marius concluded with some recommendations on using BAM:

There is also a small, but nice blog post up on their website about this as well: http://www.aimsinnovation.com/news/aims-blog/why-we-went-bam-less

 

Governance and API Management for BizTalk Server- what’s the value and how to implement it?

Speaker: Adrew Slivker

In a world that exists out of services and API's that are business critical for companies we need governance and management of these.

What is governance about?

  • Manage services
  • Share metadata (SOAP, Swagger, ... )
  • Document services
  • Publish & register services
  • ... 

The management of the services exists out of security (authentication & authorization), monitoring, SLA, etc...

Sentinet manages SOA and API services and applications deployed on-premises, in the cloud, or in hybrid environments.  To provide us the possibility to govern & manage our services, Sentinet uses the gateway concept – publish internal services to partners, provide the possibility to use multiple protocols, add monitoring, ... all that without changing the internal functionality of your services.

During the demo - Andrew showcased the Nevatech stack and the Sentinet solution. In the demo an internal net.tcp service hosted by BizTalk that's been able to consumed by clients through a virtual service hosted by Sentinet via both SOAP and REST, without any development.

 

JSON and REST in BizTalk

Speaker: Steef Jan Wiggers

 

Steef Jan brought us a session about JSON and REST.
In a new hybrid world, integration is key and will become more important than ever. There are several systems like SAP that are not going to the cloud in the near future.  BizTalk server 2013 R2 offers capabilities to fulfill the demand for a hybrid type integration solution using the JSON encoder/decoder.

The session was mostly based on DEMO's where we also connected with the API of last.fm.

You can find this demo on the technet wikis as well: http://social.technet.microsoft.com/wiki/contents/articles/29719.biztalk-server-2013-r2-json-support-and-integration-with-cloud-api-s.aspx

 

Azure Event Hubs, StreamAnalytics and Power BI

Speaker: Sam Vanhoutte

In his IoT demo Sam will show how to use Azure Event Hubs, Stream Analytics and Power BI.

There is a lot of similarities between BizTalk integration and IoT, it is all about connecting data.

A typical IoT event stream looks like:

  • Data generators (sensors)
  • Field gateways: Used as bridge between the cloud and the data generators
  • Event hubs: Used to collect data on a large scale
  • Stream Analytics: digest the data
  • Data analytics: Power BI

Event Hubs- is a highly scalable publish-subscribe event ingestor that can intake millions of events per second so that you can process and analyze the massive amounts of data produced by your connected devices and applications.  In his demo, Sam shows how to setup an event hub and how it works using throughput units.

After collecting data you can use stream analytics for real time analytics. Stream Analytics provides out-of-the-box integration with Event Hubs to ingest millions of events per second. It is based on SQL syntax. Sam gives a demo of how stream analytics works.

Power BI is about visualizing data instead of using tables for the end user, a (free) Power BI dashboard is available. Currently, it has limited capabilities:

  • Data collection
  • Support for Power Maps
  • Pin reports, relationship between different views

Sam ends with an overall demo about traffic control. His demo generates speed information, sends the data to the event hub, uses stream analytics to sort the data and finally shows the information in Power BI.

 

Conclusion

We had a blast with the Integration Day and hope to be present again next year! A big thank you to the BTUG.be organization and the speakers and sponsors of this event. We (as Codit) are proud to be apart of this!