Codit Wiki

Loading information... Please wait.

Codit Blog

Posted on Thursday, September 10, 2015 8:18 PM

Sam Vanhoutte by Sam Vanhoutte

On September 10 and 11, Codit was present at the first Cortana Analytics Workshop in Redmond, WA. This blog post is a report for the community on the content we collected and our impressions with the offering of Cortana Analytics. From data to decisions to action is the tagline for this event.

Once again, beautiful weather welcomed us in Redmond, when we arrived for the first Cortana Analytics workshop.  A lot of people from all over the world were joining this event that was highly anticipated in the Microsoft data community.  

As Codit is betting big on the new scenarios such as SaaS and Hybrid integration, API management and IoT, we really understand the real value for our customers will be gained through (Big) Data intelligence.  Next to a lot of new faces, there were also quite a bit of integration companies attending the workshop, such as our partner Axon Olympus, our Columbian friends from IT Sinergy and Chris Kabat from SPR.  

I will start with some impressions and opinions from our side.  After that, you can find more information on some of the sessions we attended.

Key take aways.

At first sight, Cortana Analytics suite is combining a lot of the existing data services that exist on Azure.  

  • Information management: Azure Data Catalog, Azure Event Hubs, Azure Data Factory
  • Big data stores: Azure Data Lake, Azure SQL Data Warehouse
  • Machine Learning and Analytics: Azure ML, Azure Stream Analytics, Azure HDInsight
  • Interaction: PowerBI (dashboarding), Cortana (personal assistent), Perceptual Intelligence (Face vision, Speech test)

So far, no specific pricing information has been announced.  The only thing to this regards was "One simple subscription model".

There are a lot of choices to implement stuff on Azure Big data.  And new stuff gets added frequently.  It will become very important to make the right choices for the right solution.  

  • Will we store our data in blob storage or in HDFS (Data Lake)
  • Will we leverage Spark, Storm or the easy Stream Analytics.
  • How will we query data?

Keynote session

The keynote session, in a packed McKinley room, was both entertaining and informative.  The event was kicked off by Corporate Vice President Jospeph Sirosh who positioned the Big Data and Analytics offering of Microsoft and Cortana Analytics.  The suite should give people the answers to the typical questions such as 'What happened?', 'Why did it happen?', 'What will happen?', 'What should I do?'.  To answer those questions, Cortana Analytics suite gives the tools to access data, analyze it and take the decisions.

Every conference has a schema that returns in every single sessions.  The schema for Cortana Analytics is the following one that shows all services that are part of the platform. (apologies for the phone picture quality)

The top differentiators for Cortana Analytics Suite

  • Agility (Volume, Variety, Velocity)
  • Platform (storage, compute, real time ingest, connectivity to all your data, information management, big data environments, advanced analytics, visualization, IDE's)
  • Assistive intelligence (Cortana!)
  • Apps (Vertical toolkits, hosted API's, eco system)
  • Features: Elasticity, Fully managed, Economic, Open Source (R & Python)
  • Facilitators
  • Secure, compliant, auditable
  • One simple subscription model
  • Future proof, with research of MSR & Bing

Cortana Analytics should be Agile, Simple and beautiful.

A firm statement was that "if you can cook by following a recipe, you should be able to use Cortana Analytics".  While I don't think that means we can go and fire all of our data scientists, I really believe that the technology to perform data analytics becomes more easily available and understandable for traditional developers and architects.

This is achieved through the following things.

  • Fully managed cloud services
  • A fully integrated platform
  • Very simple to purchase
  • Productize, simplify and democratize
  • Partner eco system


A lot of the services were demonstrated, using some interesting and well-known examples.  Especially the demo and the underlying architecture was very interesting.  That application was only possible through the tremendous scalability of Azure, and the intelligent combination of the right services on the Azure platform.  

Demystifying Cortana Analytics

Jason Wilcox, Director of engineering, was up next.  During a long intro on data anlytics, he mentioned the 'process' for data analytics as following:

  1. Find out what data sets are available
  2. Gain access to the data
  3. Shape the data
  4. Run first experiment
  5. Repeat steps 1, 2, 3 and 4 until you get it right
  6. Find the insight
  7. Operationalize & take action

 In his talk, Jason explained the following things about Cortana Analytics

  • It is a set of fully managed services (true PaaS!)
  • It works with all types of data (structured and unstructured) at any scale.  Azure Data Lake is a native HDFS (Hadoop File System) for the cloud.  Is it integrated with HDInsight, Hortonworks and Cloudera (and more services to come).  It is accessible from all HDFS compatible projects, such as Spark, Storm, Flume, Sqoop, Kafka, R, etc.  And it is fully built on open standards!
  • Operationalize the data through Azure Data Catalog (publish data assets) which will be integrated in Cortana Analytics
  • Cortana Analytics is open  to embrace and extend and allow customers to use the best-of-breed tools.
  • It's an end-to-end solution from data ingestion to presentation.


Real time data processing.  How do I choose the right solution

Two Europeans from the Data Insight Global Practice (Simon Lidberg, SE and Benjamin Wright-Jones, UK) were giving an overview of the 3 major streaming analysis services, available on Azure: Azure Stream Analytics, Apache Storm and Apache Spark.  

Azure Stream Analytics

Azure Stream Analytics is a known service for us and we've been using it for more than a year now.  We also have some blog posts and talks about it.  Simon was giving a high level overview of the service.

  • Fully managed service: No hardware deployment
  • Scalable: Dynamically scalable
  • Easy development: SQL Language
  • Built-in monitoring: View system performance through Azure portal

The typical Twitter sentiment demo was shown afterwards.  In my opinion, Azure Stream Analytics is indeed extremely easy to get started and to build quick win scenarios on Azure for telemetry ingestion, alerting and out of the box reporting.  

Storm on HDInsight

HDInsight is a streaming framework available on HDInsight.  A quick overview of HDInsight was given, positioning things like Map/Reduce (Batch), Pig (Script), Hive (SQL), HBase (NoSQL) and Storm (Streaming).

This is Apache Storm

  • Real time processing
  • Open Source
  • Visual Studi integration (C# and Java)
  • Available on HDInsight

Spark on HDInsight

Spark is extremely fast (3x faster than Hadoop in 2014).  Spark also unifies and combines Batch processing, Real Time processing, Stream Analytics, Machine Learning and Interactive SQL. 

Spark is integrated very well with the Azure platform.  There is an out of the box PowerBI connector and there is also support for direct integration with Azure Event Hubs.  

The differentiators for Spark were described as follows:

  • Enterprise Ready (fully managed service)
  • Streaming Capabilities (first class connector for Event Hubs)
  • Data Insight 
  • Flexibility and choice

Spark vs Storm comparison

Spark differs in a number of ways:

  • Workload: Spark implements a method for batching incoming updates vs individual events (Storm)
  • Latency: Seconds latency (Spark) vs Sub-second latency (Storm)
  • Fault tolerance: Exactly once (Spark) vs At least once (Storm)

When to use what?

The following table compared the three technologies.  My advise would be to opt for Stream Analytics for quick wins and straight forward scenarios.  For more complex and specialized scenarios, Storm might be a better solution.  It depends, would be the typical answer to the above question.

 Below is a good comparison table, where the '*' typically means "with some limitations".





Multi tenant service




Deployment model








Deployment complexity








Open Source Support






.NET, Java, Python

SparkSQL, Scala, Python, Java…

Power BI Integration

Yes, native


Yes, native

Overview of the Cortana Analytics Big Data stack (pt2)

In this session, 4 technologies were demonstrated and presented by 4 different speakers.  A very good session to get insights in the broad eco system of HDInsight related services.

We were shown Hadoop (Hive for querying), Storm (for streaming), HBase (NoSQL) and the new Big Data applications that will become available on the new Azure portal soon.

A nice demo, leveraging Hadoop HBase is the Tweet Sentiment demo:

Real-World Data Collection for Cortana Analytics

This was a very interesting real world scenario that was presented on a real world IoT project (soil moisture).  It's always good to hear people talk from the field.  People who have had issues and solved them.  People who have the true experience.  This was one of those sessions.  It's hard to give a good summary, so I thought to just write down some of the findings that I took away.

  • You need to get the data right!
  • Data scientists should know about the meaning of the data and sit down with the right functional people.
  • An IoT solution should be built in such a way that it suports changing sensors, data types and versions over time.

Data Warehousing in Cortana Analytics

In this session, we got a good overview of the highly scalable offering of Azure SQL Data Warehousing, by Matt Usher. The characteristics of Azure SQL DW are:

  • You can start small, but scale huge
  • Designed for on-demand scale (<1 minute for resizing!)
  • Massive parallel processing
  • Petabyte scale
  • Combine relational and non-relational data (It is Polybase with HDinsight!)
  • It is integrated with AzureML, Power BI and Azure Data Factory
  • There is SQL Server compatibility (UDF's, Table partitioning, Collations, Indices and Columnstore support)


Closing keynote: the new Power BI

James Philips, CVP Microsoft

Power BI is obviously the flagship visualization tool that gets a lot of attention.  While there are a lot of shortcomings for a lot of scenarios, it's indeed an awesome tool that allows to build reports very fast.  In this session, we got an overview of the new enhancements of Power BI and some insights in what's coming next.

Wihle most of these features were known, it was good to get an overview and recap of these features:

  • Power BI content packs
  • Custom Power BI visuals
  • Natural language query
  • On premise data connectivity

Some things I did not know yet

  • There is R support in Power BI Desktop (plotting graphs and generating word clouds)
  • When you add devToolsEnabled=true in the url, there are custom dev tools available in Power BI
  • Cortana can be integrated with Power BI.  


This was it for the first day.  You can expect another blog post on Day 2 and more on Machine Learning from my colleague Filip.

Cheers! Sam


Posted on Friday, August 7, 2015 2:25 PM

Sam Neirinck by Sam Neirinck

Tom Kerkhove by Tom Kerkhove

IoT is everywhere nowadays and it's here to stay - Almost everything is getting connected to the internet, even cows!

As of this year Codit now allows you to do more with your data and create your own Internet-of-Things (IoT) to optimize your business flows, fix before it breaks, predict costs and the like. We at Codit have invested a lot in IoT, some of it are already available on our blog and there is more to come.

However, we are not alone.

July 29th was the start of a new chapter for Microsoft - The release of Windows 10. This (last) version of Windows has a special SKU for IoT devices. Next to that Microsoft already offers a lot of services on Microsoft Azure that allow you to build out your IoT solutions in a scalable manner.

Let's have a closer look at some of the things Microsoft invested in and what they offer.

Windows 10 on Devices

On March 18, 2015 Microsoft officially announced Windows 10 IoT, its SKU for a wide range of intelligent, connected IoT devices. It targets small devices like gateways up to powerful industry devices like robotics and specialty medical devices.

Windows 10 IoT is free for Makers and commercial device builders. Additional good news is that Microsoft partnered with the Raspberry Pi Foundation, Intel and Qualcomm to ensure their operating system works great on a range of developer boards:

  • Raspberry Pi 2
  • MinnowBoard MAX
  • Qualcomm DragonBoard 410c

Along with these physical devices, it's also possible to interact with an Arduino via a Windows Remote Arduino library, or a Windows Virtual Shield for Arduino. Want more info? Look at the Getting Started page.

Windows 10 Insider Preview

Similar to the Insider Preview for Windows 10 desktop, the IoT SKU also had an Insider program. However, since the IoT SKU was released in parallel with the desktop, you can start experimenting today with the RTM bits, if you have a compatible board. Download the latest ISO on Microsoft IoT's download page.

The developer story

One of the big promises of Windows 10 is its Universal Windows Platform. This allows you to write your code once and have the same binary deployed across all sorts of devices, ranging from your small IoT device up to the Xbox or Surface Hub. One Windows Platform

How does this work in practice? Specific devices have specific functionalities. An Xbox has a controller, perhaps a Kinect. A phone might have hardware buttons. An IoT device might not have a display at all, but quite a lot of other sensors.

For this reason, extension SDKs exist. They provide APIs that allow you to interact with the device-specific features. For IoT devices there's currently the Windows IoT Extension SDK, which you can reference if you're building a Universal Windows App. Once it's referenced, you can use these API's to talk to the hardware. If you plan to run your app on other devices than IoT, you'll need to do runtime checks to see if it supports the IoT extensions, otherwise an exception will be thrown. You do this via the ApiInformation.isTypePresent method.

So this allows you to develop for IoT devices, using the same APIs (networking, storage, etc...) as desktop apps (non-win32 apps), which should feel very familiar, and redurce your time to market.

Powering your Internet-of-Things with Microsoft Azure

Everyone who's interested in Microsoft Azure knows that they are now shipping new features almost every week and staying on-track has almost become a job on its own. The new open Microsoft also reflects on Azure where they support several open-source technologies and contribute as they offer these services.

If we look at the past year from an IoT perspective we've seen a ton of new services that are now general available, in public preview or announced & in private preview :

  • Event Ingestion with Azure Event Hubs or use Azure IoT Hub with integrated device management
  • Near-real time processing with Azure Stream Analytics or Apache Storm on Azure HDInsight
  • Build your custom logic on Azure Service Fabric, a hyper-scale infrastructure
  • Use Apache HBase on Azure HDInsight or Azure Storage for long-term storage with Azure Data Lake on the horizon
  • Move your data around with Azure Data Factory
  • Run your Hadoop workloads in Azure VMs or managed by Azure HDInsight running on Linux or Windows or use Apache Spark instead, also running on Azure HDInsight
  • Train & score sample data against your trained models with Azure Machine Learning

Microsoft is using these services heavily internal as well. This means that some of these services are battle-tested and production ready. Some examples are Bing, Halo and Cortana, she's performing 500 milion evaluations per second by using Azure Service Fabric.

Azure IoT Suite & Cortana Analytics Suite

Microsoft announced two "suites" that will offer a bundle of services that will help you in certain scenarios - Azure IoT Suite & Cortana Analytics Suite. Both of these suites are not available yet but are scheduled for this fall.

Azure IoT Suite

When you want to connect devices to the cloud in a secure way you have to do a lot of plumbing i.e. provision & manage devices, revoke device, support different protocols and the like. Azure IoT Suite introduces a new service called Azure IoT Hub that acts as a gateway and does all these things out of the box. If you want to know more about IoT Hub, you can watch this introduction session from //BUILD/. There will be support for Device-to-Cloud as well as Cloud-to-Device communication with open-source device agents to communicate with.

Here is Rockwell Automation who is using this already.

Cortana Analytics Suite

Cortana Analytics Suite is more data & analytics focused that provides a fully-managed analytics suite and is subscription-based. It will leverage tight integration between Azure Big Data services, Azure Machine Learning, Cortana & PowerBI. All the services that will be included are listed here.

North American Eagle is one of the reference cases that are using Cortana Analytics Suite to calculate where they can optimize in an attempt to break a world record.

They've blogged some additional customer references here.

Lead by example

Microsoft has also been doing a good job at giving examples of IoT scenarios and how you can develop your own IoT solution or what you should not do!

  • Clemens Vasters introduced Service Assisted Communication in a white paper which is a security pattern where he talks about possible security flaws and how a cloud gateway can help you. He also discussed this more deeply at //BUILD/ this year in his Azure IoT Security talk.
  • Connect the dots is an open-source project by Microsoft Open Technologies that show you several IoT scenarios and how you can build them. More information can be found here.
  • IoT Journey by Microsoft Patterns & Practices is also an open-source project where they are building a real-world solution with guidance on what options they have and why they prefer one over the other. You can follow them on GitHub or watch them on Channel 9.
  • Connected Cows is a real-world scenario where they have literally connected cows to the internet to predict when a cow is ready to mate. Watch how Joseph Sirosh walks through the whole scenario during Strata + Hadoop 2015.


A lot has changed in the past year and I think with the Windows 10 (IoT) release, the newly released Azure services and suites on the way I think we can say - Microsoft is ready, how about you?

Want to stay up-to-dat? Feel free to subscribe to the Microsoft IoT blog!

Thanks for reading,

Sam & Tom

Categories: IoT

Posted on Wednesday, August 5, 2015 2:35 PM

Massimo Crippa by Massimo Crippa

The Nevatech team have just released a new version of their SOA and API Management solution, Sentinet 4.5. This release comes up with improvements, some enhancements and the most awaited OAauth 2.0 support. You can find all the details in the release notes, but here I want to have a look to the main new features in this summer release.

Notes before the Upgrade

Sentinet 4.5 is compiled with .NET 4.5.1 which is now the minimum requirement starting from this version. Sentinet 4.5 will be backwards compatible with the previous version except for .NET 4.5 driven minor incompatibilities with WIF framework (which is now part of .NET starting from .NET 4.5).

The incompatibilities will effect only any custom extensibility components developed for Sentinet that use WIF. Specifically, if you developed custom Access Rules you will have to recompile your components’ code because the interface signature changed (even though functionally everything will be the same).
For custom Access Rule the IMessageEvaluator interface changed in part of its Evaluate method signature.

//previous versions: 
bool Evaluate(IClaimsPrincipal principal, ref System.ServiceModel.Channels.Message message);
//from version 4.5 onwards 
bool Evaluate(ClaimsPrincipal principal, ref System.ServiceModel.Channels.Message message);

All other customizations (e.g. WCFextensibility, Handlers, Routing, Management API) are fully compatible, no adaptations are required.

OAuth 2.0

The version 4.5 introduce the OAuth support only for the passive authorization mode which means without any user agent interaction to get the authorization code/token.

The Sentinet OAuth can be setup at two sides:

  • Inbound (service side): used on the service side to validate access token provided in the request message. Two types of validation are supported, JWT (validated by checking X509 certificates, RSA and HMAC signatures) or Reference (validated by calling the token validation endpoint).
  • Outbound (client side): used on the consumers side to retrieve access token from OAuth 2.0 server and insert it into the request message to access to the backend service with the delegated authorization. Two authentication flows are supported, ResourceOwner (passing end user and client credentials) and ClientCredentials (passing only the client credentials).

Depending on whether you need to validate the token coming from the consumer application or acquire a new token, the OAuth integration is done by configuring the new built-in WebOAuthSecurity binding (full blogpost will follow).


Process Pipeline enhancements

Sentinet 4.5 adds a set of new built-in message processing components to significantly enhance the product capability to implement runtime messages processing and messages transformations. HTTP Header, HTTP Status, Query Parameter, SOAP Header, Message Body and Context Parameter components have been added to easily access and modify all the message parts.

The most interesting one is the Context Parameter component and the way we can combine it with the other components to enables different use cases. With the context property component we can:
• Access any information contained in the message headers, content, URI and message metadata.
• Extract the value using regular expressions, templates or xpath.
• Use the context property value from other components in the processing pipeline.

For example we can use a context property component to read a value from the incoming message content and assign it to a property that can be used by a decision shape in the pipeline.
The context property values are accessible from the other components using the”<{ ContextPropertyName}>” token. In the example below the ContextProperty2 is used to control the response message flow (as a part of the “if” clause) and the ContextProperty1 to set the content of the response message.

Node Activity Logs

Select a node and click on the Activity Logs tab to access to the new view. This view helps the Sentinet users to understand what’s going on with the Sentinet node and enables a quick troubleshooting without connecting to the machine where the node is hosted.

If it happens that one of the service hosted in the node cannot be activated because of connection problems or firewall restrictions, a warning node status is reported in the summary tab.
The Activity logs view enables us to identify the root cause, which service version is affected by the error and to understand whether it is a general problem or a problem limited to a specific node instance.


No configuration data will be lost during upgrade to the Sentinet 4.5. All the Sentinet physical and virtual services and other associated objects will be available immediately after the upgrade.

Enjoy this new release, cheers.


Categories: API Management Sentinet
written by: Massimo Crippa

Posted on Wednesday, July 22, 2015 1:36 PM

Tom Kerkhove by Tom Kerkhove

Security is more important than ever and no day goes by without a company being hacked, a breach has been detected in some 3th party plugins or whatsoever.

We - as developers & IT Pros - are responsible for building hardened applications and securely store sensitive data as if it were our own.

In this blog post I'll talk about Azure Key Vault and how it can help you store keys and secrets such as connection strings in the cloud.

Security is and will always be very important. In the past years we've seen how Snowden revealed the activities of the NSA, how a big company as Sony can be hacked, how governments spy on each other, etc. Next to that we also have the new technologies and concepts like the Internet-of-Things, these also introduce new concerns and problems to tackle.

These events create more awareness concerning privacy, security & data ownership while end users are still using passwords like '123456' according to CNet, good luck with that.

The applications that we, as developers/ITPros, build are responsible for protecting those users their information as much as required, whatever it takes. Alas, building secure applications is not easy and requires planning & implementation from the start - It's not something that you just add at the end of development. Unfortunately, some applications still have to deal with threats such as SQL injection as Troy Hunt mentions on DotNetRocks or even storing passwords as plain text, luckily we have Have-I-Been-Pwned to notify us for these kind of breaches.

Have I Been Pwned?

There are additional aspects we need to secure in our solution i.e. where will we store the configuration values, in our web.config? How about our API keys & connection strings? While considering where to store it, how do we protect it from humans such as operators? Can we shield the information from them? When you need to add support for encryption or signing there is the additional burden of storing these keys.

It would be easy if all these sensitive secrets are stored in one central secure place.
This is just the start, hopefully these are questions you've asked yourself before.

This is exactly where Azure Key Vault comes in and helps us with some of these concerns, let's have a look how!

Introducing Azure Key Vault

Azure Key Vault is a service that enables you to store & manage cryptographic keys and secrets in one central secure vault. All the sensitive data is stored on physical hardware security modules (HSM) - FIPS 140-2 Level 2 certified - inside the datacenter where the data will be encrypted by VMs or directly on the HSM, more on this later.

A vault owner can create a Key Vault gaining full access & control over the vault. In a later release the vault owner will have an audit trail available to see who accessed their secrets & keys. They are now in full control of the key lifecycles as well, they can roll a new version of the key, back it up, etc.

A vault consumer can perform actions on the assets inside the Key Vault when the vault owner grants him/her access and depending on the permissions granted, we will discuss this more in a sec.

This enables us to give our customers full control on their sensitive data - They can decide how their key lifecycle looks and whom has access to it. Based on the audit logs they are aware of what the consumers are doing and if they are still trustworthy.

On the other hand, developers are now no longer responsible for storing sensitive data such as API tokens, certificates & encryption keys. Operators will also no longer be able to see sensitive data in the database, web.config, etc.

Feature Overview

Let's dig a little deeper in the features is provides and their constraints.

Before we do so, it's important to know that all keys & secrets are versioned allowing you to retrieve the latest or stick to a specific version. These versions are used when you f.e. change the value of a secret.


A secret is a sequence of bytes limited to 10 kB to which you can assign any value, this can be a certificate, string or whatever you want.

The consumer can save or read back values based on the name of the secret, if they have the required permissions. It basically is a Key-Value store that encrypts your data and stores it in the HSM.

It's important to know that consumers will receive value of the secret as plain-text. This means that they can do anything with these values without the vault owner knowing what they are doing, the trust boundary ends when the data is sent back and the audit log has been updated.

On the other hand, if the type of data you are sharing allows rerolling new versions the consumer will have to come back every x minutes/hours/days to fetch the latest value. You are making them dependent and the chance of losing control. Because this is something you need to consider as well, how are they storing your secrets? In cache? Database? How are these secured? Rolling secrets is a good practice you should consider.


A key is a cryptographic RSA 2048 key that consumers can use for typical key operations such as encrypt, decrypt, sign, verify, etc. Key Vault will handle all these operations for the consumers because they can't read back the value.

All keys are encrypted and stored in physical HSMs but come in two flavors:

  • Software Keys are using Azure VMs to handle operations on the keys. They are pretty cheap but less secure. These keys are typically used for dev/test scenarios.
  • HSM Keys are performing key operations directly on the HSM and thus more secure. However, these keys are more expensive and require you to use a Premium-tier vault.

A key has a higher latency than a secret, if you need to frequently use the key it is recommended to store it as a secret.

Audit Logs (Coming soon)

In the near future Azure Key Vault will also provide audit logs of whom accessed your vault and how often. These logs allow you to act based on what is happening, f.e. revoking access to someone who doesn't need access anymore or is very suspicious.

Bring-Your-Own-Key (BYOK)

Key Vault also allows you to transfer keys from your on-premises HSM up to Azure or back to your datacenter by using a secure HSM-to-HSM transfer. As an example, you can create keys on-premises and once your application goes into production in Azure you can transfer and use that key in Key Vault.


© Microsoft

If you want to know more about bring-your-own-key, I recommend this article


Azure Key Vault leverages enterprise-grade authentication & authorization by integrating with Azure Active Directory where you grant a person or application in your directory access to the vault with a specific set of permissions. However, be aware of the fact that these permissions are granted on the vault-level.

Here is a nice overview of how the authentication process works -

Key Vault Authentication Flow

© Microsoft

When you provision a Key Vault you need to change the Access Control Lists (ACL), this can be done with a simple PowerShell script.

Set-AzureKeyVaultAccessPolicy -VaultName 'Codito' -ServicePrincipalName $azureAdClientId -PermissionsToKeys encrypt -PermissionsToSecrets get

The consumer can than authenticate with Azure Active Directory by using his Account Id & Secret or his Account Id & Certificate. You then use the granted token and give it to Key Vault along with the operation you want to perform.

If you want to revoke access or simple restrict a consumer you can run the same script with less or no permissions.

This means that you can re-use your existing active directory, unfortunately this is a requirement in order to use Key Vault.


Let's have a look at some of the scenarios where you should use Azure Key Vault. As I mentioned before, it's not a silver bullet but helps you store sensitive data as good as possible.

Internal vault

First scenario is a simple one - Some applications have to use or communicate with 3th party systems or parties.

Here are two examples :

  • A database needs a connection string to know where the database is located and how it should authenticate with it.
  • An external service where you need to identify yourself in order to gain access by using a token or password such Twilio.

Where do you store these things? In a database? Nope because you don't know where it is. A common location to store it is the web.config or app.config however this is insecure and an operator can steal this data and sell it so other people can send text messages in your name.

You could use Azure Key Vault as an internal vault containing this data for you. When you then need to authenticate with Twilio you can ask your vault for your API token and use it. Ideally you would cache it and let it expire after x minutes, get it, cache it, you get the picture.

Sharing sensitive data with a third party

Another scenario is where a third party grants you access to their assets, in this example a database.

As mentioned in the previous section there are a lot of ways to store a connection string but this means that the 3th party needs to trust you with that information and they have no clue on how you store it. Here we are just storing it in the app settings in plain text.
Basic Scenario without Key Vault

© Microsoft

However, the customer could give you the same information by creating & sharing it as part of their Key Vault. This makes certain that the data is stored in a secure manner and they have an audit trail of how you interact with the service. If they don't like what you are doing, they can still revoke your access.
Sharing senstivite data with third party scenario

© Microsoft

Important thing to note is that when you as the consumer get the value of the secret, you get it as plain text and the customer has to trust you with it. You can still save it in a file or cache it or whatsoever. On the other hand, the customer is more confident of how the secret data is stored and they have full control over it. If it were a rollable key they could implement an automatic roll system as we will see in a minute.

Multi-tenancy scenario

Key Vault can be used in a multi-tenancy scenario as well where we use the first to scenarios to build a trustworthy relationship with the customer. They can share their sensitive data, here a Azure Storage key, by allowing us to retrieve it from their Azure Key Vault. We, as a service, store the Azure Active Directory authentication for consuming their vault in an internal vault.

Multi-Tenantcy scenario

I'll walk you through the process of how it could work :

  1. The customer provisions a new Azure Key Vault
  2. They create an Azure Active Directory entity for us and set the ACLs on the Key Vault
  3. Codito signs up for our service giving us the AD Id & secret and the names of the secrets we need.
  4. We store the authentication data in our internal
  5. The service stores the names of the customers secrets in a datastore, here Azure SQL Database
  6. Our service authenticates with Azure AD to gain a access token
  7. We request the value for the secrets by passing the access token

Automatically roll keys

Last but not least there is the scenario where you want to automatically re-roll your keys without breaking your running applications. Dushyant Gill actually write a very nice article on how you can automatically roll your Azure Storage Key without breaking any applications.

Storing the vault authentication secrets

While these are only some of the possible scenarios they share a common issue - How do you store your authentication data to your Key Vault.

Well that's a hard one, as mentioned before you have two types of authentication - With a password key or a certificate. Personally, using a certificate seems like the way to go. It's easier to securely store this than a password key and easier to shield from people as well, they have to know where to look as well.

Although this does not get rid of the exposure entirely, it limits the exposure and stores most of the data in a more secure way.

Integration with Azure services

You can use Azure Key Vault to store your keys and use them in other Azure services.

  • SQL Server Encryption in Azure VM (Preview) - When using SQL Server Enterprise you can use Azure Key Vault as a SQL Server connector as an extensible key management provider. This allows you to use a key from Key Vault for Transparent Data Encryption (TDE), Column Level Encryption (CLE) & Backup encryption. This is also a feature you can use on-premises as well. More information here.

  • Azure Storage client side encryption (Preview) - You can now encrypt data before uploading to Azure Storage or decrypt while downloading. The SDK allows you to use keys from your existing Key Vault so you can manage them as you want. More information here.

  • VM Encryption with CloudLink - CloudLink allows you to encrypt and decrypt VMs while using Key Vault as a key repository. More information here

And there is even more, a full list can be found here.

Vault Management & Tooling

Management of your vault such as provisioning a new one or setting the ACLs can be done with PowerShell scripts or using the Azure CLI for Linux & Mac. Here is a PowerShell script that outline some of the Key Vault cmdlets you can use.

If you go to the portal you can provision a new Key Vault by clicking New > Management > Key Vault > oh wait, it's not ready yet!

Portal - Provision a Azure Key Vault

Fair enough, in the end it's a secondary service that is focused for enterprises, scripting such a thing are a good practice.

From a consumers perspective you can use the REST API, .Net libraries on NuGet or the preview SDK for Node.js with more in the works.

Vault Isolation

A Key Vault is dedicated to one specific region and thus you will not be able to consult data from within a different region. All the secrets and keys will be stored in physical HSMs in that specific region, the data will never leave that geographic region.

Certain countries have laws demanding that data should never leave the region, the same goes for compliances. When you deploy your application across regions this means you will have a Key Vault per region with the same structure of keys and secrets. The keys will all be different but it can happen that a secret contains the same value across regions such as a Twilio API key.

Thinking about disaster recovery

This limitation can cause some headaches when you are planning for disaster recovery. If your deployments in one region go down you still want to offer an alternative to your customers.

A possibility to cope with this is to set up manual synchronization for the secrets that are not region-specific. As an example, if we have a Twilio API key and an Azure Storage account key in our vault we would only want to synchronize the API key so we have to update one "master" vault.

Vault Replication

Unfortunately, if you are heavily using keys there is no option for DR.

If you are limited to one region this will not be applicable for your scenario.

Thinking about pricing

So who pays for everything? It's pretty simple.

If you're the owner of the vault than you pay for everything while vault consumers don't have to pay for anything.

This means that if you have a chatty consumer the cost for the vault is increasing without having control over it. Luckily the price are defined per 10,000 operations and are really low.

At the time of writing, you will be charged €0.0224 per 10,000 operations on a software key or secret while for HSM keys you also have to pay €0.7447 for each key and version of a key in your vault.

If you want to have a complete overview, here's an overview.

Azure Key Vault is now general available!

As of the 24th of June, Key Vault is now general available meaning that you can use it in production environments and is backed with a 99.9% SLA and Azure Support Plan.

You can read the announcement here.


Azure Key Vault has a lot to offer and helps developers store sensitive data as good as possible while the data owner has full control and proof of who is using their data.

However, Azure Key Vault is not a silver bullet and it was only build for secrets & keys but it helps us a lot. In my opinion, every new project running in Azure should use Key Vault for optimal security around these kind of sensitive data and setup automatically rerolling of authentication keys where possible.

There was also an interesting session at Ignite I recommend if you want to know more about Key Vault.

The question is not if you will be hacked, but when.

Thanks for reading,


Categories: Azure Security
written by: Tom Kerkhove

Posted on Wednesday, July 8, 2015 3:16 PM

Glenn Colpaert by Glenn Colpaert

In this blogpost I will demonstrate how easy it is to expose SAP functionality to the cloud with the new Azure App Services. I will use a combination of API Apps to create a Logic App that exposes BAPI functionality to an external website.

In March of this year Microsoft announced the new Azure App Services, which brings together the functionality of Azure Websites, Azure Mobile Services and Azure BizTalk Services into a single development experience.

App Service has everything you need to build apps that target both web and mobile clients from a single app back-end. Using API Apps, you can connect your app to dozens of popular services like Office 365 and in minutes, and integrate your own APIs so they can be used within any app. Also a number of BizTalk capabilities are available as API Apps to be used inside complex integration scenarios. Finally with Logic Apps, you can automate business processes using a simple no-code experience.

In this blogpost I will demonstrate how easy it is to expose SAP functionality to the cloud with the new Azure App Services. I will use a combination of API Apps to create a Logic App that exposes BAPI functionality to an external website.

Also I'll use a client application that will directly call our provisioned Logic App through HTTP. Our Logic App will then call our on premise SAP system over Hybrid Connection and return the result back to our client application.

Provisioning the API Apps

As Tom already explained in his blogpost we currently have two option to provision API Apps. One of them is to provision them upfront where you have full control of the naming, second is to provision them while designing your Logic App and leave Azure in control of your naming.

In this blogpost we will provision the API Apps upfront, so we have full control over all the naming and settings of our API apps.

HTTP Listener

The first API App we will provision is the HTTP Listener. The HTTP Listener allows you to open an endpoint that acts as a HTTP server and listen to incoming HTTP requests.

Open the Azure Marketplace and select the HTTP Listener API App and click on Create.

In the Configuration window provide an applicable name for your API App. Another important setting for our setup can be found inside the package settings. There we need to configure our HTTP Listener not to send a response back automatically. That way we are in full control of what response we send back to our calling clients.

Once our HTTP Listener is provisioned we can apply additional configuration to it. We can specify the level of security (None or Basic). We can choose if we want to automatically update our HTTP Listener to the newest version and we can also specify the Access Level.

To simplify this demo, I've chosen Security "None" and have set the Access level to "Public (anonymous)".

This rounds up the configuration of our HTTP Listener.

SAP Connector

The second API App we will provision is the SAP Connector. The SAP Connector lets you connect to an SAP server and invoke RFCs, BAPIs and tRFCs. It also allows you to send IDOCs to SAP Server.

Open the Azure Marketplace and select the SAP Connector API App and click on Create.

In the Configuration window provide an applicable name for your API App. Click on package settings to configure SAP specific configuration values, once your done filling in the necessary configuration values click on Create to provision the SAP Connector.

  • Server name - Enter the SAP Server name
  • User name - Enter a valid user name to connect to the SAP server.
  • Password - Enter a valid password to connect to the SAP server.
  • System number - Enter the system number of the SAP Application server.
  • Language - Enter the logon language.
  • Service bus connection string - This should be a valid Service Bus Namespace connection string.
  • RFCs - Enter the RFCs in SAP that are allowed to be called by the connector.
  • TRFCs - Enter the TRFCs in SAP that are allowed to be called by the connector.
  • BAPI - Enter the BAPIs in SAP that are allowed to be called by the connector.
  • IDOCs - Enter the IDOCs in SAP that can be sent by the connector.

Once our SAP Connector Listener is provisioned, we need to install and configure the Hybrid Connection as it will initially appear as Setup Incomplete.

Open the Hybrid Connection blade and click on Download and Configure to install the On-Premise Hybrid Connection Manager. This is a simple next-next finish installer, so this is very straightforward.

During the installation you will be prompted for the Relay Listen Connection String, copy this string from the Hybrid Connection Blade (Primary Configuration String).

Once the installation is finished and after a refresh (F5) of the portal, you will see the Hybrid Connection listed as connected.

This rounds up the configuration of our SAP Connector.

Creating the Logic App

In the second part of this blogpost we will use our provisioned API Apps to create a Logic App.

Logic Apps allow developers to design workflows that are activated with a trigger and execute a series of steps (API Apps).

Open the Azure Marketplace and select the Logic App and click on Create.

In the Configuration window provide an applicable name for your Logic App and click on Create to provision the Logic App.

Once our Logic App is provisioned we need to configure the necessary Triggers and Actions. You can start this configuration by clicking Triggers and Actions in your Logic App configuration.

Link the API Apps you created in the first part of this blog together as shown below. Click on Save to apply the changes to your Logic App.

This rounds up the configuration of our Logic App.

Testing your Logic App

You can test this Logic App by browsing to your HTTP Listener and opening up the host blade. Copy the URL value as shown below. You can now use this URL to submit HTTP Requests to your Logic App. In my demo I have executed a HTTP GET request from inside a website.

Be aware: Before sending HTTP Requests to the copied url, change the HTTP prefix to HTTPS.

You can see the result of your testing inside your Logic App by navigating to the Operations Blade of your Logic App. 


The new Azure App Services is a big step forward for cloud integration if you compare it with the previous Azure BizTalk Services. However it is still missing some heavy enterprise focused integration patterns like convoys, long running processes, large message handling... Also some improvements on seperating configuration values from the runtime logic can be done.

However the future looks bright and with this demo I showed you how easy it already is to expose SAP functionality to the cloud with Azure App Services.


Glenn Colpaert

Categories: App Services SAP Azure
written by: Glenn Colpaert