Codit Wiki

Loading information... Please wait.

Codit Blog

Posted on Friday, May 5, 2017 4:56 PM

Massimo Crippa by Massimo Crippa

The Analytics module in Azure API Management provides insights about the health and usage levels of your APIs, to identify key trends that impact the business. Analytics also provides a number of filtering and sorting options, to better understand who is using what, but what if I want more? For example, how about drill down reports or getting mobile access?

I am a big fan of Power BI so, let's combine the power of Azure Functions and the simplicity of the APIM REST APIs, to flow the analytics data to Power BI.

The picture below displays my scenario: Azure functions connect and combine APIs from different Azure services (AAD, APIM, Storage) to create a smooth and lightweight integration.

It's a serverless architecture which means that we don't have to worry about the infrastructure so we can focus on the business logic, having rapid iterations and a faster time to market.

The APIM analytics (aggregated data) can be read by calling the report REST API. This information can then be written to Azure Tables and automatically synchronized with Power BI. 



The Azure function:

  1. Is triggered via HTTP POST. It accepts a body parameter with the report name (byApi, byGeo, byOperation, byProduct, bySubscrition, byUser) and the day to export.

  2. Calls the AAD token endpoint using the resource owner password flow to get the access token to authorize the ARM call.

  3. Calls the APIM rest API ($filter=timestamp%20ge%20datetime'2017-02-15T00:00:00'%20and%20timestamp%20le%20datetime'2017-02-16T00:00:00')

  4. Iterate through the JTokens in the response body to build a collection of IEnumerable<DynamicTableEntity> that is passed to the CloudTable.ExecuteBatch to persist the data in the azure storage. 

Because I am using a second function to extract and load (to azure storage) additional APIM tables (e.g. apis, products, users etc..), I found this article very useful on reusing code in different Azure functions.

I created a logic app to trigger the functions multiple times, one per report to be exported. The code can support any new aggregation or additional fields added in the future without any modification.

Power BI

Using Power BI desktop I put together some visualizations and pushed them to the Power BI service. The report dataset is synced with the Azure tables one time per day, which is configurable. Here below, you can see screens from my mobile phone (left) and the desktop experience (right).


Even though the same result can be achieved using other Azure services like Webjobs or Data Factory, Azure functions provide multiple benefits like a simple programming model, the abstraction of servers and the possibility to use a simple editor to build, test and monitor the code without leaving your browser. That's a perfect fit for quick development cycle, faster adaptation that gains business advantages, isn't it?



Categories: API Management
written by: Massimo Crippa

Posted on Thursday, February 9, 2017 4:00 PM

Massimo Crippa by Massimo Crippa

In Azure API Management, Groups are used to manage the visibility of products to developers so the developers can view and consume the APIs that are contained into the groups in which they belong.

Suppose that we have a custom group for developers affiliated with a specific business partner and we want to allow those developers (that signed up with different identity providers) to access only to the partner's relevant Products.

Let's combine Logic Apps, Azure API Management and ARM together to automate the user group association.

In short: no matter what which identity provider (AAD, Google, Twitter, etc..) is used to sign up, when the user belongs to the domain it should be added to the "Codit Dev Team" custom group.

The basic idea here is to use logic apps as a batch process to get the list of registered users and then call a child logic app to assign the current developer to a proper custom group to manage the product visibility.

Logic Apps and Azure API Management

There are three ways to invoke an API Management endpoint from a Logic App:

  • API Management connector. The connector is pretty straightforward. You first select an APIM tenant, the API and then the operation to be called. Finally, the available headers and parameters are automatically displayed. The APIM connector by default shows only the APIM tenants created in the same subscription where the Logic App was created. 
  • Http + Swagger connector. This connector provides a similar user experience as the APIM connector. The shape of the API with the parameters are automatically integrated in the designer.
  • Http connector. It requires to specify HTTP verb, URL, Headers and Body to perform an HTTP call. Simple as that!

In this exercise, all the services that had been integrated are located in different Azure subscriptions therefore I used only Http and Http+Swagger connectors.

Manage 'em all

With the "Every API should be a managed API" mantra in mind and with the final goal to have a more information about which API is called and its performance we created a facade API for every HTTP call.

Here the list of managed APIs:

  • Every call to the Azure Resource Manager (get users, get groups by user, add user to group)
  • Get the token to authorize the ARM call
  • Call the child Logic App

And here the Logic App workflows that had been created. 

Some other benefits we got from the virtualization: 

  • Use of a single authorization model between Logic App and APIM by providing an API Key via the "Ocp-Apim-Subscription-Key" header.
  • Balancing complexity and simplicity. The ARM authentication is delegated to the API Management layer.
  • Apply a consistent resource naming convention. 

Azure API Management Policies

The policy engine is where the core power of Azure API Management lies. Let's go through the policies that have been configured for this exercise. 

Get the bearer token

A token API with a GET operation is used by the ARM facade API to get the bearer token to authorize the call to Azure Resource Manager endpoint. The policy associated to the "get-token" operation changes the HTTP request method and sets the body of the request to be sent to the AAD token endpoint using the password flow.

Call the ARM

This is the call to the ARM endpoint (get users, get groups by user, add user to group). The "send-request"-policy is used to perform a call to the private token API and to store the response in the bearerToken context property.

The "set-header" policy in combination with a policy expression is used to extract the token and to add it as a header to the request sent to the ARM endpoint.

This policy can be improved by adding the policy expressions to store and retrieve the token from the cache. Here an example. 

Logic Apps facade API

The Logic Apps workflows that expose an HTTP trigger call can be called by using the POST verb only and passing the parameters in the body of the request.

The child workflow that takes care to assign a user to a specific group has been virtualized via Azure API Management to change the URL segments as here{uid}/groups/{groupname} and to change the request method to PUT.


Thanks to this simple Logic App and some APIM power I can now be sure that every new colleague that sign up to our developer portal is automatically associated to the internal developer team so that he/she can get access to a broader set of APIs.

A similar result can be achieved using the Azure B2B/B2C integration in combination with the AAD security groups but, at the time of writing, the APIM integration with AAD B2C has not been completed yet.

Another benefit of managed APIs is the gain of visibility of the exposed assets and their performance. Discover how an API is used, information about the consumer and be able to discover the trends that are most impacting the business.



Categories: API Management
Tags: Azure
written by: Massimo Crippa

Posted on Thursday, December 1, 2016 2:05 PM

Massimo Crippa by Massimo Crippa

Don’t dump your internal data model on your clients. Work outside-in, design your API with the clients in mind. Build your server side API once and then tailor the API to different clients (Backend-For-Frontends pattern).

The nature of mobile experience is often different than the desktop mobile experience. Different screen size and different functionalities. We normally display less data and it’s a good practice to perform less calls to avoid to kill the battery life. 
A common way to accommodate more than one type of device and UI is to add more functionalities over time to a compound API for multiple clients. At the end of the day this could result in a complex and not easy to maintain API.

The BFF pattern offers a possible solution to this problem: having a dedicated backend API for every type of client. The BFF pattern is growing in popularity especially its implementation within API management gateways.

In this post, we will see how to leverage the power and the flexibility of the Azure API Management policy engine to reduce the complexity of one the downstream APIs therefore make it more suitable for mobile clients.

Excel as data service

On August 3rd, Microsoft announced the general availability of the Microsoft Excel REST API for Office 365. This API open new opportunities for developers to create new digital experiences using Excel as backed service.

Carpe diem! Don’t miss the chance and let’s use Excel as it would be one of the downstream services that power my brand new mobile application. To use Excel as data service I first created a new Excel file in my Office 365 drive and created a table inside the worksheet to define the area where the data will be stored.

To write a single line on the excel workbook we must:

  • refer to the workbook (specifying the user id and the workbook id)
  • create a session in order to get the workbook-session-id value.
  • post/get the data adding the “workbook-session-id” as http header.

And what about the structure of the data to be sent? What about the response? In the picture below the request/response example to GET the rows from the Excel table. 

BFF (aka “experience APIs”)

The goals of this exercise is to create an API dedicated for the mobile experience, so to remove the complexity in the URL/ HTTP headers, have a simpler inbound/outbound data contracts and hide the details about the downstream service.

Here is where API Management comes into the picture allowing the API publisher to change the behavior of the API through configuration policies, so that developers can iterate quickly on the client apps, so that innovation can happen at faster pace. 

An API has been added to the APIM gateway and three operations has been configured: Init (to create a session), send message (save a message on the excel workbook) and get messages (list of all the sent messages).

Simplify the URL

First step is to create the BFF mobile API then add the rewrite URI policy to expose a simpler URI in the gateway.

Remove HTTP header complexity

In this step we want to avoid to inject the "workbook-session-id" header all of the time. So the main idea is to create a init operation that call the "createSession" on the Excel REST API, read the "id" value from the response and store into the workbook-session-id into the gateway cache.

To achieve that let's use a combination of policies associated to the INIT operation.

  • set-body to specify that the data need to be persisted on the Excel workbook
  • set-variable to read "id" response and store into the workbook-session-id variable
  • cache-store-value to store the workbook-session-id into the cache using the JWT token as a cache key.
  • set-body to return a custom 200 response 

On the outbound, in case of valid response, the session identifier is read via context.Response.Body 

The policy associated to the GET messages operation, retrieves the workbook-session-id parameter from the cache, adds to the session header and forward the request to the downstream service.

Simplify the data contract (message transformation)

The goal of this step is having a data contract tailored to the client. Simpler and compact in terms of size.

The contract to send a message has been reduced to the minimum, a string. In the inbound policy the message is enriched with the name of the sender (from the JWT token) and a timestamp. The set body policy is used to to create the json object to be forwarded to the underlying API.

On the outbound channel the result set of the GET Messages is filtered to reduce the data transferred over the wire and it's mapped to a simpler JSON structure.

Hide the backend details

As a final step, some HTTP headers are deleted (product scope policy) to hide the details of downstream service.

In Action


The BFF supports transformative design and moves the underlying system into a better, less-coupled state giving the dev teams the autonomy to iterate quickly on the client apps and deliver new digital experiences faster.

The tight coupling between the client and the API is therefore moved in the API Management layer where we can benefit of capabilities like aggregation, transformation and the possibility to change the behavior of the API by configuration.



Categories: API Management
written by: Massimo Crippa

Posted on Friday, April 13, 2018 12:25 PM

Tom Kerkhove by Tom Kerkhove

Azure API Management released a new version that changes the OpenAPI interpretation. This article dives into the potential impact on of the consumer experience of your APIs.

Providing clean and well-documented APIs is a must. This allows your consumers to know what capabilities you provide, what they are for and what to expect.

This is where the OpenAPI specification, aka Swagger, comes in and defines how APIs should be defined across the industry, regardless of what technology is underneath it.

Recently, the Azure API Management team started releasing a new version of the product with some new features and some important changes in how they interpret the OpenAPI specification while importing/exporting them.

Before we dive into the changes to OpenAPI interpretation. I'd like to highlight that they've also added the capability to display the id of a specific operation. In the past, you still had to use the old Publisher portal for this but now you can find it via API > Operation > Frontend.

Next to that, as of last Sunday, the old Publisher portal should be fully gone now, except for the analytics part.

OpenAPI Interpretation

The latest version also changes the way OpenAPI specifications are being interpreted and are now fully based on operation as defined by the OpenAPI spec.

Here are the changes in a nutshell:

  • Id of the operation - Operation Id is based on operation.operationId, otherwise it is being generated similar to get-foo
  • Name of the operation - Display name is based on operation.summary, otherwise it will use operation.operationId. If that is not specified, it will generate a name similar to Get - /foo
  • Description of the operation - Description is based on operation.description

I like this change because it makes sense, however, this can be a breaking change in your API documentation depending on how you achieved it in the past.

The reason for this is that before rolling out this change the interpretation was different:

  • Id of the operation was a generated id
  • Name of the operation was based on operation.operationId
  • Description of the operation was based on operation.description and falls back on operation.summary

How I did it in the past

For all the projects I work on I use Swashbuckle because it's very easy to setup, use and ties into the standard XML documentation.

Here is an example of the documentation I provide for my health endpoint for Sello, which I use for demos.

As you notice, everything is right there and via the operation I specify what the operation is called and give a brief summary of what it does and what my consumers can expect as responses.

The OpenAPI specification that is generated will look like this:

Once this is imported into Azure API Management the developer experience was similar to this:

However, this approach is no longer what I'd like to offer to my consumers because if you import it after the new version it looks like this:

How I'm doing it today

Aligning with the latest interpretation was fairly easy to be honest, instead of providing a description what the operation does via summary I started using remarksinstead.

Next to that, I'm now using summary to give the operation a friendly name and assigned a better operationId via SwaggerOperation.

This is how it looks in code:

The new OpenAPI specification is compatible with the recent changes and will look like this:

Once this is imported the developer experience is maintained and looks similar to this:

When you go to the details of the new operation in the Azure portal, you will see that all our information is succesfully imported:


Azure API Management rolled out a change to the OpenAPI interpretation to provide more flexibility so you can define the operation id to use and align with the general specification.

This change is great, but it might have an impact on your current API documentation, similar to what I've experienced. With the above changes, you are good to go and your consumers will not even notice it.

Thanks for reading,


Categories: API Management, Azure
written by: Tom Kerkhove

Posted on Tuesday, January 16, 2018 2:55 PM

Massimo Crippa by Massimo Crippa

Azure API Management Versions and Revisions went GA. It's time to choose a version scheme and migrate a flat list of APIs into a versionset.

On January 11 the Azure API Management Versions and Revisions feature went GA. Thanks to this feature it’s now easier to manage the life cycle of your APIs and change + test your gateway configuration without runtime impact.

For more details check the announcement via this LINK

Before the introduction of this feature, it wasn’t possible to explicitly define the version strategy for an API. Therefore, every new version was approached as a new api. From the user interface point of view, that approach was resulting in a single flat list of assets. If we now use a “versionset”, we can specify how the API versions are managed (path, query, header) and group them together.

In this blog post we will see how to migrate from the flat structure (image below) to the grouped view using the ARM REST API. 

All the requests to the ARM REST API must contain the Authorization header with bearer token to secure the request. The target base path is the following{sid}/resourceGroups/{rg}/providers/Microsoft.ApiManagement/service/{tid}/


The procedure is pretty straightforward: 

  • Create a versionset
  • Update the API to join the new versionset
  • Repeat the update procedure for each APIs

Create a versionset 

First, create a versionset to group together the different versions of an API. The versionset defines the api name and the version scheme that will be applied to all the versions that belong to the versionset. In this example I choose the "path" version scheme.

  • The HTTP method to create a versionset is PUT 
  • The operation's path is: api-version-sets/{versionSetId}?api-version={apiVersionId}
  • The displayName value is the API name that will be rendered in the publisher portal, developer portal and in the swagger file produced by the gateway.

In case of successful call, you get 201 Created with the json representation of the resource created. Please note that the VersionSet will be displayed in the Azure Portal only when the first API is added to it.

Link the API to the VersionSet

Let's modify the B2B apis to join the versionset created with the previous call. To achieve that we need to update the APIs to add two new fields:

  • The "apiVersion" field with the API version value (e.g. v1)
  • The "apiVersionSetId" field with the pointer to the versionset we created with the previous step.

Because the api version number will be added by the api gateway, it's necessary to update the "path" field to remove the version from the base path.  The image below compares the json representation of the API with the changes tho be patched.

  • The HTTP method is PATCH 
  • The operation's path is: /apis/{apiId}?api-version={apiVersionId} 
  • This is a partial update so the PATCH is the method to be used. Do not use the PUT method, you will loose all the API's operations.


The HTTP 204 No Content success status response code indicates that the request has succeeded. Just refresh the Azure Portal to get the B2B API 1.0 added to the B2B versionset.


Perform the PATCH call for the second B2B API and repeat the same procedure for the B2C APIs to get to the final result.



With the latest service update it's also possible to add one or more release notes to the API version. All those change logs are shown on the the developer portal in the "releaseChanges" page (docs/services/{apiId}/releaseChanges).

Using the azure portal is possible to create a change log entry only marking a revision as “current” so let's use the the REST API to load the change log. 

  • The HTTP method is PUT
  • The operation's path is: /apis/{apiId}/releases/{releaseId}?api-version={apiVersionId} 
  • The release identifier must be specified in the request body ("id" field)
  • The link to the api to whom the release belong should be added in the properties ("apiId" field)

As result you get a 201 Created and the release note is displayed in the developer portal. 


Versions and revisions is one of the Azure API Management’s most awaited features. Thanks to the REST API you can quickly migrate your current configuration to get the most out of it.

Thanks for reading and happy API Management everyone!



Categories: API Management, Azure
written by: Massimo Crippa