wiki

Codit Wiki

Loading information... Please wait.

Codit Blog

Posted on Friday, May 5, 2017 4:56 PM

Massimo Crippa by Massimo Crippa

The Analytics module in Azure API Management provides insights about the health and usage levels of your APIs, to identify key trends that impact the business. Analytics also provides a number of filtering and sorting options, to better understand who is using what, but what if I want more? For example, how about drill down reports or getting mobile access?

I am a big fan of Power BI so, let's combine the power of Azure Functions and the simplicity of the APIM REST APIs, to flow the analytics data to Power BI.

The picture below displays my scenario: Azure functions connect and combine APIs from different Azure services (AAD, APIM, Storage) to create a smooth and lightweight integration.

It's a serverless architecture which means that we don't have to worry about the infrastructure so we can focus on the business logic, having rapid iterations and a faster time to market.

The APIM analytics (aggregated data) can be read by calling the report REST API. This information can then be written to Azure Tables and automatically synchronized with Power BI. 

 

Function

The Azure function:

  1. Is triggered via HTTP POST. It accepts a body parameter with the report name (byApi, byGeo, byOperation, byProduct, bySubscrition, byUser) and the day to export.

  2. Calls the AAD token endpoint using the resource owner password flow to get the access token to authorize the ARM call.

  3. Calls the APIM rest API (https://management.azure.com/subscriptions/9124e1d1-c144-1ec2-7cb2-bef226961d93/resourceGroups/rg-apim/providers/Microsoft.ApiManagement/service/apim-codit-dev/reports/bySubscription?api-version=2016-07-07&$filter=timestamp%20ge%20datetime'2017-02-15T00:00:00'%20and%20timestamp%20le%20datetime'2017-02-16T00:00:00')

  4. Iterate through the JTokens in the response body to build a collection of IEnumerable<DynamicTableEntity> that is passed to the CloudTable.ExecuteBatch to persist the data in the azure storage. 

Because I am using a second function to extract and load (to azure storage) additional APIM tables (e.g. apis, products, users etc..), I found this article very useful on reusing code in different Azure functions.

I created a logic app to trigger the functions multiple times, one per report to be exported. The code can support any new aggregation or additional fields added in the future without any modification.

Power BI

Using Power BI desktop I put together some visualizations and pushed them to the Power BI service. The report dataset is synced with the Azure tables one time per day, which is configurable. Here below, you can see screens from my mobile phone (left) and the desktop experience (right).

Conclusion

Even though the same result can be achieved using other Azure services like Webjobs or Data Factory, Azure functions provide multiple benefits like a simple programming model, the abstraction of servers and the possibility to use a simple editor to build, test and monitor the code without leaving your browser. That's a perfect fit for quick development cycle, faster adaptation that gains business advantages, isn't it?

Cheers,

Massimo

Categories: API Management
written by: Massimo Crippa

Posted on Thursday, February 9, 2017 4:00 PM

Massimo Crippa by Massimo Crippa

In Azure API Management, Groups are used to manage the visibility of products to developers so the developers can view and consume the APIs that are contained into the groups in which they belong.

Suppose that we have a custom group for developers affiliated with a specific business partner and we want to allow those developers (that signed up with different identity providers) to access only to the partner's relevant Products.

Let's combine Logic Apps, Azure API Management and ARM together to automate the user group association.

In short: no matter what which identity provider (AAD, Google, Twitter, etc..) is used to sign up, when the user belongs to the @codit.eu domain it should be added to the "Codit Dev Team" custom group.

The basic idea here is to use logic apps as a batch process to get the list of registered users and then call a child logic app to assign the current developer to a proper custom group to manage the product visibility.

Logic Apps and Azure API Management

There are three ways to invoke an API Management endpoint from a Logic App:

  • API Management connector. The connector is pretty straightforward. You first select an APIM tenant, the API and then the operation to be called. Finally, the available headers and parameters are automatically displayed. The APIM connector by default shows only the APIM tenants created in the same subscription where the Logic App was created. 
  • Http + Swagger connector. This connector provides a similar user experience as the APIM connector. The shape of the API with the parameters are automatically integrated in the designer.
  • Http connector. It requires to specify HTTP verb, URL, Headers and Body to perform an HTTP call. Simple as that!

In this exercise, all the services that had been integrated are located in different Azure subscriptions therefore I used only Http and Http+Swagger connectors.

Manage 'em all

With the "Every API should be a managed API" mantra in mind and with the final goal to have a more information about which API is called and its performance we created a facade API for every HTTP call.

Here the list of managed APIs:

  • Every call to the Azure Resource Manager (get users, get groups by user, add user to group)
  • Get the token to authorize the ARM call
  • Call the child Logic App

And here the Logic App workflows that had been created. 

Some other benefits we got from the virtualization: 

  • Use of a single authorization model between Logic App and APIM by providing an API Key via the "Ocp-Apim-Subscription-Key" header.
  • Balancing complexity and simplicity. The ARM authentication is delegated to the API Management layer.
  • Apply a consistent resource naming convention. 

Azure API Management Policies

The policy engine is where the core power of Azure API Management lies. Let's go through the policies that have been configured for this exercise. 

Get the bearer token

A token API with a GET operation is used by the ARM facade API to get the bearer token to authorize the call to Azure Resource Manager endpoint. The policy associated to the "get-token" operation changes the HTTP request method and sets the body of the request to be sent to the AAD token endpoint using the password flow.

Call the ARM

This is the call to the ARM endpoint (get users, get groups by user, add user to group). The "send-request"-policy is used to perform a call to the private token API and to store the response in the bearerToken context property.

The "set-header" policy in combination with a policy expression is used to extract the token and to add it as a header to the request sent to the ARM endpoint.

This policy can be improved by adding the policy expressions to store and retrieve the token from the cache. Here an example. 

Logic Apps facade API

The Logic Apps workflows that expose an HTTP trigger call can be called by using the POST verb only and passing the parameters in the body of the request.

The child workflow that takes care to assign a user to a specific group has been virtualized via Azure API Management to change the URL segments as here https://api.codit.eu/managedarm/users/{uid}/groups/{groupname} and to change the request method to PUT.

Conclusion

Thanks to this simple Logic App and some APIM power I can now be sure that every new colleague that sign up to our developer portal is automatically associated to the internal developer team so that he/she can get access to a broader set of APIs.

A similar result can be achieved using the Azure B2B/B2C integration in combination with the AAD security groups but, at the time of writing, the APIM integration with AAD B2C has not been completed yet.

Another benefit of managed APIs is the gain of visibility of the exposed assets and their performance. Discover how an API is used, information about the consumer and be able to discover the trends that are most impacting the business.

Cheers

Massimo

Categories: API Management
Tags: Azure
written by: Massimo Crippa

Posted on Thursday, December 1, 2016 2:05 PM

Massimo Crippa by Massimo Crippa

Don’t dump your internal data model on your clients. Work outside-in, design your API with the clients in mind. Build your server side API once and then tailor the API to different clients (Backend-For-Frontends pattern).

The nature of mobile experience is often different than the desktop mobile experience. Different screen size and different functionalities. We normally display less data and it’s a good practice to perform less calls to avoid to kill the battery life. 
A common way to accommodate more than one type of device and UI is to add more functionalities over time to a compound API for multiple clients. At the end of the day this could result in a complex and not easy to maintain API.

The BFF pattern offers a possible solution to this problem: having a dedicated backend API for every type of client. The BFF pattern is growing in popularity especially its implementation within API management gateways.

In this post, we will see how to leverage the power and the flexibility of the Azure API Management policy engine to reduce the complexity of one the downstream APIs therefore make it more suitable for mobile clients.

Excel as data service

On August 3rd, Microsoft announced the general availability of the Microsoft Excel REST API for Office 365. This API open new opportunities for developers to create new digital experiences using Excel as backed service.

Carpe diem! Don’t miss the chance and let’s use Excel as it would be one of the downstream services that power my brand new mobile application. To use Excel as data service I first created a new Excel file in my Office 365 drive and created a table inside the worksheet to define the area where the data will be stored.

To write a single line on the excel workbook we must:

  • refer to the workbook (specifying the user id and the workbook id)
  • create a session in order to get the workbook-session-id value.
  • post/get the data adding the “workbook-session-id” as http header.

And what about the structure of the data to be sent? What about the response? In the picture below the request/response example to GET the rows from the Excel table. 

BFF (aka “experience APIs”)

The goals of this exercise is to create an API dedicated for the mobile experience, so to remove the complexity in the URL/ HTTP headers, have a simpler inbound/outbound data contracts and hide the details about the downstream service.

Here is where API Management comes into the picture allowing the API publisher to change the behavior of the API through configuration policies, so that developers can iterate quickly on the client apps, so that innovation can happen at faster pace. 

An API has been added to the APIM gateway and three operations has been configured: Init (to create a session), send message (save a message on the excel workbook) and get messages (list of all the sent messages).

Simplify the URL

First step is to create the BFF mobile API then add the rewrite URI policy to expose a simpler URI in the gateway.

Remove HTTP header complexity

In this step we want to avoid to inject the "workbook-session-id" header all of the time. So the main idea is to create a init operation that call the "createSession" on the Excel REST API, read the "id" value from the response and store into the workbook-session-id into the gateway cache.

To achieve that let's use a combination of policies associated to the INIT operation.

  • set-body to specify that the data need to be persisted on the Excel workbook
  • set-variable to read "id" response and store into the workbook-session-id variable
  • cache-store-value to store the workbook-session-id into the cache using the JWT token as a cache key.
  • set-body to return a custom 200 response 

On the outbound, in case of valid response, the session identifier is read via context.Response.Body 

The policy associated to the GET messages operation, retrieves the workbook-session-id parameter from the cache, adds to the session header and forward the request to the downstream service.

Simplify the data contract (message transformation)

The goal of this step is having a data contract tailored to the client. Simpler and compact in terms of size.

The contract to send a message has been reduced to the minimum, a string. In the inbound policy the message is enriched with the name of the sender (from the JWT token) and a timestamp. The set body policy is used to to create the json object to be forwarded to the underlying API.

On the outbound channel the result set of the GET Messages is filtered to reduce the data transferred over the wire and it's mapped to a simpler JSON structure.

Hide the backend details

As a final step, some HTTP headers are deleted (product scope policy) to hide the details of downstream service.

In Action

Conclusion

The BFF supports transformative design and moves the underlying system into a better, less-coupled state giving the dev teams the autonomy to iterate quickly on the client apps and deliver new digital experiences faster.

The tight coupling between the client and the API is therefore moved in the API Management layer where we can benefit of capabilities like aggregation, transformation and the possibility to change the behavior of the API by configuration.

Cheers

Massimo

Categories: API Management
written by: Massimo Crippa

Posted on Tuesday, April 14, 2015 4:30 PM

Massimo Crippa by Massimo Crippa

The Azure API Management Portal allows API Publishers to set policies to change the behavior of the underlying API by configuration.
The Policies act like a pipeline that executes a set of conditions or rules in a sequence. Policies contain configurable rules for authentication, validation, quota and IP level restriction, caching and more.
The Microsoft Product team is constantly adding features to the service, recently conditional policies have been added.

Policies overview

Policies can be applied at different scopes (Global, Product, API, and Operation), some are allowed only on the inbound channel, some others on the contrary only on the outbound.

The table summarizes the policies as today with their scope and the applicability.

The Policies that are defined at different scopes are then pulled from the higher level to the lowest through the “<base/>” element and flattened to a single policy definition that executes the steps in a sequence.

Here the definition of the policies I setup for this blog post and the effective result per scope.

Conditional policies

On April 10th conditional policies have been rolled out:

  • Policy expressions. C#-based expressions that can be used as values in policy attributes and elements. 
  • Control flow policy. That's an if-then-else construct to conditionally apply policies based on the evaluation of logical expressions.
  • Set variable policy. Use this policy to store a computed value for later re-use within the scope of a single request.
  • Set backend service policy. To override the backend service URL.

The conditional policies bring flexibility to the API Management policy engine enabling the definition of more complex complex rules like evaluate headers in the inbound pipeline, save it in the context and then, based on that properties, take decisions on the outbound channel. 

In the very basic example below I combined these 4 new policies in a single policy definition to route the request message to different endpoints depending on the datetime millisecond.

Test and analyze the trace

To test the policy definition I used the API management built-in console and I added the "ocp-apim-trace:true" header to the operation call to enable the API inspector.

The trace location is returned in the response ocp-apim-trace-location header which can be used to download the json data to inspect the execution.

Here I got the json using fiddler and then the online json visualizer to easily inspect the trace, check the policy execution and the debug messages.

 

With the Azure API Management policies you gain the ability to protect, transform and to modify the underlying behavior of the virtualized backend service.

The April service update and the conditional policies brought more flexibility and power to the policy engine.

Cheers

Massimo

 

 

Categories: Azure, API Management
written by: Massimo Crippa