Codit Wiki

Loading information... Please wait.

Codit Blog

Posted on Thursday, October 12, 2017 11:35 PM

Toon Vanhoutte by Toon Vanhoutte

After my first blog in this series about Azure Function Proxies, I received several questions related to API management. People were curious how to position Azure Function Proxies compared to Azure API Management. It should be clear that Azure Function Proxies has some very limited API management functionality, but it comes nowhere near the capabilities of Azure API Management! Comparing the feature set of these two Azure services doesn't make sense, as Azure API Management supersedes Azure Function Proxies on all levels. Don't forget why Azure Function Proxies was introduced: it's to unify several separate functions into an API, not to provide full-blown APIM.  Let's just touch upon the functionalities that they have more or less in common!

Common Functionalities


Azure Function Proxies have limited transformation capabilities on three levels: rewriting of the URI, modification of the HTTP headers and changing the HTTP body. The options for transformations are very basic and focussed on just creating a unified API. Azure API Management on the other hand, has an impressive range of transform capabilities.

These are the main transformation policies:

Next to these policies, you have the opportunity to write policy expressions that inject .NET C# code into your processing pipeline, to make it even more intelligent.


Azure Function Proxies supports any kind of backend security that can be accomplished through static keys / tokens in the URL or HTTP headers. Frontend-facing, Azure Function Proxies offers out-of-the-box authentication enforcement by several providers: Azure Active Directory, Facebook, Google, Twitter & Microsoft. Azure API Management has many options to secure the frontend and backend API, going from IP restrictions to inbound throttling, from client certificates to full OAuth2 support.

These are the main access restriction policies:

  • Check HTTP header - Enforces existence and/or value of a HTTP Header.
  • Limit call rate by subscription - Prevents API usage spikes by limiting call rate, on a per subscription basis.
  • Limit call rate by key - Prevents API usage spikes by limiting call rate, on a per key basis.
  • Restrict caller IPs - Filters (allows/denies) calls from specific IP addresses and/or address ranges.
  • Set usage quota by subscription - Allows you to enforce a renewable or lifetime call volume and/or bandwidth quota, on a per subscription basis.
  • Set usage quota by key - Allows you to enforce a renewable or lifetime call volume and/or bandwidth quota, on a per key basis.
  • Validate JWT - Enforces existence and validity of a JWT extracted from either a specified HTTP Header or a specified query parameter.

These are the main authentication policies:

Hybrid Connectivity

Azure Function Proxies can leverage the App Service networking capabilities, if they are deployed within an App Service Plan. This gives three powerful hybrid network integration options: hybrid connections, VNET integration or App Service Environment. Azure API Management, premium tier, allows your API proxy to be part of a Virtual Network. This provides access to all resources within the VNET, which can be extended to on-premises through a Site-to-Site VPN or ExpressRoute. On this level, both services offer quite similar functionality.


The scope of Azure Function Proxies is really at the application level. It creates one single uniform API, that typically consists of multiple heterogenous backend operations. Azure API Management has more of an organizational reach and typically governs (large parts) of the API's available within an organization. The diagram below illustrates how they can be combined together. The much broader scope of API Management results also in a much richer feature set: e.g. the publisher portal to manage API's, the developer portal with samples for quick starts, advanced security options, the enormous range of runtime policies, great versioning experience, etc…

Use cases

These are some use cases where Azure Function Proxies was already very beneficial:

  • Create a single API that consists of multiple Azure Functions and / or Logic Apps
  • Create a pass-through proxy to access on-premises API's, without any coding
  • Generate a nicer URL for AS2 endpoints that are hosted in Azure Logic Apps
  • Generate a simple URL for Logic Apps endpoints, that works better for QR codes
  • Add explicit versioning in the URL of Azure Functions and / or Logic Apps


Azure Function Proxies really has an added value in the modern world of API's that often consist of multiple heterogenous (micro-)service operations. It offers very basic runtime API management capabilities, that reside on the application level.


Categories: Azure
Tags: Functions
written by: Toon Vanhoutte

Posted on Wednesday, October 4, 2017 1:52 PM

Toon Vanhoutte by Toon Vanhoutte

By creating a uniform API on top of several heterogenous service operations, we also simplify the security model for the API consumer.

After the configuration we've done in part 1, we've hidden the complexity of maintaining 4 SAS tokens and 1 function code client-side. Be aware that, at the moment, the Azure Function Proxy is not secured by default. In some cases, this might be the desired behaviour, in other scenarios we would like to restrict access to the API. Let's have a look how we can achieve the latter!

Enforce Authentication

You can leverage the default App Service authentication feature, that forces clients to get authenticated against one of these providers: Azure Active Directory, Facebook, Google, Twitter & Microsoft. This can be done without any code changes. This only covers authentication. When authorization is required, some minimal code changes are needed.

Suggestions for product team

  • Common security measures like IP restrictions and configurable rate limits to protect against DoS attacks would be great. There is already a feature request on UserVoice.

  • Leveraging the standard Azure Function keys or host keys would be also a simple way to authorize the API endpoint. You can easily setup rotating keys to improve security. Apparently this is on the radar, but no ETA defined yet!


Categories: Azure
Tags: Functions
written by: Toon Vanhoutte

Posted on Monday, October 2, 2017 10:33 AM

Toon Vanhoutte by Toon Vanhoutte

Connecting cloud services to on premises API's can be quite challenging. Depending on your setup, there are multiple options available. The most enterprise grade options reside on the network level: ExpressRoute and Site-to-Site VPN. Another option is leveraging Azure App Service Hybrid Connections, which gives you a very simple way to connect to on premise resources on the TCP level, in a firewall-friendly and high available manner. This blog will demonstrate how you can consume an on premises API via Azure Function Proxies, without any coding at all.

You can find the documentation of Azure App Service Hybrid Connections over here.


Following these instructions to setup hybrid connectivity through hybrid connections:

  • In order to take advantage of hybrid connections, you must ensure that you create the Azure Function Proxy within an Azure App Service Hosting Plan.


  • Navigate to Platform features and click on Networking


Consumption plans do not support networking features, as they are instantiated at runtime.

  • Click on Configure your hybrid connection endpoints


  • Download the Hybrid Connection Manager

  • Start the installation with accepting the License Agreement.

  • The installation doesn't take long.

  • Click Finish to complete the installation.

  • Open the Hybrid Connection Manager UI desktop app.

  • At this moment, you should not see any hybrid connection. Ignore the 'mylaptop' connection in the screen capture below, as this is still a legacy BizTalk hybrid connection.

  • Back in the Azure Function Networking blade, click Add hybrid connection.

  • Choose Create new hybrid connection.

  • Give the connection a name, provide the endpoint host name and port. As Hybrid Connections leverages ServiceBus relay technology underneath, you need to provide a ServiceBus namespace.

  • Choose in the local Hybrid Connection Manager to Configure another Hybrid Connection.

  • Select the previously created hybrid connection and click Save.

  • If all goes well, you should see the Connected status for the hybrid connection.

  • The Azure Portal should also display a Connected status.

  • You can configure now the Azure Function proxy with an external / public URL to point to the local backend URL, which is now available through the Hybrid Connection Manager.

Now you can access your local API from the external world! Be aware that there is currently no security measure applicable. This will be covered in the next part of this blog post series.

High Availability

In case you need high availability, you can install another Hybrid Connection Manager that connects to the same hybrid connection. The documentation states the following:

Each HCM can support multiple hybrid connections. Also, any given hybrid connection can be supported by multiple HCMs. The default behaviour is to round robin traffic across the configured HCMs for any given endpoint. If you want high availability on your hybrid connections from your network, simply instantiate multiple HCMs on separate machines.

Enjoy this easy hybrid connectivity!



Categories: Azure
Tags: Functions
written by: Toon Vanhoutte

Posted on Wednesday, September 27, 2017 11:49 AM

Toon Vanhoutte by Toon Vanhoutte

When using the Microsoft Azure stack, an important aspect is to use the right service for the job. In this way, you might end up with an API that is composed of several Azure services such as Blob Storage, Logic Apps and Azure Functions. In order to publish such an API in a unified way, you can leverage Azure Functions Proxies.


In this fictitious scenario, I'm building a Pizza Delivery API. It's composed of the following operations:

  • GET Products is still in development, so I want to return a mocked response.
  • GET Product returns the product information from Blob Storage.
  • POST Order invokes a Logic Apps that creates the order in an async fashion.
  • GET Order call an Azure Function that retrieves the specific order information.
  • PUT Order is responsible to asynchronously update an order via Logic Apps.
  • DELETE Order removes the order from within a Logic App.


In this blog, I'll show you how to build this API, without writing a single line of code.



This section just demonstrates the basic principles to start creating your first Azure Function Proxy.

  • Azure Function Proxies is still in preview. Therefore, you need to explicitly enable it within the Settings tab.


  • Now, you can create a new proxy, by using the "+" sign. You need to specify a name, a route template, the allowed HTTP methods and a backend URI. Click Create to save your changes.


  • When the proxy is created, you immediately get the proxy URL on which you can invoke the Azure Function Proxy. Via the Advanced editor link, you can define more options in a JSON format.


  • The advanced editor is a JSON editor of the proxies.json file, that gives you more configuration options.


After this introduction, we can start the implementation of the Pizza Delivery API.


This operation won't have a backend service, because it's still in development. We will mock a response, so the API can already be used for development purposes.

  • Create a proxy GetProductsViaMocking. Provide the following details:
    Route template: api/products
    Allowed HTTP methods: GET
    Backend URL: none, we will use mocking

  • In the Advanced editor, specify the following responseOverrides. This sets the Content-Type to application/json and the response body to a fixed list of pizzas.

  • If you test this GET operation, you'll receive the mocked list of three pizzas!

Blob Storage

The GET Product operation will return a JSON file from Blob Storage in a dynamic fashion. Blob Storage is an easy and cheap way to return static or less frequently changing content.

  • Create a blob container, named "pizza". Set its access level to private, so it's not publicly accessible by default. For every available pizza, add a JSON file that holds the product information.


  • Create an Access Policy on container level. The policy provides read access, without any expiration. You can easily do this via the Azure Storage Explorer.


  • Create a Shared Access Signature from this policy, via the Azure Storage Explorer. This gives you a URL (containing the SAS token), to access the blobs from the "pizza" container.

  • To keep the SAS token better manageable, I prefer to add the SAS query string to the Application Settings of our Function App. Create an Application Setting BLOB_SAS_QUERYSTRING.

  • Create a proxy GetProductViaBlobStorage. Provide the following details:
    Route template: api/product/{productId}
    Allowed HTTP methods: GET
    Backend URL:{productId}.json?%BLOB_SAS_QUERYSTRING%

Remark that the route template parameter 'productId' is reused in the Backend URL to get to correct JSON file.
Remark that the application settings can be retrieved via this %BLOB_SAS_QUERYSTRING% syntax.

  • In the Advanced editor, specify the following responseOverrides. This sets the Content-Type to application/json

  • If you test this GET operation, you get the product description of the corresponding product id.

Logic Apps

The asynchronous order operations will be tackled by a Logic App that puts the commands on a ServiceBus queue. Further downstream processing is not covered in this post.

  • Create a Logic App as shown below. Remark that the Request trigger contains an {operation} parameter in the relative path. The request command is sent to an orderqueue, with the {operation} as a message property.


  • Get the request URL from the Logic App. This also contains quite a long query string, that includes the API version and the SAS token. Similar to the previous step, insert the query string into an Application Setting named LOGICAPPS_SAS_QUERYSTRING.


  • Create a proxy PostOrderViaLogicApps. Provide the following details:
    Route template: api/order
    Allowed HTTP methods: POST
    Backend URL:

Remark that the create operation is passed via the Backend URL.

  • If you test this POST operation, you'll notice that the message is accepted.


  • In the Logic Apps Run History, you should see that a new Logic App instance got fired.


  • The message is now placed on the Service Bus queue. The operation has been set to create.


  • Also create a proxy for the PUT and DELETE operation, in a similar way.


Azure Functions

  • I've created an Azure Function that just returns a dummy order. The returned order ID is taken from the request.

  • Deploy this Azure Function in a Function App and retrieve the URL. The URL contains a query string with the required access code. Add this query string to the Application Settings, named FUNCTIONS_CODE_QUERYSTRING.


  • Create a proxy GetOrderViaAzureFunctions. Provide the following details:
    Route template: api/order/{orderId}
    Allowed HTTP methods: GET

Remark that the route template parameter 'orderId' is reused in the Backend URL to pass it on to the Azure Function.

  • If you test this GET operation, you'll notice that a dummy order is returned, with the requested Id.


Feedback to the product team

After playing around with Azure Function Proxies, I have the following suggestions for the product team. They are listed in a prioritized order (most wanted feature requests first):

  • The mocking experience is not ideal when working with JSON objects, as you need to constantly escape quote characters. It would be great if this could be made more user-friendly. As an alternative, you can reach out to blob storage, but this additional step decreases developer productivity. 
  • It would be nice if there is a way to remove all or some HTTP headers from the response. Each Azure service comes with its own set of HTTP response headers and I don't always want them to be returned to the client application. This would be great from both a security and consistency perspective. 
  • Accessing parts of the request or response body would open up a lot of new opportunities. As an example, I've tried to put the message immediately on an Azure Storage Queue, without any code in between. However, this was not feasible, because I needed to wrap the original message content inside this XML template:


  • Currently there is no way to perform different transformations, based on - for example - the backend response status code. This could become handy in certain scenarios.


Azure Function Proxies can be easily used to create an API that consists of multiple HTTP-based (micro)services. Practically every service can be accessed from within this light-weight application proxy, as long as the authentication is performed through the URL (e.g. SAS token, API key). This avoids that the client application needs to maintain a different URL per operation it wants to invoke. The service is very easy to use and requires no coding at all. A great addition to our integration toolbelt!


Categories: Azure
Tags: Functions
written by: Toon Vanhoutte

Posted on Thursday, July 7, 2016 11:58 AM

Luis Delgado by Luis Delgado

Discover how to unit test your node.js Azure functions on Azure, to increase code quality and productivity, using these code samples.

Writing unit and integration tests for Azure Functions is super critical to the development experience, since their execution relies on context variables and are beyond your control and supplied by the runtime. Furthermore, currently there is no local development or debugging experience available for Azure Functions. Therefore, testing if your functions behave properly, in the context of their runtime, is extremely critical to catch defects and increase your productivity.

Because Node.js is dynamically-typed, I want to share a quick trick on how to mimick the Azure Functions runtime context in order to test your functions. I did not find any documentation from Microsoft related to unit testing Node.js Azure Functions, so feel free to comment on the approach I propose here.

As an example, we are going to make a function that posts an observation every minute to Azure IoT Hub:


Now we want to write a unit/integration test for this function.


The function getContextObject simply returns an object the mimics the context object expected by the Azure Functions runtime. The test will simply import your function from index.js, create the mock-up context object and feed it to your function for execution. Finally, within your test, you can override the context.done() function to do the assertions you need and call done();

Is this the proper way to test Azure Functions on Node.js? I will let the Functions Product Group comment on that :). However, this method works for me.

The other alternative you have is to create your inside (internal) functions on other files that you can test separately in the traditional way you would test JS code, and import those files in your index.js file. The problem I see with that approach is, if your internal functions make a call to the context object, your tests will probably fail because of this.

Comments, feedback or suggestions? Submit an issue to the repository or write them below.

Categories: Azure
written by: Luis Delgado