wiki

Codit Wiki

Loading information... Please wait.

Codit Blog

Posted on Tuesday, June 23, 2015 12:31 PM

Massimo Crippa by Massimo Crippa

It's crazy to see that the Power BI APIs are documented and managed on Apiary and not on Azure API management, isn't it?
Let’s see how to configure APIM so you can try out all of the Power BI APIs without writing a single line of code using the Microsoft Service.

Moving to Azure API management is more than seting up the documentation and the interactive console with a different look and feel. It gives us the possibility to take advantage of capabilities like throttling, usage analytics, caching and many more.

Here is the 4 steps procedure I did for this exercise:

  • Virtualization layer definition
  • Configure the Authorization Service (Azure Active Directory)
  • Configure Azure API Management to register the Authorization Service
  • Dev portal customization (optional)

Power BI API calls are made on behalf of an authenticated user by sending to the resource service an authorization token acquired through Azure AD.
The diagram below shows the OAuth 2.0 Authorization Code Grant flow.

Virtualization layer

First thing to do is to create the API Façade on Azure API Management defining the set of the operations that will be proxied to the Power BI web service (https://api.powerbi.com/beta/myorg).

Since Power BI APIs don't expose a swagger metadata endpoint, I manually created the API, added the operations, descriptions, parameters and representations (you can find the documentation here).

Then I defined my “Microsoft PowerBI APIs” product activating the visibility to the Guest group and with the Subscription Required option enabled and I added the API to the Product. With this configuration the API are visible to everyone so you can freely access to the documentation and on the other hand a subscription key is required to tryout the API using the built-in console.

The PowerBI APIs require an authentication token, so if we try to call the underline service at this point of the procedure we receive a 403 Forbidden answer.

Authorization Service

In order to provide a secure sign-in and authenticate our service calls with the Power BI APIs, we need to register our APIM built-in console in Azure AD as a Web Application. To complete this step you first need to sign up for the Power BI service and an Azure AD with at least one organization user.

Here you can find the register procedure. 

The main parameters to be set up are:

  • APP ID URI : Unique web application identifier. (e.g. https://coditapi.portal.azure-api.net/)
  • Reply URL : This is the redirect_uri for the auth code. The value configure in this field will be provided by the API Management's "Register Authorization Server" procedure (next section). 
  • Add the delegated permissions that will be added as claims in the authorization token.

Once the AD application is created you will get the ClientId and the ClientSecret values.

Note that the AD web application has been created on the Codit Office 365’s Azure AD so our setup will be valid only for our corporate Power BI tenant. Something different is the Apiary setup where I imagine that the Apiary WebApplication is enabled by default in every Office365's AD.

 

Register the Authorization Service

In this step we register the Authorization Service in Azure API Management and then we setup our Power BI façade API to use the registered Authorization Service. 

Here you can find the step-by-step procedure.

The main parameters to be set up are:

  • Authorization endpoint URL and Token Endpoint URL. 
  • The Resource we want to access on behalf of the authenticated user. 
  • ClientId and ClientSecret. Specify the value we got from the AAD  

Authorization endpoint URL and Token endpoint URL. Go to the Azure AD, select the application section and click the Endpoint button to access to the endpoint details.

The Resource service (Power BI) parameter must be specified as a body parameter using application/x-www-form-urlencoded format. This is the value of the resource parameter https://analysis.windows.net/powerbi/api

Common errors with wrong resource parameter are:

  • An error has occurred while authorizing access via PowerBI: invalid_resource AADSTS50001: Resource identifier is not provided. Trace ID: {Guid} Correlation ID: {Guid} Timestamp: {TimeStamp}
  • An error has occurred while authorizing access via PowerBI: invalid_request AADSTS90002: No service namespace named '{resourcename}' was found in the data store. Trace ID: {Guid} Correlation ID: {Guid} Timestamp: {timestamp}
  • An error has occurred while authorizing access via {APIM Authorization Server Name}

The "Register Authorization Server" procedure generates a redirect_uri that must be used to update the "Reply URL" value in the AD Web Application we set up in the previous step. If not, you'll get this error at the first tryout :

  • AADSTS50011: The reply address '{redirect_uri}' does not match the reply addresses configured for the application: {Guid}.

The last step is to configure our Power BI façade API to use the OAuth 2.0 authorization mechanism. Open the API, click on the Security tab and then check the OAuth 2.0 box and select the registered Authorization Server.

Try it out

The API Management built-in console allows us to quickly test the API. 

Choose an API operation and then select "Authorization Code" from the authorization dropdown to access to the sign-in User Agent provided by Azure Active Directory. Once you have signed the HTTP request will be updated with the Bearer Token (the token value is not readable in the UI). 

Now you can specify the desired values for the additional parameters, and click Send to test the API. 

If you are interested to get access to the Bearer token, use Fiddler to intercept the Azure AD reply and then jwt.io to analyze the token content with all the properties and claims. 

Developer Portal

As the Developer portal is a completely customizable CMS, you can set the look and feel following the branding strategy and add all the content you need to help to drive the APIs adoption. 

In this case, I created a custom page (https://coditapi.portal.azure-api.net/PowerBI) dedicated to the Power BI APIs to collect some resources to facilitate the API intake like MSDN references and code samples.  

Conclusions

As you can see, you can very easily use Azure API Management to connect to Power BI APIs on behalf of an authenticated user.

Moving to API management is not only like coming back home but also gives us the possibilities to take advantage of APIM capabilities like usage analytics to get insights about the health and usage levels of the API to identify key trends that impact the business.

Cheers!

Massimo Crippa

Categories: Azure
written by: Massimo Crippa

Posted on Tuesday, June 16, 2015 10:30 AM

Tom Kerkhove by Tom Kerkhove

Brecht Vancauwenberghe by Brecht Vancauwenberghe

On June 11th, the second edition of ITPROceed was organised. In this blog post you will find a recap of the sessions that we followed that day.

ITPROceed took place at Utopolis in Mechelen. A nice venue! Parking in front of the building and a working wifi connection immediately set the pace for a professional conference. There were 20 sessions in total (5 tracks).

Keynote: Microsoft Platform State-of-the-union: Today and Beyond By Scott Klein

Scott Klein kicked off the day with the keynote and gave us a nice state-of-the-union.
Scott explained us once the many advantages of using the cloud and compared it with the old days.

A big advantage is the faster delivery of features, where in the past it could take 3/5 years to deliver a feature through service packs or new versions of software.

Why Microsoft puts Cloud first:

  • Speed
  • Agility
  • Proven
  • Feedback

Next to that Scott showed us several new features, SQL Server 2016, Azure SQL Data Warehouse, PolyBase, Data Lake, Windows 10,...

The world is changing for IT Professionals, this means lot's of changes but also a lot of opportunities!

Are you rolling your environments lights-out and hands-free? by Nick Trogh

Nick gave us a lot of demo's and showed us how we could spin up new application environment in a reliable and repeatable process. In this session we looked into tools such as Docker, Chef and Puppet and how you can leverage them in your day-to-day activities.

Demystifying PowerBI

Speaker - Koen Verbeeck

Koen gave us a short BI history lesson whereafter he illustrated what each tool is capable of, when you should use it and where you can use it. To wrap up he showed the audience what Power BI v2 (Preview) looks like and how easy it is to use. Great introduction session to (Power) BI!

If you're interested in Power BI v2, you can start here

Data Platform & Internet of Things: Optimal Azure Database Development by Karel Coenye

In this session Karel told us about several techniques to optimize databases in Azure to get the most out of them and reducing the cost.
With Azure SQL databases you need to think outside the box and optimise with following principle:  

Cross premise connectivity with Microsoft Azure & Windows Server by Rasmus Hald

Running everything in the cloud in the year 2015 is very optimistic, often several systems are still running on premise and have not been migrated already to the cloud. Network connectivity between the cloud and on premise is necessary!

Within Codit we already have experience with Azure networking, it was very nice to follow the session to get more tips and tricks from the field.

Rasmus covered four topics:

1. How Windows Server & Microsoft Azure can be used to extend your existing datacenter.
2. How to use existing 3rd party firewalls to connect to Azure.
3. The new Azure ExpressRoute offering.
4. Windows Azure Pack.

Big Data & Data Lake

 Speaker - Scott Klein

SQL Scott was back for more, this time on BIG data! He kicked off with an introduction of how many data was processed to calculate the space trip for Neil Armstrong to the moon and how the amounts of data have evolved.

Bigger amounts of data means we need to be able to store them as good & efficient as possible but also be able to work with that data. In some cases plain old databases are not good enough anymore - That's where Hadoop comes in. 

The Hadoop ecosystem allows us to store big amounts of data across data nodes and where we can use technologies such as Pig, Hive, Sqoop and others to process those big amounts of data.

As we start storing more and more data Microsoft started delivering Hadoop clusters in Azure called HDInsights which is based on the Hortonworks Data Platform. If you want to know more about HDInsight, have a look here.

Process big data obviously requires the big data itself, during Scott's talk he also talked about Azure Data Lake which was announced at //BUILD/ this year. There are several scenarios where you can benefit from Azure Data Lake - It's built to store your data without any limitation, whether it is size, type or whatsoever, in its raw format.

In the slide below you can see how you can use Azure Data Lake in an IoT scenario.

Just to clarify - Data Lake is not something Microsoft has invented, it's a well known concept in the Data-space that is kinda the contrary of Data Warehousing. If you want to learn more about Data Lake or the relation with Data Warehousing, read Martin Fowlers vision on it.

Scott wrapped up his session with a demo on how you can spin up an Azure HDInsight cluster. After that he used that cluster to run a Hive query on your big files stored in an Azure Storage account as blobs.

Great introduction session to big data on Azure.

Securing sensitive data with Azure Key Vault by Tom Kerkhove

Speaker - Tom Kerkhove

In the closing session Tom introduced us the concepts of Microsoft Azure Key Vault that allows us to securely store keys, credentials and other secrets in the cloud.

 Why you should use Azure Key Vault:

  • Store sensitve data in hardware security modules (HSM)
  • Gives back control to the customer
    • Full control over lifecycle and audit logs
    • Management of keys
  • Removes responsibility from developers
    • Secure storage for passwords, keys and certificates
    • Protect production data

Posted on Monday, June 8, 2015 8:31 AM

Brecht Vancauwenberghe by Brecht Vancauwenberghe

Filiep Maes by Filiep Maes

Jonathan Maes by Jonathan Maes

Toon Vanhoutte by Toon Vanhoutte

Pieter Vandenheede by Pieter Vandenheede

On June 4th, BTUG.be organized the first "Integration Day". At Codit we were happy to be present and in this blog post you will find a recap of the sessions that were brought that day.

The BTUG Integration Day took place in the Moonbeat studio in Mechelen. A nice venue for about 35 eager participants of the event. There were 9 sessions in total, and the day kicked off with a nice keynote session from Jan Tielens.

Keynote

Speaker: Jan Tielens

Jan started off with a small recap of where integration comes from and where it is headed to. From monolithic designs to API apps, web apps and logic apps. He proceeded with a demo on provisioning API apps and logic apps and how to retrieve tweets using a certain #hashtag by using a recurrence app and the new Logic Apps.

The demo didn't go exactly as planned due to the network acting up, but it involves retrieving tweets from Twitter and sending them to a local Raspberry Pi printer. Later that day it seems that it worked just fine:

Jan continued his keynote talking about the capabilities of API apps and the Swagger integration and the concept of hybrid integration: integration between different cloud services or the combination cloud and on-premises.

 

BizTalk Server Deep Dive Tips & Tricks for Developers and Admins

Speaker: Sandro Pereira

 

After the keynote, Sandro took the stage to have a session on BizTalk tips and tricks for both BizTalk administrators and developers. 

The first part was focused on BizTalk administration.


The most important tips:

  • Configure the default BizTalk backup job, to include custom databases
  • Take advantage of PowerShell to automate the tracking configuration settings
  • Automatically clean up custom message archiving folders

The second part was more developer oriented. Interesting tricks:

  • Use default or custom pipeline components in a generic way
  • Request/Response patterns are supported in a messaging only scenario
  • Via a registry change, the Business Rules Engine supports consuming static methods

It's good to see that most of the tips and tricks are already part of the Codit BizTalk Best Practices, our technical 'bible' to ensure quality within BizTalk integration projects!

 

Demystifying Logic Apps

Speaker: Ashwin Prabhu

Aswhin started giving an overview of the history of Logic Apps, even it's not been here for a long time it has some interesting key-points.

Windows first announced Windows Azure in 2008, in 2011 Service Bus was introduced and this were the first integration (baby)step.

In 2013 BizTalk Services were announced, but after some architectural changes this was re-worked so it would fit in the new eco-system (App services). The main reason for this is that Microsoft would like to provide us a democratic eco-system so we - as a developer - can (re-)use functionality from each other (e.g. mapping functionality)
These different building blocks (Logic app, API app, Web app, mobile app) provide us an easy way to use different functionality without having a steep learning curve. 

During the demo: Aswhin created a logic app with 2 different connectors a SQL connector and File connector - SQL server was queried and some data was picked up and sent to the file adapter.

What can we expect for Logic Apps in the future?

  • Integration patters (Convoys, long running processes, Auto delivery)
  • Large messaging handling
  • Azure services on premise.
  • Build-in designer for Visual Studio.
  • Bug fixes, important is that you provide your feedback Microsoft is ready to listen! (Tip: if you are using connectors at this moment, and you don’t want to be bothered with updates, you can disable the auto-update in the portal.
  • Better error messages

During the question round, Aswhin got the question if Logic Apps created to take over of BizTalk Server? BizTalk server on-premises is here to stay, but things are moving! For example: a start-up may be better served with cloud services so they can focus on their functionality instead of infrastructure,

Microsofts purpose is to provide an alternative in the cloud. But both worlds can exist next to each other.

 

5 Advanced techniques in BizTalk360 to Solve Operational Problems

Speaker: Saravana Kumar

Just before lunch Saravana took the lead and presented how BizTalk 360 can help you to solve daily Operational problems.

BizTalk 360 has 50 features in BizTalk 360 focused on Operations & Monitoring.

Saravana his sessions was hands-on containing 5 different interesting demo's.

1.  Visualize your message flow

  • However complex they are, with zero coding change you can visualize the BizTalk flows.
  • Admin console is difficult to understand, very technical.

2. Automate your operations

  • A lot of time is lost daily on monotonous tasks.
  • Data monitoring / MessageBox monitoring (In our opinion the BizTalk flows should handle these tasks as much as possible leaving no suspended messages/manual intervention).

3. Import/Export, Auto correct monitoring configuration

  • Import/Export: moving monitoring configuration from Test to Production.
  • Autocorrect: receive location down, gets automatically started by BizTalk 360.

4. Manage any internet connected BizTalk environment remotely

  • In a secure way
  • No complex VPN connection necessary
  • Handy for operations teams that need to be able to connect 24/7 to the BizTalk environment: BTS 360 is using Service bus Relay

5. Understand Throttling

  • Understanding throttling is a complex process and requires a BizTalk expert to understand the problem.
  • BizTalk 360 can be used to easily understand what the real problem is on the environment.

Next to BizTalk 360 there are different monitoring tools on the market (Codit Integration Dashboard, System Center Operation Manager, AIMS, BizTalk Health Monitor) each having their advantages.

 

BAM Experiences in Large Scale Deployments

Speaker: Marius W Storsten

AIMS Innovation has - up until now - used BAM as a core component of their monitoring solution for BizTalk: AIMS Innovation. Marius shared AIMS' experiences on using BAM in a monitoring setup -> how it works, the good & bad, performance, bugs, tips, tricks and challenges

Marius tried to make it an interactive session, which is very nice, but I don't think he counted on a Belgian audience :)
Luckily some Dutch guys were quicker to answer.

It is AIMS' experience that the choice for BAM has not been the best and Marius showed us this by referencing some of their experiences and discoveries around BAM. One of them being a dependency between bttdeploy.exe and the Tracking Profile Editor (TPE). Meaning that bttdeploy.exe depended on TPE and not the way around.

Marius concluded with some recommendations on using BAM:

There is also a small, but nice blog post up on their website about this as well: http://www.aimsinnovation.com/news/aims-blog/why-we-went-bam-less

 

Governance and API Management for BizTalk Server- what’s the value and how to implement it?

Speaker: Adrew Slivker

In a world that exists out of services and API's that are business critical for companies we need governance and management of these.

What is governance about?

  • Manage services
  • Share metadata (SOAP, Swagger, ... )
  • Document services
  • Publish & register services
  • ... 

The management of the services exists out of security (authentication & authorization), monitoring, SLA, etc...

Sentinet manages SOA and API services and applications deployed on-premises, in the cloud, or in hybrid environments.  To provide us the possibility to govern & manage our services, Sentinet uses the gateway concept – publish internal services to partners, provide the possibility to use multiple protocols, add monitoring, ... all that without changing the internal functionality of your services.

During the demo - Andrew showcased the Nevatech stack and the Sentinet solution. In the demo an internal net.tcp service hosted by BizTalk that's been able to consumed by clients through a virtual service hosted by Sentinet via both SOAP and REST, without any development.

 

JSON and REST in BizTalk

Speaker: Steef Jan Wiggers

 

Steef Jan brought us a session about JSON and REST.
In a new hybrid world, integration is key and will become more important than ever. There are several systems like SAP that are not going to the cloud in the near future.  BizTalk server 2013 R2 offers capabilities to fulfill the demand for a hybrid type integration solution using the JSON encoder/decoder.

The session was mostly based on DEMO's where we also connected with the API of last.fm.

You can find this demo on the technet wikis as well: http://social.technet.microsoft.com/wiki/contents/articles/29719.biztalk-server-2013-r2-json-support-and-integration-with-cloud-api-s.aspx

 

Azure Event Hubs, StreamAnalytics and Power BI

Speaker: Sam Vanhoutte

In his IoT demo Sam will show how to use Azure Event Hubs, Stream Analytics and Power BI.

There is a lot of similarities between BizTalk integration and IoT, it is all about connecting data.

A typical IoT event stream looks like:

  • Data generators (sensors)
  • Field gateways: Used as bridge between the cloud and the data generators
  • Event hubs: Used to collect data on a large scale
  • Stream Analytics: digest the data
  • Data analytics: Power BI

Event Hubs- is a highly scalable publish-subscribe event ingestor that can intake millions of events per second so that you can process and analyze the massive amounts of data produced by your connected devices and applications.  In his demo, Sam shows how to setup an event hub and how it works using throughput units.

After collecting data you can use stream analytics for real time analytics. Stream Analytics provides out-of-the-box integration with Event Hubs to ingest millions of events per second. It is based on SQL syntax. Sam gives a demo of how stream analytics works.

Power BI is about visualizing data instead of using tables for the end user, a (free) Power BI dashboard is available. Currently, it has limited capabilities:

  • Data collection
  • Support for Power Maps
  • Pin reports, relationship between different views

Sam ends with an overall demo about traffic control. His demo generates speed information, sends the data to the event hub, uses stream analytics to sort the data and finally shows the information in Power BI.

 

Conclusion

We had a blast with the Integration Day and hope to be present again next year! A big thank you to the BTUG.be organization and the speakers and sponsors of this event. We (as Codit) are proud to be apart of this!

 

Posted on Wednesday, May 27, 2015 4:14 PM

Tom Kerkhove by Tom Kerkhove

Microsoft's Yammer has been around for a while and people who are part of one or more networks will agree that Yammer can turn into Spammer.

In this blog post I demonstrate how you can automatically post to a Slack channel.

This blog post was also released on my personal blog.

Microsoft's Yammer has been around for a while and people who are part of one or more networks will agree that Yammer can turn into Spammer.

For each new conversation & comment, Yammer will send you an email resulting in mail floods. The easy fix would be to disable the notification email but then you risk the chance to miss out on interesting/important discussions.

At our current project we use Slack to communicate with each other and it's a really nice tool - Nice & clean just how I like it.

Slack Logo

So lets get rid of the notification emails and notify your team when someone starts a new conversation on Yammer! This is where Microsoft Azure App Services come in, more specifically Microsoft Azure API & Logic Apps.

With Azure Logic apps I've created a flow where I have one API app listening on a Yammer group for new conversations while another Slack API App will notify us in a channel when something pops-up.

How does that look?!

When I create a new conversation in Yammer i.e. "We're ready to go in production" - 
New Yammer Conversation

The Yammer API App in my Logic App will notice that there is a new conversation and will send a message to my team's Slack channel as the Project Announcements-bot. 

Slack Bot Response

Want it yourself? Here's how!

Before getting our hands dirty let's summarize what's on today's schedule.

We will start with provisioning our API apps that we will use from the Azure Marketplace. After that we will create a new Logic app that will describe the flow of our app.

Provisioning the API Apps

As of today you have two options for provisioning your API Apps - One is to provision them upfront where you have more control on naming and such. Second is provision them while you are designing your Logic app and let Azure take care of the naming. Be aware: Azure uses names like YammerConnector1431593752292 that doesn't really say where they're being used.

Since I always want to name my components as self-describing as possible we will provision two API apps up front :

  • A Yammer App that will trigger our Logic app when a new conversation is posted
  • A Slack App that will send a message to Slack as a Bot

Provisioning an API App is super simple : Browse to the new Azure Portal > Click New > Select Web + Mobile > Browse the Azure Marketplace > Select the API Apps section > Select the API App you want.

After you've selected your API App you basically you give it a Name, assign the App Plan & Resource Group : 
Provision API App

Azure will start provisioning the API App for you in the background, while they are doing that let's have a look at the Connector Info.

Before actually provisioning the App you see that each API App or Connector gives you an overview of it's capabilities in a Logic App. Here you can see that the Slack Connector will only be able to act in a Logic app. 
Slack Connector overview

Now when we look at the Yammer Connector Info we see that it can act withing a Logic App but also Trigger it on a certain condition. 
Yammer Connector overview

Defining the flow in a Logic App

Before we can start defining our flow we need to create a new Logic App.

In the Azure Portal click New > Select Web + Mobile > Logic App. Give it a self-describing name and add it to the same App Plan as your provisioned API Apps.

Once it is configured, open it and click Triggers and Actions
Configure Logic App

We will define our flow by defining the sequence of connectors. You can find our provisioned connectors on the side, click on your Yammer connector to add it. 
Clean Logic App

After that, the default card will be replaced with your Yammer connector. As you can see we first need to authenticate with Yammer. Click Authorize
Basic Yammer Card

A pop-up will show to do the OAuth dancing with Yammer. After you've logged in you will see need to grant access to your Logic App.

Read the statement carefully and click Allow if you agree.

Yammer Authentication

(In order to complete the following steps you need to allow access)

Now that you've allowed access to your Yammer account it's interesting to know that the authentication token will be stored in the secure store of the Gateway (A Gateway is used by API Connectors to communicate with each other and outbound services). This is because the gateway will handle all the authentication with Yammer for us.

Once that's done you get an overview of all the triggers the Yammer connector has. Luckily the only one that is available is the one we need, click New Message
Yammer Triggers

Configuring the trigger is fairly easy - We define the trigger frequency in which the connector will look for new messages. Next to that we assign the Group Id of our Yammer Group that we are interested in. The granularity of your trigger frequency depends on the hosting App Plan. In my example I'm using 1 minute which requires me to use a Standard-tier App Plan.

You can find the group Id by browsing to your group and copying the Feed Id.

https://www.yammer.com/Your-Network/#/threads/inGroup?type=in_group&feedId=579250

Yammer Connector Configured

Click the checkmark to save your configuration.

Go back to the side bar and click on your Slack Connector to add it to the pane. Here we need to authenticate with our Slack by clicking Authorize
Basic Slack Connector

Just like with Yammer, Azure will request access to your Slack account to post messages. 
Slack Authentication

Our last step is to configure the Slack connector.

What we will do is send the original message as a quote along with who posted it and a link to the conversation. In Slack that results in the following markup statement -

>>> _"Original-Message"_ by *User* _(Url)_

To achieve this we will use the @concat function to assign the Text value -

This statement is retrieving some of the output values of the Yammer connector.

We will also configure to which Slack channel you want to send it. Optionally you can assign a name to the Slack bot and give it a icon. Here I gave the name of my Yammer group as Slack bot name. 
Slack Connector Configured

Click the checkmark to save your configuration & save the flow of your logic app. 
Save Logic App

After a few seconds/minutes, depending on your trigger configuration, you will see that the Yammer connector picked up your new message and triggered your Logic App. 
Logic App - Run Overview

Now you should see a new message in your Slack channel!

Ship it!

That's it - we're done!

Your Yammer connector will now poll for new conversations in your Yammer group every cycle you've defined in its configuration. If there are new ones, your Logic App will start processing it and you will be notified in Slack!

Wrapping up

As you can see, you can very easily use Azure API & Logic Apps to create small IFTT-like flows. Nevertheless you can even build more full-blown integration scenarios by using the more advanced BizTalk API Apps!

If you want you can even expand this demo and add support for multiple Yammer groups. To do so you'll need to open the Code View and copy additional triggers in the JSON file (Thank you Sam Vanhoutte for the tip on how to create multiple trigger).

Keep in mind that the Slack bot's name that is posting is currently hardcoded, unfortunately the Yammer app doesn't expose the name of the group so this is something you'll have to work around.

Can't get enough of this? You can build your own API App or read Sam Vanhoutte his initial thoughts on Azure App Services!

Thanks for reading,

Tom Kerkhove

Categories: Azure App Services
written by: Tom Kerkhove

Posted on Wednesday, May 13, 2015 7:28 PM

Maxim Braekman by Maxim Braekman

Sam Neirinck by Sam Neirinck

Tom Kerkhove by Tom Kerkhove

The second edition of Techorama, which is being hosted at Utopolis Mechelen, provided a large range of interesting sessions covering all kind of topics. Read more about some of the sessions from the second day in this post.

Just as promised in yesterday’s post of day 1, we are back with an overview of some of the sessions from the second day of Techorama.

Internet of things, success or failure by Stefan Daugaard Poulsen

Twitter: @cyberzeddk

One of the sessions to start off the second day of Techorama, was one about the internet of things, presented by Stefan. He made it clear from the very beginning, that this was not going to be a technical session about writing the code to run on devices, nor about the electronics themselves since Stefan, to put it in his own words, knows jack about it.

Companies are continuously attempting to invent new devices for all kinds of purposes, but are all of these devices actually useful? It’s not all about inventing shiny new devices that look good, but they should take some aspects into account:

  • Does it solve a problem? Can it be used to actually make life easier or provide useful information.
  • Is it consumer-friendly? In other words, can it be shipped without user-manual without raising questions.
  • Does it repeat history? There is no use in re-creating devices that clearly failed in the past.

Off course, one could ask a whole bunch of other questions before starting the development or creating a kickstarter-project. But these questions above are vital in order to build a device, which might turn into a succes.

Although the Internet of Things is becoming widely popular and lots of companies are jumping onto the IoT-train, there are quite some challenges:

  • Privacy: what happens with the data that is being collected by this device.
  • Security: since most devices will be connected to a network, they may not become the culprit of security-leaks.
  • Data processing: all of the sensors are generating a huge load of data, which needs to be processed in an orderly way.
  • Data storage: all of this data that is being processed needs to be stored in a correct way. Do you actually need of the data? How long do you need to save it?
  • Futuristic thinking: the devices should be an enhancement of the current world, but with some limitations. It is not always possible to change how everything is currently working, without expensive modifications.
  • Battery-life: there is no use in creating a device that needs to be charged every couple of hours.

In overall, people or companies should think before creating the next new thing, as it needs to be useful, non-intrusive, reliable and enhancing.

Embellishing APIs with Code Analyzers by Justin Rusbatch

Twitter: @jrusbatch

Visual Studio 2015 ships with the long-awaited Roslyn compiler platform. I can’t remember when exactly Microsoft started talking about Compiler as a Service, but it’s been a couple of years at least. However, it was worth the wait!

As is more and more common within Microsoft, the development of this platform is all open on Github. This means the compiler is no longer a black box, but a fully featured set of APIs which can be used to analyze code, among many other things.

Justin did a small demo on how easy it was to create and debug an Analyzer using Visual Studio 2015 and the VS2015 SDK. It was a simple demo analyzer which would indicate that class names must be in upper case (I suggest not to use this in your actual code).

A code analyzer looks like this in Visual Studio: 
enter image description here

I can think of quite a few use cases already to use code analyzers for. If we think about BizTalk development alone, one can imagine quite a few rules to create, just for Pipeline Components.

  • The pipeline component class must have a GuidAttribute and ComponentCategoryAttribute. (this prevents a few minutes of wondering why your pipeline component doesn’t show up in the Toolbox).
  • Do in-depth code analysis to see if the Load&Save methods are implemented correctly.
  • Create warnings for working with non-streaming classes.

Additionally, each integration project has business-specific rules and coding/naming guidelines. Perhaps your guidelines require you to do a LogStartMethod() & LogEndMethod() in each and every method. Now you can create an analyzer which can verify this, and optionally break your build. This way you can ensure that all your guidelines are enforced, and you have great Visual Studio tooling as an additional benefit. You can even create quick fixes so it’s just a matter of clicking the light bulb and the log statements are inserted without you typing a thing.

All in all, it’s something I will definitely look into in the coming weeks.

Teamwork - Playing Well With Others by Mike Wood

Twitter: @mikewo

Those who've read yesterdays post already know I'm a fan of Mike as a speaker but todays session was really really inspiring!

The focus of the talk was about how can you as an individual work well with others - The first step to achieve this is by stop thinking about yourself as an individual, but instead think the team. Together you are one and believe in one big picture - Your main objective. You need to get rid of your ego and work together as a team to achieve your goal and get across the hurdles that are holding your back from achieving it.

Here are some other interesting tips he gave us :

  • When communicating in team, do it wisely - Don't start pointing fingers at each other when things fail but work together to fix it as soon as possible. Talk in the we form when it's possitive, otherwise talk in the I form, i.e. I broke the build because of this or that. Avoid the "Lottery"-effect where only one person has knowledge about a certain topic, losing him/her means losing a lot of knowledge.

  • Great work can be rewarded with incentives but do it the right way - Reward the team instead of one individual. As an example don't reward the salesman who sold the most, reward the team when they've reached a certain target. This will boot their team spirit instead of having internal competition.

  • Understand failure and accept it - Nobody is perfect & everybody makes mistakes so accept this, it's inevitable to make mistakes but make sure you learn from them.

  • Leadership - Not everyone wants to be a leader so don't push people to do this. A true leader knows how his team members work & feel so they can take that into account. Provide them with guidance & your vision to which you are striving. Also delegation is key to success but don't assign them tasks you would not want to do on your own.

  • Invest in your team members - Have trust in them and let them research things they're interested in

These are just some of the examples Mike gave us that can really contribute in thinking as a team, working as a team & shipping great solutions as a team.

I'd like to end this blog post with a quote Mike mentioned during his talk.

"We will accomplish what we do together. We share our successes & we never let anyone of us fail alone."
- USC Covenant
 

 

This rounds up our 2 day adventure at Techorama, first of all we want to thank everybody for reading our two blog posts and off course a big thank you to the organization of Techorama for creating such an amazing event!! 

Thanks for reading,

Maxim, Sam & Tom