Skip to main content

Microsoft Flow: Expose Public APIs via Custom Connector

I love fitness trackers. I was excited to get the new Fitbit Alta HR, but some of my eagerness waned when I remembered why I hadn't used a Fitbit since my old Fitbit One -- I hated how it didn't integrate with any of my other applications I used to track my information. Fitbit does support some integration with partner apps, but just not with any apps that I happen to prefer and use.

I'm an Android user. I've started to look at Google Fit as a hub for all of my fitness data, but I'm not completely sold on that yet. I've been using Nike+ Run Club since I jumped on the Nike bandwagon years ago and purchased the Nike+ iPod Sport Kit (back when I carried an iPod). I was so excited to have a sensor nestled in my shoe and a coach speaking to me as I was running! Fast-forward to now, and although my phone is the main sensor for everything, I still prefer to use the Nike+ applications to track my runs and training. Nike+ synchronizes data to Google Fit, but not to Fitbit. In this scenario, Fitbit is an island; it doesn't integrate with Google Fit or Nike+.

Here's the thing -- all of these fitness services provide APIs to interact with their data. All I need to do is write something to make them all talk to each other. I want to achieve a few things; my solution needs to be:

  • Easy. I can be pretty lazy when it comes to writing and maintaining software on my own time.
  • Flexible and Extensible. I'm not sure exactly what makes the most sense -- what data I plan to move from one system to the other, how frequently, etc. I know if I can write something that keeps my options open, I'll be more likely to continue using it.
  • Available. Years ago, I'd hosted a handwritten integrator application on my home computer that synchronized my Nike+ runs with Fitbit. It was unreliable because I did not have an environment meant to be "always on" at my home. I know I need something that's hosted externally and has the ability to be executing reliably throughout the day to ensure my data is fresh.

Microsoft Flow, Azure Logic Apps, or Azure Functions?

Really, I'm looking for an easily configurable integration space. I immediately thought of a few Microsoft technologies -- Microsoft Flow, Azure Logic Apps, and Azure Functions. I'm a developer, so I tend to start with code. For this solution, I'd really like the convenience of having an integration workspace that I can easily add/remove and configure components to move data between my systems. I have a good feeling that these technologies can all achieve my goals, and I'll probably end up with a few different flavors as I move forward. Here's a great article describing some of the differences between the technologies.

The main goal is to get data into and out of Fitbit, so I'll start by building out a custom connector that can be used in both Flow and Logic Apps. That should provide me with the most flexibility until I have a better idea if more advanced customization is required. In this case since I just need access to Fitbit, I don't think I'll need any additional configuration. I can defer the decision between technologies until after I get a chance to see how the custom connector works for me.

Basically, connectors are web APIs that use REST for pluggable interfaces, Swagger metadata format for documentation, and JSON as their data exchange format.
Microsoft's documentation on custom connectors for logic apps
So will this work for me? Fitbit provides a REST web API, an OpenAPI (Swagger) specification, and uses JSON. Should be easy to get started.

Building a Microsoft Flow or Logic App Connector

Creating a custom connector for either Microsoft Flow or an Azure Logic App is very similar, and both approaches share similar setup and requirements. I'll be creating one for Microsoft Flow. While the custom connector should be easy to build out by importing the OpenAPI specification, there are a few requirements of the Fitbit API that must be met. First, I'll need to register an application with Fitbit. Second, I'll need to make sure the connector uses OAuth 2.0 authorization to the Fitbit API. After this is in place, I can expose the operations on the connector as actions that can be added to a Flow or Logic App. This should make it really easy to build out different scenarios around integrating with Fitbit.

Registering my application

Registration with Fitbit was very easy. As a registered Fitbit user, I can use my standard (free) account to register and maintain my application. Register new applications here by filling out the form:

Note that I'm just creating this application for Personal use, so both the authorization code and implicit grant OAuth 2.0 flows are options for authorization. I also included a Callback URL, but don't worry if you don't have one yet -- you can edit this later after we have the correct one from the Flow application. Finally, make sure you request Read and Write access since this connector could be used in both scenarios.

After registration, you'll see some important details:

Keep these details handy, because you'll need them when we create the connector. The Client ID and Secret will be used by the Flow Connector to authenticate, and the URI details will also be needed so that the identity provider (Fitbit) can be setup correctly.

Creating the custom connector

Because Fitbit exposes an OpenAPI specification, this step should be pretty simple. The specification will provide the connector with all the information it needs to build out the actions based on the operations defined. There were several validation errors that didn't allow me to directly import from URL, so I downloaded the specification locally and edited it until it validated successfully. This doesn't change the Fitbit Web API nor what it exposes -- it was just updates to the metadata around the naming of operations. Let's see it work!

First, go to Flow. Click on the settings gear to select that you want to go to custom connectors. Once there, choose to Import an OpenAPI file to create the custom connector. Select the file to upload and continue.

The first part of the setup handles General settings. You can add a custom icon and color, and describe the connector. Most importantly, it defines the Host for the API. This information will be populated from the Open API specification, but you can also edit and adjust here.

The next section of the setup, Security, is where that information from the Fitbit application registration will be used. Again most of it is defined in the OpenAPI specification, but you'll need to add your Client ID and Secret here. Note that the Callback URL will be generated after saving the connector, so we'll have to come back here to get that value as we will need to update it in the Fitbit application registration as well.

The OpenAPI specification has 97 operations, and they're all loaded super easily. There does seem to be an issue though. The specification did not validate because the operation IDs were not unique across all operations. In removing the operation ID, it still brings in the operations, but they will all need to have a unique Operation ID added. I could've done this through the specification file and reloaded, but it is very easy to just add a unique name on each action through the Flow UI.

After adding a unique Operation ID to every connector action, you should be clear of all errors and good to save.

The next and final step to ensure your connector is ready is to Test. The first thing you'll need to do is create a New Connection. Click the button. It will use the security settings to go to Fitbit to authorize. But there's an issue. The redirect_url is invalid. Remember in the security step that the Redirect URL doesn't get generated until after save? Go back to the Security step and copy the Redirect URL to update the Fitbit application registration's Callback URL field.

After updating the Fitbit application registration settings with the correct Callback URL, test again. This time, you should see that everything is wired correctly. I'm already logged into Fitbit, so I'm getting asked to allow this registered application (Flow Connector) access to my account's Fitbit information. If I click Allow, the connection information (the bearer token returned from Fitbit) will be saved so that I can test the actions.

Select any action to test. In this example I tested the GetDevices action. As you can see, Fitbit responded successfully with the payload showing my Alta HR and Aria devices.

Save your custom connector, and you're all set to get using it. It should now be visible in your custom connectors list.

Summary

That's all there is! It's that easy to integrate an API in your Flow or Logic Apps via a custom connector. At its most simple, you can just import an OpenAPI specification and you're set. In this example we dealt with the added complexity of requiring OAuth 2.0 to access the API. We registered with the identity provider, setup the security, updated the callback, and tested our imported operations.

With this in place, I should now be able to easily get data into and out of Fitbit. Applying the same approach to other public APIs enables me to create true flows between the systems to keep that data in sync. I should finally be able to have all the data where I want without having to resort to local polling services or other similar approaches.

Comments

  1. Hi , Can I please know what are the modification you did to the JSON file . I am trying to upload the JSON directly to Flow and I am getting an error that the file is in YAML format

    ReplyDelete

Post a Comment

Popular posts from this blog

Accessing Certificates in Service Fabric Hosted Windows Containers

Azure Service Fabric is a great platform for container orchestration. It provides a full suite of features to ensure that your container is held up by the five pillars of software quality -- ensuring scalability, availability, resiliency, management, and security. Assuming your containerized application may need access to certificates to handle encryption, decryption, signing, or verification, Service Fabric even provides a built-in way to expose certificates installed in the LocalMachine store to the container by using a ContainerHostPolicy. You can also explicitly provide certificate files as part of the Data Package. Both approaches are documented well in the use a certificate in a container topic in the docs. What if you need more control over the certificates? What if they're not installed on the node and you need to dynamically make them available to your container at the time of service startup? What actually needs to happen in the setupentrypoint.sh script? This post s…

Self-Hosting a Bot in Service Fabric with Katana

I recently participated in a small hackathon with some Microsoft ninjas where we were presented with the problem, How do you host a bot in Service Fabric? It seemed like an easy problem that had probably already been solved, but it wasn't as straightforward as we thought it might be. At its core from a technology perspective, a bot is really just a Web API. We didn't want to actually host IIS on Service Fabric, so we knew we would want to self-host. If you use the built-in Stateless Service ASP.NET Core template, you'll get a great Web API implementation that's wired up with a KestrelCommunictionListener for Service Fabric. But what if you want to use Katana -- Microsoft's .NET Framework OWIN implementation? Katana is a flexible set of components for building and hosting OWIN-based web applications on .NET Framework. Microsoft's Katana Project on GitHub Although ASP.NET Core has incorporated most of the Katana project's functionality, the project will st…