Month: December 2015

M2Mqtt and Amazon AWS IoT

Few weeks ago I wrote about the new M2Mqtt feature : TLS client authentication support !

One of the M2Mqtt community friends, Nick Payne, has already used this new feature to connect to Amazon IoT platform. He wrote an article for us to show how to use the M2Mqtt client to connect AWS IoT !

The article is hosted on the official M2Mqtt web site and you can read it here. It’s a well written walk through all the needed steps to setup AWS IoT stuff and source code for connecting to it using my MQTT client.

A must read !

The “hybrid” Internet of Things

The title of this blog post could sound strange to you … what’s the “hybrid” Internet of Things ? Why speaking about an hybrid nature related to IoT ? As we’ll see, It’s not something new but we can consider it both a new way to approach already running solutions or an old way to see new solutions in the IoT space … it’s up to you choosing the interpretation you like. In the Internet of Things nothing is new, today it’s only the right time for connecting “objects” on large scale.

So … what’s “hybrid” Internet of Things ?

The hybrid cloud

For several years, the cloud computing was the most used base for a lot of enterprise architectures and companies decided to move all their data, computation, processing and so on from “on premise” infrastructures to the cloud itself.

The cloud seems to offer infinite storage space and scaling for computation without any concerns from a company point of view which can configure all the features to change automatically. The result could be less time to spend on handling “on premise” infrastructures and less (?) moneys to invest.


Even if all available cloud platforms (Microsoft Azure, Amazon AWS, IBM Bluemix, Google App Engine, …) have got certifications about data storage and related protection there is one big concern for companies which invest in a distributed architecture … the security and privacy on data.

Data are the money of the new century and for this reason today a lot of companies prefer to store sensible data in their private servers but rely on scaling and computational features of public servers in the cloud : the hybrid cloud was born in order to connect these two infrastructures.

In the hybrid architecture, the company protects sensible data in house and leverage on the computation of public cloud exchanging only not sensible data using encrypted connection of course.

One of the most important players in the hybrid cloud is Red Hat which leverage on all main open source projects to develop its offer.

The idea of Internet of Things

When Kevin Ashton coined the term “Internet of Things” in the 1999, he wanted to describe the connection between the physical world and Internet speaking about “things” (not only people) on the public network.


At that time, the hardware was too much expensive, too big in size, not so powerful and with a poor connection. Nowadays we have very tiny devices with a lot of CPU power and memory other than all kind of connectivity : all these things at a very low cost. These conditions have favored the birth of “maker” movement : a lot of non professional people can now develop on embedded devices in a very simple way with the programming language they love. Not only the “old” but powerful C/C++ but many other high level languages like C#, Java, Python and JavaScript of course.

In this scenario, the main idea is connecting all these things to Internet directly, sending data to the cloud for processing and receiving command to control devices from the cloud itself. The connection is the main needed feature that is provided on the embedded devices themselves if they have enough resources or through a more powerful node called “field gateway”.

The direct connection to Internet means a very huge number of devices connected to the big network but it means a lot of problems too. Of course, it’s true for all the devices that are TCP/IP capable (via Ethernet or Wi-Fi) but not for a very huge number of “old” devices with legacy connections (RS232, RS485 …), devices with PAN (Personal Area Network) support (BLE, ZigBee, Z-Wave, …) or industrial protocols (OPC-UA, Modbus, CAN, …). In these scenarios, the field gateway becomes the bridging point between devices and the cloud.

Moving IoT in the “fog”

As we can see in most cases there is the need to add another node at the edge of an IoT solution to bring the real world data and control to the cloud. There are too much problems to rely on a public servers infrastructure only, it’s true both on devices and cloud side.

Moving part of the intelligence at the edge of an IoT solution and closed to the “T” side is an old practise in industrial environment but has a huge value today considering the growing number of connected devices. This approach has a well defined name today : the fog computing.


There are a lot of concerns about cloud computing solved by the fog. Let’s try to summarize them !

The protocols Babel tower

The first role of a field gateway is the protocol translation. This scenario is always true : it’s true for devices which are already TCP/IP capable for connecting to the Internet but it’s also true for legacy and low constrained devices which need a bridging point to the cloud. Devices could be able to connect directly to the cloud from a network capabilities point of view but they couldn’t speak the same “language”, it means the same protocol. Babel tower is the home for a lot of protocols today and the online IoT platforms can’s speak all of them. The first need is a local translation from devices protocol to cloud protocol in both cases if it’s based on TCP/IP (see MQTT, AMQP, HTTP, …) or on personal area network (see BLE for example).

Reducing cloud workload

Using an IoT field gateway, part of the work is done in the fog at the edge of our complex solution and it means reducing the cloud workload. Data centers are huge with powerful public servers but the resources aren’t unlimited; speaking about million/billion/trillion of connected devices is a problem for the cloud  and reducing this number could be a very good solution. Thanks to a central node at the edge, we can establish to few connections but at same time sending data that are representative of more devices in a local network. Sometimes not all produced data are needed for processing in the cloud so a pre-processing is executed at the edge to analyze, filter, reduce, elaborate data to send.

Real time reaction

A lot of IoT solutions need a near real time reaction time. Connection to the cloud introduces latency : to control a device, the data is sending to the cloud through the “big net”, the server processes it and replies with a command to start an action on the device itself. There is a not negligible round trip related to the connection and to the server workload too, because it’s serving not only one device but a huge number of them.

Offline handling

Without an Internet connection, an IoT solution which is entirely based on an online platform can’t work ! Of course, all cloud providers offer a SLA (Service Level Agreement) very closed to 100% but a lot of times we have to consider the very low percentage of failure. With fog computing we have a local node and we have to deal with local network connection only. The same is true when the cloud platform is available but the connection isn’t reliable (and it’s like having the platform offline) or the bandwidth is low. Thanks to the central node at the edge we are able to handle a lot of offline scenarios : we can leverage on local storing data when the connection isn’t available.

Security and privacy

In a lot of scenarios protecting data is a must. Even if all cloud connections are encrypted and based on SSL/TLS protocol, more companies which build IoT solution prefer to have data in their private server to protect themselves and their customers (who are the data owners). Using a field gateway, we can filter data to avoid sending all of them to the cloud; we can hold sensible data in our local network and send the non sensible ones to the public servers.

Reducing price

All IoT cloud platforms aren’t for free but they have a cost which is related to the number of connected devices and the number of messages exchanged per hours/days. Thanks to the field gateway we can connect more local devices using only a single connection (for the gateway itself) and thanks to the pre-processing we can filter and reduce the amount of data to send : it means reducing cost. It’s more important when the traffic isn’t “free” but it has an higher cost like using GSM connection for example.

The “fog” computing is around us

If we think about the the now days IoT market, almost of the 100% of the solutions are “fog” solutions and don’t rely on a pure cloud architecture. We can think about the connected cars : all the sensors speak a specific protocol on a CAN bus and a central and unique gateway is the only one having an Internet connection to send gathered data to the cloud.


The same is for industrial environments (think about a cars production line) where all devices use industrial protocols like OPC-UA and need near real time reaction time (think about the robots in the production line).


The smart home is another example made of BLE, ZigBee, Z-Wave based devices connected to the Internet through a central router. In general, the smart grid solutions like smart cities are based on an architecture made of more local networks connected to the cloud through a single point. Last simple example could be considered the wearable market : all wearable devices are very low constrained and aren’t TCP/IP capable so for this reason we need a gateway to send their data to the cloud; in most cases this gateway is our smartphone !


All around the world, big companies speak about their great online IoT platforms (you can read about two of them on this article). Of course, we need them because building an “on premise” IoT solution is almost impossible. It’s also true that we can’t rely on a “pure” cloud architecture. We saw that the solutions already in the field aren’t so pure but they are “fog” solutions and it could be considered the only reliable approach we have to use for future implementation.

The future is not “pure” … it’s “hybrid” … for this reason I like to speak about “hybrid” Internet of Things !

TLS client authentication for M2Mqtt !

Today, the new M2Mqtt version was released !

It has “only one” great improvement … the TLS client authentication support !

Now, you can use a client certificate to authenticate the client itself during the SSL/TLS handshake with the broker. A big IoT platform like the Amazon AWS IoT uses it and for this reason, we support this authentication now !

You can connect your IoT device MQTT based to AWS IoT thanks to the new TLS client authentication feature.

The release is available as source code under the Eclipse Paho project, on GitHub and as binaries on Nuget.