M2M

Internet of Things : reactive and asynchronous with Vert.x !

vertx_iot

I have to admit … before joining Red Hat I didn’t know about the Eclipse Vert.x project but it took me few days to fall in love with it !

For the other developers who don’t know what Vert.x is, the best definition is …

… a toolkit to build distributed and reactive systems on top of the JVM using an asynchronous non blocking development model

The first big thing is related to develop a reactive system using Vert.x which means :

  • Responsive : the system responds in an acceptable time;
  • Elastic : the system can scale up and scale down;
  • Resilient : the system is designed to handle failures gracefully;
  • Asynchronous : the interaction with the system is achieved using asynchronous messages;

The other big thing is related to use an asynchronous non blocking development model which doesn’t mean to be multi-threading but thanks to the non blocking I/O (i.e. for handling network, file system, …) and callbacks system, it’s possible to handle a huge numbers of events per second using a single thread (aka “event loop”).

You can find a lot of material on the official web site in order to better understand what Vert.x is and all its main features; it’s not my objective to explain it in this very short article that is mostly … you guess … messaging and IoT oriented  🙂

In my opinion, all the above features make Vert.x a great toolkit for building Internet of Things applications where being reactive and asynchronous is a “must” in order to handle millions of connections from devices and all the messages ingested from them.

Vert.x and the Internet of Things

As a toolkit, so made of different components, what are the ones provided by Vert.x and useful to IoT ?

Starting from the Vert.x Core component, there is support for both versions of HTTP protocol so 1.1 and 2.0 in order to develop an HTTP server which can expose a RESTful API to the devices. Today , a lot of web and mobile developers prefer to use this protocol for building their IoT solution leveraging on the deep knowledge they have about the HTTP protocol.

Regarding more IoT oriented protocols, there is the Vert.x MQTT server component which doesn’t provide a full broker but exposes an API that a developer can use in order to handle incoming connections and messages from remote MQTT clients and then building the business logic on top of it, so for example developing a real broker or executing protocol translation (i.e. to/from plain TCP,to/from the Vert.x Event Bus,to/from HTTP,to/from AMQP and so on). The API raises all events related to the connection request from a remote MQTT client and all subsequent incoming messages; at same time, the API provides the way to reply to the remote endpoint. The developer doesn’t need to know how MQTT works on the wire in terms of encoding/decoding messages.

Related to the AMQP 1.0 protocol there are the Vert.x Proton and the AMQP bridge components. The first one provides a thin wrapper around the Apache Qpid Proton engine and can be used for interacting with AMQP based messaging systems as clients (sender and receiver) but even developing a server. The last one provides a bridge between the protocol and the Vert.x Event Bus mostly used for communication between deployed Vert.x verticles. Thanks to this bridge, verticles can interact with AMQP components in a simple way.

Last but not least, the Vert.x Kafka client component which provides access to Apache Kafka for sending and consuming messages from topics and related partitions. A lot of IoT scenarios leverage on Apache Kafka in order to have an ingestion system capable of handling million messages per second.

Conclusion

The current Vert.x code base provides quite interesting components for developing IoT solutions which are already available in the current 3.3.3 version (see Vert.x Proton and AMQP bridge) and that will be available soon in the future 3.4.0 version (see MQTT server and Kafka client). Of course, you don’t need to wait for their official release because, even if under development, you can already adopt these components and provide your feedback to the community.

This ecosystem will grow in the future and Vert.x will be a leading actor in the IoT applications world based on a microservices architecture !

M2Mqtt and GnatMQ are dead ? Long life to them !

In the last months a lot of people asked me if the M2Mqtt and GnatMQ projects are dead because there were no commits for a long time.

I can say … no ! They are not absolutely dead !

I was very busy in the last months, starting my new exciting Red Hat career in the messaging and IoT team, but I want to continue to support the projects as lead and main committer. Of course, I’ll be still busy with my new job but my commitment is to continue to improve the libraries, fixing bugs and adding new features. Just remember that I’ll do it in my spare time so I could not be so quick to reply your suggestions; of course your help with useful pull requests will be great !

As you now, today the M2Mqtt client library is under the official Paho GitHub repo and I have finally deleted the repo on my personal GitHub account.

The GnatMQ, based on M2Mqtt, will continue to live where it’s now here.

I’ll do my best to check the issues and the pull requests provided by the community, happy to know that my projects have a lot of followers and there are a lot of professional solutions using them out there !

As always I invite you to share with me your experience using my libraries in order to mention your solution in the “case studies” section of the official web site.

Thanks !

M2Mqtt : another moving … now under the official Eclipse Paho GitHub repo !

eclipse_github_repo

As you know they’re about 2 years that my M2Mqtt library is under the Eclipse Paho project as the official and supported C#, .Net (desktop, compact and micro) and WinRT client for MQTT protocol.

It’s source code was first released on CodePlex and then moved on GitHub inside my personal account in the last year but today it’s official part of the Eclipse Paho GitHub account.

It means that in the next few days I’m going to delete it as my personal GitHub repository and you will find all updates and fixes in its new “home”. Of course, you have to use this new repository in order to open new issues.

His brother, the GnatMQ broker, will stay on my personal GitHub account.

I’ll remain the principal committer and contributor but as open source project, the real power will come from the community !

M2Mqtt and Amazon AWS IoT

Few weeks ago I wrote about the new M2Mqtt feature : TLS client authentication support !

One of the M2Mqtt community friends, Nick Payne, has already used this new feature to connect to Amazon IoT platform. He wrote an article for us to show how to use the M2Mqtt client to connect AWS IoT !

The article is hosted on the official M2Mqtt web site and you can read it here. It’s a well written walk through all the needed steps to setup AWS IoT stuff and source code for connecting to it using my MQTT client.

A must read !

TLS client authentication for M2Mqtt !

Today, the new M2Mqtt 4.3.0.0 version was released !

It has “only one” great improvement … the TLS client authentication support !

Now, you can use a client certificate to authenticate the client itself during the SSL/TLS handshake with the broker. A big IoT platform like the Amazon AWS IoT uses it and for this reason, we support this authentication now !

You can connect your IoT device MQTT based to AWS IoT thanks to the new TLS client authentication feature.

The release is available as source code under the Eclipse Paho project, on GitHub and as binaries on Nuget.

WPC 2015 Milan : Azure IoT Hub and IoT Suite

wpc2015

Organized by Overnet, in collaboration with Microsoft, WPC is the most important italian conference focused on Microsoft technologies. This year it will be covered in two full immersion days on December 1st and 2nd with 70 sessions in 8 tracks.

I’m honoured to be part of the speakers team this year as Microsoft MVP on Windows Embedded and IoT; on December 2nd, I’ll have a session about Microsoft Azure IoT Hub with an overview of the new Azure cloud gateway and the related Azure IoT Suite.

For sure, the conference will be great for contents and networking with all experts about Microsoft technologies. Don’t forget the “Ask The Expert” corner with a “bunch” of Microsoft MVPs ready to answer your questions.

All information and details about the registration and the conference on the official web site.

AMQP isn’t so scary … if you know how to start !

One of the first class citizens in the “IoT protocols city” is the AMQP (Advanced Message Queue Protocol) protocol that is so popular even if it seems that there are few good resources to start understanding how it works under the cover (after a few searching on Google and Bing). Of course, for each protocol the most complete resource is the official specification but more people consider it very difficult to read (if they don’t need to implement it).

To simplify your understanding, I’d like to wrap up some resources useful to you to start studying AMQP protocol.

A good starting point is an old but very useful article by Kelly Sommers (aka “kellabyte” on Twitter) who explains how the AMQP 1.0 specification is completely different from the old AMQP 0.9.1 specification : Clarifying AMQP.

For an overview at high level, I found the articles posted by Chuck Rolke (from Red Hat) on his blog very interesting :

To deep into the protocol about message framing and type system you can read my articles :

The great thing is that all examples related to the above blog posts use the AMQP .Net Lite library (an AMQP C# implementation) from Microsoft as client.

Few articles … but very useful.

Of course, I read the official specification because in my humble opinion it’s always the best resource and I’d like to understand how the things work under the cover (in my case it’s all related to Azure Service Bus used in the IoT space that support AMQP protocol).

Last but not least, there is a great discussion on Channel9 about AMQP 1.0 and its general availability in Microsoft Azure Service Bus (it was 2013) between Clemens Vasters and David Ingham that we can consider the AMQP gurus without any doubts.

If you have other resources to enrich the above list, please don’t hesitate to contact me !

[Update October 2nd 2015]

Very nice and interesting article titled AMQP as network protocol by Ted Ross (from RedHat) with a brief overview on the protocol, multiplexing, flow control and delivery features.

Clemens Vasters (from the Microsoft Azure Messaging team) is a guru on AMQP protocol and today he published the great presentation Introduction to AMQP 1.0 with an in depth analysis about the architecture and all the related features. It’s a must read for understanding how the protocol works under the hood on the wire.

[Update October 5th 2015]

After its great slide deck about AMQP 1.0 protocol, today Clemens Vasters published a great video series on his Subscribe! blog on Channel9. The video series has the following 6 parts :

If you prefer, the videos are also available as playlist on YouTube here.

What can I say ? No words … it’s Clemens’ stuff 😉

[Update January 18th 2016]

Today my “AMQP Essentials” cheat sheet was published on DZone web site as refcard. It’s a lightweight introduction to the AMQP protocol with all main features in a single short PDF file (6 pages).

GnatMQ and SSL/TLS support : make it up and running !

After the last blog post on the new GnatMQ release with SSL/TLS support, it’s now time to show how to use this new feature.

Asymmetric encryption and SSL/TLS : a brief introduction

Speaking about SSL/TLS security means to speak a lot about asymmetric encryption (even if symmetric encryption is used as well) and X.509 certificates. Regarding asymmetric encryption we know that it’s based on public and private keys compared to symmetric encryption that uses only one shared key. Simplifying how asymmetric encryption works we can say that the message sender needs the public key of the receiver in order to use it for encrypting data. The receiver is the only one to have its private key so that it’s the only one that is able to decrypt a message that was encrypted with its own related public key. It means that public key can be distributed publically but private key must be kept secret. Asymmetric encryption can be used not only to encrypt data but for signing it too. In this case, the sender processes an hash on the message to send using its private key; the receiver can use the corresponding sender public key to verify the signature (the hash) and in this way it can be sure that the message was received from the right sender.

Of course, it’s only a simple and brief introduction on how asymmetric encryption works and I can suggest you to deep into it finding a lot of content online.

How is asymmetric encryption related to SSL/TLS protocol ?

SSL/TLS is a protocol on top of TCP/IP that is used for server authentication (to know that we are speaking with the server we want really to speak), client authentication (check on server side that the client can connect to the server itself) and content encryption (we can guarantee confidentiality on transmitted data). SSL/TLS is based on asymmetric encryption in the following way :

  • the server has a private key for encrypting messages during the first connection phase named SSL/TLS handshake;
  • the server has a related public key distributed publically to all clients want to connect to it;

I said that the server private key is used only during SSL/TLS handshake and not for encrypting all messages exchanged during connection. It’s true because during this phase, client and server exchange some information to generate a symmetric key that they will use during communication to encrypt all exchanged data. Why this step from asymmetric to symmetric encryption ?

Asymmetric encryption is more complex than symmetric and it needs more CPU and memory footprint for executing even if it’s more secure than symmetric. This one is less complex and doesn’t need huge resources for processing. This is the reason why for generating a symmetric key, the client and server use asymmetric encryption with more load during the handshake but they use the generated symmetric key for encrypting all traffic to be much faster during communication.

Now, the big question is : how the client can be sure it’s talking with the server it’d like to talk ? Here the X.509 certificates standard and the Certification Authority concepts come in and of course the second possible usage of asymmetric encryption : the signature.

The server public key must be signed by a trusted third party authority so that the client can trust this public key and check that it’s talking with the right server. A Certification Authority (CA) is a trusted third party that distributes X.509 certificates : you can think about a certificate as a server public key with a signature from CA (with more other information). It means that we can buy a server certificate from a CA directly or we can ask the CA to sign our already generated public key. Of course the CA uses its own private key to process the hash/signature for that public key to obtain a valid X.509 certificate.

How the client is able to check the server certificate validity ? Of course, the client needs the CA certificate (its public key) and can use it to verify the signature on the received certificate from the server.

Now you could ask … who signs the CA certificate ? In this case the CA is considered as a “root” so no other trusted third party over it; for this reason the CA certificate is self signed and it means that it’s signed by the CA itself … you need to trust it ! However you need to know that there are CA certificates not self signed but signed by other CAs; in this way there is the possibility to have CAs who can distribute X.509 certificates to companies but you can’t consider them as “root” because their certificates are signed by other “root” CAs.

After this higher level introduction to SSL/TLS (you need to deep into it for understanding it better), it’s time to “play” with certificates and prepare our GnatMQ broker with the new security feature.

Self-signed CA certificate creation

First of all we need a CA certificate to generate and sign the server certificate. We could buy the server certificate from a trusted third party Certificate Authority such as GoDaddy, Symantec (Verisign), Comodo or GlobalSign but only for testing, we can first generate a self-signed CA certificate and then using it to sign our server certificate. With our CA certificate we are able to impersonate a trusted Certification Authority so that we can sign any other certificate we’d like to use for our servers.

Remember that the certificates story starts from generating a public and private key for asymmetric cryptography. A server public key is signed with a CA private key to obtain a so called “certificate”. In this way the server public key can be distributed publically in a certificate fashion. In the case of a self-signed CA certificate, the CA private key is used to sign the same CA public key to obtain the self-signed CA certificate itself.

To execute all available operations on X.509 certificates (create, sign, convert and so on) there are two main available tools : OpenSSL or Makecert.
During this article we’ll use Makecert that we receive with our Visual Studio 2015 installation (it’s available with previous versions as well).
Of course, the certificate generation operation will generate both private key and public key; the last one will be signed to become a standard X.509 certificate to distribute publically.
Let’s start with the “Developer Command Prompt for VS2015” and check that the Makecert usage is the following :
makecert [options] outputCertificateFile
with a lot of “options” we can specify and the output certificate file.
Let’s execute following command :
makecert -n "CN=MyCARoot" -r -a sha512 -len 4096 -cy authority -sv MyCARoot.pvk MyCARoot.cer
01_gnatmq_ssl
where options have following meaning :
  • -n : specifies the so called Common Name and it’s the subject certificate name that conform to the X.500 standard
  • -r : ask for creation of a self-signed certificate
  • -a : specifies what algorithm must be used to sign (in this case self-signed) the certificate
  • -len : specifies the generated private key length, in bits
  • -cy : specifies the certificate type. Valid values are “end” for end-entity and “authority” for Certification Authority
  • -sv : specifies the name of the generated subject’s private key file

It’s important to notify the the -n parameter can receive more other options like Country Name (C), Organization Name (O), Locality Name (L) and so on; in this case a more complex value could be : “CN=MyCARoot,C=Italy,O=MyCACompany,L=Naples”.

There are a lot of other options you can specify like the expiration date and so on. For more information, you can see the official documentation for Makecert tool.

After executing this command you are prompted to insert the “passphrase” (a password) to protect the private key.

02_gnatmq_ssl
The first time you need to create the password for the private key and the second time you are prompted to insert the previous one to access the private key and (self)signing the certificate.
03_gnatmq_ssl
Now we have the MyCARoot.cer file and the MyCARoot.pvk file. The former is the self-signed CA certificate in the DER (binary) format and the latter is the private key used to signed the generated public key to obtain the above certificate and that will be used to sign any other server certificate.

Create the server certificate

Next step is to create the server certificate and sign it with the above CA certificate (using the related private key of course).
makecert -n "CN=DESKTOP-862CII2" -iv MyCARoot.pvk -ic MyCARoot.cer -pe -a sha512 -len 4096 -sky exchange -sv GnatMQ.pvk GnatMQ.cer
where options have following meaning :
  • -n : specifies the Common Name of the server. It must match the domain name or the PC name (like for this example)
  • -iv : specifies the issuer’s .pvk private key file (so the CA private key file)
  • -ic : specifies the issuer’s certificate file (so the self-signed CA certificate)
  • -pe : marks the generated private key as exportable (we’ll see what it means)
  • -a : specifies what algorithm must be used to sign the certificate
  • -len : specifies the generated private key length, in bits
  • -sky : specifies the subject’s key type.The “exchange” value means the key is used for key encryption and key exchange (what happes during SSL/TLS handshake)
  • -sv : specifies the name of the generated subject’s private key file

04_gnatmq_ssl

As for the CA certificate generation you are prompted to insert the passphrase to protect the private server key. In this case, you are prompted to insert the passphrase for the CA private key too (the Issuer) because it is necessary to access the CA private key for signing the server public key and generate the certificate.

05_gnatmq_ssl

In order to use the server certificate inside the broker we need to export it in the PFX format (predecessor of current PKCS#12). The PKCS#12 evolved from the personal information exchange (PFX) standard and is used to exchange public and private objects in a single file. It means that the single PFX file contains both the public and private key of the server. In this way, it has all needed data to support an SSL/TLS connection for deploying its certificate and encrypt data using the private key. This is the reason way we used the “pe” option in the Makecert command to mark the private key as exportable into the PFX file.

To do this we can use the pvk2pfx tool in the following way :

pvk2pfx -pvk GnatMQ.pvk -spc GnatMQ.cer -pfx GnatMQ.pfx -po <password>

where options have following meaning :

  • -pvk : specifies the name of a .pvk file (broker private key)
  • -spc : specifies the name and extension of the Software Publisher Certificate (SPC) file that contains the certificate (broker certificate)
  • -pfx : specifies the name of the output .pfx file
  • -po : password for the .pfx file

06_gnatmq_ssl

Setup GnatMQ security

Now we are able to use the server certificate to create a GnatMQ instance with SSL/TLS support. You can get the latest version as Nuget package and reference it inside your project. You can use the MqttBroker class instance inside a simple console application or a worker role on Azure; it’s up to you how to use it. For a simple example, the console application is the right choice.

First of all you need to add the generated GnatMQ.pfx file as resource inside your project to make it available in your source code.

07_gnatmq_ssl

The source code is so simple …

class Program
{
    static void Main(string[] args)
    {
        X509Certificate2 serverCert = new X509Certificate2(Resource1.GnatMQ, "<pwd>");

        // create and start broker
        MqttBroker broker = new MqttBroker(serverCert, MqttSslProtocols.TLSv1_0);
        broker.Start();

        Console.ReadLine();

        broker.Stop();
    }
}

In order to use the PFX file you can instantiate the X509Certificate2 class providing it the raw bytes and the password (you specified during PFX file creation). The MqttBroker class has a simple constructor for SSL/TLS support with only two parameters :

  • the server certificate;
  • the SSL/TLS protocol version;

In this way the broker is up and running and can accept only SSL/TLS connections !

Another way to instantiate the class is to use the constructor with other parameters like the RemoteCertificateValidationCallback and LocalCertificateSelectionCallback that you can use to customize the certificate validation and selection process.

Remember that any client you use to connect the broker it needs the self-signed CA certificate and your connection will be protected and encrypted after the server authentication.

Last one thing to know is that the SSL/TLS feature is supported by GnatMQ only for the .Net Framework platform and not for the WinRT (Windows 8.1 and Windows 10) due to the sandboxed execution environment that doesn’t accept incoming SSL/TLS connection (only as client).

Windows IoT Core and M2Mqtt … a simple marriage !

“Hello, Windows IoT Core” as Steve Texeira wrote few days ago !

The last stable release of the new IoT (embedded ?) OS for makers (and professionals ?) is out but of course I couldn’t wait the last days to start using it so I already played with it in a lot of demos and sessions code.

Who follows me knows that other than interacting with real world using sensors from Raspberry Pi 2, my great love is share these data and send them to the Cloud ! I already spoke and coded about using Pi 2 to send data to the awesome Event Hubs service (in the Microsoft Azure Service Bus family) and you can find my material on GitHub and SlideShare related to my “Telemetry with Windows 10 IoT : from Raspberry Pi2 to Event Hubs” session.

Of course, who follows me knows that I’m currently developing and updating (since about 2 years) the M2Mqtt library as part of the Eclipse Paho project (starting from October 2014) as the main MQTT client implementation for all .Net platforms (full, compact and micro) and WinRT (Windows 8.1 and Windows Phone 8.1) but … what about Windows 10 and Windows IoT Core ?

Don’t worry guys ! The last stable release (4.2.0.0) in its WinRT version works great on the new awesome Microsoft OS from the PC to the Raspberry Pi 2 thanks to the UWP (Universal Windows Platform) programming model. You can use the last Nuget package for sending telemetry data from your embedded system, your PC or smartphone writing only one application !

In the next few lines, I’ll describe how to use this package to send simple temperature value from my home. I’d like only to show how simple is to use the library and give you the right tools to create you IoT solution MQTT based for a Windows IoT Core system !

The hardware : Raspberry Pi2 and TMP102

Of course, it’s possible to develop an Internet of Things solution without an embedded system. Yes, of course ! I’m not crazy but I don’t want to discuss about this scenario (IoT is very huge you could think) during this article that will be focused on using an embedded hardware like the Raspberry Pi 2 with latest release of Windows IoT Core. Online, you can find a lot of article and official documentation on how to setup your board so I can consider this step already done for you !

Regarding the telemetry scenario … what is the easiest data to send … a temperature value of course ! 🙂

For this, I chose a temperature sensor I already had … the TMP102 from Texas Instruments in the related breakout board from Sparkfun. The connections to the Pi2 are so simple because the sensor supports I2C protocol and referring to the Pi 2 pinout we have (left Pi 2 pin and right TMP102 pin) :

  • PIN 1 (3.3V PWR) –> V+ (power)
  • PIN 3 (I2C1 SDA) –> SDA (I2C data)
  • PIN 5 (I2C1 SCL) –> SCL (I2C clock)
  • PIN 6 (GND) –> GND (ground)
  • PIN 9 (GND) –> ADD0 (for I2C address selection)

You can leave the ALT pin of TMP102 not connected to Pi 2 because it’s for alert notifications from the sensor if temperature is higher or lower than a specific threshold. If you chose to use it, you need to connect it to a GPIO pin on the Pi 2 to receive an interrupt (Gpio Value Changed event) on alert.

Create UWP app and add the M2Mqtt package

Using Visual Studio 2015, it’s so simple to create a new UWP application from “File –> New Project –> Windows –> Universal” as showed in the following picture.

01_pi2mqtt

To develop on Raspberry Pi 2 and using all IoT features like Gpio pins, I2C and SPI devices, you need to add the Windows IoT Extensions for the UWP under references and selecting “Select Universal –> Windows –> Extensions”.

02_pi2mqtt

Now you are able to interact with all your Pi 2 pins and sensors you can connect to it but what about the M2Mqtt library for sending data to the Cloud using MQTT protocol. As I said, the library is available as Nuget package without need to recompile it from source code (it’s open source ! you can do it if you want !). Adding the package to your project is so simple under “References -> Manage NuGet Packages”, search for “M2Mqtt” and click on “Install”.

03_pi2mqtt

At the end of the installation you can find the package under the “Solution Explorer” window.

04_pi2mqtt

Time to code !

Now it’s time to code our application, reading from the temperature sensor and publish data to an MQTT broker. The first step is to interact with TMP102 sensor to get the temperature value. To do this I already wrote the driver you can find on GitHub here and include in your project.

To be much simpler as possible I wrote my code in the MainPage constructor to instantiate the TMP102 class and the MqttClient class from M2Mqtt library and to launch a task that reads the temperature and send it to the broker every 5 seconds. Of course, you have to write better code (for example using a Backgroud IoT Task instead of an UWP app) … but this sample is useful to show you how to integrate M2Mqtt in your project and how it’s simple to use it.

public sealed partial class MainPage : Page
{
    private TMP102 tmp102;
    private MqttClient client;

    public MainPage()
    {
         this.InitializeComponent();

         this.tmp102 = new TMP102();
         this.client = new MqttClient("test.mosquitto.org");

         this.client.Connect(Guid.NewGuid().ToString());

         Task.Run(async () =&gt;
         {
              bool isOpened = await this.tmp102.OpenAsync();

              while (true)
              {
                   float temperature = this.tmp102.Temperature();
                   string json = "{ temp : " + temperature + " }";

                   this.client.Publish("/pi2mqtt/temp", Encoding.UTF8.GetBytes(json));

                   await Task.Delay(5000);
              }
         });
    }
}

To simplify the solution I decided to use the publically available Mosquitto broker at this address : test.mosquitto.org

The above example is very simple. First you have to create TMP102 and MqttClient instances (for the last one you need to specify the broker host name). The second step is to connect to the MQTT broker and start the task for reading and sending data. Inside the task, every 5 seconds, the temperature value is read from the TMP102 instance and it’s published to the broker on the “/pi2mqtt/temp” topic in JSON format (as payload of the MQTT message).

Now that your project is sending value to the broker, the simplest way to see you are publishing data is to use another MQTT client, for example the mosquitto client and subscribe to the same topic to receive data. After installing mosquitto, you can start the mosquitto subscriber client in the following way :

mosquitto_sub -h test.mosquitto.org -t /pi2mqtt/temp -q 0 -d

Now … you should see data received from you Raspberry Pi 2 !

05_pi2mqtt

Conclusion

Of course, it isn’t an article to deep into Windows IoT Core, MQTT protocol, M2Mqtt library and so on. I’d like only to show how you have an “out of box” way (as the M2Mqtt library) to integrate your embedded system into a huge MQTT solution and you can develop it using Windows IoT Core as OS or in general your Windows 10 system !

M2Mqtt on MQTT encyclopedia and me on DotNetPodcast with “IoT end to end”

Today, I’d like to highlight some stuff …

mqtt_client_encyclopedia

First, my guest article on the M2Mqtt project in the “MQTT Client Library Encyclopedia” blog post series hosted on HiveMQ web site (thanks to the team for asking me to write this entry). It’s great for this library to be part of this “encyclopedia” that will host all main MQTT client implementations in all different languages and technologies in the next weeks. M2Mqtt represents the Microsoft based one developed in C# and available for all .Net platforms (full, compact and micro) and WinRT (Windows 8.1, Windows Phone 8.1 and Windows 10).

iot_end_to_end

Second, as always I want to thanks the DotNetPodcast team for registering a new podcast, titled “IoT end to end”, on building an Internet of Things solution from devices to the cloud using all Microsoft Azure related technolgies. I described all the main steps that builds an IoT scenario speaking about devices, gateway (in the field and in the cloud), protocols, data ingestion with Azure Event Hubs, analysis in real time with Azure Stream Analytics, data storage and presentation to the user. Of course I spoke a little about the future with IoT suite and the new IoT Hub service. I’m sorry for all my non italian friends … but the podcast is in italian. If you like you can find all my podcasts (only two for now) on the dedicated page of my blog.