Satyendra Kumar
2 min readJul 8, 2020

--

Kafka Stream API and Machine learning

In the story, I wanted to present how the Arduino/Particle sensors (https://particle.io/) data can be streamed over the network using Kafka Stream APIs and how the data can be stored into Elastic Search instance which later can be used by ML program to predict the machine maintenance.

There are 5 basic elements for this end to end integration:

  1. Data sources (Arduino sensor or Particle Sensors)
  2. Particle cloud ( Cloud platform for sensor integration)
  3. Kafka Stream API
  4. Kafka Broker server
  5. Elastic search (Data storage)
  6. Recurring Neural network with Tensor flow to predict machine Maintenance.

To depict the above basic elements I have attached the diagram:

In the below diagram we have marked the systems as well processes from 1–7 and below is the descriptions which will help explain it better.

Step 1: These are manufacturing sensors which are spitting 1000s of machines status and other data every mili-seconds. These sensors connect to internet and then stream the data to Particle cloud. The Particle cloud has event hooks attached to the sensor data call and if stops sending the data for particular period of time then it sends the notification of an issue with sensor.

Step 2: The Particle

--

--

Satyendra Kumar

Sr. Enterprise Architect | Digital Transformation Strategist | AI/ML. Passionate about new ideas & innovations, product management and scalability of solutions.