Skip to content

Documentation about me experience connecting a industrial process by monitoring and analizing on cloud (OT/IT).

License

Notifications You must be signed in to change notification settings

hoat23/IndustrialInternetOfThings

Repository files navigation

Industrial Internet of Things - (Siemens&ElasticStack)

Documentation about me experience connecting a PLC to cloud (OT/IT).

The architecture specifies a five-step process:

  • Step 1 Data Collection: Security related information is collected from the probes, including system level and application level probes as specified in the architecture.
  • Step 2 Routing: Security information is routed to data stores and data analytics engines at either the data collection or the security intelligence layer of the architecture.
  • Step 3 Analyze: Security information is analyzed towards identifying patterns based on rules or other data-driven mechanisms (e.g., classification).
  • Step 4 Control & Actuate: Upon the identification of a specific behavior or event (e.g., fulfilment of rules or classification of information) the probes are reconfigured in order to adapt the data collection.
  • Step 5 Visualize: The entire intelligent and adaptive data collection process can be monitored and controlled in a visual fashion, based on proper dashboards.

Communication protocols

Every vendor use a specific protocol:

Also, exists open protocols:

IT/OT Achitecture

The Siemens IOT2040 module is the one that allows to link the industrial plant to the cloud, for security reasons it will only send data to the cloud, but it will leave the control of the industrial processes to the internal PLC.

Elasticsearch is a non-relational database, which will store the data sent from the industrial plant.

Kibana or Power BI is used to generate reports, some examples of generated dashboards are:

Why ElasticStack for IoT Analytics Platform

While specific requirements of a platform varies between organisations, developing a platform which serves real-time operational use cases must address these seven key criteria.

  • Raw Data Processing
  • Real-Time Aggregation
  • Auto-Scaling Datastore
  • Data Lifecycle Management
  • Real-Time Alerting
  • Self-Service Visualisation
  • Application Monitoring

The Elastic Stack comprises of a suite of open source products that enable users to take data from anywhere and search, analyse and visualise it in real-time.

Example of use case for monitoring temperature anomaly: https://blog.codecentric.de/en/2019/10/apache-plc4x-elasticsearch-iiot-monitoring-anomaly-detection/

Software

Node-Red

Deploying in Windows

  • Download and install nodejs from: https://nodejs.org/en/
  • Check version using this command: node -v
  • Install node-red: npm install -g --unsafe-perm node-red
  • Launch node-red server, execute: node-red

If you have nvm:

  • nvm install [VERSION-NODE]
  • nvm use [VERSION-NODE]

Upgrading npm:

  • npm install npm@latest -g
  • npm install --global --production windows-build-tools

Versions:

  • Node-RED version: 1.1.3
  • Node.js version: 12.18.2
  • npm version: 6.14.5

Remove from windows

  • Run this command: npm cache clean --force
  • Uninstall from Programs & Features with the uninstaller.
  • Reboot (or you probably can get away with killing all node-related processes from Task Manager). Look for these folders and remove them (and their contents) if any still exist. Depending on the version you installed, UAC settings, and CPU architecture, these may or may not exist:
    • C:\Program Files (x86)\Nodejs
    • C:\Program Files\Nodejs
    • C:\Users{User}\AppData\Roaming\npm (or %appdata%\npm)
    • C:\Users{User}\AppData\Roaming\npm-cache (or %appdata%\npm-cache)
    • C:\Users{User}.npmrc (and possibly check for that without the . prefix too)
    • C:\Users{User}\AppData\Local\Temp\npm-*
    • Check your %PATH% environment variable to ensure no references to Nodejs or npm exist.
  • If it's still not uninstalled, type where node at the command prompt and you'll see where it resides -- delete that (and probably the parent directory) too.
  • Reboot, for good measure.

Deploying in Simatic-IoT2040

Write the image of SO in your SD card, following this instructions: https://github.com/hoat23/IndustrialInternetOfThings/blob/master/InstallingMicroSD.md

Deploying in Docker

We create a file 'docker-compose.yml' with this content:

version: '3'
services:
	mosquitto:
		image: eclipse-mosquitto
		ports:
		  - 1883:1883
		  - 9001:9001
	
	nodered:
		image: cpswan/node-red
		ports:
		  - 1880:1880

A video explanation see: https://www.youtube.com/watch?v=KJXU0PL1oNM

Deploying in IBM-Cloud.

  1. Create an account in https://cloud.ibm.com/login
  2. Logging in IBM-Cloud.
  3. On IBM-Console type "node-red app" and click on butoon.
4. Wait by load.
5. Type the data required and click on create.
6. Click on "Desplegar su aplicaciĂłn".
7. Configurate the api-key.
8. Click on own name server, that defined preview.
9. Wait by finish deploying. After, click on "view console".
10. Lauch application doing click on "visit url of the app".
11. Click on "next". Create a username and password, like below:
12. Click on "Next" and after click on "Finish". Wait by applying the settings.
13. Click on "Go to node-red flow editor".
14. Write you username and pasword created in eleven step and click on "Login".
15. Now you can use the "node-red".

IBM Documentation

Broker MQTT

When you need a broker-mqtt?

  • When you have two or more sucriptors controlling multiple devices.
  • When you bandwidth are bad and with problems of comunications.
  • Not need, when you only reading data from devices (by monitoring).

Online Broker-MQTT

Link with differents brokers online: https://mntolia.com/10-free-public-private-mqtt-brokers-for-testing-prototyping/

Configuration a broker in Shiftr.io

We used broker online like "shiftr.io" for fast deployment. For configurate just follow the next steps:

  1. Create account in shiftr.io using your email.
  2. Login with you account in shiftr.io.
  3. Create a new-namespace.
5. Fill the white spaces.
6. Create a token clicking on "Namespace settings".
7. Wait by load, and after clink on "Add Token".
8. Type the username and password token.
9.You can write data to broker-mqtt using a similar configuration: (mqtt://mykeyusername:[email protected]) - Server: broker.shiftr.io - User: mykeyusername - Password: mysecretpassword - Port: 1883 - Protocol: mqtt

If you want to add certificates just follow this steps https://gist.github.com/hoat23/f71d081d06c3667f61106784f0c4ea8e.

Hardware

Simatic S7-1200

Some Details

  • RAM memory 100 [kB] (work)
  • ROM memory 4 [MB]
  • Remmant memory 10 [kB]
  • E/S local integrated 14E/10S (Discret)
  • 2E/2S (Analog)
  • Image memory of process 1024 [bytes]
  • Labels area 8192 [bytes]
  • Ampliation Slots of signals module 8
  • Ampliation Slots of comunication module 3
  • High counters (HSC) 6
  • Pulse generators 4
  • PROFINET ports 2 (Ethernet)

Simatic IOT2000

Some Details

  • MicroSD card is needed for the operating system with a minimun of 2GBytes.
  • Connection for the power supply (24 V).
  • COM interfaces (RS232/422/485)
  • Ethernet interface 10/100 Mbps.
  • USB type Micro-B.
  • USB type A.

Hardware and software required

  • Engineering Station: Requirements are hardware and operating system (for additional information, see Readme on the TIA Portal Installation DVDs)
  • SIMATIC STEP 7 Professional software in TIA Portal V15 or higher
  • Software for writing the example image on the SD card, e.g. Win32 Disk Imager
  • Software for SSH access, e.g. PuTTY, MobaXterm.
  • Software for SFTP/SCP file transfer, e.g. WinSCP, MobaXterm.
  • SIMATIC IOT2000 controller, e.g. IOT2040 with MicroSD Card and IO-Shield https://support.industry.siemens.com/cs/document/109741799/imagen-ejemplo-para-la-sd-card-de-un-simatic-iot2020-iot2040?dti=0&lc=es-AR (Yocto Linux Operating System).
  • Ethernet connection between the engineering station and controller
  • SIMATIC IOT2000EDU Software Controller executable on IOT2020 and IOT2040

Connect to PLC-Siemens

How to configure data-blocks of PLC S7-1200 correctly

In order to correctly read the data from the plc, the following steps must be followed.

  1. Enable PUT/GET
2. Configure IP of PLC
3. Add a new data-block
4. Write name of data-block (example: IOT) and set manually the direction of block (in this case is 10)
5. Define the variables, like this:
6. Open properties of this data-block:
7. Disabled "optimized block access"
8. Compile the data-block, doing click in "compile" button.
9. Whait by finish of compilation.
10. Now you can see the direction in memory of every variable defined in data-block.

Python - PLC S7-1200

Installing Snap7 in Windows

Install snap7 library: pip install python-snap7

Download snap7 from https://sourceforge.net/projects/snap7/files/

Search the snap7 folder for snap7.dll and snap7.lib files Copy the snap7.dll and snap7.lib into the "C:/PythonXX/site-packages/snap7 " directory:

Installing Snap7 in SimaticIOT2040

Comming soon.

Python Code

import snap7 #pip install python-snap7
import struct
import logging
from snap7.common import Snap7Library
from snap7.util import *

logging.basicConfig(level=logging.INFO)

# If you are using a different location for the library
Snap7Library(lib_location='C:/snap7/snap7.dll')
load_library() #Testing library is correctly <WinDLL 'C:\snap7\snap7.dll', handle 7ff9d5d90000 at 0x1a5a0417640>

plc = snap7.client.Client()
plc.connect("10.112.115.10",0,1)

#---Read DB--- 
# DB:10, start:0, size:8
db = plc.db_read(10,0,8)
real = struct.iter_unpack("!f",db[:6] )
print( "3 x Real Vars:", [f for f, in real] )
print( "3 x Bool Vars:", db[1]&1==1, db[2]&2==2, db[3]&4==4 )

#---Write ---
value_1 = 0b10110001
value_2 = 480
print("write 0b10110001 to V10")
plc.write("V10", value_1)
plc.write("V10.2", 0)

VW20 = plc.read('VW20')
print("VW20 : {0}".format(VW20))

plc.destroy()

Snap7 Documentation:

Node-Red

Connecting Node-Red with Broker-MQTT

A usually architecture in IoT is connect a "localhost" with a "broker" using a "node-red" like a intermediary.

Send data to Broker-MQTT

  1. Search by MQTT nodes, it's like this:
2. Click on "mqtt in", drag and drop in workspace. After double click in this node for set the configuration:
3. Fill the configuration using the broker configuration preview created. Write server "broker.shiftr.io", port "1883", and client ID "hoat23".
4. Fill the security configuration, for this case user: "mykeyusername" and password: "mysecretpassword".
5. Set node-name to "shiftr-io-hoat23" and click on "Update". After write in topic, QoS and Retain similar to this image:
6. Click on "Done". If that configuration is rigth you would see "connected" and green rectangle in "mqtt node".

Receiving data from Broker-MQTT

Similar to before, just follow this steps:

  1. Search by MQTT nodes, it's like this:
2. Click on "mqtt out", drag and drop in workspace. After double click in this node for set the configuration:
3. Fill the configuration using the broker configuration preview created. Write server "broker.shiftr.io", port "1883", and client ID "IBM-Cloud".
4. Fill the security configuration, for this case user: "mykeyusername" and password: "mysecretpassword".
5. Set node-name to "shiftr-io-hoat23" and click on "Update". After write in topic, QoS and Retain similar to this image:
6. Click on "Done". If that configuration is rigth you would see "connected" and green rectangle in "mqtt node".

Finally, in brocker-shiftr we can see the 2 modules connecting, like image below:

"Hoat23" is a localhost, this recollecting data and "IBM-Cloud" receiving the data.

Connecting Node-Red with PLC S7-1200

Installing S7 nodes

  1. Go to "Manage palete".
2. Search by "node-red-contrib-s7" and click on "Install".
3. Click on install again.
4. Wait by installing. If don't have error, now can use the s7 nodes

Read data from S7-1200

Comming soon.

Write data on S7-1200

Comming soon.

Connecting Node-Red with ElasticSearch

Write data on ElasticSearch

The data received from the MQTT server will send to elasticsearch for analytics and visualization in Kibana. The nodes and flows configurated show bellow:

  1. Convert to json format.
  2. Adding the header:
3. POST data to ElasticSearch:
4. Add debug node by print response from ElasticSearch.

Read data from ElasticSearch

Comming soon.

More Information

About

Documentation about me experience connecting a industrial process by monitoring and analizing on cloud (OT/IT).

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published