Sifting the most value out of 'big data'
The main task of analytics is to find maximum value from available data. The decision is closely related to analytics so as to make the system dynamic, self learning and policy driven.
Each sensor in a 2Connect™ system can generate thousands of time marked pieces of information per day. Multi-sensors even more so. Additional information can be imported from external systems. This is what some refer to as 'big data'. Powerful analytics is used to derive value out of this data, and boil it down to the critical information to make automated and policy-based decisions as well as visualising critical information to users.
To achieve this, the 2Connect™ analytics employs pattern recognition, predictive modelling, anomaly detection, and various forms of artificial intelligence on real-time data streams.
In traditional decision systems, e.g. those based on PLC or ITTT (if-this-then-that) programming, the decision range has inputs and outputs entirely defined by the programmers, i.e. humans. This makes for a static system. When the program is so deployed, it is restricted to what has been programmed, regardless of changes in environment, historical outcomes and new knowledge gained. Each policy change requires reprogramming, debugging, test etc.
In 2Connect™ it is different. 2Connect™ supports dynamic decision making, taking into account changing conditions, history and new findings by the analytics modules. Each decision is influenced by three sets of factors.
Physically, decisions can be distributed to take place in the gateway. Distributed decisions are often faster, and resilient towards network and power disturbances.
When right climate is really critical
When cost and service level are really important