Should Historians be History?

In our several conversations with Manufacturing companies for implementing Industry 4.0 and IOT solutions, we hear that the organizations have already invested in IoT technologies and have been using these for some time now. When we dig a little deeper, we realize that what is meant to be an IoT implementation is really a deployment of Historians for OT analysis.

Historians aggregate OT data across multiple lines, processes, and equipment in a plant. They store years of data that require compression, and they have collectors to reliably gather data in the face of network outages. Enterprise historians aggregate data across multiple plant historians.

Increasingly, OT data is used and consumed by a host of other players within an organization, including developers, business analysts, data scientist and product managers driving and supporting the business. However, Data Historians were never designed for use with a range of external systems and many of these systems were built from the ground up on expensive legacy hardware where resource constraints and lack of standards meant that functionality was focused only on the localized and immediate requirements of the OT infrastructure and local process. The result is that these systems are not easily extended either for localized analytics and visualization, sharing data across local systems, or easily and securely exchanging data with modern backend systems for further analytics and visualization.

Outside of the OT domain, the rest of the manufacturing organization’s data is stored in standard databases and data warehouses. Data Historians were focused on capturing largely structured data in time-series formats; today’s data is a vast superset of data captured by these legacy systems.  Modern time-series databases cover traditional time-series data capabilities, however, they are designed and optimized for capturing data chronology and ingesting data from unstructured and streaming data sources including binary large objects, JSON data, and adherence to standards for the latest in IoT connectivity.

Data from historians comes without context unless the end user does the heavy lifting of adding binding metadata via data mapping or manual annotation. End users need to have a clear vision for how the data will be used from the onset of the implementation, which makes scalability a challenge. Conversely, IoT environments offer a more streamlined and scalable approach. IoT technologies enable organizations to collect much more data today and analyze it in a variety of new, potentially unexpected ways in the future. This provides greater flexibility for data analysis by different users, roles, applications, and purposes over time. 

While the cost of enterprise historians is typically tied to the number of data sources with a large up-front price tag for a perpetual license, most IoT solutions are built on a data-volume, subscription-based model that spreads recognized costs out over time. As organizations become more distributed and sensors continue to become more affordable, data volumes will continue to increase, which makes a pricing model based on volume and not the number of sources more cost-efficient. Furthermore, a subscription model enables organizations to correlate the operational expense to the value received. Businesses have the option to start small and incrementally add value as needed paying for that value only when used. With historians, end users must gamble to some degree betting on their future success and value derived while paying for everything up front.
Historians have been around a while and have served their purpose, but with the advent of the need for a single real-time plant-wide view of manufacturing operations, real-time decision making in a plant or across plants, AI/ML models, historians don’t really tick all the technology and scaling boxes.. Enter IoT and Cloud computing.