NAPCON / 17 Sep 2018
The vocabulary used to explain and describe process industry processes is unique and can sometimes feel a little complicated with its own terms and abbreviations. To help your buying process, we have compiled a NAPCON dictionary to explain some of the most common terms used in automation and process technology. We hope you enjoy it, and above all, find it useful!
An abnormal situation is an occurrence that distracts the uniform condition or operation of a plant. It causes uncertainty that delays the identifying of, and responding to, the root cause.
An application of DRTO technology (see definition of DRTO) which aims to e.g., profit maximization, increased economic operation or the minimization of energy consumption, of the process being controlled.
Advanced Process Control
Advanced Process Control (APC) refers to a broad range of process control and optimization tools, delivering resource savings and measurable improvements in plant and business performance.
An analyzer conducts chemical analysis on samples or sample streams. Such samples consist of some type of matter such as solid, liquid, or gas.
Asset Management System
An asset management system monitors and maintains things of value to an entity or group. It is applicable to both tangible assets, such as buildings, and onto intangible assets, such as human capital, intellectual property, goodwill and/or financial assets.
ASTM D 5800
ASTM International has set over 12 000 standards that operate globally in various industries to improve performance in manufacturing and materials, products and processes, systems and services. An ASTM designation number identifies a unique version of an ASTM standard. D stands for miscellaneous materials, whereas the number 5800 is an assigned sequential number.
Big Data Analytics
Big data analytics is the use of advanced analytic techniques against large, diverse data. Big data refers to the massive amounts of data that are difficult to analyze and handle using common database management tools.
Big Data Handling
Handling big data is often relatively complicated due to its high volume and velocity. Big data originates from sensors, devices and networks. The origin of the data needs to be carefully considered when selecting the tools and models to handle big data sets.
Big Data Processing
Data can be processed in many ways, and there are multiple big data processing frameworks available.
Big Data Systems
Big data systems consist of platforms designed for data processing and analytics applications.
Controller refers to a device that has the task to control a measured variable either to a specific level or between specific limits. A controller can be a basic level PLC controller or a higher level APC controller.
A database system aims to achieve a highly organized collection of data, along with appropriate tools and applications that facilitate processing and access to that data.
Discrete Event Simulation
A discrete-event simulation (DES) models the operation of a system as a discrete sequence of events in time. Each event occurs at a particular instant in time and marks a change of state in the system.
Discrete Rate Simulation
In the field of simulation, a discrete-rate simulation (DRS) models the behavior of mixed discrete and continuous systems. This methodology is used to simulate linear continuous systems, hybrid continuous and discrete-event systems, and any other system that involves the rate-based movement or flow of material from one location to another.
DCS – Distributed Control System
A distributed control system (DCS) refers to an integrated system that is collecting data from the field measurements, that has several controllers and their logic in it. A DCS is essentially the user interface for the operators into their process.
Optimization using a variant of MPC with an economic objective function so that the good properties of MPC: fast execution, robustness and above all, independence of the process on reaching a steady state, are inherited.
Dynamic optimization refers to the real-time process of minimizing the costs and maximizing the benefits of some objective function over a period of time.
Dynamic simulation is the use of a computer program to model the time varying behavior of a system. The systems are typically described by ordinary differential equations or partial differential equations.
FT-NIR is ideal for rapid raw material identification and is also a powerful analysis tool capable of accurate multi-component quantitative analysis. FT-NIR provides a useful replacement for arduous chemistry tests and chromatographic methods.
Gamification is using game-based mechanics, aesthetics, and game-thinking to create a more engaging learning environment. Gamification simulation is an imitation of the operations of real-world processes.
Gamification training is a virtual medium through which various types of skills, such as business awareness and management skills, can be acquired. Gamification is the process of applying gaming designs and concepts to learning or training scenarios in order to make them more engaging and entertaining for the learner.
Hazard situation is any source of potential damage, harm or adverse health effects on something or someone. General examples of workplace hazards include any substance, material, process, practice, etc. that has the ability to cause harm.
High Fidelity Modelling
High fidelity modeling tools are used to provide customers with a cost-effective means of predicting real-world system performance without the need for costly hardware prototyping and data collection campaigns. Modeling tools explain, for example, sensor motion, sensor-platform interactions, and errors in the receive chains.
IIoT stands for the Industrial Internet of Things. The IIoT is part of a larger concept known as the Internet of Things (IoT). The IoT is a network of intelligent computers, devices, and objects that collect and share huge amounts of data.
An analyzer is a person or device that analyses given data. In-line analyzing is done with automatic sampling. For inline analysis, a sensor can be placed in a process vessel or stream of flowing material to conduct the analysis.
The Industrial Internet refers to the integration of complex physical machinery with networked sensors and software. The industrial Internet draws together fields such as machine learning, big data, the Internet of things and machine-to-machine communication to ingest data from machines, analyze it and use it to adjust operations.
Information automation refers to automatic information-handling of communication, computation processing and storage of information. Information is usually handled by an Automated Information System (AIS) that is an assembly of computer hardware, software, firmware, or any combination of these.
Instrument calibration is one of the primary processes used to maintain instrument accuracy. Calibration is the process of configuring an instrument to provide a result for a sample within an acceptable range.
Instrument management means upkeeping the devices. It contains scheduling the calibration, maintenance and validation procedures.
IT Automation means linking of disparate systems and software in such a way that they become self-acting and self-regulating. It is the key to faster deployments and continuous delivery.
IT Process Automation
IT Process Automation (ITPA) is the automation of an IT task through the orchestration and integration of tools, people and process through a single workflow. ITPA can offer an organization a greater level of infrastructure efficiency.
Logistics optimization means examining and updating your transportation costs. Logistics optimization is currently the biggest opportunity for most companies in order to attain significant reduction in operational costs.
Logistics simulation enables companies to model and evaluate virtually different aspects of their logistics network. Modeling technologies simplify the planning and estimating of costs.
Logistics system contains the plans, production and storing of goods as well as services and information between the point of origin and the point of consumption.
Maintainability is defined as the probability of performing a successful repair action within a given time. In other words, maintainability measures the ease and speed with which a system can be restored to operational status after a failure occurs.
Model Predictive Controller
Model Predictive Control (MPC) is an optimal-control based method to select control inputs by minimizing an objective function. The objective function is defined in terms of both present and predicted system variables and is evaluated using an explicit model to predict future process outputs.
Near-infrared refers to a region of the infrared spectrum of light used for Near-infrared spectroscopy. NIR spectroscopy is the study of the interaction between a sample (e.g. cereals, seeds, oils, finished products) and infrared light that has been dispersed into individual wavelengths, usually by a prism. Fourier transform near-infrared (FT-NIR) spectrometers are used to identify and characterize chemicals and compounds in a test sample. These devices are based on the characteristic absorption or transmission spectrum of chemical bonds, which can be used to identify compounds in the same way a fingerprint can be used to identify an individual.
Near Infrared (NIR) analysis is a spectroscopic technique that utilizes the naturally occurring electromagnetic spectrum. The NIR region is the area of the spectrum defined by wavelengths between 700nm and 2500nm.
NIR calibration is a quantitative and qualitative method that provides a more cost-effective and rapid way to measure the quality conditions. Quality conditions are required to adjust the storage management in advance to preserve product quality and market value.
The NOACK Volatility Test, otherwise known as ASTM D-5800, determines the evaporation loss of lubricants in high-temperature service. The more motor oils vaporize, the thicker and heavier they become, contributing to poor circulation, reduced fuel economy and increased oil consumption, wear and emissions.
Octane analyzer conducts routine analysis of gasoline octane ratings. The analyzer allows users to provide lab accurate octane measurements.
Oil analysis (OA) is the laboratory analysis of a lubricant’s properties, suspended contaminants, and wear debris. OA is performed during routine, predictive maintenance to provide meaningful and accurate information on lubricant and machine condition.
Oil optimizer is an automatic device for production control, that enhances engine efficiency at the microscopic level. The most common goal in optimization is minimizing costs whilst maximizing efficiency.
Oil Production Optimization
Oil production optimization is essentially production control where you minimize, maximize or target production. Production optimization means rearranging or rewriting data or software in order to improve efficiency of processing. The most common goal in optimization is minimizing the cost while maximizing efficiency.
Online analysis refers to a collection of research techniques used to describe and make inferences about online material through systematic coding and interpretation. Online content analysis is a form of content analysis for analysis of Internet-based communication.
An analyzer is a person or device that analyses given data. Analyzers that are connected to a process, and conduct automatic sampling, can be called online analyzers.
Online calibration processes are orchestrated by a single process running on an online host computer. Periodic calibration of the sub-detector systems is one of the most important tasks during the data taking.
OPC is a software interface standard that allows Windows programs to communicate with industrial hardware devices. OPC is implemented in server/client pairs.
OPC DA stands for OPC Data Access. It is an OPC Foundation specification that defines how real-time data can be transferred between a data source and a data sink (for example: a PLC and an HMI), without either of them having to know each other’s native protocol.
The OPC server is a software program that converts the hardware communication protocol used by a PLC into the OPC protocol. The OPC client software is any program that needs to connect to the hardware, such as an HMI.
OPC Unified Architecture
OPC Unified Architecture (OPC UA) is a machine to machine communication protocol for industrial automation developed by the OPC Foundation
Operator Training Simulator
An Operator Training Simulator (OTS) is a computer-based training system that uses a dynamic simulation model of an industrial process, usually integrated with an emulator of the process plant’s Distributed Control System (DCS).
Petrochemicals, also called petroleum distillates, are chemical products derived from petroleum.
Plant Simulation is a computer application developed for modeling, simulating, analyzing, visualizing and optimizing production systems and processes, the flow of materials and logistic operations.
PLC – programmable logic controller
Programmable logic controller (PLC) refers to a controller unit that has their own measurement inputs and needed control outputs. PLCs are commonly used in the food & beverage industry and assembly lines.
Process control is a continuous process where the physical system is represented through uninterrupted variables. Process control uses industrial control systems to achieve a production level which could not be achieved purely by manual human control.
Process Control is the active changing of the process based on the results of process monitoring. Once the process monitoring tools have detected an out-of-control situation, the person responsible for the process makes a change to bring the process back into control.
Process Information Management System (PIMS)
PIMS are software solutions that collect real-time data from manufacturing processes into a database. By producing comprehensive reports based on collected data, decision makers will in real-time obtain clear information about the processes performance.
Process Information System
An information processing system is an electrical, mechanical or biological system which takes information in one form and processes it into another, for example into statistics, by an algorithmic process.
Process IT Management System
A process IT management system refers to the administration of the information technology systems in an enterprise data center. An effective systems management plan facilitates the delivery of IT as a service and allows the organization’s employees to respond to changing business requirements in an agile manner.
Process IT System
A process IT System is often a combination of machines, people, and processes that consists of a set of inputs producing a defined set of outputs. The inputs and outputs can be interpreted as data or other information, depending on the interpreter’s relation to the system.
Process management consists of activities of planning and monitoring the performance of a business process. It includes the application of knowledge, skills, tools, techniques and systems to define, visualize, measure, control, report and improve processes.
Process optimization is the action of making changes or modifications to a process, in order to make the process function better in terms of the desired end-results, e.g. utility costs.
Process Quality Control
Quality control is a process by which entities review the quality of all factors involved in production.
Is used for the design, development, analysis, and optimization of technical processes. Process simulation is a model-based representation of chemical, physical, biological, and other technical processes and unit operations in software.
Production optimization is the action of making changes or modifications to a production process to make it function better in terms of the desired end-result. In a typical plant there are hundreds or even thousands of control loops. Production optimization means rearranging or rewriting data or software in order to improve the efficiency of processing. The most common goal in optimization is minimizing the cost while maximizing efficiency.
QA/QC is the combination of Quality Assurance, the process or set of processes used to measure and assure the quality of a product, and Quality Control, the process of ensuring products and services meet consumer expectations.
Quality Management System
A quality management system (QMS) is a collection of business processes focused on consistently meeting customer requirements and enhancing their satisfaction. It is aligned with an organization’s purpose and strategic direction.
Quality Management Tools
Quality management tools are a designation given to a fixed set of graphical techniques identified as being most helpful in troubleshooting issues related to quality. These tools contain Pareto Principle, Scatter Plots, Control Charts, Flow Charts, Cause and Effect, Fishbone or Ishikawa Diagram, Histogram or Bar Graph, Check Lists and Check Sheets.
Reliability, Availability and Maintainability (RAM) analysis is a well-known method of estimating the production availability of a system by assessing failure modes, frequencies and consequences, all the while paying attention to the effect on production.
Real Time Optimization
Real time optimization (RTO) is a type of software that can be used to improve the performance of server software. It provides a real time view and control of the corporate IT infrastructure including applications, servers, and network devices.
Real-time optimization (RTO) is a category of closed-loop process control that aims at optimizing process performance in real time for systems.
Compared to traditional process controllers, they are different as they are normally built upon model-based optimization systems and are usually large scale. RTO helps systems in increasing performance and efficiency.
Safe design is the integration of hazard identification, risk assessment and control methods early in the design process to eliminate or minimize risks to health and safety throughout the construction and life of the structure being designed.
Supervisory control and data acquisition (SCADA) system is the equivalent of a DCS system, where the underlying building blocks are separate PLC controllers. The SCADA system only gathers information from the PLCs and the PLCs have the control algorithms, whereas a DCS has also the control algorithms in it.
Simulation is the imitation of the operation of a real-world process or system over time. Simulation is used in many contexts, such as simulation of technology for performance optimization, safety engineering, testing, training, education, and video games.
A training simulation is a virtual medium through which various types of skills can be acquired. Training simulations can be used in a variety of genres; however, they are most commonly used in corporate situations to improve business awareness and management skills.
SIS – safety instrumented system
A Safety Instrumented System (SIS) is similar to a DCS system in that it receives data from the field measurements and performs control actions when needed. A SIS system will override the DCS controllers if a logic within the SIS is activated because the SIS system is only used to guard the process and environment from hazardous events.
Spectrometer calibration means identifying a peak in flame spectrometry or the wavelength of a certain absorption.
The determination of the structure or quantity of substances by measuring their capacity to absorb light of various wavelengths. Also called spectrophotometry.
Supply Chain Management System
Supply chain management systems are integrated partnerships among all links in the flow of goods and services to the customer.
Systems simulation is a set of techniques that use computers to imitate the operations of various real-world tasks or processes through simulation
Throughput is the amount of material that passing through a system or process. Throughput maximization then means maximizing that quantity.
Time Series Data
A time series is a series of data points indexed in time order. Most commonly, a time series is a sequence taken at successive equally spaced points in time. Thus it is a sequence of discrete-time data.
Time Series Data Base
Time Series Data Base (TSDB) is a software system that is optimized for handling time series data, arrays of numbers indexed by time (a datetime or a datetime range). In some fields these time series are called profiles, curves, or traces.
A training environment is a workplace or educational setting designed to assist individuals in gaining work-related skills or competencies. When a student or employee is placed in a training environment, they are provided with instruction and guidance toward learning how to perform specific tasks.
Waste Reduction System
Waste reduction system is an efficient and low maintenance way to reduce waste in high-volume operations.