WO1991010961A1 - Process control using neural network - Google Patents

Process control using neural network Download PDF

Info

Publication number
WO1991010961A1
WO1991010961A1 PCT/US1991/000141 US9100141W WO9110961A1 WO 1991010961 A1 WO1991010961 A1 WO 1991010961A1 US 9100141 W US9100141 W US 9100141W WO 9110961 A1 WO9110961 A1 WO 9110961A1
Authority
WO
WIPO (PCT)
Prior art keywords
values
variables
value
neural network
stage
Prior art date
Application number
PCT/US1991/000141
Other languages
French (fr)
Inventor
S. Keith Grayson
John B. Rudd
Original Assignee
Automation Technology, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Automation Technology, Inc. filed Critical Automation Technology, Inc.
Priority to CA002073331A priority Critical patent/CA2073331A1/en
Publication of WO1991010961A1 publication Critical patent/WO1991010961A1/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B13/00Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion
    • G05B13/02Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric
    • G05B13/0265Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric the criterion being a learning criterion
    • G05B13/027Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric the criterion being a learning criterion using neural networks only
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/10Interfaces, programming languages or software development kits, e.g. for simulating neural networks
    • G06N3/105Shells for specifying net layout

Definitions

  • This invention relates to new and useful improvements to the control of complex multi-variable continuous manufacturing processes. More specifically, the invention relates to the use of techniques to develop, implement and use a neural network to dynamically monitor and adjust a manufacturing process to attain a product that, for example, meets optimum quality and production requirements.
  • Process optimization contemplates a wide variety of situations where there is a need to control variables that are not directly or instantaneously controllable. In these situations, the only mechanism for control is the manipulation of process variables (PVs) that indirectly or belatedly affect the variables, but not in a way that can be easily determined mathematically.
  • PVs process variables
  • a human process expert can empirically derive an algorithm to optimize the indirectly controlled variables; however, as the number of PVs that influence the indirectly controlled variables increase, the complexity of the problem grows by the square of the increase. Since this condition quickly becomes unmanageable, the PVs with less influence are ignored in the solution. Although each of these PVs exhibit low influence when considered alone, the cumulative effect of multiple PVs can have significant influence. Their omission from the optimization algorithm can greatly reduce its accuracy and usability.
  • Fig. 1 illustrates the typical problem.
  • the physical process block represents any complex multi- variable continuous process where several process variables are controlled to produce optimized outputs.
  • the individual PVs are controlled by independent proportional-integral-derivative (PID) controllers in a conventional closed loop strategy.
  • PID proportional-integral-derivative
  • Process variables that cannot be controlled directly or immediately are labeled PV'l, PV'2.
  • the setpoint(s) (SP) of each of these PID loops is determined by the controller which may simply be a process expert (foreman, process engineer, etc.) to produce the desired outputs from the physical process for the current conditions. If the outputs deviate from the optimum, the process expert decides which setpoint(s) should be adjusted and the amount of the adjustments.
  • Neural networks or neural nets are, as the name implies, networks of neurons (elements) that interconnect in a unique way.
  • the networks comprise input neurons that receive signals or information from outside the network, output neurons that transmit signals or information outside the network and at least one hidden layer of neurons that receive and pass along information to other neurons.
  • the potential advantage of neural nets is that, unlike classification logic, they can work with less than perfect input information. See “Neural Networks Part, 1" ' , IEEE Expert, Winter 1987 and “Neural Networks Part, 2" , IEEE Expert. Spring 1988. The application of neural networks to commercial applications has not been reported. Two theoretical articles found by applicants relate to very simple or simulated use of neural networks for control.
  • NeuroShell NeuroShell Neural Network Shell Program
  • a control system for a multi-variable process wherein certain process variables may only be controlled indirectly by control of a plurality of directly controllable process variables.
  • the system comprises fast-acting controllable devices for changing controllable process variables, a neural network for predicting the value of at least one indirectly controlled variable, and computer means for storing and executing rules that determine the input values for the neural network and the application of the output values of the neural network as set points for the fast-acting devices.
  • the control system is so constructed and arranged that after the neural network has been trained to predict the state of the indirectly controlled variables, a rule associated with an output neuron changes at least one set point value to cause the difference between the predicted value of the indirectly controlled variable to approach the desired value of the indirectly controlled variable.
  • the fast-acting devices for example, motors connected to valves or power supplies controlling electrical heating currents, establish the value of the variable at a set point value applied to said fast-acting device.
  • the fast-acting devices include active elements, for example, motors with feedback controllers that compare the values of the directly controllable variables with the set point values and generate error signals which when applied to the active elements drive the active elements to diminish the error signals.
  • the feedback controllers are PID controllers.
  • the trainable neural networks have a plurality of input neurons for having input values applied thereto and at least one output neuron for outputting an output value.
  • the neural network may be implemented as an integrated circuit defining the neural network including circuitry for implementing a teaching algorithm, for example, a back propagation teaching algorithm, or as a computer program defining the neural network and a teaching algorithm.
  • a computer memory stores an input value for input neurons and at least one computer executable rule associated with an input neuron to establish the input value.
  • the memory also stores an output value for output neurons and at least one computer executable rule for an output neuron.
  • the computer memory stores values and rules associated with each input neuron and values and rules associated with each output neuron in a computer database.
  • a computer having a computer memory maintains a process description database defining the state of the multi-variable process including the instantaneous values of the process variables.
  • Data input means for example, direct I/O connection to process variable measurement means or a terminal keyboard or a connection to a host computer, enables entering the desired value of at least one indirectly controlled variable into the process database.
  • Hardware and associated computer tasks are provided for continuously updating the process database to reflect the values of the directly controllable variables.
  • computer input ports and a computer task for polling the ports or fielding interrupts from the ports transfers process variable values from the ports to the process database.
  • a computer is programed for executing the rules associated with each input neuron to establish the value of the input neuron and for executing the rules associated with the output neurons for establishing the set point values to be applied to the fast-acting means.
  • the computer executes the rules with an evaluator task which loads the rules and loads the values, if any, required to evaluate the rules from the process database and/or the input neurons and/or the output neurons and evaluates the rules one at a time.
  • Circuitry and a computer task are provided for applying the set point values to the fast-acting devices.
  • the circuitry comprises computer output ports and a computer task for transferring the set point values to the output ports.
  • a computer task for training the neural network to predict a value corresponding to the value of at least one indirectly controllable variable at an output neuron from the values at input neurons corresponding to directly controllable variables.
  • the task for training the neural network comprises presenting a plurality of training sets of input neuron and output neuron values captured from the process while it is in operation to the neural network during a training period in which a back propagation algorithm adjusts the interconnections between neurons.
  • at least one input neuron value and at least one output neuron value are captured at different times.
  • values of variables are stored at spaced time intervals to maintain a process history, for example, in a first- in-first-out buffer.
  • the process history is part of the process database and the rules associated with at least one input neuron take as a parameter a value from the process history.
  • the rules associated with input neurons may comprise averaging, filtering, combining and/or switching rules.
  • the rules associated with output neurons may comprise limit checking, weighing, scaling and/or ratioing rules.
  • the rules associated with at least one output neuron comprise rules for selecting which set points should be modified in order to move the predicted value to the desired value.
  • a computer task stores the values of controllable variables in a first-in-first-out buffer immediately following a settling period in which the rules associated with the input and output neurons are continuously updated to move the predicted value to the desired value.
  • a process history is maintained by the first-in- first-out buffer.
  • the rules associated with at least one input neuron take as a parameter a value from the process history.
  • training comprises presenting to the neural network a plurality of training sets of input neuron and output neuron values captured from the process at different times.
  • Fig. 1 illustrates the arrangement of a process control system for a complex multi-variable process
  • Fig. 2 illustrates the overall structure of the neural optimization controller as a block diagram
  • Fig. 3 is a flow chart of the control structure showing overall program segments;
  • Fig. 4 is a flow chart of program segment
  • Fig. 5 is a flow chart of program segment DAC
  • Fig. 6 is a flow chart of program segment PREPROC
  • Fig. 7 is a flow chart of program segment
  • Figs. 11 and 11A are schematics illustrating the equipment for a continuous bleaching process with process inputs numbered and process variables lettered.
  • the multi-variable processing system as shown in Fig. 1 is provided with the unique control system illustrated in block diagram form in Fig. 2.
  • a database is first defined and customized for each specific process application. This is done through three computer software implemented functions requiring operator interaction through the console monitor and keyboard; namely, configuration, neural network training, and monitoring of the system.
  • the user enters all of the information necessary to define the physical inputs and outputs, the input processing rules and the output processing rules, the training database and protocol to communicate with all foreign devices.
  • the database configuration tools, data acquisition, rules and rule processing, training networks and monitoring may be distributed among separate systems, for example, process control, neural network controller system, and host computer, with communication software provided for transferring data between systems as required.
  • the tasks reside in and are part of a stand-alone multi-tasking operating system.
  • the user is able to train the neural network.
  • Data points are specified that are to be included in the database, screens are created for the operator to view while the system is running, and this information is compiled.
  • the train neural network function the data is added to the training set where current data is edited. Once the data is in the desired form, the system is directed to use the data to develop a trained network.
  • the trained neural network is then used in the run time system.
  • the run time system provides for automatically collecting data to update the training set and to re-train the network.
  • This invention is based upon the application of neural network technology which is a tool that attempts to classify patterns according to other patterns that have been "learned" previously and gives the most reasonable answers based upon the variety of learned patterns. It is not guaranteed to always be correct and its results must be evaluated in terms of the percentage of answers that match those an expert might give.
  • the technology is similar to the biological neural functions that it simulates and is significantly different from other conventional software systems.
  • the basic building block of the neural network technology is the simulated neuron which processes a number of inputs to produce an output.
  • Inputs an outputs are usually numeric values between 0 and 1, where values close to 1 represent positive stimulation, and 0 represents either negative or no stimulation.
  • Inputs to the neuron may come from the "outside world” or from other neurons, and outputs may go to other neurons or to the "outside world”.
  • the process used by the neuron to process its inputs is usually a summation of inputs followed by a linear function applied to the sum called an output function.
  • Independent neurons are of little value by themselves, however, when interconnected in a network and "trained" the resulting network becomes a useful tool.
  • Neurons also called nodes, in the first layer receive their inputs from the outside and are called input nodes and neurons in the output layer send their outputs to the outside and are called output nodes.
  • Each of the input nodes indirectly passes its output to each of the ouput nodes for final processing.
  • the values are passed, they are weighted to represent connection strengths. To positively reinforce a connection, the weight is raised and to negatively reinforce or inhibit a connection the weight is lowered. The determination of the weighting factors takes place in the training process.
  • Training is the process where inputs with a known result are supplied to the network and the errors between the predicted results and the known results are used to adjust the weights. This process is repeated until the error is reduced to an acceptable tolerance.
  • the network should be capable of reproducing correct output patterns when presented with input patterns similar to those learned.
  • a two-layer network is not capable of learning complex patterns, for example, the exclusive OR pattern.
  • back propagation by adding additional layers and using a training technique referred to as back propagation, this deficiency is overcome.
  • the nodes in a layer between the input and output nodes are called hidden nodes.
  • the theory behind back propagation learning is referred to as 11 the generalized Delta Rule and is detailed in Ru elhart and McClelland "Parallel Distributed Processing", Vol. 1, pp. 444-459.
  • FIG. 2 there is shown a block diagram of the system according to this invention for controlling a physical multi-variable manufacturing process.
  • the entire system has been implemented using a combination of hardware and software to accomplish the functions represented by the blocks labeled "Configuration", "Training", “Monitoring” and "Run Time” System.
  • the configuration function allows the user to define the specific process application.
  • the training function allows the user to automatically generate neural networks based upon the configuration information and train the network based upon the training data collected.
  • the monitoring function allows the user to view process data and neural network outputs.
  • the run time function uses a previously trained neural network to implement real time control of the process based upon the user configuration.
  • the descriptions are based on an embodiment utilizing Ashton Tate's dBASE IV programming language and the NeuroShell neural network emulation program.
  • the configuration function allows the user to build databases that define the physical process points, the network configuration and the I/O driver protocols.
  • Physical process points are defined by assigning a tagname, description, unit of measurement and range of values for each point to be used by the system. These points include physical inputs, calculated values, 12 manually entered values and physical outputs.
  • the definition of each point is stored in the database named POINTDEF.
  • the entries for an exemplary process definition database are set forth in Table I.
  • Table 1 corresponds to the pulp bleaching process shown schematically in Figs. 11 and 11A.
  • the tagnames in the table correspond with the inputs and outputs labeled on Fig. 11.
  • the entries in the column headed "Value” contain values for the variable identified by tagname in the left-most column. Variable values, of course, may be changed during processing.
  • the entry in the column headed "Mode” indicates the type of input or output as follows: “Al” means analog input, "MAN” means manual input, "CAL” means manually or automatically calculated value and "NO” means neural net output.
  • the neural network is defined by assigning a tagname, description, units .of measurement and range of values for each input and output neuron required*. Processing rules are also defined for each neuron. Processing rules for input neurons are executed prior to submitting the input values to the trained network and are therefore referred to as "pre-processing" rules. Processing rules associated with output neurons are executed after obtaining the output values from the trained network and are referred to as "post-processing" rules.
  • the definition of each neuron and its associated processing rules are stored in the database named NETDEF.
  • the entries in an exemplary neural network database are set forth in Table II with the exception of the rules.
  • Neuron INPUT06 Tagname NI106 Description Clean shower Flow
  • Neuron INPUT01 Tagname NI101 Description Production Rate Units TONS/DAY Value 600.0000 Min 400.0000 Max 1000.0000
  • NI102 FI102 ** use FI102 as input to neuron
  • NI102 • NI102 ** use last value of NI104 ENDIF
  • Neuron INPUT03 Tagname NI103 Description Stock Consistency Units % Value 2.8780 Min 1.5000 Max 4.5000
  • CI103 CI103 ** use last value of CI103 ENDIF **
  • NI103 ⁇ CI103 ** Set neuron value to CI103 &SAVE CI103 ** Save current CI103 to history
  • Neuron INPUT04 Tagname NI104 Description Injection Water Flow Units GPM Value 540.0000 Min 400.0000 Max 600.0000
  • NI104 FI104 ** use FI104 as input to neuron
  • NI104 NI104 ** use last value of NI104
  • Neuron INPUT05 Tagname NI105 Description C102 Total Flow
  • NI105 HIST_VALUE ** Use historical value for nueral input
  • Neuron OUTPUT01 Tagname NO101 Description Predicated Mat Consist. Units % Value 13.2683 Min 10.0000 Max 15.0000
  • FI104SP • FI104SP + 0.10 * (N0101 -CI114) ** Adjust FI104 setpoint ELSE ** Else...
  • FI102SP FI102SP - 0.05 * (N0101 -CI11 ) ** Adjust FI102 setpoint 10 ENDIF ENDIF
  • FI104SP FI104SP + 0.10 * (N0101 -CI114 ) ** Adjust FI104 setpoint 15 ELSE ** Else...
  • FI102SP FI102SP - 0.05 * (N0101 -CI11 ) ** Adjust FI102 setpoint ENDIF ENDIF
  • Neuron OUTPUT02 Tagname NO102 Description Vat Consistency
  • MI117 NO103 ** Load % Applied Target
  • FI117SP MI117 * (1.0 - FR117) * MI101 ** Calc. CL2 setpoint
  • FI105SP (MI117 * FR117 * MI101 )/( 1.5803 * DI105) ** Calc. CL02 setpoint
  • the I/O driver protocols are configured by defining the type of process control equipment to be interfaced with the address or tagname of the I/O points in the process control system and the communication parameters required.
  • the process control equipment may be Programmable Logic Controllers (PLC) , Distributed Control Systems (DCS) or individual analog controllers.
  • the training function allows the user to generate and train neural networks for his specific application. A new network is generated after the user completes the configuration process described above. Information from the NETDEF.DBF database is used to automatically build the files required by the NeuroShell program to define the neural network.
  • the "Define Training Set" function allows the user to define the names of four training sets and the conditions required for adding training data to each set. The training data is evaluated for inclusion in a training set when actual operating data is provided to the neural network. If there is a significant error between the operating data and the predicted values, the neural input values associated with that prediction and the operating data are added to the appropriate training set.
  • the monitoring function allows the user to view process data and neural network outputs.
  • the browse function is used to view data in the point definition and network definition databases.
  • the monitoring function would typically be implemented as an integral part of the process control system. The integration of this function into the process operator's interface is highly desirable and will be a standard feature of future embodiments.
  • the control function uses a previously trained neural network to implement real-time control of the process. This functionality is implemented in the dBASE
  • CONTROL.PRG The overall structure of CONTROL.PRG is shown in Fig. 3 * .
  • The.program is comprised of six program segments that are described as follows:
  • MVARINIT is the variable initialization procedure for CONTROL.
  • the primary function of MVARINIT is to define variable structures and initialize variable values.
  • a public memory variable is defined for each point record in the point definition database and for each neuron record in the network definition database. These memory variables are used to hold the point values for the other procedures of CONTROL.
  • DAC is the data acquisition procedure for CONTROL.
  • the primary function of DAC is to acquire data from manual entry points in the point definition database and read field inputs by calling the appropriate I/O driver.
  • PREPROC is the rule based neural input pre-processing procedure for CONTROL.
  • the procedure executes the pre-processing rules for each input neuron configured in the network definition database.
  • the rules can be any valid dBASE IV command.
  • NETSUBMT is the neural network interface procedure for CONTROL.
  • the procedure builds the classification file required to submit data to a trained NeuroShell network.
  • NETSUBMT calls NeuroShell to execute the network.
  • NeuroShell places the resulting neuron output values in the classification file and returns to NETSUBMT.
  • NETSUBMT then reads the neural outputs from the classification file and moves the results to the output variables.
  • POSTPROC is the rule based neural output post-processing procedure for CONTROL.
  • the procedure executes the post-processing rules for each output neuron configured in the network definition database.
  • the rules can be any valid dBASE IV command.
  • DOUT is the data distribution procedure for CONTROL.
  • the primary function of DOUT is to write data to the network definition database and the point definition database and write field outputs by calling the appropriate I/O driver.
  • the TRAIN program is used to intelligently select data sets from the process data, in an on-line manner, and store the data in training sets to be used for training and updating networks.
  • the predicted values are compared to actual values. If the difference exceeds say 2%, then additional training sets are gathered to retrain the neural network.
  • Each set of neural input values and actual values (a training set) is tested to determine if it is a valid training set.
  • a training set may be invalid because of a short or open in a lead wire from an analog input or for other reasons that put an input value outside of a logical range.
  • Figs. 11 and 11A there is shown in schematic form a continuous process used in the paper-making industry controlled by a neural network according to the teachings of this invention. Shown is the initial portion of a bleach line. It should be understood that this is exemplary of the processes which may be controlled by a neural network according to this invention. This invention can be applied to numerous other processes.
  • the process variable inputs to the controller are labeled with letters and the process target value outputs from the controller are labeled with numerals.
  • the tagnames for process variables appear in circles, squares and hexagons.
  • the properties or process parameters that correspond to the tagnames are set forth in Table I.
  • the main objective of the bleaching process is to increase the brightness (whiteness) of a pulp by the removal and modification of certain constituents of the unbleached pulp.
  • the process involves using chemicals to oxidize and/or replace the major color component (lignin) in the pulp suspension.
  • the measure of lignin content in the finally processed pulp is known as the Kappa-number (CEK) .
  • CEK The measure of lignin content in the finally processed pulp is known as the Kappa-number (CEK) .
  • CEK The measure of lignin content in the finally processed pulp is known as the Kappa-number (CEK) .
  • This measure is taken after the stock suspension passes through two process stages (the chlorination or C-Stage and the extraction or E/O-Stage) .
  • the measured CEK (BI115B) cannot be directly controlled.
  • An optical sensor (BI115) is used to measure the change in color (brightness) at an early stage of the process.
  • the brightness sensor being located approximately 1 1/2 to 2 minutes into the reaction measures a variable the value of which is used by the neural net along with many other variable values to generate a projection of the final brightness. Because this projection is only a indirect measurement of the desired value, the projection is affected by variables such as temperature, stock consistency, flow rate, chemical concentration and so forth.
  • the chemical reaction is a two-part reaction consisting of oxidation and substitution.
  • a portion of the chlorinated lignin is water soluble and is washed out of the pulp. However, a portion is not water soluble and must be modified to be removed. This takes place approximately one-third to one-half the way through the chlorination reaction.
  • the proportionate amount of caustic (NaOH) is added to render the final chlorinated lignin water soluble.
  • other chemicals are also added to enhance the brightening process at different points or stages in the process. Some examples of these chemicals are oxygen, peroxide and hypochlorite.
  • the location of the addition point, the amount of auxiliary chemical added along with other Conditions about the addition point have an effect on the final CEK number. Interrelationships of the many variables are difficult to determine and in some cases are unknown.
  • the process time constant (90 to 120 minutes) is long compared to the time to adjust the controlled variables (1 to 120 seconds) .
  • the process variable inputs (A...KK) are measured by various transducers and applied to the controller (shown as circles) .
  • Manual entries of process variables are also applied to the controller (shown as squares) .
  • calculated variables are applied to the controller (shown as hexagons) .
  • the variable which the process seeks to control (CEK) is determined in the laboratory based on samples taken at the output of the E/0 washer Stage.
  • the outputs from the controller (1...8) are applied to provide the operator a read-out or to make corrections to the system.
  • Output 1 is used along with the C-Stage brightness measurement BI115 to calculate a compensated CEK number which is predictive of the CEK lab entry value (BI115B) .
  • the purpose of the control system is to adjust controlled process variables so that the predicted and tested CEK values approach a desired value. (If the predicted and measured CEK values are written out reasonably close then the neural net has not been adequately trained or requires retraining.)
  • Output 1 adjusts the set points for two PID controllers in a first stage of the process wherein Cl_ and CIO. are added to the raw pulp.
  • Output 3 adjusts the set point of a PID controller for the addition of caustic (NaOH) to the washer following the C-Stage.
  • Output 4 adjusts the set point of a PID controller for addition of 0_ to the E/O-Stage. If the predicted CEK (predicted by brightness measure BI115 and. the other variable values applied to the neural net) is not at the desired level, controlled variables just mentioned can be adjusted to move the predicted value to the desired value.
  • a mechanism to present input data to the neural network which data is derived from a variety of sources including: a) analog and discrete inputs which are closely coupled to the neural network by direct I/O allowing the input of actual process signals with and without signal conditioning; and b) interrogation of information from process control systems and host computers which require some sort of intelligent communication protocol.
  • a method to process input and output signals including: a) pre-processing or input processing which includes such features as averaging, filtering, combining and signal switching of signals prior to presentation to the neural network; b) the output or post-processing of neural network outputs including the application of rule-based knowledge in addition to limit checks, comparisons, weighting, scaling and ratioing.
  • a system for final control including: a) a mechanism for processing the outputs residing on the same system where the inputs were received or; b) a system in which inputs may be received from one or multiple systems and outputs may be sent to other systems or to a host system for data logging and archiving.

Abstract

A control system (figure 1) and method for a continuous process in which a trained neural network (figure 2) predicts the value of an indirectly controlled process variable and the values of directly controlled process variables are changed to cause the predicted value to approach a desired value.

Description

PROCESS. CONTROL USING NEURAL NETWORK
This invention relates to new and useful improvements to the control of complex multi-variable continuous manufacturing processes. More specifically, the invention relates to the use of techniques to develop, implement and use a neural network to dynamically monitor and adjust a manufacturing process to attain a product that, for example, meets optimum quality and production requirements. Process optimization contemplates a wide variety of situations where there is a need to control variables that are not directly or instantaneously controllable. In these situations, the only mechanism for control is the manipulation of process variables (PVs) that indirectly or belatedly affect the variables, but not in a way that can be easily determined mathematically. A human process expert can empirically derive an algorithm to optimize the indirectly controlled variables; however, as the number of PVs that influence the indirectly controlled variables increase, the complexity of the problem grows by the square of the increase. Since this condition quickly becomes unmanageable, the PVs with less influence are ignored in the solution. Although each of these PVs exhibit low influence when considered alone, the cumulative effect of multiple PVs can have significant influence. Their omission from the optimization algorithm can greatly reduce its accuracy and usability.
Fig. 1 illustrates the typical problem. The physical process block represents any complex multi- variable continuous process where several process variables are controlled to produce optimized outputs. The individual PVs are controlled by independent proportional-integral-derivative (PID) controllers in a conventional closed loop strategy. Process variables that cannot be controlled directly or immediately are labeled PV'l, PV'2. The setpoint(s) (SP) of each of these PID loops is determined by the controller which may simply be a process expert (foreman, process engineer, etc.) to produce the desired outputs from the physical process for the current conditions. If the outputs deviate from the optimum, the process expert decides which setpoint(s) should be adjusted and the amount of the adjustments. In many processes, there may be a significant lag time between corrective action taken and the results of that action. This lag time is not caused by the response time of the individual PID loops, but is a result of the natural time constant of the physical process. PID loops normally respond rapidly when compared to the total process time constant. There are problems with trying to optimize this type of process because of the process time constants and shifts in the steady state domain of the process. The corrections to these problems are not easily determined mathematically. It requires the experience of a process expert to derive the corrective actions.
Efforts to overcome and develop optimum solutions to the above problems have varied. In the simplest case multi-variable linear regression has been used to develope correlation equations. In more complex applications, actual mathematical models have been developed to approximate the process so that results are projected and changes to the SPs are made without having to wait for the process to respond with its normal time cycle. In the most recent efforts, the experiences of the process experts have been reduced to a set of rules that are implemented in a computer and/or microprocessor to change the SPs. A rule based expert system -"quantifies-* human intelligence, building a database of rules, derived from one or multiple experts, to govern what to do when predefined measurable situations arise. This prior art has shortcomings: 1) known mathematical relationships between the process variables being considered may not exist; 2) assumptions must be made to reduce the relationships between the process variables to a domain that can. be mathematically modeled; and 3) expert rules cannot be derived because experts do not exist or multiple experts do exist and their opinions differ.
Neural networks or neural nets are, as the name implies, networks of neurons (elements) that interconnect in a unique way. Typically, the networks comprise input neurons that receive signals or information from outside the network, output neurons that transmit signals or information outside the network and at least one hidden layer of neurons that receive and pass along information to other neurons. The potential advantage of neural nets is that, unlike classification logic, they can work with less than perfect input information. See "Neural Networks Part, 1" ' , IEEE Expert, Winter 1987 and "Neural Networks Part, 2" , IEEE Expert. Spring 1988. The application of neural networks to commercial applications has not been reported. Two theoretical articles found by applicants relate to very simple or simulated use of neural networks for control. "An Associative Memory Based Learning Control Scheme With Pi-Controller For SISO-Nonlinear Processes", Ersu and Wienand, IFAC Microcomputer Application in Process Control (1986) and "A Symbolic-Neural Method for Solving Control Problems", Suddarth, Sutton and Holden, IEEE International Conference on Neural Networks, Vol. 1 (1988).
Recently, software for emulating neural networks on personal size computers has become commercially available. One such program sold under the Trademar "NeuroShell" is described in the accompanying user manual "NeuroShell Neural Network Shell Program", Ward Systems Group, Inc., Frederick, MD 21701 (1988). This manual is incorporated herein by reference.
It is an advantage, according to this invention, to use the existing neural network technology, either hardware or software, in a user configurable system for process optimization.
Briefly, according to this invention, there is provided a control system for a multi-variable process wherein certain process variables may only be controlled indirectly by control of a plurality of directly controllable process variables. The system comprises fast-acting controllable devices for changing controllable process variables, a neural network for predicting the value of at least one indirectly controlled variable, and computer means for storing and executing rules that determine the input values for the neural network and the application of the output values of the neural network as set points for the fast-acting devices. The control system is so constructed and arranged that after the neural network has been trained to predict the state of the indirectly controlled variables, a rule associated with an output neuron changes at least one set point value to cause the difference between the predicted value of the indirectly controlled variable to approach the desired value of the indirectly controlled variable.
The fast-acting devices, for example, motors connected to valves or power supplies controlling electrical heating currents, establish the value of the variable at a set point value applied to said fast-acting device. Preferably, the fast-acting devices include active elements, for example, motors with feedback controllers that compare the values of the directly controllable variables with the set point values and generate error signals which when applied to the active elements drive the active elements to diminish the error signals. Most preferably, the feedback controllers are PID controllers.
The trainable neural networks have a plurality of input neurons for having input values applied thereto and at least one output neuron for outputting an output value. The neural network may be implemented as an integrated circuit defining the neural network including circuitry for implementing a teaching algorithm, for example, a back propagation teaching algorithm, or as a computer program defining the neural network and a teaching algorithm.
A computer memory stores an input value for input neurons and at least one computer executable rule associated with an input neuron to establish the input value. The memory also stores an output value for output neurons and at least one computer executable rule for an output neuron. Preferably, the computer memory stores values and rules associated with each input neuron and values and rules associated with each output neuron in a computer database.
A computer having a computer memory maintains a process description database defining the state of the multi-variable process including the instantaneous values of the process variables. Data input means, for example, direct I/O connection to process variable measurement means or a terminal keyboard or a connection to a host computer, enables entering the desired value of at least one indirectly controlled variable into the process database. Hardware and associated computer tasks are provided for continuously updating the process database to reflect the values of the directly controllable variables. Preferably, computer input ports and a computer task for polling the ports or fielding interrupts from the ports transfers process variable values from the ports to the process database. A computer is programed for executing the rules associated with each input neuron to establish the value of the input neuron and for executing the rules associated with the output neurons for establishing the set point values to be applied to the fast-acting means. Preferably, the computer executes the rules with an evaluator task which loads the rules and loads the values, if any, required to evaluate the rules from the process database and/or the input neurons and/or the output neurons and evaluates the rules one at a time.
Circuitry and a computer task are provided for applying the set point values to the fast-acting devices. Preferably, the circuitry comprises computer output ports and a computer task for transferring the set point values to the output ports.
A computer task is provided for training the neural network to predict a value corresponding to the value of at least one indirectly controllable variable at an output neuron from the values at input neurons corresponding to directly controllable variables. The task for training the neural network comprises presenting a plurality of training sets of input neuron and output neuron values captured from the process while it is in operation to the neural network during a training period in which a back propagation algorithm adjusts the interconnections between neurons. According to one embodiment, at least one input neuron value and at least one output neuron value are captured at different times. According to a preferred embodiment, values of variables are stored at spaced time intervals to maintain a process history, for example, in a first- in-first-out buffer. Most preferably, the process history is part of the process database and the rules associated with at least one input neuron take as a parameter a value from the process history. The rules associated with input neurons may comprise averaging, filtering, combining and/or switching rules. The rules associated with output neurons may comprise limit checking, weighing, scaling and/or ratioing rules. The rules associated with at least one output neuron comprise rules for selecting which set points should be modified in order to move the predicted value to the desired value.
According to a preferred embodiment, a computer task stores the values of controllable variables in a first-in-first-out buffer immediately following a settling period in which the rules associated with the input and output neurons are continuously updated to move the predicted value to the desired value. In this way, a process history is maintained by the first-in- first-out buffer. The rules associated with at least one input neuron take as a parameter a value from the process history. This embodiment is useful where the actual value of the indirectly controlled process variable cannot be determined until some period after it has been predicted, for example, where the indirectly controllable variable is only determinable at a later time downstream from the fast-acting devices which control variables which may be used to alter the value of the indirectly controllable variable. in this embodiment, training comprises presenting to the neural network a plurality of training sets of input neuron and output neuron values captured from the process at different times. Further features and other objects and advantages will become clear to those of ordinary skill in the art from the following description made with reference to the drawings in which:
Fig. 1 illustrates the arrangement of a process control system for a complex multi-variable process; Fig. 2 illustrates the overall structure of the neural optimization controller as a block diagram;
Fig. 3 is a flow chart of the control structure showing overall program segments; Fig. 4 is a flow chart of program segment
MVARINIT;
Fig. 5 is a flow chart of program segment DAC;
Fig. 6 is a flow chart of program segment PREPROC; Fig. 7 is a flow chart of program segment
NETSUBMT;
Fig. 8 is a flow chart of program segment POSTPROC; Fig. 9 is a flow chart of program segment DOUT; and Fig. 10 is a flow chart of the training program.
Figs. 11 and 11A are schematics illustrating the equipment for a continuous bleaching process with process inputs numbered and process variables lettered.
According to this invention, the multi-variable processing system as shown in Fig. 1 is provided with the unique control system illustrated in block diagram form in Fig. 2. Referring to Fig. 2, a database is first defined and customized for each specific process application. This is done through three computer software implemented functions requiring operator interaction through the console monitor and keyboard; namely, configuration, neural network training, and monitoring of the system. During configuration, the user enters all of the information necessary to define the physical inputs and outputs, the input processing rules and the output processing rules, the training database and protocol to communicate with all foreign devices. ' The database configuration tools, data acquisition, rules and rule processing, training networks and monitoring may be distributed among separate systems, for example, process control, neural network controller system, and host computer, with communication software provided for transferring data between systems as required. In an alternative embodiment, the tasks reside in and are part of a stand-alone multi-tasking operating system.
Through the training function, the user is able to train the neural network. Data points are specified that are to be included in the database, screens are created for the operator to view while the system is running, and this information is compiled. With the train neural network function, the data is added to the training set where current data is edited. Once the data is in the desired form, the system is directed to use the data to develop a trained network.
The trained neural network is then used in the run time system. The run time system provides for automatically collecting data to update the training set and to re-train the network.
This invention is based upon the application of neural network technology which is a tool that attempts to classify patterns according to other patterns that have been "learned" previously and gives the most reasonable answers based upon the variety of learned patterns. It is not guaranteed to always be correct and its results must be evaluated in terms of the percentage of answers that match those an expert might give. In this regard, the technology is similar to the biological neural functions that it simulates and is significantly different from other conventional software systems.
The basic building block of the neural network technology is the simulated neuron which processes a number of inputs to produce an output. Inputs an outputs are usually numeric values between 0 and 1, where values close to 1 represent positive stimulation, and 0 represents either negative or no stimulation. Inputs to the neuron may come from the "outside world" or from other neurons, and outputs may go to other neurons or to the "outside world". The process used by the neuron to process its inputs is usually a summation of inputs followed by a linear function applied to the sum called an output function.
Independent neurons are of little value by themselves, however, when interconnected in a network and "trained" the resulting network becomes a useful tool. Neurons, also called nodes, in the first layer receive their inputs from the outside and are called input nodes and neurons in the output layer send their outputs to the outside and are called output nodes. Each of the input nodes indirectly passes its output to each of the ouput nodes for final processing. As the values are passed, they are weighted to represent connection strengths. To positively reinforce a connection, the weight is raised and to negatively reinforce or inhibit a connection the weight is lowered. The determination of the weighting factors takes place in the training process.
Training is the process where inputs with a known result are supplied to the network and the errors between the predicted results and the known results are used to adjust the weights. This process is repeated until the error is reduced to an acceptable tolerance. Upon completion of training, the network should be capable of reproducing correct output patterns when presented with input patterns similar to those learned. A two-layer network is not capable of learning complex patterns, for example, the exclusive OR pattern. However, by adding additional layers and using a training technique referred to as back propagation, this deficiency is overcome. The nodes in a layer between the input and output nodes are called hidden nodes. The theory behind back propagation learning is referred to as 11 the generalized Delta Rule and is detailed in Ru elhart and McClelland "Parallel Distributed Processing", Vol. 1, pp. 444-459.
The operational details of the actual neural network software program evaluated and used in the present described embodiments can be found in the referenced operating manual "NeuroShell Neural Network
Shell Program."
Referring again to Fig. 2, there is shown a block diagram of the system according to this invention for controlling a physical multi-variable manufacturing process. The entire system has been implemented using a combination of hardware and software to accomplish the functions represented by the blocks labeled "Configuration", "Training", "Monitoring" and "Run Time" System. The configuration function allows the user to define the specific process application. The training function allows the user to automatically generate neural networks based upon the configuration information and train the network based upon the training data collected. The monitoring function allows the user to view process data and neural network outputs. The run time function uses a previously trained neural network to implement real time control of the process based upon the user configuration. Each of these functions is described in further detail below. The descriptions are based on an embodiment utilizing Ashton Tate's dBASE IV programming language and the NeuroShell neural network emulation program. The configuration function allows the user to build databases that define the physical process points, the network configuration and the I/O driver protocols. Physical process points are defined by assigning a tagname, description, unit of measurement and range of values for each point to be used by the system. These points include physical inputs, calculated values, 12 manually entered values and physical outputs. The definition of each point is stored in the database named POINTDEF. The entries for an exemplary process definition database are set forth in Table I.
TABLE 1
TAGNAME DESCRIPT VALUE UNITS MODE
MI101 Production Rate FI102 Stock Flow to Bleach Pit CI103A Stock Consistency FI104 Injection Water Flow FI105 C102 Total Flow FI106 Clean Shower Flow TI107 Clean Shower Temperature FI108 NaOH Shower Flow TI109 NaOH Shower Temperature TI110 Vat Temperature LI111 Vat Level ZI112 Vat Dilution Valve Pos. CI113 Predicted Vat Consist. CI114 Predicted Mat Consist. CI103 Sel Stock Consistency CI103B Stock Consistency GRADE Pine/Hardwood BI115 C-Stage Brightness RI116 C-Stage Residual FI117 Chlorine Flow MI117 Percent Applied Chlorine FI118 Caustic Flow MI118 Percent Applied Caustic FR117 CL2/CL02 Substitution DI105 CL02 Concentration DI118 NaOH Concentration TI119 Chlorine Gas Temp. PI120 Chlorine Gas Pressure
Figure imgf000014_0001
13
TI121 Stock Temp. Entering BP
MI122 Mass Rate to C12 Washer
FI123 Flow Out of C12 Washer
CI124 Stk Consist. C12 Wash Ot MI125 Mass Rate from C12 Wash
TI126 Oxygen Gas Temperature
PI127 Oxygen Gas Pressure
FI128 E-Stage Shower Flow
TI129 E-Stage Shower Temp. TI130 E-Stage Vat Temperature
LI131 E-Stage Vat Level
ZI132 E-Stage Dil. Valve Pos.
CI133 E-Stage Vat Consistency
CI134 E-Stage Mat Consistency LI135 E-Stage Tower Level
MI136 Mass Rate to E-Washer
BI115B CEK Lab Entry Value
BI115A Compensated CEK No.
AI133 E-Stage Entry pH FI134 Oxygen Flow
MI134 Percent Applied Oxygen
Figure imgf000015_0001
The process defined in Table 1 corresponds to the pulp bleaching process shown schematically in Figs. 11 and 11A. The tagnames in the table correspond with the inputs and outputs labeled on Fig. 11. The entries in the column headed "Value" contain values for the variable identified by tagname in the left-most column. Variable values, of course, may be changed during processing. The entry in the column headed "Mode" indicates the type of input or output as follows: "Al" means analog input, "MAN" means manual input, "CAL" means manually or automatically calculated value and "NO" means neural net output.
.The neural network is defined by assigning a tagname, description, units .of measurement and range of values for each input and output neuron required*. Processing rules are also defined for each neuron. Processing rules for input neurons are executed prior to submitting the input values to the trained network and are therefore referred to as "pre-processing" rules. Processing rules associated with output neurons are executed after obtaining the output values from the trained network and are referred to as "post-processing" rules. The definition of each neuron and its associated processing rules are stored in the database named NETDEF. The entries in an exemplary neural network database are set forth in Table II with the exception of the rules.
TABLE II
NEURON TAGNAME DESCRIPT VALUE UNITS
INPUT01 NI101 Production Rate 600.0000 TONS/DAY INPUT02 NI102 Stock Flow to Bleach Pit 3216.0000 GPM
5 INPUT03 NI103 Stock Consistency 2.8780 % INPUT04 NI104 Injection Water Flow 540.0000 GPM INPUT05 NI105 C102 Total Flow 62.0000 GPM INPUT06 NI106 Clean Shower Flow 560.0000 GPM INPUT07 Nil07 Clean Shower Temperature 152.0000 Degrees
10 INPUT08 NI108 NaOH Shower Flow 120.0000 GPM INPUT09 NI109 NaOH Shower Temperature 150.0000 Degrees INPUT10 NI110 Vat Temperature 142.0000 Degrees INPUT11 NI111 Vat Level 73.0000 % Full INPUT12 NI112 Vat Dilution Valve Posit 34.0000 % Open
15 INPUT13 NI113 Pine/Hardwood 0.0000 PINE/HDW INPUT14 NI114 Chlorine Flow 2300.0000 lb/hr INPUT15 NI115 Caustic Flow 65.0000 GPM OUTPUT01 NO101 Predicted Mat Consist. 13.2683 % OUTPUT02 NO102 Vat Consistency 1.4852 %
20 OUTPUT03 NI103 Equivalent C12 Set Point 6.0000 % on stk OUTPUT04 NO104 Applied Caustic Target 3.4000 % on stk
Exemplary rules for input neurons are set forth III-VIII.
TABLE III
Neuron INPUT06 Tagname NI106 Description Clean Shower Flow
Units GPM Value 560.0000 Min 300.0000 Max 700.0000
Processing Instructions NI106 = FI106
Figure imgf000019_0001
TABLE IV
Neuron INPUT01 Tagname NI101 Description Production Rate Units TONS/DAY Value 600.0000 Min 400.0000 Max 1000.0000
Processing Instructions
MI101 = 0.06 * NI102 * NI1 03 ** Calc Mass f(flw,con) NI101 = NI101 + 0.40 * (MI1 01 -NI1 01 ) ** Filter Mil01
TABLE V
Neuron INPUT02 Tagname NI102 Description Stock Flow to Bleach Pit
Units GPM Value 3216.0000 Min 2000.0000 Max 5000.0000
Processing Instructions
IF FI102>600 ** If FI102 is greater than 600 GPM
NI102 = FI102 ** use FI102 as input to neuron
ELSE ** Else...
NI102 = NI102 ** use last value of NI104 ENDIF
TABLE VI
Neuron INPUT03 Tagname NI103 Description Stock Consistency Units % Value 2.8780 Min 1.5000 Max 4.5000
Processing Instructions
5 IF CI103A>2.0 .AND. CI103A<5.0 ** If CI103A is in range... CI103 = CI103A ** select CI103A ELSE ** Else...
IF CI103B>2.0 .AND. CI103B<5.0 ** If CI103B is in range... CI103 = CI103B ** select CI103B 1 o ELSE ** Else...
CI103 = CI103 ** use last value of CI103 ENDIF **
NI103 = CI103 ** Set neuron value to CI103 &SAVE CI103 ** Save current CI103 to history
Figure imgf000022_0001
TABLE VII
Neuron INPUT04 Tagname NI104 Description Injection Water Flow Units GPM Value 540.0000 Min 400.0000 Max 600.0000
Processing Instructions
IF FI104<580 ** If FI104 is less than 580 GPM...
NI104 = FI104 ** use FI104 as input to neuron
ELSE ** Else...
NI104 = NI104 ** use last value of NI104
ENDIF
TABLE VIII
Neuron INPUT05 Tagname NI105 Description C102 Total Flow
Units GPM Value 62.0000 Min 30.0000 Max 175.0000
Processing Instructions
SELECT &HIST0RY FI105 ** Select history file for FI105
&BACK 20 ** Move back by 20 1 minute samples
NI105 = HIST_VALUE ** Use historical value for nueral input
&SAVE FI105 ** Save current value of FI105 in history
Figure imgf000024_0001
Exemplary rules for output neurons are set forth IX-XI.
TABLE IX
Figure imgf000026_0001
Neuron OUTPUT01 Tagname NO101 Description Predicated Mat Consist. Units % Value 13.2683 Min 10.0000 Max 15.0000
Processing Instructions 5 IF NO101 > 1.02 * CI114 THEN ** If 2% above target... IF FI104 < 450 THEN ** And FI104 less than 450
FI104SP = FI104SP + 0.10 * (N0101 -CI114) ** Adjust FI104 setpoint ELSE ** Else...
FI102SP = FI102SP - 0.05 * (N0101 -CI11 ) ** Adjust FI102 setpoint 10 ENDIF ENDIF
IF NO101 < 0.98 * CI114 THEN ** If 2% below target... IF FI104 < 325 THEN ** And FI104 less than 325
FI104SP = FI104SP + 0.10 * (N0101 -CI114 ) ** Adjust FI104 setpoint 15 ELSE ** Else...
FI102SP = FI102SP - 0.05 * (N0101 -CI11 ) ** Adjust FI102 setpoint ENDIF ENDIF
Figure imgf000026_0002
TABLE X
Neuron OUTPUT02 Tagname NO102 Description Vat Consistency
Units % Value 1.4852 Min 1.2000 Max 2.2000
Processing Instructions CI113 = NO102
TABLE XI
Figure imgf000028_0001
Neuron OUTPUT03 Tagname NO103 Description Equivalent C12 Set Point
Units % on stk Value 6.0000 Min 0.0000 Max 10.0000
Processing Instructions
MI117 = NO103 ** Load % Applied Target
FI117SP = MI117 * (1.0 - FR117) * MI101 ** Calc. CL2 setpoint
FI105SP = (MI117 * FR117 * MI101 )/( 1.5803 * DI105) ** Calc. CL02 setpoint
The I/O driver protocols are configured by defining the type of process control equipment to be interfaced with the address or tagname of the I/O points in the process control system and the communication parameters required. The process control equipment may be Programmable Logic Controllers (PLC) , Distributed Control Systems (DCS) or individual analog controllers.
The training function allows the user to generate and train neural networks for his specific application. A new network is generated after the user completes the configuration process described above. Information from the NETDEF.DBF database is used to automatically build the files required by the NeuroShell program to define the neural network. The "Define Training Set" function allows the user to define the names of four training sets and the conditions required for adding training data to each set. The training data is evaluated for inclusion in a training set when actual operating data is provided to the neural network. If there is a significant error between the operating data and the predicted values, the neural input values associated with that prediction and the operating data are added to the appropriate training set.
The monitoring function allows the user to view process data and neural network outputs. In the dBASE IV embodiment, the browse function is used to view data in the point definition and network definition databases. The monitoring function would typically be implemented as an integral part of the process control system. The integration of this function into the process operator's interface is highly desirable and will be a standard feature of future embodiments.
The control function uses a previously trained neural network to implement real-time control of the process. This functionality is implemented in the dBASE
IV embodiment by source code for a dBASE IV command file the program "CONTROL.PRG". The overall structure of CONTROL.PRG is shown in Fig. 3*. The.program is comprised of six program segments that are described as follows:
Referring to Fig. 4, MVARINIT is the variable initialization procedure for CONTROL. The primary function of MVARINIT is to define variable structures and initialize variable values. A public memory variable is defined for each point record in the point definition database and for each neuron record in the network definition database. These memory variables are used to hold the point values for the other procedures of CONTROL.
Referring to Fig. 5, DAC is the data acquisition procedure for CONTROL. The primary function of DAC is to acquire data from manual entry points in the point definition database and read field inputs by calling the appropriate I/O driver.
Referring to Fig. 6, PREPROC is the rule based neural input pre-processing procedure for CONTROL. The procedure executes the pre-processing rules for each input neuron configured in the network definition database. The rules can be any valid dBASE IV command.
Referring to Fig. 7, NETSUBMT is the neural network interface procedure for CONTROL. The procedure builds the classification file required to submit data to a trained NeuroShell network. After building the file, NETSUBMT calls NeuroShell to execute the network. NeuroShell places the resulting neuron output values in the classification file and returns to NETSUBMT. NETSUBMT then reads the neural outputs from the classification file and moves the results to the output variables.
Referring to Fig. 8, POSTPROC is the rule based neural output post-processing procedure for CONTROL. The procedure executes the post-processing rules for each output neuron configured in the network definition database. The rules can be any valid dBASE IV command.
Referring to Fig. 9, DOUT is the data distribution procedure for CONTROL. The primary function of DOUT is to write data to the network definition database and the point definition database and write field outputs by calling the appropriate I/O driver.
Referring to Fig. 10, the TRAIN program is used to intelligently select data sets from the process data, in an on-line manner, and store the data in training sets to be used for training and updating networks. The predicted values (neural outputs) are compared to actual values. If the difference exceeds say 2%, then additional training sets are gathered to retrain the neural network. Each set of neural input values and actual values (a training set) is tested to determine if it is a valid training set. A training set may be invalid because of a short or open in a lead wire from an analog input or for other reasons that put an input value outside of a logical range.
Referring now to Figs. 11 and 11A, there is shown in schematic form a continuous process used in the paper-making industry controlled by a neural network according to the teachings of this invention. Shown is the initial portion of a bleach line. it should be understood that this is exemplary of the processes which may be controlled by a neural network according to this invention. This invention can be applied to numerous other processes. The process variable inputs to the controller are labeled with letters and the process target value outputs from the controller are labeled with numerals. The tagnames for process variables appear in circles, squares and hexagons. The properties or process parameters that correspond to the tagnames are set forth in Table I. The main objective of the bleaching process is to increase the brightness (whiteness) of a pulp by the removal and modification of certain constituents of the unbleached pulp. The process involves using chemicals to oxidize and/or replace the major color component (lignin) in the pulp suspension. The measure of lignin content in the finally processed pulp is known as the Kappa-number (CEK) . This measure is taken after the stock suspension passes through two process stages (the chlorination or C-Stage and the extraction or E/O-Stage) . The measured CEK (BI115B) cannot be directly controlled. An optical sensor (BI115) is used to measure the change in color (brightness) at an early stage of the process. The chemical reaction which bleaches the pulp takes 90 to 120 minutes to go to completion. The brightness sensor being located approximately 1 1/2 to 2 minutes into the reaction measures a variable the value of which is used by the neural net along with many other variable values to generate a projection of the final brightness. Because this projection is only a indirect measurement of the desired value, the projection is affected by variables such as temperature, stock consistency, flow rate, chemical concentration and so forth.
The chemical reaction is a two-part reaction consisting of oxidation and substitution. A portion of the chlorinated lignin is water soluble and is washed out of the pulp. However, a portion is not water soluble and must be modified to be removed. This takes place approximately one-third to one-half the way through the chlorination reaction. Based upon the amount of chlorine added, the proportionate amount of caustic (NaOH) is added to render the final chlorinated lignin water soluble. In some instances, other chemicals are also added to enhance the brightening process at different points or stages in the process. Some examples of these chemicals are oxygen, peroxide and hypochlorite. The location of the addition point, the amount of auxiliary chemical added along with other Conditions about the addition point have an effect on the final CEK number. Interrelationships of the many variables are difficult to determine and in some cases are unknown. The process time constant (90 to 120 minutes) is long compared to the time to adjust the controlled variables (1 to 120 seconds) .
Referring still to Figs. 11 and 11A, the process variable inputs (A...KK) are measured by various transducers and applied to the controller (shown as circles) . Manual entries of process variables are also applied to the controller (shown as squares) . Finally, calculated variables are applied to the controller (shown as hexagons) . The variable which the process seeks to control (CEK) is determined in the laboratory based on samples taken at the output of the E/0 washer Stage.
The outputs from the controller (1...8) are applied to provide the operator a read-out or to make corrections to the system. Output 1 is used along with the C-Stage brightness measurement BI115 to calculate a compensated CEK number which is predictive of the CEK lab entry value (BI115B) . The purpose of the control system is to adjust controlled process variables so that the predicted and tested CEK values approach a desired value. (If the predicted and measured CEK values are written out reasonably close then the neural net has not been adequately trained or requires retraining.)
Output 1 adjusts the set points for two PID controllers in a first stage of the process wherein Cl_ and CIO. are added to the raw pulp. Output 3 adjusts the set point of a PID controller for the addition of caustic (NaOH) to the washer following the C-Stage. Output 4 adjusts the set point of a PID controller for addition of 0_ to the E/O-Stage. If the predicted CEK (predicted by brightness measure BI115 and. the other variable values applied to the neural net) is not at the desired level, controlled variables just mentioned can be adjusted to move the predicted value to the desired value.
The above-described system provides the following features and functions: A mechanism to present input data to the neural network which data is derived from a variety of sources including: a) analog and discrete inputs which are closely coupled to the neural network by direct I/O allowing the input of actual process signals with and without signal conditioning; and b) interrogation of information from process control systems and host computers which require some sort of intelligent communication protocol. A method to process input and output signals including: a) pre-processing or input processing which includes such features as averaging, filtering, combining and signal switching of signals prior to presentation to the neural network; b) the output or post-processing of neural network outputs including the application of rule-based knowledge in addition to limit checks, comparisons, weighting, scaling and ratioing. A system for final control including: a) a mechanism for processing the outputs residing on the same system where the inputs were received or; b) a system in which inputs may be received from one or multiple systems and outputs may be sent to other systems or to a host system for data logging and archiving.
Having thus described our invention with the detail and particularity required by the Patent Laws, what is claimed and desired to be protected by Letters Patent is set forth in the following claims.

Claims

WE CLAIM:
1. A control system for a continuous physical process characterized by directly controlled and indirectly controlled process variables, said variables having values that define the state of the process wherein the values of directly controlled process variables are measured and wherein the value of at least one indirectly controlled process variable is controlled indirectly by control of the values of a plurality of directly controlled process variables comprising: control means responsive to set point values for establishing the value of the directly controlled variables at said set point values applied to said control means, means for implementing a trainable neural network having a plurality of input neurons for having input values applied thereto and at least one output neuron for providing an output value, means for training the neural network to provide a predicted value for the indirectly controlled variable at an output neuron, said predicted value corresponding to the input values of the neural network, means for measuring the values of directly controlled process variables, means for establishing and continuously updating a computer database to store the values of measured process .variables, computer means for establishing the input values at the input neurons of the neural network based upon the values of process variables stored in the database, computer means for establishing set point values to be applied to control means based upon values at an output neuron, said control system so constructed and arranged that said computer means for establishing set point values, after the neural network has been trained to predict the value of said indirectly controlled variable, changes at least one set point value to cause the predicted value of the indirectly controlled variable to approach a desired value.
2. The control system according to claim 1 wherein the continuous process is characterized by an input stage, intermediate stages and an output stage, said process takes a starting material or workpieces at said input stage, processes said material or workpieces at the intermediate stages and delivers a finished material or workpieces at said output stage and wherein a period of time between the material or a workpiece at the input stage and the same material or workpiece at the output stage is long compared to a period of time for the control means to establish directly controlled variables at set points.
3. The control system according to claim 2 wherein the indirectly controlled variable is measured at or near the output stage and other process variables are measured at or near all stages.
4. The control system according to claim 3 wherein control means are associated with a plurality of distinct process stages.
5. The control system according to claims l, 2, 3 or 4 wherein said means for establishing and continuously updating the computer data base stores values of a plurality of measured variables at spaced time intervals in a data structure to maintain a process history.
6. The control system according to claim 5 wherein the adjustment of each control means at a given process stage is based upon a set point value established for that control means by presenting to the neural network inputs based upon the history of the measured process variables associated with prior process stages and based upon the current values of measured process variables at succeeding stages.
7. A method for controlling a multiple stage continuous physical process characterized by an input stage, intermediate stages, and an output stage, said process taking a starting material or workpieces at said input stage, processing said material or workpieces at intermediate stages and delivering a finished material or workpiece at said output stage, and further characterized by directly controlled and indirectly controlled process variables, and wherein the value of at least one process variable is indirectly controlled by controlling the values of a plurality of directly controlled variables comprising the steps of: a) measuring the values of a plurality of process variables associated with more than one process stage including said controlled variables, b) submitting the values of a selected set of measured process variables to a controller comprising a trained neural network to generate a predicted value of an indirectly controlled process variable, c) adjusting the values of the directly controlled variables to cause the predicted value of the indirectly controlled variable to approach a desired value wherein this step for adjusting the values of directly controlled variables at a given process stage is based upon a step for submitting the values of a selected set of measured process variables to the neural network inputs, said set comprising the values of measured process variables for the material or workpieces when at process stages through which they have already passed.
8. The method for controlling a continuous process according to claim 7 comprising the additional step of d) at spaced intervals measuring the value of the indirectly controlled variable and conparing the measured value to the predicted value to test the validity of the predicted values generated by the neural network.
9. The method for controlling a multiple stage continuous process according to claim 8 comprising the additional steps of: e) continuously, at spaced time intervals while the process is proceeding, saving training data sets comprising values of measured variables used to predict a given value of an indirectly controlled variable and the value of the indirectly controlled variable actually measured and f) using the training sets to retrain the neural network.
10. The method according to claim 7 wherein the continuous multiple stage process wherein the time interval between the material or a workpiece is taken at the input stage and when the material or workpiece is delivered at the output stage is long compared to the time interval required for adjusting directly controlled variables.
PCT/US1991/000141 1990-01-08 1991-01-08 Process control using neural network WO1991010961A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CA002073331A CA2073331A1 (en) 1990-01-08 1991-01-08 Process control using neural network

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US462,503 1983-01-31
US07/462,503 US5111531A (en) 1990-01-08 1990-01-08 Process control using neural network

Publications (1)

Publication Number Publication Date
WO1991010961A1 true WO1991010961A1 (en) 1991-07-25

Family

ID=23836669

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1991/000141 WO1991010961A1 (en) 1990-01-08 1991-01-08 Process control using neural network

Country Status (5)

Country Link
US (1) US5111531A (en)
EP (1) EP0510112A4 (en)
AU (1) AU7249891A (en)
CA (1) CA2073331A1 (en)
WO (1) WO1991010961A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1995006289A1 (en) * 1993-08-26 1995-03-02 Associative Measurement Pty. Ltd. Interpretive measurement instrument
EP0534221B1 (en) * 1991-09-24 1996-12-11 Siemens Aktiengesellschaft Control parameter improvement method for industrial installations
US5751571A (en) * 1993-07-05 1998-05-12 Siemens Aktiengesellschaft Process and apparatus for determining optimum values for manipulated variables of a technical system
EP0495044B1 (en) * 1990-08-03 1998-09-23 E.I. Du Pont De Nemours And Company Computer neural network process measurement and control system and method
AU728376B2 (en) * 1993-08-26 2001-01-11 Associative Measurement Pty Ltd Interpretive measurement instrument

Families Citing this family (183)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5588091A (en) * 1989-05-17 1996-12-24 Environmental Research Institute Of Michigan Dynamically stable associative learning neural network system
CA2031765C (en) * 1989-12-08 1996-02-20 Masahide Nomura Method and system for performing control conforming with characteristics of controlled system
US5222196A (en) * 1990-02-20 1993-06-22 International Business Machines Corporation Neural network shell for application programs
KR920002268A (en) * 1990-07-17 1992-02-28 유끼노리 가까즈 Intelligent Processing Equipment
US5212765A (en) * 1990-08-03 1993-05-18 E. I. Du Pont De Nemours & Co., Inc. On-line training neural network system for process control
US5224203A (en) * 1990-08-03 1993-06-29 E. I. Du Pont De Nemours & Co., Inc. On-line process control neural network using data pointers
EP0480654B1 (en) * 1990-10-10 1998-03-04 Honeywell Inc. Process system identification
US5285377A (en) * 1990-10-30 1994-02-08 Fujitsu Limited Control apparatus structuring system
JP3329806B2 (en) * 1990-11-09 2002-09-30 株式会社日立製作所 Neural network construction device
EP0496570B1 (en) * 1991-01-22 1998-06-03 Honeywell Inc. Two-level system identifier apparatus with optimization
JP3026630B2 (en) * 1991-04-19 2000-03-27 株式会社リコー Electrophotographic process control equipment
US5485545A (en) * 1991-06-20 1996-01-16 Mitsubishi Denki Kabushiki Kaisha Control method using neural networks and a voltage/reactive-power controller for a power system using the control method
JPH05165842A (en) * 1991-12-13 1993-07-02 Toyota Central Res & Dev Lab Inc Estimating device for process time
EP0617806B1 (en) * 1991-12-18 1998-05-20 Honeywell Inc. A closed loop neural network automatic tuner
US5361326A (en) * 1991-12-31 1994-11-01 International Business Machines Corporation Enhanced interface for a neural network engine
US5282131A (en) * 1992-01-21 1994-01-25 Brown And Root Industrial Services, Inc. Control system for controlling a pulp washing system using a neural network controller
US5396415A (en) * 1992-01-31 1995-03-07 Honeywell Inc. Neruo-pid controller
US5353207A (en) * 1992-06-10 1994-10-04 Pavilion Technologies, Inc. Residual activation neural network
US5488561A (en) * 1992-08-19 1996-01-30 Continental Controls, Inc. Multivariable process control method and apparatus
US5581661A (en) * 1992-08-31 1996-12-03 Wang; Shay-Ping T. Artificial neuron using adder circuit and method of using same
US5477444A (en) * 1992-09-14 1995-12-19 Bhat; Naveen V. Control system using an adaptive neural network for target and path optimization for a multivariable, nonlinear process
US5469528A (en) * 1992-09-16 1995-11-21 Syseca, Inc. Neural load disturbance analyzer
CA2081519C (en) * 1992-10-27 2000-09-05 The University Of Toronto Parametric control device
US6243696B1 (en) * 1992-11-24 2001-06-05 Pavilion Technologies, Inc. Automated method for building a model
US5486996A (en) * 1993-01-22 1996-01-23 Honeywell Inc. Parameterized neurocontrollers
US5825646A (en) 1993-03-02 1998-10-20 Pavilion Technologies, Inc. Method and apparatus for determining the sensitivity of inputs to a neural network on output parameters
WO1994020887A2 (en) * 1993-03-02 1994-09-15 Pavilion Technologies, Inc. Method and apparatus for analyzing a neural network within desired operating parameter constraints
JPH06301406A (en) * 1993-04-14 1994-10-28 Toshiba Corp Hierarchical model predictive control system
CA2118885C (en) * 1993-04-29 2005-05-24 Conrad K. Teran Process control system
US5680784A (en) * 1994-03-11 1997-10-28 Kawasaki Steel Corporation Method of controlling form of strip in rolling mill
US5586221A (en) * 1994-07-01 1996-12-17 Syracuse University Predictive control of rolling mills using neural network gauge estimation
US5566065A (en) * 1994-11-01 1996-10-15 The Foxboro Company Method and apparatus for controlling multivariable nonlinear processes
US5704011A (en) * 1994-11-01 1997-12-30 The Foxboro Company Method and apparatus for providing multivariable nonlinear control
US5570282A (en) * 1994-11-01 1996-10-29 The Foxboro Company Multivariable nonlinear process controller
US5615323A (en) 1994-11-04 1997-03-25 Concord Communications, Inc. Displaying resource performance and utilization information
DE4440859C2 (en) * 1994-11-15 1998-08-06 Alexander Kaske Method and device for controlling an autonomously exploring robot
US6314414B1 (en) * 1998-10-06 2001-11-06 Pavilion Technologies, Inc. Method for training and/or testing a neural network with missing and/or incomplete data
US6879971B1 (en) * 1995-12-22 2005-04-12 Pavilion Technologies, Inc. Automated method for building a model
US5746511A (en) * 1996-01-03 1998-05-05 Rosemount Inc. Temperature transmitter with on-line calibration using johnson noise
US5841669A (en) * 1996-01-26 1998-11-24 Howmet Research Corporation Solidification control including pattern recognition
US7949495B2 (en) 1996-03-28 2011-05-24 Rosemount, Inc. Process variable transmitter with diagnostics
US7254518B2 (en) * 1996-03-28 2007-08-07 Rosemount Inc. Pressure transmitter with diagnostics
US7630861B2 (en) 1996-03-28 2009-12-08 Rosemount Inc. Dedicated process diagnostic device
US6654697B1 (en) 1996-03-28 2003-11-25 Rosemount Inc. Flow measurement with diagnostics
US8290721B2 (en) 1996-03-28 2012-10-16 Rosemount Inc. Flow measurement diagnostics
US6017143A (en) * 1996-03-28 2000-01-25 Rosemount Inc. Device in a process system for detecting events
US7085610B2 (en) 1996-03-28 2006-08-01 Fisher-Rosemount Systems, Inc. Root cause diagnostics
US6907383B2 (en) 1996-03-28 2005-06-14 Rosemount Inc. Flow diagnostic system
US6539267B1 (en) 1996-03-28 2003-03-25 Rosemount Inc. Device in a process system for determining statistical parameter
US7623932B2 (en) 1996-03-28 2009-11-24 Fisher-Rosemount Systems, Inc. Rule set for root cause diagnostics
US6110214A (en) * 1996-05-03 2000-08-29 Aspen Technology, Inc. Analyzer for modeling and optimizing maintenance operations
US5877954A (en) * 1996-05-03 1999-03-02 Aspen Technology, Inc. Hybrid linear-neural network process control
US5809490A (en) * 1996-05-03 1998-09-15 Aspen Technology Inc. Apparatus and method for selecting a working data set for model development
US5933345A (en) * 1996-05-06 1999-08-03 Pavilion Technologies, Inc. Method and apparatus for dynamic and steady state modeling over a desired path between two end points
US7610108B2 (en) * 1996-05-06 2009-10-27 Rockwell Automation Technologies, Inc. Method and apparatus for attenuating error in dynamic and steady-state processes for prediction, control, and optimization
US7418301B2 (en) * 1996-05-06 2008-08-26 Pavilion Technologies, Inc. Method and apparatus for approximating gains in dynamic and steady-state processes for prediction, control, and optimization
US7058617B1 (en) * 1996-05-06 2006-06-06 Pavilion Technologies, Inc. Method and apparatus for training a system model with gain constraints
US6438430B1 (en) * 1996-05-06 2002-08-20 Pavilion Technologies, Inc. Kiln thermal and combustion control
US6493596B1 (en) * 1996-05-06 2002-12-10 Pavilion Technologies, Inc. Method and apparatus for controlling a non-linear mill
US8311673B2 (en) * 1996-05-06 2012-11-13 Rockwell Automation Technologies, Inc. Method and apparatus for minimizing error in dynamic and steady-state processes for prediction, control, and optimization
US7149590B2 (en) 1996-05-06 2006-12-12 Pavilion Technologies, Inc. Kiln control and upset recovery using a model predictive control in series with forward chaining
US5847952A (en) * 1996-06-28 1998-12-08 Honeywell Inc. Nonlinear-approximator-based automatic tuner
US5946673A (en) * 1996-07-12 1999-08-31 Francone; Frank D. Computer implemented machine learning and control system
US6246972B1 (en) 1996-08-23 2001-06-12 Aspen Technology, Inc. Analyzer for modeling and optimizing maintenance operations
US6363289B1 (en) 1996-09-23 2002-03-26 Pavilion Technologies, Inc. Residual activation neural network
US6449574B1 (en) 1996-11-07 2002-09-10 Micro Motion, Inc. Resistance based process control device diagnostics
US6754601B1 (en) 1996-11-07 2004-06-22 Rosemount Inc. Diagnostics for resistive elements of process devices
US6601005B1 (en) 1996-11-07 2003-07-29 Rosemount Inc. Process device diagnostics using process variable sensor signal
US5956663A (en) * 1996-11-07 1999-09-21 Rosemount, Inc. Signal processing technique which separates signal components in a sensor for sensor diagnostics
US6434504B1 (en) 1996-11-07 2002-08-13 Rosemount Inc. Resistance based process control device diagnostics
US5828567A (en) * 1996-11-07 1998-10-27 Rosemount Inc. Diagnostics for resistance based transmitter
US6519546B1 (en) 1996-11-07 2003-02-11 Rosemount Inc. Auto correcting temperature transmitter with resistance based sensor
DE69714606T9 (en) * 1996-12-31 2004-09-09 Rosemount Inc., Eden Prairie DEVICE FOR CHECKING A CONTROL SIGNAL COMING FROM A PLANT IN A PROCESS CONTROL
CA2230882C (en) * 1997-03-14 2004-08-17 Dubai Aluminium Company Limited Intelligent control of aluminium reduction cells using predictive and pattern recognition techniques
CA2306767C (en) 1997-10-13 2007-05-01 Rosemount Inc. Communication technique for field devices in industrial processes
US7308322B1 (en) * 1998-09-29 2007-12-11 Rockwell Automation Technologies, Inc. Motorized system integrated control and diagnostics using vibration, pressure, temperature, speed, and/or current analysis
US7539549B1 (en) * 1999-09-28 2009-05-26 Rockwell Automation Technologies, Inc. Motorized system integrated control and diagnostics using vibration, pressure, temperature, speed, and/or current analysis
US20010025232A1 (en) * 1998-10-02 2001-09-27 Klimasauskas Casimir C. Hybrid linear-neural network process control
US6415272B1 (en) * 1998-10-22 2002-07-02 Yamaha Hatsudoki Kabushiki Kaisha System for intelligent control based on soft computing
US6615149B1 (en) 1998-12-10 2003-09-02 Rosemount Inc. Spectral diagnostics in a magnetic flow meter
US6611775B1 (en) 1998-12-10 2003-08-26 Rosemount Inc. Electrode leakage diagnostics in a magnetic flow meter
US6985781B2 (en) * 1999-01-12 2006-01-10 Pavilion Technologies, Inc. Residual activation neural network
US6633782B1 (en) 1999-02-22 2003-10-14 Fisher-Rosemount Systems, Inc. Diagnostic expert in a process control system
US8044793B2 (en) 2001-03-01 2011-10-25 Fisher-Rosemount Systems, Inc. Integrated device alerts in a process control system
US7346404B2 (en) * 2001-03-01 2008-03-18 Fisher-Rosemount Systems, Inc. Data sharing in a process plant
US7206646B2 (en) * 1999-02-22 2007-04-17 Fisher-Rosemount Systems, Inc. Method and apparatus for performing a function in a plant using process performance monitoring with process equipment monitoring and control
US7562135B2 (en) * 2000-05-23 2009-07-14 Fisher-Rosemount Systems, Inc. Enhanced fieldbus device alerts in a process control system
US6298454B1 (en) 1999-02-22 2001-10-02 Fisher-Rosemount Systems, Inc. Diagnostics in a process control system
US6516348B1 (en) 1999-05-21 2003-02-04 Macfarlane Druce Ian Craig Rattray Collecting and predicting capacity information for composite network resource formed by combining ports of an access server and/or links of wide arear network
US6356191B1 (en) 1999-06-17 2002-03-12 Rosemount Inc. Error compensation for a process fluid temperature transmitter
US7010459B2 (en) 1999-06-25 2006-03-07 Rosemount Inc. Process device diagnostics using process variable sensor signal
DK1247268T4 (en) 1999-07-01 2009-11-16 Rosemount Inc Self-validating two-wire low power temperature transmitter
US6505517B1 (en) 1999-07-23 2003-01-14 Rosemount Inc. High accuracy signal processing for magnetic flowmeter
US6701274B1 (en) 1999-08-27 2004-03-02 Rosemount Inc. Prediction of error magnitude in a pressure transmitter
US6556145B1 (en) 1999-09-24 2003-04-29 Rosemount Inc. Two-wire fluid temperature transmitter with thermocouple diagnostics
US6408227B1 (en) * 1999-09-29 2002-06-18 The University Of Iowa Research Foundation System and method for controlling effluents in treatment systems
BR9906022A (en) * 1999-12-30 2001-09-25 Opp Petroquimica S A Process for the controlled production of polyethylene and its copolymers
AU4733601A (en) * 2000-03-10 2001-09-24 Cyrano Sciences Inc Control for an industrial process using one or more multidimensional variables
JP3676660B2 (en) * 2000-08-28 2005-07-27 本田技研工業株式会社 Engine power generator
US6735484B1 (en) 2000-09-20 2004-05-11 Fargo Electronics, Inc. Printer with a process diagnostics system for detecting events
EP1220063B1 (en) * 2000-12-27 2005-03-09 STMicroelectronics S.r.l. Non-integer order dynamic systems
US8073967B2 (en) 2002-04-15 2011-12-06 Fisher-Rosemount Systems, Inc. Web services-based communications for use with process control systems
JP4160399B2 (en) 2001-03-01 2008-10-01 フィッシャー−ローズマウント システムズ, インコーポレイテッド Creating and displaying indicators in the process plant
US7720727B2 (en) 2001-03-01 2010-05-18 Fisher-Rosemount Systems, Inc. Economic calculations in process control system
US6970003B2 (en) 2001-03-05 2005-11-29 Rosemount Inc. Electronics board life prediction of microprocessor-based transmitters
US6629059B2 (en) 2001-05-14 2003-09-30 Fisher-Rosemount Systems, Inc. Hand held diagnostic and communication device with automatic bus detection
US6772036B2 (en) 2001-08-30 2004-08-03 Fisher-Rosemount Systems, Inc. Control system using process model
US6701236B2 (en) 2001-10-19 2004-03-02 Yamaha Hatsudoki Kabushiki Kaisha Intelligent mechatronic control suspension system based on soft computing
US7444310B2 (en) * 2002-04-19 2008-10-28 Computer Associates Think, Inc. Automatic model maintenance through local nets
US7483868B2 (en) * 2002-04-19 2009-01-27 Computer Associates Think, Inc. Automatic neural-net model generation and maintenance
US7777743B2 (en) * 2002-04-19 2010-08-17 Computer Associates Think, Inc. Viewing multi-dimensional data through hierarchical visualization
CA2481432A1 (en) 2002-04-19 2003-10-30 Ronald Cass Processing mixed numeric and/or non-numeric data
US7313279B2 (en) * 2003-07-08 2007-12-25 Computer Associates Think, Inc. Hierarchical determination of feature relevancy
EP1636738A2 (en) * 2003-05-23 2006-03-22 Computer Associates Think, Inc. Adaptive learning enhancement to auotmated model maintenance
US7242989B2 (en) * 2003-05-30 2007-07-10 Fisher-Rosemount Systems, Inc. Apparatus and method for batch property estimation
CN1853098B (en) 2003-07-18 2010-12-08 罗斯蒙德公司 Acoustic flowmeter and method for monitoring health degree of fixed equipment in industrial process
US7402635B2 (en) * 2003-07-22 2008-07-22 Fina Technology, Inc. Process for preparing polyethylene
WO2005013019A2 (en) * 2003-07-25 2005-02-10 Yamaha Motor Co., Ltd Soft computing optimizer of intelligent control system structures
US7018800B2 (en) 2003-08-07 2006-03-28 Rosemount Inc. Process device with quiescent current diagnostics
US7627441B2 (en) 2003-09-30 2009-12-01 Rosemount Inc. Process device with vibration based diagnostics
US7523667B2 (en) 2003-12-23 2009-04-28 Rosemount Inc. Diagnostics of impulse piping in an industrial process
CA2496661C (en) * 2004-02-19 2009-05-19 Oz Optics Ltd. Light source control system
US7251638B2 (en) 2004-03-03 2007-07-31 Yamaha Hatsudoki Kabushiki Kaisha Intelligent robust control system for motorcycle using soft computing optimizer
US7678210B1 (en) * 2004-03-08 2010-03-16 The United States Of America As Represented By The Secretary Of The Navy Injection loading of highly filled explosive suspensions
TWI231481B (en) * 2004-03-11 2005-04-21 Quanta Comp Inc Electronic apparatus
US6920799B1 (en) 2004-04-15 2005-07-26 Rosemount Inc. Magnetic flow meter with reference electrode
US7046180B2 (en) 2004-04-21 2006-05-16 Rosemount Inc. Analog-to-digital converter with range error detection
US20060218108A1 (en) * 2005-03-24 2006-09-28 Sergey Panfilov System for soft computing simulation
US20060224547A1 (en) * 2005-03-24 2006-10-05 Ulyanov Sergey V Efficient simulation system of quantum algorithm gates on classical computer based on fast algorithm
US9201420B2 (en) 2005-04-08 2015-12-01 Rosemount, Inc. Method and apparatus for performing a function in a process plant using monitoring data with criticality evaluation data
US8005647B2 (en) 2005-04-08 2011-08-23 Rosemount, Inc. Method and apparatus for monitoring and performing corrective measures in a process plant using monitoring data with corrective measures data
US8112565B2 (en) * 2005-06-08 2012-02-07 Fisher-Rosemount Systems, Inc. Multi-protocol field device interface with automatic bus detection
US20060293817A1 (en) * 2005-06-23 2006-12-28 Takahide Hagiwara Intelligent electronically-controlled suspension system based on soft computing optimizer
US7272531B2 (en) 2005-09-20 2007-09-18 Fisher-Rosemount Systems, Inc. Aggregation of asset use indices within a process plant
US20070068225A1 (en) 2005-09-29 2007-03-29 Brown Gregory C Leak detector for process valve
US7644051B1 (en) * 2006-07-28 2010-01-05 Hewlett-Packard Development Company, L.P. Management of data centers using a model
US7496414B2 (en) * 2006-09-13 2009-02-24 Rockwell Automation Technologies, Inc. Dynamic controller utilizing a hybrid model
US7953501B2 (en) 2006-09-25 2011-05-31 Fisher-Rosemount Systems, Inc. Industrial process control loop monitor
ES2293845B2 (en) * 2006-09-25 2008-12-16 Universidad Politecnica De Madrid SYSTEM OF MONITORING AND CONTROL OF PROCESSES OF SURFACE THERMAL TREATMENT OF MATERIALS WITH LASER THROUGH AN ADAPTIVE NEURONAL CONTROL BY MEANS OF REFERENCE MODEL.
US8788070B2 (en) * 2006-09-26 2014-07-22 Rosemount Inc. Automatic field device service adviser
JP2010505121A (en) 2006-09-29 2010-02-18 ローズマウント インコーポレイテッド Magnetic flow meter with verification
US7321846B1 (en) 2006-10-05 2008-01-22 Rosemount Inc. Two-wire process control loop diagnostics
US8032235B2 (en) * 2007-06-28 2011-10-04 Rockwell Automation Technologies, Inc. Model predictive control system and method for reduction of steady state error
US8898036B2 (en) 2007-08-06 2014-11-25 Rosemount Inc. Process variable transmitter with acceleration sensor
US8301676B2 (en) 2007-08-23 2012-10-30 Fisher-Rosemount Systems, Inc. Field device with capability of calculating digital filter coefficients
US7702401B2 (en) 2007-09-05 2010-04-20 Fisher-Rosemount Systems, Inc. System for preserving and displaying process control data associated with an abnormal situation
US7590511B2 (en) 2007-09-25 2009-09-15 Rosemount Inc. Field device for digital process control loop diagnostics
US8055479B2 (en) 2007-10-10 2011-11-08 Fisher-Rosemount Systems, Inc. Simplified algorithm for abnormal situation prevention in load following applications including plugged line diagnostics in a dynamic process
US8868221B1 (en) * 2008-08-22 2014-10-21 Marvell International Ltd. Adaptive neural net feed forward system and method for adaptive control of mechanical systems
US7921734B2 (en) * 2009-05-12 2011-04-12 Rosemount Inc. System to detect poor process ground connections
US8676721B2 (en) * 2009-09-18 2014-03-18 Apo Offshore, Inc. Method, system and apparatus for intelligent management of oil and gas platform surface equipment
US8457767B2 (en) 2010-12-31 2013-06-04 Brad Radl System and method for real-time industrial process modeling
CN102620378B (en) * 2011-01-27 2014-01-15 国际商业机器公司 Method and system for data center energy saving controlling
US9207670B2 (en) 2011-03-21 2015-12-08 Rosemount Inc. Degrading sensor detection implemented within a transmitter
CN102189118B (en) * 2011-04-02 2013-05-08 上海大学 Method for correcting shape model online based on fixed-length sampling
US9927788B2 (en) 2011-05-19 2018-03-27 Fisher-Rosemount Systems, Inc. Software lockout coordination between a process control system and an asset management system
US8521670B2 (en) 2011-05-25 2013-08-27 HGST Netherlands B.V. Artificial neural network application for magnetic core width prediction and modeling for magnetic disk drive manufacture
US9052240B2 (en) 2012-06-29 2015-06-09 Rosemount Inc. Industrial process temperature transmitter with sensor stress diagnostics
US9207129B2 (en) 2012-09-27 2015-12-08 Rosemount Inc. Process variable transmitter with EMF detection and correction
US9602122B2 (en) 2012-09-28 2017-03-21 Rosemount Inc. Process variable measurement noise diagnostic
US9558220B2 (en) 2013-03-04 2017-01-31 Fisher-Rosemount Systems, Inc. Big data in process control systems
US9397836B2 (en) 2014-08-11 2016-07-19 Fisher-Rosemount Systems, Inc. Securing devices to process control systems
US10649424B2 (en) 2013-03-04 2020-05-12 Fisher-Rosemount Systems, Inc. Distributed industrial performance monitoring and analytics
US10386827B2 (en) 2013-03-04 2019-08-20 Fisher-Rosemount Systems, Inc. Distributed industrial performance monitoring and analytics platform
US9804588B2 (en) 2014-03-14 2017-10-31 Fisher-Rosemount Systems, Inc. Determining associations and alignments of process elements and measurements in a process
US9665088B2 (en) 2014-01-31 2017-05-30 Fisher-Rosemount Systems, Inc. Managing big data in process control systems
US10866952B2 (en) 2013-03-04 2020-12-15 Fisher-Rosemount Systems, Inc. Source-independent queries in distributed industrial system
US10649449B2 (en) 2013-03-04 2020-05-12 Fisher-Rosemount Systems, Inc. Distributed industrial performance monitoring and analytics
US9823626B2 (en) 2014-10-06 2017-11-21 Fisher-Rosemount Systems, Inc. Regional big data in process control systems
US10282676B2 (en) 2014-10-06 2019-05-07 Fisher-Rosemount Systems, Inc. Automatic signal processing-based learning in a process plant
US10223327B2 (en) 2013-03-14 2019-03-05 Fisher-Rosemount Systems, Inc. Collecting and delivering data to a big data machine in a process control system
US10909137B2 (en) 2014-10-06 2021-02-02 Fisher-Rosemount Systems, Inc. Streaming data for analytics in process control systems
US10678225B2 (en) 2013-03-04 2020-06-09 Fisher-Rosemount Systems, Inc. Data analytic services for distributed industrial performance monitoring
US11573672B2 (en) 2013-03-15 2023-02-07 Fisher-Rosemount Systems, Inc. Method for initiating or resuming a mobile control session in a process plant
CN107885494B (en) 2013-03-15 2021-09-10 费希尔-罗斯蒙特系统公司 Method and computer system for analyzing process control data
US10168691B2 (en) 2014-10-06 2019-01-01 Fisher-Rosemount Systems, Inc. Data pipeline for process control system analytics
US10503483B2 (en) 2016-02-12 2019-12-10 Fisher-Rosemount Systems, Inc. Rule builder in a process control network
US10365640B2 (en) * 2017-04-11 2019-07-30 International Business Machines Corporation Controlling multi-stage manufacturing process based on internet of things (IoT) sensors and cognitive rule induction
US11216437B2 (en) 2017-08-14 2022-01-04 Sisense Ltd. System and method for representing query elements in an artificial neural network
US11256985B2 (en) 2017-08-14 2022-02-22 Sisense Ltd. System and method for generating training sets for neural networks
WO2019035862A1 (en) * 2017-08-14 2019-02-21 Sisense Ltd. System and method for increasing accuracy of approximating query results using neural networks
US11110667B2 (en) * 2019-04-10 2021-09-07 The Boeing Company Fabrication optimization for composite parts
WO2020237011A1 (en) * 2019-05-23 2020-11-26 Cognizant Technology Solutions U.S. Corporation Quantifying the predictive uncertainty of neural networks via residual estimation with i/o kernel

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4858147A (en) * 1987-06-15 1989-08-15 Unisys Corporation Special purpose neurocomputer system for solving optimization problems
US4941122A (en) * 1989-01-12 1990-07-10 Recognition Equipment Incorp. Neural network image processing system

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4193115A (en) * 1977-12-15 1980-03-11 The United States Of America As Represented By The Secretary Of Commerce Method and apparatus for implementation of the CMAC mapping algorithm
US4777960A (en) * 1986-08-18 1988-10-18 Massachusetts Institute Of Technology Method and apparatus for the assessment of autonomic response by broad-band excitation
DE3811086A1 (en) * 1987-04-03 1988-10-20 Hitachi Ltd PID CONTROL SYSTEM
US4893255A (en) * 1988-05-31 1990-01-09 Analog Intelligence Corp. Spike transmission for neural networks
US4912653A (en) * 1988-12-14 1990-03-27 Gte Laboratories Incorporated Trainable neural network

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4858147A (en) * 1987-06-15 1989-08-15 Unisys Corporation Special purpose neurocomputer system for solving optimization problems
US4941122A (en) * 1989-01-12 1990-07-10 Recognition Equipment Incorp. Neural network image processing system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
An Associative Memory Based Learning Control Scheme with PI-Controller for SISO-Nonlinear processes; IFAC Microcomputer Application in process control; ERSU, et al; 1986; pgs. 99-105 See entire document. *
See also references of EP0510112A4 *
Use of Neural Nets for Dynamic Modeling and control of chemical process systems: proc. of the 1989 American Control Confer; BAHT et al; 1989; pages 1342-1347; See entire document. *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0495044B1 (en) * 1990-08-03 1998-09-23 E.I. Du Pont De Nemours And Company Computer neural network process measurement and control system and method
EP0534221B1 (en) * 1991-09-24 1996-12-11 Siemens Aktiengesellschaft Control parameter improvement method for industrial installations
US5751571A (en) * 1993-07-05 1998-05-12 Siemens Aktiengesellschaft Process and apparatus for determining optimum values for manipulated variables of a technical system
EP0707719B1 (en) * 1993-07-05 2001-03-14 Siemens Aktiengesellschaft Process for determining the optimum values of the correcting variables of a technical system
WO1995006289A1 (en) * 1993-08-26 1995-03-02 Associative Measurement Pty. Ltd. Interpretive measurement instrument
AU728376B2 (en) * 1993-08-26 2001-01-11 Associative Measurement Pty Ltd Interpretive measurement instrument

Also Published As

Publication number Publication date
AU7249891A (en) 1991-08-05
EP0510112A1 (en) 1992-10-28
US5111531A (en) 1992-05-05
CA2073331A1 (en) 1991-07-09
EP0510112A4 (en) 1994-03-09

Similar Documents

Publication Publication Date Title
US5111531A (en) Process control using neural network
JP4542323B2 (en) Integrated model predictive control and optimization in process control systems
JP4413563B2 (en) Integrated model predictive control and optimization in process control systems
EP1021752B1 (en) Model-free adaptive process control
US5282131A (en) Control system for controlling a pulp washing system using a neural network controller
Jang et al. Neuro-fuzzy modeling and control
Funabashi et al. Fuzzy and neural hybrid expert systems: synergetic AI
WO2019176496A1 (en) Control device, control system, control method, and control program
Holmblad et al. The FLS application of fuzzy logic
US6272391B1 (en) Self organizing industrial control system importing neighbor constraint ranges
Krause et al. A neuro-fuzzy adaptive control strategy for refuse incineration plants
Ament et al. A process oriented approach to automated quality control
Belarbi et al. Fuzzy neural networks for estimation and fuzzy controller design: simulation study for a pulp batch digester
Wu et al. Water level control by fuzzy logic and neural networks
D’Errico Fuzzy control systems with application to machining processes
Ward et al. Intelligent control of machines and processes
Montes et al. Interpretable Fuzzy Models from Data and Adaptive Fuzzy Control: A New Approach
Batur et al. Model based fuzzy control
Paiva et al. Quality prediction in pulp bleaching: Application of a neuro-fuzzy system
Azzini et al. Modeling turning points in financial markets with soft computing techniques
Xiao et al. Automatically-Grown Optimal Neural Controller
Qian et al. Modelling of a woodchip refiner using artificial neural network
Akec et al. Parameter and rule learning for fuzzy logic control systems using genetic algorithms
Prada et al. HITO-a tool for hierarchical control
Bettenhausen et al. BioX++-extended learning control of biotechnological processes

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AU BR CA FI JP KR NO SU

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FR GB GR IT LU NL SE

WWE Wipo information: entry into national phase

Ref document number: 1991904312

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2073331

Country of ref document: CA

WWP Wipo information: published in national office

Ref document number: 1991904312

Country of ref document: EP

WWW Wipo information: withdrawn in national office

Ref document number: 1991904312

Country of ref document: EP