Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberCA2066159 A1
Publication typeApplication
Application numberCA 2066159
PCT numberPCT/US1991/005260
Publication date20 Feb 1992
Filing date25 Jul 1991
Priority date3 Aug 1990
Also published asCA2066159C, DE69125809D1, DE69125809T2, EP0495046A1, EP0495046B1, US5212765, US5408586, US5640493, US5826249, WO1992002867A1
Publication numberCA 2066159, CA 2066159 A1, CA 2066159A1, CA-A1-2066159, CA2066159 A1, CA2066159A1, PCT/1991/5260, PCT/US/1991/005260, PCT/US/1991/05260, PCT/US/91/005260, PCT/US/91/05260, PCT/US1991/005260, PCT/US1991/05260, PCT/US1991005260, PCT/US199105260, PCT/US91/005260, PCT/US91/05260, PCT/US91005260, PCT/US9105260
InventorsRichard D. Skeirik
ApplicantRichard D. Skeirik, E.I. Du Pont De Nemours And Company, Pavilion Technologies, Inc.
Export CitationBiBTeX, EndNote, RefMan
External Links: CIPO, Espacenet
On-line training neural network for process control
CA 2066159 A1
Abstract
An on-line training neural network for pro-cess control system and method trains by retrieving training sets from the stream of process data. The neural network detects the availability of new train-ing data, and constructs a training set by retrieving the corresponding input data. The neural network is trained using the training set. Over time, many training sets are presented to the neural network.
When multiple presentations are needed to effec-tively train, a buffer of training sets is filled and up-date as new training data becomes available. The size of the buffer is selected in accordance with the training needs of the neural network. Once the buf-fer is full, a new training set bumps the oldest train-ing set off the top of the buffer stack. The training sets in the buffer stack can be presented one or more times each time a new training set is con-structed. A historical database of timestamped data can be used to construct training sets when training input data has a time delay from sample time to availability for the neural network. The timestamps of the training input data are used to select the ap-propriate timestamp at which input data is re-trieved for use in the training set. Using the histori-cal database, the neural network can be trained ret-rospectively by searching the historical database and constructing training sets based on past data.
Description  available in
Claims  available in
Classifications
International ClassificationG05B13/02
Cooperative ClassificationY10S706/906, G05B13/027
European ClassificationG05B13/02C1
Legal Events
DateCodeEventDescription
24 Jul 1998EEERExamination request
25 Jul 2011MKLALapsed
2 Dec 2012MKECExpiry (correction)
Effective date: 20121202