|Publication number||US7143044 B2|
|Application number||US 09/751,504|
|Publication date||28 Nov 2006|
|Filing date||29 Dec 2000|
|Priority date||29 Dec 2000|
|Also published as||US20020084902|
|Publication number||09751504, 751504, US 7143044 B2, US 7143044B2, US-B2-7143044, US7143044 B2, US7143044B2|
|Inventors||Wlodek W. Zadrozny, Dimitri Kanevsky|
|Original Assignee||International Business Machines Corporation|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (15), Non-Patent Citations (2), Referenced by (18), Classifications (19), Legal Events (4)|
|External Links: USPTO, USPTO Assignment, Espacenet|
This invention relates to a system and method of interpreting gestures and sounds produced by individuals in general, and, in particular, to a system and method for recording, and processing gestures and sounds produced by individuals, comparing the processed recordings to a database of interpreted gestures and sounds, and providing caretakers of the individuals with interpretations of the gestures and sounds.
The interpretation of nonverbal communication, i.e. gestures and sounds, is crucial for the care taking and treatment of nonverbal individuals. Young children who have not yet learned to speak use behaviors such as movements, gestures and sounds to express their needs and current state of emotions and well being. Nonverbal communication is often the only means of expression for individuals with hearing loss, autism and other disorders and illnesses. Even for verbal individuals, nonverbal communication, if interpreted properly, can be used to recognize, diagnose and track treatment of medical disorders.
From time immemorial, parents, as an example, have been faced with the challenge of determining the meaning behind gestures and sounds produced by their nonverbal infants and toddlers. New, inexperienced parents resort to trial and error to determine what need the child is expressing. Experienced parents and care givers often encounter nonverbal communicative behaviors with which they are not familiar. Child and parent are apt to experience frustration upon erroneous interpretations of the child's nonverbal communications. The child's frustration masks the actual need being expressed. The parent's frustration is sensed by the child and will interfere with making a proper interpretation of the child's communications. The frustrations can bring on undesirable behaviors on the part of the parent or child. Within time, the inexperienced parent will usually learn to intuitively recognize patterns and correctly interpret their child's nonverbal communications with which they are familiar. In most cases the frustration could have been avoided by providing the inexperienced parent with the knowledge accumulated by experienced caretakers. However, there is no effective means for a parent to discover caretakers who have experience with the particular behavior the parent is having trouble interpreting. Therefore, there is a need for a system and method to provide caretakers of individuals with access to knowledge of nonverbal behavior accumulated by experienced caretakers for interpreting nonverbal communicative behavior of nonverbal individuals.
However, there are instances in which a nonverbal communication is not contained within the knowledge already accumulated. The gesture, sound or movement recognized may be particular to the individual child or may indicate a condition which does not fall within the norm. It is probable that the interpretation of the nonverbal communication, once discovered, will be of aide to a caretaker at some point in the future. Therefore, there is a need for a system and method for tracking previously unrecognized behaviors of individuals until an interpretation is discovered, and adding the newly interpreted behavior to the already accumulated knowledge.
There are instances in which an individual's nonverbal communication is related to current situational conditions such as an event or an object proximate to the individual. It is difficult for a caretaker of the individual to account for the variable events or objects which might be affecting the individual under their care, making it difficult for the caretaker to associate the individual's behavior with an event or object. Therefore, there is a need for a system and method of associating events and proximate objects to the behaviors of an individual.
Once a behavior occurs, it is difficult for care givers and caretakers to compare it with other behavioral occurrences and to associate it with situational conditions. Each behavior is a composite of many factors. Tracking each factor of a behavior is confusing and time consuming. A caretaker may not be capable of tracking proximate events or objects for the purposes of association with the behavioral occurrences, particularly when the association is not obvious. In addition, the process of observing for the purpose of recognizing repetitive behaviors takes care givers and caretakers away from their other duties and responsibilities. Furthermore, it is difficult for a parent to be objective about their own child's behavior and to see it realistically. Therefore, there is the need for a system and method for recognizing a repetitive behavior and studying it through observing it repetitively and analyzing it.
It may be impossible for a parent or care giver to observe or study a child's behavior from the same place where the child is located. Therefore, there is a need for a system and method to provide the ability to observe the behavior of an individual from a location remote from the individual.
There are situations in which a caretaker cannot guard the individual under their care from every danger in the environment. For example, the caretaker may be caring for more than one individual, may be distracted while performing one of their duties, may be unfamiliar with dangers in the environment, may be caring for an individual who tends to sneak and seek out trouble, etc. Therefore, there is a need for a system and method of alerting a caretaker when an individual under their care is exposed to potential danger.
In summary, there is a need for a system and method for providing a caretaker of an individual with the ability to identify nonverbal behaviors of an individual under their care by comparing and matching the behavior with a collection of previously recorded nonverbal communicative behaviors, for associating nonverbal behaviors to situational conditions, for recognizing patterns of behavior and associating situational conditions to the behavioral patterns, and for alerting a caretaker when a nonverbal behavior poses a danger.
It is an aspect of this invention is to provide a system and a method for recognizing nonverbal behaviors performed by an individual, comparing the recognized behavior to a collection of known behaviors and learning from the known behaviors what may be causing the recognized behavior and what can be done for the individual.
Another aspect of this invention is to provide a system and a method for determining when a relationship exists between events and behaviors.
Another aspect of this invention is to provide a system and a method for detecting the occurrence of a dangerous event, and detecting behaviors indicating that an individual is in serious trouble due to the occurrence of a dangerous event and alerting a caretaker of the individual appropriately.
Another aspect of this invention is to provide a system and a method for providing caretakers of individuals with access to information collected relative to known behaviors and their causes, and for providing the means for caretakers to communicate with one another and with experts in appropriate fields.
To accomplish the above and other aspects, a method is proposed, which, in the preferred embodiment, comprises the steps of observing and recording a behavior of an individual; assigning a code to the observed behavior; comparing the observed behavior code with a database of behavior codes associated with known behaviors; retrieving an explanation associated with the known behavior codes which match the observed behavior code; correlating related events to the observed behavior; providing access to information stored about various behaviors, events and their relationships and providing means for caretakers of individuals to communicate with one another and with appropriate experts. The method may further include the steps of sending an alarm upon the occurrence of an event which poses a danger to the individual; and sending an alarm upon recognition of behaviors of the individual which indicate that the individual is in danger.
In addition, a system is proposed, which, in the preferred embodiment, comprises a recording means for recording a behavior performed by an individual, the recording means being in communication with a user interface; and an Analysis Provider in communication with the user interface, the Analysis Provider having a database of behavior codes associated with known behaviors, and associated with each behavior code is an explanation and event data regarding events related with each known behavior. The Analysis Provider may further have a) a Behavior Code Module for recognizing a recorded behavior and associating a behavior code with the recognized behavior, and b) a Behavior Code Matching Module for finding behavior codes associated with known behaviors which match the recognized behavior code, and retrieving information associated with the known behavior code from the database. Furthermore, the Analysis Provider may include a Situational Analyzer for identifying and tracking the occurrence of events which can be correlated to the recognized behavior, for recognizing repeated behavior patterns of the individual behaviors, and for recognizing when an event occurs which poses a danger to the individual.
In the following description, the terms “server” and “database” are used in a generic functional sense. The term “server” should be understood within the client/server architectural model—the client requests a service, the server provides a service. The term “database” can be understood in its most broad definition as a data structure stored on one or more pieces of hardware. A “database” and “server” may be depicted as one entity labeled a “database” which is inclusive of the “server”, yet it is to be understood that the “database” functions to store data and the “server” functions to provide access to the “database”. Indeed, any of the servers or databases could be implemented using a distributed network system, where the functional elements of a server or database are not only distributed among nodes, but will often migrate from node to node. On the opposite end of the spectrum, all of the servers and databases discussed below could be resident on one computer. However much of each server or database is implemented in software, firmware, or hardware being also open to many variations, as is well known in the art.
Furthermore, the terms “network” and “computer” are used in the most general sense. A “computer” is any computing means, from a single microprocessor or micro controller to a computer system distributed over multiple processing nodes. A “network” includes any means that connects computers. Thus, although the preferred embodiment uses the Internet, the nodes could connect through any other type of Intranet or network such as a central server with individual point-to-point connections, a LAN, etc.
The term “caretaker” refers to one who takes care of or gives care to another individual such as a parent, a baby sitter, a teacher, a care giver, a doctor, a therapist, a nurse, etc. The term “care giver” refers in particular to those involved in the temporary care of an individual for providing a specialized service on a patient/client basis.
The terms “nonverbal behavior”, “nonverbal communication”, and “nonverbal communicative behavior” refer to movements, gestures and sounds (including gestures accompanied by words) produced by an individual intentionally or unintentionally and which express a need, desire, or current state of emotional, mental or physical well being of the individual.
Other terms in the text are also to be understood in a generic functional sense, as would be known by one skilled in the art.
The procedure according to the preferred embodiment of the present invention is depicted in
Data gathered at the first stage is passed to the next stages. Although the second and third stages will be described in the preferred embodiment as being managed by one entity providing a service over the Internet, each stage could be handled by a separate entity. In other words, in the preferred embodiment, a single service provider is managing the second and third stages for subscribers. However, these functions could be delegated to a service provider or company who provides the service of one or more of behavior observation, behavior coding or situational analysis. Furthermore, each or all of these services could be offered to a client by a single company or a combination of companies utilizing conventional methods other than the Internet for delivering information between the client and the company.
The recording means 112 can comprise a device or manual means for recording a representation of an individual's behavior. The representation can be in any format such as textual, pictorial, graphic, digital or oral description or combination thereof. Devices employed as recording means can be an analogue or digital camera, video camera, thermal imaging system, ultra sound imaging system or any system which records a representation of the behavior of the individual. The device can be switched on and off manually or by automatic means such as the user interface 116 or a motion detector. The device can store recordings in files or submit them to the user interface 116 as they are being recorded.
The user interface 116 is any means for providing communication between the user and the Analysis Provider 118. The communication can be in any format such as written, verbal or electronic. The communication can be man to man, man to machine or machine to machine. The user interface 116 can further comprise carrier mail, a telephone, a wireless telephone, or a computer such as a personal or hand held computer. In the preferred embodiment, the user interface 116 and/or the recording means 112 comprise computers and store the recordings in files. The files can be reviewed and segments from the files can be selected for transmission to the Analysis Provider 118. Furthermore, in the preferred embodiment, the user interface and/or the recording means are capable of selectively allowing recordings to be transmitted to the Analysis Provider 118 as they are being recorded. In the preferred embodiment the user interface 116 communicates with the Analysis Provider 118 via the Internet 120. However, communication can be provided by an Intranet connection or any known type of network. Alternatively, information being transferred between computers comprising the user interface 116 and the AP 118 is stored on a conventional storage device and transported between the user interface 116 and the Analysis Provider 118.
In the preferred embodiment, the user interface 116 and the recording means are in communication. This communication can be provided by cable connections or by wireless connections.
Alternatively, the behavior can be observed directly by the Analysis Provider. This eliminates the need to form a representation of the behavior and provide it to the Analysis Provider.
The Analysis Provider can be a system comprised of personnel and/or devices for receiving recordings, recognizing behaviors and events, associating codes to the behaviors and events, and analyzing the events and behaviors. In the preferred embodiment the Analysis Provider 118 is a computer having or having access to a database of behavior codes.
The Analysis Provider 118 provides a service to a user 110. The user 110 can be a single individual or a plurality of individuals. The user 110 is generally a care giver or an assistant to a care giver of the individual whose behavior is being analyzed. The service provided by the Analysis Provider 118 can be provided gratis or for a fee. The fee can be variable according to user requests, such as the extent of analysis and reporting requested by the user 110. The Analysis Provider 118 can require that a user register with the service prior to using the service.
In the preferred embodiment the service provided by the Analysis Provider 118 includes receiving a representation of a behavior from a user, associating a code with the received behavior representation, comparing the received behavior code with a collection of stored behavior codes, returning an explanation associated with the stored behavior codes which match the received behavior code to the user.
A general outline of the basic steps performed by the Analysis Provider 118 in the preferred embodiment of the inventive system is shown in
In step 210 a behavior code is associated with each behavior represented in the compressed file. The Analysis Provider 118 recognizes gestures and uttered sounds which make up a behavior. Upon recognition, a code for each recognized gesture and sound is provided. When a gesture or sound is not recognized, a new code can be provided. The codes for the gestures and sounds are used to form the behavior code. In another embodiment, only gestures are recognized and assigned codes for forming the behavior code.
An example of a recognition system for identifying individuals according to their gestures and text independent speech is U.S. patent application Ser. No. 09/079,754 having the same assignee as the present application, entitled “APPARATUS AND METHODS FOR USER RECOGNITION EMPLOYING BEHAVIORAL PASSWORDS”, the disclosure of which is incorporated herein by reference. In one embodiment of such a recognition system, behaviors comprised of a series of gestures are recognized. Each gesture is recognized by clustering similar frames of a compressed digital file of images into a cluster, and grouping the clusters into strokes. A stroke is a series of frames which represent an individual performing a particular gesture. The strokes are classified and labeled into gesture symbols which are then interpreted to form a gesture unit.
The Analysis Provider 118 assigns a code to each gesture unit. The codes assigned to the series of gesture units which make up the behavior are assigned to the behavior code for that behavior. Similarly, the sounds of the behavior are broken up into sound units which are associated with a code and assigned to the behavior code. Thus, the behavior code is comprised of at least one of a gesture component and a sound component, each component being comprised of at least one unit code.
The gesture units are further broken up into property units such as which body part was moved, what type of movement was performed, the individual's body position and the duration of the gesture unit. Each property unit is assigned a code. The property codes assigned to the property units comprising a gesture unit are concatenated to form the gesture unit code. The sound unit will likewise be broken up into property units such as sound pitch, sound volume, sound quality, and the duration of the sound unit, and may include units appropriate for automating speech recognition (as known in the art). The property unit codes assigned to the property units comprising a sound unit are concatenated to form the sound unit code.
The behavior can be broken down into gesture units and sound units in various ways, as for example according to predetermined time intervals or according to the recognition of the end of a gesture or sound. The codes assigned to the gesture and sound units are integrated into one behavior code comprising a string of integrated gesture unit and sound unit codes. Alternatively the gesture component and the sound component of the behavior code can be maintained as separate components associated with one behavior, each component comprised of a string of corresponding unit codes.
The behavior code can have a hierarchal structure such that specific gesture or sound unit codes are assigned priority levels. There are a variety of criteria according to which priority levels are assigned to behavior unit codes. Examples of criteria include chronological order, frequency of occurrence within the behavior code database, degree of dominance, etc. . . . The degree of dominance refers to the degree to which a behavior unit is associated with the causes of the behavior. An expert, such as a behavioral psychologist, would determine the degree of dominance of a behavior unit. For example, an expert might attribute a higher degree to a behavioral unit representing head banging than to a behavioral unit representing toe wiggling.
Similarly, the property unit codes are assigned selectable priority levels. The priority levels are assigned according to the degree to which a property unit distinguishes one behavior from another for the purposes of finding a match when searching the behavior codes database.
The priority levels assigned to behavior unit codes and property unit codes is selectable and can be determined by operators of the system and experts in the behavioral or medical fields. The selections can be made manually or automatically. The selections can be dynamic, such that the selections change as other factors change, such as new data being added to the behavior code database. The priority is selected in accordance with data stored in the individual and/or general behavior codes databases.
In step 215 a behavior code database containing a collection of stored behavior codes is searched for behavior codes which match the assigned behavior code. The database is comprised of multiple records. Each record corresponds to one behavior code. Each record has multiple fields containing data pertinent to the record.
In the preferred embodiment the search will account for priority levels within the structure of the behavior codes. Behavior unit codes having a higher priority will be searched for prior to those of lower priority. Likewise, property unit codes having a higher priority will be searched for prior to those of lower priority.
In the preferred embodiment the criteria are specified for the degree of similarity desired for determining a match between the assigned behavior code and behavior codes within the database. The criteria for match determination can be specified by operators of the system or by the user.
Upon determination of at least one match at step 218, step 220 is then executed. At step 220 data is retrieved from each record having a matching behavior code. Included in the data retrieved from the matching records is an explanation of the behavior represented by the matching behavior code. The explanation preferably is a textual segment entered previously by a user of the system or by an expert consultant. At step 230 a “match found” message comprising data retrieved from the matching records is transmitted to the user.
In the instance where a match is not found, step 240 is executed. At step 240, with the user's consent, the assigned behavior code is added to the database of behavior codes together with appropriate data to form a new record. The explanation field of the data record can be left blank at this time and filled in at a later date if desired.
In the preferred embodiment, users of the service provided by the inventive system first subscribe to the service by registering with the Analysis Provider 118. The user 110 provides the Analysis Provider 118 with personal information which is entered into a subscriber database. An identification (subscriber ID) code is assigned and stored by the system for each user registered as a subscriber. Subscriber data relating to financial information is kept confidential. The subscriber can have the option to allow or prevent access by other subscribers to information such as name, e-mail address, phone number and occupation. In other embodiments the service provided by the inventive system could be available to the public. Public users and subscribers would then have various levels of access to the information provided by the service.
The behavior code database can refer to a general behavior code database for storing behavior codes associated with behaviors performed by a variety of individuals, and/or to an individual general behavior code database for storing behavior codes associated with behaviors performed by the individual. The individual database is preferably stored by the AP 118, or by the user interface 116. The means to search and update the database is also preferably resident at either the AP 118 or the user interface 116.
The behavior code database is structured in a variety of ways.
Preferably, the recordings of the behavior are stored and indexed according to the behavior ID. By storing the recordings, they can be referred to for viewing or further analysis, and can be deleted when no longer needed. The recordings are stored in any format such as the format in which they were received or as a compressed file.
In the preferred embodiment, the lists are linked databases having keys stored in fields of the records of the behavior code database. Each list contains multiple records. For example, records in the list for explanations, list for possible causes or list for recommendations may contain multiple records each having fields for a textual entry, a subscriber ID for the subscriber who entered it, and a date of entry. The list of similar behaviors may contain multiple records each having a field for a behavior ID. A user viewing a record containing a behavior ID or subscriber ID uses the ID to look up the behavior or subscriber in their respective databases in order to retrieve more information.
Access to the Analysis Provider 118 is provided by the user interface 116. In the preferred version a user 110 uses a web browser program, such as Netscape Navigator™, to provide access to the Internet 120 for accessing the Analysis Provider 118. The user 110 exchanges information with the AP 118 via Web pages. The information being transmitted to and from the Analysis Provider 118 is protected for confidentiality by employing encryption and decryption techniques such as SSL (Secure Sockets Layer) if desired.
Prior to a user 110 submitting recordings of behaviors performed by an individual, the user 110 must provide the Analysis Provider 118 with information about the individual 114. The information is stored by the Analysis Provider 118 or by the user interface in an individual personal data file or database. The information may need to be updated periodically. The Analysis Provider 118 prompts or requires the user 110 to update the information at specific time intervals, in particular for individuals 114 whose condition is in flux, such as a growing child or an individual with an unstable medical history. Information which must be entered in the individual personal data file for registering an individual 114 may include name, identification number assigned by the system, weight, height and dimensions of specific body parts. The measurements may be entered manually or entered automatically by a measuring device. The AP 118 uses the physical measurements of the individual 114 to scale the compressed file of the image for accurate association of a behavior code. Other information to be entered may be related to allergies, disabilities, present medical condition, a picture, medical history, typical demeanor (appetite, mood, energy level), current demeanor, etc. . . .
After the user enters or confirms updated information about the individual, the user specifies criteria for the analysis of the images to be submitted. The AP then presents the user with questions to be answered, which may depend on answers provided previously. The user is prompted to supply answers in various formats such as by filling in a blank field or by selecting from multiple choices. In the preferred embodiment, the user is presented with a User Analysis Request Web page, such as the page 500 shown in
At section 510 the user directs the analysis to extract from the images submitted: a) a specific behavior, b) all behaviors or c) behaviors related to situational events. At section 515 the user chooses to further extract situational events or behaviors within a selected chunk of time before and after each occurrence of a specified behavior. At section 520 the user describes the behavior being extracted. At field 525 the user enters the behavior code for the specific behavior or behaviors of the behavior pattern being extracted. The code is typically entered by keying the code in to field 525 or by selecting the code from a list of codes and descriptions 535. The description for any code entered is displayed in field 530 in order for the user to confirm that it is the code intended. At field 540 the user selects a series of gesture and sound units separately which comprise a behavior. The gesture unit is broken down, by example, into body part, movement, body orientation and duration of gesture. The user selects “Yes” or “No” of inclusion of a sound unit. The sound unit, by example, is broken into sound description, intensity, pitch and duration of sound. The user then indicates if a gesture or sound unit is simultaneous with the previous gesture or sound unit entered. The code and description which best matches the combination of gesture units selected will pop up in fields 525 and 530 respectively. The user then enters a series of behavior codes which would constitute a behavior pattern. The user selects “End of Behavior” to indicate when a series of behavior units has been entered which constitute the behavior. The user then selects “Enter Behavior” to enter the entire series of behavior units.
The user specifies criteria for the degree of accuracy required in matching the behavior being extracted to behaviors stored in the database. In this example, at field 550, the user selects a percentage of accuracy, however any method could be implemented. A minimum limit to the degree of accuracy can be required. At field 560 the user specifies the maximum number of returned matches desired. The user can store his current selections as a default for future submissions at field 565, and saves the current selections as a file at 570. The user then requests that the behaviors extracted be analyzed for determining if a pattern exists at 575. The user next requests situational analysis of the behaviors and events extracted at field 580 for determining if there is a relationship between the behaviors and events proximate to the behavior or objects proximate to the individual. Finally, once the user has entered the necessary information, the user submits the criteria for analysis entered at 590.
Once the criteria for analysis has been submitted, the file containing the images selected to be analyzed must be submitted, although the order could be reversed.
The user enters events by selecting an event displayed in the view window 620, selecting “add event” and selecting the appropriate event label from window 640. The time of occurrence will appear in the time window 642. The user specifies the end of the event by selecting “end event”. Selecting a “danger” event will allow the user to enter an event which poses a danger from window 644. The user is able to identify objects in the recording by using cursor 645 to point at or select the object displayed in window 620, and selecting the appropriate object label from window 650. Selecting a “danger” object allows the user to enter an object which poses a danger from window 655.
The user saves the selections made by selecting “save” and specifying a file name at field 660. When the necessary information has been entered, the user submits the selected segments to the AP for analysis by selecting “submit selected segments” field 670. Alternatively, the user can choose to submit a live stream of images, as they are being produced by selecting “submit live stream of images” field 675. The filename from which the images are retrieved must be specified in field 680. When submitting a live stream of data, the user has the option of selecting the option to “alert for danger” in field 685. This will cause a screen to be presented to the user in which the user must specify the nature of any dangers which might exist or occur in the vicinity of the individual. The user ends the stream of images by selecting “end stream” in field 690 with the filename specified in field 680.
Now, the modules used by the AP 118 to analyze behaviors represented in recordings submitted by the user 110, via the user interface 116, will be described in detail with reference to
The Control Module 710 supervises the passage of control from one module to the next. The File Manager 715 is responsible for supplying each module with the proper file or segment of a file to be processed. The File Manager accesses databases of submitted recordings and compressed, digitized formatted recordings. Recording databases are provided for storing the recordings at the desired stages of processing.
The request data entered by the user 110 via web screens at the user interface 116 is sent to the Control Module 710. The Control Module supervises the flow of control in order to fulfill the user's requests. Control is first sent to the Digitize and Sample Module 725. The recording data file is sent by the file manager to the Digitize and Sample Module 725.
The Digitize and Sample Module 725 converts any analogue data to digital data, and compresses the recording data by sampling the frames of the image data and sound data and saving the sampled frames in a compressed image file. Each sampled frame is time stamped. The Digitize and Sample Module 725 can remove segments of image data in which there is no activity, i.e. the frames are identical. Removal of inactivity segments can be done either prior to, or post sampling.
Next, the compressed image file and control is sent to the Scaling Module 730 in which data from the individuals's personal data file, in the individual personal data database 731, pertaining to the measurements of the individual, is accessed for providing a scaling factor. The Scaling Module can either provide the scaling factor or actually scale the images. The scaled and compressed image file and control is sent to the Noise Elimination Module 735 for eliminating frames or portions of frames in which non-meaningful activity exists. The Noise Elimination Module 735 can consult a library of non-meaningful activities, or determine that an activity detected is non-meaningful. For example, devices in an infant's room or in a hospital room may be in motion or produce noises which are not related to the behavior of the individual. Examples of devices include a respirator, a clock ticking, a rotating mobile in perpetual motion, etc. . . .
The control unit and file manager will pass control and send the scaled compressed recording file to a) the Behavior Coder (BC) Module 740 when behaviors in the recording file are being analyzed, or to b) the Situational Analyzer 745 when events in the recording file are being analyzed.
When the scaled compressed recording file is received by the Behavior Coder Module 740, the compressed recording file is examined for recognizing gestures and sounds which make up a behavior. The gestures can be further grouped into gesture units, and the sounds further grouped into sound units. Each gesture and sound unit is assigned a code. The gesture unit code can comprise multiple parts such as body part moved, type of movement, position of body and duration of movement. The sound unit code can comprise multiple parts such as sound description, sound intensity and duration of sound. A Behavior Begin/End Detector Module 750 is consulted by the Behavior Coder Module 740 for determining the end of one behavior and the beginning of the next. The gesture and sound unit codes are combined to form a recognized behavior code for each behavior recognized. The combining of the gesture and sound units to form a code for a behavior can be done in a variety of ways. For example, a Synchronizer Module (not shown) can be used for synchronizing the gestures and sound units to best model the actual behavior. In a different method, the gesture units may be stored separately from the sound units. Also, the gesture and sound units may be stored consecutively according to chronological order, with an additional gesture or sound unit code part for indicating when a unit code is simultaneous with the previous unit code.
At this point, the Control Module 710 will pass control to the Behavior Code Matching Module 755 when each recognized behavior code provided by the BC Module 740 is to be compared to existing behavior codes for finding a match. On the other hand control will be passed to the Behavior Series Matching Module 760 when a series of behaviors is to be compared to existing series of behaviors for finding a match.
The Behavior Code Matching Module 755 determines if the user requested that a specific behavior be searched for. In this case, the recognized behavior code is compared to the specific requested behavior code to determine if a match exists. The results are sent to the Report Generator 770. Otherwise, the Behavior Code Matching Module 755 searches a Behavior Codes (BC) Database 765 to find a stored behavior code which matches the recognized behavior code.
When searching the Behavior Code Data Base 765 for a match, an algorithm is used for prioritizing the comparison of gesture or sound units, or their parts. Many different algorithms could be used. For example, a first high priority class may be formed of all behavior codes in the Behavior Code Database having a first gesture unit with a body part moved which matches the body part moved of the first gesture unit in the recognized Behavior Code. A subclass can be formed from the first class by matching first gesture units having the same movement. The subclass can be further narrowed down by matching first gesture units with a body position and then narrowed further by matching gesture units with a duration within a specified threshold. The comparison can proceed by comparing gesture units in their chronological order. Subsequent subclasses can be formed until matches have been found. In a different algorithm, certain types of body parts and movements would have priority and are compared first, regardless of their chronological order. Duration of a movement or sound may or may not be compared until late in the search or not at all.
Just as the recognized behavior codes can be determined by using a module to synchronize the gesture and sound units or consider them separately, the process of searching for a match can also involve comparing gesture and sound units in a synchronized fashion or separately form one another.
Determination of a match is made according to criteria which are selectable. The criteria is selected by the operators of the system, by the system software, by the user or by any combination thereof. For example, the user may have specified that the criteria for a match be a perfect match, such that all of the components of all of the behavior units match. In another example, the user may have specified that criteria for a match be less than perfect, such as 80%. The degree to which codes match one another can be calculated in various ways. For example, the match between a recognized behavior code and a stored behavior code can be assigned a value. The match value can be assigned a beginning value of 100%, which is then decreased a specified amount for each part of a behavior unit which doesn't match, or for matching gesture and sound units which occur in a different order. The match value can be decreased an even greater value when an entire behavior unit in a stored code doesn't correspond to a behavior unit in the observed code.
When control is transferred to the Behavior Series Matching Module 760 it is determined if the user is searching for a specified behavior series. If so, the recognized behavior codes as a string of codes are compared to the specified behavior series and the results are sent to the Report Generator 770 or to the SA Module 745 for further analysis.
Preferably, the Behavior Code Database comprises a general behavior code database and an individual behavior code database. The general behavior code database stores codes associated with behaviors of multiple individuals. The individual behavior code database stores codes associated with behaviors of the individual being observed. The individual behavior database is used, by example, for detecting patterns of the same behavior by the individual. The individual behavior database is stored at the user interface or at the AP, together with computer program code for executing searches and updating data related to the individual database.
The Report Generator 770 receives results from the various modules and generates messages and reports which are sent to the user via the Web Interface Server 705. The user specifies the type of report he wishes to receive, including how detailed it should be, upon registering with the system or submitting a request. The report can be interactive, including the option to proceed with the analysis. For example, the user may be prompted to change the criteria for a match via a user Web screen and then repeat the search. This would be appropriate when no match was found or too many matches were found. The Report Generator 770 may query the user as to whether or not an unmatched behavior submitted by the user can be entered into the BC Database as a new behavior. The Report Generator 770 may further query the user as to whether or not information such as the User's subscriber ID and the individual's ID can be entered into the general behavior code database.
Control is passed to the SA Module 745 for analysis of behaviors and behavior patterns in relationship to events. The SA Module 745 recognizes events and correlates events to behaviors or behavior patterns. An event refers to a) movement of an object, which is distinct from the individual; b) movement of an individual, relative to the object; c) the presence of an object; d) an occurrence in the schedule of the individual; e) an expected occurrence in the schedule of the individual; or f) a detected change in the condition of the individual; in which the above cause a possible influence upon the behavior of the individual. Individual event data such as the individual's schedule and condition are entered into the system either manually by a user, or automatically by a device such as a monitor. Individual event data such as objects present in the images are entered into the system either manually by a user, or automatically by the system via recognition methods. An example of an automatic object recognition system for identifying, monitoring, tracking and/or locating objects is U.S. patent application Ser. No. 09/239,845 having the same assignee as the present invention, entitled “A VIRTUAL MAP SYSTEM AND METHOD FOR TRACKING OBJECTS”, the disclosure of which is incorporated herein by reference.
With reference to
In the inventive system, the SA Module 745 can be provided as an optional unit. In the preferred embodiment, in which the SA Module 745 is provided, a user 110 has the option to deactivate it. The SA Module 745 can be activated by user request or by the system's internal instructions.
When activated, control is passed to the SA module 745 by the control module 710, along with specific instructions. The compressed images to be processed are provided by the File Manager. The compressed recordings are provided as a steady stream or as a finite segment.
The modules of the SA Module 745 will now be discussed in detail. An Event Identifier/Tracker (EI/T) Module 815 receives the compressed recordings for detecting and identifying the occurrence of an event. Methods for detection of the occurrence of an event can include object recognition, tracking objects, tracking the individual for location relative to objects, tracking the time of an occurrence or expected occurrence in the individual's schedule and tracking the time of a change in the individual's condition. The Event Identifier/Tracker Module 815 compares the recognized object or event to records stored in the Individual's Event Database 805 and the General Event Library 810. Events which were recognized, but could not be identified, are transmitted to a Labeling Module 820 for labeling the event, creating a record and entering it into the appropriate Event Database or Library. The event can be transmitted to the Labeling Module 820 in many forms such as images, compressed images or codes. The Labeling Module 820 can be an automatic intelligent system or a person for recognizing and identifying the event.
Multiple occurrences of a behavior within a selected time interval surrounding an event may indicate a cause and effect relationship. Likewise, the occurrence of an event within a selected time interval surrounding the performance of a behavior indicates a correlation between the event and the behavior. An Event/Behavior Correlator Module (E/BC) Module 825 recognizes when a correlation between an event and a behavior exists and updates the related event data list field 335 of the behavior record stored in the BC Database 765 for events correlated to the behavior. Event data stored in field 335 includes the code associated with the correlated event, the number of recorded occurrences of each event and the time of the occurrence relative to the behavior.
The SA 745 may be activated for the case in which a user has requested notification of occurrences of a specified event. Each event recognized is compared by an Event Matching Module 830 to the specific requested event, in order to find a match. When a match is found, control will return to the Behavior Coder Module 740 for examining a specified time interval preceding and/or following each occurrence of a matching event for recognizing behaviors within that time interval. The Behavior Matching Module 755 will match the recognized behaviors to stored behaviors in the Behavior Code Database 765. The recognition and identification of a behavior within the specified time interval indicates that there may be a cause and effect relationship between the specific event and the matched behavior code. Control returns to the Event/Behavior Correlation (E/BC) Module 825 of the SA 745 for updating the related event data list field 335 of the matching behavior code. For the case in which the specific requested event was not recognized in the images provided, control is returned to the Report Generator 770 for returning a message to the user interface 116.
The SA 745 may further be activated for the case in which the system or user have requested identification of all recognized events occurring within a segment of compressed images provided. The user may have requested that the system find a correlation between a specific behavior or behavior pattern and any events which have occurred within a specific time interval prior to or following the specified behavior. A system request is generated when the system recognizes that the related event field 335, primary explanation field 320, or the related explanation field 330 for a certain behavior code are still blank. The system tries to generate an explanation for the behavior associated with the behavior code by examining the associated compressed images of each occurrence of the behavior stored in the behavior code database. A specified time interval prior to and/or following each instance of the behavior is searched for the occurrence of any events. The system correlates any related events to the behavior.
The E/BC Module 825 updates the BC Database 765 by recording each event identified in the related events field 335 of the behavior code associated with the behavior being analyzed. Alternatively, the E/BC Module 825 uses intelligent processing to enter only the events which are likely to be related. The E/BC Module 825 looks for a pattern of events and makes a determination as to which events are most likely related to the behavior. The results are then sent to the Report Generator 770.
A Behavior Pattern Recognizing (BPR) Module 830 examines the Individual BC Database for patterns of behavior for the purpose of grouping behaviors together which typically occur approximate one another. Upon determining that a group of behaviors are related, the BPR Module 830 examines the related event field 335 of each of the behaviors within the group. The BPR Module 830 determines if there are common related events amongst the group indicating that an event may be the cause of a number of related behaviors. The results are then sent to the Report Generator 770.
The SA 745 further functions to recognize when an event occurs which presents potential danger to the individual. Examples of dangerous events include exposure to a severe allergen, ingestion of a choking hazard, the individual makes contact with a sharp object or a poisonous substance, the individual approaches an electric or fire hazard, a serious medical condition is sensed by a monitor, etc. . . . The event records in the Individual Event Database and the General Event Library include a field for indicating which events pose a danger. Upon detection of a dangerous event, the results are sent to the Report Generator for transmitting an alert to the user or caretaker of the individual. Furthermore, control is passed to the Behavior Coder Module for determining if a behavior has occurred which indicates that the individual is in trouble. Trouble behaviors being searched for could include typical choking gestures or sounds, inactivity, seizure, screaming, severe crying, etc. . . .
Subscribers can access the subscriber, behavior code and image data bases to look up information. Subscribers may be allowed to add information to behavior records such as adding to the lists for possible causes, explanations and recommendations. Various levels of access to information can be assigned to subscribers and public users.
In the preferred embodiment the system provides the user interface with the software necessary for allowing the user to exchange information with the AP via the Web *Screens. Via the Web screens the user can browse and search the general databases for specific information. The User Query Module 720 handles the user requests, accesses the appropriate database, retrieves the information requested and provides the information to the Web Interface Server for returning the requested information to the user interface. In addition the users of the system can communicate with other users and experts. Furthermore, users and experts can communicate with one another interactively and in group formats such as chat rooms.
While the invention has been described with respect to the preferred embodiment, it should be understood that the invention is not limited to this embodiment, but, on the contrary, the invention is intended to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the invention as defined by the appended claims.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US5526465 *||30 Sep 1991||11 Jun 1996||Ensigma Limited||Methods and apparatus for verifying the originator of a sequence of operations|
|US5596994 *||2 May 1994||28 Jan 1997||Bro; William L.||Automated and interactive behavioral and medical guidance system|
|US5722418 *||30 Sep 1994||3 Mar 1998||Bro; L. William||Method for mediating social and behavioral processes in medicine and business through an interactive telecommunications guidance system|
|US5913310 *||29 Oct 1997||22 Jun 1999||Health Hero Network, Inc.||Method for diagnosis and treatment of psychological and emotional disorders using a microprocessor-based video game|
|US5970448 *||23 Jul 1993||19 Oct 1999||Kurzweil Applied Intelligence, Inc.||Historical database storing relationships of successively spoken words|
|US5983190 *||19 May 1997||9 Nov 1999||Microsoft Corporation||Client server animation system for managing interactive user interface characters|
|US6006175 *||6 Feb 1996||21 Dec 1999||The Regents Of The University Of California||Methods and apparatus for non-acoustic speech characterization and recognition|
|US6048209 *||26 May 1998||11 Apr 2000||Bailey; William V.||Doll simulating adaptive infant behavior|
|US6186145 *||21 Jun 1999||13 Feb 2001||Health Hero Network, Inc.||Method for diagnosis and treatment of psychological and emotional conditions using a microprocessor-based virtual reality simulator|
|US6292830 *||7 Aug 1998||18 Sep 2001||Iterations Llc||System for optimizing interaction among agents acting on multiple levels|
|US6334778 *||17 Mar 1999||1 Jan 2002||Health Hero Network, Inc.||Remote psychological diagnosis and monitoring system|
|US6404438 *||21 Dec 1999||11 Jun 2002||Electronic Arts, Inc.||Behavioral learning for a visual representation in a communication environment|
|US6421453 *||15 May 1998||16 Jul 2002||International Business Machines Corporation||Apparatus and methods for user recognition employing behavioral passwords|
|US6792418 *||29 Mar 2000||14 Sep 2004||International Business Machines Corporation||File or database manager systems based on a fractal hierarchical index structure|
|US7076430 *||17 Jun 2002||11 Jul 2006||At&T Corp.||System and method of providing conversational visual prosody for talking heads|
|1||U.S. Appl. No. 09/079,754, "Apparatus and Methods for User Recognition Employing Behavorial Passwords".|
|2||U.S. Appl. No. 09/239,845, "A Virtual Map System and Method for Tracking Objects".|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US8442835 *||17 Jun 2010||14 May 2013||At&T Intellectual Property I, L.P.||Methods, systems, and products for measuring health|
|US8600759 *||12 Apr 2013||3 Dec 2013||At&T Intellectual Property I, L.P.||Methods, systems, and products for measuring health|
|US8638911 *||24 Jul 2009||28 Jan 2014||Avaya Inc.||Classification of voice messages based on analysis of the content of the message and user-provisioned tagging rules|
|US8666768||27 Jul 2010||4 Mar 2014||At&T Intellectual Property I, L. P.||Methods, systems, and products for measuring health|
|US8712760 *||29 Dec 2010||29 Apr 2014||Industrial Technology Research Institute||Method and mobile device for awareness of language ability|
|US8744847||25 Apr 2008||3 Jun 2014||Lena Foundation||System and method for expressive language assessment|
|US8938390 *||27 Feb 2009||20 Jan 2015||Lena Foundation||System and method for expressive language and developmental disorder assessment|
|US9240188||23 Jan 2009||19 Jan 2016||Lena Foundation||System and method for expressive language, developmental disorder, and emotion assessment|
|US9355651||29 Apr 2014||31 May 2016||Lena Foundation||System and method for expressive language, developmental disorder, and emotion assessment|
|US9700207||13 Feb 2014||11 Jul 2017||At&T Intellectual Property I, L.P.||Methods, systems, and products for measuring health|
|US9734542||30 Oct 2013||15 Aug 2017||At&T Intellectual Property I, L.P.||Methods, systems, and products for measuring health|
|US9799348||15 Jan 2016||24 Oct 2017||Lena Foundation||Systems and methods for an automatic language characteristic recognition system|
|US20040116782 *||17 Dec 2002||17 Jun 2004||International Business Machines Corporation||Behavior based life support with abstract behavior patterns|
|US20090208913 *||27 Feb 2009||20 Aug 2009||Infoture, Inc.||System and method for expressive language, developmental disorder, and emotion assessment|
|US20100088084 *||3 Oct 2008||8 Apr 2010||International Business Machines Corporation||Method for translating a non-verbal communication within a virtual world environment|
|US20110021178 *||24 Jul 2009||27 Jan 2011||Avaya Inc.||Classification of voice messages based on analysis of the content of the message and user-provisioned tagging rules|
|US20110313774 *||17 Jun 2010||22 Dec 2011||Lusheng Ji||Methods, Systems, and Products for Measuring Health|
|US20120053929 *||29 Dec 2010||1 Mar 2012||Industrial Technology Research Institute||Method and mobile device for awareness of language ability|
|U.S. Classification||704/275, 704/270, 704/277|
|International Classification||A61B5/00, A61B5/11, G10L21/00, A61B5/16|
|Cooperative Classification||A61B5/7232, A61B5/16, A61B5/1123, A61B5/1128, A61B5/411, G06K9/00973, A61B5/0002, G06K9/00335|
|European Classification||A61B5/11T, A61B5/41B, G06K9/00Y, G06K9/00G|
|29 Dec 2000||AS||Assignment|
Owner name: IBM CORPORATION, NEW YORK
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ZADROZNY, WLODEK W.;KANEVSKY, DIMITRI;REEL/FRAME:011426/0801;SIGNING DATES FROM 20001222 TO 20001226
|6 Mar 2009||AS||Assignment|
Owner name: NUANCE COMMUNICATIONS, INC., MASSACHUSETTS
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INTERNATIONAL BUSINESS MACHINES CORPORATION;REEL/FRAME:022354/0566
Effective date: 20081231
|28 May 2010||FPAY||Fee payment|
Year of fee payment: 4
|30 Apr 2014||FPAY||Fee payment|
Year of fee payment: 8