WO2012068422A1 - Computer user interface - Google Patents

Computer user interface Download PDF

Info

Publication number
WO2012068422A1
WO2012068422A1 PCT/US2011/061286 US2011061286W WO2012068422A1 WO 2012068422 A1 WO2012068422 A1 WO 2012068422A1 US 2011061286 W US2011061286 W US 2011061286W WO 2012068422 A1 WO2012068422 A1 WO 2012068422A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
food
user interface
tool
functional
Prior art date
Application number
PCT/US2011/061286
Other languages
French (fr)
Inventor
John H. Hnatio
Original Assignee
Projectioneering, LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Projectioneering, LLC filed Critical Projectioneering, LLC
Publication of WO2012068422A1 publication Critical patent/WO2012068422A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • FIG. 10 the user input device to activate the respective different software applications
  • the second functional area being reserved for user navigation and data input by the user for an activated tool application currently in use
  • the third functional area being reserved for displaying, in order of selection, screen shot captures for all screens viewed by the user from a start point up to a point at which the third functional area is activated
  • the fourth functional area being reserved for depicting to the user whether issues associated with a type of event being analyzed have been addressed
  • a data file area reserved for displaying icons associated with data files
  • a feed area to receive dragged and dropped icons from the data file area, wherein the system is operative such that the data file associated with the dragged and dropped data icon to the feed area is scanned for key words and semantic context to pre-populate data fields for any tool applications associated with the tool icons in the first functional area
  • a hidden screen landmark area located generally at a center portion of the graphical user interface to seamlessly transition among the four functions associated with the functional area.
  • Embodiments also include a computerized method, comprising:
  • the electronically provided user interface having a plurality of functional zones, a file zone to show icons associated with files to be automatically analyzed, a feed zone to receive icons fed from the file zone, and an invisible zone operative to allow the user to transition among the functional zones; and electronically providing the user interface for manipulation by a user to access the functional zones, the file zone, the feed zone, and the invisible zone.
  • a computerized control system for manipulating data provided by a user via a graphical user interface, comprises: a processor having an information processing unit and a non-transitory computer readable medium; a user input device; a non-volatile storage unit; and a graphical user interface.
  • the computer readable medium of the processor stores instructions that, when executed by the processor, cause the processor to perform operations including: allowing the user to provide data via the graphical user interface using the user input device; automatically transcribing in real time data provided by the user to data fields associated with tool applications of the system; and storing all data provided by the user in the non-volatile storage unit.
  • FIG. 1 is a block diagram of a system according to embodiments of the invention.
  • FIG. 2 shows an exemplary embodiment of a user interface.
  • FIGS. 3A-3D show working examples of another exemplary embodiment of a user interface.
  • FIGS. 4A and 4B show another working example of the exemplary embodiment of the user interface in FIGS. 3A-3D.
  • FIGS. 5 A and 5B show another working example of the exemplary embodiment of the user interface in FIGS. 3A-3D.
  • FIGS. 6A-6G show portions of the working example from FIGS. 5A and 5B.
  • FIGS. 7A and 7B show yet another working example of the exemplary embodiment of the user interface in FIGS. 3A-3D.
  • FIG. 8 is a flow chart for a method according to embodiments.
  • the user interfaces and methods, systems and computer readable media or computer program products can be used or implemented with the systems, methods, etc. of any of the following U.S. patent applications, including U.S. Patent Application No. 1 1/808,580, U.S. Patent Application No. 12/948,597 (and corresponding PCT Application No. PCT/US1 1/61127), and U.S. Patent Application No. 12/948,588 (and corresponding PCT Application No. PCT/US1 1/61 129).
  • the entire content of each of the foregoing applications is hereby incorporated by reference into the present application, and copies of the disclosures for PCT
  • embodiments of user interfaces and systems, methods, and computer program products or computer readable media thereof according to the present invention can provide a driving metaphor such that a user feels he or she is steering or navigating through applications and/or data with a guidance system (e.g., similar to a GPS device in a vehicle).
  • a guidance system e.g., similar to a GPS device in a vehicle.
  • a user may be able to navigate information and data available such that data or information is brought to him or her without necessarily having to look or search for such information.
  • the interface can also provide a user experience that is more suggestive of a video game experience as opposed to the look and feel of conventional application software user interfaces (e.g., Windows, MacOS, etc.).
  • the interface can also provide a way for a user to approach a data processing task from any desired starting point, without requiring or forcing the user into a linear sequence of actions, for example.
  • an embodiment includes a computer or computerized system 202 having a processor 204 and a computer readable medium 206.
  • the computer readable medium 206 has stored thereon software instructions that, when executed, cause the processor 204 to generate a user interface (200 of FIG. 2 or 300 of FIGS. 3A-7B, for example), transmit the user interface to a display 208, and respond to user interaction with the interface via input or input/output device 210.
  • the input device 210 may be comprised of one or more of a mouse, a joystick, a touchscreen, and a keyboard.
  • a non- volatile memory may be provided within the processor 204 or external to the processor 204, either making up a portion of the computer readable medium 206 or separate from the computer readable medium 206.
  • the non-volatile memory may be used to store data provided by a user or otherwise.
  • the non-volatile memory also may be a data repository whether data can be retrieved automatically or in response to a user input.
  • User interfaces can include a plurality of sections or zones.
  • four or more sections or zones may be implemented.
  • Sections or zones can include two or more functional zones or areas, a file zone or area to show icons associated with files to be automatically analyzed, for instance, a feed zone or area to receive icons fed from the file zone/area, for example, and an invisible zone or area operative to allow the user to transition among the functional zones.
  • FIG. 2 shows an exemplary user interface 200 in accordance with the present disclosure, which includes the following functional sections: a data entry section (e.g., chalk board) 102; a comic book section (e.g., graphical data displays or screen captures) 104; an application icon section (e.g., toy or tool box) 106; and a warnings, error and/or alerts section (e.g., time out) 108.
  • a data entry section e.g., chalk board
  • a comic book section e.g., graphical data displays or screen captures
  • an application icon section e.g., toy or tool box
  • a warnings, error and/or alerts section e.g., time out
  • Interface 200 also can include a document input section (e.g., dumpster or feeder 1 10) that can be coupled to an automated data processing system for inputting documents (e.g., optical character recognition, or other textual or graphical recognition system) and scanning for keywords and/or semantic context in order to pre-populate the applications in the applications section with input data.
  • a document input section e.g., dumpster or feeder 1
  • an automated data processing system for inputting documents (e.g., optical character recognition, or other textual or graphical recognition system) and scanning for keywords and/or semantic context in order to pre-populate the applications in the applications section with input data.
  • the user interface 200 includes functional sections in four quadrants: chalk board 102, comic book 104, toy box 106, and time out 108.
  • Interface 200 also includes a feeder dumpster 1 10 and a sweet spot or hotspot 1 12, which may not be visible to the user; i.e., the sweet spot or hotspot 1 12 may be invisible to or hidden from the user.
  • the sweet spot or hotspot 1 12 may become temporarily visible or unhidden when the user drags an application over or to the sweet spot or hotspot 1 12 (e.g., rolls over the spot).
  • the sweet spot or hotspot 1 12 may be located at or around the center of the user interface 200 or a user screen upon which the interface 200 is displayed. Note that while the example in FIG. 2 shows four quadrants, it will be appreciated that a fewer or greater number of screen functional sections can be used depending on a contemplated embodiment.
  • a computer (or processor) 204 can execute software instructions retrieved from a computer readable medium 206 to generate and respond to the user interface 100.
  • the computer can receive user inputs from one or more I/O devices 210, for example, by a traditional input device such as a joystick, mouse, keyboard, trackball or the like.
  • a traditional input device such as a joystick, mouse, keyboard, trackball or the like.
  • the computer can receive user input from I/O devices 210 such as a touch screen or a wireless pointing device (e.g., an input device similar to the game controller for the Wii video game console).
  • the computer can receive user input from one or more systems designed to sense and/or track human movements and/or voice (e.g., input devices similar to the Kinect for the Xbox 360 video game console) that can include a motion sensor adapted to sense the motion of a human body, a skeletal tracking system, a facial recognition system and/or a voice recognition system.
  • a motion sensor adapted to sense the motion of a human body
  • a skeletal tracking system e.g., a skeletal tracking system
  • a facial recognition system e.g., a voice recognition system.
  • the user interface 200 can provide for seamless switching between applications by using a click, drag and drop technique, for instance.
  • the user can select and drag the desired application icon 107 (e.g., Food Safety TQ shown in FIG. 2) from the toy box 106 to the sweet spot 1 12.
  • the user may drop or release the application icon 107 over the sweet spot 1 12, or, alternatively, the user may drag the application icon 107 through the sweet spot 1 12 to one of the three other quadrants 102, 104, 108.
  • the sweet spot 1 12 can be a hidden or visible landmark and located anywhere on the screen, for example, in the center of the screen. The user can learn the location of the sweet spot 1 12 through initial setup, tutorial and/or demonstration of the system, or through repeated use of the user interface 200.
  • an application icon 107 has been dragged and released over the sweet spot 1 12 (or dragged to the sweet spot and held there), the user can then position the onscreen pointer (or cursor) over the chalk board 102, comic book 104, or time out 108 quadrants, or alternatively drag the application icon 107 to one of the aforementioned three quadrants. As the pointer/cursor or icon 107 is positioned over one of these three remaining quadrants, a full screen view of that quadrant can be displayed.
  • the user can see data that has been input for the active application, for example, and can delete existing data, modify existing data, and/or enter new data.
  • the user positions the cursor or pointer (or icon 107) over the comic book 104 quadrant, which can represent the metaphor of graphical images, for example, the user can see, for example, a display of screen shots and/or graphical data visualizations that have been generated and captured thus far in the course of using the active application.
  • time out 108 quadrant which can represent a metaphor of a time out as used to correct children, or in sports activities, when some reflection is needed on a current situation, the time out 108 quadrant indicates a warning of incomplete data or other errors.
  • interface 200 can make the items in the time out 108 quadrant active links that can be selected in order to take the user to the portion of the application where further attention or action may be needed.
  • the toy box 106 section can include icons for some or all applicable applications available to the user at that time.
  • the time out 108 section can display issues ranging in severity from warning to serious or critical, for instance.
  • the severity of the issue can be displayed via color (e.g., yellow, orange, red), size of font, typeface of font, position within quadrant, or the like.
  • a checklist of actions that need to be and have been taken can also be displayed.
  • the feeder or dumpster 1 10 can be a user interface element that performs automated data processing on files or documents dragged and dropped onto the feeder or dumpster 1 10.
  • the automated processing can include optical character recognition, or other textual or graphical recognition system, adapted to scan for keywords and/or semantic context in order to pre-populate data fields associated with one or more of the applications in the toy box 106 with input data.
  • the user interface 200 can also include an ability to retain a record of previously entered data in order to automatically transcribe data entered in one application into like fields of other applications in the toy box 106, thus reducing or eliminating repetitive data entry by the user.
  • the user interface 200 can be configured to show all sections at once along with sweet spot 1 12. Alternatively, the user interface 200 can be configured to show one or more selected applications. [0033] As applications are dragged and dropped over the sweet spot 1 12, relevant data can be automatically retrieved from one or more databases associated with the active application. For example, applications in a risk management system (as described below in Appendix I) such as assessment, simulation, projection, prevention and response/mitigation when activated could retrieve the relevant data from a metadata database (as described below in Appendix II). The user interface 200 can be adapted to facilitate the retrieval of relevant data when applications are activated.
  • a risk management system as described below in Appendix I
  • the user interface 200 can be adapted to facilitate the retrieval of relevant data when applications are activated.
  • the user interface 200 can be adapted for display on a desktop or laptop computer display 208. Also, the user interface 200 can be adapted for display on a mobile or wireless device such as a Blackberry, iPhone, smartphone, cell phone, feature phone, netbook and/or the like.
  • a mobile or wireless device such as a Blackberry, iPhone, smartphone, cell phone, feature phone, netbook and/or the like.
  • FIGS. 3A-3D show working examples of another exemplary embodiment of a user interface 300.
  • Interface 300 can include a plurality of functional areas. Interface 300 in particular includes four functional areas, including a first functional area 306, a second functional area 302, a third functional area 304, and a fourth functional area 308.
  • the first functional area 306 can be reserved for display of a plurality of tool icons 307 for a number of different software applications.
  • the icons 307 can be manipulated by direct manipulation, for instance, based on inputs from the user input device 210 as described herein to activate the respective different software applications.
  • the second functional area 302 may be reserved for user navigation and data input by the user for an activated tool application currently in use.
  • the third functional area 304 may be reserved for displaying, in order of selection, for instance, screen shot captures for all screens viewed by the user from a start point up to a point at which the third functional area 304 is activated.
  • the fourth functional area 308 may be reserved for depicting to the user whether issues associated with a type of event being analyzed have been addressed.
  • Interface 300 also may include a data file area 314 reserved for displaying icons 315 associated with data files.
  • a feed area 310 also may be provided to receive dragged and dropped icons 315 from the data file area.
  • System 202 can be operative such that the data file associated with the dragged and dropped data icon 315 to the feed area is scanned for key words and semantic context to pre-populate data fields for any tool applications associated with the tool icons in the first functional area.
  • the data from the data file may be tagged.
  • FIGS. 3A through 3D show multiple data icons 315 being moved or dragged to feeder 310 from data file area 314.
  • Interface 300 also may include a hidden or invisible landmark area 312 (actually shown in the figures).
  • FIGS. 3 A through 3D show the hidden area 312 being located in the center of the four functional areas, but the location is this area 312 is not so limited and it could optionally be in the middle of the entire interface 300 display.
  • functional area 306 is called a tool box in FIGS. 3A through 3D, and this area include a plurality of icons 307 associated with tool applications.
  • the icons 307 may represent the following applications or products: Poison TQ, Food Mapper TQ, Food Defense TQ, Food Safety TQ, Food Event Analysis and Simulation Tool (FEAST), and Food Response Emergency Evaluation Tool (FREE).
  • system 202 can allow a user to search a meta database located in the cloud for an adulterant agent and/or symptoms exhibited by a affected person to an agent.
  • system 202 can provide for identification of the words 'shall' and 'should' in legal documents concerning legal requirements for a particular food product type or method of food production from thirteen different federal agencies.
  • system 202 can provide fillable self- evaluation forms to be completed in order to calculate a food defense risk and a threat quotient.
  • the system 202 can provides fillable self-evaluation forms to be completed in order to calculate a food safety risk and the threat quotient.
  • the system 202 can run
  • the system 202 can provide responses to specific emergencies for the particular location and/or product type, including directing personnel to perform certain actions in response to emergency, contacting an appropriate emergency responder, and/or providing an electronic record of all actions taken by personnel or emergency responders.
  • the Poison TQ can be used for identifying accidental and intentional attacks on product or person and assisting in identifying agent and ways said agent was used in the past to contaminate a product;
  • the Food Mapper TQ can be used to identifies all federal and/or state requirements concerning a plant's production, for instance, and can be used by management and/or auditing bodies to determine compliance to federal and/or state requirements;
  • the Food Defense TQ can identify low performing suppliers to focus limited resources on potential problem facilities and/or processes;
  • the Food Safety TQ also can be used to identify low performing suppliers to focus limited resources on potential problem facilities and/or processes;
  • the Food Event Analysis and Simulation Tool (FEAST) can be used for by a plant, for example, to determine its vulnerabilities and how to prevent and/or respond to reduce outcome of vulnerabilities, as well as by a consultant and or auditing agent to run a realistic event at a plant to determine effectiveness of food defense/safety plan, or by a government agency to run realistic scenarios to practice response(s) to an issue; and the Food
  • FIGS. 4 A and 4B show another working example of the exemplary embodiment of the user interface in FIGS. 3A-3D, whereby a user has caused an application icon 307 to be moved into or otherwise appear in the chalk board section 302 by way of the hidden area 312 (as described above with respect to interface 200).
  • FIG. 4B is an example of a data box than can appear in the user interface as a result of moving a tool application icon 307 into the chalk board area 302.
  • area 302 provides for user navigation and data input by the user for an activated tool application currently in use.
  • FIG. 4A shows the application currently in use as a poison-related application, such as Poison TQ.
  • FIG. 4B shows a plurality of text boxes 303. A user may review the data in the boxes and accept or modify the data. Optionally, the user may simply respond to questions asked by the text boxes, for instance by selecting or checking boxes or circles or actually typing in text.
  • FIGS. 5A and 5B show another working example of the exemplary embodiment of the user interface in FIGS. 3A-3D, whereby a user has caused an application icon 307 to be moved into or otherwise appear in the comic book section 304 by way of the hidden area 312 (as described above with respect to interface 200).
  • FIG. 5B shows in one section, all of the images, such as screen shot captures 305, of particular screens or pages visited, viewed, or manipulated by the user from a start point to an end point, such as when the comic book section 304 is activated by the user.
  • the images or captures 305 may be displayed in order of occurrence, so a reviewing party may analyze steps or actions taken by a particular user in a particular situation or in response to a certain event, for instance. A time period spend on each page also may be displayed.
  • the images or captures may be arranged or grouped in some other fashion, such as by a certain threat, condition or event.
  • FIGS. 6A-6G show individual screen captures 305 according to an order of occurrence (i.e., the order in which the user visited or accessed the data).
  • FIGS. 7A and 7B show yet another working example of the exemplary embodiment of the user interface in FIGS. 3A-3D, whereby a user has caused an application icon 307 to be moved into or otherwise appear in the time out section 308 by way of the hidden area 312 (as described above with respect to interface 200).
  • area 308 provides for depicting whether the critical issues, for instance, associated with the type of event being analyzed have been addressed. This may be done by flagging critical data fields by generic category of events in advance, for instance.
  • FIG. 8 is a flow chart for a method 800 according to embodiments.
  • Method 800 can include generating a user interface according to embodiments as described and shown herein (804), and electronically providing the generated user interface to a user (806).
  • a processor can cause the user interface to displayed on a screen, for instance, and the user interface is provided to the user in such a manner that the user can manipulate the interface to access zones or areas of the interface, for example by using one or more input devices as described herein.
  • the user may provide data to the system via the graphical user interface using one or more input devices as described herein (808).
  • the provided data can be automatically manipulated by the system as described herein, for instance, automatically transcribing in real time data provided by the user to data fields associated with tool applications of the system, tagging, and or performing symantic or keyword processing and analysis (810).
  • Some or all of the data provided may be stored in a non-volatile storage memory as described earlier, for instance (812).
  • the data previously provided by a user and stored may be output in response to a user request or input from the input device with respect to the user interface (814). For instance, the data can be displayed in response to activation of the various functional sections or tool applications as described herein.
  • the method 800 can also include user operation of interfaces as described herein and as shown by the drawing.
  • modules, processes, systems, and sections described above can be implemented in hardware, hardware programmed by software, software instructions stored on a nontransitory computer readable medium or a combination of the above.
  • a user interface system can be implemented, for example, using a processor configured to execute a sequence of programmed instructions stored on a nontransitory computer readable medium.
  • the processor can include, but not be limited to, a personal computer or workstation or other such computing system that includes a processor, microprocessor, microcontroller device, or is comprised of control logic including integrated circuits such as, for example, an Application Specific Integrated Circuit (ASIC).
  • ASIC Application Specific Integrated Circuit
  • the instructions can be compiled from source code instructions provided in accordance with a programming language such as Java, C++, C#.net or the like.
  • the instructions can also comprise code and data objects provided in accordance with, for example, the Visual BasicTM language, or another structured or object-oriented programming language.
  • the sequence of programmed instructions and data associated therewith can be stored in a
  • nontransitory computer-readable medium such as a computer memory or storage device which may be any suitable memory apparatus, such as, but not limited to ROM, PROM, EEPROM, RAM, flash memory, disk drive and the like.
  • modules, processes systems, and sections can be implemented as a single processor or as a distributed processor. Further, it should be appreciated that the steps mentioned above may be performed on a single or distributed processor (single and/or multi-core). Also, the processes, modules, and sub-modules described in the various figures of and for embodiments above may be distributed across multiple computers or systems or may be co-located in a single processor or system. Exemplary structural embodiment alternatives suitable for implementing the modules, sections, systems, means, or processes described herein are provided below.
  • modules, processors or systems described above can be implemented as a programmed general purpose computer, an electronic device programmed with microcode, a hard-wired analog logic circuit, software stored on a computer-readable medium or signal, an optical computing device, a networked system of electronic and/or optical devices, a special purpose computing device, an integrated circuit device, a semiconductor chip, and a software module or object stored on a computer- readable medium or signal, for example.
  • Embodiments of the method and system may be implemented on a general-purpose computer, a special-purpose computer, a programmed microprocessor or microcontroller and peripheral integrated circuit element, an ASIC or other integrated circuit, a digital signal processor, a hardwired electronic or logic circuit such as a discrete element circuit, a programmed logic circuit such as a PLD, PLA, FPGA, PAL, or the like.
  • any processor capable of implementing the functions or steps described herein can be used to implement embodiments of the method, system, or a computer program product (software program stored on a nontransitory computer readable medium).
  • embodiments of the disclosed method, system, and computer program product may be readily implemented, fully or partially, in software using, for example, object or object-oriented software development environments that provide portable source code that can be used on a variety of computer platforms.
  • embodiments of the disclosed method, system, and computer program product can be implemented partially or fully in hardware using, for example, standard logic circuits or a VLSI design.
  • Other hardware or software can be used to implement embodiments depending on the speed and/or efficiency requirements of the systems, the particular function, and/or particular software or hardware system, microprocessor, or microcomputer being utilized.
  • Embodiments of the method, system, and computer program product can be implemented in hardware and/or software using any known or later developed systems or structures, devices and/or software by those of ordinary skill in the applicable art from the function description provided herein and with a general basic knowledge of the user interface and/or computer programming arts.
  • embodiments of the disclosed method, system, and computer program product can be implemented in software executed on a programmed general purpose computer, a special purpose computer, a microprocessor, or the like.

Abstract

Systems, methods and computer readable media for manipulating data provided by a user via a graphical user interface. Data provided by a user via the graphical user interface can be automatically transcribed to or otherwise caused to populate in real time data fields associated with tool applications available to the user. The user can access multiple applications via the graphical user interface and can enter and view data previously entered for each application or application subset.

Description

Figure imgf000002_0002
Figure imgf000002_0001
FIG. 10 the user input device to activate the respective different software applications, the second functional area being reserved for user navigation and data input by the user for an activated tool application currently in use, the third functional area being reserved for displaying, in order of selection, screen shot captures for all screens viewed by the user from a start point up to a point at which the third functional area is activated, and the fourth functional area being reserved for depicting to the user whether issues associated with a type of event being analyzed have been addressed; a data file area reserved for displaying icons associated with data files; a feed area to receive dragged and dropped icons from the data file area, wherein the system is operative such that the data file associated with the dragged and dropped data icon to the feed area is scanned for key words and semantic context to pre-populate data fields for any tool applications associated with the tool icons in the first functional area; and a hidden screen landmark area located generally at a center portion of the graphical user interface to seamlessly transition among the four functions associated with the functional area.
[0005] Embodiments also include a computerized method, comprising:
electronically generating a user interface, the electronically provided user interface having a plurality of functional zones, a file zone to show icons associated with files to be automatically analyzed, a feed zone to receive icons fed from the file zone, and an invisible zone operative to allow the user to transition among the functional zones; and electronically providing the user interface for manipulation by a user to access the functional zones, the file zone, the feed zone, and the invisible zone.
[0006] In embodiments, a computerized control system for manipulating data provided by a user via a graphical user interface, comprises: a processor having an information processing unit and a non-transitory computer readable medium; a user input device; a non-volatile storage unit; and a graphical user interface. The computer readable medium of the processor stores instructions that, when executed by the processor, cause the processor to perform operations including: allowing the user to provide data via the graphical user interface using the user input device; automatically transcribing in real time data provided by the user to data fields associated with tool applications of the system; and storing all data provided by the user in the non-volatile storage unit. BRIEF DESCRIPTION OF THE DRAWINGS
[0007] Embodiments will hereinafter be described in detail below with reference to the accompanying drawings, wherein like reference numerals represent like elements. The accompanying drawings have not necessarily been drawn to scale. Any values dimensions illustrated in the accompanying graphs and figures are for illustration purposes only and may not represent actual or preferred values or dimensions. Where applicable, some features may not be illustrated to assist in the description of underlying features.
[0008] FIG. 1 is a block diagram of a system according to embodiments of the invention.
[0009] FIG. 2 shows an exemplary embodiment of a user interface.
[00.10] FIGS. 3A-3D show working examples of another exemplary embodiment of a user interface.
[0011] FIGS. 4A and 4B show another working example of the exemplary embodiment of the user interface in FIGS. 3A-3D.
[0012] FIGS. 5 A and 5B show another working example of the exemplary embodiment of the user interface in FIGS. 3A-3D.
[0013] FIGS. 6A-6G show portions of the working example from FIGS. 5A and 5B.
[0014] FIGS. 7A and 7B show yet another working example of the exemplary embodiment of the user interface in FIGS. 3A-3D.
[0015] FIG. 8 is a flow chart for a method according to embodiments.
DETAILED DESCRIPTION
[0016] The description set forth below in connection with the appended drawings is intended as a description of various embodiments of the invention and is not intended to represent the only embodiments in which the invention may be practiced. The description includes specific details for the purpose of providing an
understanding of the invention. However, it will be apparent to those skilled in the art that the disclosed subject matter may be practiced without these specific details. In some instances, well-known structures and components may be shown in block diagram form in order to avoid obscuring the concepts of the invention.
[0017] While embodiments may be described in connection with various specific application examples, it will be appreciated that the methods, systems and computer readable media disclosed herein are applicable to many types of subjects, areas of interest, facilities, organizations, processes, scenarios, and the like. For example, user interfaces and systems, methods, and computer program products or computer readable media thereof are not limited to food protection or safety, and can be applied to school-related issues, building-related issues, such as maintenance and safety, biotechnology production-related issues, transportation-related issues, military-related issues (e.g., facilities), issues relating to other sensitive facilities where security may be a concern, such as hospitals, airports, businesses, financial institutions and the like.
[0018] In embodiments, the user interfaces and methods, systems and computer readable media or computer program products can be used or implemented with the systems, methods, etc. of any of the following U.S. patent applications, including U.S. Patent Application No. 1 1/808,580, U.S. Patent Application No. 12/948,597 (and corresponding PCT Application No. PCT/US1 1/61127), and U.S. Patent Application No. 12/948,588 (and corresponding PCT Application No. PCT/US1 1/61 129). The entire content of each of the foregoing applications is hereby incorporated by reference into the present application, and copies of the disclosures for PCT
Application No. PCT/US11/61 127, and PCT Application No. PCT/US1 1/61129 are attached hereto as Appendices I and II, respectively.
[0019] Generally speaking, embodiments of user interfaces and systems, methods, and computer program products or computer readable media thereof according to the present invention can provide a driving metaphor such that a user feels he or she is steering or navigating through applications and/or data with a guidance system (e.g., similar to a GPS device in a vehicle). Thus, in using interfaces according to embodiments of the invention, a user may be able to navigate information and data available such that data or information is brought to him or her without necessarily having to look or search for such information. The interface can also provide a user experience that is more suggestive of a video game experience as opposed to the look and feel of conventional application software user interfaces (e.g., Windows, MacOS, etc.). The interface can also provide a way for a user to approach a data processing task from any desired starting point, without requiring or forcing the user into a linear sequence of actions, for example.
[0020] As shown in FIG. 1 , an embodiment includes a computer or computerized system 202 having a processor 204 and a computer readable medium 206. The computer readable medium 206 has stored thereon software instructions that, when executed, cause the processor 204 to generate a user interface (200 of FIG. 2 or 300 of FIGS. 3A-7B, for example), transmit the user interface to a display 208, and respond to user interaction with the interface via input or input/output device 210. The input device 210 may be comprised of one or more of a mouse, a joystick, a touchscreen, and a keyboard. Not explicitly shown, a non- volatile memory may be provided within the processor 204 or external to the processor 204, either making up a portion of the computer readable medium 206 or separate from the computer readable medium 206. The non-volatile memory may be used to store data provided by a user or otherwise. The non-volatile memory also may be a data repository whether data can be retrieved automatically or in response to a user input.
[0021] User interfaces according to embodiments can include a plurality of sections or zones. Optionally, four or more sections or zones may be implemented. Sections or zones can include two or more functional zones or areas, a file zone or area to show icons associated with files to be automatically analyzed, for instance, a feed zone or area to receive icons fed from the file zone/area, for example, and an invisible zone or area operative to allow the user to transition among the functional zones.
[0022] FIG. 2 shows an exemplary user interface 200 in accordance with the present disclosure, which includes the following functional sections: a data entry section (e.g., chalk board) 102; a comic book section (e.g., graphical data displays or screen captures) 104; an application icon section (e.g., toy or tool box) 106; and a warnings, error and/or alerts section (e.g., time out) 108. Interface 200 also can include a document input section (e.g., dumpster or feeder 1 10) that can be coupled to an automated data processing system for inputting documents (e.g., optical character recognition, or other textual or graphical recognition system) and scanning for keywords and/or semantic context in order to pre-populate the applications in the applications section with input data.
[0023] As indicated above, the user interface 200 includes functional sections in four quadrants: chalk board 102, comic book 104, toy box 106, and time out 108. Interface 200 also includes a feeder dumpster 1 10 and a sweet spot or hotspot 1 12, which may not be visible to the user; i.e., the sweet spot or hotspot 1 12 may be invisible to or hidden from the user. Optionally, the sweet spot or hotspot 1 12 may become temporarily visible or unhidden when the user drags an application over or to the sweet spot or hotspot 1 12 (e.g., rolls over the spot). Optionally, the sweet spot or hotspot 1 12 may be located at or around the center of the user interface 200 or a user screen upon which the interface 200 is displayed. Note that while the example in FIG. 2 shows four quadrants, it will be appreciated that a fewer or greater number of screen functional sections can be used depending on a contemplated embodiment.
[0024] In operation, a computer (or processor) 204, shown in FIG. 1, can execute software instructions retrieved from a computer readable medium 206 to generate and respond to the user interface 100. The computer can receive user inputs from one or more I/O devices 210, for example, by a traditional input device such as a joystick, mouse, keyboard, trackball or the like. In addition to traditional input devices, the computer can receive user input from I/O devices 210 such as a touch screen or a wireless pointing device (e.g., an input device similar to the game controller for the Wii video game console). Also, the computer can receive user input from one or more systems designed to sense and/or track human movements and/or voice (e.g., input devices similar to the Kinect for the Xbox 360 video game console) that can include a motion sensor adapted to sense the motion of a human body, a skeletal tracking system, a facial recognition system and/or a voice recognition system.
[0025] The user interface 200 can provide for seamless switching between applications by using a click, drag and drop technique, for instance. For example, to activate an application, the user can select and drag the desired application icon 107 (e.g., Food Safety TQ shown in FIG. 2) from the toy box 106 to the sweet spot 1 12. The user may drop or release the application icon 107 over the sweet spot 1 12, or, alternatively, the user may drag the application icon 107 through the sweet spot 1 12 to one of the three other quadrants 102, 104, 108. As indicated above, the sweet spot 1 12 can be a hidden or visible landmark and located anywhere on the screen, for example, in the center of the screen. The user can learn the location of the sweet spot 1 12 through initial setup, tutorial and/or demonstration of the system, or through repeated use of the user interface 200.
[0026] Once an application icon 107 has been dragged and released over the sweet spot 1 12 (or dragged to the sweet spot and held there), the user can then position the onscreen pointer (or cursor) over the chalk board 102, comic book 104, or time out 108 quadrants, or alternatively drag the application icon 107 to one of the aforementioned three quadrants. As the pointer/cursor or icon 107 is positioned over one of these three remaining quadrants, a full screen view of that quadrant can be displayed. For example, if the user positions the cursor or pointer (or icon 107) over the chalk board 102 quadrant, which can represent the metaphor of writing on a chalk board, the user can see data that has been input for the active application, for example, and can delete existing data, modify existing data, and/or enter new data.
[0027] If the user positions the cursor or pointer (or icon 107) over the comic book 104 quadrant, which can represent the metaphor of graphical images, for example, the user can see, for example, a display of screen shots and/or graphical data visualizations that have been generated and captured thus far in the course of using the active application.
[0028] If the user positions the cursor or pointer over the time out 108 quadrant, which can represent a metaphor of a time out as used to correct children, or in sports activities, when some reflection is needed on a current situation, the time out 108 quadrant indicates a warning of incomplete data or other errors. The user
interface 200 can make the items in the time out 108 quadrant active links that can be selected in order to take the user to the portion of the application where further attention or action may be needed.
[0029] Generally speaking, the toy box 106 section can include icons for some or all applicable applications available to the user at that time. The time out 108 section can display issues ranging in severity from warning to serious or critical, for instance. The severity of the issue can be displayed via color (e.g., yellow, orange, red), size of font, typeface of font, position within quadrant, or the like. Furthermore, a checklist of actions that need to be and have been taken can also be displayed.
[0030] The feeder or dumpster 1 10 can be a user interface element that performs automated data processing on files or documents dragged and dropped onto the feeder or dumpster 1 10. The automated processing can include optical character recognition, or other textual or graphical recognition system, adapted to scan for keywords and/or semantic context in order to pre-populate data fields associated with one or more of the applications in the toy box 106 with input data.
[0031] The user interface 200 can also include an ability to retain a record of previously entered data in order to automatically transcribe data entered in one application into like fields of other applications in the toy box 106, thus reducing or eliminating repetitive data entry by the user.
[0032] The user interface 200 can be configured to show all sections at once along with sweet spot 1 12. Alternatively, the user interface 200 can be configured to show one or more selected applications. [0033] As applications are dragged and dropped over the sweet spot 1 12, relevant data can be automatically retrieved from one or more databases associated with the active application. For example, applications in a risk management system (as described below in Appendix I) such as assessment, simulation, projection, prevention and response/mitigation when activated could retrieve the relevant data from a metadata database (as described below in Appendix II). The user interface 200 can be adapted to facilitate the retrieval of relevant data when applications are activated.
[0034] The user interface 200 can be adapted for display on a desktop or laptop computer display 208. Also, the user interface 200 can be adapted for display on a mobile or wireless device such as a Blackberry, iPhone, smartphone, cell phone, feature phone, netbook and/or the like.
[0035] FIGS. 3A-3D show working examples of another exemplary embodiment of a user interface 300.
[0036] Interface 300 can include a plurality of functional areas. Interface 300 in particular includes four functional areas, including a first functional area 306, a second functional area 302, a third functional area 304, and a fourth functional area 308. The first functional area 306 can be reserved for display of a plurality of tool icons 307 for a number of different software applications. The icons 307 can be manipulated by direct manipulation, for instance, based on inputs from the user input device 210 as described herein to activate the respective different software applications. The second functional area 302 may be reserved for user navigation and data input by the user for an activated tool application currently in use. The third functional area 304 may be reserved for displaying, in order of selection, for instance, screen shot captures for all screens viewed by the user from a start point up to a point at which the third functional area 304 is activated. The fourth functional area 308 may be reserved for depicting to the user whether issues associated with a type of event being analyzed have been addressed. Interface 300 also may include a data file area 314 reserved for displaying icons 315 associated with data files. A feed area 310 also may be provided to receive dragged and dropped icons 315 from the data file area. System 202 can be operative such that the data file associated with the dragged and dropped data icon 315 to the feed area is scanned for key words and semantic context to pre-populate data fields for any tool applications associated with the tool icons in the first functional area. Optionally, the data from the data file may be tagged. FIGS. 3A through 3D show multiple data icons 315 being moved or dragged to feeder 310 from data file area 314. Interface 300 also may include a hidden or invisible landmark area 312 (actually shown in the figures). FIGS. 3 A through 3D show the hidden area 312 being located in the center of the four functional areas, but the location is this area 312 is not so limited and it could optionally be in the middle of the entire interface 300 display.
[0037] Note that functional area 306 is called a tool box in FIGS. 3A through 3D, and this area include a plurality of icons 307 associated with tool applications. As but one non-limiting example, the icons 307 may represent the following applications or products: Poison TQ, Food Mapper TQ, Food Defense TQ, Food Safety TQ, Food Event Analysis and Simulation Tool (FEAST), and Food Response Emergency Evaluation Tool (FREE).
[0038] For Poison TQ, system 202 can allow a user to search a meta database located in the cloud for an adulterant agent and/or symptoms exhibited by a affected person to an agent. For Food Mapper TQ, system 202 can provide for identification of the words 'shall' and 'should' in legal documents concerning legal requirements for a particular food product type or method of food production from thirteen different federal agencies. For Food Defense TQ, system 202 can provide fillable self- evaluation forms to be completed in order to calculate a food defense risk and a threat quotient. For Food Safety TQ, the system 202 can provides fillable self-evaluation forms to be completed in order to calculate a food safety risk and the threat quotient. For the Food Event Analysis and Simulation Tool, the system 202 can run
vulnerability scenarios to determine viability of an event occurring for a particular location and/or product type and to analyze critical nodes to (1) deter human action; (2) detect event; (3) prevent event; (4) assist in response; and/or (5) mitigate consequences to determine an acceptable or best outcome. For the Food Response Emergency Evaluation Tool, the system 202 can provide responses to specific emergencies for the particular location and/or product type, including directing personnel to perform certain actions in response to emergency, contacting an appropriate emergency responder, and/or providing an electronic record of all actions taken by personnel or emergency responders.
[0039] As examples only, the Poison TQ can be used for identifying accidental and intentional attacks on product or person and assisting in identifying agent and ways said agent was used in the past to contaminate a product; the Food Mapper TQ can be used to identifies all federal and/or state requirements concerning a plant's production, for instance, and can be used by management and/or auditing bodies to determine compliance to federal and/or state requirements; the Food Defense TQ can identify low performing suppliers to focus limited resources on potential problem facilities and/or processes; the Food Safety TQ also can be used to identify low performing suppliers to focus limited resources on potential problem facilities and/or processes; the Food Event Analysis and Simulation Tool (FEAST) can be used for by a plant, for example, to determine its vulnerabilities and how to prevent and/or respond to reduce outcome of vulnerabilities, as well as by a consultant and or auditing agent to run a realistic event at a plant to determine effectiveness of food defense/safety plan, or by a government agency to run realistic scenarios to practice response(s) to an issue; and the Food Response Emergency Evaluation Tool (FREE) can be used during emergencies in order to assist with decisions to aid in controlling the emergency.
[0040] FIGS. 4 A and 4B show another working example of the exemplary embodiment of the user interface in FIGS. 3A-3D, whereby a user has caused an application icon 307 to be moved into or otherwise appear in the chalk board section 302 by way of the hidden area 312 (as described above with respect to interface 200). FIG. 4B is an example of a data box than can appear in the user interface as a result of moving a tool application icon 307 into the chalk board area 302. Generally speaking, area 302 provides for user navigation and data input by the user for an activated tool application currently in use. FIG. 4A, for instance, shows the application currently in use as a poison-related application, such as Poison TQ. FIG. 4B shows a plurality of text boxes 303. A user may review the data in the boxes and accept or modify the data. Optionally, the user may simply respond to questions asked by the text boxes, for instance by selecting or checking boxes or circles or actually typing in text.
[0041] FIGS. 5A and 5B show another working example of the exemplary embodiment of the user interface in FIGS. 3A-3D, whereby a user has caused an application icon 307 to be moved into or otherwise appear in the comic book section 304 by way of the hidden area 312 (as described above with respect to interface 200). FIG. 5B shows in one section, all of the images, such as screen shot captures 305, of particular screens or pages visited, viewed, or manipulated by the user from a start point to an end point, such as when the comic book section 304 is activated by the user. Optionally, the images or captures 305 may be displayed in order of occurrence, so a reviewing party may analyze steps or actions taken by a particular user in a particular situation or in response to a certain event, for instance. A time period spend on each page also may be displayed. Alternatively, the images or captures may be arranged or grouped in some other fashion, such as by a certain threat, condition or event. FIGS. 6A-6G show individual screen captures 305 according to an order of occurrence (i.e., the order in which the user visited or accessed the data).
[0042] FIGS. 7A and 7B show yet another working example of the exemplary embodiment of the user interface in FIGS. 3A-3D, whereby a user has caused an application icon 307 to be moved into or otherwise appear in the time out section 308 by way of the hidden area 312 (as described above with respect to interface 200). Generally speaking, area 308 provides for depicting whether the critical issues, for instance, associated with the type of event being analyzed have been addressed. This may be done by flagging critical data fields by generic category of events in advance, for instance. FIG. 7B, for instance, shows the application currently in use as a poison- related application, such as Poison TQ, and a check has been added to the box entitled "threat" to indicate that one, some, or all issues relating to a particular threat has been addressed.
[0043] FIG. 8 is a flow chart for a method 800 according to embodiments.
[0044] Method 800 can include generating a user interface according to embodiments as described and shown herein (804), and electronically providing the generated user interface to a user (806). Generally speaking, a processor can cause the user interface to displayed on a screen, for instance, and the user interface is provided to the user in such a manner that the user can manipulate the interface to access zones or areas of the interface, for example by using one or more input devices as described herein.
[0045] The user may provide data to the system via the graphical user interface using one or more input devices as described herein (808). The provided data can be automatically manipulated by the system as described herein, for instance, automatically transcribing in real time data provided by the user to data fields associated with tool applications of the system, tagging, and or performing symantic or keyword processing and analysis (810). Some or all of the data provided may be stored in a non-volatile storage memory as described earlier, for instance (812). The data previously provided by a user and stored may be output in response to a user request or input from the input device with respect to the user interface (814). For instance, the data can be displayed in response to activation of the various functional sections or tool applications as described herein.
[0046] The method 800 can also include user operation of interfaces as described herein and as shown by the drawing.
[0047] It will be appreciated that the modules, processes, systems, and sections described above can be implemented in hardware, hardware programmed by software, software instructions stored on a nontransitory computer readable medium or a combination of the above. A user interface system can be implemented, for example, using a processor configured to execute a sequence of programmed instructions stored on a nontransitory computer readable medium. For example, the processor can include, but not be limited to, a personal computer or workstation or other such computing system that includes a processor, microprocessor, microcontroller device, or is comprised of control logic including integrated circuits such as, for example, an Application Specific Integrated Circuit (ASIC). The instructions can be compiled from source code instructions provided in accordance with a programming language such as Java, C++, C#.net or the like. The instructions can also comprise code and data objects provided in accordance with, for example, the Visual Basic™ language, or another structured or object-oriented programming language. The sequence of programmed instructions and data associated therewith can be stored in a
nontransitory computer-readable medium such as a computer memory or storage device which may be any suitable memory apparatus, such as, but not limited to ROM, PROM, EEPROM, RAM, flash memory, disk drive and the like.
[0048] Furthermore, the modules, processes systems, and sections can be implemented as a single processor or as a distributed processor. Further, it should be appreciated that the steps mentioned above may be performed on a single or distributed processor (single and/or multi-core). Also, the processes, modules, and sub-modules described in the various figures of and for embodiments above may be distributed across multiple computers or systems or may be co-located in a single processor or system. Exemplary structural embodiment alternatives suitable for implementing the modules, sections, systems, means, or processes described herein are provided below.
[0049] The modules, processors or systems described above can be implemented as a programmed general purpose computer, an electronic device programmed with microcode, a hard-wired analog logic circuit, software stored on a computer-readable medium or signal, an optical computing device, a networked system of electronic and/or optical devices, a special purpose computing device, an integrated circuit device, a semiconductor chip, and a software module or object stored on a computer- readable medium or signal, for example.
[0050] Embodiments of the method and system (or their sub-components or modules), may be implemented on a general-purpose computer, a special-purpose computer, a programmed microprocessor or microcontroller and peripheral integrated circuit element, an ASIC or other integrated circuit, a digital signal processor, a hardwired electronic or logic circuit such as a discrete element circuit, a programmed logic circuit such as a PLD, PLA, FPGA, PAL, or the like. In general, any processor capable of implementing the functions or steps described herein can be used to implement embodiments of the method, system, or a computer program product (software program stored on a nontransitory computer readable medium).
[0051] Furthermore, embodiments of the disclosed method, system, and computer program product may be readily implemented, fully or partially, in software using, for example, object or object-oriented software development environments that provide portable source code that can be used on a variety of computer platforms.
Alternatively, embodiments of the disclosed method, system, and computer program product can be implemented partially or fully in hardware using, for example, standard logic circuits or a VLSI design. Other hardware or software can be used to implement embodiments depending on the speed and/or efficiency requirements of the systems, the particular function, and/or particular software or hardware system, microprocessor, or microcomputer being utilized. Embodiments of the method, system, and computer program product can be implemented in hardware and/or software using any known or later developed systems or structures, devices and/or software by those of ordinary skill in the applicable art from the function description provided herein and with a general basic knowledge of the user interface and/or computer programming arts.
[0052] Moreover, embodiments of the disclosed method, system, and computer program product can be implemented in software executed on a programmed general purpose computer, a special purpose computer, a microprocessor, or the like.
[0053] It is, therefore, apparent that there is provided, in accordance with the various embodiments disclosed herein, computer systems, methods and software for a human-computer user interface. [0054] While the invention has been described in conjunction with a number of embodiments, it is evident that many alternatives, modifications and variations would be or are apparent to those of ordinary skill in the applicable arts. Accordingly, Applicant intends to embrace all such alternatives, modifications, equivalents and variations that are within the spirit and scope of the invention.

Claims

CLAIMS What is claimed is:
1. A computerized system implementing a human-computer graphical user interface, comprising:
a processor;
a computer readable medium operatively coupled to the processor, the computer readable medium having stored thereon a plurality of executable instructions that, when executed by the processor, implement a graphical user interface;
a display device operatively coupled to the processor to display the graphical user interface; and
a user input device operatively coupled to the processor, the user input device being operative to output signals to the processor based on user input to manipulate the graphical user interface;
wherein the graphical user interface has split-screen capability and includes: four or more separate functional areas, including a first functional area, a second functional area, a third functional area, and a fourth functional area, the first functional area being reserved for display of a plurality of tool icons for a number of different software applications operative by direct manipulation based on inputs from the user input device to activate the respective different software applications, the second functional* area being reserved for user navigation and data input by the user for an activated tool application currently in use, the third functional area being reserved for displaying, in order of selection, screen shot captures for all screens viewed by the user from a start point up to a point at which the third functional area is activated, and the fourth functional area being reserved for depicting to the user whether issues associated with a type of event being analyzed have been addressed;
a data file area reserved for displaying icons associated with data files; a feed area to receive dragged and dropped icons from the data file area, wherein the system is operative such that the data file associated with the dragged and dropped data icon to the feed area is scanned for key words and semantic context to pre-populate data fields for any tool applications associated with the tool icons in the first functional area; and a hidden screen landmark area located generally at a center portion of the graphical user interface to seamlessly transition among the four functions associated with the functional area.
2. The system according to Claim 1 , further comprising at least one other user input device.
3. The system according to either Claim 1 or Claim 2, wherein each of the user input device and the at least one other user input device is selected from the group comprised of a mouse, a joystick, a touchscreen, and a keyboard.
4. The system according to any of Claims 1 through 3, wherein the graphical user interface is a food protection interface.
5. The system according to any of Claims 1 through 4, wherein the graphical user interface is one or more of a three-dimensional graphical user interface and a zooming graphical user interface.
6. The system according to any of Claims 1 through 5,
wherein the tool applications include one or more of the following applications: a poison application, a food mapper application, a food defense application, a food safety application, a food event analysis and simulation tool application, and a food response emergency evaluation tool,
wherein the poison application is operative to allow a user to search a meta database located in the cloud for an adulterant agent and/or symptoms exhibited by a affected person to an agent,
wherein the food mapper application provides for identification of the words 'shall' and 'should' in legal documents concerning legal requirements for a particular food product type or method of food production from thirteen different federal agencies, wherein the food defense application provides Tillable self-evaluation forms to be completed in order to calculate a food defense risk and a threat quotient,
wherein the food safety application provides Tillable self-evaluation forms to be completed in order to calculate a food safety risk and the threat quotient, wherein the food event analysis and simulation tool application is operative to run vulnerability scenarios to determine viability of an event occurring for a particular location and/or product type and to analyze critical nodes to (1) deter human action; (2) detect event; (3) prevent event; (4) assist in response; and/or (5) mitigate consequences to determine an acceptable or best outcome, and
wherein the food response emergency evaluation tool is operative to provide responses to specific emergencies for the particular location and/or product type, including directing personnel to perform certain actions in response to emergency, contacting an appropriate emergency responder, and/or providing an electronic record of all actions taken by personnel or emergency responders.
7. The system according to any of Claims 1 through 6, wherein the fourth functional area of the graphical user interface shows a flag or indicator for each data field to depict that the issue associated with the type of event being analyzed has been addressed.
8. The system according to any of Claims 1 through 7, wherein the system is operative to provide for user selection from the first functional area a tool application icon to activate the corresponding tool application by dragging it to the a hidden screen landmark area, then dragging the selected tool application icon to one of the second, third, or fourth functional areas, thereby displaying a full screen view of the selected functional area.
9. The system according to any of Claims 1 through 8, wherein the system is operative to store all data previously entered and to automatically transcribe said data into like data fields across multiple ones of the tool applications.
10. A computerized method, comprising:
electronically generating a user interface, the electronically provided user interface having a plurality of functional zones, a file zone to show icons associated with files to be automatically analyzed, a feed zone to receive icons fed from the file zone, and an invisible zone operative to allow the user to transition among the functional zones; and electronically providing the user interface for manipulation by a user to access the functional zones, the file zone, the feed zone, and the invisible zone.
1 1. The method according to Claim 10, wherein said electronically generating is performed by a processor, and said electronically providing is performed by a display.
12. The method according to Claim 11, wherein the display is one of a computer screen or a screen of a mobile device.
13. The method according to any of Claims 10 through 12, wherein each file fed to the feed area is automatically analyzed in the sense that it is tagged.
14. The method according to any of Claims 10 through 13, wherein each file fed to the feed area is automatically analyzed in order to populate one or more applications associated with one of the functional zones of the plurality.
15. The method according to any of Claims 10 through 14, wherein the user interface is one or more of a graphical interface, a command-line interface, a three- dimensional interface, and a zooming user interface.
16. The method according to any of Claims 10 through 15, wherein, when the user moves an application icon over it the invisible zone, the invisible zone becomes visible temporarily.
17. The method according to any of Claims 10 through 16, wherein, when a signal from a user input device indicates input from the user corresponding to selection of or movement into the invisible zone, the invisible zone becomes temporarily visible to the user.
18. The method according to any of Claims 10 through 17, wherein the plurality of functional zones is four, including a first functional zone to display of a plurality of tool icons operative to activate respective tool applications, a second functional zone for user navigation and data input for an activated tool application, a third functional zone to display automatically taken screen shots for all screens from a predetermined point up to a point at which the third functional zone is accessed, and a fourth functional area to display whether certain issues have been addressed.
19. A computerized control system for manipulating data provided by a user via a graphical user interface, the computerized control system comprising:
a processor having an information processing unit and a non-transitory computer readable medium;
a user input device;
a non-volatile storage unit; and
a graphical user interface,
wherein the computer readable medium of the processor stores instructions that, when executed by the processor, cause the processor to perform operations including: allowing the user to provide data via the graphical user interface using the user input device;
automatically transcribing in real time data provided by the user to data fields associated with tool applications of the system; and
storing all data provided by the user in the non- volatile storage unit.
20. The system according to Claim 19, further comprising at least one other user input device.
21. The system according to either Claim 19 or Claim 20, wherein each of the user input device and the at least one other user input device is selected from the group comprised of a mouse, a joystick, a touchscreen, and a keyboard.
22. The system according to any of Claims 19 through 21, wherein the graphical user interface is a food protection interface as shown and disclosed herein.
23. The system according to any of Claims 19 through 22, wherein the graphical user interface is one or more of a three-dimensional graphical user interface and a zooming graphical user interface.
24. The system according to any of Claims 19 through 23,
wherein the tool applications include one or more of the following applications: a poison application, a food mapper application, a food defense application, a food safety application, a food event analysis and simulation tool application, and a food response emergency evaluation tool.
25. The system according to Claim 24,
wherein the poison application is operative to allow the user to search a meta database located in the cloud for an adulterant agent and/or symptoms exhibited by a affected person to an agent,
wherein the food mapper application provides for identification of the words 'shall' and 'should' in legal documents concerning legal requirements for a particular food product type or method of food production from a plurality of different federal or state agencies,
wherein the food defense application provides fillable self-evaluation forms to be completed in order to calculate a food defense risk and a threat quotient,
wherein the food safety application provides fillable self-evaluation forms to be completed in order to calculate a food safety risk and the threat quotient,
wherein the food event analysis and simulation tool application is operative to run vulnerability scenarios to determine viability of an event occurring for a particular location and/or product type and to analyze critical nodes to (1) deter human action; (2) detect event; (3) prevent event; (4) assist in response; and/or (5) mitigate consequences to determine an acceptable or best outcome, and
wherein the food response emergency evaluation tool is operative to provide responses to specific emergencies for the particular location and/or product type, including directing personnel to perform certain actions in response to emergency, contacting an appropriate emergency responder, and/or providing an electronic record of all actions taken by personnel or emergency responders.
26. The system according to any of Claims 19 through 25, wherein one of the functional areas of the graphical user interface shows a flag or indicator for each data field to depict successful completion of a task.
27. The system according to any of Claims 19 through 26, wherein the system is operative to provide for user selection from one of the functional areas a tool application icon from a plurality of tool application icons to be used in conjunction with another functional area of the plurality of functional areas.
28. The system according to Claim 27, wherein the system is operative to use the selected tool application in conjunction with the another functional area of the plurality of functional areas based on the user moving the selected tool application icon to a hidden landmark area of the graphical user interface, and then moving the selected tool application icon to the another functional area different from the one functional area.
29. The system according to Claim 28, wherein the moving the selected tool application icon to the another functional area different from the one functional area results in full or half screen display of the selected functional area.
30. The system according to any of Claims 19 through 29, wherein said automatically transcribing in real time data provided by the user to data fields associated with tool applications of the system includes tagging said data.
31. The system according to any of Claims 19 through 30, wherein said automatically transcribing in real time data provided by the user to data fields associated with tool applications of the system includes keyword and/or symantic processing of said data.
32. The system according to any of Claims 19 through 31 , wherein, when the user moves any tool application icon over it the hidden area, the hidden area becomes visible temporarily.
33. The system according to any of Claims 19 through 32, further comprising outputting stored data previously provided by a user.
PCT/US2011/061286 2010-11-17 2011-11-17 Computer user interface WO2012068422A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US41479310P 2010-11-17 2010-11-17
US61/414,793 2010-11-17

Publications (1)

Publication Number Publication Date
WO2012068422A1 true WO2012068422A1 (en) 2012-05-24

Family

ID=46084420

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2011/061286 WO2012068422A1 (en) 2010-11-17 2011-11-17 Computer user interface

Country Status (1)

Country Link
WO (1) WO2012068422A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102799348A (en) * 2012-06-13 2012-11-28 北京小米科技有限责任公司 Method and device for icon edition on terminal adopting touch screen
CN103077668A (en) * 2013-01-05 2013-05-01 北京农业信息技术研究中心 Virtual interaction display system and method for agricultural products

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050147054A1 (en) * 2003-10-23 2005-07-07 Loo Rose P. Navigational bar
US20060010395A1 (en) * 2004-07-09 2006-01-12 Antti Aaltonen Cute user interface
US20070101297A1 (en) * 2005-10-27 2007-05-03 Scott Forstall Multiple dashboards
US20070288865A1 (en) * 2002-02-20 2007-12-13 Hitachi, Ltd. Information processing apparatus for project management and its computer software
US20090293043A1 (en) * 2008-05-23 2009-11-26 Microsoft Corporation Development environment integration with version history tools

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070288865A1 (en) * 2002-02-20 2007-12-13 Hitachi, Ltd. Information processing apparatus for project management and its computer software
US20050147054A1 (en) * 2003-10-23 2005-07-07 Loo Rose P. Navigational bar
US20060010395A1 (en) * 2004-07-09 2006-01-12 Antti Aaltonen Cute user interface
US20070101297A1 (en) * 2005-10-27 2007-05-03 Scott Forstall Multiple dashboards
US20090293043A1 (en) * 2008-05-23 2009-11-26 Microsoft Corporation Development environment integration with version history tools

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102799348A (en) * 2012-06-13 2012-11-28 北京小米科技有限责任公司 Method and device for icon edition on terminal adopting touch screen
CN103077668A (en) * 2013-01-05 2013-05-01 北京农业信息技术研究中心 Virtual interaction display system and method for agricultural products

Similar Documents

Publication Publication Date Title
Wickens et al. Applied attention theory
Wickens Attention: Theory, principles, models and applications
Stanton et al. Human factors methods: a practical guide for engineering and design
US9836192B2 (en) Identifying and displaying overlay markers for voice command user interface
Meyer Effects of warning validity and proximity on responses to warnings
Martinez Graphical user interfaces
US11086916B2 (en) System and method for analyzing and visualizing team conversational data
US7730392B2 (en) Electronic web sticky
Stothard et al. Taxonomy of interactive computer-based visualisation systems and content for the mining industry–part 2
US9268476B2 (en) Drag and drop interaction paradigm with image swap
Wang et al. Inattentional blindness in augmented reality head-up display-assisted driving
Muller et al. Forgetting practices in the data sciences
US11929068B2 (en) Providing enhanced functionality in an interactive electronic technical manual
Woods et al. Capturing the dynamics of attention control from individual to distributed systems: the shape of models to come
Andreasson et al. Effects of visualizing missing data: an empirical evaluation
Furmankiewicz et al. Implementation of business intelligence performance dashboard for the knowledge management in organization
Sopan et al. Reducing wrong patient selection errors: exploring the design space of user interface techniques
Shao et al. How dynamic information layout in GIS interface affects users’ search performance: integrating visual motion cognition into map information design
WO2012068422A1 (en) Computer user interface
Quezada et al. Assessing the target’size and drag distance in mobile applications for users with autism
US10762116B2 (en) System and method for analyzing and visualizing team conversational data
US10175844B2 (en) Interface environment for capturing and storing information in a document
Nadj et al. Situation awareness in aircraft ground handling: the impact of auditory and visual notification cues
Jespersen Dashboard design guidelines for improved evidence based decision making in public health in developing countries
Islam et al. An Artificial Intelligence–Based Smartphone App for Assessing the Risk of Opioid Misuse in Working Populations Using Synthetic Data: Pilot Development Study

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11842056

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11842056

Country of ref document: EP

Kind code of ref document: A1