US20070152987A1 - Control of a unit provided with a processor - Google Patents
Control of a unit provided with a processor Download PDFInfo
- Publication number
- US20070152987A1 US20070152987A1 US11/715,347 US71534707A US2007152987A1 US 20070152987 A1 US20070152987 A1 US 20070152987A1 US 71534707 A US71534707 A US 71534707A US 2007152987 A1 US2007152987 A1 US 2007152987A1
- Authority
- US
- United States
- Prior art keywords
- command
- processor
- address
- commands
- sensor device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03545—Pens or stylus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
- G06F3/0317—Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface
- G06F3/0321—Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface by optically sensing the absolute position with respect to a regularly patterned surface forming a passive digitiser, e.g. pen optically detecting position indicative tags printed on a paper sheet
Definitions
- the present invention relates to methods for controlling a unit provided with a processor, and to a device, a computer program product and a product kit for the same purpose.
- Intelligent homes are dwellings, where one or more electronic devices can be controlled or monitored from units located outside the house. Examples of devices that can be controlled or monitored remotely are heating devices, for example in holiday cottages, fire alarms, flood alarms and burglar alarms. Other examples of devices that could be controlled remotely or monitored are computers, lighting, irrigation systems, TV, video, music centers, refrigerators, freezers, cookers, microwave ovens or washing machines.
- a basic requirement is that the device can be provided with or can be connected to a processor which can receive, process and pass on information. The processor should then be capable of being connected to some form of communication network, such as the telephone network or the Internet.
- the remote control unit is typically a telephone, mobile telephone or computer terminal, which communicates with the processor via the communication network.
- the remote control unit is a telephone or mobile telephone which communicates directly with the processor by means of predefined key commands or codes.
- the remote control unit is a computer or mobile telephone which communicates with the processor via a computer network, such as the Internet.
- a problem with using a telephone as remote control terminal is that the user interface is based on pressing keys on a numeric keypad, which can lead to problems, for example in learning or remembering the codes that are to be used. Communicating by pressing keys on a numeric keypad is thus not particularly user-friendly.
- a mobile telephone has usually limited data entry facilities on account of its small format. In addition, no or limited feedback is given in response to what has been keyed into the mobile telephone.
- the object of the present invention is to solve the above problem completely or partially.
- a method for controlling a unit provided with a processor comprises receiving at least one graphical notation in the form of positions representing a sensor device's movement across a base that is provided with a position-coding pattern, while the graphical notation was made.
- the method further comprises identifying, based on the at least one graphical notation, at least one command for the unit provided with a processor, and receiving an address to the unit provided with a processor.
- the method comprises controlling the unit provided with a processor by sending the at least one command to the address.
- a graphical notation may be any writing and/or drawing, which is made on a base, using a sensor device, that records positions based on the position-coding pattern provided on the base.
- the graphical notation may be a single, continuous stroke, or a group of such strokes. Each stroke may be represented as a sequence of coordinate pairs coded by the position-coding pattern on the base.
- the user input may be digitized without any additional operation on the part of the user, such as scanning the base or digitizing it in some other way.
- such graphical notations may be used for drawing up a command structure or hierarchy on a base, such as a paper.
- the command structure may comprise commands for controlling a unit provided with a processor. This provides a rapid, simple and easy to understand way for the user to control the unit provided with a processor. In addition, by noting down the commands, the user obtains automatically an easy to understand copy of what was entered into the unit.
- An additional advantage of the present invention is that the user is not limited to one base which is specific to a particular type of command.
- a unit provided with a processor can be a device which contains any form of processor, for example a microprocessor.
- processors for example a microprocessor.
- computers modern household appliances (dishwashers, microwave ovens, cookers/stoves, audio/video players, etc), industrial machinery and other computer-controlled applications such as central heating installations, air conditioning installations, telephone systems and monitoring/alarm systems.
- a base can be a device on which information can be noted down, usually a sheet of paper, a drawing board or similar medium which is provided with a position-coding pattern which makes possible electronic recording of what is noted down on the base.
- a command can be such words, symbols, sub-addresses in computer networks, program names, command names, file names, storage addresses or symbols which represent particular operations, functions, operators, parameters and arguments that can be used individually or in combination for controlling a unit provided with a processor.
- An address of the unit provided with a processor can be an address for communication with the unit provided with a processor. It can be a computer network address, such as a standard IP address, but other forms of address are possible. It can also be an address of a unit via which the unit provided with a processor communicates, for example a proxy-server or a unit for short-range communication such as Bluetooth®. Thus it is also possible to utilize the invention when the user is in the vicinity of the unit provided with a processor without having to connect, for example, via a computer network. An address can also be expressed as a short name, which is associated with the address. For example, the word “home” can be an indication of a particular computer network address of a computer which is located in the user's home.
- Making graphical notations may, but does not necessarily, mean that a mark is left on the base. This may provide the advantage that the base will constitute a copy of what was entered into the unit provided with a processor. With the reuse of previously entered commands it can, however, be an advantage if no mark is left on the base, in order to make possible repeated use.
- the at least one graphical notation may at least partly comprise handwritten characters.
- the at least one graphical notation may be at least partly converted into a character coded format for identifying the at least one command. This provides a very flexible way of entering commands, since there is no need for predefining or associating commands with user-friendly symbols.
- identifying the at least one command may comprise identifying at least one graphical symbol from the at least one graphical notation, the graphical symbol representing the at least one command.
- This variant may be combined with the above described character coded variant, by e.g. providing predefined symbols for frequently used commands or addresses, while less frequently used commands or addresses are to be noted as handwritten characters for conversion to character coded form.
- identifying the at least one command may comprise detecting a command indicator, based on the at least one graphical notation optionally, identifying the at least one command may comprise identifying a subarea of the position-coding pattern, the subarea being essentially encircled by the command indicator. The subarea may be associated with the command.
- a command indicator may be a graphically noted indication, which is recognized by the sensor device as an instruction to identify a command.
- the instruction may be a symbol and in one embodiment of the invention, the symbol may encircle the command, and thus constitute a frame or any other drawn shape, which wholly or partially encircles the command.
- the frame may have a specific appearance, which is recognized by the sensor device and thus interpreted as the indication that a command is being entered.
- the sensor device may more easily be able to interpret a graphical notation as a command.
- a subarea is an area of the position-coding pattern on the base, which area is delimited by e.g. the command indicator or by some other relation to the command, such as a certain area encircling the graphical notation. For example, a halo-like area surrounding the command may be defined as soon as a command is recognized.
- the subarea may, via the position-coding pattern, be associated with the command, such that a command may be identified based on a recording of any pair of coordinates that falls within the subarea associated with that command.
- the command indicator may have a dual function: defining the subarea and indicating that a command is being entered.
- sending the at least one command to the address may be effected in response to a recording of a pair of coordinates within the subarea. This enables “reuse” of a previously noted and recorded command.
- the address may be received in different manners.
- receiving the address comprises receiving the address from a memory in the sensor device.
- the address to the unit provided with a processor may be preprogrammed and stored in a memory in the sensor device.
- receiving the address may comprise identifying the address based on the at least one graphical notation.
- the address may be noted graphically on the base, recorded by the sensor device and optionally associated with one or more commands.
- an association between the address and the at least one command may be identified based on the at least one graphical notation.
- the association may indicate the relationship between the commands or the command and the address.
- receiving at least one graphical notation comprises receiving at least three separable graphical notations representing the address, the at least one command and the at least one association between the address and the at least one command.
- the graphical notations representing the address, the command and the association may be clearly distinguishable from each other.
- the at least one association may be identified based on a separable graphical notation connecting the graphical notations representing the at least one command and the address, respectively.
- the at least one association may be identified as a graphical notation having essentially the shape of a line extending between the graphical notations representing the at least one command and the address.
- associations may also connect two commands, or a command and an address. The associations provide a simple and intuitive way of relating commands, addresses, etc. to each other.
- an electronic representation of a command hierarchy which comprises at least two commands and at least one association, each of the at least two commands and the at least one association being graphically represented on the base by nodes and arcs, respectively, and each of the at least two commands being associated with a respective subarea of the position-coding pattern.
- a command hierarchy may be any hierarchy of commands for one or more units provided with a processor.
- a command hierarchy typically has a root, which may be a command or an address to a unit provided with a processor. Further, the command hierarchy may have, but does not necessarily need to have, a plurality of branches, which may also be referred to as nodes, each branch or node, in turn, having a number of sub branches, such that a tree-like structure is formed.
- the command hierarchy may be large, thus comprising a large number of command levels and/or a large number of commands on each level.
- the nodes may be connected by lines, such that each node has one superior node, but may have more than one subordinate node.
- the lines connecting the nodes may be interpreted as the associations referred to above, while the nodes may be addresses or commands for controlling a unit provided with a processor. Naturally, one address may in turn have a number of subaddresses to, e.g. different units provided with processors.
- the inventive method may include forming at least one command string based on the command hierarchy, and controlling the unit provided with a processor by sending the command string to the address.
- a command string is an instruction for a unit provided with a processor, which instruction is built up by more than one command and optionally by an address.
- a command string may include more than one command.
- identifying the at least one command may comprise receiving a pair of coordinates from one of the subareas, the pair of coordinates representing a chosen command and identifying an electronic representation of the chosen command, wherein the command string is formed based on the chosen command and at least one hierarchically superior command.
- the command hierarchy according to this embodiment may be represented on a base provided with a position-coding pattern.
- the electronic representation of the command hierarchy may be at least partly provided through identifying the command hierarchy based on the at least one graphical notation. At least partly means that an existing command hierarchy, which is either preprinted or e.g. graphically noted by the user or someone else, may be expanded by the user adding further commands or addresses.
- providing the electronic representation of the command hierarchy may at least partly comprise electronically receiving the electronic representation of the command hierarchy.
- the base, on which the graphical notation is made, may be provided with a graphical representation of at least a part of the command hierarchy.
- a base having a preprinted command structure may be provided together with an electronic version of the command structure, which is to be stored in the memory of the sensor device for future use.
- providing the electronic representation of the command hierarchy may also comprise identifying, based on the at least one graphical notation, at least one further command and at least one further association, and storing, in a memory of the sensor device, the at least one command and the least one further command, based on the at least one further association.
- the predefined command structure may be expanded by the user adding e.g. further commands or by the user adding e.g. an address.
- the command hierarchy may also be used by recording pairs of coordinates within predefined subareas, which are associated with commands in the command hierarchy. Based on such recordings, commands may be sent to the unit provided with a processor, as described above.
- the electronic representation of the command hierarchy may be stored in a tree data structure in the memory of the sensor device.
- a tree data structure may be any data structure for representing a hierarchy or a tree structure. Numerous such data structures are known.
- command strings are formed in response to an indication of a command, such as a recording of a pair of coordinates within a subarea that is associated with a command.
- the command string may be built up starting with the selected command and adding each hierarchically superior command, until the root is reached. Building the command string may also comprise adding separating characters, such as “ ⁇ ” etc., arranging the commands in a suitable order and adding the necessary parameters or switches.
- the electronic representation of the command hierarchy may be stored in the form of at least one command string which is formable based on the electronic representation of the command hierarchy.
- a plurality, or all, of the command strings that may be formed based on the command hierarchy may be stored in the form of more or less complete command strings.
- a computer program product for controlling a unit provided with a processor.
- the computer program product comprises instructions for a sensor device, which, when executed, causes the sensor device to perform the above described method.
- a sensor device for controlling a unit provided with a processor.
- the sensor device comprises a signal processor for receiving positions representing the sensor device's movement across a base that is provided with a position-coding pattern.
- the signal processor is arranged for receiving at least one graphical notation in the form of the positions, identifying, based on the at least one graphical notation, at least one command for the unit provided with a processor, receiving an address to the unit provided with a processor, and controlling the unit provided with a processor by sending the at least one command to the address.
- the signal processor of the sensor device may be arranged to perform the method described above. The method may be implemented by means of special-purpose circuitry, by programmable microprocessors or by a combination thereof.
- a method for controlling a unit provided with a processor comprises using a sensor device for recording positions representing the sensor device's movement across a base provided with a position-coding pattern, while a graphical notation is made on the base, noting graphically on the base, using the sensor device, at least one command to the unit provided with a processor, and sending the command to the unit provided with a processor for controlling this unit.
- This method provides a user-friendly and intuitive way of controlling a unit provided with a processor.
- the method can, for example, be used when a user wants to enter a command string into, for example, a computer or other unit provided with a processor connected to the sensor device, e.g. by wireless means.
- the method enables a unit provided with a processor to be controlled without using a preexisting user interface, such as a preprinted base provided with command options. Instead, the user may use any base that is provided with a position-coding pattern, on which he or she makes graphical notations, which correspond to the desirable commands and sends the commands to the unit provided with a processor.
- a product kit for controlling a unit provided with a processor comprises a control base, provided with a position-coding pattern, on which control base a command hierarchy, comprising at least two commands, is graphically represented, and on which control base each of the at least two commands is associated with a respective subarea of the position-coding pattern, and a computer program product comprising an electronic representation of the command hierarchy, whereby the at least two commands are identifiable based on a recording of a pair of coordinates within its respective associated subarea.
- a product kit provides a way of integrating e.g. a household appliance into a remote control system for an intelligent home.
- the electronic representation of the command hierarchy may be stored in the memory of the sensor device, and possibly completed by the user adding an address to which the commands are to be sent, thereby completing the installation of the appliance in the intelligent home.
- the user may then either use the control base provided for selecting predefined commands, or draw up the command hierarchy on an arbitrary base and send a command string to the unit provided with a processor.
- FIG. 1 schematically shows a system in which the present invention can be used.
- FIG. 2 shows schematically a base with graphically noted commands and associations according to a first application of an embodiment of the present invention.
- FIG. 3 shows schematically a base with several graphically noted commands and associations according to a second application of an embodiment of the present invention.
- FIG. 4 shows schematically a base with several graphically noted commands and associations according to a third application of an embodiment of the present invention.
- FIG. 5 shows schematically a sensor device for use in connection with the present invention.
- FIGS. 6-12 are flow charts, which schematically illustrate methods for controlling a unit provided with a processor according to the invention.
- FIG. 1 the general principles of the invention will be described with reference to FIG. 1 . Thereafter, a number of exemplifying applications and alternative embodiments of the invention will be discussed with reference to FIGS. 2-7 .
- the present invention is based on the general idea of controlling a unit provided with a processor by means of commands which are written on a position-coded base and which thereafter are sent to the unit provided with a processor.
- FIG. 1 which more specifically shows a base 1 in the form of a sheet of paper, on which commands can be written, a sensor device 2 using which the commands can be written on the base 1 , recorded in electronic form and sent to a unit 3 provided with a processor, which in FIG. 1 is exemplified by a computer.
- the sensor device 2 can communicate with the computer 3 in various ways.
- One alternative is for the sensor device 2 to communicate directly with the computer 3 , for example via a cable, an infrared link or a short-range radio link, such as according to the Bluetooth standard. This is illustrated in FIG. 1 by a broken line 4 .
- a second alternative is for the sensor device 2 to communicate with the computer via a local or global computer network 5 , such as the Internet.
- the sensor device 2 can be connected to the computer network 5 by means of a computer 6 which is permanently connected to the computer network, and with which the sensor device can communicate in, for example, any one of the ways mentioned above for communication with the computer 3 . This is also shown by a broken line 4 .
- the sensor device can be connected to the computer network 5 by wireless means via a radio access point 7 which is reached, for example, via a mobile telephone 8 , a hand-held computer 9 , such as a PDA—Personal Digital Assistant, or a portable computer 10 .
- these units communicate with the radio access point 7 via each other, for example by the portable computer 10 or the hand-held computer 9 utilizing a modem in the mobile telephone 8 as a link to the radio access point 7 , which can be a radio access point in some known system such as GSM, CDMA, GPRS or some other type of mobile communication network.
- the sensor device 2 can itself have means of communication, making possible direct connection to the radio access point 7 .
- the base 1 is provided with a position-coding pattern P.
- the position-coding pattern P is shown only schematically in FIG. 1 as a surface provided with dots. This position-coding pattern is used to record in electronic form what is written on the base.
- Various types of position-coding pattern which can be used for this purpose are known.
- U.S. Pat. No. 5,477,012 for example, a position-coding pattern is shown where each position is coded by a unique symbol.
- the position-coding pattern can be read off by a pen which detects the position code optically, decodes it and generates a pair of coordinates that describes the movement of the pen across the surface.
- the position-coding patterns in WO 00/73983 and WO 01/26032 can be detected optically by a pen that decodes the dots and generates a pair of coordinates for each set of, for example, 6*6 dots. If the position-coding pattern is read off while the pen is writing on the position-coding pattern, a sequence of pairs of coordinates that describes the movement of the pen across the position-coding pattern is thus obtained and thus constitutes an electronic representation of what was written on the sheet of paper.
- the base 1 is provided with a position-coding pattern of the type described in WO 01/26032.
- other types of position-coding patterns may have an equivalent function.
- the sensor device 2 can then also be of the type described in WO 01/26032. An example of the construction of such a device is described in the following with reference to FIG. 5 .
- It comprises a casing 11 which has approximately the same shape as a pen. At the end of the casing there is an opening 12 . The end is intended to abut against or to be held a short distance from the surface on which the position determination is to be carried out.
- the casing contains principally an optics part, an electronic circuitry part and a power supply.
- the optics part comprises at least one light-emitting diode 13 for illuminating the surface which is to be imaged and a light-sensitive area sensor 14 , for example a CCD or CMOS sensor, for recording a two-dimensional image.
- the device can also contain an optical system, such as a mirror and/or lens system (not shown).
- the light-emitting diode can be an infrared light-emitting diode and the sensor can be sensitive to infrared light.
- the power supply for the device is obtained from a battery 15 , which is mounted in a separate compartment in the casing. It is also possible to obtain the power supply via a cable from an external power source (not shown).
- the electronic circuitry part contains a signal-processor 16 which comprises a processor with primary memory and program memory.
- the processor is programmed to read images from the sensor, to detect the position-coding pattern in the images and to decode this into positions in the form of pairs of coordinates, and to process the information thus recorded in electronic form in the way described in greater detail below for controlling the unit 3 provided with a processor.
- the device also comprises a pen point 17 , with the aid of which ordinary pigment-based writing can be written on the surface on which the position determination is to be carried out.
- the pen point 17 can be extendable and retractable so that the user can control whether or not it is to be used. In certain applications the device does not need to have a pen point at all.
- the pigment-based writing is suitably of a type that is transparent to infrared light and the marks suitably absorb infrared light.
- the detection of the pattern is carried out without the above-mentioned writing interfering with the pattern.
- the device may also comprise buttons 18 , by means of which the device can be activated and controlled. It also has a transceiver 19 for wireless transmission, for example using infrared light, radio waves or ultrasound, of information to and from the device.
- the device can also comprise a display 20 for displaying positions or recorded information.
- the device can be divided between different physical casings, a first casing containing components which are required for recording images of the position-coding pattern and for transmitting these to components which are contained in a second casing and which carry out the position determination on the basis of the recorded image or images.
- the sensor device 2 communicates with other units 8 , 9 , 10 or 6 , by wireless means in a way known to those skilled in the art.
- the communication between the units 8 , 9 , 10 and 6 respectively, the radio access point 7 , the computer network 5 and the unit 3 provided with a processor also takes place in a way known to those skilled in the art.
- the function of the sensor, and the application of the position-coding pattern on the base are also methods known to those skilled in the art.
- a first application of an embodiment of the present invention will now be described with reference to FIG. 2 .
- a base 1 which is provided with the position-coding pattern, is noted a number of phrases or symbols which constitute commands 22 , 23 or addresses 21 .
- a command can be a word, but can also be a symbol, provided that the sensor device is pre-programmed to recognize and identify a symbol as a command. Such pre-programming can be carried out by some form of learning, whereby a command is associated with a symbol, as will be described in more detail below.
- One of the noted phrases 21 constitutes an indication of a computer network address to which the sensor device connects, e.g. the computer network address where the unit 3 provided with a processor is located.
- the computer network address can also be some other type of address for computer communication, such as a Bluetooth® address.
- Commands are noted on the base in a tree structure, in such a way that an address or an indication of the address constitutes the root and in such a way that commands constitute nodes in the tree structure.
- a node is associated with another node by a line being drawn between them.
- the commands 22 , 23 can thus constitute conditions, operators, parameters or subordinate commands, as shown in FIG. 2 . It is recognized that the tree structure with commands can vary in extent, from a single chain of commands to a large tree with many commands and subordinate commands.
- a command frame 24 may be noted around each command.
- the command frame 24 may work in such a way that it delimits a part of the base 1 which is to be associated with the command and also in such a way that it is recognized by the sensor device 2 and understood as an indication that a command is being entered and thus work as a command indicator. Between the commands 22 , 23 or the command frames 24 lines may be drawn which indicate connections between the commands.
- the shape of the command frames in FIGS. 2-4 constitutes only one example of how such frames can be designed. Other shapes are possible and different shapes can represent different types of commands or addresses.
- On the base 1 there is also a “send” box 26 which indicates to the sensor device that the commands are to be sent to the unit provided with a processor.
- the “send” box 26 can be either a pre-printed box comprising a specific, predefined part of the position-coding pattern which codes for the “send” function, or alternatively the “send” box 26 can be noted by the user on the base 1 and provided with a particular symbol or a particular command word which represents the “send” function.
- the “send” box can be omitted, by for example the sensor device connecting directly to the computer network address when it identifies a command box or an indication of a computer network address.
- the user has noted the address “my computer” 21 with a sensor device 2 , which indicates to the sensor device 2 the computer network address of the user's own computer.
- the command frame 24 is noted, which defines the subarea of the position-coding pattern which after the input will be associated with the address by the sensor device.
- the sensor device reads off the position-coding pattern and forms an electronic representation of the graphical image that the address 21 and the command frame 24 constitute.
- the graphical image is then interpreted using OCR or ICR, so that an electronic form of the command “my computer” is obtained and is stored in a memory in the sensor device.
- the address is stored together with an indication that the noted “my computer” is an address and not just a text string.
- the address derived by the sensor device for example 197.57.3.982, can be stored.
- a representation of the surface on the base 1 which is enclosed by command frame 24 is also stored in the sensor device and associated with the address “my computer” as an association with an IP address, here exemplified by 197.57.3.982.
- a command string is formed in the sensor device from the stored commands 22 , 23 and associations 25 , the first component of the command string consisting of the address that constitutes a root in the tree-like command structure, that is the address “my computer”.
- Next associations 25 are followed, until the last command “hard disk” 23 is reached, whereupon the command string is built up gradually and finally assumes, for example, the form
- Command strings may be formed and stored in different ways. According to one alternative, each command string that may be formed from a given command structure or hierarchy may be stored. According to this alternative, new command strings are added as new commands or parameters are added to the command structure.
- a tree structure that has been noted graphically and registered may be represented in any appropriate way in the sensor device, such as by means of any data structure for representing tree structures.
- a certain command is selected, by e.g. registering a pair of coordinates within the command frame on the base, the corresponding command string is formed from the marked command or parameter and all other commands or parameters through the root.
- the command string is sent to the computer network address that is indicated by the address “my computer”, whereupon the unit 3 provided with a processor and connected to the computer network address executes the command and its hard disk is formatted.
- the unit 3 provided with a processor then sends an acknowledgement to the sensor device 2 that the command has been carried out.
- the acknowledgement can be presented to the user in the form of a sound, light or vibration signal in the sensor device or by being displayed on some other unit in the vicinity of the user, such as a mobile telephone or hand-held computer.
- an address 30 and a number of commands 31 , 32 , 33 , 34 , 35 , 36 , surrounded by command frames 24 and connected by associations 25 have been noted in a similar way to that described above with reference to FIG. 2 .
- the commands have been noted and linked together into a tree-like structure, where an address “home” 30 indicates a computer network address, e.g. of a unit which is situated in the user's intelligent home, which unit is arranged to control one or more units provided with a processor in the user's home.
- each unit provided with a processor in the user's home can have a network address and can be connected directly to the computer network, without any master unit as in the example.
- a number of units provided with a processor are connected to the computer network address, which units are addressed by the commands 31 , 32 , 33 , 34 .
- Other types of units are possible and are to be regarded as covered by the invention.
- Each unit can then be provided with a number of commands or parameters for its control.
- FIG. 3 the tree structure is drawn out in full only for the temperature control of a heating system, but it is recognized that the structure can be extended as new units are added, or as existing units are provided with new functions. It is also recognized that the tree structure shown in FIG. 3 is drawn mechanically and that for the purpose of illustration it shows a larger part of the tree structure than what is necessary for the example.
- the tree structure or command hierarchy may be hand-drawn, and only that part of the structure, that is those commands, which are to be used are drawn, as in FIG. 2 .
- predefined and printed or machine drawn command structures, or parts thereof are conceivable. These may e.g. be provided by the manufacturer of a certain household appliance. Such predefined command hierarchies may be expandable by e.g. allowing the user to add further parameters or commands. It is also possible that the manufacturer of a household appliance provides a complete, predefined command hierarchy containing all the necessary commands or parameters for an appliance or a group of appliances.
- the command hierarchy may be provided in the form of a control base, e.g.
- a corresponding electronic version of the command hierarchy may be provided in the form of software for installation in the sensor device.
- the software provides the necessary instructions for the sensor device to associate the subareas of the position-coding pattern with the proper command.
- the user may install the appliance by filling in his address with the sensor device, at the proper position in the command hierarchy, thus making the appliance controllable by means of the sensor device.
- the command hierarchy Once the command hierarchy has been installed in the sensor device, the user may use it and add commands according to what has been described above.
- the appliance manufacturer may provide its customer with a product kit comprising the control base and software for installation of the command hierarchy in the sensor device.
- a command 35 which indicates the temperature parameter and a value command 36 for this are marked and connected by means of an association line 25 .
- the command string which is generated for setting the temperature of the heating installation is home/heating/temperature/20,
- FIG. 4 shows a third application of an embodiment of the present invention.
- the input is carried out to programs in the user's computer 3 ( FIG. 1 ).
- an address 40 and a number of commands 41 , 42 , 43 , 44 have been drawn on a base 1 and connected by associations 25 and marked with command frames 24 .
- a “send” box 26 is also arranged on the base 1 .
- a sketch 45 has also been noted on the base and recorded in electronic form.
- the command structure in FIG. 4 comprises an address “my computer” 40 which indicates the address of the user's computer, and a graphics program 41 , a word processing program 42 and a spreadsheet program 43 which are installed in the user's computer 3 .
- the command “import” 44 has been noted and recorded in electronic form using the sensor device.
- the sketch 45 is connected by a line 47 to the command “import” 44 .
- the sketch 45 is stored in the sensor device in the form of graphical data, below called “image data”. More specifically, the sketch is stored as a graphics file, e.g. a vector graphics file. This can be in a standard storage format such as .wmf (Windows® Meta File) or in a storage format specific to the sensor device.
- the file is transferred to the unit provided with a processor before the command is executed or in association with the command being executed.
- the unit 3 provided with a processor receives the command string and the image and causes the graphics program to import the image.
- the importing to the graphics program can be carried out by the program's existing capability of being executed by indicating a command string comprising a file name.
- a storage format specific to the sensor device it is necessary for the program that receives the image data to have been provided with functionality for handling the storage format of the sensor device.
- a base 1 on which commands are noted can be used repeatedly by the commands being stored in a memory in the sensor device when the noting is carried out. This may mean that the user can indicate to the sensor device which command is to be carried out by pointing at a written-down command with the sensor device so that the sensor device can read off the position-coding pattern corresponding to the command.
- the sensor device may identify this command in its memory and send it to the address with which the command was associated when the noting on the base was carried out. This may be done by simply recording a pair of coordinates within the proper subarea of the position-coding pattern, without marking any “send” box.
- further commands or parameters defining a previously noted and recorded command may be added by being noted graphically on the base and associated with the command in question.
- a dynamic command structure is obtained, which can be enlarged as new units are added or enlarged with new commands. Commands that has been added to the command hierarchy in this manner will thus constitute parts of the command hierarchy as described above.
- the methods described here may be implemented as a computer program product, as shown in FIGS. 6 and 7 .
- the computer program product comprises a computer program which is stored in the program memory of the sensor device and is executed in its processor.
- the method can be implemented completely or partially in the form of a product-specific circuit, such as for example an ASIC, an FPGA or in the form of digital or analogue circuits or in any suitable combination of these.
- FIGS. 6-12 show different embodiments of the invention, which may be used separately or in combination.
- FIG. 6 is a schematic flow chart for a method according to of the invention.
- the method may be implemented in e.g. a computer program product.
- a graphical notation is received in step 50 in the sensor device 2 .
- a command for the unit 3 provided with a processor is identified in step 51 based on the graphical notation.
- the sensor device also receives in step 52 an address to the unit 3 provided with a processor.
- the command is sent to the address.
- Step 53 may be initiated in different manners, by e.g. the sensor device 2 detecting a “send”-box on the base, by receiving a send command or any other indication such as a button being depressed etc.
- FIG. 7 is a schematic flow chart for a method according to an embodiment of the invention.
- the step 51 for identifying the command for the unit 3 provided with a processor comprises a substep 51 a of at least partly converting the graphical notation into a machine readable character format by e.g. ICR (intelligent character recognition) or HWR (handwriting recognition), based on the output from step 51 a, whereupon the command may be identified in step 51 b.
- ICR integer character recognition
- HWR handwriting recognition
- a command that is written in plain text may be identified, either directly from the actual combination of machine readable characters, or by retrieving a related command from a database on the basis of the character combination.
- the sensor device may be arranged to identify an address from such a character combination (step 52 ).
- FIG. 8 is a schematic flow chart for a method according to another embodiment of the invention.
- the step 51 of identifying the command comprises a first substep 51 c of identifying a graphical symbol, which may be predefined, and thus recognized, as being equivalent to a certain command.
- the corresponding command is identified. For example, a cross mark (“X”) could be interpreted as a “delete” command.
- the sensor device may be arranged to identify an address from such a graphical symbol (step 52 ).
- FIGS. 7 and 8 may be combined, by the sensor device being capable of first determining whether a command or address is recorded in plain text or not, and then identifying the command or address based on either a character combination or a graphical symbol.
- FIG. 9 is a schematic flow chart for a method according to yet another embodiment of the invention.
- the step 51 of identifying the command further comprises the substep 51 d of detecting a command indicator, such as the command indicators 24 of FIGS. 2-4 .
- a subarea of the position-coding pattern may be identified in step 51 e.
- the subarea may be associated with the command in step 51 f, such that a recording of a pair of coordinates within the subarea will be interpreted by the sensor device as being equal to a recording of the associated command.
- FIG. 10 is a schematic flow chart for a method according to a further embodiment of the invention.
- the step 52 a of receiving the address comprises receiving the address from a memory 55 , which may be incorporated in the sensor device 2 .
- FIG. 11 is a schematic flow chart for a method according to another embodiment of the invention.
- the command (in step 51 ), the address (in step 52 b ) and the association (in step 54 ) are all identified based on the graphical notation 50 .
- FIG. 12 illustrates an embodiment of a method according to the invention, in which a command hierarchy in step 57 has been provided in a memory 56 of the sensor device 2 .
- the command hierarchy may be provided by storing graphical notations that has been made at earlier occasions. It may also be provided by downloading into the memory 56 from e.g. an external memory medium, e.g. via any one of the communication paths discussed above in relation to FIG. 1 .
- commands may be e.g. associated with subareas of the position-coding pattern.
- the step 51 of identifying the command may comprise a substep 51 g of receiving a pair of coordinates from a subarea that is associated with the command.
- substep 51 h the command associated with the subarea is identified.
- FIG. 12 also illustrates that based on the command hierarchy, command strings may be formed.
- the command strings may be formed while commands are being entered through graphical notations or in response to a command being identified through e.g. a graphical symbol or through a pair of coordinates within a predefined subarea.
- One alternative is to use a special purpose form provided with a position-coding pattern, where the user, in e.g. dedicated boxes, writes a short name, an address or a command that is to be made identifiable for the sensor device.
- Another option is to perform the preprogramming by means of a unit that communicates with the sensor device, such as a computer, a PDA etc.
- a unit may communicate via e.g. a computer network or via short range communication such as Bluetooth® or IrDA.
- the graphical notation may also comprise e.g. data in the form of e.g. figures, text, sketches or graphics, which is recorded by the sensor device 2 and associated with commands or addresses as described above with reference to FIG. 4 .
- the sensor device may be programmed to evaluate the graphical notations as they are received, based on their contents. For example, a plain text command, a graphical symbol or a command indicator may trigger the above described method of identifying commands. All other, non-recognized data may be treated in any standard fashion, such as may be stored as strokes, i.e. sequences of coordinate pairs, in the memory of the sensor device 2 .
- position-coding pattern does not necessarily need to be optically detectable. It could, for example, be readable by magnetic, capacitive, inductive, chemical or acoustic means. However, this would require a different type of sensor to be used.
- the position-coding pattern in WO 01/26032 can code coordinates of a very large number of unique positions or points. It can be considered as though all these points together make up an imaginary surface which is considerably larger than any single base. This imaginary surface can be divided into different areas which are reserved for different applications. An area can, for example, be reserved for controlling units provided with a processor. Information defining such areas and functions connected thereto can be stored in the pen and utilized for controlling the function of the pen.
- Another alternative is to allow commands to be noted within practically any part of the imaginary surface.
- almost any surface which is provided with a position-coding pattern can be used for entering commands and for the associated control of a unit provided with a processor.
- measures may possibly need to be taken to prevent interference, as certain areas can previously have been reserved for certain functions.
- the address to which commands and data are sent can be identified by, for example, looking up addresses in a database external to the sensor device. Both commands and address could be sent to an external unit for interpretation and further processing. This interpretation can be carried out in the sensor device, in the unit provided with a processor, or in some other external unit, possibly dedicated to the purpose.
- the address to which the command string is to be sent may be determined in advance, for example, by the sensor device being pre-programmed with such information. According to this embodiment, it is not necessary to note an address graphically. At least one command can be noted, but command structures as described above can also be noted and stored according to this embodiment. Alternatively, the address to which the command string is to be sent may be associated with a specific area of the position-coding pattern, and thus with a specific base on which that area of the position-coding pattern is arranged.
- each command string represents a conceivable combination of commands.
- the form in which the commands and the address are sent can also vary: it is, for example, possible to send raw data in the form of the images which the sensor device takes of the base. It is also possible to send some form of processed, for example compressed, image data, a series of coordinates which has been derived from the images and which represents the movement of the sensor device across the base, or commands or address in character-coded format. Other ways of sending commands, addresses or data are not excluded.
- the sending of address, commands and data to the unit provided with a processor can be initiated by the use of the “send” button, but can also be initiated in response to an indication being written down on the base. For example, the drawing of a frame around a command can initiate transmission. It is also possible to initiate transmission as soon as the sensor device has recorded a complete command string, or when a partial area within a frame is marked which surrounds an already written-down command. Further alternatives for initiating sending comprises, but is not limited to voice control, depressing a button on the pen etc.
- entry of addresses and commands can be marked by, for example, the user pressing a button on the sensor device. This can precede the entry of a command, but the button can also be held depressed during the whole or part of the entry procedure.
Abstract
The invention relates to a method for controlling a unit provided with a processor. The method comprises receiving at least one graphical notation in the form of positions representing a sensor device's movement across a base that is provided with a position-coding pattern, while the graphical notation was made, identifying, based on the at least one graphical notation, at least one command for the unit provided with a processor, receiving an address to the unit provided with a processor, and controlling the unit provided with a processor by sending the at least one command to the address. The invention further comprises a computer program product and a device for implementing the method, a method for controlling a unit provided with a processor using a sensor device and a product kit for controlling a unit provided with a processor.
Description
- This application is a Divisional of co-pending application Ser. No. 10/178,734, filed on Jun. 25, 2002, for which priority is claimed under 35 U.S.C. §120, which claims priority under 35 U.S.C. §119(a) to Swedish Application No. 0102236-7 filed in Sweden on Jun. 25, 2001, and Provisional Application No. 60/301,446 under 35 U.S.C. §119(e) filed Jun. 29, 2001 the entire contents of all of which are incorporated by reference.
- The present invention relates to methods for controlling a unit provided with a processor, and to a device, a computer program product and a product kit for the same purpose.
- Concurrently with the development of information technology, new opportunities for realization of so-called “intelligent homes” are arising. “Intelligent homes” are dwellings, where one or more electronic devices can be controlled or monitored from units located outside the house. Examples of devices that can be controlled or monitored remotely are heating devices, for example in holiday cottages, fire alarms, flood alarms and burglar alarms. Other examples of devices that could be controlled remotely or monitored are computers, lighting, irrigation systems, TV, video, music centers, refrigerators, freezers, cookers, microwave ovens or washing machines. A basic requirement is that the device can be provided with or can be connected to a processor which can receive, process and pass on information. The processor should then be capable of being connected to some form of communication network, such as the telephone network or the Internet.
- For the intelligent home to be controllable from a remote location, there is a need for a remote control unit. The remote control unit according to existing systems is typically a telephone, mobile telephone or computer terminal, which communicates with the processor via the communication network. In its simplest form, the remote control unit is a telephone or mobile telephone which communicates directly with the processor by means of predefined key commands or codes. In a more advanced form, the remote control unit is a computer or mobile telephone which communicates with the processor via a computer network, such as the Internet. A problem with using a telephone as remote control terminal is that the user interface is based on pressing keys on a numeric keypad, which can lead to problems, for example in learning or remembering the codes that are to be used. Communicating by pressing keys on a numeric keypad is thus not particularly user-friendly. In addition, a mobile telephone has usually limited data entry facilities on account of its small format. In addition, no or limited feedback is given in response to what has been keyed into the mobile telephone.
- The use of a computer with Internet connection is a solution which works providing such a computer is available. In addition, this requires a specific user interface to be developed and adapted for the respective unit which is to be controlled remotely. In addition, people who are not used to computers perceive solutions which require the use of a computer as complicated and difficult to use.
- Another problem is that as the system is developed with more devices that can be controlled remotely, demands are made concerning a uniform standard for their control. For example, it would be time-consuming and thus unsatisfactory to need to look up different Internet home pages in order to control the central heating and the microwave oven.
- There is thus a need for a method for user-friendly and flexible remote control of units provided with a processor, which method utilizes a portable device.
- The object of the present invention is to solve the above problem completely or partially.
- Methods for fulfilling the object are described in
independent claims independent claims - According to a first aspect of the invention, there is provided a method for controlling a unit provided with a processor. The method comprises receiving at least one graphical notation in the form of positions representing a sensor device's movement across a base that is provided with a position-coding pattern, while the graphical notation was made. The method further comprises identifying, based on the at least one graphical notation, at least one command for the unit provided with a processor, and receiving an address to the unit provided with a processor. Finally, the method comprises controlling the unit provided with a processor by sending the at least one command to the address.
- A graphical notation may be any writing and/or drawing, which is made on a base, using a sensor device, that records positions based on the position-coding pattern provided on the base. The graphical notation may be a single, continuous stroke, or a group of such strokes. Each stroke may be represented as a sequence of coordinate pairs coded by the position-coding pattern on the base. Thus, the user input may be digitized without any additional operation on the part of the user, such as scanning the base or digitizing it in some other way.
- According to the present invention, such graphical notations may be used for drawing up a command structure or hierarchy on a base, such as a paper. The command structure may comprise commands for controlling a unit provided with a processor. This provides a rapid, simple and easy to understand way for the user to control the unit provided with a processor. In addition, by noting down the commands, the user obtains automatically an easy to understand copy of what was entered into the unit.
- An additional advantage of the present invention is that the user is not limited to one base which is specific to a particular type of command.
- A unit provided with a processor can be a device which contains any form of processor, for example a microprocessor. Examples of such devices are computers, modern household appliances (dishwashers, microwave ovens, cookers/stoves, audio/video players, etc), industrial machinery and other computer-controlled applications such as central heating installations, air conditioning installations, telephone systems and monitoring/alarm systems.
- A base can be a device on which information can be noted down, usually a sheet of paper, a drawing board or similar medium which is provided with a position-coding pattern which makes possible electronic recording of what is noted down on the base.
- A command can be such words, symbols, sub-addresses in computer networks, program names, command names, file names, storage addresses or symbols which represent particular operations, functions, operators, parameters and arguments that can be used individually or in combination for controlling a unit provided with a processor.
- An address of the unit provided with a processor can be an address for communication with the unit provided with a processor. It can be a computer network address, such as a standard IP address, but other forms of address are possible. It can also be an address of a unit via which the unit provided with a processor communicates, for example a proxy-server or a unit for short-range communication such as Bluetooth®. Thus it is also possible to utilize the invention when the user is in the vicinity of the unit provided with a processor without having to connect, for example, via a computer network. An address can also be expressed as a short name, which is associated with the address. For example, the word “home” can be an indication of a particular computer network address of a computer which is located in the user's home.
- Making graphical notations may, but does not necessarily, mean that a mark is left on the base. This may provide the advantage that the base will constitute a copy of what was entered into the unit provided with a processor. With the reuse of previously entered commands it can, however, be an advantage if no mark is left on the base, in order to make possible repeated use.
- According to one embodiment of the invention, the at least one graphical notation may at least partly comprise handwritten characters. Thus, the at least one graphical notation may be at least partly converted into a character coded format for identifying the at least one command. This provides a very flexible way of entering commands, since there is no need for predefining or associating commands with user-friendly symbols.
- Alternatively, identifying the at least one command may comprise identifying at least one graphical symbol from the at least one graphical notation, the graphical symbol representing the at least one command. This way, user friendly symbols, which speed up the making of the graphical notation, may be provided. This variant may be combined with the above described character coded variant, by e.g. providing predefined symbols for frequently used commands or addresses, while less frequently used commands or addresses are to be noted as handwritten characters for conversion to character coded form.
- According to another embodiment of the invention, identifying the at least one command may comprise detecting a command indicator, based on the at least one graphical notation optionally, identifying the at least one command may comprise identifying a subarea of the position-coding pattern, the subarea being essentially encircled by the command indicator. The subarea may be associated with the command.
- A command indicator may be a graphically noted indication, which is recognized by the sensor device as an instruction to identify a command. The instruction may be a symbol and in one embodiment of the invention, the symbol may encircle the command, and thus constitute a frame or any other drawn shape, which wholly or partially encircles the command. The frame may have a specific appearance, which is recognized by the sensor device and thus interpreted as the indication that a command is being entered.
- By indicating that the command is being entered, the sensor device may more easily be able to interpret a graphical notation as a command.
- A subarea is an area of the position-coding pattern on the base, which area is delimited by e.g. the command indicator or by some other relation to the command, such as a certain area encircling the graphical notation. For example, a halo-like area surrounding the command may be defined as soon as a command is recognized. The subarea may, via the position-coding pattern, be associated with the command, such that a command may be identified based on a recording of any pair of coordinates that falls within the subarea associated with that command.
- By allowing the command indicator to define the subarea that is associated with the command, and indicate that a command has been entered, the command indicator may have a dual function: defining the subarea and indicating that a command is being entered.
- Thus, according to one embodiment of the invention, sending the at least one command to the address may be effected in response to a recording of a pair of coordinates within the subarea. This enables “reuse” of a previously noted and recorded command.
- According to the invention, the address may be received in different manners. One alternative is that receiving the address comprises receiving the address from a memory in the sensor device. Thus, the address to the unit provided with a processor may be preprogrammed and stored in a memory in the sensor device. According to another alternative, receiving the address may comprise identifying the address based on the at least one graphical notation. Thus, the address may be noted graphically on the base, recorded by the sensor device and optionally associated with one or more commands.
- According to an embodiment of the invention, an association between the address and the at least one command may be identified based on the at least one graphical notation. The association may indicate the relationship between the commands or the command and the address.
- According to one embodiment of the invention, receiving at least one graphical notation comprises receiving at least three separable graphical notations representing the address, the at least one command and the at least one association between the address and the at least one command. Thus, the graphical notations representing the address, the command and the association may be clearly distinguishable from each other.
- The at least one association may be identified based on a separable graphical notation connecting the graphical notations representing the at least one command and the address, respectively. For example, the at least one association may be identified as a graphical notation having essentially the shape of a line extending between the graphical notations representing the at least one command and the address. Naturally, associations may also connect two commands, or a command and an address. The associations provide a simple and intuitive way of relating commands, addresses, etc. to each other.
- According to one embodiment of the invention, there is provided, in a memory in the sensor device, an electronic representation of a command hierarchy which comprises at least two commands and at least one association, each of the at least two commands and the at least one association being graphically represented on the base by nodes and arcs, respectively, and each of the at least two commands being associated with a respective subarea of the position-coding pattern.
- A command hierarchy, or a command structure, may be any hierarchy of commands for one or more units provided with a processor. A command hierarchy typically has a root, which may be a command or an address to a unit provided with a processor. Further, the command hierarchy may have, but does not necessarily need to have, a plurality of branches, which may also be referred to as nodes, each branch or node, in turn, having a number of sub branches, such that a tree-like structure is formed. The command hierarchy may be large, thus comprising a large number of command levels and/or a large number of commands on each level. The nodes may be connected by lines, such that each node has one superior node, but may have more than one subordinate node. The lines connecting the nodes may be interpreted as the associations referred to above, while the nodes may be addresses or commands for controlling a unit provided with a processor. Naturally, one address may in turn have a number of subaddresses to, e.g. different units provided with processors.
- The inventive method may include forming at least one command string based on the command hierarchy, and controlling the unit provided with a processor by sending the command string to the address. A command string is an instruction for a unit provided with a processor, which instruction is built up by more than one command and optionally by an address. A command string may include more than one command.
- Furthermore, identifying the at least one command may comprise receiving a pair of coordinates from one of the subareas, the pair of coordinates representing a chosen command and identifying an electronic representation of the chosen command, wherein the command string is formed based on the chosen command and at least one hierarchically superior command.
- The command hierarchy according to this embodiment may be represented on a base provided with a position-coding pattern. The electronic representation of the command hierarchy may be at least partly provided through identifying the command hierarchy based on the at least one graphical notation. At least partly means that an existing command hierarchy, which is either preprinted or e.g. graphically noted by the user or someone else, may be expanded by the user adding further commands or addresses.
- Alternatively, providing the electronic representation of the command hierarchy may at least partly comprise electronically receiving the electronic representation of the command hierarchy. The base, on which the graphical notation is made, may be provided with a graphical representation of at least a part of the command hierarchy. Thus, a base having a preprinted command structure may be provided together with an electronic version of the command structure, which is to be stored in the memory of the sensor device for future use.
- According to the invention, providing the electronic representation of the command hierarchy may also comprise identifying, based on the at least one graphical notation, at least one further command and at least one further association, and storing, in a memory of the sensor device, the at least one command and the least one further command, based on the at least one further association. Thus, the predefined command structure may be expanded by the user adding e.g. further commands or by the user adding e.g. an address. The command hierarchy may also be used by recording pairs of coordinates within predefined subareas, which are associated with commands in the command hierarchy. Based on such recordings, commands may be sent to the unit provided with a processor, as described above.
- The electronic representation of the command hierarchy may be stored in a tree data structure in the memory of the sensor device. A tree data structure may be any data structure for representing a hierarchy or a tree structure. Numerous such data structures are known. According to this alternative, command strings are formed in response to an indication of a command, such as a recording of a pair of coordinates within a subarea that is associated with a command. The command string may be built up starting with the selected command and adding each hierarchically superior command, until the root is reached. Building the command string may also comprise adding separating characters, such as “\” etc., arranging the commands in a suitable order and adding the necessary parameters or switches.
- Alternatively, the electronic representation of the command hierarchy may be stored in the form of at least one command string which is formable based on the electronic representation of the command hierarchy. Thus, a plurality, or all, of the command strings that may be formed based on the command hierarchy may be stored in the form of more or less complete command strings. When a command is indicated by e.g. a recording of a pair of coordinates within a subarea, the command string comprising that command is retrieved and sent to the unit provided with a processor.
- According to a second aspect of the invention, there is provided a computer program product for controlling a unit provided with a processor. The computer program product comprises instructions for a sensor device, which, when executed, causes the sensor device to perform the above described method.
- According to a third aspect of the invention, there is provided a sensor device for controlling a unit provided with a processor. The sensor device comprises a signal processor for receiving positions representing the sensor device's movement across a base that is provided with a position-coding pattern. The signal processor is arranged for receiving at least one graphical notation in the form of the positions, identifying, based on the at least one graphical notation, at least one command for the unit provided with a processor, receiving an address to the unit provided with a processor, and controlling the unit provided with a processor by sending the at least one command to the address. The signal processor of the sensor device may be arranged to perform the method described above. The method may be implemented by means of special-purpose circuitry, by programmable microprocessors or by a combination thereof.
- According to a fourth aspect of the invention, there is provided a method for controlling a unit provided with a processor. The method comprises using a sensor device for recording positions representing the sensor device's movement across a base provided with a position-coding pattern, while a graphical notation is made on the base, noting graphically on the base, using the sensor device, at least one command to the unit provided with a processor, and sending the command to the unit provided with a processor for controlling this unit. This method provides a user-friendly and intuitive way of controlling a unit provided with a processor.
- The method can, for example, be used when a user wants to enter a command string into, for example, a computer or other unit provided with a processor connected to the sensor device, e.g. by wireless means. The method enables a unit provided with a processor to be controlled without using a preexisting user interface, such as a preprinted base provided with command options. Instead, the user may use any base that is provided with a position-coding pattern, on which he or she makes graphical notations, which correspond to the desirable commands and sends the commands to the unit provided with a processor.
- According to a fifth aspect of the invention, there is provided a product kit for controlling a unit provided with a processor. The product kit comprises a control base, provided with a position-coding pattern, on which control base a command hierarchy, comprising at least two commands, is graphically represented, and on which control base each of the at least two commands is associated with a respective subarea of the position-coding pattern, and a computer program product comprising an electronic representation of the command hierarchy, whereby the at least two commands are identifiable based on a recording of a pair of coordinates within its respective associated subarea. Such a product kit provides a way of integrating e.g. a household appliance into a remote control system for an intelligent home.
- The electronic representation of the command hierarchy may be stored in the memory of the sensor device, and possibly completed by the user adding an address to which the commands are to be sent, thereby completing the installation of the appliance in the intelligent home. The user may then either use the control base provided for selecting predefined commands, or draw up the command hierarchy on an arbitrary base and send a command string to the unit provided with a processor.
- The invention will be described in greater detail in the following with reference to the attached schematic drawings which, for the purposes of exemplification, show embodiments of the invention according to its different aspects.
-
FIG. 1 schematically shows a system in which the present invention can be used. -
FIG. 2 shows schematically a base with graphically noted commands and associations according to a first application of an embodiment of the present invention. -
FIG. 3 shows schematically a base with several graphically noted commands and associations according to a second application of an embodiment of the present invention. -
FIG. 4 shows schematically a base with several graphically noted commands and associations according to a third application of an embodiment of the present invention. -
FIG. 5 shows schematically a sensor device for use in connection with the present invention. -
FIGS. 6-12 are flow charts, which schematically illustrate methods for controlling a unit provided with a processor according to the invention. - By way of introduction, the general principles of the invention will be described with reference to
FIG. 1 . Thereafter, a number of exemplifying applications and alternative embodiments of the invention will be discussed with reference toFIGS. 2-7 . - The present invention is based on the general idea of controlling a unit provided with a processor by means of commands which are written on a position-coded base and which thereafter are sent to the unit provided with a processor. This is illustrated schematically in
FIG. 1 , which more specifically shows abase 1 in the form of a sheet of paper, on which commands can be written, asensor device 2 using which the commands can be written on thebase 1, recorded in electronic form and sent to aunit 3 provided with a processor, which inFIG. 1 is exemplified by a computer. - The
sensor device 2 can communicate with thecomputer 3 in various ways. One alternative is for thesensor device 2 to communicate directly with thecomputer 3, for example via a cable, an infrared link or a short-range radio link, such as according to the Bluetooth standard. This is illustrated inFIG. 1 by abroken line 4. A second alternative is for thesensor device 2 to communicate with the computer via a local orglobal computer network 5, such as the Internet. Thesensor device 2 can be connected to thecomputer network 5 by means of a computer 6 which is permanently connected to the computer network, and with which the sensor device can communicate in, for example, any one of the ways mentioned above for communication with thecomputer 3. This is also shown by abroken line 4. Alternatively, the sensor device can be connected to thecomputer network 5 by wireless means via aradio access point 7 which is reached, for example, via a mobile telephone 8, a hand-held computer 9, such as a PDA—Personal Digital Assistant, or aportable computer 10. Optionally, these units communicate with theradio access point 7 via each other, for example by theportable computer 10 or the hand-held computer 9 utilizing a modem in the mobile telephone 8 as a link to theradio access point 7, which can be a radio access point in some known system such as GSM, CDMA, GPRS or some other type of mobile communication network. - As an additional alternative, the
sensor device 2 can itself have means of communication, making possible direct connection to theradio access point 7. - As mentioned above, the
base 1 is provided with a position-coding pattern P. The position-coding pattern P is shown only schematically inFIG. 1 as a surface provided with dots. This position-coding pattern is used to record in electronic form what is written on the base. Various types of position-coding pattern which can be used for this purpose are known. In U.S. Pat. No. 5,477,012, for example, a position-coding pattern is shown where each position is coded by a unique symbol. The position-coding pattern can be read off by a pen which detects the position code optically, decodes it and generates a pair of coordinates that describes the movement of the pen across the surface. In WO 00/73983 and WO 01/26032, which are hereby incorporated by this reference, and both of which are assigned to the Applicant of the present application, another position-coding pattern is described in which each position is coded by means of a plurality of symbols of a simpler type and where each symbol contributes to the coding of more than one position. In WO 00/73983 different sized dots are used to code ones and zeros in the position-coding pattern which is binary. In WO 01/26032 four different displacements of a dot from a nominal position are used to code four different pairs of bits in the position-coding pattern. A certain number of dots, for example 6*6 dots, codes a unique position. The position can be calculated from the bit values corresponding to the dots. - The position-coding patterns in WO 00/73983 and WO 01/26032 can be detected optically by a pen that decodes the dots and generates a pair of coordinates for each set of, for example, 6*6 dots. If the position-coding pattern is read off while the pen is writing on the position-coding pattern, a sequence of pairs of coordinates that describes the movement of the pen across the position-coding pattern is thus obtained and thus constitutes an electronic representation of what was written on the sheet of paper.
- In the following, it is assumed that the
base 1 is provided with a position-coding pattern of the type described in WO 01/26032. However, it should be noted that other types of position-coding patterns, for the purposes of the invention, may have an equivalent function. - The
sensor device 2 can then also be of the type described in WO 01/26032. An example of the construction of such a device is described in the following with reference toFIG. 5 . - It comprises a
casing 11 which has approximately the same shape as a pen. At the end of the casing there is anopening 12. The end is intended to abut against or to be held a short distance from the surface on which the position determination is to be carried out. - The casing contains principally an optics part, an electronic circuitry part and a power supply.
- The optics part comprises at least one light-emitting
diode 13 for illuminating the surface which is to be imaged and a light-sensitive area sensor 14, for example a CCD or CMOS sensor, for recording a two-dimensional image. Optionally, the device can also contain an optical system, such as a mirror and/or lens system (not shown). The light-emitting diode can be an infrared light-emitting diode and the sensor can be sensitive to infrared light. - The power supply for the device is obtained from a
battery 15, which is mounted in a separate compartment in the casing. It is also possible to obtain the power supply via a cable from an external power source (not shown). - The electronic circuitry part contains a signal-
processor 16 which comprises a processor with primary memory and program memory. The processor is programmed to read images from the sensor, to detect the position-coding pattern in the images and to decode this into positions in the form of pairs of coordinates, and to process the information thus recorded in electronic form in the way described in greater detail below for controlling theunit 3 provided with a processor. - In this embodiment, the device also comprises a
pen point 17, with the aid of which ordinary pigment-based writing can be written on the surface on which the position determination is to be carried out. Thepen point 17 can be extendable and retractable so that the user can control whether or not it is to be used. In certain applications the device does not need to have a pen point at all. - The pigment-based writing is suitably of a type that is transparent to infrared light and the marks suitably absorb infrared light. By using a light-emitting diode which emits infrared light and a sensor which is sensitive to infrared light, the detection of the pattern is carried out without the above-mentioned writing interfering with the pattern.
- The device may also comprise
buttons 18, by means of which the device can be activated and controlled. It also has atransceiver 19 for wireless transmission, for example using infrared light, radio waves or ultrasound, of information to and from the device. The device can also comprise adisplay 20 for displaying positions or recorded information. - The device can be divided between different physical casings, a first casing containing components which are required for recording images of the position-coding pattern and for transmitting these to components which are contained in a second casing and which carry out the position determination on the basis of the recorded image or images.
- According to one embodiment, the
sensor device 2 communicates withother units 8, 9, 10 or 6, by wireless means in a way known to those skilled in the art. The communication between theunits 8, 9, 10 and 6 respectively, theradio access point 7, thecomputer network 5 and theunit 3 provided with a processor also takes place in a way known to those skilled in the art. The function of the sensor, and the application of the position-coding pattern on the base are also methods known to those skilled in the art. - A first application of an embodiment of the present invention will now be described with reference to
FIG. 2 . On abase 1, which is provided with the position-coding pattern, is noted a number of phrases or symbols which constitute commands 22, 23 or addresses 21. A command can be a word, but can also be a symbol, provided that the sensor device is pre-programmed to recognize and identify a symbol as a command. Such pre-programming can be carried out by some form of learning, whereby a command is associated with a symbol, as will be described in more detail below. - One of the
noted phrases 21 constitutes an indication of a computer network address to which the sensor device connects, e.g. the computer network address where theunit 3 provided with a processor is located. The computer network address can also be some other type of address for computer communication, such as a Bluetooth® address. - Commands are noted on the base in a tree structure, in such a way that an address or an indication of the address constitutes the root and in such a way that commands constitute nodes in the tree structure. A node is associated with another node by a line being drawn between them.
- The
commands FIG. 2 . It is recognized that the tree structure with commands can vary in extent, from a single chain of commands to a large tree with many commands and subordinate commands. - A
command frame 24 may be noted around each command. Thecommand frame 24 may work in such a way that it delimits a part of thebase 1 which is to be associated with the command and also in such a way that it is recognized by thesensor device 2 and understood as an indication that a command is being entered and thus work as a command indicator. Between thecommands - The shape of the command frames in
FIGS. 2-4 constitutes only one example of how such frames can be designed. Other shapes are possible and different shapes can represent different types of commands or addresses. On thebase 1 there is also a “send”box 26 which indicates to the sensor device that the commands are to be sent to the unit provided with a processor. The “send”box 26 can be either a pre-printed box comprising a specific, predefined part of the position-coding pattern which codes for the “send” function, or alternatively the “send”box 26 can be noted by the user on thebase 1 and provided with a particular symbol or a particular command word which represents the “send” function. As another alternative, the “send” box can be omitted, by for example the sensor device connecting directly to the computer network address when it identifies a command box or an indication of a computer network address. - As shown in
FIG. 2 , the user has noted the address “my computer” 21 with asensor device 2, which indicates to thesensor device 2 the computer network address of the user's own computer. Around the address thecommand frame 24 is noted, which defines the subarea of the position-coding pattern which after the input will be associated with the address by the sensor device. At the same time as the noting is carried out, the sensor device reads off the position-coding pattern and forms an electronic representation of the graphical image that theaddress 21 and thecommand frame 24 constitute. The graphical image is then interpreted using OCR or ICR, so that an electronic form of the command “my computer” is obtained and is stored in a memory in the sensor device. Optionally, the address is stored together with an indication that the noted “my computer” is an address and not just a text string. As an alternative, the address derived by the sensor device, for example 197.57.3.982, can be stored. A representation of the surface on thebase 1 which is enclosed bycommand frame 24 is also stored in the sensor device and associated with the address “my computer” as an association with an IP address, here exemplified by 197.57.3.982. - When the user notes the command “format” 22 and the second command “hard disk” 23 and the command frames 24, the same procedure is repeated: the noted commands are recorded in electronic form, interpreted and stored together with the indication that they constitute commands and with the subareas of the
base 1 with which they are associated. When the user then notesassociations 25 between thecommands commands - A command string is formed in the sensor device from the stored commands 22, 23 and
associations 25, the first component of the command string consisting of the address that constitutes a root in the tree-like command structure, that is the address “my computer”.Next associations 25 are followed, until the last command “hard disk” 23 is reached, whereupon the command string is built up gradually and finally assumes, for example, the form - my computer/format/hard disk
- or alternatively
- 197.57.3.982/format/hard disk
- Command strings may be formed and stored in different ways. According to one alternative, each command string that may be formed from a given command structure or hierarchy may be stored. According to this alternative, new command strings are added as new commands or parameters are added to the command structure.
- According to another alternative, a tree structure that has been noted graphically and registered may be represented in any appropriate way in the sensor device, such as by means of any data structure for representing tree structures. When a certain command is selected, by e.g. registering a pair of coordinates within the command frame on the base, the corresponding command string is formed from the marked command or parameter and all other commands or parameters through the root.
- When the user marks the “send”
box 26, the command string is sent to the computer network address that is indicated by the address “my computer”, whereupon theunit 3 provided with a processor and connected to the computer network address executes the command and its hard disk is formatted. - In one embodiment of the present invention, the
unit 3 provided with a processor then sends an acknowledgement to thesensor device 2 that the command has been carried out. The acknowledgement can be presented to the user in the form of a sound, light or vibration signal in the sensor device or by being displayed on some other unit in the vicinity of the user, such as a mobile telephone or hand-held computer. - A second application of an embodiment of the present invention will now be described with reference to
FIG. 3 . On abase 1 provided with a position-coding pattern, anaddress 30 and a number ofcommands command frames 24 and connected byassociations 25, have been noted in a similar way to that described above with reference toFIG. 2 . The commands have been noted and linked together into a tree-like structure, where an address “home” 30 indicates a computer network address, e.g. of a unit which is situated in the user's intelligent home, which unit is arranged to control one or more units provided with a processor in the user's home. Alternatively, each unit provided with a processor in the user's home can have a network address and can be connected directly to the computer network, without any master unit as in the example. - A number of units provided with a processor are connected to the computer network address, which units are addressed by the
commands FIG. 3 , the tree structure is drawn out in full only for the temperature control of a heating system, but it is recognized that the structure can be extended as new units are added, or as existing units are provided with new functions. It is also recognized that the tree structure shown inFIG. 3 is drawn mechanically and that for the purpose of illustration it shows a larger part of the tree structure than what is necessary for the example. In an actual application the tree structure or command hierarchy may be hand-drawn, and only that part of the structure, that is those commands, which are to be used are drawn, as inFIG. 2 . Alternatively, predefined and printed or machine drawn command structures, or parts thereof, are conceivable. These may e.g. be provided by the manufacturer of a certain household appliance. Such predefined command hierarchies may be expandable by e.g. allowing the user to add further parameters or commands. It is also possible that the manufacturer of a household appliance provides a complete, predefined command hierarchy containing all the necessary commands or parameters for an appliance or a group of appliances. The command hierarchy may be provided in the form of a control base, e.g. a sheet or a paper, provided with the position-coding pattern, on which the command hierarchy is preprinted, but where the address has not been filled out. A corresponding electronic version of the command hierarchy may be provided in the form of software for installation in the sensor device. The software provides the necessary instructions for the sensor device to associate the subareas of the position-coding pattern with the proper command. - The user may install the appliance by filling in his address with the sensor device, at the proper position in the command hierarchy, thus making the appliance controllable by means of the sensor device. Once the command hierarchy has been installed in the sensor device, the user may use it and add commands according to what has been described above.
- Thus, the appliance manufacturer may provide its customer with a product kit comprising the control base and software for installation of the command hierarchy in the sensor device.
- A
command 35 which indicates the temperature parameter and avalue command 36 for this are marked and connected by means of anassociation line 25. The command string which is generated for setting the temperature of the heating installation is home/heating/temperature/20, - whereupon the heating installation in the user's home sets the temperature to 20 degrees, in response to the user sending the command string to the computer network address, which is indicated by the address “home” by marking the node containing the temperature parameter and then the “send”
box 26. It is recognized that similar command structures can be constructed for all theother units units -
FIG. 4 shows a third application of an embodiment of the present invention. According to this application, the input is carried out to programs in the user's computer 3 (FIG. 1 ). Also in this case, anaddress 40 and a number ofcommands base 1 and connected byassociations 25 and marked with command frames 24. A “send”box 26 is also arranged on thebase 1. Asketch 45 has also been noted on the base and recorded in electronic form. The command structure inFIG. 4 comprises an address “my computer” 40 which indicates the address of the user's computer, and agraphics program 41, aword processing program 42 and aspreadsheet program 43 which are installed in the user'scomputer 3. - In this example, the input of data in the form of a sketch to the
graphics program 41 is described, but it is recognized that input to theword processing program 42 and thespreadsheet program 43 can be carried out analogously. Practically all information that can be noted on a base can also be transferred in this way. Other examples are calendar information, memos, database entries, etc. - As a subordinate command to the graphics program, the command “import” 44 has been noted and recorded in electronic form using the sensor device. The
sketch 45 is connected by aline 47 to the command “import” 44. Thesketch 45 is stored in the sensor device in the form of graphical data, below called “image data”. More specifically, the sketch is stored as a graphics file, e.g. a vector graphics file. This can be in a standard storage format such as .wmf (Windows® Meta File) or in a storage format specific to the sensor device. The file is transferred to the unit provided with a processor before the command is executed or in association with the command being executed. - In response to the “send” box being marked, the following command string is sent to the
unit 3 provided with a processor: - my computer/graphics program/import/image data.
- The
unit 3 provided with a processor receives the command string and the image and causes the graphics program to import the image. When a standard storage format is used, the importing to the graphics program can be carried out by the program's existing capability of being executed by indicating a command string comprising a file name. However, if a storage format specific to the sensor device is used, it is necessary for the program that receives the image data to have been provided with functionality for handling the storage format of the sensor device. - It is recognized that additional commands to the graphics program can be noted on the
base 1 and used for more precise control of the input of thesketch 45. As an alternative, the area that is to constitute image data can be marked in order to delimit it from other areas that are not required to be transmitted to theunit 3 provided with a processor. Such amark 46 is shown inFIG. 4 . - A
base 1 on which commands are noted can be used repeatedly by the commands being stored in a memory in the sensor device when the noting is carried out. This may mean that the user can indicate to the sensor device which command is to be carried out by pointing at a written-down command with the sensor device so that the sensor device can read off the position-coding pattern corresponding to the command. The sensor device may identify this command in its memory and send it to the address with which the command was associated when the noting on the base was carried out. This may be done by simply recording a pair of coordinates within the proper subarea of the position-coding pattern, without marking any “send” box. - According to one embodiment of the invention, further commands or parameters defining a previously noted and recorded command may be added by being noted graphically on the base and associated with the command in question. In this way, a dynamic command structure is obtained, which can be enlarged as new units are added or enlarged with new commands. Commands that has been added to the command hierarchy in this manner will thus constitute parts of the command hierarchy as described above.
- The methods described here may be implemented as a computer program product, as shown in
FIGS. 6 and 7 . The computer program product comprises a computer program which is stored in the program memory of the sensor device and is executed in its processor. As an alternative, the method can be implemented completely or partially in the form of a product-specific circuit, such as for example an ASIC, an FPGA or in the form of digital or analogue circuits or in any suitable combination of these. - The following description is directed to the inventive method based on
FIGS. 6-12 , which show different embodiments of the invention, which may be used separately or in combination. -
FIG. 6 is a schematic flow chart for a method according to of the invention. The method may be implemented in e.g. a computer program product. A graphical notation is received instep 50 in thesensor device 2. A command for theunit 3 provided with a processor is identified instep 51 based on the graphical notation. The sensor device also receives instep 52 an address to theunit 3 provided with a processor. Instep 53, the command is sent to the address.Step 53 may be initiated in different manners, by e.g. thesensor device 2 detecting a “send”-box on the base, by receiving a send command or any other indication such as a button being depressed etc. -
FIG. 7 is a schematic flow chart for a method according to an embodiment of the invention. InFIG. 7 , thestep 51 for identifying the command for theunit 3 provided with a processor, comprises a substep 51 a of at least partly converting the graphical notation into a machine readable character format by e.g. ICR (intelligent character recognition) or HWR (handwriting recognition), based on the output fromstep 51 a, whereupon the command may be identified instep 51 b. Thus, a command that is written in plain text may be identified, either directly from the actual combination of machine readable characters, or by retrieving a related command from a database on the basis of the character combination. Likewise, the sensor device may be arranged to identify an address from such a character combination (step 52). -
FIG. 8 is a schematic flow chart for a method according to another embodiment of the invention. InFIG. 8 thestep 51 of identifying the command comprises afirst substep 51 c of identifying a graphical symbol, which may be predefined, and thus recognized, as being equivalent to a certain command. In asecond substep 51 b, the corresponding command is identified. For example, a cross mark (“X”) could be interpreted as a “delete” command. Likewise, the sensor device may be arranged to identify an address from such a graphical symbol (step 52). - Evidently, the embodiments of
FIGS. 7 and 8 may be combined, by the sensor device being capable of first determining whether a command or address is recorded in plain text or not, and then identifying the command or address based on either a character combination or a graphical symbol. - Further commands may be added in additional steps (not shown) corresponding to those described in
FIGS. 6-8 . -
FIG. 9 is a schematic flow chart for a method according to yet another embodiment of the invention. InFIG. 9 , thestep 51 of identifying the command further comprises thesubstep 51 d of detecting a command indicator, such as thecommand indicators 24 ofFIGS. 2-4 . In connection with the detection of the command indicator, a subarea of the position-coding pattern may be identified instep 51 e. The subarea may be associated with the command instep 51 f, such that a recording of a pair of coordinates within the subarea will be interpreted by the sensor device as being equal to a recording of the associated command. -
FIG. 10 is a schematic flow chart for a method according to a further embodiment of the invention. InFIG. 10 thestep 52 a of receiving the address comprises receiving the address from amemory 55, which may be incorporated in thesensor device 2. -
FIG. 11 is a schematic flow chart for a method according to another embodiment of the invention. InFIG. 11 , the command (in step 51), the address (instep 52 b) and the association (in step 54) are all identified based on thegraphical notation 50. -
FIG. 12 illustrates an embodiment of a method according to the invention, in which a command hierarchy instep 57 has been provided in amemory 56 of thesensor device 2. The command hierarchy may be provided by storing graphical notations that has been made at earlier occasions. It may also be provided by downloading into thememory 56 from e.g. an external memory medium, e.g. via any one of the communication paths discussed above in relation toFIG. 1 . In thememory 56, commands may be e.g. associated with subareas of the position-coding pattern. - In
FIG. 12 , thestep 51 of identifying the command may comprise a substep 51 g of receiving a pair of coordinates from a subarea that is associated with the command. Thus, insubstep 51 h, the command associated with the subarea is identified. -
FIG. 12 also illustrates that based on the command hierarchy, command strings may be formed. The command strings may be formed while commands are being entered through graphical notations or in response to a command being identified through e.g. a graphical symbol or through a pair of coordinates within a predefined subarea. - Different procedures for preprogramming or predefining addresses and/or commands are conceivable.
- One alternative is to use a special purpose form provided with a position-coding pattern, where the user, in e.g. dedicated boxes, writes a short name, an address or a command that is to be made identifiable for the sensor device.
- Another option is to perform the preprogramming by means of a unit that communicates with the sensor device, such as a computer, a PDA etc. Such a unit may communicate via e.g. a computer network or via short range communication such as Bluetooth® or IrDA.
- From the above description, it should be apparent to the person skilled in the art that different embodiments of the method according to the invention may be combined and that the steps may be performed in different order.
- The graphical notation may also comprise e.g. data in the form of e.g. figures, text, sketches or graphics, which is recorded by the
sensor device 2 and associated with commands or addresses as described above with reference toFIG. 4 . - The sensor device may be programmed to evaluate the graphical notations as they are received, based on their contents. For example, a plain text command, a graphical symbol or a command indicator may trigger the above described method of identifying commands. All other, non-recognized data may be treated in any standard fashion, such as may be stored as strokes, i.e. sequences of coordinate pairs, in the memory of the
sensor device 2. - The invention can also be varied in other ways within the scope of the appended claims.
- For example, many different types of position-coding pattern are conceivable, in addition to those shown herein. The position-coding pattern does not necessarily need to be optically detectable. It could, for example, be readable by magnetic, capacitive, inductive, chemical or acoustic means. However, this would require a different type of sensor to be used.
- The position-coding pattern in WO 01/26032 can code coordinates of a very large number of unique positions or points. It can be considered as though all these points together make up an imaginary surface which is considerably larger than any single base. This imaginary surface can be divided into different areas which are reserved for different applications. An area can, for example, be reserved for controlling units provided with a processor. Information defining such areas and functions connected thereto can be stored in the pen and utilized for controlling the function of the pen.
- Another alternative is to allow commands to be noted within practically any part of the imaginary surface. In this way almost any surface which is provided with a position-coding pattern can be used for entering commands and for the associated control of a unit provided with a processor. In this embodiment measures may possibly need to be taken to prevent interference, as certain areas can previously have been reserved for certain functions.
- The address to which commands and data are sent can be identified by, for example, looking up addresses in a database external to the sensor device. Both commands and address could be sent to an external unit for interpretation and further processing. This interpretation can be carried out in the sensor device, in the unit provided with a processor, or in some other external unit, possibly dedicated to the purpose.
- It is further possible to determine in advance the address to which the command string is to be sent, for example, by the sensor device being pre-programmed with such information. According to this embodiment, it is not necessary to note an address graphically. At least one command can be noted, but command structures as described above can also be noted and stored according to this embodiment. Alternatively, the address to which the command string is to be sent may be associated with a specific area of the position-coding pattern, and thus with a specific base on which that area of the position-coding pattern is arranged.
- As an alternative to storing the commands as a dynamic structure, it is possible to store a plurality of command strings where each command string represents a conceivable combination of commands.
- The form in which the commands and the address are sent can also vary: it is, for example, possible to send raw data in the form of the images which the sensor device takes of the base. It is also possible to send some form of processed, for example compressed, image data, a series of coordinates which has been derived from the images and which represents the movement of the sensor device across the base, or commands or address in character-coded format. Other ways of sending commands, addresses or data are not excluded.
- It is also possible to permit association of commands and/or data that are on different bases. This could, for example, be carried out by an address being noted on a first base and a command being noted on a second base, then the address and the command are linked by the bases being arranged next to each other, after which a line is drawn between the address and the command. The discontinuity in the position-coding pattern which then probably arises, can be handled by the sensor device, e.g. by creating an association between the two areas of the position-coding pattern on the respective side of the discontinuity, such as is described in WO 01/75781, which is hereby incorporated through this reference.
- The sending of address, commands and data to the unit provided with a processor can be initiated by the use of the “send” button, but can also be initiated in response to an indication being written down on the base. For example, the drawing of a frame around a command can initiate transmission. It is also possible to initiate transmission as soon as the sensor device has recorded a complete command string, or when a partial area within a frame is marked which surrounds an already written-down command. Further alternatives for initiating sending comprises, but is not limited to voice control, depressing a button on the pen etc.
- In addition, entry of addresses and commands can be marked by, for example, the user pressing a button on the sensor device. This can precede the entry of a command, but the button can also be held depressed during the whole or part of the entry procedure.
- Other variations and combinations are also possible within the scope of the appended claims.
Claims (1)
1. A method for controlling a unit provided with a processor, characterized by:
receiving at least one graphical notation in the form of positions representing a sensor device's movement across a base that is provided with a position-coding pattern, while the graphical notation was made,
identifying, based on said at least one graphical notation, at least one command for said unit provided with a processor,
receiving an address to the unit provided with a processor, and
controlling the unit provided with a processor by sending said at least one command to the address.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/715,347 US20070152987A1 (en) | 2001-06-25 | 2007-03-08 | Control of a unit provided with a processor |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE0102236-7 | 2001-06-25 | ||
SE0102236A SE0102236L (en) | 2001-06-25 | 2001-06-25 | Control of a processor equipped unit |
US30144601P | 2001-06-29 | 2001-06-29 | |
US10/178,734 US7202861B2 (en) | 2001-06-25 | 2002-06-25 | Control of a unit provided with a processor |
US11/715,347 US20070152987A1 (en) | 2001-06-25 | 2007-03-08 | Control of a unit provided with a processor |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/178,734 Division US7202861B2 (en) | 2001-06-25 | 2002-06-25 | Control of a unit provided with a processor |
Publications (1)
Publication Number | Publication Date |
---|---|
US20070152987A1 true US20070152987A1 (en) | 2007-07-05 |
Family
ID=27354718
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/178,734 Expired - Lifetime US7202861B2 (en) | 2001-06-25 | 2002-06-25 | Control of a unit provided with a processor |
US11/715,347 Abandoned US20070152987A1 (en) | 2001-06-25 | 2007-03-08 | Control of a unit provided with a processor |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/178,734 Expired - Lifetime US7202861B2 (en) | 2001-06-25 | 2002-06-25 | Control of a unit provided with a processor |
Country Status (1)
Country | Link |
---|---|
US (2) | US7202861B2 (en) |
Families Citing this family (71)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6750978B1 (en) * | 2000-04-27 | 2004-06-15 | Leapfrog Enterprises, Inc. | Print media information system with a portable print media receiving unit assembly |
US7253919B2 (en) * | 2000-11-30 | 2007-08-07 | Ricoh Co., Ltd. | Printer with embedded retrieval and publishing interface |
US7916124B1 (en) | 2001-06-20 | 2011-03-29 | Leapfrog Enterprises, Inc. | Interactive apparatus using print media |
US7424129B2 (en) * | 2001-11-19 | 2008-09-09 | Ricoh Company, Ltd | Printing system with embedded audio/video content recognition and processing |
US7788080B2 (en) | 2001-11-19 | 2010-08-31 | Ricoh Company, Ltd. | Paper interface for simulation environments |
US8539344B2 (en) | 2001-11-19 | 2013-09-17 | Ricoh Company, Ltd. | Paper-based interface for multimedia information stored by multiple multimedia documents |
US7747655B2 (en) | 2001-11-19 | 2010-06-29 | Ricoh Co. Ltd. | Printable representations for time-based media |
US7743347B2 (en) * | 2001-11-19 | 2010-06-22 | Ricoh Company, Ltd. | Paper-based interface for specifying ranges |
US7861169B2 (en) | 2001-11-19 | 2010-12-28 | Ricoh Co. Ltd. | Multimedia print driver dialog interfaces |
US7149957B2 (en) | 2001-11-19 | 2006-12-12 | Ricoh Company, Ltd. | Techniques for retrieving multimedia information using a paper-based interface |
US20040181815A1 (en) * | 2001-11-19 | 2004-09-16 | Hull Jonathan J. | Printer with radio or television program extraction and formating |
US7703044B2 (en) | 2001-11-19 | 2010-04-20 | Ricoh Company, Ltd. | Techniques for generating a static representation for time-based media information |
US20040229195A1 (en) * | 2003-03-18 | 2004-11-18 | Leapfrog Enterprises, Inc. | Scanning apparatus |
US7647562B2 (en) * | 2003-04-03 | 2010-01-12 | National Instruments Corporation | Deployment and execution of a graphical program on an embedded device from a PDA |
US20050068573A1 (en) * | 2003-09-25 | 2005-03-31 | Hart Peter E. | Networked printing system having embedded functionality for printing time-based media |
US7864352B2 (en) * | 2003-09-25 | 2011-01-04 | Ricoh Co. Ltd. | Printer with multimedia server |
US8077341B2 (en) | 2003-09-25 | 2011-12-13 | Ricoh Co., Ltd. | Printer with audio or video receiver, recorder, and real-time content-based processing logic |
US7573593B2 (en) * | 2003-09-25 | 2009-08-11 | Ricoh Company, Ltd. | Printer with hardware and software interfaces for media devices |
US7570380B2 (en) * | 2003-09-25 | 2009-08-04 | Ricoh Company, Ltd. | Printer user interface |
US7528976B2 (en) * | 2003-09-25 | 2009-05-05 | Ricoh Co., Ltd. | Stand alone printer with hardware/software interfaces for sharing multimedia processing |
US7505163B2 (en) * | 2003-09-25 | 2009-03-17 | Ricoh Co., Ltd. | User interface for networked printer |
JP2005108230A (en) * | 2003-09-25 | 2005-04-21 | Ricoh Co Ltd | Printing system with embedded audio/video content recognition and processing function |
US7440126B2 (en) * | 2003-09-25 | 2008-10-21 | Ricoh Co., Ltd | Printer with document-triggered processing |
US20050071746A1 (en) * | 2003-09-25 | 2005-03-31 | Hart Peter E. | Networked printer with hardware and software interfaces for peripheral devices |
US7528977B2 (en) * | 2003-09-25 | 2009-05-05 | Ricoh Co., Ltd. | Printer with hardware and software interfaces for peripheral devices |
US20050071763A1 (en) * | 2003-09-25 | 2005-03-31 | Hart Peter E. | Stand alone multimedia printer capable of sharing media processing tasks |
US7831933B2 (en) * | 2004-03-17 | 2010-11-09 | Leapfrog Enterprises, Inc. | Method and system for implementing a user interface for a device employing written graphical elements |
US20060066591A1 (en) | 2004-03-17 | 2006-03-30 | James Marggraff | Method and system for implementing a user interface for a device through recognized text and bounded areas |
US20060033725A1 (en) * | 2004-06-03 | 2006-02-16 | Leapfrog Enterprises, Inc. | User created interactive interface |
US20060077184A1 (en) * | 2004-03-17 | 2006-04-13 | James Marggraff | Methods and devices for retrieving and using information stored as a pattern on a surface |
US20060078866A1 (en) * | 2004-03-17 | 2006-04-13 | James Marggraff | System and method for identifying termination of data entry |
US7853193B2 (en) | 2004-03-17 | 2010-12-14 | Leapfrog Enterprises, Inc. | Method and device for audibly instructing a user to interact with a function |
US20060067576A1 (en) * | 2004-03-17 | 2006-03-30 | James Marggraff | Providing a user interface having interactive elements on a writable surface |
US20060127872A1 (en) * | 2004-03-17 | 2006-06-15 | James Marggraff | Method and device for associating a user writing with a user-writable element |
US20060125805A1 (en) * | 2004-03-17 | 2006-06-15 | James Marggraff | Method and system for conducting a transaction using recognized text |
US7453447B2 (en) * | 2004-03-17 | 2008-11-18 | Leapfrog Enterprises, Inc. | Interactive apparatus with recording and playback capability usable with encoded writing medium |
US8274666B2 (en) * | 2004-03-30 | 2012-09-25 | Ricoh Co., Ltd. | Projector/printer for displaying or printing of documents |
CA2527240A1 (en) * | 2004-06-03 | 2005-12-22 | Leapfrog Enterprises, Inc. | User created interactive interface |
US11627944B2 (en) | 2004-11-30 | 2023-04-18 | The Regents Of The University Of California | Ultrasound case builder system and method |
CA2532422A1 (en) * | 2005-01-12 | 2006-07-12 | Leapfrog Enterprises, Inc. | Device user interface through recognized text and bounded areas |
US7661592B1 (en) | 2005-06-08 | 2010-02-16 | Leapfrog Enterprises, Inc. | Interactive system including interactive apparatus and game |
US7922099B1 (en) | 2005-07-29 | 2011-04-12 | Leapfrog Enterprises, Inc. | System and method for associating content with an image bearing surface |
US7281664B1 (en) | 2005-10-05 | 2007-10-16 | Leapfrog Enterprises, Inc. | Method and system for hierarchical management of a plurality of regions of an encoded surface used by a pen computer |
US8209061B2 (en) * | 2005-10-24 | 2012-06-26 | The Toro Company | Computer-operated landscape irrigation and lighting system |
US7936339B2 (en) * | 2005-11-01 | 2011-05-03 | Leapfrog Enterprises, Inc. | Method and system for invoking computer functionality by interaction with dynamically generated interface regions of a writing surface |
US8599143B1 (en) | 2006-02-06 | 2013-12-03 | Leapfrog Enterprises, Inc. | Switch configuration for detecting writing pressure in a writing device |
US7788712B2 (en) * | 2006-06-05 | 2010-08-31 | Ricoh Company, Ltd. | Managing access to a document-processing device using an identification token |
JP2008021168A (en) * | 2006-07-13 | 2008-01-31 | Fuji Xerox Co Ltd | Handwriting detection sheet and handwriting system |
US8261967B1 (en) | 2006-07-19 | 2012-09-11 | Leapfrog Enterprises, Inc. | Techniques for interactively coupling electronic content with printed media |
US20080098315A1 (en) * | 2006-10-18 | 2008-04-24 | Dao-Liang Chou | Executing an operation associated with a region proximate a graphic element on a surface |
DE102008021160A1 (en) * | 2008-04-28 | 2009-10-29 | Beckhoff Automation Gmbh | remote control |
US20110128258A1 (en) * | 2009-11-30 | 2011-06-02 | Hui-Hu Liang | Mouse Pen |
US9081412B2 (en) * | 2010-07-31 | 2015-07-14 | Hewlett-Packard Development Company, L.P. | System and method for using paper as an interface to computer applications |
US9021402B1 (en) | 2010-09-24 | 2015-04-28 | Google Inc. | Operation of mobile device interface using gestures |
WO2012050251A1 (en) * | 2010-10-14 | 2012-04-19 | 엘지전자 주식회사 | Mobile terminal and method for controlling same |
US20120216152A1 (en) * | 2011-02-23 | 2012-08-23 | Google Inc. | Touch gestures for remote control operations |
US9135512B2 (en) | 2011-04-30 | 2015-09-15 | Hewlett-Packard Development Company, L.P. | Fiducial marks on scanned image of document |
US11631342B1 (en) | 2012-05-25 | 2023-04-18 | The Regents Of University Of California | Embedded motion sensing technology for integration within commercial ultrasound probes |
US20130339859A1 (en) | 2012-06-15 | 2013-12-19 | Muzik LLC | Interactive networked headphones |
US20140223382A1 (en) * | 2013-02-01 | 2014-08-07 | Barnesandnoble.Com Llc | Z-shaped gesture for touch sensitive ui undo, delete, and clear functions |
US9098217B2 (en) | 2013-03-22 | 2015-08-04 | Hewlett-Packard Development Company, L.P. | Causing an action to occur in response to scanned data |
KR102203885B1 (en) * | 2013-04-26 | 2021-01-15 | 삼성전자주식회사 | User terminal device and control method thereof |
KR102183448B1 (en) * | 2013-04-26 | 2020-11-26 | 삼성전자주식회사 | User terminal device and display method thereof |
KR102063103B1 (en) * | 2013-08-23 | 2020-01-07 | 엘지전자 주식회사 | Mobile terminal |
US10380920B2 (en) | 2013-09-23 | 2019-08-13 | SonoSim, Inc. | System and method for augmented ultrasound simulation using flexible touch sensitive surfaces |
US10380919B2 (en) | 2013-11-21 | 2019-08-13 | SonoSim, Inc. | System and method for extended spectrum ultrasound training using animate and inanimate training objects |
CN107210950A (en) | 2014-10-10 | 2017-09-26 | 沐择歌有限责任公司 | Equipment for sharing user mutual |
US11600201B1 (en) | 2015-06-30 | 2023-03-07 | The Regents Of The University Of California | System and method for converting handheld diagnostic ultrasound systems into ultrasound training systems |
US10896628B2 (en) | 2017-01-26 | 2021-01-19 | SonoSim, Inc. | System and method for multisensory psychomotor skill training |
US11810473B2 (en) | 2019-01-29 | 2023-11-07 | The Regents Of The University Of California | Optical surface tracking for medical simulation |
US11495142B2 (en) | 2019-01-30 | 2022-11-08 | The Regents Of The University Of California | Ultrasound trainer with internal optical tracking |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5477012A (en) * | 1992-04-03 | 1995-12-19 | Sekendur; Oral F. | Optical position determination |
US5652412A (en) * | 1994-07-11 | 1997-07-29 | Sia Technology Corp. | Pen and paper information recording system |
US5661506A (en) * | 1994-11-10 | 1997-08-26 | Sia Technology Corporation | Pen and paper information recording system using an imaging pen |
US5852434A (en) * | 1992-04-03 | 1998-12-22 | Sekendur; Oral F. | Absolute optical position determination |
US6076734A (en) * | 1997-10-07 | 2000-06-20 | Interval Research Corporation | Methods and systems for providing human/computer interfaces |
US6161134A (en) * | 1998-10-30 | 2000-12-12 | 3Com Corporation | Method, apparatus and communications system for companion information and network appliances |
US6570104B1 (en) * | 1999-05-28 | 2003-05-27 | Anoto Ab | Position determination |
US6593908B1 (en) * | 2000-02-16 | 2003-07-15 | Telefonaktiebolaget Lm Ericsson (Publ) | Method and system for using an electronic reading device on non-paper devices |
US6628847B1 (en) * | 1998-02-27 | 2003-09-30 | Carnegie Mellon University | Method and apparatus for recognition of writing, for remote communication, and for user defined input templates |
US6756998B1 (en) * | 2000-10-19 | 2004-06-29 | Destiny Networks, Inc. | User interface and method for home automation system |
US7162222B2 (en) * | 1999-12-01 | 2007-01-09 | Silverbrook Research Pty Ltd | Method and system for telephone control using sensor with identifier |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU5263300A (en) | 1999-05-28 | 2000-12-18 | Anoto Ab | Position determination |
WO2001016691A1 (en) | 1999-08-30 | 2001-03-08 | Anoto Ab | Notepad |
SE517445C2 (en) | 1999-10-01 | 2002-06-04 | Anoto Ab | Position determination on a surface provided with a position coding pattern |
AU4060701A (en) | 2000-02-16 | 2001-08-27 | Telefonaktiebolaget Lm Ericsson (Publ) | Method and system for configuring and unlocking an electronic reading device |
WO2001075781A1 (en) | 2000-04-05 | 2001-10-11 | Anoto Ab | Method and system for information association |
-
2002
- 2002-06-25 US US10/178,734 patent/US7202861B2/en not_active Expired - Lifetime
-
2007
- 2007-03-08 US US11/715,347 patent/US20070152987A1/en not_active Abandoned
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5477012A (en) * | 1992-04-03 | 1995-12-19 | Sekendur; Oral F. | Optical position determination |
US5852434A (en) * | 1992-04-03 | 1998-12-22 | Sekendur; Oral F. | Absolute optical position determination |
US5652412A (en) * | 1994-07-11 | 1997-07-29 | Sia Technology Corp. | Pen and paper information recording system |
US5661506A (en) * | 1994-11-10 | 1997-08-26 | Sia Technology Corporation | Pen and paper information recording system using an imaging pen |
US6076734A (en) * | 1997-10-07 | 2000-06-20 | Interval Research Corporation | Methods and systems for providing human/computer interfaces |
US6628847B1 (en) * | 1998-02-27 | 2003-09-30 | Carnegie Mellon University | Method and apparatus for recognition of writing, for remote communication, and for user defined input templates |
US6161134A (en) * | 1998-10-30 | 2000-12-12 | 3Com Corporation | Method, apparatus and communications system for companion information and network appliances |
US6570104B1 (en) * | 1999-05-28 | 2003-05-27 | Anoto Ab | Position determination |
US7162222B2 (en) * | 1999-12-01 | 2007-01-09 | Silverbrook Research Pty Ltd | Method and system for telephone control using sensor with identifier |
US6593908B1 (en) * | 2000-02-16 | 2003-07-15 | Telefonaktiebolaget Lm Ericsson (Publ) | Method and system for using an electronic reading device on non-paper devices |
US6756998B1 (en) * | 2000-10-19 | 2004-06-29 | Destiny Networks, Inc. | User interface and method for home automation system |
Also Published As
Publication number | Publication date |
---|---|
US7202861B2 (en) | 2007-04-10 |
US20030014615A1 (en) | 2003-01-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7202861B2 (en) | Control of a unit provided with a processor | |
US7054487B2 (en) | Controlling and electronic device | |
US20110279415A1 (en) | Method and system for implementing a user interface for a device employing written graphical elements | |
US20060066591A1 (en) | Method and system for implementing a user interface for a device through recognized text and bounded areas | |
KR100918535B1 (en) | Notepad | |
US7176896B1 (en) | Position code bearing notepad employing activation icons | |
CN103970409B (en) | Generate the method for augmented reality content and the terminal using the augmented reality content | |
CN100409159C (en) | Non contact human-computer interface | |
CN101697277B (en) | Method, device and system for realizing multifunction of intelligent wireless microphone | |
KR100735554B1 (en) | Character input method and apparatus for the same | |
WO2006049574A1 (en) | Management of internal logic for electronic pens | |
US20070082710A1 (en) | Method and apparatus for batch-processing of commands through pattern recognition of panel input in a mobile communication terminal | |
WO2007055717A2 (en) | A method and device for associating a user writing with a user-writable element | |
WO2001061454A8 (en) | Controlling an electronic device | |
US20090127006A1 (en) | Information Management in an Electronic Pen Arrangement | |
CN102034341B (en) | Control system and method for generating control picture by using control system | |
EP1405165A1 (en) | Control of a unit provided with a processor | |
JP6367031B2 (en) | Electronic device remote control system and program | |
KR20030085268A (en) | Remote Controller Having User Interface expressed in Icon Format | |
JPH10207908A (en) | Network service access managing device | |
US20030011562A1 (en) | Data input method and device for a computer system | |
EP1681623A1 (en) | Device user interface through recognized text and bounded areas | |
EP1276038A1 (en) | Data input method and device for a computer system | |
KR20050015947A (en) | An externally connected handwritten recognition input device for a mobile phone | |
KR20030008389A (en) | Data input method and device for a computer system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |