US20070270074A1 - Robot Toy - Google Patents

Robot Toy Download PDF

Info

Publication number
US20070270074A1
US20070270074A1 US11/775,133 US77513307A US2007270074A1 US 20070270074 A1 US20070270074 A1 US 20070270074A1 US 77513307 A US77513307 A US 77513307A US 2007270074 A1 US2007270074 A1 US 2007270074A1
Authority
US
United States
Prior art keywords
light
emotion
robot toy
light source
recited
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/775,133
Inventor
Yuichi AOCHI
Wakana Yamana
Eiji Takuma
Yoshiyuki Endo
Fujio Nobata
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sega Toys Co Ltd
Original Assignee
Sega Toys Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sega Toys Co Ltd filed Critical Sega Toys Co Ltd
Assigned to SEGA TOYS CO., LTD. reassignment SEGA TOYS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NOBATA, FUJIO, AOCHI, YUICHI, ENDO, YOSHIYUKI, TAKUMA, EIJI, YAMANA, WAKANA
Publication of US20070270074A1 publication Critical patent/US20070270074A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • G06N3/008Artificial life, i.e. computing arrangements simulating life based on physical entities controlled by simulated intelligence so as to replicate intelligent life forms, e.g. based on robots replicating pets or humans in their appearance or behaviour
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H11/00Self-movable toy figures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H33/00Other toys
    • A63H33/22Optical, colour, or shadow toys
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H2200/00Computerized interactive toys, e.g. dolls

Definitions

  • the present invention relates to a robot toy, and is particularly useful as an inexpensive robot toy.
  • Robot toys that generate emotions automatically based on an appeal from a user, such as the clapping of hands or patting, or the surrounding environment, and express these generated emotions through a similar emission pattern on light-emitting elements and with sounds such as music have recently been proposed and marketed.
  • JP 3277500 discloses LEDs (light-emitting diodes) having a shape and emission color corresponding to an emotion disposed in the head part of a robot toy, wherein the diodes are covered by a semi-transparent cover in such a way that they are visible from the outside only when they are emitting light and these LEDs are flashed in response to the emotion of the robot toy in order to express the emotions of “joy” and “anger.”
  • JP 3277500 also discloses a robot toy that expresses emotion, wherein, in place of these above-described LEDs, multiple light-emitting elements are disposed in matrix form and these elements have a shape and an emission color corresponding to the emotions of “joy” and “anger,” and these light-emitting elements are flashed selectively.
  • the present invention is created in light of such problems, and an object thereof is to provide a robot toy that, although it is inexpensive in construction, is capable of effectively expressing emotions.
  • a robot toy of the present invention comprises a first light source capable of emitting, simultaneously or separately, light of at least two colors; five or more second light sources disposed around the first light source; a memory for storing at least two emotion parameters that define emotions; and control means for increasing and decreasing the value of the emotion parameters based on operation input from the outside, wherein the control means cause the first and second light sources to emit light based on an emission pattern corresponding to the combination of the values of the emotion parameters.
  • FIG. 1 is an oblique view showing the appearance of a robot toy of an embodiment of the present invention.
  • FIG. 2 is a block diagram showing the internal structure of a robot toy of an embodiment of the present invention.
  • FIG. 3 is a partial oblique view showing the specific structure of the head part.
  • FIG. 4 is an oblique view showing the structure of the drive mechanism part.
  • FIG. 5 is an oblique view following the description of the pendulum gear.
  • FIG. 6 is a simplified plan view for describing the cam of the first cam gear.
  • FIG. 7 is an oblique view showing the structure of the rising and falling member.
  • FIG. 8 is an oblique view showing the structure of the ear drive member.
  • FIG. 9 is a simplified drawing for describing the movement of the head.
  • FIG. 10 is an oblique view showing the similar emission status of the LEDs as seen from the outside.
  • FIG. 11 is a plan view showing the positional relationship between each LED in the LED display part.
  • FIG. 12 is a schematic drawing for describing the emotions of the robot toy.
  • FIG. 13 is a chart for describing the emotions of the robot toy.
  • FIG. 14 is a schematic drawing for describing the basic light emission patterns.
  • FIG. 15 is a flow chart showing the order of processing for generating and expressing emotions.
  • FIG. 16 is a flow chart showing the order of processing for modification of the first emotion parameter.
  • FIG. 17 is a flow chart showing the order of processing for modification of the second emotion parameter.
  • FIG. 18 is a flow chart showing the order of processing for modification of the third emotion parameter.
  • FIG. 19 is a flow chart showing the order of processing for expressing the first emotion.
  • FIG. 20 is a flow chart showing the order of processing for expressing the second emotion.
  • FIG. 21 is a flow chart showing the order of processing for expressing the third emotion.
  • FIG. 22 is a simplified drawing for describing another embodiment of the present invention.
  • 1 is the robot toy
  • 3 is the head part
  • 7 is the tail switch
  • 10 is the control part
  • 21 is the LED display part
  • 21 A through 21 G is the LED display part
  • 70 A through 70 D is the head switch
  • 71 A through 71 E is the head switch
  • 72 A through 72 F is the head switch
  • 24 is the nose switch.
  • FIG. 1 shows a robot toy 1 relating to the present embodiment.
  • Robot toy 1 has an overall appearance and shape imitating a sitting animal, such as a dog.
  • the head part is linked, in such a way that it can freely turn, to a shaft extending from the frontal end of a torso part 2 and is attached, in such a way that it can freely move, to the left and right top ends of a head part 3 at ear parts 4 A and 4 B.
  • paddle-shaped front leg parts 5 A and 5 B having rounded sides are linked, in such a way that they can be freely moved manually, to the frontal end parts of the left and right side surfaces of torso part 2 , and hind leg parts 6 A and a 6 B, which protrude shorter then the front leg parts, are formed as a single unit with torso part 2 at the rear end parts of the left and right side surfaces of torso part 2 .
  • a joystick-shaped tail switch 7 is attached, in such a way that it can freely move, near the rear end part of the top surface of torso part 2 .
  • torso part 2 is a right-angled parallelepiped shape having rounded sides, and, as shown in FIG. 2 , houses on the inside a control part 10 responsible for motion control of the entire robot toy 1 , a motor drive part 11 , a substrate 13 formed by a sound amplifier part 12 , a microphone 14 , a speaker 15 , and other devices. Moreover, an earphone jack is disposed on one hind leg part 6 B of torso part 2 .
  • Head part 3 is a right-angled parallelepiped shape having rounded sides that is flatter than torso part 2 , and houses on the inside a light sensor 20 , a display part 21 that has multiple LEDs, and other devices, and a drive mechanism 34 ( FIG. 3 ) described later for driving head part 3 and each ear part 4 A and 4 B with a motor 22 as the power source, etc.
  • a head switch 23 which is a press switch, is disposed at the top end part of head part 3
  • a nose switch 24 which is a press switch, is disposed at the position of the nose on head part 3 .
  • robot toy 1 has front leg parts 5 A and 5 B that are longer than hind leg parts 6 A and 6 B; therefore, one surface of head part 3 is inclined with respect to the mounting surface and faces up and the display surface of LED display part 21 is disposed on this one surface so that the display can be easily seen by the user.
  • microphone 14 of torso part 2 is a directional microphone for ambient sounds and transmits obtained sound signals S 1 to control part 10 .
  • light sensor 20 of head part 3 detects ambient brightness and transmits the detection results to a control part 10 as brightness detection signals S 2 .
  • tail switch 7 , head switch 23 , and nose switch 24 detect physical appeals from the user, such as by “hitting down” or “pressing” and transmit the detection results to control part 10 as operation detection signals S 3 A through S 3 C.
  • sound signals S 4 from an outside musical toy are also applied to control part 10 via earphone jack 16 .
  • Control part 10 is a microcomputer providing an information processor comprising a CPU (central processing unit) 10 A, a memory 10 B, an analog-to-digital conversion circuit 10 C, and the like, and recognizes ambient conditions and appeals from the user based on each operation detection signal S 3 A through S 3 C of tail switch 7 , head switch 23 , and nose switch 24 .
  • CPU central processing unit
  • Control part 10 causes robot toy 1 to move in such a way that head part 3 is tilted and ear parts 4 A and 4 B are opened and closed by transmitting motor drive signals S 4 to motor drive part 11 based on these recognition results and the program pre-stored in memory 10 B to actuate motor 22 . Moreover, control part 10 outputs sound and music from speaker 15 based on sound signals S 5 by applying specific sound signals S 5 to speaker 15 by a sound amplifier part 12 as necessary, and flashes the LEDs of LED display part 21 by a specific light emission pattern by applying specific drive signals S 6 to LED display part 21 .
  • head part 3 of robot toy 1 is formed by a drive mechanism part 34 , a cover 35 of this drive mechanism part 34 , LED display part 21 , a member 36 for preventing light from escaping, and a filter cover 37 , layered in succession moving from the back to the front in the direction shown by arrow a, inside a case comprising a first half case 30 that forms the external shape of the back face of head part 3 , a second half case 31 forming the external shape of the front face of head part 3 , and first and second U-shaped case side members 32 that form the left and right side faces of head 3 .
  • a worm gear that is not illustrated is attached to the output axle of motor 22 of drive mechanism part 34 , and this worm gear engages with a gear 42 via a gear 40 and a gear 41 formed coaxially as one unit with this gear 40 .
  • a movable member 45 wherein weights 44 are attached to one end is attached, in such a way that the member can freely turn, to an axle 43 to which gear 42 is also attached; and a pendulum gear 46 is attached, such that it can freely turn, to the other end of this movable member 45 and engages with a gear 47 formed coaxially and as one unit with gear 42 .
  • first and second linking gears 48 and 49 are disposed at the left and right of pendulum gear 46 such that when movable member 45 turns around axle 43 , it can engage with pendulum gear 46 .
  • gear 47 turns movable member 45 as one unit with pendulum gear 46 in a clockwise direction in FIG. 4 and thereby enables pendulum gear 46 to engage with first linking gear 48
  • gear 47 turns movable member 45 in the counter-clockwise direction in FIG. 4 and thereby enables pendulum gear 46 to engage with second linking gear 49 .
  • first linking gear 48 engages with a first cam gear 50 and a cam 50 A of a specific shape is formed, as shown in FIG. 6 , in the bottom surface of this first cam gear 50 .
  • this cam 50 A will fit into an engagement hole 51 A of rising and falling member 51 , as shown in FIG. 7 , disposed so that it can freely move up and down, as shown by arrow b in FIG. 4 , in such a way that rising and falling member 51 will rise and fall with cam 50 A when first cam gear 50 rotates.
  • first and second shaft bodies 53 A and 53 B are set at the top end of rising and falling member 51 such that they are positioned symmetrically to the right and left of a shaft body 52 ( FIG. 4 ) set in first case half 30 , and as shown in FIG. 4 , an ear drive member 54 is attached to these first and second shaft bodies 53 A and 53 B.
  • Ear drive member 54 is formed by linkage, via a barrel part 61 and as one unit in such a way that it can freely bend, of the base part of each tweezer-shaped first and second spring parts 60 A and 60 B formed of an elastic material Moreover, ear drive member 54 is attached to rising and falling member 51 in such a way that the corresponding first and second shaft bodies 53 A and 53 B of rising and falling member 51 fit into holes 62 AX and 62 BX of first and second engagement parts 62 A and 62 B disposed near the base part of these first and second spring parts 60 A and 60 B, and barrel part 61 is attached to rising and falling member 51 in such a way that it engages with shaft body 52 .
  • drive mechanism part 34 operates in such a way that when rising and falling member 51 rises and falls, first and second shaft bodies 53 A and 53 B of rising and falling member 51 move up and down relative to shaft body 52 and first and second spring parts 60 A and 60 B open and close as one unit with first and second engagement parts 62 A and 62 B of ear drive member 54 .
  • the bottom end parts of the corresponding ear parts 4 A and 4 B fit into first and second spring parts 60 A and 60 B of ear drive member 54 , and ear part 4 A and ear part 4 B are supported, near the bottom end, to the left and right, respectively, of the top end of first half case 30 , and ear parts 4 A and 4 B open and close with the opening and closing motion of first and second spring parts 60 A and 60 B of ear drive member 54 .
  • second linking gear 49 engages with a second cam gear 63 and a cam 63 A of a specific shape is formed as shown in FIG. 9 (A) in the bottom surface of this second cam gear 63 .
  • two parallel arm parts 65 A and 65 B of a two-pronged member 65 anchored to a shaft body 64 which is in turn anchored to torso part 2 , extend to the bottom of second cam gear 63 .
  • cam 63 A of second cam gear 63 fits in between these two arm parts 65 A and 65 B of this two-pronged member 65 .
  • LED display part 21 is formed by disposing seven LEDs 21 A through 21 G on a substrate 66 at a specific positional relationship. The positional relationship of the seven LEDs 21 A through 21 G and the emission colors thereof are described later.
  • Member 36 for preventing light from escaping is formed from, for instance, a black resin or rubber material that will not transmit light, and is attached tightly to substrate 66 of LED display part 21 .
  • a pressing part 24 A and a contact part 24 B of nose switch 24 are anchored to the bottom end of this member 36 for preventing light from escaping.
  • a total of seven holes 36 A through 36 G corresponding to each LED 21 A through 21 G of LED display part 21 are made in this member 36 for preventing light from escaping, and when this member 36 for preventing light from escaping is attached to substrate 66 of LED display part 21 , the corresponding LED 21 A through 21 G of LED display part 21 can be exposed to the filter cover 37 via the respective holes 36 A through 36 G.
  • the thickness of member 36 for preventing light from escaping is the same as the size of the space between substrate 66 of LED display part 21 and filter cover 37 , and the light that is reflected from each LED 21 A through 21 G of LED display part 21 can therefore be shined in the direction of filter cover 37 without mixing with the light emitted from each of the other LEDs 21 A through 21 G.
  • filter cover 37 is formed using a semitransparent resin material, and second half case 31 is formed from a transparent resin material.
  • second half case 31 is formed from a transparent resin material.
  • filter cover 37 is white. The light emitted from LEDs 21 A through 21 G of LEDs that can be seen from the outside through this filter cover 37 does not mix and faintly glows inside the white cover.
  • a function for expressing feeling is loaded in control part 10 ( FIG. 2 ) of this robot toy 1 .
  • the emotions of robot toy 1 are generated based on the type of appeal from the user, and each of LEDs 21 A through 21 G of LED display 21 flashes in an emission pattern that corresponds to the type and degree of this emotion in such a way that robot toy 1 expresses an emotion.
  • FIG. 11 shows the positional layout of each LED 21 A through 21 G of LED display part 21 used for expression of emotion by robot toy 1 .
  • one LED 21 A is disposed at a specific position on the center line in the axial direction of head 3 in LED display part 21
  • the remaining six LEDs 21 B through 21 G are disposed, at equal distances from LED 21 and at equal intervals from one another, in a concentric circle around LED 21 A at the center.
  • the peripheral six LEDs 21 B through 21 G are disposed in a positional relationship such that each is at an apex of a regular hexagon with the central LED 21 A at the center.
  • central LED 21 A An LED capable of simultaneously or separately emitting the three colors of green, red, and blue light is used as central LED 21 A. Moreover, an LED capable of simultaneously or separately emitting the two colors of green and red light is used as the other peripheral LEDs 21 B through 12 G. Consequently, peripheral LEDs 21 B through 21 G can emit orange light by simultaneously flashing the two colors of green and red light.
  • the emotions of robot toy 1 are defined by two parameters (hereafter referred to as emotion parameters) that represent a “stimulation level” and an “affection level”. These two parameters are stored in memory 10 B and are values within a range of “ ⁇ 8” to “8.”
  • each emotion parameter of the “stimulation level” and the “affection level” corresponds to emotions of “joy” and “great fondness” when 0 or within the positive range; and when the value of the emotion parameter of the “affection level” is 0 or positive, but the value of the “stimulation level” is within a negative range, the emotion corresponds to “normal” or “calm”.
  • Values of the emotion parameters for the “stimulation level” and “affection level” that are both within the negative range correspond to the emotions of “loneliness” and “depression,” while values of the emotion parameter of the “stimulation level” that are 0 or positive and values of the emotion parameter of the “affection level” that are negative correspond to the emotions of “anger” and “dislike.”
  • the degree of an emotion is expressed by the magnitude of each emotion parameter of the “stimulation level” and “affection level,” and if the absolute value of the emotion parameter is high, the degree of the emotion increases with this increase in the value.
  • control part 10 increases or decreases the value of the emotion parameter of the “stimulation level” and the “affection level” within a range of “ ⁇ 8” to “8” when the user appeals to the toy by “hitting down” or pressing” tail switch 7 , head switch 23 , or nose switch 24 , or when there has been no appeal for a specific time based on operation detection signals S 3 A through S 3 C applied from tail switch 7 , head switch 23 , and nose switch 24 .
  • the value of a parameter of the “stimulation level” and the “affection level” is predetermined whether the value of a parameter of the “stimulation level” and the “affection level” will be increased or decreased in accordance with the appeal from the user. For example, if the user presses nose switch 24 , the value of each emotion parameter of the “stimulation level” and “affection level” will be increased by one, and when the user presses the head switch, the value of the emotion parameter of the “stimulation level” will be reduced by one, while the value of the emotion parameter of the “affection level” will be increased by one.
  • the user can change the emotions of robot toy 1 to emotions of “joy” and “great fondness”, and increase the degree of these emotions of “joy” and “great fondness,” by pressing nose switch 24 of robot toy 1 , and the user can change the emotions of robot toy 1 to “normal” or “calm,” and increase the degree of this emotion of “normal” or “calm” by pressing head switch 23 .
  • control part 10 makes tail switch 7 swing back and forth, the value of the emotion parameter of the “stimulation level” is increased by one, while the value of the emotion parameter of the “affection level” is increased or decreased by one within a range of “ ⁇ 8” to “8.” Consequently, by swinging tail switch 7 of robot toy 1 , the user can change emotions of “anger” and “dislike” to emotions of “depression” or “loneliness,” and increase the degree of these emotions.
  • control part 10 reduces the value of the emotion parameters of the “stimulation level” and the “affection level” by one when the user appeals by “hitting down” or “pressing” tail switch 7 , head switch 23 , or nose switch 24 , or when there has been no appeal for a specific time (for instance, 30 seconds). At this latter time, robot toy 1 changes from auto-emotion to emotions of “depression” and “loneliness” and the degree of these emotions increases.
  • control part 10 responds to the emotion of robot toy 1 as well as the degree thereof as determined from the value of the two emotion parameters after the change by causing each LED 21 A through 21 G of LED display part 21 to flash with an emission pattern corresponding to the emotion and the degree thereof.
  • control part 10 reads the value of each emotion parameter of the “stimulation level” and “affection level” at this time from memory 10 B and differentiates the values of the two read emotion parameters to determine whether they are emotions of “joy” and “great affection” (the values of the emotion parameters of the “stimulation level” and “affection level” are both 0 or within the positive range); emotions of “normal” and “calm” (the value of the emotion parameter of the “affection level” is 0 or positive, and the value of the emotion parameter of the “stimulation level” is negative); emotions of “depression” and “loneliness” (the values of the emotion parameters of the “stimulation level” and “affection level” are both negative); or emotions of “anger” and “dislike” (the value of the emotion parameter of the “stimulation level” is 0 or positive and the value of the emotion parameter of the “a
  • Control part 10 further differentiates between the values of the emotion parameters of the “stimulation level” and the “affection level” to determine the degree of the emotion of robot toy 1 at that time and controls LED display based on these determination results in such a way that LEDs 21 A through 21 G of LED display part 21 flash with an emission pattern corresponding to the degree of the emotions of robot toy 1 at that time.
  • Each LED 21 A through 21 G of LED display part 21 will respond to a specific combination of emotion parameters of the “stimulation level” and the “affection level” by a pre-established emission pattern, and a program that defines, for each of the emission patterns, the timing by which LEDs 21 A through LED 21 G of LED display part 21 will flash is prestored in memory 10 B of LEDs 21 A through 21 G (this program is referred to as the LED drive program hereafter).
  • Control part 10 determines the emotion of robot toy 1 and the degree thereof as described above, and then causes LEDs 21 A through 21 G of LED display part 21 to flash with the emission pattern corresponding to the emotion of robot toy 1 and the degree thereof at that time in accordance with this LED drive program.
  • the emission patterns of LEDs 21 A through 21 G of this LED display part 21 are based on an emission pattern wherein each LED 21 B through 21 G is repeatedly turned on and off in succession relative to the respective adjacent LED 21 B through LED 21 G such that only peripheral LEDs 21 B through 21 G flash, one at a time, clockwise around central LED 21 A.
  • This emission pattern is referred to as the basic emission pattern hereafter.
  • This basic emission pattern is used as the emission pattern for expressing emotions other than “depression” and “loneliness.”
  • LEDs 21 B through 21 G respond by emitting orange light;
  • LEDs 21 B through 21 G respond by emitting green light; and
  • LEDs 21 B through 21 G respond by emitting green light; and
  • LEDs 21 B through 21 G respond by emitting green light.
  • the peripheral LEDs 21 B through 21 G do not emit light and only central LED 21 A flashes blue.
  • the emission patterns of LEDs 21 A through 21 G of LED display part 21 are set in such a way that when the emotions are “joy” and “great fondness,” “normal” and “calm,” or “anger” and “dislike,” the lights rotate faster as the degree of the emotion increases, in essence, the flashing cycle of the peripheral LEDs 21 B through 21 G becomes faster, when the degree of the emotion is high, and when the emotions are “depression” and “loneliness,” the emission patterns are set in such a way that the flashing cycle of central LED 21 A becomes faster as the degree of the emotion increases when the degree of the emotion is high.
  • the emotion of robot toy 1 can be visually recognized from the outside based on the emission colors of LEDs 21 A through 21 G of LED display part 21 at that time, and the degree of this emotion can be visually recognized from the outside based on the flashing speed of LED 21 A through 21 D at that time.
  • Control part 10 executes the process for generating emotions of robot toy 1 and processing to express the generated emotions (processing for controlling LED display part 21 ) based on the LED drive program in accordance with procedure RT 1 for the generation and expression of emotions shown in FIG. 15 .
  • control part 10 starts procedure RT 1 for generating and expressing of emotions when the power source of this robot toy 1 is turned on, and in step SP 1 , the values of the emotion parameters of both the “affection level” and “stimulation level” representing the emotions of robot toy 1 are set at an initial value of “0.”
  • Control part 10 then proceeds to step SP 3 and determines whether or not head switch 23 ( FIG. 2 ), nose switch 24 ( FIG. 2 ), or tail switch 7 ( FIG. 2 ) have been pressed or moved.
  • control part 10 obtains results to the contrary in step SP 3 , it proceeds to step SP 4 and reads the count value of an internal timer, which is not illustrated.
  • step SP 5 the control part sets the values of each emotion parameter of the “affection level” and “stimulation level” of step SP 2 at the initial values based on the count as read at step SP 4 , or it determines whether or not a specific time (for instance, 30 seconds) has lapsed since either head switch 23 , nose switch 24 , or tail switch 7 was pressed or moved.
  • control part 10 When control part 10 obtains results to the contrary in step SP 5 , it returns to step SP 3 and then repeats steps SP 3 -SP 4 -SP 5 -SP 3 until an affirmative result is obtained at step SP 3 or step SP 5 .
  • control part 10 proceeds to step SP 6 and changes the value of each emotion parameter of the “affection level” and “stimulation level” in accordance with first procedure RT 2 for changing the emotion parameters shown in FIG. 16 .
  • step SP 6 first procedure RT 2 for changing the emotion parameters starts at step SP 20 and then in step SP 21 , the control part determines whether or not the values of the emotion parameters on the “affection level” and the “stimulation level” have reached the maximum value (“8” in the present embodiment).
  • control part 10 When control part 10 obtains results to the contrary in step SP 21 , control part 10 proceeds to step SP 22 and increases by one the values of each emotion parameter of the “affection level” and the “stimulation level.” This control part 10 finishes second procedure RT 2 for changing the values of the emotion parameters and proceeds to step SP 13 of procedure RT 1 for generating and expressing emotions ( FIG. 15 ).
  • control part 10 determines that the value of the emotion parameter of the “affection level” is at a maximum in step SP 21 , it proceeds to step SP 23 and increases by one only the value of the emotion parameter of the “stimulation level”. Moreover, control part 10 then proceeds to SP 25 , completes first procedure RT 2 for changing the emotion parameter, and proceeds to step SP 13 of procedure RT 1 for generating and expressing emotions.
  • control part 10 determines that the value of the emotion parameter of the “stimulation level” in step SP 21 is at a maximum, it proceeds to step SP 24 and increases by one only the value of the “affection level”. Control part 10 then proceeds to step SP 25 , completes first procedure RT 2 for changing the emotion parameter, and then proceeds to step SP 13 of procedure RT 1 for generating and expressing emotions.
  • control part 10 obtains affirmation at step SP 3 of procedure RT 1 for generating and expressing emotions as a result of the user pressing nose switch 24 of robot toy 1 , the control part proceeds to step SP 7 and changes the value of each emotion parameter of the “affection level” and “stimulation level” of the emotion in accordance with the second procedure RT 3 for changing the emotion parameters shown in FIG. 17 .
  • control part 10 when control part 10 proceeds to step SP 7 , it starts second procedure RT 3 for changing the emotion parameter at step S 30 and in step SP 31 determines whether or not the value of the emotion parameter of the “affection level” is at a maximum and whether or not the value of the emotion parameter of the “stimulation level” is at a minimum (“ ⁇ 8” in the present embodiment).
  • control part 10 When control part 10 obtains results to the contrary in step SP 31 , it proceeds to step SP 32 , and increases by one the value of the emotion parameter of the “affection level” and decreases by one the value of the emotion parameter of the “stimulation level.” Control part 10 then proceeds to step SP 35 , completes second procedure RT 3 for changing the emotion parameters, and proceeds to step SP 13 of procedure RT 1 for generating and expressing emotions ( FIG. 15 ).
  • control part 10 determines in step SP 31 that the value of the emotion parameter of the “affection level” is at a maximum, it proceeds to step SP 33 and decreases by one only the value of the emotion parameter of the “stimulation level.” Control part 10 then proceeds to step SP 35 , completes second processing RT 3 for changing the emotion parameters, and then proceeds to step SP 13 of procedure RT 1 for generating and expressing emotions.
  • control part 10 determines in step SP 31 that the value of the emotion parameter of the “stimulation level” is at a minimum, it proceeds to step SP 34 and increases by one only the value of the emotion parameter of the “affection level.” Control part 10 then proceeds to step S 35 , completes second procedure RT 3 for changing the emotion parameters, and proceeds to step SP 13 of procedure RT 1 for generating and expressing emotions.
  • control part 10 when control part 10 obtains affirmation in step SP 3 for procedure RT 1 for generating and expressing emotions as a result of the user moving tail switch 7 of robot toy 1 , it proceeds to step SP 8 and changes the level of each emotion parameter of the “affection level” and “stimulation level” in accordance with the third procedure RT 4 for changing the emotion parameters shown in FIG. 18 .
  • control part 10 proceeds to step SP 8 , it begins third procedure RT 4 for changing the emotion parameters in step SP 40 , and in step SP 41 it determines whether the value of the emotion parameter of the “affection level” is at a minimum (“ ⁇ 8” in the present embodiment) and whether the value of the emotion parameter of the “stimulation level” is at a maximum.
  • control part 10 When control part 10 obtains results to the contrary in step SP 41 , it proceeds to step SP 42 and decreases by one the emotion parameter of the “affection level” and increases by one the value of the emotion parameter of the “stimulation level.” Moreover, control part 10 then proceeds to step 45 , completes the third procedure for changing the emotion parameters, and proceeds to step SP 13 of procedure RT 1 for generating and expressing emotions ( FIG. 15 ).
  • control part 10 determines that the value of the emotion parameter of the “affection level” is at a minimum in step SP 41 , it proceeds to step SP 43 and increases by one only the value of the emotion parameter of the “stimulation level”. Moreover, control part 10 then proceeds to SP 45 , completes third procedure RT 4 for changing the emotion parameters, and proceeds to step SP 13 of procedure RT 1 for generating and expressing emotions.
  • control part 10 determines that the value of the emotion parameter of the “stimulation level” in step SP 41 is at a maximum, it proceeds to step SP 44 and decreases by one only the value of the “affection level”. Control part 10 then proceeds to step 45 , completes the third procedure RT 4 for changing the emotion parameters, and then proceeds to step SP 13 of procedure RT 1 for generating and expressing emotions.
  • control part 10 obtains affirmation in step SP 5 of procedure RT 1 for generating and expressing emotions as a result of each of the emotion parameters of the “affection level” and “stimulation level” being set at the initial values, or a specific amount of time has passed since head switch 23 , nose switch 24 , or tail switch 7 has been pressed or moved, it proceeds to step SP 9 and determines whether the value of each of the emotion parameters of the “affection level” and “stimulation level” are at a minimum.
  • control part 10 When control part 10 obtains results to the contrary in step SP 9 , it proceeds to step SP 10 and decreases by one the values of each emotion parameter of the “affection level” and “stimulation level,” and then proceeds to step SP 13 .
  • control part 10 determines in step SP 9 that the value of the emotion parameter of the “affection level” is at a minimum, it proceeds to step SP 11 , decreases by one only the value of the emotion parameter of the “stimulation level”, and then proceeds to step SP 13 .
  • control part 10 determines in step SP 9 that the value of the emotion parameter of the “stimulation level” is at a minimum, it proceeds to step SP 12 , decreases by one only the value of the emotion parameter of the “affection level,” and then proceeds to step SP 13 .
  • control part 10 expresses the emotions and degree thereof of robot toy 1 by the emission pattern of LEDs 21 A through 21 G of LED display part 21 based on the values of each emotion parameter of the “affection level” and “stimulation level” that were renewed in steps SP 3 through SP 12 in accordance with the presence of an appeal from the user and the details thereof.
  • control part 10 when control part 10 proceeds to step SP 13 , it reads the values of the changed emotion parameters of the “affection level” and “stimulation level” from memory 10 B ( FIG. 2 ), and determines the value of these emotion parameters. Control part 10 proceeds to step SP 14 when the emotion parameter of the “stimulation level” and “affection level” are both 0 or positive, and expresses the emotion (“joy” and “great affection”) of robot toy 1 , and the degree thereof by the emission pattern of LEDs 21 A through 21 G of LED display part 21 in accordance with the first procedure RT 5 for emotion expression in FIG. 19 .
  • control part 10 proceeds to step SP 14 , the first procedure RT 5 for expressing emotions is started in step SP 50 , and then in step SP 51 , the control part determines the value of each emotion parameter of the “stimulation level” and “affection level.”
  • control part 10 proceeds to step SP 52 and causes each peripheral LED 21 B through 21 G of LED display part 21 to flash orange in succession such that one cycle of orange light is one second, while if the value of each emotion parameter of the “stimulation level” and “affection level” is outside this range, the control part proceeds to step SP 53 and causes each peripheral LED 21 B through 21 G on LED display part 21 to flash orange in succession such that one cycle of orange light is 0.5 second.
  • control part 10 completes the processing in step SP 52 or step SP 53 , it proceeds to step SP 54 , completes the first procedure RT 5 for expressing emotions, and then returns to step SP 3 of procedure RT 1 for generating and expressing emotions.
  • Control part 10 proceeds to step SP 15 when the emotion parameter of the “stimulation level” is 0 or positive and the emotion parameter of the “affection level” is negative as determined in step SP 13 for the procedure RT 1 for generating and expressing emotions, and expresses the emotions (“anger” and “dislike”) of robot toy 1 , and the degree thereof by the emission pattern of LEDs 21 A through 21 G of LED display part 21 in accordance with the second procedure RT 6 for emotion expression in FIG. 20 .
  • control part 10 determines the value of each emotion parameter of the “stimulation level” and “affection level.”
  • the control part 10 proceeds to step SP 62 and causes each peripheral LED 21 B through 21 G of LED display part 21 to flash orange in succession such that one cycle of orange light is one second, while if the value of each emotion parameter of the “stimulation level” and “affection level” is outside this range, the control part proceeds to step SP 63 and causes each peripheral LED 21 B through 21 G on LED display part 21 to flash orange in succession such that one cycle of orange light is 0.5 second.
  • control part 10 completes the processing in step SP 62 or step SP 63 , it proceeds to step SP 64 , completes the second procedure RT 6 for expressing emotions, and then returns to step SP 3 of procedure RT 1 for generating and expressing emotions.
  • Control part 10 proceeds to step SP 16 when the emotion parameter of the “stimulation level” is negative and the emotion parameter of the “affection level” is 0 or positive as determined in step SP 13 for procedure RT 1 for generating and expressing emotions, and expresses the emotions (“normal” and “calm”) of robot toy 1 , and the degree thereof by the emission pattern of LED 21 A through 21 G of LED display part 21 in accordance with the third procedure RT 7 for emotion expression in FIG. 21 .
  • control part 10 proceeds to step SP 16 , third procedure RT 7 for expressing emotions is started in step SP 70 , and then in step SP 71 , the control part determines the value of each emotion parameter of the “stimulation level” and “affection level.”
  • the value of the emotion parameter of the “affection level” is within a range of ⁇ 1 to ⁇ 4 and the value of the emotion parameter of the “stimulation level” is within a range of 1 to 4
  • control part 10 proceeds to step SP 72 and causes each peripheral LED 21 B through 21 G of LED display part 21 to flash green in succession such that one cycle of green light is one second, while if the value of each emotion parameter of the “stimulation level” and “affection level” are outside these ranges, the control part proceeds to step SP 73 and causes each peripheral LED 21 B through 21 G of LED display part 21 to flash green in succession such that one cycle of green light is 0.5 second.
  • control part 10 completes the processing in step SP 72 or step SP 73 , it proceeds to step SP 74 , completes the third procedure RT 7 for expressing emotions, and then returns to step SP 3 of procedure RT 1 for generating and expressing emotions.
  • control part 10 proceeds to step SP 17 and determines each emotion parameter of the “stimulation level” and the “affection level.”
  • control part 10 proceeds to step SP 18 and causes only central LED 21 A of LED display part 21 to flash at a cycle of once/second, while if the value of either the emotion parameter of the “stimulation level” or “affection level” is outside these ranges, the control part proceeds to step SP 19 and causes only central LED 21 A of LED display part 21 to flash once/0.5 second. Moreover, when control part 10 completes step SP 18 or step SP 19 , it returns to step SP 3 .
  • control part 10 causes LEDs 21 A through 21 G of LED display part 21 to flash with an emission pattern that corresponds to the emotions of robot toy 1 , and the degree thereof.
  • LEDs 21 A through 21 G of LED display part 21 only LEDs 21 B through 21 G flash, one at a time, in the clockwise direction in order to express the emotions of “joy” and “great fondness,” the emotions of “normal” and “calm,” and the emotions of “anger” and “dislike,” while only central LED 21 A flashes blue in order to express the emotions of “depression” and “loneliness”.
  • central LED 21 A first light source
  • LEDs capable of simultaneously or separately emitting two colors of light, green and red were used for the other peripheral LEDs 21 B through 21 G (second light sources).
  • central LED 21 A can emit other types and numbers of emission colors
  • the other peripheral LEDs 21 B through 21 G can emit other types and numbers of emission colors.
  • the six LEDs 21 B through 21 G are disposed such that they are at an equal distance from central LED 21 A and they are at equal intervals from one another, but LEDs 21 A through 21 G can be disposed in another way.
  • three LEDs 70 B through 70 D are disposed around central LED 70 in FIG. 22 (A- 1 ) and (A- 2 )
  • four LEDs 71 B through 71 E are disposed around central LED 71 A in FIG. 22 (B- 1 ) and (B- 2 )
  • five LEDs 72 B through 72 F are disposed around central LED 72 A in FIG. 22 (C- 1 ) and (C- 2 ).
  • LEDs 21 A through 21 G can be caused to flash with an emission pattern that matches emotions with the emission colors of LEDs 21 A through 21 G using as many emotions as there are emission colors of LEDs 21 A through 21 G.
  • only central LED 21 A of LED display 21 flashes, or each of LEDs 21 A through 21 G flashes with an emission pattern wherein only the peripheral LEDs 21 B through 21 G flashed clockwise around central LED 21 A.
  • another emission pattern can be used.
  • the values of the emotion parameters of the “stimulation level” and the “affection level” were changed within a range of ⁇ 8 to 8 to simplify the discussion, but the values are not limited to this range and various ranges can be selected.
  • the details of procedure RT 1 for generating and expressing emotions can be changed as needed.
  • the control part can determine whether or not the emotion parameters fall within any of 0 to 3, 4 to 7, 8 through B or C through F using step SP 13 in combination with step SP 14 (or steps SP 15 through SP 17 ) such that LEDs 21 A through 21 G are caused to emit light in colors and emission patterns that are matched to these ranges.
  • a variety of means can be used as long as LEDs 21 A through 21 G are caused to emit light with emission patterns that correspond to combinations of the values of the emotion parameters.

Abstract

A robot toy comprises a first light source emitting light of at least two colors simultaneously or individually, five or more second light sources arranged around the first light source, a memory for storing at least two types of emotion parameters constituting emotion, and a control means for increasing/decreasing the value of emotion parameters based on an external operation input. The control means controls the first and second light sources to emit light in an emission pattern corresponding to a combination of the values of emotion parameters, thus realizing an inexpensive robot toy capable of effectively representing emotion.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application is a continuation of International Application PCT/JP2006/300617, filed on Jan. 18, 2006, pending at the time of filing of this continuation application and claims priority from Japanese Patent Application JP 2005-010771 filed on Jan. 18, 2005, the contents of which are herein wholly incorporated by reference.
  • FIELD OF THE INVENTION
  • The present invention relates to a robot toy, and is particularly useful as an inexpensive robot toy.
  • BACKGROUND OF THE INVENTION
  • Robot toys that generate emotions automatically based on an appeal from a user, such as the clapping of hands or patting, or the surrounding environment, and express these generated emotions through a similar emission pattern on light-emitting elements and with sounds such as music have recently been proposed and marketed.
  • For instance, JP 3277500 discloses LEDs (light-emitting diodes) having a shape and emission color corresponding to an emotion disposed in the head part of a robot toy, wherein the diodes are covered by a semi-transparent cover in such a way that they are visible from the outside only when they are emitting light and these LEDs are flashed in response to the emotion of the robot toy in order to express the emotions of “joy” and “anger.” JP 3277500 also discloses a robot toy that expresses emotion, wherein, in place of these above-described LEDs, multiple light-emitting elements are disposed in matrix form and these elements have a shape and an emission color corresponding to the emotions of “joy” and “anger,” and these light-emitting elements are flashed selectively.
  • However, when the emotions of the robot toy are expressed by the light emission pattern of light-emitting elements, it is possible to express emotions by a more diverse emission pattern when multiple light-emitting elements are disposed in matrix form as described above and are selectively flashed in response to these emotions than when LEDs having a shape and emission color corresponding to a variety of emotions including “joy” and “anger” are used.
  • Nevertheless, it is difficult to inexpensively produce a robot toy wherein light-emitting elements are disposed in matrix form, and skill is required to effectively express the emotions of a robot toy by the light emission pattern of light-emitting elements at a limited cost. Moreover, it appears that if the emotions of a robot toy can be effectively expressed by a similar emission pattern of light-emitting elements, user interest in the robot toy would be improved and the commercial value as a “toy” would also be improved.
  • SUMMARY OF THE INVENTION
  • The present invention is created in light of such problems, and an object thereof is to provide a robot toy that, although it is inexpensive in construction, is capable of effectively expressing emotions.
  • In order to solve the above-mentioned problems, a robot toy of the present invention comprises a first light source capable of emitting, simultaneously or separately, light of at least two colors; five or more second light sources disposed around the first light source; a memory for storing at least two emotion parameters that define emotions; and control means for increasing and decreasing the value of the emotion parameters based on operation input from the outside, wherein the control means cause the first and second light sources to emit light based on an emission pattern corresponding to the combination of the values of the emotion parameters.
  • By means of the present invention it is possible to express diverse emotions with few light sources and to effectively express emotions with an inexpensive construction.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • For the purpose of facilitating an understanding of the inventions, the accompanying drawings and description illustrate a preferred embodiment thereof, from which the inventions, structure, construction and operation, and many related advantages may be readily understood and appreciated.
  • FIG. 1 is an oblique view showing the appearance of a robot toy of an embodiment of the present invention.
  • FIG. 2 is a block diagram showing the internal structure of a robot toy of an embodiment of the present invention.
  • FIG. 3 is a partial oblique view showing the specific structure of the head part.
  • FIG. 4 is an oblique view showing the structure of the drive mechanism part.
  • FIG. 5 is an oblique view following the description of the pendulum gear.
  • FIG. 6 is a simplified plan view for describing the cam of the first cam gear.
  • FIG. 7 is an oblique view showing the structure of the rising and falling member.
  • FIG. 8 is an oblique view showing the structure of the ear drive member.
  • FIG. 9 is a simplified drawing for describing the movement of the head.
  • FIG. 10 is an oblique view showing the similar emission status of the LEDs as seen from the outside.
  • FIG. 11 is a plan view showing the positional relationship between each LED in the LED display part.
  • FIG. 12 is a schematic drawing for describing the emotions of the robot toy.
  • FIG. 13 is a chart for describing the emotions of the robot toy.
  • FIG. 14 is a schematic drawing for describing the basic light emission patterns.
  • FIG. 15 is a flow chart showing the order of processing for generating and expressing emotions.
  • FIG. 16 is a flow chart showing the order of processing for modification of the first emotion parameter.
  • FIG. 17 is a flow chart showing the order of processing for modification of the second emotion parameter.
  • FIG. 18 is a flow chart showing the order of processing for modification of the third emotion parameter.
  • FIG. 19 is a flow chart showing the order of processing for expressing the first emotion.
  • FIG. 20 is a flow chart showing the order of processing for expressing the second emotion.
  • FIG. 21 is a flow chart showing the order of processing for expressing the third emotion.
  • FIG. 22 is a simplified drawing for describing another embodiment of the present invention.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • In the drawings, 1 is the robot toy, 3 is the head part, 7 is the tail switch, 10 is the control part, 21 is the LED display part, 21A through 21G, 70A through 70D, 71A through 71E, 72A through 72F, and 73A through 73G are LEDs; 23 is the head switch, and 24 is the nose switch.
  • Embodiments of the present invention will now be described while referring to each of the drawings.
  • (1) Structure of the Robot Toy
  • FIG. 1 shows a robot toy 1 relating to the present embodiment. Robot toy 1 has an overall appearance and shape imitating a sitting animal, such as a dog. In essence, the head part is linked, in such a way that it can freely turn, to a shaft extending from the frontal end of a torso part 2 and is attached, in such a way that it can freely move, to the left and right top ends of a head part 3 at ear parts 4A and 4B. Moreover, paddle-shaped front leg parts 5A and 5B having rounded sides are linked, in such a way that they can be freely moved manually, to the frontal end parts of the left and right side surfaces of torso part 2, and hind leg parts 6A and a 6B, which protrude shorter then the front leg parts, are formed as a single unit with torso part 2 at the rear end parts of the left and right side surfaces of torso part 2. Furthermore, a joystick-shaped tail switch 7 is attached, in such a way that it can freely move, near the rear end part of the top surface of torso part 2.
  • Overall, torso part 2 is a right-angled parallelepiped shape having rounded sides, and, as shown in FIG. 2, houses on the inside a control part 10 responsible for motion control of the entire robot toy 1, a motor drive part 11, a substrate 13 formed by a sound amplifier part 12, a microphone 14, a speaker 15, and other devices. Moreover, an earphone jack is disposed on one hind leg part 6B of torso part 2.
  • Head part 3 is a right-angled parallelepiped shape having rounded sides that is flatter than torso part 2, and houses on the inside a light sensor 20, a display part 21 that has multiple LEDs, and other devices, and a drive mechanism 34 (FIG. 3) described later for driving head part 3 and each ear part 4A and 4B with a motor 22 as the power source, etc. A head switch 23, which is a press switch, is disposed at the top end part of head part 3, and a nose switch 24 which is a press switch, is disposed at the position of the nose on head part 3. Overall, robot toy 1 has front leg parts 5A and 5B that are longer than hind leg parts 6A and 6B; therefore, one surface of head part 3 is inclined with respect to the mounting surface and faces up and the display surface of LED display part 21 is disposed on this one surface so that the display can be easily seen by the user.
  • Moreover, microphone 14 of torso part 2 is a directional microphone for ambient sounds and transmits obtained sound signals S1 to control part 10. Moreover, light sensor 20 of head part 3 detects ambient brightness and transmits the detection results to a control part 10 as brightness detection signals S2. Furthermore, tail switch 7, head switch 23, and nose switch 24 detect physical appeals from the user, such as by “hitting down” or “pressing” and transmit the detection results to control part 10 as operation detection signals S3A through S3C. It should be noted that sound signals S4 from an outside musical toy are also applied to control part 10 via earphone jack 16.
  • Control part 10 is a microcomputer providing an information processor comprising a CPU (central processing unit) 10A, a memory 10B, an analog-to-digital conversion circuit 10C, and the like, and recognizes ambient conditions and appeals from the user based on each operation detection signal S3A through S3C of tail switch 7, head switch 23, and nose switch 24.
  • Control part 10 causes robot toy 1 to move in such a way that head part 3 is tilted and ear parts 4A and 4B are opened and closed by transmitting motor drive signals S4 to motor drive part 11 based on these recognition results and the program pre-stored in memory 10B to actuate motor 22. Moreover, control part 10 outputs sound and music from speaker 15 based on sound signals S5 by applying specific sound signals S5 to speaker 15 by a sound amplifier part 12 as necessary, and flashes the LEDs of LED display part 21 by a specific light emission pattern by applying specific drive signals S6 to LED display part 21.
  • It should be noted that the specific structure of head part 3 is shown in FIG. 3. As is clear from FIG. 3, head part 3 of robot toy 1 is formed by a drive mechanism part 34, a cover 35 of this drive mechanism part 34, LED display part 21, a member 36 for preventing light from escaping, and a filter cover 37, layered in succession moving from the back to the front in the direction shown by arrow a, inside a case comprising a first half case 30 that forms the external shape of the back face of head part 3, a second half case 31 forming the external shape of the front face of head part 3, and first and second U-shaped case side members 32 that form the left and right side faces of head 3.
  • As is clear from FIG. 4, a worm gear that is not illustrated is attached to the output axle of motor 22 of drive mechanism part 34, and this worm gear engages with a gear 42 via a gear 40 and a gear 41 formed coaxially as one unit with this gear 40. Moreover, as shown in FIG. 5, a movable member 45 wherein weights 44 are attached to one end, is attached, in such a way that the member can freely turn, to an axle 43 to which gear 42 is also attached; and a pendulum gear 46 is attached, such that it can freely turn, to the other end of this movable member 45 and engages with a gear 47 formed coaxially and as one unit with gear 42. Moreover, first and second linking gears 48 and 49 are disposed at the left and right of pendulum gear 46 such that when movable member 45 turns around axle 43, it can engage with pendulum gear 46.
  • Thus, when motor 22 of drive mechanism part 34 is driven by normal rotation, this rotational force is transmitted to gear 47 via the row of gears from the worm gear attached to the output axle of motor 22 up to gear 42; and based on this rotational force, gear 47 turns movable member 45 as one unit with pendulum gear 46 in a clockwise direction in FIG. 4 and thereby enables pendulum gear 46 to engage with first linking gear 48, while when motor 22 is driven by reverse rotation, based on this rotational force, gear 47 turns movable member 45 in the counter-clockwise direction in FIG. 4 and thereby enables pendulum gear 46 to engage with second linking gear 49.
  • In this case, first linking gear 48 engages with a first cam gear 50 and a cam 50A of a specific shape is formed, as shown in FIG. 6, in the bottom surface of this first cam gear 50. Moreover, this cam 50A will fit into an engagement hole 51A of rising and falling member 51, as shown in FIG. 7, disposed so that it can freely move up and down, as shown by arrow b in FIG. 4, in such a way that rising and falling member 51 will rise and fall with cam 50A when first cam gear 50 rotates.
  • As is clear from FIG. 7, first and second shaft bodies 53A and 53B are set at the top end of rising and falling member 51 such that they are positioned symmetrically to the right and left of a shaft body 52 (FIG. 4) set in first case half 30, and as shown in FIG. 4, an ear drive member 54 is attached to these first and second shaft bodies 53A and 53B.
  • Ear drive member 54 is formed by linkage, via a barrel part 61 and as one unit in such a way that it can freely bend, of the base part of each tweezer-shaped first and second spring parts 60A and 60B formed of an elastic material Moreover, ear drive member 54 is attached to rising and falling member 51 in such a way that the corresponding first and second shaft bodies 53A and 53B of rising and falling member 51 fit into holes 62AX and 62BX of first and second engagement parts 62A and 62B disposed near the base part of these first and second spring parts 60A and 60B, and barrel part 61 is attached to rising and falling member 51 in such a way that it engages with shaft body 52.
  • As a result, drive mechanism part 34 operates in such a way that when rising and falling member 51 rises and falls, first and second shaft bodies 53A and 53B of rising and falling member 51 move up and down relative to shaft body 52 and first and second spring parts 60A and 60 B open and close as one unit with first and second engagement parts 62A and 62B of ear drive member 54.
  • Furthermore, the bottom end parts of the corresponding ear parts 4A and 4B fit into first and second spring parts 60A and 60B of ear drive member 54, and ear part 4A and ear part 4B are supported, near the bottom end, to the left and right, respectively, of the top end of first half case 30, and ear parts 4A and 4B open and close with the opening and closing motion of first and second spring parts 60A and 60B of ear drive member 54.
  • When motor 22 of drive mechanism 34 is driven by forward motion, pendulum gear 46 (FIG. 5) engages with first linking gear 48; the rotational force of motor 22 is transmitted to first cam gear 50; cam 50A of first cam gear 50 causes rising and falling member 51 to rise and fall based on this rotational force; and as a result, ear parts 4A and 4B open and close with the opening and closing of first and second spring parts 60A and 60B of ear drive member 54.
  • In contrast to this, second linking gear 49 engages with a second cam gear 63 and a cam 63A of a specific shape is formed as shown in FIG. 9(A) in the bottom surface of this second cam gear 63. Moreover, two parallel arm parts 65A and 65B of a two-pronged member 65 anchored to a shaft body 64, which is in turn anchored to torso part 2, extend to the bottom of second cam gear 63. Moreover, cam 63A of second cam gear 63 fits in between these two arm parts 65A and 65B of this two-pronged member 65.
  • When motor 22 of drive mechanism part 34 is driven by reverse motion, pendulum gear 46 (FIG. 5) engages with second linking gear 49; the rotational force of motor 22 is transmitted to second cam gear 63; cam 63A of second cam gear 63 restrains arm parts 65A and 65B of two-pronged member 65, as shown in FIGS. 9(B) and (C) based on this rotational force; and in response, head part 3 swings to the left and right of shaft body 64. It should be noted as long as the direction of the rotation is in opposite directions, the forward motion and reverse motion of motor 22 are not limited to just clockwise and counterclockwise motion.
  • On the other hand, as shown in FIG. 3, LED display part 21 is formed by disposing seven LEDs 21A through 21G on a substrate 66 at a specific positional relationship. The positional relationship of the seven LEDs 21A through 21G and the emission colors thereof are described later.
  • Member 36 for preventing light from escaping is formed from, for instance, a black resin or rubber material that will not transmit light, and is attached tightly to substrate 66 of LED display part 21. A pressing part 24A and a contact part 24B of nose switch 24 are anchored to the bottom end of this member 36 for preventing light from escaping.
  • Moreover, a total of seven holes 36A through 36G corresponding to each LED 21A through 21G of LED display part 21 are made in this member 36 for preventing light from escaping, and when this member 36 for preventing light from escaping is attached to substrate 66 of LED display part 21, the corresponding LED 21A through 21G of LED display part 21 can be exposed to the filter cover 37 via the respective holes 36A through 36G.
  • Moreover, the thickness of member 36 for preventing light from escaping is the same as the size of the space between substrate 66 of LED display part 21 and filter cover 37, and the light that is reflected from each LED 21A through 21G of LED display part 21 can therefore be shined in the direction of filter cover 37 without mixing with the light emitted from each of the other LEDs 21A through 21G.
  • Furthermore, filter cover 37 is formed using a semitransparent resin material, and second half case 31 is formed from a transparent resin material. As a result, when LEDs 21A through 21G of LED display part 21 are off, these LEDs 21A through 21G cannot be recognized (seen) from the outside, and when LEDs 21A through 21G are on, these LEDs 21A through 21G can be recognized (the light emitted from LEDs 21A through 21G can be seen) from the outside. Moreover, by using filter cover 37, the light can be diffused such that the entire shape formed by the corresponding holes 36A through 36G in member 36 for preventing light from escaping seems brighter rather than pin-point as the brightness of only the individual LEDs 21A through 21G that have been turned on.
  • In this case, filter cover 37 is white. The light emitted from LEDs 21A through 21G of LEDs that can be seen from the outside through this filter cover 37 does not mix and faintly glows inside the white cover.
  • (2) Expression of Emotions by Robot Toy (2-1) Expression of Emotions by Robot Toy
  • Next, the expression of emotions by robot toy 1 will be described. A function for expressing feeling is loaded in control part 10 (FIG. 2) of this robot toy 1. By means of this function, the emotions of robot toy 1 are generated based on the type of appeal from the user, and each of LEDs 21A through 21G of LED display 21 flashes in an emission pattern that corresponds to the type and degree of this emotion in such a way that robot toy 1 expresses an emotion.
  • FIG. 11 shows the positional layout of each LED 21A through 21G of LED display part 21 used for expression of emotion by robot toy 1. As is clear from FIG. 11, one LED 21A is disposed at a specific position on the center line in the axial direction of head 3 in LED display part 21, and the remaining six LEDs 21B through 21G are disposed, at equal distances from LED 21 and at equal intervals from one another, in a concentric circle around LED 21A at the center. In short, the peripheral six LEDs 21B through 21G are disposed in a positional relationship such that each is at an apex of a regular hexagon with the central LED 21A at the center.
  • An LED capable of simultaneously or separately emitting the three colors of green, red, and blue light is used as central LED 21A. Moreover, an LED capable of simultaneously or separately emitting the two colors of green and red light is used as the other peripheral LEDs 21B through 12G. Consequently, peripheral LEDs 21B through 21G can emit orange light by simultaneously flashing the two colors of green and red light.
  • On the other hand, as shown in FIG. 12, the emotions of robot toy 1 are defined by two parameters (hereafter referred to as emotion parameters) that represent a “stimulation level” and an “affection level”. These two parameters are stored in memory 10B and are values within a range of “−8” to “8.”
  • Moreover, the value of each emotion parameter of the “stimulation level” and the “affection level” corresponds to emotions of “joy” and “great fondness” when 0 or within the positive range; and when the value of the emotion parameter of the “affection level” is 0 or positive, but the value of the “stimulation level” is within a negative range, the emotion corresponds to “normal” or “calm”.
  • Values of the emotion parameters for the “stimulation level” and “affection level” that are both within the negative range correspond to the emotions of “loneliness” and “depression,” while values of the emotion parameter of the “stimulation level” that are 0 or positive and values of the emotion parameter of the “affection level” that are negative correspond to the emotions of “anger” and “dislike.”
  • Furthermore, the degree of an emotion (intensity of an emotion) such as “joy” is expressed by the magnitude of each emotion parameter of the “stimulation level” and “affection level,” and if the absolute value of the emotion parameter is high, the degree of the emotion increases with this increase in the value.
  • Consequently, for instance, when the values of an emotion parameter of the “stimulation level” and the “affection level” at a certain time are both “8,” the values of these emotion parameters are both highly positive, and the emotions of robot toy 1 are “joy” and “great fondness.” Both of these emotion parameters are maximum values; therefore, the degree of the emotions of “joy” and “great fondness” is at a maximum.
  • In order to change the emotion, control part 10 increases or decreases the value of the emotion parameter of the “stimulation level” and the “affection level” within a range of “−8” to “8” when the user appeals to the toy by “hitting down” or pressing” tail switch 7, head switch 23, or nose switch 24, or when there has been no appeal for a specific time based on operation detection signals S3A through S3C applied from tail switch 7, head switch 23, and nose switch 24.
  • In this case, it is predetermined whether the value of a parameter of the “stimulation level” and the “affection level” will be increased or decreased in accordance with the appeal from the user. For example, if the user presses nose switch 24, the value of each emotion parameter of the “stimulation level” and “affection level” will be increased by one, and when the user presses the head switch, the value of the emotion parameter of the “stimulation level” will be reduced by one, while the value of the emotion parameter of the “affection level” will be increased by one.
  • Consequently, the user can change the emotions of robot toy 1 to emotions of “joy” and “great fondness”, and increase the degree of these emotions of “joy” and “great fondness,” by pressing nose switch 24 of robot toy 1, and the user can change the emotions of robot toy 1 to “normal” or “calm,” and increase the degree of this emotion of “normal” or “calm” by pressing head switch 23.
  • Moreover, when control part 10 makes tail switch 7 swing back and forth, the value of the emotion parameter of the “stimulation level” is increased by one, while the value of the emotion parameter of the “affection level” is increased or decreased by one within a range of “−8” to “8.” Consequently, by swinging tail switch 7 of robot toy 1, the user can change emotions of “anger” and “dislike” to emotions of “depression” or “loneliness,” and increase the degree of these emotions.
  • Furthermore, control part 10 reduces the value of the emotion parameters of the “stimulation level” and the “affection level” by one when the user appeals by “hitting down” or “pressing” tail switch 7, head switch 23, or nose switch 24, or when there has been no appeal for a specific time (for instance, 30 seconds). At this latter time, robot toy 1 changes from auto-emotion to emotions of “depression” and “loneliness” and the degree of these emotions increases.
  • When the value of the emotion parameters of the “stimulation level” and “affection level” change in this way, control part 10 responds to the emotion of robot toy 1 as well as the degree thereof as determined from the value of the two emotion parameters after the change by causing each LED 21A through 21G of LED display part 21 to flash with an emission pattern corresponding to the emotion and the degree thereof.
  • Actually, when the values of the emotion parameters of the “stimulation level” and “affection level” have been changed as described above, control part 10 reads the value of each emotion parameter of the “stimulation level” and “affection level” at this time from memory 10B and differentiates the values of the two read emotion parameters to determine whether they are emotions of “joy” and “great affection” (the values of the emotion parameters of the “stimulation level” and “affection level” are both 0 or within the positive range); emotions of “normal” and “calm” (the value of the emotion parameter of the “affection level” is 0 or positive, and the value of the emotion parameter of the “stimulation level” is negative); emotions of “depression” and “loneliness” (the values of the emotion parameters of the “stimulation level” and “affection level” are both negative); or emotions of “anger” and “dislike” (the value of the emotion parameter of the “stimulation level” is 0 or positive and the value of the emotion parameter of the “affection level” is negative).
  • Control part 10 further differentiates between the values of the emotion parameters of the “stimulation level” and the “affection level” to determine the degree of the emotion of robot toy 1 at that time and controls LED display based on these determination results in such a way that LEDs 21A through 21G of LED display part 21 flash with an emission pattern corresponding to the degree of the emotions of robot toy 1 at that time.
  • An example of the above-mentioned means is shown in FIG. 13. Each LED 21A through 21G of LED display part 21 will respond to a specific combination of emotion parameters of the “stimulation level” and the “affection level” by a pre-established emission pattern, and a program that defines, for each of the emission patterns, the timing by which LEDs 21A through LED 21G of LED display part 21 will flash is prestored in memory 10B of LEDs 21A through 21G (this program is referred to as the LED drive program hereafter).
  • Control part 10 determines the emotion of robot toy 1 and the degree thereof as described above, and then causes LEDs 21A through 21G of LED display part 21 to flash with the emission pattern corresponding to the emotion of robot toy 1 and the degree thereof at that time in accordance with this LED drive program.
  • As is clear from FIG. 13, the emission patterns of LEDs 21A through 21G of this LED display part 21 are based on an emission pattern wherein each LED 21B through 21G is repeatedly turned on and off in succession relative to the respective adjacent LED 21B through LED 21G such that only peripheral LEDs 21B through 21G flash, one at a time, clockwise around central LED 21A. This emission pattern is referred to as the basic emission pattern hereafter.
  • This basic emission pattern is used as the emission pattern for expressing emotions other than “depression” and “loneliness.” When the emotions are “joy” and “great fondness” (the values of the emotion parameters of the “stimulation level” and “affection level” are both 0 or positive), LEDs 21B through 21G respond by emitting orange light; when the emotions are “normal” and “calm” (the value of the emotion parameter of the “stimulation level” is negative, and the value of the emotion parameter of the “affection level” is 0 or positive), LEDs 21B through 21G respond by emitting green light; and when the emotions are “anger” and “dislike” (the value of the emotion parameter of the “stimulation level” is 0 or positive, and the value of the emotion parameter of the “affection level” is negative), LEDs 21B through 21G respond by emitting green light. When the emotions of the robot toy are “depression” and “loneliness” (the values of the emotion parameters of the “stimulation level” and the “affection level” are both negative), the peripheral LEDs 21B through 21G do not emit light and only central LED 21A flashes blue.
  • Furthermore, the emission patterns of LEDs 21A through 21G of LED display part 21 are set in such a way that when the emotions are “joy” and “great fondness,” “normal” and “calm,” or “anger” and “dislike,” the lights rotate faster as the degree of the emotion increases, in essence, the flashing cycle of the peripheral LEDs 21B through 21G becomes faster, when the degree of the emotion is high, and when the emotions are “depression” and “loneliness,” the emission patterns are set in such a way that the flashing cycle of central LED 21A becomes faster as the degree of the emotion increases when the degree of the emotion is high.
  • The emotion of robot toy 1 can be visually recognized from the outside based on the emission colors of LEDs 21A through 21G of LED display part 21 at that time, and the degree of this emotion can be visually recognized from the outside based on the flashing speed of LED 21A through 21D at that time.
  • (2-2) Procedure for Generating and Processing Emotion
  • Control part 10 executes the process for generating emotions of robot toy 1 and processing to express the generated emotions (processing for controlling LED display part 21) based on the LED drive program in accordance with procedure RT1 for the generation and expression of emotions shown in FIG. 15.
  • In essence, control part 10 starts procedure RT1 for generating and expressing of emotions when the power source of this robot toy 1 is turned on, and in step SP1, the values of the emotion parameters of both the “affection level” and “stimulation level” representing the emotions of robot toy 1 are set at an initial value of “0.”
  • Control part 10 then proceeds to step SP3 and determines whether or not head switch 23 (FIG. 2), nose switch 24 (FIG. 2), or tail switch 7 (FIG. 2) have been pressed or moved. When control part 10 obtains results to the contrary in step SP3, it proceeds to step SP4 and reads the count value of an internal timer, which is not illustrated. In step SP5, the control part sets the values of each emotion parameter of the “affection level” and “stimulation level” of step SP2 at the initial values based on the count as read at step SP4, or it determines whether or not a specific time (for instance, 30 seconds) has lapsed since either head switch 23, nose switch 24, or tail switch 7 was pressed or moved.
  • When control part 10 obtains results to the contrary in step SP5, it returns to step SP3 and then repeats steps SP3-SP4-SP5-SP3 until an affirmative result is obtained at step SP3 or step SP5.
  • Moreover, once affirmation is obtained at step SP3 by the user pressing head switch 23 of robot toy 1, control part 10 proceeds to step SP6 and changes the value of each emotion parameter of the “affection level” and “stimulation level” in accordance with first procedure RT2 for changing the emotion parameters shown in FIG. 16.
  • In essence, when control part 10 proceeds to step SP6, first procedure RT2 for changing the emotion parameters starts at step SP20 and then in step SP21, the control part determines whether or not the values of the emotion parameters on the “affection level” and the “stimulation level” have reached the maximum value (“8” in the present embodiment).
  • When control part 10 obtains results to the contrary in step SP21, control part 10 proceeds to step SP22 and increases by one the values of each emotion parameter of the “affection level” and the “stimulation level.” This control part 10 finishes second procedure RT2 for changing the values of the emotion parameters and proceeds to step SP13 of procedure RT1 for generating and expressing emotions (FIG. 15).
  • When control part 10 determines that the value of the emotion parameter of the “affection level” is at a maximum in step SP21, it proceeds to step SP23 and increases by one only the value of the emotion parameter of the “stimulation level”. Moreover, control part 10 then proceeds to SP25, completes first procedure RT2 for changing the emotion parameter, and proceeds to step SP13 of procedure RT1 for generating and expressing emotions.
  • Moreover, when control part 10 determines that the value of the emotion parameter of the “stimulation level” in step SP21 is at a maximum, it proceeds to step SP24 and increases by one only the value of the “affection level”. Control part 10 then proceeds to step SP25, completes first procedure RT2 for changing the emotion parameter, and then proceeds to step SP13 of procedure RT1 for generating and expressing emotions.
  • On the other hand, when control part 10 obtains affirmation at step SP3 of procedure RT1 for generating and expressing emotions as a result of the user pressing nose switch 24 of robot toy 1, the control part proceeds to step SP7 and changes the value of each emotion parameter of the “affection level” and “stimulation level” of the emotion in accordance with the second procedure RT3 for changing the emotion parameters shown in FIG. 17.
  • In essence, when control part 10 proceeds to step SP7, it starts second procedure RT3 for changing the emotion parameter at step S30 and in step SP31 determines whether or not the value of the emotion parameter of the “affection level” is at a maximum and whether or not the value of the emotion parameter of the “stimulation level” is at a minimum (“−8” in the present embodiment).
  • When control part 10 obtains results to the contrary in step SP31, it proceeds to step SP32, and increases by one the value of the emotion parameter of the “affection level” and decreases by one the value of the emotion parameter of the “stimulation level.” Control part 10 then proceeds to step SP35, completes second procedure RT3 for changing the emotion parameters, and proceeds to step SP13 of procedure RT1 for generating and expressing emotions (FIG. 15).
  • On the other hand, when control part 10 determines in step SP31 that the value of the emotion parameter of the “affection level” is at a maximum, it proceeds to step SP33 and decreases by one only the value of the emotion parameter of the “stimulation level.” Control part 10 then proceeds to step SP35, completes second processing RT3 for changing the emotion parameters, and then proceeds to step SP13 of procedure RT1 for generating and expressing emotions.
  • When control part 10 determines in step SP31 that the value of the emotion parameter of the “stimulation level” is at a minimum, it proceeds to step SP34 and increases by one only the value of the emotion parameter of the “affection level.” Control part 10 then proceeds to step S35, completes second procedure RT3 for changing the emotion parameters, and proceeds to step SP13 of procedure RT1 for generating and expressing emotions.
  • On the other hand, when control part 10 obtains affirmation in step SP3 for procedure RT1 for generating and expressing emotions as a result of the user moving tail switch 7 of robot toy 1, it proceeds to step SP8 and changes the level of each emotion parameter of the “affection level” and “stimulation level” in accordance with the third procedure RT4 for changing the emotion parameters shown in FIG. 18.
  • In essence, when control part 10 proceeds to step SP8, it begins third procedure RT4 for changing the emotion parameters in step SP40, and in step SP41 it determines whether the value of the emotion parameter of the “affection level” is at a minimum (“−8” in the present embodiment) and whether the value of the emotion parameter of the “stimulation level” is at a maximum.
  • When control part 10 obtains results to the contrary in step SP41, it proceeds to step SP42 and decreases by one the emotion parameter of the “affection level” and increases by one the value of the emotion parameter of the “stimulation level.” Moreover, control part 10 then proceeds to step 45, completes the third procedure for changing the emotion parameters, and proceeds to step SP13 of procedure RT1 for generating and expressing emotions (FIG. 15).
  • When control part 10 determines that the value of the emotion parameter of the “affection level” is at a minimum in step SP41, it proceeds to step SP43 and increases by one only the value of the emotion parameter of the “stimulation level”. Moreover, control part 10 then proceeds to SP45, completes third procedure RT4 for changing the emotion parameters, and proceeds to step SP13 of procedure RT1 for generating and expressing emotions.
  • Moreover, when control part 10 determines that the value of the emotion parameter of the “stimulation level” in step SP41 is at a maximum, it proceeds to step SP44 and decreases by one only the value of the “affection level”. Control part 10 then proceeds to step 45, completes the third procedure RT4 for changing the emotion parameters, and then proceeds to step SP13 of procedure RT1 for generating and expressing emotions.
  • On the other hand, when control part 10 obtains affirmation in step SP5 of procedure RT1 for generating and expressing emotions as a result of each of the emotion parameters of the “affection level” and “stimulation level” being set at the initial values, or a specific amount of time has passed since head switch 23, nose switch 24, or tail switch 7 has been pressed or moved, it proceeds to step SP9 and determines whether the value of each of the emotion parameters of the “affection level” and “stimulation level” are at a minimum.
  • When control part 10 obtains results to the contrary in step SP9, it proceeds to step SP10 and decreases by one the values of each emotion parameter of the “affection level” and “stimulation level,” and then proceeds to step SP13.
  • In contrast to this, when control part 10 determines in step SP9 that the value of the emotion parameter of the “affection level” is at a minimum, it proceeds to step SP11, decreases by one only the value of the emotion parameter of the “stimulation level”, and then proceeds to step SP13.
  • When control part 10 determines in step SP9 that the value of the emotion parameter of the “stimulation level” is at a minimum, it proceeds to step SP12, decreases by one only the value of the emotion parameter of the “affection level,” and then proceeds to step SP13.
  • In steps SP13 through SP19, control part 10 expresses the emotions and degree thereof of robot toy 1 by the emission pattern of LEDs 21A through 21G of LED display part 21 based on the values of each emotion parameter of the “affection level” and “stimulation level” that were renewed in steps SP3 through SP12 in accordance with the presence of an appeal from the user and the details thereof.
  • In essence, when control part 10 proceeds to step SP13, it reads the values of the changed emotion parameters of the “affection level” and “stimulation level” from memory 10B (FIG. 2), and determines the value of these emotion parameters. Control part 10 proceeds to step SP14 when the emotion parameter of the “stimulation level” and “affection level” are both 0 or positive, and expresses the emotion (“joy” and “great affection”) of robot toy 1, and the degree thereof by the emission pattern of LEDs 21A through 21G of LED display part 21 in accordance with the first procedure RT5 for emotion expression in FIG. 19.
  • Actually, when control part 10 proceeds to step SP14, the first procedure RT5 for expressing emotions is started in step SP50, and then in step SP51, the control part determines the value of each emotion parameter of the “stimulation level” and “affection level.” When the value of each emotion parameter of the “stimulation level” and “affection level” is within a range of 0 to 4, control part 10 proceeds to step SP52 and causes each peripheral LED 21B through 21G of LED display part 21 to flash orange in succession such that one cycle of orange light is one second, while if the value of each emotion parameter of the “stimulation level” and “affection level” is outside this range, the control part proceeds to step SP53 and causes each peripheral LED 21B through 21G on LED display part 21 to flash orange in succession such that one cycle of orange light is 0.5 second.
  • Moreover, when control part 10 completes the processing in step SP52 or step SP53, it proceeds to step SP54, completes the first procedure RT5 for expressing emotions, and then returns to step SP3 of procedure RT1 for generating and expressing emotions.
  • Control part 10 proceeds to step SP15 when the emotion parameter of the “stimulation level” is 0 or positive and the emotion parameter of the “affection level” is negative as determined in step SP13 for the procedure RT1 for generating and expressing emotions, and expresses the emotions (“anger” and “dislike”) of robot toy 1, and the degree thereof by the emission pattern of LEDs 21A through 21G of LED display part 21 in accordance with the second procedure RT6 for emotion expression in FIG. 20.
  • In essence, when control part 10 proceeds to step SP15, the second procedure RT6 for expressing emotions is started in step SP60, and then in step SP61, the control part determines the value of each emotion parameter of the “stimulation level” and “affection level.” When the value of each emotion parameter of the “stimulation level” and “affection level” is within a range of 0 to 4, control part 10 proceeds to step SP62 and causes each peripheral LED 21B through 21G of LED display part 21 to flash orange in succession such that one cycle of orange light is one second, while if the value of each emotion parameter of the “stimulation level” and “affection level” is outside this range, the control part proceeds to step SP63 and causes each peripheral LED 21B through 21G on LED display part 21 to flash orange in succession such that one cycle of orange light is 0.5 second.
  • Moreover, when control part 10 completes the processing in step SP62 or step SP63, it proceeds to step SP64, completes the second procedure RT6 for expressing emotions, and then returns to step SP3 of procedure RT1 for generating and expressing emotions.
  • Control part 10 proceeds to step SP16 when the emotion parameter of the “stimulation level” is negative and the emotion parameter of the “affection level” is 0 or positive as determined in step SP13 for procedure RT1 for generating and expressing emotions, and expresses the emotions (“normal” and “calm”) of robot toy 1, and the degree thereof by the emission pattern of LED 21A through 21G of LED display part 21 in accordance with the third procedure RT7 for emotion expression in FIG. 21.
  • In essence, when control part 10 proceeds to step SP16, third procedure RT7 for expressing emotions is started in step SP70, and then in step SP71, the control part determines the value of each emotion parameter of the “stimulation level” and “affection level.” When the value of the emotion parameter of the “affection level” is within a range of −1 to −4 and the value of the emotion parameter of the “stimulation level” is within a range of 1 to 4, control part 10 proceeds to step SP72 and causes each peripheral LED 21B through 21G of LED display part 21 to flash green in succession such that one cycle of green light is one second, while if the value of each emotion parameter of the “stimulation level” and “affection level” are outside these ranges, the control part proceeds to step SP73 and causes each peripheral LED 21B through 21G of LED display part 21 to flash green in succession such that one cycle of green light is 0.5 second.
  • Moreover, when control part 10 completes the processing in step SP72 or step SP73, it proceeds to step SP74, completes the third procedure RT7 for expressing emotions, and then returns to step SP3 of procedure RT1 for generating and expressing emotions.
  • On the other hand, when the value of each emotion parameter of the “stimulation level” and “affection level” is negative as determined in step SP13 of procedure RT1 for generating emotions, control part 10 proceeds to step SP17 and determines each emotion parameter of the “stimulation level” and the “affection level.”
  • When the value of each emotion parameter of the “stimulation level” and “affection level” is within a range of −1 through −4, control part 10 proceeds to step SP18 and causes only central LED 21A of LED display part 21 to flash at a cycle of once/second, while if the value of either the emotion parameter of the “stimulation level” or “affection level” is outside these ranges, the control part proceeds to step SP19 and causes only central LED 21A of LED display part 21 to flash once/0.5 second. Moreover, when control part 10 completes step SP18 or step SP19, it returns to step SP3.
  • Thus, control part 10 causes LEDs 21A through 21G of LED display part 21 to flash with an emission pattern that corresponds to the emotions of robot toy 1, and the degree thereof.
  • (3) Effects
  • As previously described, by means of robot toy 1 of the present embodiment, of LEDs 21A through 21G of LED display part 21, only LEDs 21B through 21G flash, one at a time, in the clockwise direction in order to express the emotions of “joy” and “great fondness,” the emotions of “normal” and “calm,” and the emotions of “anger” and “dislike,” while only central LED 21A flashes blue in order to express the emotions of “depression” and “loneliness”.
  • By means of this expression method, for instance, it is possible to express the emotions of robot toy 1 based on diverse emission patterns when compared to the simple flashing of an LED of a pre-determined shape, while at the same time, emotions can be expressed effectively using robot toy 1 that is inexpensive in construction because LED display part 21 can be formed from fewer LEDs than when multiple LEDs are arranged in matrix form.
  • (4) Other Embodiments
  • By means of the above-mentioned embodiment, an LED capable of simultaneously or separately emitting three colors of light, green, red, and blue, was used as central LED 21A (first light source) of LED display part 21, and LEDs capable of simultaneously or separately emitting two colors of light, green and red, were used for the other peripheral LEDs 21B through 21G (second light sources). However, central LED 21A can emit other types and numbers of emission colors, and the other peripheral LEDs 21B through 21G can emit other types and numbers of emission colors.
  • Moreover, by means of the above-mentioned embodiment, the six LEDs 21B through 21G are disposed such that they are at an equal distance from central LED 21A and they are at equal intervals from one another, but LEDs 21A through 21G can be disposed in another way. For instance, three LEDs 70B through 70D are disposed around central LED 70 in FIG. 22 (A-1) and (A-2), four LEDs 71B through 71E are disposed around central LED 71A in FIG. 22 (B-1) and (B-2), and five LEDs 72B through 72F are disposed around central LED 72A in FIG. 22 (C-1) and (C-2). Moreover, it is possible to dispose six LEDs 73B through 73G around central electrode LED 72A as in FIG. 22(D). Eight electrodes LED 73B through 731 can be disposed around central LED 73A as in FIG. 22(E). It is also possible to use more than eight LEDs. The same effect as in the above-mentioned embodiment can be realized.
  • Furthermore, by means of the above-described embodiment, there were two types of emotion parameters, a “stimulation level” and an “affection level,” but it is also possible to use other numbers and types of emotion parameters. For instance, LEDs 21A through 21G can be caused to flash with an emission pattern that matches emotions with the emission colors of LEDs 21A through 21G using as many emotions as there are emission colors of LEDs 21A through 21G.
  • By means of the above-described embodiment, only central LED 21A of LED display 21 flashes, or each of LEDs 21A through 21G flashes with an emission pattern wherein only the peripheral LEDs 21B through 21G flashed clockwise around central LED 21A. However, another emission pattern can be used. For instance, it is possible to control the lights in such a way that peripheral LEDs 21B through 21G are turned on and off in succession with respect to adjacent LEDs 21B through 21G in such a way that only peripheral LEDs 21B through 21G flash, one at a time, counterclockwise around central LED 21A, or in such a way that only peripheral LEDs 21B through 21G flash, several at a time, clockwise or counterclockwise around central LED 21A.
  • Furthermore, by means of the above-described embodiment, the values of the emotion parameters of the “stimulation level” and the “affection level” were changed within a range of −8 to 8 to simplify the discussion, but the values are not limited to this range and various ranges can be selected. In addition, the details of procedure RT1 for generating and expressing emotions can be changed as needed. For instance, by means of actual computer processing, the numbers within a range of −8 to 8 become the numbers within the range of 0 to 15 (0-F by the 16-ary method); therefore, rather than determining whether the above-described emotion parameters are positive or negative at step SP13 of procedure RT1 for generating and expressing emotion, the control part can determine whether or not the emotion parameters fall within any of 0 to 3, 4 to 7, 8 through B or C through F using step SP13 in combination with step SP14 (or steps SP15 through SP17) such that LEDs 21A through 21G are caused to emit light in colors and emission patterns that are matched to these ranges. In short, a variety of means can be used as long as LEDs 21A through 21G are caused to emit light with emission patterns that correspond to combinations of the values of the emotion parameters.

Claims (20)

1. A robot toy that artificially expresses an emotion by the flashing of a light source, said mechanism comprising:
a head part of the robot toy having a frontal surface;
a group of light sources disposed at the frontal surface of the head part of the robot toy and comprising of one first light source, at least five second light sources disposed at virtually equal intervals around the first light source as the center;
a cover that covers the light sources and is formed of a semi-transparent material with which it is possible to recognize from the outside the light emitted from a light source when a light source has been turned on; and
an information processor for controlling the flashing of the light sources, wherein the information processor executes control in response to input from the outside and multiple light-emitting patterns are realized comprising a light emission pattern whereby the five or more second light sources turn on and off in succession in a clockwise or counterclockwise fashion.
2. A robot toy as recited claim 1, comprising a memory for storing at least two types of emotion parameters that define emotions, wherein the information processor increases or decreases the values of the emotion parameters based on operation input from the outside and control the emission of light from the first and/or second light sources by a light emission pattern corresponding to the increase or decrease in the emotion parameters.
3. A robot toy as recited in claim 2, wherein the information processor causes the first and/or second light sources to flash in response to the degree of the emotion as determined by the value of the emotion parameter at a cycle that becomes faster with any increase in this degree.
4. A robot toy as recited in claim 2, wherein the information processor causes the first and/or second light source to flash in a color corresponding to the type of emotion as determined by the combination of the values of the emotion parameters by light of two colors being simultaneously or individually emitted.
5. A robot toy as recited in claim 1, wherein the first light source is capable of emitting more colors of light than can be emitted by the second light sources individually.
6. A robot toy as recited in claim 2, wherein the information processor causes the first light source to emit light in a specific color that cannot be emitted by the second light sources when the type of emotion as determined by the combination of the values of the emotion parameters corresponds to a pre-established specific type.
7. A robot toy as recited in claim 2, comprising two or more operation input switches, and the information processor increases the values of the emotion parameters in accordance with the operation input of each of the operation input switches, and decreases the values of the emotion parameters when there has been no operation input for a specific time.
8. A robot toy comprising:
a torso part;
a head part disposed at the front of the torso part such that the head part can move in relation to the torso part;
a pair of front leg parts disposed at the front of the torso part;
a frontal face part comprising,
a virtually flat, semi-transparent cover designed such that light can be emitted through the cover from beneath while the cover hides the area beneath the cover,
multiple light source parts disposed on the inside of said frontal surface part comprising one first light source part and at least five second light source parts disposed annularly at virtually equal intervals surrounding the first light source part, and disposed in such a way that the flashing lights do not mix with one another;
a switch that functions in response to operation by a user; and
an information processor for controlling the turning on and off of the multiple light source parts in response to signals from the switch, the information processor is designed such that the turning on and off of the light source parts is controlled by light emission patterns comprising a light-emission pattern for the successive turning off and on of the multiple second light source parts in a clockwise or counterclockwise fashion.
9. The robot toy recited in claim 8, wherein the first light source part comprises at least two light sources having different emission colors, and the light emission patterns comprise a light-emission pattern wherein light is emitted by one of the first light source parts and a light-emission pattern wherein at least two light sources simultaneously emit light.
10. The robot toy recited in claim 8, wherein the frontal surface part of the head part is formed in an oblong right-angled parallelepiped shape having rounded sides.
11. The robot toy recited in claim 8, wherein the switches are disposed below the position of the multiple light source parts on the top face part and the frontal surface part of the head part.
12. The robot toy recited in claim 8, comprising a tail switch is further disposed at the back of the torso part.
13. The robot toy recited in claim 8, wherein the first light source part comprises three light sources having different emission colors, and the multiple first light source parts comprise two light sources having different emission colors.
14. A musical toy in the shape of an animal, comprising:
a torso part;
a head part disposed at the front part of the torso part such that the head can move in relation to the torso part;
a pair of front legs disposed at the front of the torso part; and
a pair of hind legs disposed at the back of the torso part, and having an appearance and shape that imitate a sitting animal, wherein the head part comprises a frontal surface part, which is an oblong right-angle parallelepiped shape having rounded sides and is flat and wider than the torso part, and further houses a sound amplifier part, and a speaker; and a display part is disposed at the frontal surface part and this display part displays in accordance with the music output from the speakers.
15. The robot toy as recited claim 14, comprising a memory for storing at least two types of emotion parameters that define emotions, wherein the information processor increases or decreases the values of the emotion parameters based on operation input from the outside and control the emission of light from the first and/or second light sources by a light emission pattern corresponding to the increase or decrease in the emotion parameters.
16. The musical toy as recited in claim 14, comprising an information processor and a first and/or second light source.
17. The robot toy as recited in claim 16, wherein the information processor causes the first and/or second light sources to flash in response to the degree of the emotion as determined by the value of the emotion parameter at a cycle that becomes faster with any increase in this degree.
18. The robot toy as recited in claim 16, wherein the information processor causes the first and/or second light source to flash in a color corresponding to the type of emotion as determined by the combination of the values of the emotion parameters by light of two colors being simultaneously or individually emitted.
19. The robot toy as recited in claim 16, wherein the first light source is capable of emitting more colors of light than can be emitted by the second light sources individually.
20. The robot toy as recited in claim 16, wherein the information processor causes the first light source to emit light in a specific color that cannot be emitted by the second light sources when the type of emotion as determined by the combination of the values of the emotion parameters corresponds to a pre-established specific type.
US11/775,133 2005-01-18 2007-07-09 Robot Toy Abandoned US20070270074A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2005010771A JP2006198017A (en) 2005-01-18 2005-01-18 Robot toy
JP2005-010771 2005-01-18
PCT/JP2006/300617 WO2006077868A1 (en) 2005-01-18 2006-01-18 Robot toy

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2006/300617 Continuation WO2006077868A1 (en) 2005-01-18 2006-01-18 Robot toy

Publications (1)

Publication Number Publication Date
US20070270074A1 true US20070270074A1 (en) 2007-11-22

Family

ID=36692257

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/775,133 Abandoned US20070270074A1 (en) 2005-01-18 2007-07-09 Robot Toy

Country Status (3)

Country Link
US (1) US20070270074A1 (en)
JP (1) JP2006198017A (en)
WO (1) WO2006077868A1 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070087652A1 (en) * 2005-10-05 2007-04-19 Wen-Bin Hsu Pet-like toy combined with MP3 player
US20090098792A1 (en) * 2007-10-12 2009-04-16 Hong Fu Jin Precision Industry (Shenzhen) Co.,Ltd. Electronic toy capable of emotion displaying using an emotion unit
US20090104844A1 (en) * 2007-10-19 2009-04-23 Hon Hai Precision Industry Co., Ltd. Electronic dinosaur toys
US20100305448A1 (en) * 2009-05-26 2010-12-02 Anne Cecile Dagonneau Apparatus and method for indicating ultrasound probe orientation and activation status
US20110151746A1 (en) * 2009-12-18 2011-06-23 Austin Rucker Interactive toy for audio output
US20120320077A1 (en) * 2011-06-17 2012-12-20 Microsoft Corporation Communicating status and expression
CN102980103A (en) * 2012-11-27 2013-03-20 华南理工大学 Machine vision LED (light emitting diode) illumination source
US20130073087A1 (en) * 2011-09-20 2013-03-21 Disney Enterprises, Inc. System for controlling robotic characters to enhance photographic results
US8868739B2 (en) 2011-03-23 2014-10-21 Linkedin Corporation Filtering recorded interactions by age
US8886807B2 (en) 2011-09-21 2014-11-11 LinkedIn Reassigning streaming content to distribution servers
US20150045697A1 (en) * 2011-12-21 2015-02-12 Koninklijke Philips N.V. Peel and stick cpr assistance device
WO2016018881A1 (en) * 2014-07-28 2016-02-04 Gonchar Sergei Sparkly childrens products
US10068424B2 (en) 2016-05-13 2018-09-04 Universal Entertainment Corporation Attendant device and gaming machine
CN109108961A (en) * 2017-06-23 2019-01-01 卡西欧计算机株式会社 Robot, the control method of robot and storage medium
US10307911B2 (en) * 2017-08-30 2019-06-04 Panasonic Inttellectual Property Management Co., Ltd. Robot
USD919688S1 (en) * 2017-10-31 2021-05-18 Sony Corporation Ear of robot
US11123873B2 (en) * 2018-08-07 2021-09-21 Circulus Inc. Method and server for controlling interaction robot
KR20220077938A (en) * 2020-11-05 2022-06-10 (주)로보티즈 Companion robot

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8292689B2 (en) 2006-10-02 2012-10-23 Mattel, Inc. Electronic playset
US8062089B2 (en) 2006-10-02 2011-11-22 Mattel, Inc. Electronic playset
CN105676740A (en) * 2016-03-01 2016-06-15 深圳前海勇艺达机器人有限公司 Method and apparatus enabling robot to have illumination function
WO2020009098A1 (en) * 2018-07-02 2020-01-09 Groove X株式会社 Robot

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4556932A (en) * 1983-03-28 1985-12-03 Lehrer Bradley D Lighted novelty item
US4654659A (en) * 1984-02-07 1987-03-31 Tomy Kogyo Co., Inc Single channel remote controlled toy having multiple outputs
US5141464A (en) * 1991-01-23 1992-08-25 Mattel, Inc. Touch responsive animated toy figure
US5402702A (en) * 1992-07-14 1995-04-04 Jalco Co., Ltd. Trigger circuit unit for operating light emitting members such as leds or motors for use in personal ornament or toy in synchronization with music
US5461188A (en) * 1994-03-07 1995-10-24 Drago; Marcello S. Synthesized music, sound and light system
US5668333A (en) * 1996-06-05 1997-09-16 Hasbro, Inc. Musical rainbow toy
USD448433S1 (en) * 1999-11-02 2001-09-25 Sega Toys Ltd. Robotic dog
US6337552B1 (en) * 1999-01-20 2002-01-08 Sony Corporation Robot apparatus
USD457203S1 (en) * 1999-11-02 2002-05-14 Sega Toys, Ltd. Robotic dog
US6458011B1 (en) * 1999-05-10 2002-10-01 Sony Corporation Robot device
US6462498B1 (en) * 2000-05-09 2002-10-08 Andrew J. Filo Self-stabilizing walking apparatus that is capable of being reprogrammed or puppeteered
US6534943B1 (en) * 1999-10-25 2003-03-18 Sony Corporation Robot device and learning method of robot device
US6594551B2 (en) * 2001-06-14 2003-07-15 Sharper Image Corporation Robot for expressing moods
US6672934B2 (en) * 2000-02-04 2004-01-06 Trendmasters, Inc. Amusement device
US6682390B2 (en) * 2000-07-04 2004-01-27 Tomy Company, Ltd. Interactive toy, reaction behavior pattern generating device, and reaction behavior pattern generating method
US7063591B2 (en) * 1999-12-29 2006-06-20 Sony Corporation Edit device, edit method, and recorded medium
US7099742B2 (en) * 2000-10-20 2006-08-29 Sony Corporation Device for controlling robot behavior and method for controlling it
US7363108B2 (en) * 2003-02-05 2008-04-22 Sony Corporation Robot and control method for controlling robot expressions
US7374482B2 (en) * 2003-08-12 2008-05-20 Ghaly Nabil N Interactive slot machine

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3277500B2 (en) * 1999-05-10 2002-04-22 ソニー株式会社 Robot device
JP2002154081A (en) * 2000-11-16 2002-05-28 Nec Access Technica Ltd Robot, its facial expression method and detecting method for step difference and lifting-up state
JP2003060745A (en) * 2001-08-22 2003-02-28 Sony Corp Device and method for transmitting information and monitoring device
JP2003071765A (en) * 2001-09-04 2003-03-12 Sony Corp Robot device and input method therefor

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4556932A (en) * 1983-03-28 1985-12-03 Lehrer Bradley D Lighted novelty item
US4654659A (en) * 1984-02-07 1987-03-31 Tomy Kogyo Co., Inc Single channel remote controlled toy having multiple outputs
US5141464A (en) * 1991-01-23 1992-08-25 Mattel, Inc. Touch responsive animated toy figure
US5402702A (en) * 1992-07-14 1995-04-04 Jalco Co., Ltd. Trigger circuit unit for operating light emitting members such as leds or motors for use in personal ornament or toy in synchronization with music
US5461188A (en) * 1994-03-07 1995-10-24 Drago; Marcello S. Synthesized music, sound and light system
US5668333A (en) * 1996-06-05 1997-09-16 Hasbro, Inc. Musical rainbow toy
US6337552B1 (en) * 1999-01-20 2002-01-08 Sony Corporation Robot apparatus
US6458011B1 (en) * 1999-05-10 2002-10-01 Sony Corporation Robot device
US6534943B1 (en) * 1999-10-25 2003-03-18 Sony Corporation Robot device and learning method of robot device
USD457203S1 (en) * 1999-11-02 2002-05-14 Sega Toys, Ltd. Robotic dog
USD448433S1 (en) * 1999-11-02 2001-09-25 Sega Toys Ltd. Robotic dog
US7063591B2 (en) * 1999-12-29 2006-06-20 Sony Corporation Edit device, edit method, and recorded medium
US7040951B2 (en) * 2000-02-04 2006-05-09 Hornsby James R Amusement device
US6672934B2 (en) * 2000-02-04 2004-01-06 Trendmasters, Inc. Amusement device
US6462498B1 (en) * 2000-05-09 2002-10-08 Andrew J. Filo Self-stabilizing walking apparatus that is capable of being reprogrammed or puppeteered
US6682390B2 (en) * 2000-07-04 2004-01-27 Tomy Company, Ltd. Interactive toy, reaction behavior pattern generating device, and reaction behavior pattern generating method
US7099742B2 (en) * 2000-10-20 2006-08-29 Sony Corporation Device for controlling robot behavior and method for controlling it
US7024280B2 (en) * 2001-06-14 2006-04-04 Sharper Image Corporation Robot capable of detecting an edge
US6594551B2 (en) * 2001-06-14 2003-07-15 Sharper Image Corporation Robot for expressing moods
US7363108B2 (en) * 2003-02-05 2008-04-22 Sony Corporation Robot and control method for controlling robot expressions
US7374482B2 (en) * 2003-08-12 2008-05-20 Ghaly Nabil N Interactive slot machine

Cited By (55)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070087652A1 (en) * 2005-10-05 2007-04-19 Wen-Bin Hsu Pet-like toy combined with MP3 player
US20090098792A1 (en) * 2007-10-12 2009-04-16 Hong Fu Jin Precision Industry (Shenzhen) Co.,Ltd. Electronic toy capable of emotion displaying using an emotion unit
US20090104844A1 (en) * 2007-10-19 2009-04-23 Hon Hai Precision Industry Co., Ltd. Electronic dinosaur toys
US7988522B2 (en) * 2007-10-19 2011-08-02 Hon Hai Precision Industry Co., Ltd. Electronic dinosaur toy
US20100305448A1 (en) * 2009-05-26 2010-12-02 Anne Cecile Dagonneau Apparatus and method for indicating ultrasound probe orientation and activation status
US8515092B2 (en) 2009-12-18 2013-08-20 Mattel, Inc. Interactive toy for audio output
US20110151746A1 (en) * 2009-12-18 2011-06-23 Austin Rucker Interactive toy for audio output
US9705760B2 (en) 2011-03-23 2017-07-11 Linkedin Corporation Measuring affinity levels via passive and active interactions
US8943157B2 (en) 2011-03-23 2015-01-27 Linkedin Corporation Coasting module to remove user from logical group
US9413706B2 (en) 2011-03-23 2016-08-09 Linkedin Corporation Pinning users to user groups
US8868739B2 (en) 2011-03-23 2014-10-21 Linkedin Corporation Filtering recorded interactions by age
US8880609B2 (en) 2011-03-23 2014-11-04 Linkedin Corporation Handling multiple users joining groups simultaneously
US9691108B2 (en) 2011-03-23 2017-06-27 Linkedin Corporation Determining logical groups without using personal information
US8892653B2 (en) 2011-03-23 2014-11-18 Linkedin Corporation Pushing tuning parameters for logical group scoring
US8930459B2 (en) 2011-03-23 2015-01-06 Linkedin Corporation Elastic logical groups
US8935332B2 (en) 2011-03-23 2015-01-13 Linkedin Corporation Adding user to logical group or creating a new group based on scoring of groups
US8943138B2 (en) * 2011-03-23 2015-01-27 Linkedin Corporation Altering logical groups based on loneliness
US8943137B2 (en) 2011-03-23 2015-01-27 Linkedin Corporation Forming logical group for user based on environmental information from user device
US9325652B2 (en) 2011-03-23 2016-04-26 Linkedin Corporation User device group formation
US8954506B2 (en) 2011-03-23 2015-02-10 Linkedin Corporation Forming content distribution group based on prior communications
US20150302082A1 (en) * 2011-03-23 2015-10-22 Linkedin Corporation Determining membership in a group based on loneliness score
US8959153B2 (en) 2011-03-23 2015-02-17 Linkedin Corporation Determining logical groups based on both passive and active activities of user
US8965990B2 (en) 2011-03-23 2015-02-24 Linkedin Corporation Reranking of groups when content is uploaded
US8972501B2 (en) 2011-03-23 2015-03-03 Linkedin Corporation Adding user to logical group based on content
US9071509B2 (en) 2011-03-23 2015-06-30 Linkedin Corporation User interface for displaying user affinity graphically
US9094289B2 (en) 2011-03-23 2015-07-28 Linkedin Corporation Determining logical groups without using personal information
US9536270B2 (en) 2011-03-23 2017-01-03 Linkedin Corporation Reranking of groups when content is uploaded
US9413705B2 (en) * 2011-03-23 2016-08-09 Linkedin Corporation Determining membership in a group based on loneliness score
US20120320077A1 (en) * 2011-06-17 2012-12-20 Microsoft Corporation Communicating status and expression
US9656392B2 (en) * 2011-09-20 2017-05-23 Disney Enterprises, Inc. System for controlling robotic characters to enhance photographic results
US20130073087A1 (en) * 2011-09-20 2013-03-21 Disney Enterprises, Inc. System for controlling robotic characters to enhance photographic results
US9306998B2 (en) 2011-09-21 2016-04-05 Linkedin Corporation User interface for simultaneous display of video stream of different angles of same event from different users
US9154536B2 (en) 2011-09-21 2015-10-06 Linkedin Corporation Automatic delivery of content
US9497240B2 (en) 2011-09-21 2016-11-15 Linkedin Corporation Reassigning streaming content to distribution servers
US9131028B2 (en) 2011-09-21 2015-09-08 Linkedin Corporation Initiating content capture invitations based on location of interest
US9654535B2 (en) 2011-09-21 2017-05-16 Linkedin Corporation Broadcasting video based on user preference and gesture
US9654534B2 (en) 2011-09-21 2017-05-16 Linkedin Corporation Video broadcast invitations based on gesture
US8886807B2 (en) 2011-09-21 2014-11-11 LinkedIn Reassigning streaming content to distribution servers
US9774647B2 (en) 2011-09-21 2017-09-26 Linkedin Corporation Live video broadcast user interface
US10433767B2 (en) * 2011-12-21 2019-10-08 Koninklijke Philips N.V. Peel and stick CPR assistance device
US20150045697A1 (en) * 2011-12-21 2015-02-12 Koninklijke Philips N.V. Peel and stick cpr assistance device
CN102980103A (en) * 2012-11-27 2013-03-20 华南理工大学 Machine vision LED (light emitting diode) illumination source
WO2016018881A1 (en) * 2014-07-28 2016-02-04 Gonchar Sergei Sparkly childrens products
US10068424B2 (en) 2016-05-13 2018-09-04 Universal Entertainment Corporation Attendant device and gaming machine
US10192399B2 (en) * 2016-05-13 2019-01-29 Universal Entertainment Corporation Operation device and dealer-alternate device
US10275982B2 (en) 2016-05-13 2019-04-30 Universal Entertainment Corporation Attendant device, gaming machine, and dealer-alternate device
US10290181B2 (en) 2016-05-13 2019-05-14 Universal Entertainment Corporation Attendant device and gaming machine
EP3456487A3 (en) * 2017-06-23 2019-04-17 Casio Computer Co., Ltd. Robot, method of controlling the same, and program
CN109108961A (en) * 2017-06-23 2019-01-01 卡西欧计算机株式会社 Robot, the control method of robot and storage medium
US11000952B2 (en) 2017-06-23 2021-05-11 Casio Computer Co., Ltd. More endearing robot, method of controlling the same, and non-transitory recording medium
US10307911B2 (en) * 2017-08-30 2019-06-04 Panasonic Inttellectual Property Management Co., Ltd. Robot
USD919688S1 (en) * 2017-10-31 2021-05-18 Sony Corporation Ear of robot
US11123873B2 (en) * 2018-08-07 2021-09-21 Circulus Inc. Method and server for controlling interaction robot
KR20220077938A (en) * 2020-11-05 2022-06-10 (주)로보티즈 Companion robot
KR102415997B1 (en) 2020-11-05 2022-07-05 (주)로보티즈 Companion robot

Also Published As

Publication number Publication date
WO2006077868A1 (en) 2006-07-27
JP2006198017A (en) 2006-08-03

Similar Documents

Publication Publication Date Title
US20070270074A1 (en) Robot Toy
US7442107B1 (en) Electronic toy, control method thereof, and storage medium
US6544098B1 (en) Interactive toy
US6558225B1 (en) Electronic figurines
US7507139B1 (en) Electromechanical toy
JP2006198017A5 (en)
US7428994B1 (en) Toy adapting to color of surroundings
CN108697938A (en) Robot with variable role
WO2000068879A1 (en) Robot device, its control method, and recorded medium
NO324232B1 (en) Remote controlled play
US20090098792A1 (en) Electronic toy capable of emotion displaying using an emotion unit
WO2024067701A1 (en) Toy
JP3277500B2 (en) Robot device
WO2024067700A1 (en) Toy
US20150093958A1 (en) System for Controlled Distribution of Light in Toy Characters
JP2012045186A (en) Action toy
US7695341B1 (en) Electromechanical toy
KR20080075268A (en) Robot and plaything for moving and playing by outside stimulus
JP4757979B2 (en) Electronic toy
JP2002346064A (en) Game machine equipped with illumination device
JP2002066155A (en) Emotion-expressing toy
US8029330B2 (en) Doll with two conductor tethered remote control
JP2000202171A (en) Sound sensitive toy
JP3121098U (en) Accessories toy
CN210302376U (en) Toy structure with communication and touch functions

Legal Events

Date Code Title Description
AS Assignment

Owner name: SEGA TOYS CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:AOCHI, YUICHI;YAMANA, WAKANA;TAKUMA, EIJI;AND OTHERS;REEL/FRAME:019583/0627;SIGNING DATES FROM 20070711 TO 20070717

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION