WO2003021924A1 - A method of operating a communication system - Google Patents

A method of operating a communication system Download PDF

Info

Publication number
WO2003021924A1
WO2003021924A1 PCT/GB2002/004010 GB0204010W WO03021924A1 WO 2003021924 A1 WO2003021924 A1 WO 2003021924A1 GB 0204010 W GB0204010 W GB 0204010W WO 03021924 A1 WO03021924 A1 WO 03021924A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
image
expression
reference value
communication system
Prior art date
Application number
PCT/GB2002/004010
Other languages
French (fr)
Inventor
Stephen Philip Rowe
Peter Saddington
Original Assignee
Roke Manor Research Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GB0120841A external-priority patent/GB0120841D0/en
Priority claimed from GB0122374A external-priority patent/GB0122374D0/en
Application filed by Roke Manor Research Limited filed Critical Roke Manor Research Limited
Publication of WO2003021924A1 publication Critical patent/WO2003021924A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72427User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Telephone Function (AREA)
  • Telephonic Communication Services (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

A communication system comprises a first communication device and a second communication device. Each device comprises an input; a display and a processor for controlling operation of the device. In use, an expression is transmitted from the first user to the second user and a reference value is allocated to the expression by the first device. The reference value is transmitted to the second device and the second device displays an image under control of the second processor dependent upon the reference value received by the second device.

Description

A METHOD OF OPERATING A COMMUNICATION SYSTEM
This invention relates to a method of operating a communication system, for example a mobile telephone. Current generations of mobile phones are versatile devices capable of exchanging data such as short text messages or speech data. The short message service (SMS) facility provided by the GSM cellular system has proved popular, in particular with the younger end of the market. In some parts of the world, downloading of logos and ring tones to phones has also developed into big business. Increasingly, these devices are required to carry out more complex operations.
Some current models are able to send one of a small number of predetermined images included in their text messages, e.g. Nokia 8210. These images are used to enhance communication between the two users. The most modern handsets allow the Multimedia Messaging Service (MMS). This allows images and sound to be transferred in a similar way to SMS, but not during a call. The next generation of mobile telephones will be capable of sending video and data at faster speeds. However, the changeover to 3r generation mobile telephone technology will not be instantaneous and users of older generation mobile telephones should not be ignored.
In accordance with a first aspect of the present invention, a communication system comprises a first communication device and a second communication device; wherein each device comprises an input; a display and a processor for controlling operation of the device; wherein, in use, an expression is transmitted from the first user to the second user; wherein a reference value is allocated to the expression by the first device; wherein the reference value is transmitted to the second device; and wherein the second device displays an image under control of the second processor dependent upon the reference value received by the second device.
In accordance with a second aspect of the present invention, a method of operating a communication system comprises allocating a reference value to an expression produced by a first user, transmitting the expression and the reference value to a second user and causing an image to be displayed to the second user; wherein the image to be displayed is chosen from one of a plurality of stored images in response to the reference value transmitted. The present invention enables new functionality to be applied to existing communication systems, such as mobile telephones, whereby animated conversation is made possible in which each user is able to display an image representing e.g. emotion information determined by the other user during a call. Although any stored image can be displayed, preferably, the or each stored image comprises a face representing an emotional state and a plurality of faces may be stored to represent different levels of each emotional state.
Two users are able to send e.g. happy faces, sad faces or jubilant faces to each other during a call and the method is flexible enough to allow the faces to be of the users' favourite celebrities, such as footballers or pop stars, or personal caricatures.
Preferably, the user can alter the level of an emotional state being displayed by depressing a key on the phone.
The invention is applicable to any system which implements the 3rd Generation Partnership Project (3GPP) specification, but preferably, the communication system comprises a mobile phone and the expression is a spoken expression.
In one example, the reference value is allocated by pressing numbered keys on the telephone keypad of the transmitting telephone during a call, so that a picture is displayed on the receiving telephone - each of the numbered keys being assigned to a separate emotion, such as happiness or anger. The first user may choose to control the image displayed to the second user in this way, but preferably, the spoken expression is analysed by a processor in the first user's communication system and a reference value automatically allocated by comparing the tone of the spoken expression with the tone of stored expressions.
An example of a communication system and a method of operating the same according to the present invention will now be described with reference to the accompanying drawings in which:
Figure 1 is a block diagram illustrating a communication system adapted to operate in accordance with the method of the present invention;
Figure 2 illustrates an example of how the keypad and specific emotional states are mapped for use in the invention of Fig. 1 ;
Figure 3 illustrates different levels for the emotional states of Fig. 2; and, Figure 4 illustrates in more detail, three different levels of the emotional state of sadness.
Figure 1 illustrates a communication system comprising a pair of mobile phones adapted for operating the method of the present invention. The features of each phone are the same. A first mobile phone 1 communicates via a GSM interface 3 with a second mobile phone 2. A first user speaks into a microphone 4 in his phone 1 linked to a GSM messaging system 5 via a coder-decoder (codec) 6 which is responsible for coding the speech for transmission and decoding it on reception. The messaging system 5 transmits the spoken expression to the second mobile phone 2 and an animated conversation software program 7 stored on the phone, also causes the messaging system to transmit a signal to instruct the phone 2 of a second user to display an appropriate image on a display 12 of the second user. Optionally, a lip syncing facility 8 may be provided which will operate in the second user's phone to animate the lips of the image based on the speech received. Thus, the images which the software program 7 causes to be displayed on the second user's phone relate more directly to the spoken expression used by the first user.
Each phone 1, 2 also has a keypad 9 and a selector 10 which allows manual or automatic operation to be selected. For manual operation each key will cause a specific image representing the first user to be displayed on the second user's phone. If a webcam image is to be provided there is an input 11 to receive a signal from the webcam. The webcam may be used to provide a direct representation of the user who is speaking or to generate a caricature. Pre-generated caricatures can be stored and appear on the receiving phone when a particular person's phone number is recognised as calling the receiving phone.
Operation between two users talking on a mobile telephone, requires a hands- free kit or a separate display. In manual mode, for example, UserA says something that UserB finds surprising. To reflect this, UserB expresses the emotion verbally, but at the same time presses a numbered key on his handset which has been allocated to the emotional state - surprise. Instantly, a picture of a surprised face is displayed on UserA' s display. UserA then asks why UserB is surprised while, at the same time, UserA presses the key corresponding to the emotional state - puzzlement on his handset. The entire conversation can continue in a similar fashion.
In automatic mode, the system may use voice processing software so that the spoken expression is analysed and a suitable image is displayed to reflect the expression of the user. Alternatively, the software may be adapted to produce in a caricature a particular facial response to vowel sounds, consonants and diphthongs and display this. This enhances the experience by providing a primitive form of lip- syncing.
The system must have at least one character set in memory 13. Assume that the user can choose to display emotions using either a male or female face - two sexes. Assume also that there are seven possible emotions to convey and each emotion has three levels. The system must then store 42 images. Each image is a bitmap of size 50x101 bits, which is rounded up to be 632 octets. Therefore the 26,544 octets are needed to store the complete character set for a male and female. The system may allow the character sets to be interchangeable so that personal caricatures, pop stars or other celebrity faces can be displayed. The system may also allow pictures from the webcam to be used so that the real user is reflected. The character sets can be downloaded to the device from a PC and the webcam picture will be converted to a suitable format before downloading via the input 11. If several character sets are stored, then callerlD information can be used to automatically display the correctly assigned character. The number of character sets that can be stored will be device dependant.
Figure 2 shows an example of how six different emotions, boredom, sadness, ' anger, surprise, excitement and happiness could be represented, and how they are mapped to the numbered key on the telephone keypad 9. For manual mode, the user has the option of selecting the same key up to three times during the call to display a more intense level of the chosen emotional state. A fourth press will display the first
* emotion again. In automatic mode, one key can be defined to change the level, although the actual emotional state will have been derived from the users voice tone by the voice processing software. A example of the emotions and the range of levels is given in Fig. 3, ranging from "slightly", through "fairly" to "very". Figure 4 shows for one emotional state - sadness - how the character changes to become increasingly sad as the level is stepped up. The key "0" may be used to toggle the sex of the character displayed.
The exchange of information between the two phone handsets 1,2 is realised using the 3GPP User-to-User Signalling (UUS) supplementary service. An animated conversation task running on the first mobile phone 1 would use the UUS supplementary service to transfer a small amount of information between itself and the second mobile phone 2. For a basic example of the system, there are 42 faces to choose from. When sending information between the phones 1, 2 only 6 bits of information need be transferred. However, there are a sufficient number of octets available in the User-User message to allow more information to be transferred. This means that the extended capability indicating facial response to vowels, diphthongs and consonants can easily be accommodated.
A summary of the UUS supplementary service is given below. Further information can be found in documents 3GPP TS 22.087 v 4.0.0 (2001-2003), 3GPP TS 23.087 v.4.0.0 (2001-2003) and 3GPP TS 24.087 v.4.0.0 (2001-2003) available from ETSI, Mobile Competence Centre, 650 route des Lucioles, 06921 Sophia- Antipolis Cedex, Valbonne, France. This supplementary service allows a mobile subscriber to the service to send and receive a limited amount of information to and from another Public Land Mobile Network (PLMN) or Integrated Services Digital Network (ISDN) subscriber over the signalling channel during a call to the other subscriber.
The differences between the three services are as follows: i. UUS Service 1 allows the transfer of UUI embedded within the Call Control messages ii. UUS Service 2 allows the transfer of UUI after reception of address complete (dialled) and receiving the answer indication iii. UUS Service 3 allows the transfer of UUI during an active call The services may be used with other networks (PLMN/ISDN). However, these networks may support a maximum User-User Information element length of 35 octets. The standard specifies a maximum user-user data length of 130 octets.
Figure imgf000008_0001
Table 1: GSM 04.08: USER INFORMATION message content
There are three possible scenarios for communication. The first is a call from a first user whose phone is capable of animated conversations (animated conversation capable -ACC) to a user whose phone is not capable of an animated conversation
(animated conversation incapable - ACI). The second is from an ACI user to an ACC user and the third is between two ACC users.
In the first case UserA wishes to establish a call using his ACC mobile telephone to UserB with his ACI mobile telephone. UserA dials UserB;
A call is established (both parties talk);
UserA presses an emotion key on his handset for the first time;
MSA sends Message:
"Facility(Invoke=UserUserService (UUS3, UUS not required))" MSB does not reply;
MSA receives the message "Facility (Return Error)" with content "requested facility not implemented";
UserA sees a message on the display that that the receiving device (UserB) is an ACI device and UserA' s display does not show any animated conversation content. Successive key presses have no effect and the call continues normally, including normal termination.
In the second case UserA wishes to establish a call using his ACI mobile telephone to UserB with his ACC mobile telephone.
UserA dials UserB; Call established (both parties talk);
UserB presses an emotion key on his handset for the first time; MSB sends Message:
"Facility(Invoke=UserUserService (UUS3, UUS not required))"
MSA does not reply;
MSB receives the message "Facility (Return Error)" with content "requested facility not implemented";
UserB receives a message stating that the receiving device is ACL UserB 's display does not show any animated conversation content and successive key presses have no effect. The call continues normally, including normal termination.
In the third scenario UserA wishes to establish a call using his ACC mobile telephone to UserB with his ACC mobile telephone.
UserA dials UserB;
UserB 's display may show an image of UserA which has been stored as a character set in memory;
The call is established (both parties talk); UserA presses an emotion key on his handset for the first time;
MSA sends Message;
"Facility(Ln.voke=UserUserService (UUS3, UUS not required))";
MSB replies "Facility(Return Result)";
MSA receives the message "Facility (Return Result)"; UserA' s display shows a face with neutral expression;
UserB 's display shows a face with expression corresponding to the key pressed by
UserA.
Further key presses by either party display the corresponding image on the opposing handset. Call continues in a similar fashion, including normal termination In the third scenario, operating in automatic mode, the display will be of a face having an emotional state which has been determined by the voice processing software from the spoken expressions of the user and will change as the user's voice changes. Alternatively, a small add-on component, similar to a web-cam, could be attached to the phone, which would be used in hands-free mode, to recognise the emotions expressed on the user's face automatically, rather than analysing the voice tones, and then to send the information as before. The add-on component houses the necessary processing capability and is connected via an I O connector through the input 11.
The same mechanism can be used for the real time transfer of data, such as sending an address during a call, or a telephone number, a diary appointment, serial numbers, business card, small amounts of encrypted data etc. and also for interactive games - such as chess, checkers, hangman, etc., by modifying the image displayed dependent upon the keys being depressed on both handsets. For example, the cartoon characters can be made to fight eachother if a suitable application is downloaded onto the phone. This can work in tv mode or a combination of tv and manual mode. The size and scope of the service depends on the resources of the mobile telephone device. One particular model may only have enough free memory to store a single character set, whereas another may have enough free resources to store several character sets as well as additional processing for lip-syncing.
The invention is not limited to use with mobile phones, but is intended for systems that implement at least the GSM specification The system requires at least a simple display, a keypad and a processor to run the software and means to store the cartoon images. The images may be distributed and sold via the Internet or the communications carrier of the user. The PC application allows the user to generate his own set of cartoons from a standard webcam.

Claims

1. A communication system, the system comprising a first communication device and a second communication device; wherein each device comprises an input; a display and a processor for controlling operation of the device; wherein, in use, an expression is transmitted from the first user to the second user; wherein a reference value is allocated to the expression by the first device; wherein the reference value is transmitted to the second device; and wherein the second device displays an image under control of the second processor dependent upon the reference value received by the second device.
2. A communication system according to claim 1, wherein the expression is a spoken expression.
3. A system according to claim 1 or claim 2, wherein the image is chosen from a plurality of images stored on the device.
4. A system according to any preceding claim, wherein the first and second communication devices comprise mobile phones.
5. A method of operating a communication system, the method comprising allocating a reference value to an expression produced by a first user, transmitting the expression and the reference value to a second user and causing an image to be displayed to the second user; wherein the image to be displayed is chosen from one of a plurality of stored images in response to the reference value transmitted.
6. A method according to claim 5, wherein the method further comprises obtaining an image of the first user from a webcam, allocating a reference value to that image, transmitting the image to a second user, storing the image and subsequently displaying the image to the second user when the allocated reference value is transmitted.
7. A method according to claim 5 or claim 6, wherein the expression is a spoken expression.
8. A method according to claim 7, wherein the spoken expression is analysed by a processor in the first user's communication system and a reference value automatically allocated by comparing the tone of the spoken expression with the tone of stored expressions.
9. A method according to any of claims 5 to 8, wherein the stored images are stored on the phone of the second user.
10. A method according to any of claims 5 to 9, wherein the stored images comprise one of a male and a female face representing an emotional state.
11. A method according to any of claims 5 to 10, wherein a plurality of faces are stored representing different levels of each emotional state.
12. A method according to any of claims 5 to 11, wherein the image is one of a caricature, a celebrity and a historical figure.
13. A method according to any of claims 5 to 12, wherein the communication system comprises a mobile phone.
14. A method according to any of claims 5 to 13, wherein the user can alter the level of an emotional state being displayed by depressing a key on the telephone.
PCT/GB2002/004010 2001-08-29 2002-08-27 A method of operating a communication system WO2003021924A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
GB0120841A GB0120841D0 (en) 2001-08-29 2001-08-29 Display graphic
GB0120841.2 2001-08-29
GB0122374.2 2001-09-17
GB0122374A GB0122374D0 (en) 2001-09-17 2001-09-17 Display graphic

Publications (1)

Publication Number Publication Date
WO2003021924A1 true WO2003021924A1 (en) 2003-03-13

Family

ID=26246483

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2002/004010 WO2003021924A1 (en) 2001-08-29 2002-08-27 A method of operating a communication system

Country Status (1)

Country Link
WO (1) WO2003021924A1 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2853100A1 (en) * 2003-03-28 2004-10-01 France Telecom Interlocutor emotional state transmission device for e.g. telemedicine, has sensors obtaining sensory information e.g. heart beat, of interlocutor, and coding circuit transforming information to emotional state code
EP1480425A1 (en) * 2003-05-20 2004-11-24 NTT DoCoMo, Inc. Portable terminal and program for generating an avatar based on voice analysis
EP1507391A2 (en) * 2003-08-14 2005-02-16 Nec Corporation A portable telephone including an animation function and a method of controlling the same
WO2005067272A1 (en) * 2004-01-09 2005-07-21 Siemens Aktiengesellschaft Communication terminal with avatar code transmission
EP1560406A1 (en) * 2004-01-30 2005-08-03 NTT DoCoMo, Inc. Method for controlling the behavior of an avatar displayed on a portable communication terminal
EP1619909A1 (en) * 2004-07-20 2006-01-25 Pantech&Curitel Communications, Inc. Method and apparatus for transmitting and outputting data in voice communication
WO2006028223A1 (en) 2004-09-10 2006-03-16 Matsushita Electric Industrial Co., Ltd. Information processing terminal
WO2006033809A1 (en) * 2004-09-15 2006-03-30 Motorola, Inc. Communication device with term analysis capability, associated function triggering and related method
WO2006033807A1 (en) * 2004-09-20 2006-03-30 Motorola, Inc. Communication device with image transmission operation and method thereof
JP2006106711A (en) * 2004-09-10 2006-04-20 Matsushita Electric Ind Co Ltd Information processing terminal
JP2007104275A (en) * 2005-10-04 2007-04-19 Nec Corp Freely rotatable mobile communication apparatus, and freely rotatable mobile phone
WO2007141052A1 (en) * 2006-06-09 2007-12-13 Sony Ericsson Mobile Communications Ab Methods, electronic devices, and computer program products for setting a feature of an electronic device based on at least one user characteristic
JP2017068856A (en) * 2016-11-10 2017-04-06 カシオ計算機株式会社 Terminal device, motion state determination method, and program
EP3174052A4 (en) * 2014-07-22 2017-05-31 ZTE Corporation Method and device for realizing voice message visualization service

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0604964A2 (en) * 1992-12-28 1994-07-06 Casio Computer Co., Ltd. Radio communication devices
JPH09138767A (en) * 1995-11-14 1997-05-27 Fujitsu Ten Ltd Communication equipment for feeling expression
US5907604A (en) * 1997-03-25 1999-05-25 Sony Corporation Image icon associated with caller ID
JP2000172589A (en) * 1998-12-09 2000-06-23 Nippon Telegr & Teleph Corp <Ntt> Emotion image communication method and device, and recording medium recording emotion image communication program
WO2000062279A1 (en) * 1999-04-12 2000-10-19 Amir Liberman Apparatus and methods for detecting emotions in the human voice
WO2000070848A1 (en) * 1999-05-14 2000-11-23 Freie Erfindungskünstler GmbH Method for transmitting symbols and/or information from a sender to a recipient

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0604964A2 (en) * 1992-12-28 1994-07-06 Casio Computer Co., Ltd. Radio communication devices
JPH09138767A (en) * 1995-11-14 1997-05-27 Fujitsu Ten Ltd Communication equipment for feeling expression
US5907604A (en) * 1997-03-25 1999-05-25 Sony Corporation Image icon associated with caller ID
JP2000172589A (en) * 1998-12-09 2000-06-23 Nippon Telegr & Teleph Corp <Ntt> Emotion image communication method and device, and recording medium recording emotion image communication program
WO2000062279A1 (en) * 1999-04-12 2000-10-19 Amir Liberman Apparatus and methods for detecting emotions in the human voice
WO2000070848A1 (en) * 1999-05-14 2000-11-23 Freie Erfindungskünstler GmbH Method for transmitting symbols and/or information from a sender to a recipient

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
PATENT ABSTRACTS OF JAPAN vol. 1997, no. 09 30 September 1997 (1997-09-30) *
PATENT ABSTRACTS OF JAPAN vol. 2000, no. 09 13 October 2000 (2000-10-13) *

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2853100A1 (en) * 2003-03-28 2004-10-01 France Telecom Interlocutor emotional state transmission device for e.g. telemedicine, has sensors obtaining sensory information e.g. heart beat, of interlocutor, and coding circuit transforming information to emotional state code
CN1328909C (en) * 2003-05-20 2007-07-25 株式会社Ntt都科摩 Portable terminal, image communication program
EP1480425A1 (en) * 2003-05-20 2004-11-24 NTT DoCoMo, Inc. Portable terminal and program for generating an avatar based on voice analysis
US7486969B2 (en) 2003-05-20 2009-02-03 Ntt Docomo, Inc. Transmitting portable terminal
EP1507391A2 (en) * 2003-08-14 2005-02-16 Nec Corporation A portable telephone including an animation function and a method of controlling the same
EP1507391A3 (en) * 2003-08-14 2005-03-30 Nec Corporation A portable telephone including an animation function and a method of controlling the same
WO2005067272A1 (en) * 2004-01-09 2005-07-21 Siemens Aktiengesellschaft Communication terminal with avatar code transmission
CN100420297C (en) * 2004-01-30 2008-09-17 株式会社Ntt都科摩 Mobile communication terminal and program
EP1560406A1 (en) * 2004-01-30 2005-08-03 NTT DoCoMo, Inc. Method for controlling the behavior of an avatar displayed on a portable communication terminal
EP1619909A1 (en) * 2004-07-20 2006-01-25 Pantech&Curitel Communications, Inc. Method and apparatus for transmitting and outputting data in voice communication
WO2006028223A1 (en) 2004-09-10 2006-03-16 Matsushita Electric Industrial Co., Ltd. Information processing terminal
JP2006106711A (en) * 2004-09-10 2006-04-20 Matsushita Electric Ind Co Ltd Information processing terminal
US7788104B2 (en) 2004-09-10 2010-08-31 Panasonic Corporation Information processing terminal for notification of emotion
WO2006033809A1 (en) * 2004-09-15 2006-03-30 Motorola, Inc. Communication device with term analysis capability, associated function triggering and related method
WO2006033807A1 (en) * 2004-09-20 2006-03-30 Motorola, Inc. Communication device with image transmission operation and method thereof
JP2007104275A (en) * 2005-10-04 2007-04-19 Nec Corp Freely rotatable mobile communication apparatus, and freely rotatable mobile phone
WO2007141052A1 (en) * 2006-06-09 2007-12-13 Sony Ericsson Mobile Communications Ab Methods, electronic devices, and computer program products for setting a feature of an electronic device based on at least one user characteristic
EP3174052A4 (en) * 2014-07-22 2017-05-31 ZTE Corporation Method and device for realizing voice message visualization service
JP2017068856A (en) * 2016-11-10 2017-04-06 カシオ計算機株式会社 Terminal device, motion state determination method, and program

Similar Documents

Publication Publication Date Title
KR100617183B1 (en) System and method for multiplexing media information over a network using reduced communications resources and prior knowledge/experience of a called or calling party
CN100546322C (en) Chat and tele-conferencing system with the translation of Text To Speech and speech-to-text
JP4489121B2 (en) Method for providing news information using 3D character in mobile communication network and news information providing server
US20070161412A1 (en) Augmentation of ringtones and other signals
WO2003021924A1 (en) A method of operating a communication system
WO2008079505A2 (en) Method and apparatus for hybrid audio-visual communication
JP2004537231A (en) How to change avatar-like graphical data in mobile communication terminals
KR20040044261A (en) Method for remote control the AVATAR by using mobile phone
CN102420897B (en) Mobile phone communication information transmitting method and device
US20070243898A1 (en) Multi-handset cordless voice over IP telephony system
EP2247083A2 (en) Method for communicating pre-recorded audible messages during a call and associated mobile device
US20050049879A1 (en) Communication device capable of interworking between voice communications and text communications
KR100941598B1 (en) telephone communication system and method for providing users with telephone communication service comprising emotional contents effect
CN100466767C (en) Method for realizing user&#39;s signature and mobile terminal
KR100574458B1 (en) Background music transmitting method in a wireless communication terminal
KR100570079B1 (en) Video call method and system using avatar
WO2003005717A1 (en) Visual mode mobile phone and visual mode telephoning method using same
CN202907188U (en) Information transmitter for mobile phone communication
KR100720074B1 (en) Providing Method of video chatting using FLASH
JP2002374366A (en) Communications equipment and communication system
KR100775561B1 (en) Method for expressing a feeling and state of caller in mobile node of communication state
WO2007099558A2 (en) A method and system for message-based multi-user conference through wireless communication devices
TWI255639B (en) VoIP phone system capable of showing the images from phone caller&#39;s end
KR100639252B1 (en) Method for operating virtual image communication in mobile communication terminal
KR20050095747A (en) Method and system of video phone calling inserting animation

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BY BZ CA CH CN CO CR CU CZ DE DM DZ EC EE ES FI GB GD GE GH HR HU ID IL IN IS JP KE KG KP KR LC LK LR LS LT LU LV MA MD MG MN MW MX MZ NO NZ OM PH PL PT RU SD SE SG SI SK SL TJ TM TN TR TZ UA UG US UZ VC VN YU ZA ZM

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR IE IT LU MC NL PT SE SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ UG ZM ZW AM AZ BY KG KZ RU TJ TM AT BE BG CH CY CZ DK EE ES FI FR GB GR IE IT LU MC PT SE SK TR BF BJ CF CG CI GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP