EP0664660A2 - Audio signal reproducing apparatus - Google Patents

Audio signal reproducing apparatus Download PDF

Info

Publication number
EP0664660A2
EP0664660A2 EP95104929A EP95104929A EP0664660A2 EP 0664660 A2 EP0664660 A2 EP 0664660A2 EP 95104929 A EP95104929 A EP 95104929A EP 95104929 A EP95104929 A EP 95104929A EP 0664660 A2 EP0664660 A2 EP 0664660A2
Authority
EP
European Patent Office
Prior art keywords
transfer characteristics
audio signal
signal
head
listener
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP95104929A
Other languages
German (de)
French (fr)
Other versions
EP0664660B1 (en
EP0664660A3 (en
Inventor
Kiyofumi C/O Sony Corporation Inanaga
Hiroyuki C/O Sony Corporation Sogawa
Yasuhiro C/O Sony Corporation Iida
Susumu C/O Sony Corporation Yabe
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2008520A external-priority patent/JP2893780B2/en
Priority claimed from JP2008514A external-priority patent/JP2751512B2/en
Application filed by Sony Corp filed Critical Sony Corp
Publication of EP0664660A2 publication Critical patent/EP0664660A2/en
Publication of EP0664660A3 publication Critical patent/EP0664660A3/xx
Application granted granted Critical
Publication of EP0664660B1 publication Critical patent/EP0664660B1/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R5/00Stereophonic arrangements
    • H04R5/033Headphones for stereophonic communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S1/00Two-channel systems
    • H04S1/002Non-adaptive circuits, e.g. manually adjustable or static, for enhancing the sound image or the spatial distribution
    • H04S1/005For headphones

Abstract

In an audio signal processing apparatus information on the transfer characteristics from virtual sound sources to both ears of a listener for each given rotational angle depending upon the movement of the head M of the listener is stored in storing means 62. The rotational angular position of the head of the listener is detected by detecting means 45L, 45R and 53 at a solution higher than that of the information of the transfer characteristics stored in said storing means. The information on at least two transfer characteristics in the vicinity of the rotational angular position represented by the detection outputs from detecting means is read from the storing means. The information on the transfer characteristics in the rotational angular position of the head represented by the detection output is interpolation operated by interpolation operating means 61. Based upon the information on the transfer characteristics determined by the interpolation operation means, left and right channel audio signals are processed by audio signal processing means 63 for achieving a proper binaural reproduction relative to the virtual sound sources.

Description

  • The present invention relates to an audio signal binaural reproducing apparatus for reproducing audio signals by means of headphones.
  • A binaural reproducing method has heretofore been known as an approach for providing better direction sensation of sound image or outside head localization sensation when audio signals are reproduced by headphones fitted to the head of a listener so that a pair of headphones are located in the vicinity of both ears.
  • An audio reproducing system adopting this binaural system preliminarily applies a given signal processing to the audio signals reproduced by headphones as is described in, for example, specification of Japanese Patent Publication Sho 53-283.
  • The direction sensation of sound image and outside head localization sensation and the like depend upon the differences in volumes, times and phases of sounds listened by left and right ears.
  • The signal processing aims at causing in an audio output reproduced by the headphones, audio effects equivalent to those caused by the difference in distances between sound sources, that is, speaker systems and right and left ears of a listener and reflections and diffractions in the vicinity of the head of the listener when audio reproducing is performed, for example, by speaker systems remote from the listener. Such a signal processing is performed by convolution-integrating left and right ear audio signals with impulse responses corresponding to the above-mentioned audio effects.
  • Since the absolute position of the sound image is not changed even if the listener moves or turns his or her head when audio reproducing is performed by speaker systems remote from the listener, the relative direction and position of the sound image that the listener senses are changed. In contrast to this, since the headphones is turned together with the listeners head if the listener turns his or her head when audio reproducing is performed by a binaural method using headphones, the relative direction and position of the sound image which the listener senses are not changed.
  • If binaural reproducing is performed by using headphones in such a manner, a sound image is created in the head of a listener due to differences in displacement of the sound image relative to a change in direction of the listener's head. Therefore, it is difficult to locate the sound image in front of the listener. Furthermore, the front sound image has a tendency to lift up.
  • Accordingly, an audio signal reproducing system which detects a change in the direction of the listener's head and changes the modes of the signal processing based upon a result of the detection for providing a good front localization sensation in headphones has heretofore been proposed as is disclosed in Japanese Unexamined Patent Publication No. Sho 42-227 and Japanese Examined Patent Publication No. 54-19242. In such an audio signal reproducing system, a direction detecting device such as gyrocompass and magnetic needle is provided on the head of the listener. A level adjusting circuit and a delay circuit and the like for processing the audio signals are controlled based upon a result of detection from the direction detecting device so that a sound image sensation which is similar to that of the audio reproducing using speaker systems remote from the listener is obtained.
  • In the prior art binaural reproducing system in which headphones are provided with a direction detecting device comprising a gyrocompass, an excellent sound image can be obtained by controlling the content of the signal processing which is applied to the audio signals depending upon changes in direction of the listener's head.
  • In order to control the content of the signal processing applied to the audio signals depending upon a change in direction of the listener's head, it is necessary to preliminarily measure the impulse responses, that is, transfer characteristics corresponding to audio effects given to audio signals of left and right ears for each predetermined rotational angle and to store a great amount of information on the transfer characteristics. The information is read from the storing means depending upon the change in direction of the head. The audio signal will be subjected to a necessary convolution-integration processing in real-time.
  • The present invention was made under such circumstances.
  • It is an object of the present invention to provide an audio signal reproducing apparatus having a simple structure using storing means having a low storing capacity which is capable of performing a binaural reproduction for providing a very natural localization of a sound image in which the positions of virtual sound sources are not changed by headphones even if a listener moves by reducing the amount of information on transfer characteristics from virtual sound sources necessary for binaural reproduction of audio signals with the headphones to both ears of the listener.
  • An audio signal reproducing apparatus of the present invention comprises an audio signal reproducing apparatus, comprising:
       means for storing transfer characteristics information representative of the transfer characteristics from virtual sound sources to both ears of a listener for each predetermined rotational angle corresponding to the movement of a head of the listener;
       means for detecting the rotational angular position corresponding to the movement of the head of the listener;
       interpolation operation means which reads from said storing means information on at least two transfer characteristics in the vicinity of the rotational angular position of the head represented by a detection output of said detecting means for interpolation-processing the read transfer characteristics information in the rotational angular position of the head represented by the detection output of said detecting means; and
       audio signal processing means for processing left and right channel audio signals with the transfer characteristics information determined by said interpolation operation means, whereby the audio signals which have been processed by said audio signal processing means are reproduced as sounds by a headphone set.
  • The present invention will be further described hereinafter with reference to the following description of exemplary embodiments and the accompanying Figures, in which:-
    • Fig. 1 is a block diagram schematically showing the structure of an audio signal reproducing apparatus described for reference purposes;
    • Fig. 2 is a time chart schematically showing signals supplied to an operation unit of apparatus of Fig. 1;
    • Fig. 3 is a schematic diagram illustrating the distance and the angle calculated by the operation unit of the apparatus of Fig. 1;
    • Fig. 4 is a view for explaining the information on the transfer characteristics stored in a storing circuit of the operation unit in the apparatus of Fig. 1;
    • Fig. 5 is a plan view showing the relative positional relation between virtual sound sources and a listener for explaining the operation of binaural reproducing performed by the apparatus of Fig. 1; and
    • Fig. 6 is a block diagram schematically showing the structure of the audio signal reproducing apparatus of the present invention.
  • An audio signal reproducing apparatus described for reference comprises a headphone set 10 which is fitted over the head M of a listener P and a pair of headphones 2L and 2R are supported by a head band 1 so that they are located in the vicinity of left and right ears of the listener P, respectively as shown in Fig. 1.
  • Two sliders 4L and r$ from which support arms 3L and 3R, respectively project are slidably mounted on the head band 1 of the headphone set 10. A pair of signal detectors 5L and 5R which detect a position detection reference signal emitted from a reference signal source 11 are provided at the tip ends of the support arms 3L and 3R, respectively. That is, the pair of signal detectors 5L and 5R are provided on the tip ends of the support arms 3L and 3R projectedly formed on the sliders 4L and 4R which are slidably mounted on the head band 1 so that they are supported in positions remote from the head band 1 and the pair of headphones 2L and 2R, that is the main body of the headphone set.
  • In the present apparatus the reference signal source 11 comprises an ultrasonic signal source 12 and an ultrasonic speaker 13 for generating an ultrasonic signal from the ultrasonic signal source 12 as a reference signal. Each of the pair of signal detectors 5L and 5R which receive the reference signal comprises an ultrasonic microphone.
  • An ultrasonic wave generated from the ultrasonic speaker 13, that is, the position detection reference signal is a burst wave in which an ultrasonic wave having a given level is intermittently generated for a given period of time as shown at A in Fig. 2, or an ultrasonic wave, the phase of which may be detected like a so-called level modulated wave, the level of which changes in a given circle.
  • The pair of signal detectors 5L and 5R provided on the headphone set 10 detects the ultrasonic position detection reference signal generated from the ultrasonic speaker 13 and generate respective detection signals shown at B and C in Fig. 2, each having a time lag depending upon the relative positional relation between the listener P and the ultrasonic speaker 13.
  • Since the pair of signal detectors 5L and 5R are supported by the support arm 3L and 3R in positions remote from the main body of the headphone set 10 while they are mounted on the tip ends of the support arms 3L and 3R in positions remote from the main body of the headphone set 10 while they are mounted on the tip ends of the support arms 3L and 3R which project from the sliders 4L and 4R, respectively slidably mounted on the head band 1 and, the head band 1 and the pair of headphone 2L and 2R, that is, the main body of the headphone set is fitted on the head M of the listener P, they can detect the ultrasonic wave generated from the ultrasonic speaker 13, that is, the position detection reference signal stably and accurately without being located behind the head P of the listener P even if the listener P moves or rotates his head P. The pair of the signal detectors 5L and 5R can be adjusted to a position optical for detecting the detection reference signal by sliding the sliders 4L and 4R along the head band 1. For example, the optimal positions of the headphones 2L and 2R which are fitted on the head M of the listener P by the head band 1 so that they correspond to the vicinity of the left and right ears depend on the shape and size of the had M of the listener P, that is, have the differences among individuals. Accordingly, the positions of the pair of signal detectors 5L and 5R can be adjusted so that they correspond to the headphones 2L and 2R, respectively.
  • Each detection signal obtained by these signal detectors 5L and 5R is applied to an operation unit 14.
  • The operation unit 14 comprises first and second edge detecting circuits 15 and 16, to which the detection signal from the signal detectors 5L and 5R for detecting the position detection reference signal are supplied, respectively and a third edge detecting circuit 17 to which an ultrasonic signal from the ultrasonic signal sources 12, that is, the position detection reference signal is applied.
  • The first and second edge detecting circuits 15 and 16 detect rise-up edges of the detection signals generated from the signal detectors 5L and 5R, respectively and output pulse signals shown at D and E of Fig. 2 corresponding to the rise-up edges pulse signals generated by the first and second edge detecting circuits 15 and 16 are supplied to a distance calculating circuit 18 and a circuit 19 for detecting the time difference between both ears. The third edge detecting circuit 17 detects the rise-up edge of the ultrasonic signal from the ultrasonic signal source 12 and outputs a pulse signal corresponding to the rise-up edge as shown at F in Fig. 2. A pulse signal obtained by the third edge detection circuit 17 is supplied to the distance calculating circuit 18.
  • The distance calculating circuit 18 detects the time difference t₁ between pulse signals obtained by the third and first edge detecting circuits 17 and 15 which is represented as ΔT₁ in Fig. 2 and the time difference t₂ between pulse signals obtained by the third and second edge detecting circuits 17 and 16 which is represented as ΔT₂ in Fig. 2 and then calculates the distance ℓ₀ between the ultrasonic speaker 13 and the center of the head M of the listener P represented as ℓ₀ in Fig. 3 based upon the time differences t₁, t₂ and the sound velocity V.
  • The sound velocity V may be preliminarily preset as a constant in the distance calculating circuit 18 or alternatively may be changed with changes in atmospheric temperature, humidity and atmospheric pressure and the like. On calculating the distance ℓ, compensation may be conducted for the positional relation between the signal detector 5L and 5R and the center of the head M, the shape and size of the hand M.
  • Signals representative of the distance ℓ₀, time differences t₁ and t₂ are fed to an angle calculating circuit 20.
  • The circuit 19 for detecting the time difference between both ears detects the time difference t₃ between the pulse signals generated by the first and second edge detecting circuits 15 and 16, represented as Δ₃ in Fig. 2. A signal representative of the time difference t₃ is fed to the angle calculating circuit 20.
  • The angle calculating circuit 20 calculates an angle representative of the direction of the head M represented by an arrow ϑ₀ in Fig. 3 by using the time differences t₁, t₂, t₃, the distance ℓ₀, the sound velocity V and the radius r of the head M. The angle ϑ₀ can be determined, for example, by the equation 1 as follows: ϑ₀ ≒ sin⁻¹{V²(t₁+t₂)t₃/4rℓ}
    Figure imgb0001
    Then, the rotation angle ϑ of the head M relative to a desired position of a virtual sound source is calculated from information on the angle ϑ₀ and the distance ℓ₀ representative of the relative positional relationship between a reference position and the listener P by assuming that the position of the ultrasonic speaker 13 be the reference position of the virtual sound source.
  • Information on the rotation angle of the head of the listener obtained by the angle calculating circuit 20 is provided to a control circuit 21.
  • In the audio signal reproducing apparatus of this embodiment, the operation unit 14 includes a storing circuit 22 in which information on transfer characteristics from the virtual sound source to both ears of the listener in first quadrant of the rotational angular position of the head of the listener, for example, information on the transfer characteristics for each angle ϑ₁₁ to ϑ1n in the first quadrant.
  • Based upon the current angle position calculated by the angle calculating circuit 20, the control circuit 21 reads the information on the transfer characteristics corresponding to the current angles ϑ₁₁ to ϑ1n positions from the storing circuit 22 if the current angle position is in the first quadrant in Fig. 4 and reads the transfer characteristics information in which the current angles ϑ₂₁ to ϑ2n corresponds to the angles ϑ₁₁ to ϑ1n in the first quadrant from the storing circuit 22 if the current angle position is in the second quadrant in Fig. 4 and read the transfer characteristics information in which the current angles ϑ₃₁ to ϑ3n corresponds to the angles ϑ₁₁ to ϑ1n in the first quadrant from the storing circuit 22 if the current angle position is in the third quadrant in Fig. 4 and read the transfer characteristics information in which the current angles ϑ₄₁ to ϑ4n correspond to the angles ϑ₁₁ to ϑ1n in the first quadrant from the storing circuit 22 if the current angle position is in the fourth quadrant in Fig. 4 and supplies the read transfer characteristics information to an audio signal processing circuit 23 together with a signal representative of the quadrant in which the current angular position is located.
  • Since the head of the listener is substantially spherical and rotary symmetric, the transfer characteristics from the virtual sound sources to both ears of the listener can be treated as symmetrical in each quadrant.
  • Alternatively, in the control circuit 21, two transfer characteristics in the vicinity of the rotational angular position of the head represented by the angular position information may be read form the storing circuit 22 and the information on the transfer characteristics in the current head rotational angular position may be operated by, for example, linear interpolation processing, as described later with reference to Fig. 6.
  • Left and right channel audio signals SL and SR which are outputted from the audio signal source 22 are supplied to the audio signal processing circuit 23.
  • The audio signal source 24 is an apparatus for outputting given left and right channel audio signals SL and SR, such as recording disc playback apparatus or radio communication receivers and the like.
  • The audio signal processing circuit 23 performs a signal processing which provides the left and right channel audio signals SL and SR fed from the audio signal source 24 with a given transfer characteristics form the virtual sound source to the both ears of the listener. The audio signal processing circuit 23 comprises first to sixth switches 25L, 25R, 26L, 26R, 27L and, 27R for switching the signal lines and first to fourth signal processing units 28a, 28b, 28c and 28d.
  • The first to sixth switches 25L, 25R, 26L, 26R, 27L and 27R are controlled for switching in response to a control signal from the control circuit 21 representative of the quadrant to which the current angular position belongs.
  • The first and second switches 25L and 25R perform switching of inputs of left and right channel audio signals SL and SR fed from the audio signal source 24 and supply the right channel audio signal SR to the first and second signal processing units 28a and 28b and supply the left channel audio signal SL to the third and fourth signal processing units 28c and 28d when the current angular position is in the first or third quadrant and supply the left channel audio signal SL to the first and second signal processing unit 28a and 28b and supply the right channel audio signal SR to the third and fourth signal processing units 28c and 28d when the current angular position is in the second or fourth quadrant.
  • The third and fourth switches 26L and 26R perform switching of the output of the left and right channel audio signals EL and ER outputted from the audio signal processing circuit 23 and select as a right channel audio signal ER the output signal of the first adder 29R for adding the output signals of the first and third signal processing unit 28a and 28c and select as a left channel audio signal EL the output signal of the second adder 29L for adding the output signals of the second and fourth signal processing Units 28b and 28d when the current angular position is in the first or third quadrant and select as a right channel audio signal ER the output signal of the first adder 29L and select as a left channel audio signal EL the output signal of the second adder 29L when the current angular position is in the second or the fourth quadrant.
  • The third and fourth switches 26L and 26R perform switching of filters for the left and right channel audio signals EL and ER outputted from the audio signal processing circuit 23 and output the left and right audio signals EL and ER unswitched when the current angular position is in the second or fourth quadrant and output the audio signals EL and ER from which high frequency components have been removed by low pass filters 30L and 30R when the current angular position is in the second or fourth quadrant.
  • In each of signal processing units 28a, 28b, 28c and 28d, an impulse response representative of the transfer characteristics of the left and right channel audio signals SL and SR reproduced from a pair of left and right channel speakers which are virtual sound sources facing to a listener to each ear of the listener is preset based upon information on transfer characteristics supplied from the control circuit 21.
  • In other words, the first signal processing unit 28a presets the impulse response {hRR(t, ϑ)} representative of transfer characteristics of the sound reproduced from the right channel audio signal SR to the right ear. The second signal processing unit 28b presets the impulse response {hRL(t, ϑ)} representative of the transfer characteristics of the sound reproduced from the right channel audio signal SR to the left ear. The third signal processing unit 28c presets the impulse response {hLR(t, ϑ)} representative of transfer characteristics of the sound reproduced form the left channel audio signal SL to the right ear. The fourth signal processing unit 28d presents the impulse response {hLL(t, ϑ)} representative of the transfer characteristics of the sound reproduced from the left channel audio signal SL to the left ear.
  • When the current angular position of the head of the listener is in the first quadrant, the right channel audio signal SR is fed the first and second signal processing units 28a and 28b. In the first signal processing unit 28a, the right channel audio signal SR is subjected to a signal processing of convolution-integration of the impulse response {hRR(t, ϑ)}. In the second signal processing unit 28b, the right channel audio signal SR is subjected to a signal processing of convolution-integration of the impulse response {hRL(t, ϑ)}.
  • The left channel audio signal SL is fed to the third and fourth signal processing units 28c and 28d. In the third signal processing unit 28c, the left channel audio signal SL is subjected to a signal processing of convolution-integration of the impulse response {hLR(t, ϑ)}. In the second signal processing unit 28d, the left channel audio signal SL is subjected to a signal processing of convolution-integration of the impulse response {hLL(t, ϑ)}.
  • The output signals from the first and third signal processing units 28a and 28c are applied to the right channel adder 29R and are added with each other therein. The output signal of the right channel adder 28R is fed as the right channel studio signal ER via the right channel amplifier 31R to the right channel headphone 2R and reproduced as a sound. The output signals from the second and fourth signal processing units 28b and 28d are applied to the left channel adder 29L and are added with each other therein. The output signal of the left channel adder 29 is fed as the left channel audio signal EL via the left channel amplifier 31L to the left channel headphone 2L and reproduced as a sound.
  • When the current angular position of the head of the listener is in the second quadrant, the left and right channels of inputs and outputs are replaced with each other and a processing which is similar to that of the foregoing first quadrant is performed. Accordingly, a front localization sensation is provided. When the current angular position of the head of the listener is in the third and fourth quadrants a processing which is similar to those of the first and second quadrants is performed. Audio signals EL and ER from which high frequency components have been removed from the low pass filters 30L and 30R are outputted. Accordingly, rear localization sensation can be provided.
  • In the audio signal reproducing apparatus described above information on the transfer characteristics in the rotational angular positions corresponding to the movement of the head of the listener calculated by the angle calculating circuit 20 is formed based upon the information upon the transfer characteristics of the first quadrant stored in the storing circuit 22. By performing a signal processing of the left and right channel audio signals SL and SR which responds to changes in transfer characteristics in association with the movement of the listener P and the rotation of the head M in real time in the audio signal processing circuit 23 based upon the transfer characteristics data, good outside head localization sensation and front localization sensation are obtained in which the virtual sound sources are not moved as similarly to the case in which an audio signal is reproduced by a pair of speaker systems SL and SR which faces to the to the listener P and are remote therefrom and with each other as is shown in Figs. 5A, 5B and 5C in which relative positional relations between the virtual sound sources and the listener P are illustrated.
  • Fig. 5B shows that the listener P has approached to a pair of speaker systems SL and SR, that is, virtual sound sources from a position of Fig. 5A. Fig. 4C shows that the listener P rotates his head M towards the right speaker device SR. By performing a signal processing which can respond in real time to changes in transfer characteristics in association with the movement of the listener and the rotation of the head M as mentioned above, good head outside and front localization sensation in which no virtual sound source is moved can be obtained so that a binaural reproduction which can respond to any conditions of Figs. 5A, 5B and 5C can be performed.
  • Since it suffices for the audio signal reproducing apparatus to store in storing means transfer characteristics information representative of the transfer characteristics from virtual sound sources to a listener of the first quadrant of the rotational angular position of the head of the listener, the amount of the information of the transfer characteristics to be stored in the storing means is small and the storing means having a low storing capacity can be used. The audio signal processing means forms the transfer characteristics information in the rotational angular position represented by a detection output from detecting means for detecting the rotational angular position depending upon the movement of the head of the listener in accordance with the transfer characteristics information of the first quadrant stored in the sorting means and processes the left and right channel audio signals for supplying the processed audio signals to the headphone set. Accordingly, a proper binaural reproduction can be performed for providing a very natural sound image localization sensation in which the positions of the virtual sound sources are not moved even if the listener moves.
  • An embodiment of an audio signal reproducing apparatus of the present invention will now be described in detail with reference to the drawings.
  • The audio signal reproducing apparatus of the present invention shown in Fig. 6 comprises a headphone set 40 which is fitted over the head M of a listener P and a pair of headphones 42L and 42R are supported by a head band 41 so that they are located in the vicinity of the left and right ears of the listener P, as is similar to the apparatus shown in Fig. 1.
  • Two sliders 44L and 44R from which support arms 43L and 43R, respectively project are slidably mounted on the head 1 of the headphone set 40. A pair of signal detectors 45L and 45R which detect a position detection reference signal emitted from a reference signal source 51 are provided at the tip ends of the support arms 43L and 43R, respectively. That is, the pair of signal detectors 45L and 45R are provided on the tip ends of the support arms 43L and 43R projectedly formed on the sliders 44L and 44R which are slidably mounted on the head band 51 so that they are supported in positions remote from the head band 51 and the pair of headphones 42L and 42R, that is the main body of the headphone set.
  • Also in this present embodiment, the reference signal source 51 comprises an ultrasonic signal source 52 and an ultrasonic speaker 53 for generating an ultrasonic signal from the ultrasonic signal source 52 as a reference signal. Each of the pair of signal detectors 45L and 45R which receives the reference signal comprises an ultrasonic microphone.
  • An ultrasonic wave generated from the ultrasonic speaker 53, that is, the position detection reference signal is a burst wave in which an ultrasonic wave having a given level is intermittently generated for a given period of time as is similar to the first embodiment, or an ultrasonic wave, the phase of which may be detected like a so-called level modulated wave, the level of which changes in a given circle.
  • The pair of signal detectors 45L and 45R provided on the headphone set 40 detects the ultrasonic position detection reference signal generated from the ultrasonic speaker 53 and generate respective detection signals, each having a time lag depending upon the relative positional relation between the listener P and the ultrasonic speaker 53.
  • Each detection signal obtained by these signal detectors 45L and 45R is applied to an operation unit 54.
  • The operation unit 54 comprises first and second edge detecting circuit 55 and 56, to which the detection signal from the signal detectors 45L and 45R for detecting the position detection reference signal are supplied, respectively and a third edge detecting circuit 57 to which an ultrasonic signal from the ultrasonic signal source 52, that is, the position detection reference signal is applied.
  • The first and second edge detecting circuits 55 and 56 detect rise-up edges of the detection signals generated from the signal detectors 45L and 45R, respectively and output pulse signals corresponding to the rise-up edges pulse signals generated by the first and second edge detecting circuits 55 and 56 are supplied to a distance calculating circuit 58 and a circuit 59 for detecting the time difference between both ears. The third edge detecting circuit 57 detects the rise-up edge of the ultrasonic signal from the ultrasonic signal source 52 and outputs a pulse signal corresponding to the rise-up edge. A pulse signal obtained by the third edge detection circuit 57 is supplied to the distance calculating circuit 58.
  • The distance calculating circuit 58 detects the time difference t₁ between pulse signals obtained by the third and first edge detecting circuits 57 and 55 and the time difference t₂ between pulse signals obtained by the third and second edge detecting circuits 57 and 56 and then calculates the distance ℓ₀ between the ultrasonic speaker 53 and the center of the head M of the listener based upon the time differences t₁, t₂ and the sound velocity V.
  • Signals representative of the distance ℓ₀, time differences t₁ and t₂ are fed to an angle calculating circuit 60.
  • The circuit 59 for detecting the time difference between both ears detects the time difference t₃ between the pulse signals generated by the first and second edge detecting circuits 55 and 56. A signal representative of the time difference t₃ is fed to the angle calculating circuit 60.
  • The angle calculating circuit 60 calculates an angle ϑ₀ representative of the direction of the head M by using the time differences t₁, t₂, t₃, the distance ℓ₀, the sound velocity V and the radius r of the head M similarly to the angle calculating circuit 20 in the first embodiment.
  • Information on the rotation angular position head of the listener obtained by the angle calculating circuit 60 is provided to an interpolation operation and processing circuit 61.
  • In the audio signal reproducing apparatus of the present embodiment, the operation unit 54 includes a storing circuit 62 in which transfer characteristics information representative of the transfer characteristics from the virtual sound sources to both ears of the listener for each predetermined angle, which is larger than that of the angular positional information of the listener calculated by the angle calculating circuit 60.
  • The interpolation operation and processing circuit 61 reads the information on two transfer characteristics in the vicinity of the rotational angular position of the head represented the current angular positional information calculated by the angle calculating circuit 60 and operates the transfer characteristics in the current rotational angular position of the head by, for example, a linear interpolation processing.
  • The interpolation operation and processing circuit 61 may reads the information on more than two transfer characteristics in the vicinity of the current rotational angular position of the head represented by the angular positional information for performing secondary interpolation processing other than the linear interpolation processing.
  • The information on the transfer characteristics in the current rotational angular position obtained by the interpolation operation and processing circuit 61 is supplied to an audio signal processing circuit 63.
  • The audio signal processing circuit 63 is also supplied with left and right channel audio signals SL and SR outputted from an audio signal source 64.
  • The audio signal source 64 is a device for outputting predetermined left and right channel audio signals SL and SR and may includes, for example, various recording disc playback devices, recording the playback device of wireless receivers and the like.
  • The audio signal processing circuit 63 performs a signal processing which provides the left and right channel audio signals SL and SR fed from the audio signal source 64 with a given transfer characteristics from the virtual sound source to the both ears of the listener. The audio signal processing circuit 63 comprises first through fourth signal processing units 65a, 65b, 65c and 65d to which the transfer characteristics information in the current rotational angular positional of the head obtained by the interpolation operation and processing circuit 61. In each of signal processing units 65a, 65b, 65c and 65d, an impulse response representative of the transfer characteristics of the left and right channel audio signals SL and SR reproduced from a pair of left and right channel speakers which are virtual sound sources facing to a listener to each ear of the listener is preset based upon information on transfer characteristics.
  • In other words, the first signal processing unit 65a presents the impulse response {hRR(t, ϑ)} representative of the transfer characteristics of the sound reproduced form the right channel audio signal SR to the right ear. The second signal processing unit 65b presets the impulse response {hRL(t, ϑ)} representative of the transfer characteristics of the sound reproduced from the right channel audio signal SR to the left ear. The third signal processing unit 65c presets the impulse response {hLR(t, ϑ)} representative of the transfer characteristics of the sound reproduced from the left channel audio signal SL to the right ear. The fourth signal processing unit 65d presets the impulse response {hLL(t, ϑ)} representative of the transfer characteristics of the sound reproduced from the left channel audio signal SL to the left ear. In the audio signal processing circuit 63, the right channel audio signal SL is fed the first and second signal processing units 65a and 65n. In the first signal processing unit 65a, the right channel audio signal SL is subjected to a signal processing of convolution-integration of the impulse response {hRR(t, ϑ)}. In the second signal processing unit 65b, the right channel audio signal SR is subjected to a signal processing of convolution-integration of the impulse response {hRL(t, ϑ)}.
  • The left channel audio signal SL is fed to the third and fourth signal processing units 65c and 65d. In the third signal processing unit 65c, the left channel audio signal SL is subjected to a signal processing of convolution-integration of the impulse response {hLR(t, ϑ)}. In the second signal processing unit 65d, the left channel audio signal S is subjected to a signal processing of convolution-integration of the impulse response {hLL(t, ϑ)}.
  • The output signals from the first and third signal processing units 65a and 65c are applied to the right channel adder 66R and are added with each other therein. The output signal of the right channel adder 66R is fed as the right channel audio signal ER via the right channel amplifier 68R to the right channel headphone 4R of the headphones 40 and reproduced as a sound. The output signals from the second and fourth signal processing units 64b and 64a are applied to the left channel adder 66L and are added with each other therein. The output signal of the right channel adder 66L is fed as the left channel audio signal ER via the left channel amplifier 68L to the left channel headphone 42L of the headphone set 40 and reproduced as a sound.
  • In the thus formed audio signal reproducing apparatus information on two transfer characteristics in the vicinity of the rotational angular position represented by the current angular positional information is read from the storing circuit 62 based upon the current angular positional information calculated by the angle calculating circuit 60. The transfer characteristics information in the current rotational angular position are operated by a linear interpolation processing in the interpolation operation circuit 61. By performing a signal processing which responds to changes in transfer characteristics in association with the movement of the listener and the rotation of the head M in real time in the audio signal processing circuit 63 based upon the transfer characteristics data, good outside head localization sensation and front localization sensation are obtained in which the virtual sound sources are not moved as similarly to the case in which an audio signal is reproduced by a pair of speaker systems which faces to the listener and are remote therefrom and with each other.
  • As mentioned above, in the audio signal reproducing apparatus of the present invention, information on at least two transfer characteristics in the vicinity of the rotational angular position of the head represented by the detection output from detecting means for detecting the rotational angular position of the head of the listener at a solution higher than that of the transfer characteristics information stored in the storing means is read from the storing means. The transfer characteristics information in the rotational angular position of the head represented by the detection output are interpolation-operated by interpolation operation means. Accordingly, the amount of the information on the transfer characteristics stored in the storing means can be reduced. The audio signal processing means processes the left and right channel audio signals based upon the transfer characteristics information determined by the interpolation operation means. The processed audio signals are supplied to the headphones, resulting in that a proper binaural reproduction can be achieved for providing very natural sound image localization sensation in which the positions of the virtual sound sources do not move even if a listener moves.

Claims (5)

  1. An audio signal reproducing apparatus, comprising:
       means (62) for storing transfer characteristics information representative of the transfer characteristics from virtual sound sources to both ears of a listener for each predetermined rotational angle corresponding to the movement of a head of the listener;
       means for detecting (45L, R, 51, 55-59) the rotational angular position corresponding to the movement of the head of the listener;
       interpolation operation means (61) which reads from said storing means information on at least two transfer characteristics in the vicinity of the rotational angular position of the head represented by a detection output of said detecting means for interpolation-processing the read transfer characteristics information in the rotational angular position of the head represented by the detection output of said detecting means; and
       audio signal processing means (63) for processing left and right channel audio signals with the transfer characteristics information determined by said interpolation operation means, whereby the audio signals which have been processed by said audio signal processing means are reproduced as sounds by a headphone set (40).
  2. An apparatus according to claim 1, in which said detecting means includes:
       a pair of signal detecting elements (45L, R) for detecting a reference signal transmitted from a reference signal source (53);
       means (78) for calculating the distance between said reference signal source and the head of the listener from the phase difference between the detection output signals from said pair of the signal detecting elements and said reference signal; and
       means (59) for detecting the time difference between the detection output signals from said pair of signal detecting elements, whereby the rotational angular position of the head of the listener is calculated by using information on the distance obtained from said distance calculating means and on the time difference obtained from said time difference detecting means.
  3. An apparatus according to claim 1 or 2, in which said audio signal processing means includes:
       a first signal processor (65a) for applying to the right channel input audio signal (SR) a convolution-integration of the impulse response corresponding to transfer characteristics of a right channel reproduced audio signal of an input audio signal to the right ear;
       a second signal processor (65b) for applying to the right channel input audio signal (SR) a convolution-integration of the impulse response corresponding to the transfer characteristics of the right channel reproduced audio signal to the left ear;
       a third signal processor (65c) for applying to the left channel input audio signal (SL) a convolution-integration of the impulse response corresponding to the transfer characteristics of the left channel reproduced audio signal of the input audio signal to the right ear;
       a fourth signal processor (65d) for applying to the left channel input audio signal (SL) a convolution-integration of the impulse response corresponding to the transfer characteristics of the left channel reproduced audio signal to the left ear;
       a first means for adding (66R) the output of said first signal processing unit with the output of the third signal processing unit;
       a second means for adding (66L) the output of said second signal processing unit with the output of the fourth signal processing unit, whereby the outputs (ER, EL) of the first and second adding means are supplied to the right and left channel headphones of said headphone set, respectively.
  4. An apparatus according to claim 1, 2 or 3 in which said interpolation operation means (61) reads from said storing means (62) the information on two transfer characteristics in the vicinity of the rotational angular position of the head represented by the detection output from said detecting means for linear interpolation operating the information on the transfer characteristics in the rotational angular position of the head represented by the detection output from said detecting means.
  5. An apparatus according to claim 1, 2 and 3, in which said interpolation operation means reads from said storing means the information on more than two transfer characteristics in the vicinity of the rotational angular position of the head represented by the detection output from said detecting means for secondary interpolation operating the information on the transfer characteristics in the rotational angular position of the head represented by the detection output from said detecting means.
EP95104929A 1990-01-19 1991-01-18 Audio signal reproducing apparatus Expired - Lifetime EP0664660B1 (en)

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
JP852090 1990-01-19
JP8514/90 1990-01-19
JP2008520A JP2893780B2 (en) 1990-01-19 1990-01-19 Sound signal reproduction device
JP2008514A JP2751512B2 (en) 1990-01-19 1990-01-19 Sound signal reproduction device
JP851490 1990-01-19
JP8520/90 1990-01-19
EP91902738A EP0464217B1 (en) 1990-01-19 1991-01-18 Apparatus for reproducing acoustic signals

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
EP91902738.3 Division 1991-01-18
EP91902738A Division EP0464217B1 (en) 1990-01-19 1991-01-18 Apparatus for reproducing acoustic signals

Publications (3)

Publication Number Publication Date
EP0664660A2 true EP0664660A2 (en) 1995-07-26
EP0664660A3 EP0664660A3 (en) 1995-08-09
EP0664660B1 EP0664660B1 (en) 2000-09-27

Family

ID=26343043

Family Applications (2)

Application Number Title Priority Date Filing Date
EP95104929A Expired - Lifetime EP0664660B1 (en) 1990-01-19 1991-01-18 Audio signal reproducing apparatus
EP91902738A Expired - Lifetime EP0464217B1 (en) 1990-01-19 1991-01-18 Apparatus for reproducing acoustic signals

Family Applications After (1)

Application Number Title Priority Date Filing Date
EP91902738A Expired - Lifetime EP0464217B1 (en) 1990-01-19 1991-01-18 Apparatus for reproducing acoustic signals

Country Status (5)

Country Link
EP (2) EP0664660B1 (en)
KR (1) KR920702175A (en)
CA (1) CA2048686C (en)
DE (2) DE69120150T2 (en)
WO (1) WO1991011080A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0762803A2 (en) * 1995-08-31 1997-03-12 Sony Corporation Headphone device
FR2744871A1 (en) * 1996-02-13 1997-08-14 Sextant Avionique SOUND SPATIALIZATION SYSTEM, AND PERSONALIZATION METHOD FOR IMPLEMENTING SAME
EP2005793A2 (en) * 2006-04-04 2008-12-24 Aalborg Universitet Binaural technology method with position tracking

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0795698A (en) * 1993-09-21 1995-04-07 Sony Corp Audio reproducing device
DE69632889T2 (en) * 1995-05-22 2005-07-21 Victor Company of Japan, Ltd., Yokohama Player with headphones
EP2288178B1 (en) 2009-08-17 2012-06-06 Nxp B.V. A device for and a method of processing audio data
US9706304B1 (en) * 2016-03-29 2017-07-11 Lenovo (Singapore) Pte. Ltd. Systems and methods to control audio output for a particular ear of a user

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS58116900A (en) * 1982-11-15 1983-07-12 Sony Corp Stereophonic reproducing device
WO1989003632A1 (en) * 1987-10-15 1989-04-20 Cooper Duane H Head diffraction compensated stereo system
JPH01121000A (en) * 1987-11-05 1989-05-12 Sony Corp Audio reproducing device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5419242B2 (en) 1973-06-22 1979-07-13
JPS5165901A (en) * 1974-12-05 1976-06-08 Sony Corp
US4076677A (en) 1976-06-23 1978-02-28 Desoto, Inc. Aqueous copolymer dispersions and method of producing the same
JPS54109401A (en) * 1978-02-16 1979-08-28 Victor Co Of Japan Ltd Signal converter
JP3155592B2 (en) * 1991-12-11 2001-04-09 武藤工業株式会社 Method and apparatus for correcting progressive dimensions

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS58116900A (en) * 1982-11-15 1983-07-12 Sony Corp Stereophonic reproducing device
WO1989003632A1 (en) * 1987-10-15 1989-04-20 Cooper Duane H Head diffraction compensated stereo system
JPH01121000A (en) * 1987-11-05 1989-05-12 Sony Corp Audio reproducing device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
PATENT ABSTRACTS OF JAPAN vol. 13 no. 364 (E-805) ,14 August 1989 & JP-A-01 121000 (SONY CORP) 12 May 1989, *
PATENT ABSTRACTS OF JAPAN vol. 7 no. 226 (E-202) ,7 October 1983 & JP-A-58 116900 (SONY KK) 12 July 1983, *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0762803A2 (en) * 1995-08-31 1997-03-12 Sony Corporation Headphone device
EP0762803A3 (en) * 1995-08-31 2006-07-26 Sony Corporation Headphone device
FR2744871A1 (en) * 1996-02-13 1997-08-14 Sextant Avionique SOUND SPATIALIZATION SYSTEM, AND PERSONALIZATION METHOD FOR IMPLEMENTING SAME
EP0790753A1 (en) * 1996-02-13 1997-08-20 Sextant Avionique System for sound spatial effect and method therefor
US5987142A (en) * 1996-02-13 1999-11-16 Sextant Avionique System of sound spatialization and method personalization for the implementation thereof
EP2005793A2 (en) * 2006-04-04 2008-12-24 Aalborg Universitet Binaural technology method with position tracking

Also Published As

Publication number Publication date
DE69132430D1 (en) 2000-11-02
DE69132430T2 (en) 2001-04-05
EP0464217A4 (en) 1992-06-24
DE69120150T2 (en) 1996-12-12
CA2048686C (en) 2001-01-02
EP0464217B1 (en) 1996-06-12
EP0464217A1 (en) 1992-01-08
KR920702175A (en) 1992-08-12
EP0664660B1 (en) 2000-09-27
EP0664660A3 (en) 1995-08-09
CA2048686A1 (en) 1991-07-20
WO1991011080A1 (en) 1991-07-25
DE69120150D1 (en) 1996-07-18

Similar Documents

Publication Publication Date Title
US5495534A (en) Audio signal reproducing apparatus
JP2964514B2 (en) Sound signal reproduction device
KR100435217B1 (en) Headphone
KR100225546B1 (en) Apparatus for reproducing acoustic signals
EP0674467B1 (en) Audio reproducing device
JP3687099B2 (en) Video signal and audio signal playback device
EP0699012B1 (en) Sound image enhancement apparatus
US5526429A (en) Headphone apparatus having means for detecting gyration of user's head
EP0977464A3 (en) Audio signal processing circuit
EP0664660B1 (en) Audio signal reproducing apparatus
TW200513134A (en) Method and arrangement for locating aural events such that they have a constant spatial direction using headphones
US7917236B1 (en) Virtual sound source device and acoustic device comprising the same
EP1161119B1 (en) Method for localizing sound image
JP2893780B2 (en) Sound signal reproduction device
JP2893779B2 (en) Headphone equipment
JPH1155799A (en) Audio reproducing device
JPH03296400A (en) Audio signal reproducing device
JP2751514B2 (en) Sound signal reproduction device
JP3111455B2 (en) Sound signal reproduction device
JP2874236B2 (en) Sound signal reproduction system
JPH04273800A (en) Sound image localization device
JPH03214896A (en) Acoustic signal reproducing device
JPS5819920Y2 (en) Sound reproduction device using headphone
JPH03214895A (en) Acoustic signal reproducing device
JPH03254163A (en) Sound apparatus for vehicle

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AC Divisional application: reference to earlier application

Ref document number: 464217

Country of ref document: EP

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): DE FR GB NL

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): DE FR GB NL

17P Request for examination filed

Effective date: 19960115

17Q First examination report despatched

Effective date: 19990308

GRAG Despatch of communication of intention to grant

Free format text: ORIGINAL CODE: EPIDOS AGRA

17Q First examination report despatched

Effective date: 19990308

GRAG Despatch of communication of intention to grant

Free format text: ORIGINAL CODE: EPIDOS AGRA

GRAH Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOS IGRA

GRAH Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOS IGRA

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AC Divisional application: reference to earlier application

Ref document number: 464217

Country of ref document: EP

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): DE FR GB NL

REF Corresponds to:

Ref document number: 69132430

Country of ref document: DE

Date of ref document: 20001102

ET Fr: translation filed
PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed
REG Reference to a national code

Ref country code: GB

Ref legal event code: IF02

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20100208

Year of fee payment: 20

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20100113

Year of fee payment: 20

Ref country code: DE

Payment date: 20100114

Year of fee payment: 20

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20100101

Year of fee payment: 20

REG Reference to a national code

Ref country code: NL

Ref legal event code: V4

Effective date: 20110118

REG Reference to a national code

Ref country code: GB

Ref legal event code: PE20

Expiry date: 20110117

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF EXPIRATION OF PROTECTION

Effective date: 20110118

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF EXPIRATION OF PROTECTION

Effective date: 20110117

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF EXPIRATION OF PROTECTION

Effective date: 20110118