WO2003005339A1 - Systems and method for producing music as a function of physiological parameters - Google Patents

Systems and method for producing music as a function of physiological parameters Download PDF

Info

Publication number
WO2003005339A1
WO2003005339A1 PCT/EP2002/007348 EP0207348W WO03005339A1 WO 2003005339 A1 WO2003005339 A1 WO 2003005339A1 EP 0207348 W EP0207348 W EP 0207348W WO 03005339 A1 WO03005339 A1 WO 03005339A1
Authority
WO
WIPO (PCT)
Prior art keywords
musical
music
rhythm
physiological
parameter
Prior art date
Application number
PCT/EP2002/007348
Other languages
French (fr)
Inventor
Paul-Louis Meunier
René-Louis BARON
Original Assignee
Thomson Multimedia
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thomson Multimedia filed Critical Thomson Multimedia
Publication of WO2003005339A1 publication Critical patent/WO2003005339A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/0028Training appliances or apparatus for special sports for running, jogging or speed-walking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/60Information retrieval; Database structures therefor; File system structures therefor of audio data
    • G06F16/63Querying
    • G06F16/635Filtering based on additional data, e.g. user or group profiles
    • G06F16/636Filtering based on additional data, e.g. user or group profiles by using biological or physiological data
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/0008Associated control or indicating means
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/02Means for controlling the tone frequencies, e.g. attack or decay; Means for producing special musical effects, e.g. vibratos or glissandos
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0075Means for generating exercise programs or schemes, e.g. computerized virtual trainer, e.g. using expert databases
    • A63B2024/0078Exercise efforts programmed as a function of time
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/17Counting, e.g. counting periodical movements, revolutions or cycles, or including further data processing to determine distances or speed

Definitions

  • the present invention relates to the production of music as a function of physiological parameters.
  • Systems which make prerecorded musical rhythms vary as a function of heartbeat. Such systems can be used to enable one's cardiac activity to be controlled when practicing sporting exercise.
  • Patent US-4 883 067 discloses the generation of musical signals as a function of the electroencephalogram (EEG) of a person, by means of electrical signals produced and converted into music.
  • This music comprises at least one voice which follows the contour of the EEG in real time, preferably by modulating the frequency of a tone or of a chord with the EEG signal. Additional voices are advantageously added, such as for example a voice using a modulation of timbre to indicate the relative frequency of occurrence of a particular feature of the EEG signal (col. 5, I. 5-11).
  • the method disclosed in this document is especially adapted to relaxation techniques or to therapeutic procedures.
  • Patent application EP-0 966 919 relates to a method of influencing the body, in particular in order to treat sleep problems. Biosignals such as an EEG, an electrocardiogram or the like (page 2, lines 49-52) are recorded in the course of a reference period.
  • This period is then divided into time intervals and, for each one of them, a frequency spectrum is calculated.
  • Music is then generated by allocating to each time interval a sound, the duration, tone and volume of which are determined by a parameter K extracted from the frequency spectrum.
  • This parameter K is equal to the ratio of spectral powers obtained over two separate frequency ranges.
  • Patent application EP-A-0 301 790 discloses techniques for detecting electrical signals generated in a human body, called
  • biopotentials and the transformation of such signals into a MIDI code.
  • Video games, musical or relaxation therapy (col. 4, I. 41 to col. 5, 1. 2) among others are included in the many applications of these methods. It is more specifically indicated that the body may react to the musical notes generated, by modifying brainwaves or heartbeats (col. 20, I. 6-9).
  • the frequency of a detected signal can be transformed into a note pitch (col. 8, I. 35-43), a program number or a harmonic generation (col. 11 , 1. 16-19), and a change in intensity or frequency can be transformed into a change in musical parameters (col. 8, I. 35-43).
  • a note pitch col. 8, I. 35-43
  • program number or a harmonic generation (col. 11 , 1. 16-19)
  • a change in intensity or frequency can be transformed into a change in musical parameters (col. 8, I. 35-43).
  • One example described in this document (col. 20, I.
  • 24-28 consists in modifying a MIDI code coming from another machine as a function of properties of measured biopotentials. It is thus mentioned by way of illustration that it is possible to increase the note pitches or to modify the rhythms when the heartbeats accelerate in an electrocardiogram signal (col. 20, I. 42-49).
  • the present invention relates to a system for producing music as a function of physiological parameters of a user, especially enabling heartbeats of the user to be controlled reliably during repetitive sporting efforts.
  • the system of the invention avoids the user himself assessing the frequency of a musical rhythm to know whether or not he has to moderate his efforts. It also enables him to avoid receiving confused information, further risking causing an error rather than helping him to measure the intensity of his activities.
  • the invention relates to a system for producing music as a function of physiological parameters having physiological rhythms, which makes it possible to obtain information which is reliable and which can be used directly on these physiological rhythms.
  • the invention is applicable not only to the control of repetitive sporting efforts (jogging, bodybuilding, gymnastics, etc.), but also to various other situations such as:
  • the invention also relates to a method of producing music as a function of physiological parameters and to a computer program, corresponding to the system of the invention.
  • the subject of the invention is a system for producing music as a function of physiological parameters, comprising:
  • the unit for producing music is such that the physiological rhythm is essentially represented in the music by the other musical parameters.
  • music produced refers to any music allowing variations of musical parameters. This music may especially consist of: - prerecorded music; for example, different recordings are allocated respectively to different ranges of values of the physiological rhythm;
  • a network for example on the air
  • channels broadcasting various types of music are associated with various ranges of values of the physiological rhythm.
  • physiological rhythm refers to any regular cadence likely to be extracted from the physiological parameters, for example a dominant frequency identified by Fourier transformation.
  • musical rhythm refers to the conventional concept of rhythm in the musical field, which is applicable to events whose succession in time is discernable to the ear, and not in particular to a vibration frequency creating a particular sound impression (note pitch) to a listener. Such a rhythm can be expressed as the succession and the relationship between audible values of particular durations.
  • the musical rhythm may especially consist of a tempo, that is to say of the rate of execution of a work.
  • the extraction of one or more physiological rhythms from one or more physiological parameters is obtained by any method, and may be based on a filtering technique and/or on a frequency transformation. In practice, it involves a dominant frequency (frequency peak) in the spectrum of the physiological parameter, representative of a periodic repetition. Similarly, the musical rhythm can be identified by a dominant frequency in the spectrum of the music produced (taking into account the restrictions above on the definition of the musical rhythm). According to variant embodiments, the system takes into account several physiological rhythms and/or several musical rhythms, it being possible for each of the physiological parameters to have one or more physiological rhythms.
  • the other musical parameters "represent" the physiological rhythm. They are therefore such that significant variations in the latter are perceptible in the music through these musical parameters.
  • the term "significant variations” refers to variations providing a benefit with respect to the desired result, for example unambiguously indicating that the cardiac activity is located either within an acceptable region, or warrants rest, or is dangerous.
  • This representation may optionally relate only to certain modifications of the physiological rhythm, for example to indicate exclusively the passage from a lower range to an upper range of values, by means of a binary selection between two values of the musical parameter in question.
  • this characteristic of "representation" of the physiological rhythm by a musical parameter is expressed as follows. It relates to the physiological parameter in a given measurement field of the physiological rhythm. In this measurement field, at least one value of the musical parameter in question is allocated to each value of the physiological rhythm.
  • This allocation may be deterministic, that is to say that since other data required independently of the physiological parameter are fixed (for example specific characteristics of the person on which the measurements are made), a single value of the musical parameter is allocated to the value of the physiological rhythm.
  • the allocation may also be nondeterministic, that is to say that several values of the musical parameter, which could include one or more ranges of values, are allocated to the same value of the physiological rhythm.
  • Such a nondeterministic allocation may be random, in which case the value effectively taken by the musical parameter from the various possible values for the value of the physiological rhythm is unpredictable. It may also be pseudo-random, when the value selected for the musical parameter is determined by an automatically selected parameter.
  • This selected parameter is, for example, derived from the physiological parameter, or from any mathematical law (such as loop selection of the various possible values of the musical parameter), or else from the present time.
  • the two types of allocation, deterministic and nondeterministic, may also be combined for the same musical parameter: in this case, the value of the musical parameter is one-to-one for some ranges of the measurement field, and indeterminate for others.
  • image of an interval of the measurement field refers to the set of possible values of the musical parameter in question (the other data required being fixed) for values of the physiological rhythm included within this interval.
  • Each image generally comprises at least one isolated value and/or at least one interval of values of the musical parameter, hereinafter called parametric value and parametric interval, respectively.
  • the musical parameter "represents” the physiological rhythm if the measurement field is divided into at least two intervals, hereinafter called rhythmic intervals, such that the respective images of these rhythmic intervals are separated in pairs by intervals of nonzero length.
  • Curve 71 corresponds to a random or pseudo-random distribution over two successive rhythmic intervals IR1 and IR2 forming a measurement field DM on one axis 60 of physiological rhythm RP.
  • the images of the rhythmic intervals IR1 and IR2 respectively consist of upper 1P1 and lower IP2 parametric intervals on one axis 61 of musical parameter PM, separated by an intermediate interval 74.
  • the discrimination may be effective even if the length of the intermediate interval 74 is short.
  • the uncertainties that would exist if the interval 74 was reduced to a point are avoided: the parametric intervals IP1 and 1P2 would then be adjacent at this point and, even if they were separate, there would be a risk of the values over the rhythmic intervals IR1 and IR2 tending toward the limiting value of this point until being unrecognizable by the listener.
  • the representativity of the musical parameter PM would then not be guaranteed.
  • Curve 72 defines a deterministic distribution of a musical parameter PM (axis 62) as a function of the physiological rhythm RP over the measurement field DM (axis 60).
  • the latter consists of three successive rhythmic intervals IR1 , IR2 and IR3, having three parametric values VP1 , VP2 and VP3 respectively (steps 75, 76 and 77, respectively) for images. Since these images are separated by intervals of nonzero length, the parameter PM represents the physiological rhythm RP. More specifically, it indicates whether this physiological rhythm RP has low (IR2), mean (IR3) or high (IR1) values.
  • Curve 73 corresponds to a nondeterministic distribution of a musical parameter PM (axis 63) over two successive rhythmic intervals IR1 and IR2 of the measurement field DM (axis 60).
  • the images of these intervals IR1 and IR2 consist of two parametric intervals IP1 and IP2, respectively, overlapping in a common region 78. Unlike the previous cases, the intervals IP1 and IP2 are therefore not separated by an interval, but have an intersection, the region 78.
  • the musical parameter PM is consequently unable to represent the physiological rhythm RP. This is because a user identifying values of the musical parameter PM would be unable to know whether the physiological rhythm RP were very high or very low.
  • the second point of this proposal is that the physiological rhythm is "essentially” represented by the other musical parameters. Recognition of the physiological rhythm is therefore not carried out fundamentally by the musical rhythm. Any effect of the physiological rhythm on the musical rhythm is incidental and does not have any meaningful purpose, other than secondary. When this effect is not completely removed, it may on the other hand allow a margin for maneuver in the production of music. This is therefore used either for artistic purposes, or to support the identification carried out by means of other musical parameters (for example, on crossing a threshold of the physiological rhythm, musical instruments are changed at a constant cadence thereby giving more rhythmic strength to a drum accompaniment).
  • the music production system of the invention contrasts with the known techniques highlighting variations of the physiological rhythm exclusively by means of varying the musical rhythm.
  • the physiological rhythm is perceived by means of other parameters, which may well be more discriminating to listen to, such as especially the nature of the music style or of the melody, instrumentation, scale, note pitch and/or the tonality.
  • the system of the invention also contrasts with known methods in which a variation of musical parameters, not restricted to the musical rhythm, is caused by modifications of the physiological parameter in question. This is because, in this case, the music production unit is such that these other musical parameters are representative of the physiological rhythm, and are not dependent on it only in a complex and unusable way.
  • the physiological rhythm is represented in a significant way in the music produced through musical parameters other than the musical rhythm.
  • the music production unit is such that this dependent contribution is minor in the musical rhythm.
  • This "contribution” is defined as the maximum amplitude of variation of the musical rhythm when the physiological rhythm sweeps the entire measurement field, all other specific characteristics of the physiological parameter in question moreover being the same.
  • This measurement field may cover the entire positive part of the real axis, or be more restricted for physiological reasons (for example, heartbeats are necessarily included within a physically feasible range of values).
  • This contribution is “minor” in the sense that the ratio of the contribution to the mean value of the musical rhythm over the measurement field is less than 50%.
  • the contribution of the physiological rhythm may in particular be zero, the musical rhythm then being completely independent of the physiological rhythm.
  • the user chooses a background musical rhythm, while the actual musical rhythm is given by a curve which increases as a function of the physiological rhythm over the measurement field, having a mean value equal to the background musical rhythm and an amplitude equal to 30% of this value, over this interval. Because of the existence of this limitation, the effect of the physiological rhythm on the musical rhythm is restricted. Thus, is it possible to avoid excessive disturbance of the latter by the former and, as a result, to enable a listener to be more receptive to the really meaningful variations in the music. This arrangement may also avoid drifts in rhythm which are disagreeable to listen to. It also makes it possible to select a given musical rhythm which is completely independent of the physiological rhythm and may therefore carry other information. In particular, this musical rhythm is advantageously employed to indicate the desired frequency of sporting efforts.
  • the contribution dependent on the physiological rhythm over this field is less than 20% of this mean, and preferably less than 10%.
  • the percentage of this contribution is calculated as indicated above.
  • the contribution of the musical rhythm dependent on the physiological rhythm is zero, such that the musical rhythm is independent of the physiological rhythm.
  • the musical rhythm is substantially constant over time. This constancy extends beyond the intervention of a user, likely to modify the musical rhythm.
  • the music production system then preferably comprises a unit for controlling the musical rhythm by a user, allowing this user to choose the musical rhythm which is substantially constant over time.
  • This embodiment may prove to be particularly useful for rhythmic sporting efforts. This is because it is then desirable to dissociate the cadence of effort, for example from walking, weightlifting or pedaling, from that of heartbeat.
  • the user has both this reference frequency and musical indications concerning his physical abilities.
  • the choice of the musical rhythm especially makes it possible to provide a pleasant rhythm, for example for a motor vehicle driver, or a lively rhythm, for example for music produced in a discotheque.
  • the latter can be varied over time.
  • rhythms in the music produced which makes it possible, for example, to choose various pieces without thinking about rhythms, or to introduce pleasant diversity, thus avoiding monotony.
  • the other musical parameters are advantageously chosen from styles of music (advantageously including at least one of the following harmonic families: western, eastern, far-eastern, blues, jazz, etc.), melodies, instrumentations (including, separately or in any combination: a choice of instruments or sounds, setting or "mixing" instruments relative to the parameters of overall volume, reverberation, echoes, panning, envelope, clarity of sound, etc.), scales, note pitches, tonalities, harmonic chords and any combination of these parameters.
  • a variation in sound intensity is also possible, but is secondarily rather than primarily desirable, because of the uncertainties in distinction by the listener and the uncomfortable consequences which could result from excessive sound variations.
  • physiological parameters they are advantageously chosen from heartbeats, an electroencephalogram, blood pressure, ocular pupil size, overall body movements, muscular tension (in particular quasi- static movement, such as a bodybuilding effort) and any combination of these parameters.
  • heartbeats an electroencephalogram
  • blood pressure a blood pressure
  • ocular pupil size a body movements
  • muscular tension in particular quasi- static movement, such as a bodybuilding effort
  • the physiological rhythm In a first form of deterministic relationship between the physiological rhythm and the other musical parameters, the latter depend on the physiological rhythm according to a dependence relationship which is continuous. A user may thus progressively notice a change in the music heard as a function of the physiological rhythm, which allows a subtle distinction. Furthermore, the transitions are thus pleasant to listen to.
  • the physiological rhythm In a second form of deterministic relationship between the physiological rhythm and the other musical parameters, the latter depend on the physiological rhythm according to a dependence relationship which is in steps.
  • This embodiment may allow the user to hear clearly significant transitions between several ranges of values of the physiological rhythm (for example, passing from an acceptable range to a dangerous range for sporting efforts).
  • the music production system advantageously comprises a unit for selecting a relationship of dependence of the other musical parameters on the physiological rhythm, from user information and reference dependence relationships stored in a database. It is thus possible to match the change in musical parameters to the specific characteristics of the user, such as his age, his sex, his height and his weight, etc. To do this, it is possible to resort to graphs giving appropriate curves.
  • the unit for producing music comprises a generator of music, this generator having at least one input parameter consisting of the musical parameter or musical parameters.
  • this generator having at least one input parameter consisting of the musical parameter or musical parameters.
  • the latter comprises:
  • the means of constructing at least one series of notes is such that, for each moment, each note whose pitch belongs exclusively to the second family is surrounded exclusively by notes from the first family.
  • the succession of note pitches has both great richness, since the number of successions that can thus be generated is several thousand, and harmonic coherence, since the polyphony generated is regulated by restrictions.
  • the means of defining two families of note pitches is suitable for defining, for each musical moment, the first family as a set of note pitches belonging to a chord duplicated octave by octave.
  • the means of defining two families of note pitches is suitable for defining the second family of note pitches for which it comprises at least the note pitches of a scale which are not in the first family of note pitches.
  • the means of constructing at least one succession of notes having at least two notes is adapted so that each musical phrase is defined as a set of notes, the starting times of which are not separated from each other, in pairs, by more than a predetermined duration.
  • a musical phrase consists, for example, of notes whose starts are not separated by more than three sixteenth notes.
  • the invention also relates to a method of producing music as a function of physiological parameters, according to which music having at least one musical rhythm and at least one other musical parameter is produced, these other musical parameters depending on at least one physiological rhythm of at least one physiological parameter.
  • the physiological rhythm is essentially represented in the music by the other musical parameters.
  • the music production method is preferably implemented by means of a music production system according to any one of the embodiments of the invention.
  • the physiological rhythm of the physiological parameter, used to produce the music is measured on a user.
  • the music produced is then advantageously sent in real time to the user.
  • the user can then adapt his behavior as a function of the information received, in a conscious way (for example in the case of warning of excessive effort) or in an involuntary way (for example relaxation).
  • recorded data is used to produce music.
  • beneficial information is obtained which can be directly interpreted by the listener.
  • the subject of the invention is also a computer program. According to the invention, the latter comprises functionalities capable of producing any one of the embodiments of a music production system according to the invention and/or of implementing any one of the forms of a music production method according to the invention when this program is executed on a computer.
  • computer program refers to any physical embodiment of a computer program, which may include not only storage media (cassettes, disks, etc.) but also (electrical, optical, etc.) signals.
  • FIG. 1A shows the variations of a first musical parameter representing a physiological rhythm in music produced as a function of this physiological rhythm
  • FIG. 1 B shows the variations of a second musical parameter representing a physiological rhythm in music produced as a function of this physiological rhythm
  • - figure 1 C shows the variations of a third musical parameter, not representing a physiological rhythm in music produced as a function of this physiological rhythm
  • - figure 2 is an outline diagram of a musical production assembly comprising a music production system according to the invention
  • figure 3 details a music production unit of the music production system of figure 2;
  • - figure 4 shows continuous parametrized curves giving a first musical parameter used in the music production system of figures 2 and 3, as a function of a pressure rhythm
  • - figure 5 shows a stepped curve giving a second musical parameter used in the music production system of figures 2 and 3, as a function of the pressure rhythm
  • FIG. 6 shows a flow diagram for generating automatic music, implemented for generating music by means of the music production unit of figures 2 and 3;
  • figure 7 shows, in the form of a block diagram, a music generator which can be used in the music production unit of figures 2 and 3 and allowing automatic music generation according to figure 6.
  • An assembly for producing music as a function of physiological parameters PP comprises (figure 2) a music production system 1 especially including a unit 11 for receiving these parameters PP and a unit 12 for producing music M as a function of these parameters PP.
  • the music production assembly also comprises a detector 2 for detecting a physiological quantity on a user and a filter 3, making it possible to extract the desired physiological parameters PP from the detected quantity and to transmit them to the reception unit 11 of the music production system 1.
  • the measured physiological quantity is blood pressure and variations of this pressure with time, in particular, a pressure rhythm RP, are extracted.
  • the music production assembly is also provided with headphones 4 receiving the music M supplied by the music production system 1 and broadcasting it to a user, who is preferably the one on which the measurements are carried out.
  • this user has, advantageously in real time, feedback in musical form of his blood pressure variations, that is to say of his heartbeat.
  • the music production system 1 also has a unit 14 for selecting dependence relationships RD between the measured pressure rhythm RP and musical parameters PM, which serve to produce the music M.
  • this selection only relates to some of the musical parameters PM, each of the other musical parameters already having a fixed dependence relationship.
  • the selection unit 14 is provided in order to determine, for each musical parameter PM involved, the appropriate dependence relationship RD, from user information IU (sex, age, frequency of sporting activities practiced, etc.), for example entered beforehand by the user himself.
  • the dependence relationships RD themselves are provided to be stored in a database 5, which can be accessed by the music production unit 12.
  • the selection unit 14 especially has the function of recording an identifier IRD for the selected dependence relationship RD in the database 5, so that the music production unit 12 can directly access the latter among the various dependence relationships RD available in the database 5.
  • the music production system 1 further comprises a rhythm control unit 13, intended to specify a musical rhythm RM to be adopted for the music M produced, to the music production unit 12.
  • This control unit 13 acts as a function of user commands C.
  • this rhythm RM is imposed on a background music M0 having a rhythm which is substantially constant over time.
  • the music production unit 12 will now be detailed (figure 3).
  • the latter comprises a module 21 for calculating musical parameters PM from dependence relationships RD recorded in the database 5, some of these dependence relationships RD being selected by means of corresponding identifiers IRD.
  • PM1 (axis 31) is given by a curve C1 which decreases as a function of the pressure rhythm RP (axis 30), this curve C1 being determined from user information IU.
  • various possible curves C1A, C1 B, C1 C and C1 D are distinguished, the choice of the correct curve C1 being carried out by the selection unit 14 and being recorded in the database 5 in the form of the identifier IRD associated with this parameter PM1.
  • a second of the musical parameters, PM2 (axis 32), is defined independently of all other information, as a function of the pressure rhythm RP (figure 5).
  • This parameter PM2 is given by a curve C2 comprising successive steps P1-P5, such that the parameter PM2 has constant values over respective ranges of values of the pressure rhythm RP.
  • the two parameters PM1 and PM2 are used jointly to generate the background music M0, possibly in combination with yet other musical parameters PM.
  • the music M produced thus varies in two different ways, both by indicating minor modifications of the pressure rhythm RP and by emphasizing significant changes of this rhythm.
  • the parameter PM1 corresponds to note pitches, while the parameter PM2 involves instruments employed to play these notes.
  • the parameter PM1 represents a continuous passage from one type of instrument to another (such as passing from a string instrument to a wind instrument), while the parameter PM2 defines the presence and the intensity of percussion instruments.
  • the music production unit 12 also comprises a music generator 22, capable of producing the background music M0, with a rhythm which is substantially constant over time, from musical parameters PM supplied by the computing module 21.
  • Figure 6 shows schematically a flowchart for automatic music generation used for the generator 22.
  • musical moments are defined during an operation 42.
  • a musical piece comprising bars, each bar comprising beats, each beat comprising note placements.
  • the operation 42 consists in allocating a bar number to the musical piece, a beat number to each bar and a note placement number to each beat or a minimum note duration.
  • Each musical moment is defined therein in such a way that at least four notes are capable of being played for its duration.
  • two families of note pitches are defined for each musical moment, the second family of note pitches having at least one note pitch which is not in the first family.
  • a scale and a chord are allocated to each half bar of the musical piece, the first family comprising the note pitches of this chord, duplicated octave by octave, and the second family comprising at least the note pitches of the scale which are not in the first family. It is observed that different musical moments or consecutive musical moments may have the same families of note pitches.
  • At least one succession of notes having at least two notes is constructed with, for each moment, each note whose pitch belongs exclusively to the second family being exclusively surrounded by notes from the first family.
  • a succession of notes is defined as a set of notes, the starting times of which are not separated from each other, in pairs, by more than a predetermined duration.
  • a succession of notes does not have two consecutive note pitches which are exclusively in the second family of note pitches.
  • FIG. 7 shows, in the form of a block diagram, one embodiment of the generator 22.
  • the generator 22 comprises, connected to each other by at least one signal line 55, a generator 52 generating families of note pitches, a musical moment generator 54, a musical phrase generator 56 and an output port 58.
  • the output port 58 is connected to an external signal line 59 connected to the listener 4 via the rhythm adaptation module 23.
  • the signal line 59 is a line capable of transporting a message or information.
  • the musical moment generator 54 defines musical moments such that, during each musical moment, four notes are capable of being played.
  • the musical moment generator 54 defines a musical piece by a bar number contained therein and, for each bar, a beat number and, for each beat, a placement number for a possible note start or a minimum note duration.
  • the generator 52 for generating families of note pitches defines two families of note pitches, for each musical moment.
  • the generator 52 defines the two families of note pitches in such a way that the second family of note pitches has at least one note pitch which is not in the first family of note pitches.
  • a scale and a chord are allocated to each half bar of the musical piece, the first family comprising the note pitches of this chord, duplicated octave by octave, and the second family comprising at least the note pitches of the scale which are not in the first family. It is observed that various musical moments or consecutive musical moments may have the same families of note pitches.
  • the musical phrase generator 56 generates at least one succession of notes having at least two notes, each succession being constructed in such a way that, for each moment, each note whose pitch belongs exclusively to the second family is surrounded exclusively by notes of the first family.
  • a succession of notes is defined as a set of notes whose starting times are not separated from each other, in pairs, by more than a predetermined duration.
  • the generator 52 generating families of note pitches, for each half bar, a succession of notes does not have two consecutive note pitches which are exclusively in the second family of note pitches.
  • the output port 58 transmits, via the external signal line 59, a signal representative of the note pitches of each succession.

Abstract

The present invention relates to a system (1) and a method of producing music as a function of physiological parameters (PP), and to a corresponding computer program. The system comprises a unit (11) for receiving at least one physiological parameter having at least one physiological rhythm, and a unit (12) for producing music having at least one musical rhythm (RM) and at least one other musical parameter dependent on the physiological rhythm. The physiological rhythm is essentially represented in the music by the other musical parameters. Preferably, although the musical rhythm comprises a contribution dependent on the physiological rhythm, this contribution is minor in the musical rhythm. Applications to vigilance during sporting activities or motor vehicle driving, and to the creation of music by dynamic or quasi-static movement.

Description

SYSTEMS AND METHOD FOR PRODUCING MUSIC AS A FUNCTION OF
PHYSIOLOGICAL PARAMETERS
The present invention relates to the production of music as a function of physiological parameters.
The practice of regular sporting activities, such as, for example, jogging or bodybuilding, is extremely widespread throughout the world, and makes it possible to keep in shape. However, people who practice these activities sometimes go too far in their efforts, for example with concerns about exceeding limits or following fatigue states, and are exposed to cardiac accidents.
Systems are known which make prerecorded musical rhythms vary as a function of heartbeat. Such systems can be used to enable one's cardiac activity to be controlled when practicing sporting exercise.
However, it is difficult for users to adapt the intensity of their efforts as a function of the tempo of the music heard. This is because the heartbeat rhythm does not necessarily coincide with the tempo of the music broadcast, which risks causing some confusion. Furthermore, the user does not in general have an acceptable range of cardiac rhythms immediately in mind. He may well realize that the tempo is high or low, but without being able to clearly determine at which moment he should moderate his efforts or to what extent he can allow himself to intensify them. Moreover, music accelerated by the cardiac rhythm risks being quite uncomfortable to listen to.
By way of illustration, let us consider the cardiac rhythm of an immobile person, at rest for more than twenty minutes. His heart has a frequency of 70 beats per minute, which corresponds to a slow musical tempo (slow, adagio, etc.). If this person suddenly supplies a strenuous effort (for example a 60 meter run), his cardiac rhythm can go from 70 to 140 beats per minute in a few seconds, which in musical terms corresponds to a fast tempo and is not necessarily adapted to the execution of music written for a tempo of 70 to the quarter note.
Finally, the use of such systems does not allow variations of rhythms other than those controlled by the heartbeats; as a result, it requires background music having a very regular cadence, the speed of which can be adjusted, which reduces the options available for the broadcast music.
Patent US-4 883 067 discloses the generation of musical signals as a function of the electroencephalogram (EEG) of a person, by means of electrical signals produced and converted into music. This music comprises at least one voice which follows the contour of the EEG in real time, preferably by modulating the frequency of a tone or of a chord with the EEG signal. Additional voices are advantageously added, such as for example a voice using a modulation of timbre to indicate the relative frequency of occurrence of a particular feature of the EEG signal (col. 5, I. 5-11). The method disclosed in this document is especially adapted to relaxation techniques or to therapeutic procedures.
One adaptation of this technique in controlling cardiac activity is possible and it would have the advantage of relating to original music, generated as a function of the physiological characteristics of the users. However, it would lead to the same inaccuracies and discomfort as those mentioned above. In particular, the cardiac rhythm variations would then result in similar variations of tone or chord modulations, and in possible corresponding variations of timbre modulations, which would be difficult to exploit by a user desiring to adapt his efforts to his abilities at the time. Patent application EP-0 966 919 relates to a method of influencing the body, in particular in order to treat sleep problems. Biosignals such as an EEG, an electrocardiogram or the like (page 2, lines 49-52) are recorded in the course of a reference period. This period is then divided into time intervals and, for each one of them, a frequency spectrum is calculated. Music is then generated by allocating to each time interval a sound, the duration, tone and volume of which are determined by a parameter K extracted from the frequency spectrum. This parameter K is equal to the ratio of spectral powers obtained over two separate frequency ranges.
This technique could possibly be adapted in order to control sporting effort. However, apart from the fact that the results would be deferred (time intervals large enough to encompass several periods, Fourier transform and calculations to be carried out on each of them, necessity of having enough time intervals to then compose music taking account of the duration of each sound, etc.), the musical parameters chosen (tone, volume, duration) would change in a complex way with the rhythm of the heartbeat, thus providing information which is virtually unusable. In particular, their values would be strongly affected by the measurement duration and time intervals, and by the overall frequency content of the signal over the frequency ranges in question, chosen in order to calculate the parameter K. Furthermore, according to the example illustrated (cf. figure 2 of the prior art), there would be a risk of the execution rate varying in a sensitive and very irregular way with the cardiac rhythm, which could trouble the user in the regularity of his sporting effort.
Patent application EP-A-0 301 790 discloses techniques for detecting electrical signals generated in a human body, called
"biopotentials", and the transformation of such signals into a MIDI code. Video games, musical or relaxation therapy (col. 4, I. 41 to col. 5, 1. 2) among others are included in the many applications of these methods. It is more specifically indicated that the body may react to the musical notes generated, by modifying brainwaves or heartbeats (col. 20, I. 6-9). Furthermore, the frequency of a detected signal can be transformed into a note pitch (col. 8, I. 35-43), a program number or a harmonic generation (col. 11 , 1. 16-19), and a change in intensity or frequency can be transformed into a change in musical parameters (col. 8, I. 35-43). One example described in this document (col. 20, I. 24-28) consists in modifying a MIDI code coming from another machine as a function of properties of measured biopotentials. It is thus mentioned by way of illustration that it is possible to increase the note pitches or to modify the rhythms when the heartbeats accelerate in an electrocardiogram signal (col. 20, I. 42-49).
The teaching of this prior art, adapted to the production of music and taking account of physiological phenomena such as heartbeat, may make it possible for a user to be receptive to certain physiological modifications and to react to them. However, the techniques described are not really suitable for reliable control of cardiac activity. This is because the user only has vague indications about the rhythm of his heartbeat, associated with musical parameters such as the rhythm or the note pitch, these indications furthermore being, in some cases, dependent on the incoming music.
Moreover, international application WO-00/17850 (or corresponding European patent EP-B-1116213 in the process of delivery) describes a method of automatic music generation, and especially discloses the generation of music as a function of physiological quantities of the body of a user, such as for example those obtained by a tensiometer or a pulse sensor (page 22, lines 1-12). In order to do this, it is shown that parameters representative of physical quantities are made to correspond with parameters for music generation. However, this prior art does not give further explanation on the nature of these parameters and their correspondences. Thus, it emerges from the prior art that the only proven and recognized technique allowing effective control of physical activities as a function of physiological reactions, by means of musical information, is based on varying the musical rhythm as a function of a physiological rhythm such as that of the heartbeat. The effect of the physiological rhythm on musical parameters other than the musical rhythm (note pitch, for example) is used for esthetic purposes or overall to produce emotional impressions in the user, which could encourage automatic adaptation reactions. The musical rhythm, whether or not it is combined with other effects, in any case remains the only decisive factor to control the efforts. Now, the unfortunate consequences of this method (confusion with the sporting rhythms, difficult fine discrimination, acoustic discomfort) have been seen above.
The present invention relates to a system for producing music as a function of physiological parameters of a user, especially enabling heartbeats of the user to be controlled reliably during repetitive sporting efforts.
In particular, the system of the invention avoids the user himself assessing the frequency of a musical rhythm to know whether or not he has to moderate his efforts. It also enables him to avoid receiving confused information, further risking causing an error rather than helping him to measure the intensity of his activities.
More generally, the invention relates to a system for producing music as a function of physiological parameters having physiological rhythms, which makes it possible to obtain information which is reliable and which can be used directly on these physiological rhythms. The invention is applicable not only to the control of repetitive sporting efforts (jogging, bodybuilding, gymnastics, etc.), but also to various other situations such as:
- checking the vigilance of a vehicle driver; - highlighting mental manifestations, such as emotional or psychosomatic disorders, for therapeutic, prophylactic or diagnostic purposes;
- highlighting physical disorders for medical purposes;
- self-control by means of feedback from the music produced; - calming techniques, especially for treating sleep disorders;
- the esthetic creation of music from physiological quantities.
The invention also relates to a method of producing music as a function of physiological parameters and to a computer program, corresponding to the system of the invention.
For this purpose, the subject of the invention is a system for producing music as a function of physiological parameters, comprising:
- a unit for receiving at least one physiological parameter having at least one physiological rhythm;
- and a unit for producing music having at least one musical rhythm and at least one other musical parameter; these other musical parameters depend on the physiological rhythm.
According to the invention, the unit for producing music is such that the physiological rhythm is essentially represented in the music by the other musical parameters.
The term "music produced" refers to any music allowing variations of musical parameters. This music may especially consist of: - prerecorded music; for example, different recordings are allocated respectively to different ranges of values of the physiological rhythm;
- music generated automatically, in particular from other musical parameters, these musical parameters being determined as a function of the physiological rhythm;
- or music picked up in real time via a network (for example on the air); by way of illustration, channels broadcasting various types of music (classical music, reggae, rock, etc.) are associated with various ranges of values of the physiological rhythm.
The term "physiological rhythm" refers to any regular cadence likely to be extracted from the physiological parameters, for example a dominant frequency identified by Fourier transformation. On the other hand, the term "musical rhythm" refers to the conventional concept of rhythm in the musical field, which is applicable to events whose succession in time is discernable to the ear, and not in particular to a vibration frequency creating a particular sound impression (note pitch) to a listener. Such a rhythm can be expressed as the succession and the relationship between audible values of particular durations. The musical rhythm may especially consist of a tempo, that is to say of the rate of execution of a work.
The extraction of one or more physiological rhythms from one or more physiological parameters is obtained by any method, and may be based on a filtering technique and/or on a frequency transformation. In practice, it involves a dominant frequency (frequency peak) in the spectrum of the physiological parameter, representative of a periodic repetition. Similarly, the musical rhythm can be identified by a dominant frequency in the spectrum of the music produced (taking into account the restrictions above on the definition of the musical rhythm). According to variant embodiments, the system takes into account several physiological rhythms and/or several musical rhythms, it being possible for each of the physiological parameters to have one or more physiological rhythms.
The proposal according to which the physiological rhythm is essentially represented in the music by the other musical parameters is structured around two points. Firstly, the other musical parameters "represent" the physiological rhythm. They are therefore such that significant variations in the latter are perceptible in the music through these musical parameters. The term "significant variations" refers to variations providing a benefit with respect to the desired result, for example unambiguously indicating that the cardiac activity is located either within an acceptable region, or warrants rest, or is dangerous. This representation may optionally relate only to certain modifications of the physiological rhythm, for example to indicate exclusively the passage from a lower range to an upper range of values, by means of a binary selection between two values of the musical parameter in question.
From a formal point of view, this characteristic of "representation" of the physiological rhythm by a musical parameter is expressed as follows. It relates to the physiological parameter in a given measurement field of the physiological rhythm. In this measurement field, at least one value of the musical parameter in question is allocated to each value of the physiological rhythm.
This allocation may be deterministic, that is to say that since other data required independently of the physiological parameter are fixed (for example specific characteristics of the person on which the measurements are made), a single value of the musical parameter is allocated to the value of the physiological rhythm. The allocation may also be nondeterministic, that is to say that several values of the musical parameter, which could include one or more ranges of values, are allocated to the same value of the physiological rhythm. Such a nondeterministic allocation may be random, in which case the value effectively taken by the musical parameter from the various possible values for the value of the physiological rhythm is unpredictable. It may also be pseudo-random, when the value selected for the musical parameter is determined by an automatically selected parameter. This selected parameter is, for example, derived from the physiological parameter, or from any mathematical law (such as loop selection of the various possible values of the musical parameter), or else from the present time. The two types of allocation, deterministic and nondeterministic, may also be combined for the same musical parameter: in this case, the value of the musical parameter is one-to-one for some ranges of the measurement field, and indeterminate for others.
The term "image" of an interval of the measurement field refers to the set of possible values of the musical parameter in question (the other data required being fixed) for values of the physiological rhythm included within this interval. Each image generally comprises at least one isolated value and/or at least one interval of values of the musical parameter, hereinafter called parametric value and parametric interval, respectively. It is then said that the musical parameter "represents" the physiological rhythm if the measurement field is divided into at least two intervals, hereinafter called rhythmic intervals, such that the respective images of these rhythmic intervals are separated in pairs by intervals of nonzero length. This proposal states that it is possible for the rhythmic interval involved to be unambiguously identified from any one of these images of the musical parameter.
This concept is illustrated by three curves 71 , 72 and 73
(represented by figures 1A, 1 B and 1C, respectively). Curve 71 corresponds to a random or pseudo-random distribution over two successive rhythmic intervals IR1 and IR2 forming a measurement field DM on one axis 60 of physiological rhythm RP. The images of the rhythmic intervals IR1 and IR2 respectively consist of upper 1P1 and lower IP2 parametric intervals on one axis 61 of musical parameter PM, separated by an intermediate interval 74. This means that, for any value of the physiological rhythm RP located in the rhythmic interval IR1 , the musical parameter PM is located in the parametric interval IP1 , without it being possible to give its value a priori, and similarly for the intervals 1R2 and IP2. Since the intervals IP1 and IP2 are separated by the intermediate interval 74, the musical parameter PM represents the physiological rhythm RP. More specifically, it is capable of indicating whether this physiological rhythm RP has lower (IR1) or upper (IR2) values.
The discrimination may be effective even if the length of the intermediate interval 74 is short. On the other hand, the uncertainties that would exist if the interval 74 was reduced to a point are avoided: the parametric intervals IP1 and 1P2 would then be adjacent at this point and, even if they were separate, there would be a risk of the values over the rhythmic intervals IR1 and IR2 tending toward the limiting value of this point until being unrecognizable by the listener. The representativity of the musical parameter PM would then not be guaranteed.
Curve 72 (figure 1 B) defines a deterministic distribution of a musical parameter PM (axis 62) as a function of the physiological rhythm RP over the measurement field DM (axis 60). The latter consists of three successive rhythmic intervals IR1 , IR2 and IR3, having three parametric values VP1 , VP2 and VP3 respectively (steps 75, 76 and 77, respectively) for images. Since these images are separated by intervals of nonzero length, the parameter PM represents the physiological rhythm RP. More specifically, it indicates whether this physiological rhythm RP has low (IR2), mean (IR3) or high (IR1) values. Curve 73 (figure 1C) corresponds to a nondeterministic distribution of a musical parameter PM (axis 63) over two successive rhythmic intervals IR1 and IR2 of the measurement field DM (axis 60). The images of these intervals IR1 and IR2 consist of two parametric intervals IP1 and IP2, respectively, overlapping in a common region 78. Unlike the previous cases, the intervals IP1 and IP2 are therefore not separated by an interval, but have an intersection, the region 78. The musical parameter PM is consequently unable to represent the physiological rhythm RP. This is because a user identifying values of the musical parameter PM would be unable to know whether the physiological rhythm RP were very high or very low.
The second point of this proposal, which is qualitative, is that the physiological rhythm is "essentially" represented by the other musical parameters. Recognition of the physiological rhythm is therefore not carried out fundamentally by the musical rhythm. Any effect of the physiological rhythm on the musical rhythm is incidental and does not have any meaningful purpose, other than secondary. When this effect is not completely removed, it may on the other hand allow a margin for maneuver in the production of music. This is therefore used either for artistic purposes, or to support the identification carried out by means of other musical parameters (for example, on crossing a threshold of the physiological rhythm, musical instruments are changed at a constant cadence thereby giving more rhythmic strength to a drum accompaniment).
In practice, it is possible to vouch for the presence of these two features as follows: several significant values of the physiological rhythm are selected and, all other specific characteristics of the physiological parameter in question moreover being equal, music corresponding respectively to these significant values is produced by means of the system of the invention. Next, temporal transformation of the music obtained is carried out, so as to give it an identical mean musical rhythm. This makes it possible to overcome variations of musical rhythm due to the variations in the physiological rhythm. Simply listening to music transformed in this way must then enable an informed listener to distinguish the desired significant aspects of the physiological rhythm according to the music.
The music production system of the invention contrasts with the known techniques highlighting variations of the physiological rhythm exclusively by means of varying the musical rhythm. In this case, the physiological rhythm is perceived by means of other parameters, which may well be more discriminating to listen to, such as especially the nature of the music style or of the melody, instrumentation, scale, note pitch and/or the tonality.
The system of the invention also contrasts with known methods in which a variation of musical parameters, not restricted to the musical rhythm, is caused by modifications of the physiological parameter in question. This is because, in this case, the music production unit is such that these other musical parameters are representative of the physiological rhythm, and are not dependent on it only in a complex and unusable way.
The technique of document EP-0 966 919 leads, for example, to complete uncertainty whether a frequency peak (corresponding to the physiological rhythm in question) is located in the interval Δ or α of figure 1 , since the musical parameters depend on the ratio of the spectral densities in the intervals Θ and β. Thus, a low value of the parameter K, which is deterministic for the musical parameters, could for example correspond both to a very slow rhythm (interval Δ) and to a very fast rhythm (interval β). The musical parameters defined therefore cannot represent the physiological rhythm. Similarly, in application EP-A-0 301 790, no provision is made so that parameters such as note pitches allow reliable information concerning a physiological rhythm to be obtained, whether these pitches are produced from all parts or by altering the background music. In this way, a person skilled in the art wishing to provide effective control could only follow the general trend consisting in using the musical rhythm as an indicator.
Thus, in an unexpected way with respect to the known techniques, the physiological rhythm is represented in a significant way in the music produced through musical parameters other than the musical rhythm.
Preferably, since the musical rhythm comprises a contribution dependent on the physiological rhythm, the music production unit is such that this dependent contribution is minor in the musical rhythm.
This "contribution" is defined as the maximum amplitude of variation of the musical rhythm when the physiological rhythm sweeps the entire measurement field, all other specific characteristics of the physiological parameter in question moreover being the same. This measurement field may cover the entire positive part of the real axis, or be more restricted for physiological reasons (for example, heartbeats are necessarily included within a physically feasible range of values).
This contribution is "minor" in the sense that the ratio of the contribution to the mean value of the musical rhythm over the measurement field is less than 50%. The contribution of the physiological rhythm may in particular be zero, the musical rhythm then being completely independent of the physiological rhythm. By way of example, the user chooses a background musical rhythm, while the actual musical rhythm is given by a curve which increases as a function of the physiological rhythm over the measurement field, having a mean value equal to the background musical rhythm and an amplitude equal to 30% of this value, over this interval. Because of the existence of this limitation, the effect of the physiological rhythm on the musical rhythm is restricted. Thus, is it possible to avoid excessive disturbance of the latter by the former and, as a result, to enable a listener to be more receptive to the really meaningful variations in the music. This arrangement may also avoid drifts in rhythm which are disagreeable to listen to. It also makes it possible to select a given musical rhythm which is completely independent of the physiological rhythm and may therefore carry other information. In particular, this musical rhythm is advantageously employed to indicate the desired frequency of sporting efforts.
Advantageously, since the musical rhythm has a mean over a field for measuring the physiological rhythm, the contribution dependent on the physiological rhythm over this field is less than 20% of this mean, and preferably less than 10%. The percentage of this contribution is calculated as indicated above.
In a particular advantageous embodiment, the contribution of the musical rhythm dependent on the physiological rhythm is zero, such that the musical rhythm is independent of the physiological rhythm.
According to a first form of this embodiment, the musical rhythm is substantially constant over time. This constancy extends beyond the intervention of a user, likely to modify the musical rhythm.
The music production system then preferably comprises a unit for controlling the musical rhythm by a user, allowing this user to choose the musical rhythm which is substantially constant over time. This embodiment may prove to be particularly useful for rhythmic sporting efforts. This is because it is then desirable to dissociate the cadence of effort, for example from walking, weightlifting or pedaling, from that of heartbeat. With the present embodiment, the user has both this reference frequency and musical indications concerning his physical abilities. In other circumstances, the choice of the musical rhythm especially makes it possible to provide a pleasant rhythm, for example for a motor vehicle driver, or a lively rhythm, for example for music produced in a discotheque.
According to a second form of the embodiment with zero contribution from the physiological rhythm in the musical rhythm, the latter can be varied over time. Thus, one is entirely free to vary rhythms in the music produced, which makes it possible, for example, to choose various pieces without thinking about rhythms, or to introduce pleasant diversity, thus avoiding monotony.
The other musical parameters are advantageously chosen from styles of music (advantageously including at least one of the following harmonic families: western, eastern, far-eastern, blues, jazz, etc.), melodies, instrumentations (including, separately or in any combination: a choice of instruments or sounds, setting or "mixing" instruments relative to the parameters of overall volume, reverberation, echoes, panning, envelope, clarity of sound, etc.), scales, note pitches, tonalities, harmonic chords and any combination of these parameters. A variation in sound intensity is also possible, but is secondarily rather than primarily desirable, because of the uncertainties in distinction by the listener and the uncomfortable consequences which could result from excessive sound variations.
As for the physiological parameters, they are advantageously chosen from heartbeats, an electroencephalogram, blood pressure, ocular pupil size, overall body movements, muscular tension (in particular quasi- static movement, such as a bodybuilding effort) and any combination of these parameters. Among beneficial applications of such measurements, it is especially possible to distinguish the use:
- of heartbeats for checking sporting efforts;
- of ocular pupil sizes for checking the vigilance of motor vehicle drivers;
- of overall body movements (arms, legs, etc.) for musical animations in a discotheque; and
- of an electroencephalogram for relaxation techniques.
In a first form of deterministic relationship between the physiological rhythm and the other musical parameters, the latter depend on the physiological rhythm according to a dependence relationship which is continuous. A user may thus progressively notice a change in the music heard as a function of the physiological rhythm, which allows a subtle distinction. Furthermore, the transitions are thus pleasant to listen to.
In a second form of deterministic relationship between the physiological rhythm and the other musical parameters, the latter depend on the physiological rhythm according to a dependence relationship which is in steps. This embodiment may allow the user to hear clearly significant transitions between several ranges of values of the physiological rhythm (for example, passing from an acceptable range to a dangerous range for sporting efforts).
These two forms are advantageously combined, so as to provide continuity of listening (somewhat quantitative perception) while highlighting significant transitions (somewhat qualitative perception).
The music production system advantageously comprises a unit for selecting a relationship of dependence of the other musical parameters on the physiological rhythm, from user information and reference dependence relationships stored in a database. It is thus possible to match the change in musical parameters to the specific characteristics of the user, such as his age, his sex, his height and his weight, etc. To do this, it is possible to resort to graphs giving appropriate curves.
Preferably, the unit for producing music comprises a generator of music, this generator having at least one input parameter consisting of the musical parameter or musical parameters. Thus a complete and subtle recognition of the various other musical parameters is made possible, while producing original and varied music.
In a preferred embodiment of the music generator, the latter comprises:
- a means of defining musical moments during which at least four notes are respectively capable of being played,
- a means of defining two families of note pitches, for each musical moment, the second family of note pitches having at least one note pitch which is not in the first family of note pitches,
- a means of constructing at least one succession of notes having at least two notes, each succession of notes being called a musical phrase, a succession in which two successive notes are necessarily chosen from at least the first two of the three following types, independently of the order of the two notes:
- two notes whose pitches belong to the first family, - one note whose pitch belongs to the first family and one note whose pitch belongs to the second family,
- two notes whose pitches belong exclusively to the second family, the pitch of one of the two notes corresponding to a sixth degree of a diatonic scale and the pitch of the other note corresponding to a seventh degree of said diatonic scale, and these two notes being exclusively in the immediate vicinity of notes whose pitches belong to the first family, and - a means of outputting a signal representative of each note pitch of each succession.
This music generator advantageously complies with that disclosed in document WO-00/17850 (or in corresponding European patent EP-B-1116213 in the process of delivery). Thus, preferably, the means of constructing at least one series of notes is such that, for each moment, each note whose pitch belongs exclusively to the second family is surrounded exclusively by notes from the first family.
By virtue of these arrangements, the succession of note pitches has both great richness, since the number of successions that can thus be generated is several thousand, and harmonic coherence, since the polyphony generated is regulated by restrictions.
According to particular features, the means of defining two families of note pitches is suitable for defining, for each musical moment, the first family as a set of note pitches belonging to a chord duplicated octave by octave.
According to other particular features, the means of defining two families of note pitches is suitable for defining the second family of note pitches for which it comprises at least the note pitches of a scale which are not in the first family of note pitches.
By virtue of these arrangements, the definition of the families is easy and the alternation of notes from the two families is harmonious.
According to other particular features, the means of constructing at least one succession of notes having at least two notes is adapted so that each musical phrase is defined as a set of notes, the starting times of which are not separated from each other, in pairs, by more than a predetermined duration.
By virtue of these arrangements, a musical phrase consists, for example, of notes whose starts are not separated by more than three sixteenth notes.
The invention also relates to a method of producing music as a function of physiological parameters, according to which music having at least one musical rhythm and at least one other musical parameter is produced, these other musical parameters depending on at least one physiological rhythm of at least one physiological parameter.
According to the invention, the physiological rhythm is essentially represented in the music by the other musical parameters.
The music production method is preferably implemented by means of a music production system according to any one of the embodiments of the invention.
Preferably, the physiological rhythm of the physiological parameter, used to produce the music, is measured on a user. The music produced is then advantageously sent in real time to the user. The user can then adapt his behavior as a function of the information received, in a conscious way (for example in the case of warning of excessive effort) or in an involuntary way (for example relaxation).
In another embodiment, recorded data is used to produce music. Thus, beneficial information is obtained which can be directly interpreted by the listener. The subject of the invention is also a computer program. According to the invention, the latter comprises functionalities capable of producing any one of the embodiments of a music production system according to the invention and/or of implementing any one of the forms of a music production method according to the invention when this program is executed on a computer.
The term "computer program" refers to any physical embodiment of a computer program, which may include not only storage media (cassettes, disks, etc.) but also (electrical, optical, etc.) signals.
The invention will be better understood and illustrated by means of the following exemplary and implementational embodiments which are in no way limiting, with reference to the appended figures in which:
- figure 1A shows the variations of a first musical parameter representing a physiological rhythm in music produced as a function of this physiological rhythm;
- figure 1 B shows the variations of a second musical parameter representing a physiological rhythm in music produced as a function of this physiological rhythm;
- figure 1 C shows the variations of a third musical parameter, not representing a physiological rhythm in music produced as a function of this physiological rhythm; - figure 2 is an outline diagram of a musical production assembly comprising a music production system according to the invention;
- figure 3 details a music production unit of the music production system of figure 2;
- figure 4 shows continuous parametrized curves giving a first musical parameter used in the music production system of figures 2 and 3, as a function of a pressure rhythm; - figure 5 shows a stepped curve giving a second musical parameter used in the music production system of figures 2 and 3, as a function of the pressure rhythm;
- figure 6 shows a flow diagram for generating automatic music, implemented for generating music by means of the music production unit of figures 2 and 3;
- figure 7 shows, in the form of a block diagram, a music generator which can be used in the music production unit of figures 2 and 3 and allowing automatic music generation according to figure 6.
An assembly for producing music as a function of physiological parameters PP comprises (figure 2) a music production system 1 especially including a unit 11 for receiving these parameters PP and a unit 12 for producing music M as a function of these parameters PP. The music production assembly also comprises a detector 2 for detecting a physiological quantity on a user and a filter 3, making it possible to extract the desired physiological parameters PP from the detected quantity and to transmit them to the reception unit 11 of the music production system 1. By way of illustration, the measured physiological quantity is blood pressure and variations of this pressure with time, in particular, a pressure rhythm RP, are extracted.
The music production assembly is also provided with headphones 4 receiving the music M supplied by the music production system 1 and broadcasting it to a user, who is preferably the one on which the measurements are carried out. Thus, this user has, advantageously in real time, feedback in musical form of his blood pressure variations, that is to say of his heartbeat.
The music production system 1 also has a unit 14 for selecting dependence relationships RD between the measured pressure rhythm RP and musical parameters PM, which serve to produce the music M. In the example illustrated, this selection only relates to some of the musical parameters PM, each of the other musical parameters already having a fixed dependence relationship. The selection unit 14 is provided in order to determine, for each musical parameter PM involved, the appropriate dependence relationship RD, from user information IU (sex, age, frequency of sporting activities practiced, etc.), for example entered beforehand by the user himself.
The dependence relationships RD themselves are provided to be stored in a database 5, which can be accessed by the music production unit 12. The selection unit 14 especially has the function of recording an identifier IRD for the selected dependence relationship RD in the database 5, so that the music production unit 12 can directly access the latter among the various dependence relationships RD available in the database 5.
The music production system 1 further comprises a rhythm control unit 13, intended to specify a musical rhythm RM to be adopted for the music M produced, to the music production unit 12. This control unit 13 acts as a function of user commands C. In the example illustrated, this rhythm RM is imposed on a background music M0 having a rhythm which is substantially constant over time.
The music production unit 12 will now be detailed (figure 3). The latter comprises a module 21 for calculating musical parameters PM from dependence relationships RD recorded in the database 5, some of these dependence relationships RD being selected by means of corresponding identifiers IRD.
By way of illustration (figure 4), one of the musical parameters
PM1 (axis 31) is given by a curve C1 which decreases as a function of the pressure rhythm RP (axis 30), this curve C1 being determined from user information IU. Thus, various possible curves C1A, C1 B, C1 C and C1 D are distinguished, the choice of the correct curve C1 being carried out by the selection unit 14 and being recorded in the database 5 in the form of the identifier IRD associated with this parameter PM1. A second of the musical parameters, PM2 (axis 32), is defined independently of all other information, as a function of the pressure rhythm RP (figure 5). This parameter PM2 is given by a curve C2 comprising successive steps P1-P5, such that the parameter PM2 has constant values over respective ranges of values of the pressure rhythm RP. The two parameters PM1 and PM2 are used jointly to generate the background music M0, possibly in combination with yet other musical parameters PM. The music M produced thus varies in two different ways, both by indicating minor modifications of the pressure rhythm RP and by emphasizing significant changes of this rhythm. For example, the parameter PM1 corresponds to note pitches, while the parameter PM2 involves instruments employed to play these notes. In another example, the parameter PM1 represents a continuous passage from one type of instrument to another (such as passing from a string instrument to a wind instrument), while the parameter PM2 defines the presence and the intensity of percussion instruments.
The music production unit 12 also comprises a music generator 22, capable of producing the background music M0, with a rhythm which is substantially constant over time, from musical parameters PM supplied by the computing module 21.
Finally, it has a module 23 for adapting the rhythm of the music M produced as a function of the musical rhythm RM communicated by the rhythm control unit 13 and by rhythmic adaptation of the background music M0 coming from the music generator 22. One advantageous embodiment of a music generator 22 will now be explained, with reference to figures 6 and 7. Practical details are outlined in international application WO-00/17850 (or in corresponding European patent EP-B-1116213).
Figure 6 shows schematically a flowchart for automatic music generation used for the generator 22.
After the start 41 , musical moments are defined during an operation 42. For example, a musical piece comprising bars, each bar comprising beats, each beat comprising note placements, is defined. In this example, the operation 42 consists in allocating a bar number to the musical piece, a beat number to each bar and a note placement number to each beat or a minimum note duration. Each musical moment is defined therein in such a way that at least four notes are capable of being played for its duration.
Next, during an operation 44, two families of note pitches are defined for each musical moment, the second family of note pitches having at least one note pitch which is not in the first family. For example, a scale and a chord are allocated to each half bar of the musical piece, the first family comprising the note pitches of this chord, duplicated octave by octave, and the second family comprising at least the note pitches of the scale which are not in the first family. It is observed that different musical moments or consecutive musical moments may have the same families of note pitches.
Then, during an operation 46, at least one succession of notes having at least two notes is constructed with, for each moment, each note whose pitch belongs exclusively to the second family being exclusively surrounded by notes from the first family. For example, a succession of notes is defined as a set of notes, the starting times of which are not separated from each other, in pairs, by more than a predetermined duration. Thus, in the example outlined with operation 44, for each half bar, a succession of notes does not have two consecutive note pitches which are exclusively in the second family of note pitches.
During an operation 48, a signal representative of the note pitches of each succession is emitted. The music generation then stops at operation 50.
Figure 7 shows, in the form of a block diagram, one embodiment of the generator 22. In this embodiment, the generator 22 comprises, connected to each other by at least one signal line 55, a generator 52 generating families of note pitches, a musical moment generator 54, a musical phrase generator 56 and an output port 58. The output port 58 is connected to an external signal line 59 connected to the listener 4 via the rhythm adaptation module 23.
The signal line 59 is a line capable of transporting a message or information. For example, it is an electrical or optical conductor of known type. The musical moment generator 54 defines musical moments such that, during each musical moment, four notes are capable of being played. For example, the musical moment generator 54 defines a musical piece by a bar number contained therein and, for each bar, a beat number and, for each beat, a placement number for a possible note start or a minimum note duration.
The generator 52 for generating families of note pitches defines two families of note pitches, for each musical moment. The generator 52 defines the two families of note pitches in such a way that the second family of note pitches has at least one note pitch which is not in the first family of note pitches. For example, a scale and a chord are allocated to each half bar of the musical piece, the first family comprising the note pitches of this chord, duplicated octave by octave, and the second family comprising at least the note pitches of the scale which are not in the first family. It is observed that various musical moments or consecutive musical moments may have the same families of note pitches.
The musical phrase generator 56 generates at least one succession of notes having at least two notes, each succession being constructed in such a way that, for each moment, each note whose pitch belongs exclusively to the second family is surrounded exclusively by notes of the first family. For example, a succession of notes is defined as a set of notes whose starting times are not separated from each other, in pairs, by more than a predetermined duration. Thus, in the example outlined with the generator 52 generating families of note pitches, for each half bar, a succession of notes does not have two consecutive note pitches which are exclusively in the second family of note pitches.
The output port 58 transmits, via the external signal line 59, a signal representative of the note pitches of each succession.

Claims

1. A system (1) for producing music as a function of physiological parameters (PP), comprising:
- a unit (11) for receiving at least one physiological parameter (PP) having at least one physiological rhythm (RP),
- and a unit (12) for producing music (M) having at least one musical rhythm (RM) and at least one other musical parameter (PM), said other musical parameter depending on said physiological rhythm (RP),
characterized in that said unit (12) for producing music (M) is such that said physiological rhythm (RP) is essentially represented in said music by said other musical parameter (PM).
2. The music production system (1) as claimed in claim 1 , characterized in that, since said musical rhythm (RM) comprises a contribution dependent on said physiological rhythm (RP), said unit (12) for producing music (M) is such that said dependent contribution is minor in said musical rhythm (RM).
3. The music production system (1) as claimed in claim 2, characterized in that, since said musical rhythm (RM) has a mean over a field for measuring said physiological rhythm, said contribution dependent on said physiological rhythm (RP) over said field is less than 20% of said mean, and preferably less than 10%.
4. The music production system (1) as claimed in claim 3, characterized in that said contribution dependent on said physiological rhythm (RP) is zero, such that said musical rhythm (RM) is independent of said physiological rhythm (RP).
5. The music production system (1) as claimed in claim 4, characterized in that said musical rhythm (RM) is substantially constant over time.
6. The music production system (1) as claimed in claim 5, characterized in that said system (1) for producing music (M) comprises a unit (13) for controlling the musical rhythm (RM) by a user, allowing said user to choose said musical rhythm (RM) which is substantially constant over time.
7. The music production system (1) as claimed in any one of the preceding claims, characterized in that said other musical parameters (PM) are chosen from styles of music, melodies, instrumentations, scales, note pitches, tonalities, harmonic chords and any combination of said parameters.
8. The music production system (1) as claimed in any one of the preceding claims, characterized in that said physiological parameters (PP) are chosen from heartbeats, an electroencephalogram, blood pressure, ocular pupil size, overall body movements, muscular tension and any combination of said parameters.
9. The music production system (1) as claimed in any one of the preceding claims, characterized in that said other musical parameter (PM1) depends on said physiological rhythm (RP) according to a dependence relationship (RD) which is continuous (C1A-C1D).
10. The music production system (1) as claimed in any one of the preceding claims, characterized in that said other musical parameter (PM2) depends on said physiological rhythm (RP) according to a dependence relationship (RD) which is in steps (C2).
11. The music production system (1) as claimed in any one of the preceding claims, characterized in that it comprises a unit (14) for selecting a relationship (RD) of dependence of said other musical parameter (PM) on said physiological rhythm (RP), from user information (IU) and reference dependence relationships stored in a database (BD).
12. The music production system (1) as claimed in any one of the preceding claims, characterized in that said unit (12) for producing music (M) comprises a generator (22) of music (MO), said generator (22) having at least one input parameter consisting of said musical parameter (PM).
13. The music production system (1) as claimed in claim 12, characterized in that said music generator (22) comprises: - a means (54) of defining musical moments during which at least four notes are respectively capable of being played,
- a means (52) of defining two families of note pitches, for each musical moment, the second family of note pitches having at least one note pitch which is not in the first family of note pitches, - a means (56) of constructing at least one succession of notes having at least two notes, each succession of notes being called a musical phrase, a succession in which two successive notes are necessarily chosen from at least the first two of the three following types, independently of the order of the two notes: - two notes whose pitches belong to the first family,
- one note whose pitch belongs to the first family and one note whose pitch belongs to the second family,
- two notes whose pitches belong exclusively to the second family, the pitch of one of the two notes corresponding to a sixth degree of a diatonic scale and the pitch of the other note corresponding to a seventh degree of said diatonic scale, and said two notes being exclusively in the immediate vicinity of notes whose pitches belong to the first family, and
- a means (58) of outputting a signal representative of each note pitch of each said succession.
14. A method of producing music as a function of physiological parameters (PP), according to which music (M) having at least one musical rhythm (RM) and at least one other musical parameter (PM) is produced, said other musical parameter depending on at least one physiological rhythm (RP) of at least one physiological parameter (PP),
characterized in that said physiological rhythm (RP) is essentially represented in said music by said other musical parameter (PM),
said music production method preferably being implemented by means of a music production system (1) according to any one of claims 1 to 13.
15. The music production method as claimed in claim 14, characterized in that said physiological rhythm (RP) of said physiological parameter (PP), used to produce said music (M), is measured on a user.
16. The music production method as claimed in claim 15, characterized in that the music (M) produced is sent in real time to the user.
17. A computer program, characterized in that it comprises functionalities capable of producing a music production system (1) as claimed in any one of claims 1 to 13 and/or of implementing a music production method as claimed in any one of claims 14 to 16 when said program is executed on a computer.
PCT/EP2002/007348 2001-07-03 2002-07-03 Systems and method for producing music as a function of physiological parameters WO2003005339A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR01/08805 2001-07-03
FR0108805A FR2827069A1 (en) 2001-07-03 2001-07-03 DEVICES AND METHOD FOR PRODUCING MUSIC BASED ON PHYSIOLOGICAL PARAMETERS

Publications (1)

Publication Number Publication Date
WO2003005339A1 true WO2003005339A1 (en) 2003-01-16

Family

ID=8865064

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2002/007348 WO2003005339A1 (en) 2001-07-03 2002-07-03 Systems and method for producing music as a function of physiological parameters

Country Status (2)

Country Link
FR (1) FR2827069A1 (en)
WO (1) WO2003005339A1 (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7177672B2 (en) * 2002-12-16 2007-02-13 Polar Electro Oy Coding heart rate information
ES2277572A1 (en) * 2006-09-22 2007-07-01 Universitat Autonoma De Barcelona Method for musicalizing a phenomenon being monitored
JP2007522862A (en) * 2004-02-19 2007-08-16 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Voice interval training device
JP2008535532A (en) * 2005-02-14 2008-09-04 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Electronic device and method for reproducing human sensing signals
EP1977796A1 (en) * 2006-01-24 2008-10-08 Sony Corporation Audio reproducing device, audio reproducing method, and audio reproducing program
US7544880B2 (en) 2003-11-20 2009-06-09 Sony Corporation Playback mode control device and playback mode control method
EP2113194A1 (en) * 2008-05-02 2009-11-04 Paracelsus Klinik Lustmühle AG Device and method for acoustic and visual representation of prepared physiological data and use of prepared data
WO2011054423A1 (en) * 2009-11-06 2011-05-12 Sony Ericsson Mobile Communications Ab Method for setting up a list of audio files
EP3109595A1 (en) * 2004-12-17 2016-12-28 NIKE Innovate C.V. Multi-sensor monitoring of athletic performance where the playbackspeed of a media content depends upon gps data
GB2551807A (en) * 2016-06-30 2018-01-03 Lifescore Ltd Apparatus and methods to generate music
CN109522441A (en) * 2019-01-09 2019-03-26 昆山快乐岛运动电子科技有限公司 Music screening technique
US10335489B2 (en) 2012-01-09 2019-07-02 Adocia Injectable solution at pH 7 comprising at least one basal insulin the pi of which is between 5.8 and 8.5 and a substituted co-polyamino acid
CN110124177A (en) * 2019-05-24 2019-08-16 江西应用技术职业学院 A kind of psychological venting device
US11633460B2 (en) 2017-12-07 2023-04-25 Adocia Injectable solution at pH 7 comprising at least one basal insulin wherein the pI is comprised from 5.8 to 8.5 and a co-polyamino acid bearing carboxylate charges and hydrophobic radicals
US11883496B2 (en) 2017-12-07 2024-01-30 Adocia Injectable pH 7 solution comprising at least one basal insulin having a pI from 5.8 to 8.5 and a co-polyamino acid bearing carboxylate charges and hydrophobic radicals

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4627324A (en) * 1984-06-19 1986-12-09 Helge Zwosta Method and instrument for generating acoustic and/or visual effects by human body actions
EP0301790A2 (en) * 1987-07-24 1989-02-01 BioControl Systems, Inc. Biopotential digital controller for music and video applications
FR2785077A1 (en) * 1998-09-24 2000-04-28 Rene Louis Baron Automatic music generation method and device, comprises defining musical moments from note inputs and note pitch libraries
WO2000033731A1 (en) * 1998-12-10 2000-06-15 Andrew Junker Brain-body actuated system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4627324A (en) * 1984-06-19 1986-12-09 Helge Zwosta Method and instrument for generating acoustic and/or visual effects by human body actions
EP0301790A2 (en) * 1987-07-24 1989-02-01 BioControl Systems, Inc. Biopotential digital controller for music and video applications
FR2785077A1 (en) * 1998-09-24 2000-04-28 Rene Louis Baron Automatic music generation method and device, comprises defining musical moments from note inputs and note pitch libraries
WO2000033731A1 (en) * 1998-12-10 2000-06-15 Andrew Junker Brain-body actuated system

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7177672B2 (en) * 2002-12-16 2007-02-13 Polar Electro Oy Coding heart rate information
US7544880B2 (en) 2003-11-20 2009-06-09 Sony Corporation Playback mode control device and playback mode control method
JP2007522862A (en) * 2004-02-19 2007-08-16 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Voice interval training device
EP3109595A1 (en) * 2004-12-17 2016-12-28 NIKE Innovate C.V. Multi-sensor monitoring of athletic performance where the playbackspeed of a media content depends upon gps data
US10668324B2 (en) 2004-12-17 2020-06-02 Nike, Inc. Multi-sensor monitoring of athletic performance
US10328309B2 (en) 2004-12-17 2019-06-25 Nike, Inc. Multi-sensor monitoring of athletic performance
US10022589B2 (en) 2004-12-17 2018-07-17 Nike, Inc. Multi-sensor monitoring of athletic performance
US9937381B2 (en) 2004-12-17 2018-04-10 Nike, Inc. Multi-sensor monitoring of athletic performance
US11590392B2 (en) 2004-12-17 2023-02-28 Nike, Inc. Multi-sensor monitoring of athletic performance
US11071889B2 (en) 2004-12-17 2021-07-27 Nike, Inc. Multi-sensor monitoring of athletic performance
US9833660B2 (en) 2004-12-17 2017-12-05 Nike, Inc. Multi-sensor monitoring of athletic performance
US11465032B2 (en) 2005-02-14 2022-10-11 Koninklijke Philips N.V. Electronic device and method for reproducing a human perceptual signal
JP2008535532A (en) * 2005-02-14 2008-09-04 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Electronic device and method for reproducing human sensing signals
US10741218B2 (en) 2005-02-14 2020-08-11 Koninklijke Philips N.V. Electronic device and method for reproducing a human perceptual signal
US8212136B2 (en) 2006-01-24 2012-07-03 Sony Corporation Exercise audio reproducing device, exercise audio reproducing method, and exercise audio reproducing program
EP1977796A4 (en) * 2006-01-24 2010-01-20 Sony Corp Audio reproducing device, audio reproducing method, and audio reproducing program
EP1977796A1 (en) * 2006-01-24 2008-10-08 Sony Corporation Audio reproducing device, audio reproducing method, and audio reproducing program
ES2277572A1 (en) * 2006-09-22 2007-07-01 Universitat Autonoma De Barcelona Method for musicalizing a phenomenon being monitored
WO2008034930A1 (en) * 2006-09-22 2008-03-27 Universitat Autonoma De Barcelona Method for musicalizing a phenomenon being monitored
EP2113194A1 (en) * 2008-05-02 2009-11-04 Paracelsus Klinik Lustmühle AG Device and method for acoustic and visual representation of prepared physiological data and use of prepared data
WO2009133484A1 (en) * 2008-05-02 2009-11-05 Paracelsus Klinik Lustmühle Ag Device and method for the acoustic and visual representation of processed physiological data and use of the processed data
WO2011054423A1 (en) * 2009-11-06 2011-05-12 Sony Ericsson Mobile Communications Ab Method for setting up a list of audio files
US10335489B2 (en) 2012-01-09 2019-07-02 Adocia Injectable solution at pH 7 comprising at least one basal insulin the pi of which is between 5.8 and 8.5 and a substituted co-polyamino acid
US10839780B2 (en) 2016-06-30 2020-11-17 Lifescore Limited Apparatus and methods for cellular compositions
GB2551807A (en) * 2016-06-30 2018-01-03 Lifescore Ltd Apparatus and methods to generate music
GB2551807B (en) * 2016-06-30 2022-07-13 Lifescore Ltd Apparatus and methods to generate music
US11633460B2 (en) 2017-12-07 2023-04-25 Adocia Injectable solution at pH 7 comprising at least one basal insulin wherein the pI is comprised from 5.8 to 8.5 and a co-polyamino acid bearing carboxylate charges and hydrophobic radicals
US11883496B2 (en) 2017-12-07 2024-01-30 Adocia Injectable pH 7 solution comprising at least one basal insulin having a pI from 5.8 to 8.5 and a co-polyamino acid bearing carboxylate charges and hydrophobic radicals
CN109522441A (en) * 2019-01-09 2019-03-26 昆山快乐岛运动电子科技有限公司 Music screening technique
CN110124177A (en) * 2019-05-24 2019-08-16 江西应用技术职业学院 A kind of psychological venting device

Also Published As

Publication number Publication date
FR2827069A1 (en) 2003-01-10

Similar Documents

Publication Publication Date Title
US4883067A (en) Method and apparatus for translating the EEG into music to induce and control various psychological and physiological states and to control a musical instrument
US5529498A (en) Method and apparatus for measuring and enhancing neuro-motor coordination
Harrer et al. Music, emotion and autonomic function
Hodges Bodily responses to music
Fraisse Rhythm and tempo
US5076281A (en) Device and method for effecting rhythmic body activity
Fraisse Time and rhythm perception
Fitch et al. Perception and production of syncopated rhythms
Unwin et al. The effects of group singing on mood
EP0872255B1 (en) Relax guiding device and biofeedback guiding device
WO2003005339A1 (en) Systems and method for producing music as a function of physiological parameters
Tervaniemi et al. Melodic multi-feature paradigm reveals auditory profiles in music-sound encoding
Dromey et al. The effects of emotional expression on vibrato
US10636400B2 (en) Method for producing and streaming music generated from biofeedback
Leydon et al. The role of auditory feedback in sustaining vocal vibrato
US20220086560A1 (en) Binaural signal composing apparatus
Hébert et al. Detection of metric structure in auditory figural patterns
Janata et al. Spectral analysis of the EEG as a tool for evaluating expectancy violations of musical contexts
Cazden Sensory theories of musical consonance
JP3868326B2 (en) Sleep introduction device and psychophysiological effect transfer device
EP0432152B1 (en) Apparatus for translating the eeg into music
Janata Cognitive neuroscience of music
JP2735592B2 (en) A device that converts brain waves into music
Tulilaulu et al. Sleep musicalization: Automatic music composition from sleep measurements
AU636287B2 (en) Apparatus for translating the eeg into music

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TN TR TT TZ UA UG US UZ VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP