EP1784049A1 - Procédé et système de reproduction sonore, et produit de programme informatique - Google Patents
Procédé et système de reproduction sonore, et produit de programme informatique Download PDFInfo
- Publication number
- EP1784049A1 EP1784049A1 EP05024347A EP05024347A EP1784049A1 EP 1784049 A1 EP1784049 A1 EP 1784049A1 EP 05024347 A EP05024347 A EP 05024347A EP 05024347 A EP05024347 A EP 05024347A EP 1784049 A1 EP1784049 A1 EP 1784049A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- sound reproduction
- mobile terminal
- audio content
- mapping
- leading mobile
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims abstract description 23
- 238000013507 mapping Methods 0.000 claims abstract description 28
- QVZZPLDJERFENQ-NKTUOASPSA-N bassianolide Chemical compound CC(C)C[C@@H]1N(C)C(=O)[C@@H](C(C)C)OC(=O)[C@H](CC(C)C)N(C)C(=O)[C@@H](C(C)C)OC(=O)[C@H](CC(C)C)N(C)C(=O)[C@@H](C(C)C)OC(=O)[C@H](CC(C)C)N(C)C(=O)[C@@H](C(C)C)OC1=O QVZZPLDJERFENQ-NKTUOASPSA-N 0.000 claims abstract description 5
- 230000003287 optical effect Effects 0.000 claims description 3
- 239000004020 conductor Substances 0.000 description 25
- ZYXYTGQFPZEUFX-UHFFFAOYSA-N benzpyrimoxan Chemical compound O1C(OCCC1)C=1C(=NC=NC=1)OCC1=CC=C(C=C1)C(F)(F)F ZYXYTGQFPZEUFX-UHFFFAOYSA-N 0.000 description 10
- 239000000872 buffer Substances 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000015556 catabolic process Effects 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04S—STEREOPHONIC SYSTEMS
- H04S7/00—Indicating arrangements; Control arrangements, e.g. balance control
- H04S7/30—Control circuits for electronic adaptation of the sound field
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R2205/00—Details of stereophonic arrangements covered by H04R5/00 but not provided for in any of its subgroups
- H04R2205/024—Positioning of loudspeaker enclosures for spatial sound reproduction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R2420/00—Details of connection covered by H04R, not provided for in its groups
- H04R2420/05—Detection of connection of loudspeakers or headphones to amplifiers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R2420/00—Details of connection covered by H04R, not provided for in its groups
- H04R2420/07—Applications of wireless loudspeakers or wireless microphones
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04S—STEREOPHONIC SYSTEMS
- H04S3/00—Systems employing more than two channels, e.g. quadraphonic
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04S—STEREOPHONIC SYSTEMS
- H04S7/00—Indicating arrangements; Control arrangements, e.g. balance control
- H04S7/30—Control circuits for electronic adaptation of the sound field
- H04S7/301—Automatic calibration of stereophonic sound system, e.g. with test microphone
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04S—STEREOPHONIC SYSTEMS
- H04S7/00—Indicating arrangements; Control arrangements, e.g. balance control
- H04S7/30—Control circuits for electronic adaptation of the sound field
- H04S7/308—Electronic adaptation dependent on speaker or headphone connection
Definitions
- the invention relates to methods, program products and systems for sound reproduction.
- stereo televisions can be used to reproduce sound - especially music - but also mobile terminals, such as mobile telephones, portable computers, or portable music players that can receive audio content can be used for this purpose.
- mobile terminals such as mobile telephones, portable computers, or portable music players that can receive audio content can be used for this purpose.
- a method for sound reproduction comprises the steps of: a) receiving or generating a mapping from at least two instruments, from at least two frequency ranges, from at least two directional channels, or any combination thereof, to at least two sound reproduction devices or to at least one sound reproduction device and a leading mobile terminal; b) receiving audio content; c) using said mapping on the audio content to pass sound information describing an instrument, a frequency range, or a directional channel from said audio content to a corresponding sound reproduction device; and d) at a leading mobile terminal, synchronizing playback on at least one of the sound reproduction devices over a wireless local connection, audio content, can be output via the sound reproduction devices, making the impression of the audio content being reproduced via a surround speaker set or by a small orchestra.
- step d) is performed at the leading mobile terminal by transmitting a synchronization signal to at least one of the sound reproduction devices, the user control over the sound reproduction may be improved.
- a gimmick in the form of a light show may be obtained, at least if the optical signal is at least partly within the visible light spectrum. Furthermore, the proper functioning of the sound reproduction device may be checked in a less complex manner, also if one or more of the sound reproduction devices or the mobile terminal has been turned silent.
- the mapping When the mapping is automatically adapted responsive to a change in the relative position of the sound reproduction devices from each other or from the leading mobile terminal, or responsive to a change of availability of a sound reproduction device, the user experience may be improved further, and if one of the sound reproduction devices is taken away, the missing instrument, frequency range, or directional channel can be replaced to output by at least one of the other sound reproduction devices.
- the required computing power at the leading mobile terminal can be reduced.
- a system for sound reproduction that comprises not only a leading mobile terminal comprising a program product adapted to carry out the method according to the invention when executed in a processing unit, but also at least one sound reproduction device comprising means adapted to receive sound information describing at least one instrument, at least one frequency range, or at least one directional channel from the leading mobile terminal and to receive synchronization information from the leading mobile terminal over wireless local connection means, a system with which the user experience may be improved can be obtained. Especially if at least one of the further sound reproduction devices is a mobile terminal too, the versatility of the mobile terminals can be improved, possibly with a gimmick effect.
- Figure 1 illustrates the idea of a mobile orchestra. Sound reproduction devices 12, 13, 14, which may be deliberately many, form the mobile orchestra.
- the leading mobile terminal that conducts the orchestra may only direct the performance of the mobile orchestra or also be a part of the mobile orchestra. In the following, the leading mobile terminal is referred to as conductor 11.
- sound reproduction devices 12, 13, 14 are mobile terminals too, there may be a graphics on the display of each mobile terminal illustrating a face of a musician, the face of a musician robot, or an instrument. Same applies also to the conductor 11, but preferably instead of a musician, a picture of a conductor is shown.
- Figure 1 shows the conductor 11 from behind, illustrating the principle that if the leading mobile terminal has a display on both sides of its housing, the graphics on the display on the back side of the housing may be different from that on the display on the front side, preferably showing the same object as in the other display but from behind.
- the same principle can be used to implement graphics on the display or displays of the sound reproduction devices, especially if they are mobile terminals.
- the looks of the musicians may be changed automatically responsive to the music style.
- jazz or Blues musicians may have some characteristics, such as clothing, showing different style or ethnic background than that of musicians playing some other class of music, for example.
- Sets of images may be mapped to a given music style or performer which may be recognized automatically through a genre or artist or record identifier stored with the audio content.
- FIG. 1 shows the conductor 11 together with sound reproduction devices 12, 13, 14, 25 that together form the mobile orchestra.
- the conductor 11 receives or generates a mapping 30 from at least two instruments, from at least two frequency ranges, or from at least two directional channels to at least two sound reproduction devices 12, 13, 14, 25.
- the mapping 30 may further comprise information to map at least one instrument, at least one frequency range, or at least one directional channel to the conductor 11.
- the conductor 11 receives audio content 31. Then the conductor 11 uses the mapping 30 on the audio content 31 to pass sound information describing an instrument, a frequency range, or a directional channel from the audio content 31 to a corresponding sound reproduction device 12, 13, 14, 25 or to itself 11. Then the conductor 11 preferably synchronizes playback on the sound reproduction devices 12, 13, 14, 25 over a wireless local connection, possibly with the playback locally on the conductor 11.
- the individual tracks can be transferred from the conductor 11 to the sound reproduction devices 12, 13, 14, 25 via data cable, IrDA or Bluetooth from the conductor 11 to each of the sound reproduction devices 12, 13, 14, 25.
- the audio content preferably comprises control signals for synchronization. If the conductor 11 detects a control signal (Fig 3, a synchronization mark), it signals this to the sound reproduction devices 12, 13, 14, 25, preferably by lighting up or flashing a light source, such as a LED-flash.
- a control signal Fig 3, a synchronization mark
- the light emitted by the light source is preferably at least partially within the visible light spectrum in order to have a visual effect.
- the control signals can be placed in the audio content at constant time intervals, e.g. every 20 milliseconds. If the audio content is constant bitrate audio content, the use of synchronization marks may not be necessary since then the number of buffers reproduced can be synchronized with an internal timer at the responsive sound reproduction device or in the conductor 11.
- the sound reproduction devices 12, 13, 14, 25 detect the signaling, e.g. via their light sensors LS, and responsive to the detecting they may discard the rest of the stream buffer and start playing the next buffer which begins with the synchronization signal.
- the extent of quality degradation, such as jitter, as observed by human listeners can be minimized.
- the mobile orchestra may give a better stereo or surround sound than a single sound reproduction device since the distance between the sound reproduction devices 12, 13, 14, 25 and of the conductor 11 can be larger than that of normal wired speakers.
- the cables between the handset and speakers are not necessary but may be replaced by wireless communication.
- Figure 3 illustrates how the mapping may be used on audio content.
- the mapping 30 is on the audio content, such as that of an audio file 31, to pass sound information describing at least two instruments, frequency ranges or directional channels to corresponding sound reproduction devices.
- the mapping 30 is a mapping from four directional channels to four sound reproduction devices 12, 13, 14, 25 and from one frequency range to the conductor 11.
- the directional channels and the sound reproduction devices assigned are L-FRONT (sound reproduction device 12), L-REAR (sound reproduction device 13), R-REAR (sound reproduction device 14), R-FRONT (sound reproduction device 25).
- the frequency range BASS is assigned to the conductor 11.
- the mapping 30 may be automatically adapted responsive to a change in the relative position of the sound reproduction devices 12, 13, 14, 25 from each other or from the leading mobile terminal 11.
- the mapping 30 may be automatically adapted responsive to a change of availability of a sound reproduction device 12, 13, 14, 25. Then if one sound reproduction device disappears, because of an empty battery or because the user of the sound reproduction device takes the sound reproduction device with him or her, the mapping 30 may be modified by mapping the part of audio content to another sound reproduction device or to the conductor instead of the disappeared (or disappearing) sound reproduction device.
- Figure 4 shows system architecture with a network server 400 for using the mapping on audio content, such as an audio file 31.
- the mapping 30 is in the network server 400 that uses in on the audio file and passes the resulting processed audio file 33, preferably through the Internet, to the conductor 11.
- the conductor 11 passes the processed audio file 33 as a whole or only partially to the sound reproduction devices 12, 13, 14, 25.
- the audio file 31 may alternatively be converted directly at the conductor 11 from a music file to a desired number of partial audio files to be passed to the desired number of sound reproduction devices.
- the audio content provides already partial audio files, i.e. tracks.
- the mapping 30 preferably comprises mapping from each of the tracks to at least one sound reproduction device 12, 13, 14, 25 or to the conductor 11.
- the audio content, especially the audio file 31, may be in form of a midi file, containing information on sound to be reproduced by different instruments.
- the mapping 30 preferably comprises mapping from each instrument to at least one sound reproduction device (or to the conductor).
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Acoustics & Sound (AREA)
- Signal Processing (AREA)
- Stereophonic System (AREA)
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP05024347A EP1784049A1 (fr) | 2005-11-08 | 2005-11-08 | Procédé et système de reproduction sonore, et produit de programme informatique |
| PCT/EP2006/010704 WO2007054285A1 (fr) | 2005-11-08 | 2006-11-08 | Procede et systeme de restitution sonore et produit de programme |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP05024347A EP1784049A1 (fr) | 2005-11-08 | 2005-11-08 | Procédé et système de reproduction sonore, et produit de programme informatique |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| EP1784049A1 true EP1784049A1 (fr) | 2007-05-09 |
Family
ID=36659810
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP05024347A Withdrawn EP1784049A1 (fr) | 2005-11-08 | 2005-11-08 | Procédé et système de reproduction sonore, et produit de programme informatique |
Country Status (2)
| Country | Link |
|---|---|
| EP (1) | EP1784049A1 (fr) |
| WO (1) | WO2007054285A1 (fr) |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2009144537A1 (fr) * | 2008-05-27 | 2009-12-03 | Sony Ericsson Mobile Communications Ab | Appareil et procédé pour la synchronisation temporelle de flux de données audio sans fil |
| WO2012098191A1 (fr) * | 2011-01-19 | 2012-07-26 | Devialet | Dispositif de traitement audio |
| EP2747441A1 (fr) * | 2012-12-18 | 2014-06-25 | Huawei Technologies Co., Ltd. | appareil et procédé pour commande de lecture pour terminaux multiples |
| EP2804397A1 (fr) * | 2013-05-15 | 2014-11-19 | Giga-Byte Technology Co., Ltd. | Système de haut-parleurs à canaux audio multiples |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE102009031995A1 (de) * | 2009-07-06 | 2011-01-13 | Neutrik Aktiengesellschaft | Verfahren zur drahtlosen Echtzeitübertragung zumindest eines Audiosignales |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2000076272A1 (fr) * | 1998-12-03 | 2000-12-14 | Audiologic, Incorporated | Systeme numerique de haut-parleurs sans fil |
| WO2004023841A1 (fr) * | 2002-09-09 | 2004-03-18 | Koninklijke Philips Electronics N.V. | Haut-parleurs intelligents |
| US20040159219A1 (en) * | 2003-02-07 | 2004-08-19 | Nokia Corporation | Method and apparatus for combining processing power of MIDI-enabled mobile stations to increase polyphony |
-
2005
- 2005-11-08 EP EP05024347A patent/EP1784049A1/fr not_active Withdrawn
-
2006
- 2006-11-08 WO PCT/EP2006/010704 patent/WO2007054285A1/fr not_active Ceased
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2000076272A1 (fr) * | 1998-12-03 | 2000-12-14 | Audiologic, Incorporated | Systeme numerique de haut-parleurs sans fil |
| WO2004023841A1 (fr) * | 2002-09-09 | 2004-03-18 | Koninklijke Philips Electronics N.V. | Haut-parleurs intelligents |
| US20040159219A1 (en) * | 2003-02-07 | 2004-08-19 | Nokia Corporation | Method and apparatus for combining processing power of MIDI-enabled mobile stations to increase polyphony |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2009144537A1 (fr) * | 2008-05-27 | 2009-12-03 | Sony Ericsson Mobile Communications Ab | Appareil et procédé pour la synchronisation temporelle de flux de données audio sans fil |
| WO2012098191A1 (fr) * | 2011-01-19 | 2012-07-26 | Devialet | Dispositif de traitement audio |
| CN103329570A (zh) * | 2011-01-19 | 2013-09-25 | 帝瓦雷公司 | 音频处理装置 |
| US10187723B2 (en) | 2011-01-19 | 2019-01-22 | Devialet | Audio processing device |
| EP2747441A1 (fr) * | 2012-12-18 | 2014-06-25 | Huawei Technologies Co., Ltd. | appareil et procédé pour commande de lecture pour terminaux multiples |
| US9705944B2 (en) | 2012-12-18 | 2017-07-11 | Huawei Technologies Co., Ltd. | Multi-terminal synchronous play control method and apparatus |
| EP2804397A1 (fr) * | 2013-05-15 | 2014-11-19 | Giga-Byte Technology Co., Ltd. | Système de haut-parleurs à canaux audio multiples |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2007054285A1 (fr) | 2007-05-18 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN110692252B (zh) | 具有用于广域广播的延迟管理的视听协作方法 | |
| US7756595B2 (en) | Method and apparatus for producing and distributing live performance | |
| US8838835B2 (en) | Session terminal apparatus and network session system | |
| Carôt et al. | Network music performance-problems, approaches and perspectives | |
| US11399249B2 (en) | Reproduction system and reproduction method | |
| EP2743917B1 (fr) | Système d'informations, appareil de reproduction d'informations, procédé de génération d'informations et support de stockage | |
| US20220386062A1 (en) | Stereophonic audio rearrangement based on decomposed tracks | |
| EP1784049A1 (fr) | Procédé et système de reproduction sonore, et produit de programme informatique | |
| WO2018095022A1 (fr) | Système de microphone | |
| Konstantas et al. | The distributed musical rehearsal environment | |
| US20240129669A1 (en) | Distribution system, sound outputting method, and non-transitory computer-readable recording medium | |
| US6525253B1 (en) | Transmission of musical tone information | |
| US10863259B2 (en) | Headphone set | |
| JP4422656B2 (ja) | ネットワークを用いた遠隔多地点合奏システム | |
| JP5256682B2 (ja) | 情報処理装置、情報処理方法及びプログラム | |
| JP6834398B2 (ja) | 音処理装置、音処理方法、及びプログラム | |
| CN114946194A (zh) | 无线midi耳机 | |
| JP2003085068A (ja) | ライブ情報提供サーバ、情報通信端末、ライブ情報提供システムおよびライブ情報提供方法 | |
| Braasch et al. | Mixing console design considerations for telematic music applications | |
| KR20050083389A (ko) | 멀티 채널 가라오케 장치 및 방법 | |
| WO2025229876A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations et programme | |
| JP6819236B2 (ja) | 音処理装置、音処理方法、及びプログラム | |
| JP2024001600A (ja) | 再生装置、再生方法、および再生プログラム | |
| KR20250087151A (ko) | 가상합주를 위한 음악제작시스템 | |
| KR20250130352A (ko) | 오디오 재생 시스템 및 방법 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
| AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR |
|
| AX | Request for extension of the european patent |
Extension state: AL BA HR MK YU |
|
| AKX | Designation fees paid | ||
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
| 18D | Application deemed to be withdrawn |
Effective date: 20071110 |
|
| REG | Reference to a national code |
Ref country code: DE Ref legal event code: 8566 |