[go: up one dir, main page]

WO2011052960A2 - Système de commande de transmission de données en temps réel et procédé de commande associé - Google Patents

Système de commande de transmission de données en temps réel et procédé de commande associé Download PDF

Info

Publication number
WO2011052960A2
WO2011052960A2 PCT/KR2010/007361 KR2010007361W WO2011052960A2 WO 2011052960 A2 WO2011052960 A2 WO 2011052960A2 KR 2010007361 W KR2010007361 W KR 2010007361W WO 2011052960 A2 WO2011052960 A2 WO 2011052960A2
Authority
WO
WIPO (PCT)
Prior art keywords
video signal
control
terminal
encoded
communication
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2010/007361
Other languages
English (en)
Korean (ko)
Other versions
WO2011052960A3 (fr
Inventor
김대원
한상훈
신성문
진선영
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
INVIGENT Inc
Original Assignee
INVIGENT Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by INVIGENT Inc filed Critical INVIGENT Inc
Publication of WO2011052960A2 publication Critical patent/WO2011052960A2/fr
Publication of WO2011052960A3 publication Critical patent/WO2011052960A3/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W28/00Network traffic management; Network resource management
    • H04W28/02Traffic management, e.g. flow control or congestion control
    • H04W28/0231Traffic management, e.g. flow control or congestion control based on communication conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L47/00Traffic control in data switching networks
    • H04L47/10Flow control; Congestion control
    • H04L47/24Traffic characterised by specific attributes, e.g. priority or QoS
    • H04L47/2416Real-time traffic
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L47/00Traffic control in data switching networks
    • H04L47/10Flow control; Congestion control
    • H04L47/38Flow control; Congestion control by adapting coding or compression rate
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W28/00Network traffic management; Network resource management
    • H04W28/02Traffic management, e.g. flow control or congestion control
    • H04W28/0278Traffic management, e.g. flow control or congestion control using buffer status reports
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W28/00Network traffic management; Network resource management
    • H04W28/02Traffic management, e.g. flow control or congestion control
    • H04W28/06Optimizing the usage of the radio link, e.g. header compression, information sizing, discarding information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W28/00Network traffic management; Network resource management
    • H04W28/02Traffic management, e.g. flow control or congestion control
    • H04W28/10Flow control between communication endpoints
    • H04W28/14Flow control between communication endpoints using intermediate storage
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W28/00Network traffic management; Network resource management
    • H04W28/16Central resource management; Negotiation of resources or communication parameters, e.g. negotiating bandwidth or QoS [Quality of Service]
    • H04W28/18Negotiating wireless communication parameters
    • H04W28/22Negotiating communication rate

Definitions

  • the present invention relates to a real-time data transmission control system and a control method thereof, and more particularly, to a system and a control method for controlling the data size per second of a multimedia stream transmitted according to the state of a variable communication network.
  • the multimedia stream is a form of data capable of reproducing video and / or audio in real time.
  • An object of the present invention is to control the size of data to be transmitted so that data can be transmitted without delay in the case of transmitting data in real time under a communication environment of a variable communication network.
  • An object of the present invention is to ensure the quality of data to be transmitted even if the size of the data is small.
  • Another object of the present invention is to allow the user to remotely control the data transmission of the terminal.
  • the terminal of the present invention the interface unit for receiving a video signal from an external image acquisition device; An encoding unit encoding the received video signal based on a plurality of encoding parameters; A communication unit for transmitting the encoded video signal to an external device through a communication network; And confirming a communication environment of the communication network and performing at least one of a first control operation and a second control operation so that the data size of the encoded and transmitted video signal is adjusted in real time according to the identified communication environment. And a control unit, wherein the first control operation controls the encoding unit by controlling at least one of the plurality of encoding parameters, and the second control operation includes only some of the frames constituting the encoded video signal. And controlling the communication unit by controlling the number of frames constituting the encoded and transmitted video signal to be transmitted to the external device.
  • the video signal receiving apparatus of the present invention comprises: a communication unit for receiving the encoded video signal from a terminal for encoding a video signal based on a plurality of encoding parameters through a communication network; And identifying at least one control signal of the first control signal and the second control signal so that the data size of the encoded and transmitted video signal is adjusted from the terminal according to the identified communication environment. And a controller for generating and transmitting the generated at least one control signal to the terminal through the communication unit, wherein the first control signal controls at least one of the plurality of encoding parameters, and the second control signal.
  • the control signal may be configured to control the number of frames constituting the encoded and transmitted video signal such that the terminal transmits only some of the frames constituting the encoded and transmitted video signal.
  • the video signal communication system of the present invention includes a video signal communication system, the communication system including a terminal and a video signal receiving device, wherein the terminal receives a video signal from an external video acquisition device.
  • An encoding unit encoding the received video signal based on a plurality of encoding parameters;
  • a first communication unit which transmits the encoded video signal to the video signal receiving apparatus through a communication network; And confirming a communication environment of the communication network and performing at least one of a first control operation and a second control operation so that the data size of the encoded and transmitted video signal is adjusted in real time according to the confirmed communication environment.
  • the video signal receiving apparatus comprises: a second communication unit receiving the encoded and transmitted video signal from the first communication unit; And a second control unit, wherein the first control operation controls the encoding unit by controlling at least one of the plurality of encoding parameters, and the second control operation comprises: a frame constituting the encoded video signal.
  • the first communication unit is controlled by controlling the number of frames constituting the encoded and transmitted video signal such that only a part of them is transmitted to the video signal receiving apparatus.
  • the present invention provides the effect that data can be transmitted without delay even when the communication environment of the communication network is deteriorated.
  • the present invention provides an effect of protecting the image quality even if the size of the transmitted data is reduced.
  • the present invention provides the effect that the user can remotely set the quality of the transmitted data.
  • FIG. 1 illustrates a system associated with one embodiment of the present invention.
  • FIG. 2 illustrates a system associated with one embodiment of the present invention.
  • FIG. 3 is a diagram illustrating a terminal and a connection device of the present invention.
  • 4 and 5 are side views illustrating the terminal of the present invention.
  • FIG. 6 is a block diagram of a terminal related to an embodiment of the present invention.
  • FIG. 7 is a block diagram of a server related to an embodiment of the present invention.
  • FIG. 8 is a block diagram illustrating a user terminal associated with one embodiment of the present invention.
  • FIG. 9 is a flowchart illustrating a video signal transmission method according to an embodiment of the present invention.
  • FIG. 10 is a flowchart illustrating a video signal transmission control method according to an embodiment of the present invention.
  • FIG. 11 is a flowchart illustrating a video signal transmission control method according to another embodiment of the present invention.
  • FIG. 12 is a view for explaining a video signal transmission control method of the present invention.
  • FIG. 13 is a view for explaining a video signal transmission control method of the present invention.
  • FIG. 1 illustrates a system associated with one embodiment of the present invention.
  • the live broadcasting system 1 may include an image acquisition device 10, a terminal 100, a server 200, and a user terminal 300.
  • the image acquisition device 10 may include a device used to acquire a general video such as a camcorder and a camera.
  • the image acquisition apparatus 10 may not only acquire a video, but also obtain audio information.
  • the server 200 may be configured as a general server capable of receiving, storing, and transmitting data.
  • the data may include both video and audio signals.
  • the image acquisition apparatus 10 and the server 100 are connected by a communication network.
  • the network is, for example, Ethernet or WIMAX (provided by IEEE 802.16), WIBRO (provided by IEEE 802.16e) (also referred to as 'mobile WIMAX'), WIFI (provided by IEEE 802.11), and high-speed packets that extend WCDMA. It may be connected to a wired or wireless communication network including communication standards HSDPA, EVDO rev.A, and HSUPA (High Speed Uplink Packet Access).
  • HSDPA High Speed Uplink Packet Access
  • the server 200 and the user terminal may be connected to the communication network described above.
  • the user terminal 300 may receive, store, transmit and play data from the server 200. In addition, the user terminal 300 may receive data directly from the terminal 100, not from the server 200.
  • the user terminal 300 may include, for example, a PC, a PDA, a mobile phone, a PMP, a navigation device, a TV, a DTV, a broadcast station, and an IPTV.
  • the user terminal 300 may be connected to another user terminal 300.
  • Each configuration of the live broadcast system 1 may be connected as follows.
  • the image acquisition apparatus 10 may obtain an image and / or an audio signal and transmit the image and / or audio signal to the terminal 100.
  • the terminal 100 may encode video and audio signals.
  • the terminal 100 may transmit the encoded video and / or audio signal to the server 200 through a communication network.
  • the server 200 may generate a control signal for controlling the operation of the terminal 100 as necessary.
  • the server 200 may transmit the transmitted video and / or audio signal to the user terminal 300 through a communication network.
  • the server 200 may store the transmitted video and / or audio signal.
  • the terminal 100, the server 200, and the user terminal 300 may include a modem for data communication through the communication network.
  • the modem may be built in the terminal 100, the server 200, and the user terminal 300, or may be connected by a connection terminal provided in each terminal.
  • the user terminal 300 may decode and play the transmitted video and / or audio signal.
  • a control signal for controlling the operation of the terminal 100 may be generated as necessary.
  • the control signal generated by the user terminal 300 may be transmitted by the server 200 to the corresponding terminal 100, and may be transmitted directly from the user terminal 300 to the terminal 100.
  • FIG. 2 illustrates a system associated with one embodiment of the present invention.
  • the live broadcasting system 1 may include a plurality of image acquisition apparatuses 1, a terminal 100, a server 200, and a user terminal 300, respectively. Since the plurality of image acquisition apparatuses 1 and the terminal 100 are included, a plurality of live broadcasts may be simultaneously provided to the server 200. By providing a plurality of the server 200, it is possible to distribute the reception, storage, and transmission functions.
  • FIGS. 1 and 2 are not essential, such that a multimedia transmission system having more or fewer components may be implemented.
  • FIG. 3 is a diagram illustrating a terminal and a connection device of the present invention.
  • various external devices may be connected to the terminal 100.
  • the image acquisition device 10 the Internet, a modem, a laptop and an external battery may be connected.
  • FIGS. 4 and 5 are side views illustrating a terminal 100 according to an embodiment of the present invention.
  • a display unit 104, an interface unit 130, and an input unit 170 that can be connected to the terminal 100 are illustrated.
  • the display unit 104 displays various information about the terminal 100.
  • the interface unit 130 has a USB port 102, a charging adapter connection unit 110, an SD card connection unit 118, an Internet connection unit 120, an A / V input unit 112, and an A / V output unit 114. May be included.
  • the USB port 102 supports a connection to an external device through a USB interface. For example, it is used to connect WiFi modem, WIBRO modem, GPS device.
  • the charging adapter connection unit 110 is an example of the power supply unit 160.
  • the charging adapter connection unit 110 is connected to the charging adapter to supply power to the terminal 100.
  • the SD card connection unit 118 is a slot to which an external SD memory can be connected.
  • Internet connection 120 is a slot to connect the RJ45 Ethernet cable.
  • the AV input unit 112 is a terminal for connecting an AV (audio / video) input cable
  • the AV output unit 114 is a terminal for connecting an AV output cable.
  • the input unit 170 provides a function for setting the terminal 100 and includes a key input unit.
  • the key input unit includes a reset unit 106 for turning the terminal 100 off and on, and a switch 108 as a button for controlling the terminal 100.
  • the switch 108 may be configured, for example, of a wheel type.
  • FIG. 6 is a block diagram of a terminal 100 according to an embodiment of the present invention.
  • the terminal 100 may include a display unit 104, an interface unit 130, an encoding unit 140, a communication unit 150, a power supply unit 160, an input unit 170, and a controller 190.
  • the display unit 104 may display various types of information related to the operation of the terminal. For example, the display unit 104 may determine the parameters such as the bit rate, the number of frames per second, the resolution, the length of the image set and other terminal states, whether to compress audio, the audio bit rate, the sample rate, and the network connection state. Can be represented.
  • the interface unit 130 receives a video signal consisting of one or both of video and audio from the image acquisition apparatus 10 through the S-Video and RCA terminals, the NTSC (National Television System (s) Committee) method and the PAL ( Phase-Alternating Line, Phase Alternation by Line, or Phase Alternation Line). There are many other ways to support it.
  • NTSC National Television System (s) Committee
  • PAL Phase-Alternating Line, Phase Alternation by Line, or Phase Alternation Line.
  • Audio is received via the RCA jack, with analog (stereo) audio using two RCAs and analog (mono) audio using one RCA.
  • an additional optical cable may be used.
  • the video may be received by a composite method of receiving an image signal with one RCA strand, a component method consisting of three RCA strands, and S-Video, HDMI, component, SDI, etc., which are called S terminals or S-VHS.
  • the encoder 140 converts the analog data received from the image acquisition apparatus 10 into a digital signal and compresses the converted digital data.
  • an image may use an H.264 (Advanced Video Coding) codec
  • voice may be compressed using an AAC (Advanced Audio Coding) codec.
  • H.264 Advanced Video Coding
  • AAC Advanced Audio Coding
  • Parameters used when encoding may include a bitrate associated with an image, a frame per second (fps), a resolution, and a length of a group of picture (GOP). It may also include sample rates, bitrates, and channels associated with speech. The parameters are examples only and other parameter values may be added.
  • the bit rate may be defined as the number of bits processed for each specific time unit (seconds).
  • the number of frames per second may be defined as the number of screens displayed for one second.
  • the resolution may be defined as the number of pixels used to represent one frame, which is usually expressed in the form of multiplying the number of pixels by the number of pixels by the number of pixels by the width.
  • the length of the image set may be expressed as an interval between an I frame and an I frame, which is an anchor frame.
  • the parameters may be set in advance, by the controller 190 of the terminal 100 according to the environment of the communication network, or by remote control in the server 200 or the user terminal 300.
  • the communicator 150 may include a memory 152, a wired communication module 154, and a wireless communication module 156.
  • the memory 152 may be, for example, a buffer in which encoded data is temporarily stored before being transmitted to the server 200.
  • the memory 152 is not limited to being located in the communication unit 150.
  • the memory 152 may constitute the terminal as an independent component of the communication unit 150.
  • the wired communication module 154 and the wireless communication module 156 may have a function of transmitting encoded data to the outside of the terminal 100 and receiving a signal from the outside.
  • the encoded data is a Real Time Streaming Protocol (RTSP, RFC 2326) and a Real-time Transport Protocol (RTP) or Real Time Messaging protocol. Protocol (RTMP), and to the server 200 via its own protocol.
  • RTSP Real Time Streaming Protocol
  • RTP Real-time Transport Protocol
  • RTMP Real Time Messaging protocol. Protocol
  • the power supply unit 160 supplies power to the terminal 100.
  • the terminal 100 may itself have a built-in battery.
  • the input unit 170 may make settings related to the parameters and the other terminal 100.
  • the controller 190 described below through the input setting value may generate a control operation for controlling each component.
  • the controller 190 includes all components including the interface unit 130, the encoding unit 140, the communication unit 150, the display unit 104, the power supply unit 160, and the input unit 170 of the terminal 100. And control the overall operation.
  • the encoding unit 140 and the communication unit 150 may be controlled according to the communication environment of the communication network.
  • FIG. 7 is a block diagram of a server 200 related to an embodiment of the present invention.
  • the server 200 may include a communication unit 210, a control unit 250, and a storage unit 260.
  • the communication unit 210 may include a wired communication module 212 and a wireless communication module 214.
  • the communication unit 210 performs a reception and transmission function. That is, based on a plurality of parameters, encoded data is received from the terminal 100.
  • the communication unit 210 may transmit a control signal to the terminal 100 and may transmit the multimedia stream to the user terminal 300.
  • the multimedia stream here refers to encoded video and / or audio signals.
  • the controller 250 may control all components of the server 200 and may control a control signal to be transmitted to the terminal 100. In this case, accessing the terminal 100 may be performed anywhere if a public IP is assigned. However, it may be convenient to connect through a virtual private network (VPN) because the private IP is required in an environment such as EVDO or WIFI.
  • VPN virtual private network
  • the controller 250 may also convert the received data into various protocols.
  • the received data may be converted into RTSP and RTP protocols for an IPTV service, and converted into RTSP and HTTP protocols for a mobile service and transmitted to the user terminal 300 through the communication unit 210.
  • the storage unit 260 may store the multimedia stream transmitted from the terminal 100 in the form of a media file.
  • the stored multimedia stream can be directly played online through a VOD server without a separate format conversion.
  • the played multimedia stream may be stored in a format that can be played in a general media player.
  • the server 200 may further include an input unit 240 and an output unit 230 to control the terminal 100 and / or the server 200.
  • the components of the server 200 described above are not essential components in this case, may be omitted as necessary, and another component may be added.
  • FIG. 8 is a block diagram illustrating a user terminal associated with one embodiment of the present invention.
  • the user terminal 300 may include a communication unit 310, a decoding unit 320, an output unit 330, an input unit 340, a control unit 350, and a storage unit 360.
  • the user terminal 300 may be connected to the server 200 through a communication network.
  • the communication unit 310 may access the server 200 and receive a multimedia stream through the RTSP / RTP protocol.
  • a control signal generated by the controller 350 may be transmitted to the terminal 100 through the server 200.
  • the control unit 250 of the server 200 may perform a function of finding and transmitting a corresponding terminal 100.
  • the decoder 320 may receive the multimedia stream from the communication unit 310, decode the received data through a codec filter, and convert the received data into data that can be processed by a renderer (not shown).
  • the output unit 330 may include a playback unit.
  • the playback unit may play the decoded multimedia stream video and audio through a renderer. At this time, the synchronization is performed so that the video and audio do not deviate so that a delay does not occur. Through this, the multimedia stream can be reproduced optimally even in a deteriorated communication network.
  • the output unit 330 transmits the reproduced multimedia stream to the outside of the workstation through the DVI-D port, and the transmitted multimedia stream can be broadcasted through a scan converter (SD-SDI, HD-). SDI).
  • SD-SDI scan converter
  • the input unit 340 may control the user terminal 300 and generate a control signal necessary for controlling the terminal 100.
  • the storage unit 360 may store the transmitted multimedia stream.
  • the operating method of the live broadcasting system according to an embodiment of the present invention may be implemented in the live broadcasting system according to the embodiment of the present invention described above, or may be implemented by a system having a different configuration. However, for convenience of description, in describing the operation method, reference is made to the configuration shown in FIGS. 1 to 8.
  • the video signal receiver or external device described below may include the server 200 and the user terminal 300.
  • FIG. 9 is a flowchart illustrating a video signal transmission method according to an embodiment of the present invention.
  • the interface unit 130 of the terminal 100 receives a video signal from the image acquisition apparatus 10 (S810), and the encoding unit 140 encodes the received video signal.
  • the communication unit 150 transmits the converted multimedia stream to an external device.
  • FIG. 10 is a flowchart illustrating a video signal transmission control method according to an embodiment of the present invention.
  • the communication environment of the communication network between the terminal 100 and the external devices 200 and 300 is variable. Therefore, in order to transmit / receive a multimedia stream without interruption, it is necessary to control the data size per second according to the state of the communication network.
  • the embodiment of FIG. 10 further includes a control method in the flowchart of FIG. 9.
  • the communication environment of the communication network between the terminal 100 and the external devices 200 and 300 is checked (S910).
  • the communication environment may be calculated by the data size of the multimedia stream waiting to be stored in the memory 152 of the terminal 100 or the signal strength of the communication network, and the calculated result may be monitored in real time.
  • the multimedia stream is temporarily stored in the memory 152 according to the transmission state, and the goodness of the communication environment can be known through the data size of the stored multimedia stream or the empty memory 152.
  • the signal strength may also be known through round trip time or Received Signal Strength Indication (RSSI).
  • RSSI Received Signal Strength Indication
  • the communication environment may be confirmed by the display unit 104 of the terminal and the output units 230 and 330 of the external devices 200 and 300.
  • the next process is to set the parameters in real time (S920). This is to set the parameters to a predetermined value corresponding to the communication environment. That is, the bitrate, resolution, length of the image set, and frames per second optimized for each communication environment are prepared in advance.
  • the bitrate and resolution may adjust the data size of the multimedia stream per second, and the length of the image set and the number of frames per second may provide a function for protecting image quality even if the data size per second is reduced.
  • the controller 190 performs a first control operation for controlling the size of data required for playing the multimedia stream 1 second.
  • the first control operation can be, for example, reducing or increasing the bit rate and resolution.
  • the controller 190 may control the encoder 140 to set the parameter value.
  • the controller 190 may control the encoding unit 140 to adjust the length of the image set and the number of frames per second. .
  • the parameter values may be automatically generated by the controller 190 or directly input through the input unit 170.
  • the bit rate can be set from 64kbps to 10000kbps and can be adjusted in 50kbps units, and resolution is divided into 320x240, 320x480, 352x240, 352x288, 352x480, 426x240, 480x320, 640x240, 640x480, 720x240, 720x480 It can be adjusted, the number of frames per second can be adjusted to 10 fps, 15 fps, 25 fps, 30 fps, for example, and the length of the image set can be set from 15 to 600, for example.
  • Each of the parameters may be sequentially controlled according to various combination orders. For example, when the communication state deteriorates, the parameter values may be controlled in order of controlling the bit rate first and then controlling the resolution. Conversely, it is also possible to control the resolution first and then control the bit rate.
  • the image signal received by the image acquisition apparatus 10 may be encoded by the changed parameter value (S930).
  • the next process is to perform frame dropping (S940).
  • Frame dropping is for responding to a worsened communication environment by not transmitting P frames or B frames of a multimedia stream waiting to be transmitted.
  • B frames may be dropped first and P frames may be dropped.
  • I frames can also be dropped.
  • the drop order between B, P, and I frames is not limited to the above-described order, and may be based on another example.
  • a concept of frame dropping will be described in detail with reference to FIG. 12.
  • the length of the image set is 10
  • frame dropping is performed, all P frames after the dropped P frame are dropped, thereby reducing the size of data required for playing the multimedia stream 1 second. Therefore, even if the communication state deteriorates, the multimedia stream can be transmitted without delay.
  • the image quality may deteriorate when the data size is reduced by frame dropping, the length of the image set and the number of frames per second may be controlled to compensate for this.
  • the controller 190 controls the communication unit 150 through a second control operation for performing frame dropping.
  • the encoded multimedia stream may be transmitted to the external devices 200 and 300 (S950).
  • the parameter setting step S920 and the frame dropping S940 may be selectively performed.
  • the second control operation may be performed first, and the first control operation may be performed. Detailed description thereof will be described with reference to FIG. 13.
  • FIG. 13 shows an example of parameter values and frame dropping set according to the confirmed communication state.
  • the case in which the communication state is displayed as an image may be regarded as a case where the maximum data size is transmitted.
  • the second control operation for dropping the frame is not necessary.
  • the intermediate process may be omitted and the process may be immediately proceeded to the transmitting step (S950).
  • the bit rate and the resolution of the parameter values may be lowered.
  • a2 and b2 are smaller than a1 and b1, respectively.
  • the length of the image set and the number of frames per second may be reduced from c1 and d1 to c2 and d2 without changing the bit rate and resolution. In this case, image quality may be improved.
  • the above process describes a case in which the communication environment deteriorates downward from the top.
  • the reverse process may be performed in real time to correspond to a changing communication environment.
  • an audio signal may also be controlled at the time of encoding.
  • channel control consisting of sample rate, bit rate, and stereo (STEREO) or mono (MONO) is possible.
  • FIG. 11 is a flowchart illustrating a video signal transmission control method according to another embodiment of the present invention.
  • the embodiment of FIG. 10 prepares a predetermined parameter value and a degree of frame dropping according to a communication environment
  • the embodiment of FIG. 11 compares a parameter value and a frame in real time by comparing a data size per second of a multimedia stream transmitted with a communication environment in real time. The difference is that the degree of dropping varies.
  • a step of calculating a transmission speed (S1020) and a step of comparing the transmission speed and the data size per second of the encoded video signal (S1030) are further performed. Including is different.
  • the controller 190 checks the communication environment and calculates a transmission speed of the communication network (S1020).
  • the controller 190 compares the calculated transmission rate with the data size per second of the encoded video signal (S1030).
  • the data size per second of the encoded video signal refers to the size of data required to reproduce one second of the multimedia stream.
  • control unit 190 is connected to the encoding unit 140 as follows.
  • the communication unit 150 may be controlled.
  • each parameter value setting and frame dropping may be performed separately, and may be superimposed according to the transmission speed of the communication network.
  • control unit 190 may control the encoding unit 140 to increase the bit rate and resolution.
  • the number of frames dropped by the communicator 150 may be reduced.
  • the controller 190 controls the encoder 140 to maintain the set parameter value, and the controller 190 controls the communicator 150 to maintain the degree of dropping the frame. ) Can be controlled.
  • FIG. 14 illustrates a terminal 100 and a server 200 that control the terminal 100 through the server 200.
  • the communication unit 150 of the terminal transmits the encoded multimedia stream, the state information displayed on the display unit 104 of the terminal, and information on the terminal communication environment (S1310), and the communication unit 210 of the server receives it. (S1312).
  • the next process is to determine the communication environment (S1314).
  • the communication environment may be confirmed through information on the communication environment of the transmitted terminal.
  • An IP address assigned to each terminal 100 may be input to the input unit 240 of the server to access information about a communication environment of the corresponding terminal 100.
  • the user generates a first control signal and a second control signal through the input unit 240 of the server.
  • the first control signal is a signal capable of controlling a plurality of parameter values used when encoding analog data received by the encoding unit 140 of the terminal.
  • the second control signal is a signal capable of controlling frame dropping in the communication unit 150 of the terminal.
  • the user may generate the control signal as described above to correspond to the case where the communication environment is deteriorated and the case where the communication environment is improved. Only the first control signal may be generated or only the second control signal may be generated.
  • the generated control signal is transmitted to the communication unit 150 of the terminal through the communication unit 210 of the server (S1316).
  • the controller 250 of the server controls the communication unit 210 of the server so that the first and / or second control signals can be transmitted to a corresponding terminal through an IP address among a plurality of terminals.
  • the control unit 150 of the terminal controls the encoding unit 140 and the communication unit 150 based on the received control signal (S1318, S1320).
  • the encoding unit 140 performs a first control operation of controlling a plurality of parameter values used in encoding based on the first control signal, and the communication unit 150 drops the number of dropped frames based on the second control signal. Perform a second control operation to control.
  • Step S1310 to S1320 may be repeated in real time.
  • FIG. 15 illustrates a terminal 100 and a user terminal 300 that control the terminal 100 through the user terminal 300.
  • the difference from FIG. 14 is that the encoded video signal is transmitted from the terminal 100 directly to the user terminal 300. Therefore, although the subject was the server 200 in FIG. 14, in the embodiment of FIG. 15, the subject is changed to the user terminal 300, and each step for implementing remote control is the same. Hereinafter, the difference steps will be described.
  • the encoded video signal and the terminal communication environment information are received through the communication unit 310 of the user terminal (S1332).
  • the output unit 330 reproduces the received video signal (S1334). Therefore, the communication environment may be determined by reproducing the received communication environment information and the image signal. Other steps are the same as the embodiment described with reference to FIG.
  • FIG. 16 illustrates a terminal 100, a server 200, and a user terminal 300, which controls the terminal 100 through the user terminal 300.
  • the basic configuration is the same as the embodiment in FIG. 14 and FIG. 15.
  • the encoded video signal and information on the terminal communication environment may be transmitted from the terminal 100 to the user terminal 300 through the server 200.
  • the user terminal 300 may output the received video signal and generate a control signal based on information on the terminal communication environment.
  • the generated control signal is transmitted to the server 200.
  • the controller 250 of the server may transmit the received control signal to the terminal 100.
  • the controller of the terminal 100 controls the encoding parameter value and the number of dropped frames based on the received control signal. Thereafter, the first step (S1350) can be repeated to perform in real time.
  • FIG. 17 illustrates a terminal 100, a server 200, and a user terminal 300, which controls the terminal 100 through the server 200.
  • the basic configuration is the same as the embodiment in FIG. 14, FIG. 15, and FIG.
  • the video signal encoded from the terminal 100 and terminal communication environment information may be transmitted to the server 200.
  • the received video signal may be transmitted to the user terminal 300.
  • the server 200 may generate a control signal and transmit the control signal to the terminal 100.
  • the controller 150 of the terminal may control the encoding parameter value and the number of dropped frames based on the received control signal. Thereafter, the first step (S1370) can be repeated again.
  • 18, 19, and 20 illustrate a communication environment display unit and an input unit of a terminal.
  • 18 to 20 are only examples of information that can be confirmed and expressed through the display unit 104 of the terminal, the output unit 230 of the server, and the output unit 330 of the user terminal.
  • the above table expressed as video includes image selection whether to compress video, resolution expressed in frame size, frames per second expressed in frame rate, bit rate, length of picture set expressed in GOP size, and video timing. It can be seen that the information can be provided, and the user can directly set the above values.
  • the following table, which consists of audio, provides information on voice signal compression selection, bit rate, channel, and sample rate, and can be configured by the user.
  • information about a network such as a network search order, an IP address, and the like is provided.
  • information about a terminal such as terminal ID information, a memory, a battery, and an operating system, is provided.
  • FIGS. 18 to 20 are only examples, and any other information necessary for implementing an embodiment of the present invention may be added.
  • a real-time data transmission control system and control method capable of transmitting the data without delay even when the communication environment of the communication network deteriorates. can be.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Quality & Reliability (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

La présente invention porte sur un système de commande de transmission de données en temps réel et sur un procédé de commande associé, et plus spécifiquement sur un système et un procédé de commande destinés à commander la taille de données par seconde dans un flux multimédia transmis en fonction de l'état d'un réseau de communication variable. Le système comprend : une unité d'interface qui reçoit un signal d'image provenant d'un dispositif d'acquisition d'image externe ; une unité de codage qui code le signal d'image reçu sur la base d'une pluralité de paramètres de codage ; une unité de communication qui transmet le signal d'image codé à un dispositif externe par l'intermédiaire du réseau de communication ; et une unité de commande qui confirme l'environnement de communication du réseau de communication, et réalise au moins une opération de commande parmi des première et seconde opérations de commande de telle manière que la taille de données du signal d'image transmis qui a été codé peut être ajustée en temps réel en fonction de l'environnement de communication confirmé. La première opération de commande commande l'unité de codage par commande d'au moins l'un de la pluralité de paramètres de codage, et la seconde opération de commande commande l'unité de communication par commande du nombre de trames qui constituent le signal d'image transmis qui a été codé, de telle manière que certaines des trames qui constituent le signal d'image codé peuvent être transmises au dispositif externe.
PCT/KR2010/007361 2009-10-28 2010-10-26 Système de commande de transmission de données en temps réel et procédé de commande associé Ceased WO2011052960A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2009-0103009 2009-10-28
KR1020090103009A KR100968266B1 (ko) 2009-10-28 2009-10-28 실시간 데이터 전송 제어 시스템 및 그 제어 방법

Publications (2)

Publication Number Publication Date
WO2011052960A2 true WO2011052960A2 (fr) 2011-05-05
WO2011052960A3 WO2011052960A3 (fr) 2011-09-01

Family

ID=42645236

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2010/007361 Ceased WO2011052960A2 (fr) 2009-10-28 2010-10-26 Système de commande de transmission de données en temps réel et procédé de commande associé

Country Status (2)

Country Link
KR (1) KR100968266B1 (fr)
WO (1) WO2011052960A2 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015119337A1 (fr) * 2014-02-06 2015-08-13 엘지전자 주식회사 Procédé de réalisation d'un service d'affichage par wi-fi et dispositif à cet effet

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101782997B1 (ko) * 2011-05-17 2017-09-28 삼성전자주식회사 디지털 촬영 장치 및 이의 전력 제어 방법
KR101339835B1 (ko) 2011-12-13 2013-12-11 엠앤서비스 주식회사 이동 단말의 실시간 화면 재생기, 그 시스템 및 화면 재생 방법
KR101827427B1 (ko) * 2016-06-20 2018-02-08 한국항공대학교산학협력단 영상 신호 송신 장치 및 그 영상 신호 송신 방법
KR101795958B1 (ko) * 2016-08-09 2017-12-01 연세대학교 산학협력단 실시간 네트워크 카메라에서의 적응적 영상 제공 방법, 장치 및 사용자 단말기
KR20220130394A (ko) 2021-03-18 2022-09-27 삼성전자주식회사 복수의 미디어 스트림을 전송하는 전자 장치 및 그 방법

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6256423B1 (en) * 1998-09-18 2001-07-03 Sarnoff Corporation Intra-frame quantizer selection for video compression
JP4465827B2 (ja) 2000-07-13 2010-05-26 ソニー株式会社 映像信号記録再生装置および方法、並びに記録媒体
JP2006279926A (ja) 2005-03-01 2006-10-12 Omron Corp 無線通信装置、無線通信システム、無線通信方法、プログラム、およびそのプログラムを記録した記録媒体
KR100817055B1 (ko) 2006-08-23 2008-03-26 삼성전자주식회사 회귀경로를 이용한 영상처리 시스템 장치 및 영상처리 방법
KR100827108B1 (ko) * 2006-10-25 2008-05-02 삼성전자주식회사 영상 서비스 제공 시스템 및 이를 위한 방법 및 장치
KR101172430B1 (ko) * 2007-08-17 2012-08-08 삼성전자주식회사 비트율 제어 방법 및 그 장치
KR100962673B1 (ko) * 2008-01-12 2010-06-11 (주) 이노티브잉크코리아 영상처리시스템, 영상처리방법 및 영상전달방법
KR20080113325A (ko) * 2008-11-18 2008-12-30 이동수 무선 다중접속에 의한 실시간 동영상 전송장치, 무선 다중접속에 의한 실시간 동영상 수신장치, 무선 다중접속에의한 실시간 동영상 송수신장치 및 무선 다중접속에 의한 실시간 동영상 송수신 방법

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015119337A1 (fr) * 2014-02-06 2015-08-13 엘지전자 주식회사 Procédé de réalisation d'un service d'affichage par wi-fi et dispositif à cet effet
AU2014382177B2 (en) * 2014-02-06 2017-03-16 Lg Electronics Inc. Method for performing Wi-Fi display service and device for same

Also Published As

Publication number Publication date
WO2011052960A3 (fr) 2011-09-01
KR100968266B1 (ko) 2010-07-06

Similar Documents

Publication Publication Date Title
KR100975311B1 (ko) 요청시 i-화상 삽입
US9955212B2 (en) Media content shifting
US20020170067A1 (en) Method and apparatus for broadcasting streaming video
US20140281999A1 (en) Display device, two-way communication system and display information using method
US20110304739A1 (en) Camera system, video selection apparatus and video selection method
WO2011052960A2 (fr) Système de commande de transmission de données en temps réel et procédé de commande associé
US10200633B2 (en) Camera system, video selection apparatus and video selection method
JP2005176219A (ja) 制御装置及び動画再生装置
CN101072341A (zh) 发送装置及方法、接收装置及方法、发送和接收系统
WO2012128563A2 (fr) Dispositif et procédé d'émission/réception de contenu radiodiffusé lié grâce à un réseau hétérogène
JP2009502073A (ja) 携帯電話からテレビジョン生放送するためのシステム
KR20080086262A (ko) 디지털 콘텐츠 공유를 위한 방법 및 장치, 그리고 디지털콘텐츠 공유 시스템
CA2303326C (fr) Recepteur et emetteur-recepteur
WO2016056804A1 (fr) Appareil de traitement de contenu et procédé de traitement de contenu associé
WO2012144795A2 (fr) Appareil de diffusion d'émission enregistrée par enregistrement programmé, et procédé de commande de l'appareil
CN101141585B (zh) 接收设备
US20110321106A1 (en) Transmitter, receiver, communication equipment, communication system, transmission method and reception method
JP2010258489A (ja) 映像表示装置、受信装置、送受信システム、及び映像表示方法
JP2008193732A (ja) 監視映像送信方法
US8872971B2 (en) Video display apparatus, video processing method, and video display system
KR20210097550A (ko) 5g 이동 통신망을 이용한 휴대용 무선 방송 중계장치 및 방법
US12401840B2 (en) Information processing device and information processing method
WO2015046854A1 (fr) Appareil d'affichage d'image, serveur, procédé de mise en œuvre l'appareil d'affichage d'image et procédé de mise en œuvre du serveur
JP6468663B2 (ja) 映像コンテンツ配信装置
WO2020166759A1 (fr) Procédé et appareil pour lire une vidéo en fonction d'un temps de lecture de vidéo demandé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10827044

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10827044

Country of ref document: EP

Kind code of ref document: A2