[go: up one dir, main page]

WO2017158622A2 - Method for managing image data at electronic device - Google Patents

Method for managing image data at electronic device Download PDF

Info

Publication number
WO2017158622A2
WO2017158622A2 PCT/IN2017/050092 IN2017050092W WO2017158622A2 WO 2017158622 A2 WO2017158622 A2 WO 2017158622A2 IN 2017050092 W IN2017050092 W IN 2017050092W WO 2017158622 A2 WO2017158622 A2 WO 2017158622A2
Authority
WO
WIPO (PCT)
Prior art keywords
blocks
image frames
blocks corresponding
rate
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/IN2017/050092
Other languages
French (fr)
Other versions
WO2017158622A3 (en
Inventor
Anurag Mittal
Raj Gupta
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Indian Institute of Technology Madras
Original Assignee
Indian Institute of Technology Madras
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Indian Institute of Technology Madras filed Critical Indian Institute of Technology Madras
Publication of WO2017158622A2 publication Critical patent/WO2017158622A2/en
Publication of WO2017158622A3 publication Critical patent/WO2017158622A3/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/507Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction using conditional replenishment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/132Sampling, masking or truncation of coding units, e.g. adaptive resampling, frame skipping, frame interpolation or high-frequency transform coefficient masking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/136Incoming video signal characteristics or properties
    • H04N19/137Motion inside a coding unit, e.g. average field, frame or block difference
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/167Position within a video image, e.g. region of interest [ROI]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/176Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock

Definitions

  • the present application relates to an image data processing, and more particularly to a method for managing image data at an electronic device.
  • the present application is based on, and claims priority from an Indian Application Number 201641009144 filed on 16 th March, 2016, the disclosure of which is hereby incorporated by reference herein.
  • a transmission format typically used for the camera feed are H.264/265 over an Internet Protocol (IP) network.
  • IP Internet Protocol
  • the transmission format is developed for a real-time video streaming for many applications (e.g., video chat, live event streaming, internet video display or the like).
  • the transmission format is adopted as it is for the video surveillance application.
  • a transmission process encounters various problems.
  • one of the method is used to transmit all the data in high resolution. This increases the transmission bandwidth usage. Further, the data is transmitted over channels without sufficient bandwidth. This may be the case on a cellular network and a remote location monitoring application. This results in degrading performance in the video surveillance application. In another method, the data is transmitted after compression at a lower resolution. This results in loss of important forensic data.
  • the conventional systems and methods are effective to a degree during transmission of the data but includes both advantages and disadvantages in terms of bandwidth usage, memory, power, loss of information due to a channel, cost, level of accuracy, ability to support multiple environments, reliable in communication or the like.
  • bandwidth usage bandwidth usage
  • memory power
  • loss of information due to a channel
  • cost cost
  • level of accuracy ability to support multiple environments, reliable in communication or the like.
  • the principal object of the embodiments herein is to provide a method for managing an image data at an electronic device.
  • Another object of the embodiments herein is to receive a plurality of image frame.
  • Another object of the embodiments herein is to segment each of the image frames into a plurality of blocks.
  • Another object of the embodiments herein is to detect a set of blocks from a plurality of blocks that includes a frequent motion of one or more object(s) in each of the image frames.
  • Another object of the embodiments herein is to detect a rate of motion of the object in the detected set of blocks.
  • Another object of the embodiments herein is to determine a differential motion data of the object among the image frames based on the rate of motion of the object.
  • Another object of the embodiments herein is to transmit a portion of the image frames corresponding to the differential motion data frequently compared to the remaining image frames in the plurality of image frames.
  • Another object of the embodiments herein is to determine a set of blocks corresponding to a background portion and a set of blocks corresponding to a foreground portion in each of the image frames based on the differential motion data. [0012] Another object of the embodiments herein is to crop a boundary of the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion.
  • Another object of the embodiments herein is to process at least one of the set of blocks corresponding to the background portion to have a quality lower the set of blocks corresponding to the foreground portion.
  • Another object of the embodiments herein is to detect an importance block of the at least one object of the set of blocks corresponding to the foregoing portion based on a plurality of parameters.
  • Another object of the embodiments herein is to process the important blocks corresponding to the foreground portion in each of the image frames.
  • Another object of the embodiments herein is to transmit the processed important blocks over a network.
  • Embodiments herein disclose a method for managing image data at an electronic device.
  • the method includes receiving, by a coding unit, a plurality of image frame. Further, the method includes detecting, by the coding unit, a rate of motion of at least one object in each block of the image frames. Further, the method includes controlling, by the coding unit, a rate of transmission of each of the blocks of the image frames based on the rate of motion of the at least one object.
  • controlling the rate of transmission of each of the blocks of the image frames based on the rate of motion of the at least one object includes determining a differential motion data of the at least one object among the image frames based on the rate of motion of the at least one object, and transmitting the blocks of the image frames corresponding to the differential motion data over a network.
  • the method includes determining a set of blocks corresponding to a background portion and a set of blocks corresponding to a foreground portion in each of the image frames based on the differential motion data. Further, the method includes processing at least one of the set of blocks corresponding to the background portion to have a quality lower the set of blocks corresponding to the foreground portion. Further, the method includes transmitting the processed set of blocks over the network.
  • the coding unit is configured to process at least one of the set of blocks corresponding to the background portion to have the quality lower the set of blocks corresponding to the foreground portion by cropping at least one of boundary of the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion.
  • the boundary of the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion are lined with boundary used by a compression technique.
  • the method includes determining a set of blocks corresponding to the background portion and a set of blocks corresponding to the foreground portion in each of the image frames based on the differential motion data. Further, the method includes detecting an importance block of the at least one object of the set of blocks corresponding to the foregoing portion based on a plurality of parameters. Further, the method includes processing the important blocks corresponding to the foreground portion in each of the image frames. Further, the method includes transmitting the processed important blocks over the network.
  • the parameters includes at least one of an availability of the at least one object in each of the image frames, information already transmitted corresponding to the at least one object, a rate of motion of the at least one object in each of the image frames, resolution, and an output from an object detection technique.
  • the coding unit is configured to process the important blocks corresponding to the foreground portion in each of the image frames by cropping boundary of the important blocks corresponding to the foreground portion.
  • the boundary of the important blocks is lined with the boundary used by the compression technique.
  • the rate of motion of the at least one object in each block of each of the image frames is detected by segmenting each of the image frames into a plurality of blocks and detecting a set of blocks from the plurality of block that includes a frequent motion of the at least one object in each of the image frame.
  • Embodiments herein disclose a method for managing an image data at an electronic device.
  • the method includes receiving, by a coding unit, a plurality of image frames. Further, the method includes determining, by the coding unit, a differential motion data of the at least one foreground object among the plurality of image frames. Further, the method includes controlling a rate of transmission over a network by transmitting a portion of the image frames corresponding to the differential motion data frequently compared to the remaining image frames in the plurality of image frames.
  • Embodiments herein disclose an electronic device for managing an image data.
  • the electronic device includes a coding unit in communication with a memory and a processor.
  • the coding unit is configured to receive a plurality of image frame. Further, the coding unit is configured to detect a rate of motion of at least one object in each block of the image frames. Further, the coding unit is configured to control a rate of transmission of each of the blocks of the image frames based on the rate of motion of the at least one object.
  • Embodiments herein disclose an electronic device for managing image data.
  • the electronic device includes a coding unit in communication with a memory and a processor.
  • the coding unit is configured to receive a plurality of image frames. Further, the coding unit is configured to determine a differential motion data of the at least one foreground object among the plurality of image frames. Further, the coding unit is configured to control a rate of transmission over a network by transmitting a portion of the image frames corresponding to the differential motion data frequently compared to the remaining image frames in the plurality of image frames.
  • the embodiment herein provides a computer program product including a computer executable program code recorded on a computer readable non-transitory storage medium.
  • the computer executable program code when executed causing the actions including receiving, by a coding unit, a plurality of image frame.
  • the computer executable program code when executed causing the actions including detecting, by the coding unit, a rate of motion of at least one object in each block of the image frames.
  • the computer executable program code when executed causing the actions including controlling, by the coding unit, a rate of transmission of each of the blocks of the image frames based on the rate of motion of the at least one object.
  • the embodiment herein provides a computer program product including a computer executable program code recorded on a computer readable non-transitory storage medium.
  • the computer executable program code when executed causing the actions including receiving, by a coding unit, a plurality of image frames.
  • the computer executable program code when executed causing the actions including determining, by the coding unit, a differential motion data of the at least one foreground object among the plurality of image frames.
  • the computer executable program code when executed causing the actions including controlling a rate of transmission over a network by transmitting a portion of the image frames corresponding to the differential motion data frequently compared to the remaining image frames in the plurality of image frames.
  • FIG. 1 illustrates various units of an electronic device for managing image data, according to the embodiments as disclosed herein;
  • FIG. 2 illustrates various units of a coding unit included in the electronic device, according to the embodiments as disclosed herein;
  • FIG. 3 is a flow diagram illustrating a method for controlling a rate of transmission of each of blocks of image frames, according to an embodiment as disclosed herein;
  • FIG. 4 is a flow diagram illustrating a method for transmitting one or more portions of the image frames corresponding to a differential motion data over a network, according to an embodiment as disclosed herein;
  • FIG. 5 is a flow diagram illustrating various operations performed to detect a rate of motion of one or more objects in the detected set of blocks, according to an embodiment as disclosed herein;
  • FIG. 6 illustrates an example in which an image data is processed in a video surveillance environment, according to an embodiment as disclosed herein;
  • FIG. 7 is a flow diagram illustrating various operations performed to transmit the blocks of the image frames corresponding to the differential motion data over a network, according to an embodiment as disclosed herein;
  • FIGS. 8a and 8b are flow diagrams illustrating various operations performed to transmit the set of blocks over the network, according to an embodiment as disclosed herein;
  • FIG. 9 illustrates a computing environment implementing the method for managing the image data, according to an embodiment as disclosed herein.
  • circuits may, for example, be embodied in one or more semiconductor chips, or on substrate supports such as printed circuit boards and the like.
  • circuits constituting a block may be implemented by dedicated hardware, or by a processor (e.g., one or more programmed microprocessors and associated circuitry), or by a combination of dedicated hardware to perform some functions of the block and a processor to perform other functions of the block.
  • a processor e.g., one or more programmed microprocessors and associated circuitry
  • Each block of the embodiments may be physically separated into two or more interacting and discrete blocks without departing from the scope of the invention.
  • the blocks of the embodiments may be physically combined into more complex blocks without departing from the scope of the invention
  • the embodiments herein provide a method for managing image data at an electronic device.
  • the method includes receiving, by a coding unit, a plurality of image frame. Further, the method includes detecting, by the coding unit, a rate of motion of at least one object in each block of the image frames. Further, the method includes controlling, by the coding unit, a rate of transmission of each of the blocks of the image frames based on the rate of motion of the at least one object.
  • the proposed method can be used to detect important block of the object from the set of blocks corresponding to a foreground portion. Further, the proposed method can be used to process the important blocks corresponding to the foreground portion in each of the image frames. Further, the proposed method can be used to transmit the processed important blocks over a network. This results in avoiding repeatedly sending the high resolution images over the network. This results in reducing bandwidth usage.
  • the proposed method can be used to reduce the distortion in pixel values in the block so as to improve the quality of the foreground portion and the background portion.
  • FIGS. 1 through 9 where similar reference characters denote corresponding features consistently throughout the figure, there are shown preferred embodiments.
  • FIG. 1 illustrates various units of an electronic device 100 for managing an image data, according to the embodiments as disclosed herein.
  • the electronic device 100 can be, for example but not limited to, a digital camera, a mobile telephone, a smartphone. a Personal Digital Assistant (PDA), a media player, a gaming device, a web camera, a video camera, a computer, a laptop, or the like.
  • the image data can be, for example but not limited to, a picture, a video, a multimedia content or the like.
  • the electronic device 100 includes a communication unit 102, a coding unit 104, a processor 106 and a memory 108.
  • the coding unit 104 is in communication with the memory 108 and the processor 106.
  • the coding unit 104 is configured to receive a plurality of image frames. After receiving the plurality of image frames, the coding unit 104 is configured to segment each of the image frames into a plurality of blocks. In an example, the coding unit 104 divides each of the image frames into the plurality of blocks using a tiling pattern. The each of the image frames includes a plurality of full- sized, interior blocks.
  • the coding unit 104 is configured to detect the set of blocks from the plurality of blocks that includes a frequent motion of one or more object in each of the image frames.
  • the object corresponds to a specific portion in the blocks or region of interest in the blocks.
  • the objects of the image data are likely to be a recognizable item of interest to the user.
  • the recognizable item may include, for example, a person's face, a person's body, a car, a truck, a cat, a dog or the like.
  • the coding unit 104 is configured to determine the rate of motion of one or more object in the detected set of blocks.
  • the coding unit 104 is configured to determine the rate of motion of one or more object in the detected set of blocks using a blob detection technique, a color change adjustment scheme, a geometric variation scheme or the like.
  • the coding unit 104 is configured to control the rate of transmission of each of the blocks of the image frames.
  • the coding unit 104 is configured to control the rate of transmission by determining a differential motion data of the object among the image frames based on the rate of motion of the at least one object.
  • the differential motion data corresponds to the movement of the object or position changes of the object. The movement of the object or position changes of the object are determined by a scheme (e.g., object tracking scheme, object tracking scheme or the like).
  • the coding unit 104 is configured to transmit the blocks of the image frames over a network (not shown).
  • the network can be a cellular network.
  • the coding unit 104 is configured to determine the set of blocks corresponding to a background portion and the set of blocks corresponding to a foreground portion in each of the image frames based on the differential motion data. Further, the coding unit 104 is configured to process at least one of the set of blocks corresponding to the background portion to have a quality lower the set of blocks corresponding to the foreground portion.
  • the quality level refers to a number of image processing parameters including resolution, frame rate, bit rate, and image compression quality.
  • the coding unit 104 is configured to transmit the blocks of the image frames over the network [0059]
  • the compression of the image data may include any appropriate compression scheme, such as applying a scheme that changes the effective amount of the image data in terms of number of bits per pixel.
  • the compression scheme include, for example, a predetermined compression scheme for a specific file format.
  • a lowest quality JPEG compression scheme may have a quality value (or Q value) of two, a low quality JPEG compression may have a Q value of seven, a medium quality JPEG compression may have a Q value of twenty, an average quality JPEG compression may have a Q value of fifty, and a full quality JPEG compression may have a Q value of one hundred.
  • the coding unit 104 is configured to process the set of blocks corresponding to the background portion to have the quality lower the set of blocks corresponding to the foreground portion by cropping a boundary of the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion.
  • the lowering the quality of the image data is achieved in several schemes such as lowering the resolution of the image data, or using a higher compression rate in the compression scheme.
  • the quality management may apply a low or moderate amount of compression to one region of the image and a higher amount of compression to the rest of the image.
  • the image data is originally compressed using a block-based approach such as JPEG and a cropped head image is compressed again before transmission, then the compression artifacts can be reduced if the blocks used for re-compression are the same as the original one. For this, the cropping of the important objects can be done at the boundaries of the original block boundaries themselves, which will reduce these artifacts.
  • the JPEG compression is employed at the source image and re-compression, then the windows of the extracted images is cropped at image boundaries which are multiples of 16 since JPEG operates on blocks of size 16x16.
  • every block can be ensured to be only the foreground block or the background block. This is important because this preserves the fidelity of the foreground objects during a reconstruction process. This helps in identifying which pixels are background and which are foreground during reconstruction.
  • the boundary of the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion are lined with boundary used by the compression technique.
  • the boundary of the important blocks is lined with the boundary used by the compression technique.
  • the coding unit 104 is configured to detect the importance block of the object of the set of blocks corresponding to the foregoing portion based on a plurality of parameters.
  • the coding unit 104 is configured to process the important blocks corresponding to the foreground portion in each of the image frames. Further, the coding unit 104 is configured to transmit the blocks corresponding to the differential motion data over the network.
  • the coding unit 104 determines changes in the each of the blocks of the image frames due to noise. As a result, the foreground image including the noise may be sent with very low frequency. [0069] In an embodiment, the noise and small changes in the each of the blocks of the image frames need not be sent using the object detection techniques.
  • the parameters includes at least one of an availability of the object in each of the image frames, information already transmitted corresponding to the object, a rate of motion of the object in each of the image frames, resolution, and an output from the object detection technique.
  • the coding unit 104 is configured to transmit different parts of the important blocks separately.
  • Each block have one or more foreground objects with start x and y co-ordinates and width and height of the image data. This might reduce the amount of data to be transmitted.
  • the items may be prioritized based on size and/or location within the image data.
  • the visitor sits in a reception area.
  • the reception area includes a set of immovable items (flowers pot, a set of tables and monitors).
  • the immovable items do not change their positions.
  • the immovable items consider as the background portion.
  • the visitor sitting in the reception area may change his position frequently, which can be as the foreground portion.
  • the people changing position considers as the differential motion data.
  • the coding unit 104 transmits the block of the foreground portion corresponding to the differential motion data over the network. This results in avoiding repeated transmission of the high resolution images over the network. This results in reducing the bandwidth usage over the network.
  • the communication unit 102 is configured for communicating internally between internal units and with external devices via one or more networks.
  • the memory 108 may include one or more computer-readable storage media.
  • the memory 108 may include nonvolatile storage elements. Examples of such non- volatile storage elements may include magnetic hard disc, optical discs, floppy discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories.
  • EPROM electrically programmable memories
  • EEPROM electrically erasable and programmable
  • the memory 108 may, in some examples, be considered a non-transitory storage medium.
  • the term “non-transitory” may indicate that the storage medium is not embodied in a carrier wave or a propagated signal. However, the term “non-transitory” should not be interpreted that the memory 108 is non-movable.
  • FIG. 1 shows exemplary units of the electronic device 100 but it is to be understood that other embodiments are not limited thereon. In other embodiments, the electronic device 100 may include less or more number of units. Further, the labels or names of the units are used only for illustrative purpose and does not limit the scope of the invention. One or more units can be combined together to perform same or substantially similar function to manage the image data at the electronic device 100.
  • FIG. 2 illustrates various units of the coding unit 104 included in the electronic device 100, according to the embodiments as disclosed herein.
  • the coding unit 104 includes a frame processing unit 104a, a segmentation unit 104b, a compressor 104c, an object detector 104d, an image analyzer 104e, and a block estimator 104f.
  • the frame processing unit 104a is configured to receive the plurality of image frame.
  • the segmentation unit 104b is configured to segment each of the image frames into the plurality of blocks.
  • the segmentation unit 104b segments each of the image frames into the plurality of blocks using a pattern analyzing scheme.
  • the object detector 104d is configured to detect the set of blocks from the plurality of block that includes the frequent motion of the object in each of the image frames.
  • the block estimator 104f is configured to determine the rate of motion of the object in the detected set of blocks. In an example, the block estimator 104f determines the rate of motion of the object in the detected set of blocks using the blob detection technique, the color change adjustment scheme, or the like.
  • the block estimator 104f is configured to control the rate of transmission of each of the blocks of the image frames.
  • the image analyzer 104d is configured to control the rate of transmission by determining the differential motion data of the object among the image frames based on the rate of motion of the object.
  • the block estimator 104f is configured to transmit the blocks of the image frames over the network.
  • the block estimator 104f frequently transmits each of the blocks of the image frames, if the rate of motion of the object occurs frequently in the detected set of blocks.
  • the block estimator 104f is configured to determine the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion in each of the image frames based on the differential motion data.
  • the block estimator 104f is configured to processes the set of blocks corresponding to the background portion to have the quality lower the set of blocks corresponding to the foreground portion.
  • the block estimator 104f is further configured to transmit the blocks of the image frames over the network.
  • the image analyzer 104e is configured to process the set of blocks corresponding to the background portion to have the quality lower the set of blocks corresponding to the foreground portion by cropping the boundary of the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion.
  • the block estimator 104f is configured to detect the importance block of the object of the set of blocks corresponding to the foregoing portion based on the plurality of parameters.
  • the block estimator 104f is configured to process the important blocks corresponding to the foreground portion in each of the image frames.
  • the block estimator 104f is further configured to transmit the blocks corresponding to the differential motion data over the network
  • the object detector 104d determines which object of the set of blocks have already been sent. In an embodiment, object detector 104d determines which object of the set of blocks not send at all. In an embodiment, the object detector 104d determines which object of the set of blocks is sent in very low number of bits over the network.
  • the object detector 104d utilizes different coding schemes depending on the amount of motion that the foreground objects undergo.
  • a difference from the previous image after motion compensation can be computed and sent (similar to traditional compression schemes such as MPEG4/H.264/H.265 schemes used for image data).
  • the motion is large (for e.g., a passenger moving away from the lounge by suddenly getting up with his luggage), a difference image from the background can be sent.
  • the scheme that gives the least number of bits can be used adaptively.
  • Detection techniques can be used in the foreground region to determine the important parts of the image and sent with the frequency and bits depending on their importance. For example, at a first instance, the data needs to be transmitted at 15fps, and at another instance, the data needs to be transmitted at lfps. Also, objects that have been sent before may be sent at even greater intervals. The importance of different parts of the image may be determined by the object detection techniques. For instance, faces of humans, number plates of vehicles etc. are the most important forensic information and can be sent in higher resolution and in high frequency, while other parts of the humans or vehicles do not need so many bits to transfer and can be sent infrequently and/or in lower number of bits.
  • FIG. 2 shows exemplary units of the coding unit 104 but it is to be understood that other embodiments are not limited thereon.
  • the coding unit 104 may include less or more number of units.
  • the labels or names of the units are used only for illustrative purpose and does not limit the scope of the invention.
  • One or more units can be combined together to perform same or substantially similar function to process the image data in the coding unit 104.
  • FIG. 3 is a flow diagram 300 illustrating a method for controlling the rate of transmission of each of blocks of the image frames, according to an embodiment as disclosed herein.
  • the method includes receiving the plurality of image frames. In an embodiment, the method allows the frame processing unit 104 to receive the plurality of image frames.
  • the method includes detecting the rate of motion of the object in each block of the image frames. In an embodiment, the method allows the object detector 104d to detect the rate of motion of the object in each block of the image frames. The operations 304 is explained in conjunction with the FIG. 5.
  • the method includes controlling the rate of transmission of each of the blocks of the image frames. In an embodiment, the method allows the block estimator 104f to control the rate of transmission of each of the blocks of the image frames. The operations 306 is explained in conjunction with the FIG. 7.
  • the images of important objects which are clearly visible from a viewpoint and do not change much over a period of time need not be transmit frequently.
  • the people in an area may not change their positions very often. So, the proposed method does not repeatedly send the high resolution frontal images of their faces over the network. This results in reducing the bandwidth usage.
  • the proposed method can be used to reduce the number of bits transmitted across the network, as compression of an "all zero" block and "all foreground” block is easier for scheme such as JPEG scheme, which utilize spatial smoothness of the image data.
  • JPEG scheme which utilize spatial smoothness of the image data.
  • the compression scheme needs to use considerable information to a model change from the background portion to the foreground portion within the block.
  • the method can be used to reduce distortion the pixel values in the block and improve the quality of the foreground portion and the background portion.
  • FIG. 4 is a flow diagram 400 illustrating a method for transmitting one or more portions of the images frames corresponding to the differential motion data over the network, according to an embodiment as disclosed herein.
  • the method includes receiving the plurality of image frames.
  • the method allows the frame processing unit 104a to receive the plurality of image frames.
  • the method includes determining the differential motion data of the foreground object among the plurality of image frames.
  • the method allows the block estimator 104f to determine the differential motion data of the foreground object among the plurality of image frames.
  • the method includes transmitting the portion of the image frames corresponding to the differential motion data frequently compared to the remaining image frames in the plurality of image frames.
  • the method allows the coding unit 104 to transmit the portions of the image frames corresponding to the differential motion data frequently compared to the remaining image frames in the plurality of image frames.
  • the differential motion data corresponds to the movement of the object or position changes of the object.
  • the method includes controlling the rate of transmission over the network.
  • the method allows the coding unit 104 to control the rate of transmission over the network.
  • the remaining portions of the image frames are visible from the viewpoint and do not change frequently over the period of time.
  • the people sits in a reception area.
  • the reception area includes a flowers pot and a set of chairs.
  • the flowers pot and the set of chair does not change their positions, which can be consider as the background portion.
  • the people sitting in the reception area may change their position, which can be consider as the foreground portion.
  • the people change position considers as the differential motion data.
  • the block estimator 104f transmits the block of the foreground portion corresponding to the differential motion data over the network by using an object tracking scheme.
  • the flowers pot and the set of chair does not change their positions frequently over a period of time. This results in avoiding repeatedly transmitting the high resolution images over the network. This results in reducing the bandwidth usage over the network.
  • FIG. 5 is a flow diagram 500 illustrating various operations performed to detect the rate of motion of the objects in the set of blocks, according to an embodiment as disclosed herein.
  • the method includes segmenting each of the image frames into the plurality of blocks. In an embodiment, the method allows the segmentation unit 104b to segment each of the image frames into the plurality of blocks.
  • the method includes detecting the set of blocks from the plurality of blocks that includes the frequent motion of the object in each of the image frames.
  • the method allows the block estimator 104fto detect the set of blocks from the plurality of block that includes the frequent motion of the object in each of the image frames.
  • the method includes detecting the rate of motion of the object in the detected set of blocks. In an embodiment, the method allows the block estimator 104fto detect the rate of motion of the object in the detected set of blocks.
  • FIG. 6 illustrates an example in which the image data is processed in a video surveillance environment, according to an embodiment as disclosed herein.
  • the scene captured by the electronic device 100 is as shown in the notation "A".
  • the frame processing unit 104a receives the image frames.
  • the image frames includes the foreground portion and the background portion.
  • the segmentation unit 104b determines the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion in each of the image frames.
  • the segmentation unit 104b divides the background portion and the foreground portion as shown in the notations "b and c".
  • the object detector 104d detects the movable items and non-movable items in the foreground portion.
  • the movable items are people as shown in the FIG. 7.
  • the non-movable items are steps and wooden materials.
  • the image analyzer 104e Based on the detecting the movable items, the image analyzer 104e detects the important block (e.g., faces) of the movable items as shown in the notations "d and e". The image analyzer 104e processes the important blocks corresponding to the movable items.
  • FIG. 7 is a flow diagram 700 illustrating various operations performed to transmit the blocks of the image frames corresponding to the differential motion data over the network, according to an embodiment as disclosed herein.
  • the method includes determining the differential motion data of the object among the image frames based on the rate of motion of the object. In an embodiment, the method allows the block estimator 104f to determine the differential motion data of the object among the image frames based on the rate of motion of the object.
  • the method includes transmitting the blocks of the image frames corresponding to the differential motion data over the network. In an embodiment, the method allows the coding unit 104 to transmit the blocks of the image frames corresponding to the differential motion data over the network. The operations 704 is explained in conjunction with the FIG. 8a and the FIG. 8b.
  • the coding unit 104 transmits background data very rarely as the differential motion data does not occur in the background portion.
  • the coding unit 104 transmits foreground sent frequently as the differential motion data frequently occurs in the foreground portion.
  • FIGS. 8a and 8b are flow diagrams 800a and 800b illustrating various operations performed to transmit the set of blocks over the network, according to an embodiment as disclosed herein.
  • the method includes determining the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion in each of the image frames based on the differential motion data.
  • the method allows the block estimator 104f to determine the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion in each of the image frames based on the differential motion data.
  • the method includes processing the set of blocks corresponding to the background portion to have the quality lower the set of blocks corresponding to the foreground portion.
  • the method allows the compressor 104c to processing the set of blocks corresponding to the background portion to have the quality lower the set of blocks corresponding to the foreground portion.
  • the method includes transmitting the processed set of blocks over the network.
  • the method allows the block estimator 104f to transmit the processed set of blocks over the network.
  • the method includes determining the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion in each of the image frames based on the differential motion data.
  • the method allows the object detector 104dto determine the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion in each of the image frames based on the differential motion data.
  • the method includes detecting the importance block of the object of the set of blocks corresponding to the foregoing portion based on the plurality of parameters.
  • the method allows the object detector 104dto detect the importance block of the object of the set of blocks corresponding to the foregoing portion based on the plurality of parameters.
  • the faces are one of the most important forensic data (i.e., important block) that is needed by the security agency.
  • the faces may be extracted using various techniques (e.g., machine learning techniques, such as Histogram of Gradients with Support Vector Machines (HOG-SVM) and Neural Networks.
  • HOG-SVM Histogram of Gradients with Support Vector Machines
  • Neural Networks Neural Networks.
  • the portion of the faces is estimated using a background subtracted blob scheme.
  • the faces corresponding to the foreground portion is processed in each of the image frames.
  • the method includes processing the important blocks corresponding to the foreground portion in each of the image frames. In an embodiment, the method allows the image analyzer 104eto process the important blocks corresponding to the foreground portion in each of the image frames. At 808b, the method includes transmitting the processed important blocks over the network. In an embodiment, the method allows the block estimator 104fto transmit the processed important blocks over the network.
  • one of the most important information required by the security agency is a number plate of a vehicle. This information can be extracted again using several techniques such as machine learning techniques.
  • the important block is the number plate.
  • the block estimator 104f only transmits the processed number plate over the network.
  • FIG. 9 illustrates a computing environment 902 implementing the method for managing the image data, according to an embodiment as disclosed herein.
  • the computing environment 902 comprises at least one processing unit 908 that is equipped with a control unit 904, an Arithmetic Logic Unit (ALU) 906, a memory 910, a storage unit 912, a plurality of networking devices 916 and a plurality Input output (I/O) devices 914.
  • the processing unit 908 is responsible for processing the instructions of the technique.
  • the processing unit 908 receives commands from the control unit 904 in order to perform its processing. Further, any logical and arithmetic operations involved in the execution of the instructions are computed with the help of the ALU 906.
  • the overall computing environment 902 can be composed of multiple homogeneous or heterogeneous cores, multiple CPUs of different kinds, special media and other accelerators.
  • the processing unit 908 is responsible for processing the instructions of the technique. Further, the plurality of processing units 904 may be located on a single chip or over multiple chips.
  • the technique comprising of instructions and codes required for the implementation are stored in either the memory unit 910 or the storage 912 or both. At the time of execution, the instructions may be fetched from the corresponding memory 910 or storage 912, and executed by the processing unit 908.
  • networking devices 916 or external I/O devices 914 may be connected to the computing environment 902 to support the implementation through the networking unit and the I/O device unit.
  • FIGS. 1 to 9 The embodiments disclosed herein can be implemented through at least one software program running on at least one hardware device and performing network management functions to control the elements.
  • the elements shown in the FIGS. 1 to 9 include blocks, elements, actions, acts, steps, or the like which can be at least one of a hardware device, or a combination of hardware device and software module.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

Embodiments herein disclose a method for managing image data at an electronic device. The method includes receiving, by a coding unit, a plurality of image frame. Further, the method includes detecting, by the coding unit, a rate of motion of at least one object in each block of the image frames. Further, the method includes controlling, by the coding unit, a rate of transmission of each of the blocks of the image frames based on the rate of motion of the at least one object.

Description

"Method for managing image data at electronic device"
FIELD OF INVENTION
[0001] The present application relates to an image data processing, and more particularly to a method for managing image data at an electronic device. The present application is based on, and claims priority from an Indian Application Number 201641009144 filed on 16th March, 2016, the disclosure of which is hereby incorporated by reference herein.
BACKGROUND OF INVENTION
[0001] The importance of a video camera for surveillance and security is being used by a security agency. The security agency utilizes a camera feed for a post event analysis. A transmission format typically used for the camera feed are H.264/265 over an Internet Protocol (IP) network. The transmission format is developed for a real-time video streaming for many applications (e.g., video chat, live event streaming, internet video display or the like). The transmission format is adopted as it is for the video surveillance application. However, when used as it is in the video surveillance application, a transmission process encounters various problems.
[0002] In the existing methods, one of the method is used to transmit all the data in high resolution. This increases the transmission bandwidth usage. Further, the data is transmitted over channels without sufficient bandwidth. This may be the case on a cellular network and a remote location monitoring application. This results in degrading performance in the video surveillance application. In another method, the data is transmitted after compression at a lower resolution. This results in loss of important forensic data.
[0003] But, the conventional systems and methods are effective to a degree during transmission of the data but includes both advantages and disadvantages in terms of bandwidth usage, memory, power, loss of information due to a channel, cost, level of accuracy, ability to support multiple environments, reliable in communication or the like. Thus, there remains a need of a robust method for managing the data at an electronic device.
OBJECT OF INVENTION
[0004] The principal object of the embodiments herein is to provide a method for managing an image data at an electronic device.
[0005] Another object of the embodiments herein is to receive a plurality of image frame.
[0006] Another object of the embodiments herein is to segment each of the image frames into a plurality of blocks.
[0007] Another object of the embodiments herein is to detect a set of blocks from a plurality of blocks that includes a frequent motion of one or more object(s) in each of the image frames.
[0008] Another object of the embodiments herein is to detect a rate of motion of the object in the detected set of blocks.
[0009] Another object of the embodiments herein is to determine a differential motion data of the object among the image frames based on the rate of motion of the object.
[0010] Another object of the embodiments herein is to transmit a portion of the image frames corresponding to the differential motion data frequently compared to the remaining image frames in the plurality of image frames.
[0011] Another object of the embodiments herein is to determine a set of blocks corresponding to a background portion and a set of blocks corresponding to a foreground portion in each of the image frames based on the differential motion data. [0012] Another object of the embodiments herein is to crop a boundary of the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion.
[0013] Another object of the embodiments herein is to process at least one of the set of blocks corresponding to the background portion to have a quality lower the set of blocks corresponding to the foreground portion.
[0014] Another object of the embodiments herein is to detect an importance block of the at least one object of the set of blocks corresponding to the foregoing portion based on a plurality of parameters.
[0015] Another object of the embodiments herein is to process the important blocks corresponding to the foreground portion in each of the image frames.
[0016] Another object of the embodiments herein is to transmit the processed important blocks over a network.
SUMMARY
[0017] Embodiments herein disclose a method for managing image data at an electronic device. The method includes receiving, by a coding unit, a plurality of image frame. Further, the method includes detecting, by the coding unit, a rate of motion of at least one object in each block of the image frames. Further, the method includes controlling, by the coding unit, a rate of transmission of each of the blocks of the image frames based on the rate of motion of the at least one object.
[0018] In an embodiment, controlling the rate of transmission of each of the blocks of the image frames based on the rate of motion of the at least one object includes determining a differential motion data of the at least one object among the image frames based on the rate of motion of the at least one object, and transmitting the blocks of the image frames corresponding to the differential motion data over a network. [0019] In an embodiment, the method includes determining a set of blocks corresponding to a background portion and a set of blocks corresponding to a foreground portion in each of the image frames based on the differential motion data. Further, the method includes processing at least one of the set of blocks corresponding to the background portion to have a quality lower the set of blocks corresponding to the foreground portion. Further, the method includes transmitting the processed set of blocks over the network.
[0020] In an embodiment, the coding unit is configured to process at least one of the set of blocks corresponding to the background portion to have the quality lower the set of blocks corresponding to the foreground portion by cropping at least one of boundary of the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion.
[0021] In an embodiment, the boundary of the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion are lined with boundary used by a compression technique.
[0022] In an embodiment, the method includes determining a set of blocks corresponding to the background portion and a set of blocks corresponding to the foreground portion in each of the image frames based on the differential motion data. Further, the method includes detecting an importance block of the at least one object of the set of blocks corresponding to the foregoing portion based on a plurality of parameters. Further, the method includes processing the important blocks corresponding to the foreground portion in each of the image frames. Further, the method includes transmitting the processed important blocks over the network. [0023] In an embodiment, the parameters includes at least one of an availability of the at least one object in each of the image frames, information already transmitted corresponding to the at least one object, a rate of motion of the at least one object in each of the image frames, resolution, and an output from an object detection technique.
[0024] In an embodiment, the coding unit is configured to process the important blocks corresponding to the foreground portion in each of the image frames by cropping boundary of the important blocks corresponding to the foreground portion.
[0025] In an embodiment, the boundary of the important blocks is lined with the boundary used by the compression technique.
[0026] In an embodiment, the rate of motion of the at least one object in each block of each of the image frames is detected by segmenting each of the image frames into a plurality of blocks and detecting a set of blocks from the plurality of block that includes a frequent motion of the at least one object in each of the image frame.
[0027] Embodiments herein disclose a method for managing an image data at an electronic device. The method includes receiving, by a coding unit, a plurality of image frames. Further, the method includes determining, by the coding unit, a differential motion data of the at least one foreground object among the plurality of image frames. Further, the method includes controlling a rate of transmission over a network by transmitting a portion of the image frames corresponding to the differential motion data frequently compared to the remaining image frames in the plurality of image frames.
[0028] In an embodiment, the remaining portions of the image frames are visible from a viewpoint and do not change frequently over a period of time. [0029] Embodiments herein disclose an electronic device for managing an image data. The electronic device includes a coding unit in communication with a memory and a processor. The coding unit is configured to receive a plurality of image frame. Further, the coding unit is configured to detect a rate of motion of at least one object in each block of the image frames. Further, the coding unit is configured to control a rate of transmission of each of the blocks of the image frames based on the rate of motion of the at least one object.
[0030] Embodiments herein disclose an electronic device for managing image data. The electronic device includes a coding unit in communication with a memory and a processor. The coding unit is configured to receive a plurality of image frames. Further, the coding unit is configured to determine a differential motion data of the at least one foreground object among the plurality of image frames. Further, the coding unit is configured to control a rate of transmission over a network by transmitting a portion of the image frames corresponding to the differential motion data frequently compared to the remaining image frames in the plurality of image frames.
[0031] Accordingly the embodiment herein provides a computer program product including a computer executable program code recorded on a computer readable non-transitory storage medium. The computer executable program code when executed causing the actions including receiving, by a coding unit, a plurality of image frame. The computer executable program code when executed causing the actions including detecting, by the coding unit, a rate of motion of at least one object in each block of the image frames. The computer executable program code when executed causing the actions including controlling, by the coding unit, a rate of transmission of each of the blocks of the image frames based on the rate of motion of the at least one object. [0032] Accordingly the embodiment herein provides a computer program product including a computer executable program code recorded on a computer readable non-transitory storage medium. The computer executable program code when executed causing the actions including receiving, by a coding unit, a plurality of image frames. The computer executable program code when executed causing the actions including determining, by the coding unit, a differential motion data of the at least one foreground object among the plurality of image frames. The computer executable program code when executed causing the actions including controlling a rate of transmission over a network by transmitting a portion of the image frames corresponding to the differential motion data frequently compared to the remaining image frames in the plurality of image frames.
[0033] These and other aspects of the embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings. It should be understood, however, that the following descriptions, while indicating preferred embodiments and numerous specific details thereof, are given by way of illustration and not of limitation. Many changes and modifications may be made within the scope of the embodiments herein without departing from the spirit thereof, and the embodiments herein include all such modifications.
BRIEF DESCRIPTION OF FIGURES
[0034] This invention is illustrated in the accompanying drawings, throughout which like reference letters indicate corresponding parts in the various figures. The embodiments herein will be better understood from the following description with reference to the drawings, in which:
[0035] FIG. 1 illustrates various units of an electronic device for managing image data, according to the embodiments as disclosed herein; [0036] FIG. 2 illustrates various units of a coding unit included in the electronic device, according to the embodiments as disclosed herein;
[0037] FIG. 3 is a flow diagram illustrating a method for controlling a rate of transmission of each of blocks of image frames, according to an embodiment as disclosed herein;
[0038] FIG. 4 is a flow diagram illustrating a method for transmitting one or more portions of the image frames corresponding to a differential motion data over a network, according to an embodiment as disclosed herein;
[0039] FIG. 5 is a flow diagram illustrating various operations performed to detect a rate of motion of one or more objects in the detected set of blocks, according to an embodiment as disclosed herein;
[0040] FIG. 6 illustrates an example in which an image data is processed in a video surveillance environment, according to an embodiment as disclosed herein;
[0041] FIG. 7 is a flow diagram illustrating various operations performed to transmit the blocks of the image frames corresponding to the differential motion data over a network, according to an embodiment as disclosed herein;
[0042] FIGS. 8a and 8b are flow diagrams illustrating various operations performed to transmit the set of blocks over the network, according to an embodiment as disclosed herein; and
[0043] FIG. 9 illustrates a computing environment implementing the method for managing the image data, according to an embodiment as disclosed herein. DETAILED DESCRIPTION OF INVENTION
[0044] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well- known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. Also, the various embodiments described herein are not necessarily mutually exclusive, as some embodiments can be combined with one or more other embodiments to form new embodiments. The term "or" as used herein, refers to a nonexclusive or, unless otherwise indicated. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein can be practiced and to further enable those skilled in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[0045] As is traditional in the field, embodiments may be described and illustrated in terms of blocks which carry out a described function or functions. These blocks, which may be referred to herein as units or modules or the like, are physically implemented by analog or digital circuits such as logic gates, integrated circuits, microprocessors, microcontrollers, memory circuits, passive electronic components, active electronic components, optical components, hardwired circuits, or the like, and may optionally be driven by firmware and/or software. The circuits may, for example, be embodied in one or more semiconductor chips, or on substrate supports such as printed circuit boards and the like. The circuits constituting a block may be implemented by dedicated hardware, or by a processor (e.g., one or more programmed microprocessors and associated circuitry), or by a combination of dedicated hardware to perform some functions of the block and a processor to perform other functions of the block. Each block of the embodiments may be physically separated into two or more interacting and discrete blocks without departing from the scope of the invention. Likewise, the blocks of the embodiments may be physically combined into more complex blocks without departing from the scope of the invention
[0046] The embodiments herein provide a method for managing image data at an electronic device. The method includes receiving, by a coding unit, a plurality of image frame. Further, the method includes detecting, by the coding unit, a rate of motion of at least one object in each block of the image frames. Further, the method includes controlling, by the coding unit, a rate of transmission of each of the blocks of the image frames based on the rate of motion of the at least one object.
[0047] Unlike conventional systems and methods, the proposed method can be used to detect important block of the object from the set of blocks corresponding to a foreground portion. Further, the proposed method can be used to process the important blocks corresponding to the foreground portion in each of the image frames. Further, the proposed method can be used to transmit the processed important blocks over a network. This results in avoiding repeatedly sending the high resolution images over the network. This results in reducing bandwidth usage.
[0048] The proposed method can be used to reduce the distortion in pixel values in the block so as to improve the quality of the foreground portion and the background portion.
[0049] Referring now to the drawings and more particularly to FIGS. 1 through 9, where similar reference characters denote corresponding features consistently throughout the figure, there are shown preferred embodiments.
[0050] FIG. 1 illustrates various units of an electronic device 100 for managing an image data, according to the embodiments as disclosed herein. The electronic device 100 can be, for example but not limited to, a digital camera, a mobile telephone, a smartphone. a Personal Digital Assistant (PDA), a media player, a gaming device, a web camera, a video camera, a computer, a laptop, or the like. The image data can be, for example but not limited to, a picture, a video, a multimedia content or the like. In an embodiment, the electronic device 100 includes a communication unit 102, a coding unit 104, a processor 106 and a memory 108. The coding unit 104 is in communication with the memory 108 and the processor 106.
[0051] Further, the coding unit 104 is configured to receive a plurality of image frames. After receiving the plurality of image frames, the coding unit 104 is configured to segment each of the image frames into a plurality of blocks. In an example, the coding unit 104 divides each of the image frames into the plurality of blocks using a tiling pattern. The each of the image frames includes a plurality of full- sized, interior blocks.
[0052] After segmenting each of the image frames into the plurality of blocks, the coding unit 104 is configured to detect the set of blocks from the plurality of blocks that includes a frequent motion of one or more object in each of the image frames. In an embodiment, the object corresponds to a specific portion in the blocks or region of interest in the blocks.
[0053] In an embodiment, the objects of the image data are likely to be a recognizable item of interest to the user. The recognizable item may include, for example, a person's face, a person's body, a car, a truck, a cat, a dog or the like.
[0054] Based on detecting the frequent motion of the one or more objects in the image frames, the coding unit 104 is configured to determine the rate of motion of one or more object in the detected set of blocks. In an example, the coding unit 104 is configured to determine the rate of motion of one or more object in the detected set of blocks using a blob detection technique, a color change adjustment scheme, a geometric variation scheme or the like.
[0055] In an embodiment, based on determination of the rate of motion of the one or more object in the detected set of blocks, the coding unit 104 is configured to control the rate of transmission of each of the blocks of the image frames.
[0056] In an embodiment, the coding unit 104 is configured to control the rate of transmission by determining a differential motion data of the object among the image frames based on the rate of motion of the at least one object. In an embodiment, the differential motion data corresponds to the movement of the object or position changes of the object. The movement of the object or position changes of the object are determined by a scheme (e.g., object tracking scheme, object tracking scheme or the like).
[0057] In an embodiment, based on determination of the differential motion data of the object among the image frames, the coding unit 104 is configured to transmit the blocks of the image frames over a network (not shown). In an embodiment, the network can be a cellular network.
[0058] In an embodiment, the coding unit 104 is configured to determine the set of blocks corresponding to a background portion and the set of blocks corresponding to a foreground portion in each of the image frames based on the differential motion data. Further, the coding unit 104 is configured to process at least one of the set of blocks corresponding to the background portion to have a quality lower the set of blocks corresponding to the foreground portion. In an embodiment, the quality level refers to a number of image processing parameters including resolution, frame rate, bit rate, and image compression quality. Further, the coding unit 104 is configured to transmit the blocks of the image frames over the network [0059] In an embodiment, the compression of the image data may include any appropriate compression scheme, such as applying a scheme that changes the effective amount of the image data in terms of number of bits per pixel. The compression scheme include, for example, a predetermined compression scheme for a specific file format.
[0060] In an example, a lowest quality JPEG compression scheme may have a quality value (or Q value) of two, a low quality JPEG compression may have a Q value of seven, a medium quality JPEG compression may have a Q value of twenty, an average quality JPEG compression may have a Q value of fifty, and a full quality JPEG compression may have a Q value of one hundred.
[0061] In an embodiment, the coding unit 104 is configured to process the set of blocks corresponding to the background portion to have the quality lower the set of blocks corresponding to the foreground portion by cropping a boundary of the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion.
[0062] In an example, the lowering the quality of the image data is achieved in several schemes such as lowering the resolution of the image data, or using a higher compression rate in the compression scheme. In another example, the quality management may apply a low or moderate amount of compression to one region of the image and a higher amount of compression to the rest of the image.
[0063] In an example, the image data is originally compressed using a block-based approach such as JPEG and a cropped head image is compressed again before transmission, then the compression artifacts can be reduced if the blocks used for re-compression are the same as the original one. For this, the cropping of the important objects can be done at the boundaries of the original block boundaries themselves, which will reduce these artifacts. In an example, if the JPEG compression is employed at the source image and re-compression, then the windows of the extracted images is cropped at image boundaries which are multiples of 16 since JPEG operates on blocks of size 16x16.
[0064] In an example, when the block-based compression scheme such as JPEG is used, every block can be ensured to be only the foreground block or the background block. This is important because this preserves the fidelity of the foreground objects during a reconstruction process. This helps in identifying which pixels are background and which are foreground during reconstruction.
[0065] In an embodiment, the boundary of the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion are lined with boundary used by the compression technique.
[0066] In an embodiment, the boundary of the important blocks is lined with the boundary used by the compression technique.
[0067] In an embodiment, the coding unit 104 is configured to detect the importance block of the object of the set of blocks corresponding to the foregoing portion based on a plurality of parameters. The coding unit 104 is configured to process the important blocks corresponding to the foreground portion in each of the image frames. Further, the coding unit 104 is configured to transmit the blocks corresponding to the differential motion data over the network.
[0068] By using an object detection scheme, the coding unit 104 determines changes in the each of the blocks of the image frames due to noise. As a result, the foreground image including the noise may be sent with very low frequency. [0069] In an embodiment, the noise and small changes in the each of the blocks of the image frames need not be sent using the object detection techniques.
[0070] In an embodiment, the parameters includes at least one of an availability of the object in each of the image frames, information already transmitted corresponding to the object, a rate of motion of the object in each of the image frames, resolution, and an output from the object detection technique.
[0071] In an embodiment, the coding unit 104 is configured to transmit different parts of the important blocks separately. Each block have one or more foreground objects with start x and y co-ordinates and width and height of the image data. This might reduce the amount of data to be transmitted.
[0072] In an embodiment, if the objects of the image data contains a relatively large number of the recognized items (e.g., two or more items), the items may be prioritized based on size and/or location within the image data.
[0073] In an example, the visitor sits in a reception area. The reception area includes a set of immovable items (flowers pot, a set of tables and monitors). The immovable items do not change their positions. The immovable items consider as the background portion. The visitor sitting in the reception area may change his position frequently, which can be as the foreground portion. The people changing position considers as the differential motion data. The coding unit 104 transmits the block of the foreground portion corresponding to the differential motion data over the network. This results in avoiding repeated transmission of the high resolution images over the network. This results in reducing the bandwidth usage over the network. [0074] The communication unit 102 is configured for communicating internally between internal units and with external devices via one or more networks. The memory 108 may include one or more computer-readable storage media. The memory 108 may include nonvolatile storage elements. Examples of such non- volatile storage elements may include magnetic hard disc, optical discs, floppy discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories. In addition, the memory 108 may, in some examples, be considered a non-transitory storage medium. The term "non-transitory" may indicate that the storage medium is not embodied in a carrier wave or a propagated signal. However, the term "non-transitory" should not be interpreted that the memory 108 is non-movable.
[0075] Although the FIG. 1 shows exemplary units of the electronic device 100 but it is to be understood that other embodiments are not limited thereon. In other embodiments, the electronic device 100 may include less or more number of units. Further, the labels or names of the units are used only for illustrative purpose and does not limit the scope of the invention. One or more units can be combined together to perform same or substantially similar function to manage the image data at the electronic device 100.
[0076] FIG. 2 illustrates various units of the coding unit 104 included in the electronic device 100, according to the embodiments as disclosed herein. In an embodiment, the coding unit 104 includes a frame processing unit 104a, a segmentation unit 104b, a compressor 104c, an object detector 104d, an image analyzer 104e, and a block estimator 104f. The frame processing unit 104a is configured to receive the plurality of image frame. After receiving the plurality of image frames, the segmentation unit 104b is configured to segment each of the image frames into the plurality of blocks. In an example, the segmentation unit 104b segments each of the image frames into the plurality of blocks using a pattern analyzing scheme.
[0077] After segmenting each of the image frames into the plurality of blocks, the object detector 104d is configured to detect the set of blocks from the plurality of block that includes the frequent motion of the object in each of the image frames.
[0078] Based on detecting the frequent motion of the object in the image frames, the block estimator 104f is configured to determine the rate of motion of the object in the detected set of blocks. In an example, the block estimator 104f determines the rate of motion of the object in the detected set of blocks using the blob detection technique, the color change adjustment scheme, or the like.
[0079] In an embodiment, based on determination of the rate of motion of the object in the detected set of blocks, the block estimator 104f is configured to control the rate of transmission of each of the blocks of the image frames.
[0080] In an embodiment, the image analyzer 104d is configured to control the rate of transmission by determining the differential motion data of the object among the image frames based on the rate of motion of the object.
[0081] In an embodiment, based on determination of the differential motion data of the object among the image frames, the block estimator 104f is configured to transmit the blocks of the image frames over the network.
[0082] In an example, the block estimator 104f frequently transmits each of the blocks of the image frames, if the rate of motion of the object occurs frequently in the detected set of blocks.
[0083] In an embodiment, the block estimator 104f is configured to determine the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion in each of the image frames based on the differential motion data. The block estimator 104f is configured to processes the set of blocks corresponding to the background portion to have the quality lower the set of blocks corresponding to the foreground portion. The block estimator 104f is further configured to transmit the blocks of the image frames over the network.
[0084] In an embodiment, the image analyzer 104e is configured to process the set of blocks corresponding to the background portion to have the quality lower the set of blocks corresponding to the foreground portion by cropping the boundary of the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion.
[0085] In an embodiment, the block estimator 104f is configured to detect the importance block of the object of the set of blocks corresponding to the foregoing portion based on the plurality of parameters. The block estimator 104f is configured to process the important blocks corresponding to the foreground portion in each of the image frames. The block estimator 104f is further configured to transmit the blocks corresponding to the differential motion data over the network
[0086] In an embodiment, the object detector 104d determines which object of the set of blocks have already been sent. In an embodiment, object detector 104d determines which object of the set of blocks not send at all. In In an embodiment, the object detector 104d determines which object of the set of blocks is sent in very low number of bits over the network.
[0087] In an embodiment, the object detector 104d utilizes different coding schemes depending on the amount of motion that the foreground objects undergo. In an example, if the movement of the object is not large (for e.g., the movement of a passenger's hand in reading a newspaper while waiting in a lounge), then a difference from the previous image after motion compensation can be computed and sent (similar to traditional compression schemes such as MPEG4/H.264/H.265 schemes used for image data). On the other hand, if the motion is large (for e.g., a passenger moving away from the lounge by suddenly getting up with his luggage), a difference image from the background can be sent. The scheme that gives the least number of bits can be used adaptively. Detection techniques can be used in the foreground region to determine the important parts of the image and sent with the frequency and bits depending on their importance. For example, at a first instance, the data needs to be transmitted at 15fps, and at another instance, the data needs to be transmitted at lfps. Also, objects that have been sent before may be sent at even greater intervals. The importance of different parts of the image may be determined by the object detection techniques. For instance, faces of humans, number plates of vehicles etc. are the most important forensic information and can be sent in higher resolution and in high frequency, while other parts of the humans or vehicles do not need so many bits to transfer and can be sent infrequently and/or in lower number of bits.
[0088] Although the FIG. 2 shows exemplary units of the coding unit 104 but it is to be understood that other embodiments are not limited thereon. In other embodiments, the coding unit 104 may include less or more number of units. Further, the labels or names of the units are used only for illustrative purpose and does not limit the scope of the invention. One or more units can be combined together to perform same or substantially similar function to process the image data in the coding unit 104.
[0089] FIG. 3 is a flow diagram 300 illustrating a method for controlling the rate of transmission of each of blocks of the image frames, according to an embodiment as disclosed herein. At 302, the method includes receiving the plurality of image frames. In an embodiment, the method allows the frame processing unit 104 to receive the plurality of image frames. At 304, the method includes detecting the rate of motion of the object in each block of the image frames. In an embodiment, the method allows the object detector 104d to detect the rate of motion of the object in each block of the image frames. The operations 304 is explained in conjunction with the FIG. 5. At 306, the method includes controlling the rate of transmission of each of the blocks of the image frames. In an embodiment, the method allows the block estimator 104f to control the rate of transmission of each of the blocks of the image frames. The operations 306 is explained in conjunction with the FIG. 7.
[0090] In an example, the images of important objects which are clearly visible from a viewpoint and do not change much over a period of time need not be transmit frequently. In another example, in a sitting lounge, the people in an area may not change their positions very often. So, the proposed method does not repeatedly send the high resolution frontal images of their faces over the network. This results in reducing the bandwidth usage.
[0091] The proposed method can be used to reduce the number of bits transmitted across the network, as compression of an "all zero" block and "all foreground" block is easier for scheme such as JPEG scheme, which utilize spatial smoothness of the image data. After subtracting the background portion, if the block contains portions of both the foreground and the background (typically occurring along the boundaries of the foreground portions), the compression scheme needs to use considerable information to a model change from the background portion to the foreground portion within the block. [0092] The method can be used to reduce distortion the pixel values in the block and improve the quality of the foreground portion and the background portion.
[0093] The various actions, acts, blocks, steps, or the like in the flow diagram 300 may be performed in the order presented, in a different order or simultaneously. Further, in some embodiments, some of the actions, acts, blocks, steps, or the like may be omitted, added, modified, skipped, or the like without departing from the scope of the invention.
[0094] FIG. 4 is a flow diagram 400 illustrating a method for transmitting one or more portions of the images frames corresponding to the differential motion data over the network, according to an embodiment as disclosed herein. At 402, the method includes receiving the plurality of image frames. In an embodiment, the method allows the frame processing unit 104a to receive the plurality of image frames. At 404, the method includes determining the differential motion data of the foreground object among the plurality of image frames. In an embodiment, the method allows the block estimator 104f to determine the differential motion data of the foreground object among the plurality of image frames. At 406, the method includes transmitting the portion of the image frames corresponding to the differential motion data frequently compared to the remaining image frames in the plurality of image frames. In an embodiment, the method allows the coding unit 104 to transmit the portions of the image frames corresponding to the differential motion data frequently compared to the remaining image frames in the plurality of image frames.
[0095] In an embodiment, the differential motion data corresponds to the movement of the object or position changes of the object.
[0096] At 408, the method includes controlling the rate of transmission over the network. In an embodiment, the method allows the coding unit 104 to control the rate of transmission over the network. [0097] In an embodiment, the remaining portions of the image frames are visible from the viewpoint and do not change frequently over the period of time.
[0098] In an example, the people sits in a reception area. The reception area includes a flowers pot and a set of chairs. The flowers pot and the set of chair does not change their positions, which can be consider as the background portion. The people sitting in the reception area may change their position, which can be consider as the foreground portion. The people change position considers as the differential motion data. The block estimator 104f transmits the block of the foreground portion corresponding to the differential motion data over the network by using an object tracking scheme. The flowers pot and the set of chair does not change their positions frequently over a period of time. This results in avoiding repeatedly transmitting the high resolution images over the network. This results in reducing the bandwidth usage over the network.
[0099] The various actions, acts, blocks, steps, or the like in the flow diagram 400 may be performed in the order presented, in a different order or simultaneously. Further, in some embodiments, some of the actions, acts, blocks, steps, or the like may be omitted, added, modified, skipped, or the like without departing from the scope of the invention.
[00100] FIG. 5 is a flow diagram 500 illustrating various operations performed to detect the rate of motion of the objects in the set of blocks, according to an embodiment as disclosed herein. At 502, the method includes segmenting each of the image frames into the plurality of blocks. In an embodiment, the method allows the segmentation unit 104b to segment each of the image frames into the plurality of blocks. At 504, the method includes detecting the set of blocks from the plurality of blocks that includes the frequent motion of the object in each of the image frames. In an embodiment, the method allows the block estimator 104fto detect the set of blocks from the plurality of block that includes the frequent motion of the object in each of the image frames. At 506, the method includes detecting the rate of motion of the object in the detected set of blocks. In an embodiment, the method allows the block estimator 104fto detect the rate of motion of the object in the detected set of blocks.
[00101] The various actions, acts, blocks, steps, or the like in the flow diagram 500 may be performed in the order presented, in a different order or simultaneously. Further, in some embodiments, some of the actions, acts, blocks, steps, or the like may be omitted, added, modified, skipped, or the like without departing from the scope of the invention.
[00102] FIG. 6 illustrates an example in which the image data is processed in a video surveillance environment, according to an embodiment as disclosed herein. The scene captured by the electronic device 100 is as shown in the notation "A". The frame processing unit 104a receives the image frames. The image frames includes the foreground portion and the background portion. The segmentation unit 104b determines the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion in each of the image frames. The segmentation unit 104b divides the background portion and the foreground portion as shown in the notations "b and c". Further, the object detector 104d detects the movable items and non-movable items in the foreground portion. The movable items are people as shown in the FIG. 7. The non-movable items are steps and wooden materials. Based on the detecting the movable items, the image analyzer 104e detects the important block (e.g., faces) of the movable items as shown in the notations "d and e". The image analyzer 104e processes the important blocks corresponding to the movable items.
[00103] FIG. 7 is a flow diagram 700 illustrating various operations performed to transmit the blocks of the image frames corresponding to the differential motion data over the network, according to an embodiment as disclosed herein. At 702, the method includes determining the differential motion data of the object among the image frames based on the rate of motion of the object. In an embodiment, the method allows the block estimator 104f to determine the differential motion data of the object among the image frames based on the rate of motion of the object. At 704, the method includes transmitting the blocks of the image frames corresponding to the differential motion data over the network. In an embodiment, the method allows the coding unit 104 to transmit the blocks of the image frames corresponding to the differential motion data over the network. The operations 704 is explained in conjunction with the FIG. 8a and the FIG. 8b.
[00104] In an example, the coding unit 104 transmits background data very rarely as the differential motion data does not occur in the background portion.
[00105] In another example, the coding unit 104 transmits foreground sent frequently as the differential motion data frequently occurs in the foreground portion.
[00106] The various actions, acts, blocks, steps, or the like in the flow diagram 700 may be performed in the order presented, in a different order or simultaneously. Further, in some embodiments, some of the actions, acts, blocks, steps, or the like may be omitted, added, modified, skipped, or the like without departing from the scope of the invention.
[00107] FIGS. 8a and 8b are flow diagrams 800a and 800b illustrating various operations performed to transmit the set of blocks over the network, according to an embodiment as disclosed herein. As shown in the FIG. 8a, at 802a, the method includes determining the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion in each of the image frames based on the differential motion data. In an embodiment, the method allows the block estimator 104f to determine the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion in each of the image frames based on the differential motion data.
[00108] At 804a, the method includes processing the set of blocks corresponding to the background portion to have the quality lower the set of blocks corresponding to the foreground portion. In an embodiment, the method allows the compressor 104c to processing the set of blocks corresponding to the background portion to have the quality lower the set of blocks corresponding to the foreground portion.
[00109] At 806a, the method includes transmitting the processed set of blocks over the network. In an embodiment, the method allows the block estimator 104f to transmit the processed set of blocks over the network.
[00110] As shown in the FIG. 8b, at 802b, the method includes determining the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion in each of the image frames based on the differential motion data. In an embodiment, the method allows the object detector 104dto determine the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion in each of the image frames based on the differential motion data.
[00111] At 804b, the method includes detecting the importance block of the object of the set of blocks corresponding to the foregoing portion based on the plurality of parameters. In an embodiment, the method allows the object detector 104dto detect the importance block of the object of the set of blocks corresponding to the foregoing portion based on the plurality of parameters. [00112] In an example, the faces are one of the most important forensic data (i.e., important block) that is needed by the security agency. The faces may be extracted using various techniques (e.g., machine learning techniques, such as Histogram of Gradients with Support Vector Machines (HOG-SVM) and Neural Networks. Further, the portion of the faces is estimated using a background subtracted blob scheme. Further, the faces corresponding to the foreground portion is processed in each of the image frames.
[00113] At 806b, the method includes processing the important blocks corresponding to the foreground portion in each of the image frames. In an embodiment, the method allows the image analyzer 104eto process the important blocks corresponding to the foreground portion in each of the image frames. At 808b, the method includes transmitting the processed important blocks over the network. In an embodiment, the method allows the block estimator 104fto transmit the processed important blocks over the network.
[00114] In another example, in a traffic scenario, one of the most important information required by the security agency is a number plate of a vehicle. This information can be extracted again using several techniques such as machine learning techniques. Here, the important block is the number plate. The block estimator 104f only transmits the processed number plate over the network.
[00115] The various actions, acts, blocks, steps, or the like in the flow diagrams 800a and 800b may be performed in the order presented, in a different order or simultaneously. Further, in some embodiments, some of the actions, acts, blocks, steps, or the like may be omitted, added, modified, skipped, or the like without departing from the scope of the invention.
[00116] FIG. 9 illustrates a computing environment 902 implementing the method for managing the image data, according to an embodiment as disclosed herein. As depicted in the FIG. 9, the computing environment 902 comprises at least one processing unit 908 that is equipped with a control unit 904, an Arithmetic Logic Unit (ALU) 906, a memory 910, a storage unit 912, a plurality of networking devices 916 and a plurality Input output (I/O) devices 914. The processing unit 908 is responsible for processing the instructions of the technique. The processing unit 908 receives commands from the control unit 904 in order to perform its processing. Further, any logical and arithmetic operations involved in the execution of the instructions are computed with the help of the ALU 906.
[00117] The overall computing environment 902 can be composed of multiple homogeneous or heterogeneous cores, multiple CPUs of different kinds, special media and other accelerators. The processing unit 908 is responsible for processing the instructions of the technique. Further, the plurality of processing units 904 may be located on a single chip or over multiple chips.
[00118] The technique comprising of instructions and codes required for the implementation are stored in either the memory unit 910 or the storage 912 or both. At the time of execution, the instructions may be fetched from the corresponding memory 910 or storage 912, and executed by the processing unit 908.
[00119] In case of any hardware implementations various networking devices 916 or external I/O devices 914 may be connected to the computing environment 902 to support the implementation through the networking unit and the I/O device unit.
[00120] The embodiments disclosed herein can be implemented through at least one software program running on at least one hardware device and performing network management functions to control the elements. The elements shown in the FIGS. 1 to 9 include blocks, elements, actions, acts, steps, or the like which can be at least one of a hardware device, or a combination of hardware device and software module.
[00121] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the embodiments as described herein.

Claims

STATEMENT OF CLAIMS We claim:
1. A method for managing image data at an electronic device, the method comprising:
receiving, by a coding unit, a plurality of image frames;
detecting, by the coding unit, a rate of motion of at least one object in each block of the image frames; and
controlling, by the coding unit, a rate of transmission of each of the blocks of the image frames based on the rate of motion of the at least one object.
2. The method of claim 1, wherein controlling the rate of transmission of each of the blocks of the image frames based on the rate of motion of the at least one object comprising:
determining a differential motion data of the at least one object among the image frames based on the rate of motion of the at least one object; and
transmitting the blocks of the image frames corresponding to the differential motion data over a network.
3. The method of claim 2, wherein transmitting the blocks of the image frames corresponding to the differential motion data over the network comprising:
determining a set of blocks corresponding to a background portion and a set of blocks corresponding to a foreground portion in each of the image frames based on the differential motion data;
processing at least one of the set of blocks corresponding to the background portion to have a quality lower the set of blocks corresponding to the foreground portion; and
transmitting the processed set of blocks over the network.
4. The method of claim 3, wherein processing is performed by cropping at least one of boundary of the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion.
5. The method of claim 4, wherein the boundary of the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion are lined with boundary used by a compression technique.
6. The method of claim 2, wherein transmitting the blocks corresponding to the differential motion data over the network comprising:
determining a set of blocks corresponding to a background portion and a set of blocks corresponding to a foreground portion in each of the image frames based on the differential motion data;
detecting an importance block of the at least one object of the set of blocks corresponding to the foregoing portion based on a plurality of parameters;
processing the important blocks corresponding to the foreground portion in each of the image frames; and
transmitting the processed important blocks over the network.
7. The method of claim 6, wherein the parameters comprises at least one of an availability of the at least one object in each of the image frames, information already transmitted corresponding to the at least one object, a rate of motion of the at least one object in each of the image frames, resolution, and an output from an object detection technique.
8. The method of claim 6, wherein processing is performed by cropping boundary of the important blocks corresponding to the foreground portion, wherein the boundary of the important blocks is lined with boundary used by a compression technique.
9. The method of claim 1, wherein detecting the rate of motion of the at least one object in each block of each of the image frames comprising: segmenting each of the image frames into a plurality of blocks; detecting a set of blocks from the plurality of block that includes a frequent motion of the at least one object in each of the image frames; and
detecting the rate of motion of the at least one object in the detected set of blocks.
10. A method for managing image data at an electronic device, the method comprising:
receiving, by a coding unit, a plurality of image frames;
determining, by the coding unit, a differential motion data of at least one object in the plurality of image frames; and
controlling, by the coding unit, a rate of transmission by transmitting at least one portion of at least one image frame corresponding to the differential motion data frequently compared to remaining image frames in the plurality of image frames.
11. The method of claim 10, wherein the remaining portions of the image frames are visible from a viewpoint and do not change frequently over a period of time.
12. The method of claim 10, wherein transmitting the at least one portion of the at least one image frame corresponding to the differential motion data comprising:
determining a set of blocks corresponding to a background portion and a set of blocks corresponding to a foreground portion in each of the image frames based on the differential motion data;
processing the set of blocks corresponding to the background portion to have a quality lower the set of blocks corresponding to the foreground portion; and transmitting the processed set of blocks over a network.
13. The method of claim 12, wherein processing is performed by cropping boundary at least one of the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion.
14. The method of claim 13, wherein the boundary of the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion are lined with boundary used by a compression technique.
15. The method of claim 10, wherein transmitting the at least one portion of the at least one image frame corresponding to the differential motion data comprising:
determining a set of blocks corresponding to a background portion and a set of blocks corresponding to a foreground portion in each of the image frames based on the differential motion data;
detecting at least one importance block of the at least one object of the set of blocks corresponding to the foregoing portion based on a plurality of parameters;
processing the important blocks corresponding to the foreground portion in each of the image frames; and
transmitting the processed important blocks over a network.
16. The method of claim 15, wherein the parameters comprises at least one of an availability of the at least one object in each of the image frames, information already transmitted corresponding to the at least one object, a rate of motion of the at least one object in each of the image frames, resolution, and an output from an object detection technique.
17. The method of claim 15, wherein processing is performed by cropping boundary of the important blocks corresponding to the foreground portion, wherein the boundary of the important blocks is lined with boundary used by a compression technique.
18. An electronic device for managing image data, the electronic device comprising:
a memory;
a processor; and
a coding unit, in communication with the memory and the processor, configured to:
receive a plurality of image frame;
detect a rate of motion of at least one object in each block of the image frames; and
control a rate of transmission of each of the blocks of the image frames based on the rate of motion of the at least one object.
19. The electronic device of claim 18, wherein control the rate of transmission of each of the blocks of the image frames based on the rate of motion of the at least one object comprising:
determine a differential motion data of the at least one object among the image frames based on the rate of motion of the at least one object, and
transmit the blocks of the image frames corresponding to the differential motion data over a network.
20. The electronic device of claim 19, wherein transmit the blocks of the image frames corresponding to the differential motion data over the network comprising:
determine a set of blocks corresponding to a background portion and a set of blocks corresponding to a foreground portion in each of the image frames based on the differential motion data, process at least one of the set of blocks corresponding to the background portion to have a quality lower the set of blocks corresponding to the foreground portion, and
transmit the processed set of blocks over the network.
21. The electronic device of claim 20, wherein the coding unit is configured to process at least one of the set of blocks corresponding to the background portion to have a quality lower the set of blocks corresponding to the foreground portion by cropping at least one of boundary of the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion.
22. The electronic device of claim 21, wherein the boundary of the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion are lined with boundary used by a compression technique.
23. The electronic device of claim 19, wherein transmit the blocks corresponding to the differential motion data over the network comprising:
determine a set of blocks corresponding to a background portion and a set of blocks corresponding to a foreground portion in each of the image frames based on the differential motion data,
detect an importance block of the at least one object of the set of blocks corresponding to the foregoing portion based on a plurality of parameters,
process the important blocks corresponding to the foreground portion in each of the image frames, and
transmit the processed important blocks over the network.
24. The electronic device of claim 23, wherein the parameters comprises at least one of an availability of the at least one object in each of the image frames, information already transmitted corresponding to the at least one object, a rate of motion of the at least one object in each of the image frames, resolution, and an output from an object detection technique.
25. The electronic device of claim 23, wherein the coding unit is configured to process the important blocks corresponding to the foreground portion in each of the image frames by cropping boundary of the important blocks corresponding to the foreground portion, wherein the boundary of the important blocks is lined with boundary used by a compression technique.
26. The electronic device of claim 18, wherein detect the rate of motion of the at least one object in each of the blocks of each of the image frames comprising:
segment each of the image frames into a plurality of blocks, detect a set of blocks from the plurality of block that includes a frequent motion of the at least one object in each of the image frames, and
detect the rate of motion of the at least one object in the detected set of blocks.
27. An electronic device for managing image data, the electronic device comprising:
a memory;
a processor; and
a coding unit, in communication with the memory and the processor, configured to:
receive a plurality of image frames,
determine a differential motion data of at least one object in the plurality of image frames, and
control a rate of transmission by transmitting at least one portion of at least one image frame corresponding to the differential motion data frequently compared to remaining image frames in the plurality of image frames.
28. The electronic device of claim 27, wherein the remaining portions of the image frames are visible from a viewpoint and do not change frequently over a period of time.
29. The electronic device of claim 27, wherein transmit the at least one portion of the at least one image frame corresponding to the differential motion data comprising:
determine a set of blocks corresponding to a background portion and a set of blocks corresponding to a foreground portion in each of the image frames based on the differential motion data;
process the set of blocks corresponding to the background portion to have a quality lower the set of blocks corresponding to the foreground portion; and
transmit the processed set of blocks over a network.
30. The electronic device of claim 29, wherein the coding unit is configured to process the set of image frames corresponding to the background portion to have a quality lower the set of image frames corresponding to the foreground portion by cropping boundary at least one of the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion.
31. The electronic device of claim 30, wherein the boundary of the set of blocks corresponding to the background portion and the set of blocks corresponding to the foreground portion are lined with boundary used by a compression technique.
32. The electronic device of claim 29, wherein transmit the at least one portion of the at least one image frame corresponding to the differential motion data comprising: determine a set of blocks corresponding to a background portion and a set of blocks corresponding to a foreground portion in each of the image frames based on the differential motion data,
detect at least one importance block of the at least one object of the set of blocks corresponding to the foregoing portion based on a plurality of parameters,
process the important blocks corresponding to the foreground portion in each of the image frames, and
transmit the processed important blocks over a network.
33. The electronic device of claim 32, wherein the parameters comprises at least one of an availability of the at least one object in each of the image frames, information already transmitted corresponding to the at least one object, a rate of motion of the at least one object in each of the image frames, resolution, and an output from an object detection technique.
34. The electronic device of claim 32, wherein the coding unit is configured to process the important blocks corresponding to the foreground portion in each of the image frames by cropping boundary of the important blocks corresponding to the foreground portion, wherein the boundary of the important blocks is lined with boundary used by a compression technique.
PCT/IN2017/050092 2016-03-16 2017-03-15 Method for managing image data at electronic device Ceased WO2017158622A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN201641009144 2016-03-16
IN201641009144 2016-03-16

Publications (2)

Publication Number Publication Date
WO2017158622A2 true WO2017158622A2 (en) 2017-09-21
WO2017158622A3 WO2017158622A3 (en) 2018-08-09

Family

ID=59850788

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IN2017/050092 Ceased WO2017158622A2 (en) 2016-03-16 2017-03-15 Method for managing image data at electronic device

Country Status (1)

Country Link
WO (1) WO2017158622A2 (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11403069B2 (en) 2017-07-24 2022-08-02 Tesla, Inc. Accelerated mathematical engine
US11409692B2 (en) 2017-07-24 2022-08-09 Tesla, Inc. Vector computational unit
US11487288B2 (en) 2017-03-23 2022-11-01 Tesla, Inc. Data synthesis for autonomous control systems
US11537811B2 (en) 2018-12-04 2022-12-27 Tesla, Inc. Enhanced object detection for autonomous vehicles based on field view
US11561791B2 (en) 2018-02-01 2023-01-24 Tesla, Inc. Vector computational unit receiving data elements in parallel from a last row of a computational array
US11562231B2 (en) 2018-09-03 2023-01-24 Tesla, Inc. Neural networks for embedded devices
US11567514B2 (en) 2019-02-11 2023-01-31 Tesla, Inc. Autonomous and user controlled vehicle summon to a target
US11610117B2 (en) 2018-12-27 2023-03-21 Tesla, Inc. System and method for adapting a neural network model on a hardware platform
US11636333B2 (en) 2018-07-26 2023-04-25 Tesla, Inc. Optimizing neural network structures for embedded systems
US11665108B2 (en) 2018-10-25 2023-05-30 Tesla, Inc. QoS manager for system on a chip communications
US11681649B2 (en) 2017-07-24 2023-06-20 Tesla, Inc. Computational array microprocessor system using non-consecutive data formatting
US11734562B2 (en) 2018-06-20 2023-08-22 Tesla, Inc. Data pipeline and deep learning system for autonomous driving
US11748620B2 (en) 2019-02-01 2023-09-05 Tesla, Inc. Generating ground truth for machine learning from time series elements
US11790664B2 (en) 2019-02-19 2023-10-17 Tesla, Inc. Estimating object properties using visual image data
US11816585B2 (en) 2018-12-03 2023-11-14 Tesla, Inc. Machine learning models operating at different frequencies for autonomous vehicles
US11841434B2 (en) 2018-07-20 2023-12-12 Tesla, Inc. Annotation cross-labeling for autonomous control systems
US11893774B2 (en) 2018-10-11 2024-02-06 Tesla, Inc. Systems and methods for training machine models with augmented data
US11893393B2 (en) 2017-07-24 2024-02-06 Tesla, Inc. Computational array microprocessor system with hardware arbiter managing memory requests
US12014553B2 (en) 2019-02-01 2024-06-18 Tesla, Inc. Predicting three-dimensional features for autonomous driving
US12118755B2 (en) 2021-12-07 2024-10-15 International Business Machines Corporation Stochastic compression of raster data
US12307350B2 (en) 2018-01-04 2025-05-20 Tesla, Inc. Systems and methods for hardware-based pooling
US12462575B2 (en) 2021-08-19 2025-11-04 Tesla, Inc. Vision-based machine learning model for autonomous driving with adjustable virtual camera

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100292138B1 (en) * 1993-07-12 2002-06-20 이데이 노부유끼 Transmitter and Receiver for Digital Video Signal
US6356664B1 (en) * 1999-02-24 2002-03-12 International Business Machines Corporation Selective reduction of video data using variable sampling rates based on importance within the image
US9049447B2 (en) * 2010-12-30 2015-06-02 Pelco, Inc. Video coding
US9159137B2 (en) * 2013-10-14 2015-10-13 National Taipei University Of Technology Probabilistic neural network based moving object detection method and an apparatus using the same

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11487288B2 (en) 2017-03-23 2022-11-01 Tesla, Inc. Data synthesis for autonomous control systems
US12020476B2 (en) 2017-03-23 2024-06-25 Tesla, Inc. Data synthesis for autonomous control systems
US11403069B2 (en) 2017-07-24 2022-08-02 Tesla, Inc. Accelerated mathematical engine
US11409692B2 (en) 2017-07-24 2022-08-09 Tesla, Inc. Vector computational unit
US11893393B2 (en) 2017-07-24 2024-02-06 Tesla, Inc. Computational array microprocessor system with hardware arbiter managing memory requests
US11681649B2 (en) 2017-07-24 2023-06-20 Tesla, Inc. Computational array microprocessor system using non-consecutive data formatting
US12216610B2 (en) 2017-07-24 2025-02-04 Tesla, Inc. Computational array microprocessor system using non-consecutive data formatting
US12086097B2 (en) 2017-07-24 2024-09-10 Tesla, Inc. Vector computational unit
US12307350B2 (en) 2018-01-04 2025-05-20 Tesla, Inc. Systems and methods for hardware-based pooling
US12455739B2 (en) 2018-02-01 2025-10-28 Tesla, Inc. Instruction set architecture for a vector computational unit
US11561791B2 (en) 2018-02-01 2023-01-24 Tesla, Inc. Vector computational unit receiving data elements in parallel from a last row of a computational array
US11797304B2 (en) 2018-02-01 2023-10-24 Tesla, Inc. Instruction set architecture for a vector computational unit
US11734562B2 (en) 2018-06-20 2023-08-22 Tesla, Inc. Data pipeline and deep learning system for autonomous driving
US11841434B2 (en) 2018-07-20 2023-12-12 Tesla, Inc. Annotation cross-labeling for autonomous control systems
US12079723B2 (en) 2018-07-26 2024-09-03 Tesla, Inc. Optimizing neural network structures for embedded systems
US11636333B2 (en) 2018-07-26 2023-04-25 Tesla, Inc. Optimizing neural network structures for embedded systems
US11983630B2 (en) 2018-09-03 2024-05-14 Tesla, Inc. Neural networks for embedded devices
US12346816B2 (en) 2018-09-03 2025-07-01 Tesla, Inc. Neural networks for embedded devices
US11562231B2 (en) 2018-09-03 2023-01-24 Tesla, Inc. Neural networks for embedded devices
US11893774B2 (en) 2018-10-11 2024-02-06 Tesla, Inc. Systems and methods for training machine models with augmented data
US11665108B2 (en) 2018-10-25 2023-05-30 Tesla, Inc. QoS manager for system on a chip communications
US11816585B2 (en) 2018-12-03 2023-11-14 Tesla, Inc. Machine learning models operating at different frequencies for autonomous vehicles
US12367405B2 (en) 2018-12-03 2025-07-22 Tesla, Inc. Machine learning models operating at different frequencies for autonomous vehicles
US11908171B2 (en) 2018-12-04 2024-02-20 Tesla, Inc. Enhanced object detection for autonomous vehicles based on field view
US11537811B2 (en) 2018-12-04 2022-12-27 Tesla, Inc. Enhanced object detection for autonomous vehicles based on field view
US12198396B2 (en) 2018-12-04 2025-01-14 Tesla, Inc. Enhanced object detection for autonomous vehicles based on field view
US11610117B2 (en) 2018-12-27 2023-03-21 Tesla, Inc. System and method for adapting a neural network model on a hardware platform
US12136030B2 (en) 2018-12-27 2024-11-05 Tesla, Inc. System and method for adapting a neural network model on a hardware platform
US12014553B2 (en) 2019-02-01 2024-06-18 Tesla, Inc. Predicting three-dimensional features for autonomous driving
US11748620B2 (en) 2019-02-01 2023-09-05 Tesla, Inc. Generating ground truth for machine learning from time series elements
US12223428B2 (en) 2019-02-01 2025-02-11 Tesla, Inc. Generating ground truth for machine learning from time series elements
US11567514B2 (en) 2019-02-11 2023-01-31 Tesla, Inc. Autonomous and user controlled vehicle summon to a target
US12164310B2 (en) 2019-02-11 2024-12-10 Tesla, Inc. Autonomous and user controlled vehicle summon to a target
US12236689B2 (en) 2019-02-19 2025-02-25 Tesla, Inc. Estimating object properties using visual image data
US11790664B2 (en) 2019-02-19 2023-10-17 Tesla, Inc. Estimating object properties using visual image data
US12462575B2 (en) 2021-08-19 2025-11-04 Tesla, Inc. Vision-based machine learning model for autonomous driving with adjustable virtual camera
US12118755B2 (en) 2021-12-07 2024-10-15 International Business Machines Corporation Stochastic compression of raster data

Also Published As

Publication number Publication date
WO2017158622A3 (en) 2018-08-09

Similar Documents

Publication Publication Date Title
WO2017158622A2 (en) Method for managing image data at electronic device
US10924761B2 (en) Encoding a privacy masked image into an encoded image frame
US10893283B2 (en) Real-time adaptive video denoiser with moving object detection
JP6169276B2 (en) Temporal noise reduction method and related apparatus for noise images
US9936208B1 (en) Adaptive power and quality control for video encoders on mobile devices
JP4567733B2 (en) Method and apparatus for motion vector processing
JP5662023B2 (en) Method and apparatus for detecting banding artifacts in digital video content, and program storage device having application program
TWI613910B (en) Method and encoder for video encoding of a sequence of frames
KR100721543B1 (en) Image Processing Method and System for Removing Noise Using Statistical Information
CN105472205B (en) Real-time video noise reduction method and device in encoding process
US20130279598A1 (en) Method and Apparatus For Video Compression of Stationary Scenes
CN101262559A (en) A method and device for eliminating sequential image noise
JPWO2018061976A1 (en) Image processing device
US20140363045A1 (en) Precipitation removal for vision-based parking management systems
US10049436B1 (en) Adaptive denoising for real-time video on mobile devices
CN115082326A (en) Processing method for deblurring video, edge computing equipment and central processor
US10405003B2 (en) Image compression based on semantic relevance
CN112929701B (en) Video coding method, device, equipment and medium
JP2021013145A (en) Video transmission device, video transmission method
US11716475B2 (en) Image processing device and method of pre-processing images of a video stream before encoding
CN112911299A (en) Video code rate control method and device, electronic equipment and storage medium
JP2019149721A (en) Moving image coding apparatus, control method of the same, and program
EP4641496A1 (en) Power-efficient video conferencing
Zhou et al. Efficient adaptive MRF-MAP error concealment of video sequences
KR20250010427A (en) Video data processing technology for reducing transmission bandwidth

Legal Events

Date Code Title Description
NENP Non-entry into the national phase in:

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17766002

Country of ref document: EP

Kind code of ref document: A2

122 Ep: pct application non-entry in european phase

Ref document number: 17766002

Country of ref document: EP

Kind code of ref document: A2