[go: up one dir, main page]

WO2025071563A1 - Systèmes et procédés de détection et d'atténuation d'une bande de roulement - Google Patents

Systèmes et procédés de détection et d'atténuation d'une bande de roulement Download PDF

Info

Publication number
WO2025071563A1
WO2025071563A1 PCT/US2023/034065 US2023034065W WO2025071563A1 WO 2025071563 A1 WO2025071563 A1 WO 2025071563A1 US 2023034065 W US2023034065 W US 2023034065W WO 2025071563 A1 WO2025071563 A1 WO 2025071563A1
Authority
WO
WIPO (PCT)
Prior art keywords
computer
video
determining
implemented method
band
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2023/034065
Other languages
English (en)
Inventor
Qi MAO
Gazi Yamin IQBAL
Hung-Hsin Wu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Google LLC
Original Assignee
Google LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Google LLC filed Critical Google LLC
Priority to PCT/US2023/034065 priority Critical patent/WO2025071563A1/fr
Publication of WO2025071563A1 publication Critical patent/WO2025071563A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/745Detection of flicker frequency or suppression of flicker wherein the flicker is caused by illumination, e.g. due to fluorescent tube illumination or pulsed LED illumination
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation

Definitions

  • Many modem computing devices include image capture devices, such as still and/or video cameras.
  • the image capture devices can capture images, such as images that include people, animals, landscapes, and/or objects.
  • the image capture devices can include an image sensor.
  • the image sensor includes a plurality of light-sensing pixels that measure an intensity of light incident thereon and thereby collectively capture an image of an environment.
  • the intensity of light may have a high dynamic range based on the environment.
  • This application generally relates to detecting and correcting rolling bands in live capture videos.
  • a rolling band may occur when recording a video on a smartphone. It is generally caused by a mismatch between the frame rate of the video and the frequency of the ambient lighting that is being used to capture a scene. This can result in horizontal lines or bands that appear to be moving across the video.
  • Some existing technologies are configured to detect a rolling band based on row sums of pixel intensities, and then increase the exposure time of the camera to remove and/or reduce the banding. However this can result in a reduction of the dynamic range (e.g., in indoor bright scenes), and users are likely to see bright areas of the image as over-exposed.
  • detection of banding based on the row sums alone may be unreliable. For example, changes in row sums may be caused by a moving object.
  • a frequency of the banding may be compared to a frequency of the ambient light to eliminate a false detection of banding, such as the one caused by motion.
  • Techniques that address rolling band based on flickering are unable to remove the rolling band from all areas of the image, and may also lead to a degradation of image quality. Accordingly, there is a technical problem that involves detection, and reduction and/or removal, of rolling bands in videos.
  • a computer-implemented method includes receiving, from an image sensor of an image capturing device, aggregated light intensity data based on respective pixel arrays for two consecutive frames of a video. The method also includes determining a pattern of light intensities for the two consecutive frames based on the aggregated light intensity data. The method includes further receiving, from an ambient light sensor, ambient data associated with the two consecutive frames. The method also includes determining, based on the ambient data, a flicker measurement indicating an amount of flicker in the video. The method additionally includes correlating the pattern of light intensities and the amount of flicker to determine a presence or an absence of a rolling band in the video.
  • a system may include one or more processors.
  • the system may also include data storage, where the data storage has stored thereon computer-executable instructions that, when executed by the one or more processors, cause the system to carry out operations.
  • the operations may include receiving, from an image sensor of an image capturing device, aggregated light intensity data based on respective pixel arrays for two consecutive frames of a video.
  • the operations may also include determining a pattern of light intensities for the two consecutive frames based on the aggregated light intensity data.
  • the operations may additionally include receiving, from an ambient light sensor, ambient data associated with the two consecutive frames.
  • the operations may also include determining, based on the ambient data, a flicker measurement indicating an amount of flicker in the video.
  • the operations may further include correlating the pattern of light intensities and the amount of flicker to determine a presence or an absence of a rolling band in the video.
  • a computing device includes one or more processors and data storage that has stored thereon computer-executable instructions that, when executed by the one or more processors, cause the computing device to carry out operations.
  • the operations may include receiving, from an image sensor of an image capturing device, aggregated light intensity data based on respective pixel arrays for two consecutive frames of a video.
  • the operations may also include determining a pattern of light intensities for the two consecutive frames based on the aggregated light intensity data.
  • the operations may additionally include receiving, from an ambient light sensor, ambient data associated with the two consecutive frames.
  • the operations may also include determining, based on the ambient data, a flicker measurement indicating an amount of flicker in the video.
  • the operations may further include correlating the pattern of light intensities and the amount of flicker to determine a presence or an absence of a rolling band in the video.
  • an article of manufacture may include a non-transitory computer-readable medium having stored thereon program instructions that, upon execution by one or more processors of a computing device, cause the computing device to carry out operations.
  • the operations may include receiving, from an image sensor of an image capturing device, aggregated light intensity data based on respective pixel arrays for two consecutive frames of a video.
  • the operations may also include determining a pattern of light intensities for the two consecutive frames based on the aggregated light intensity data.
  • the operations may additionally include receiving, from an ambient light sensor, ambient data associated with the two consecutive frames.
  • the operations may also include determining, based on the ambient data, a flicker measurement indicating an amount of flicker in the video.
  • the operations may further include correlating the pattern of light intensities and the amount of flicker to determine a presence or an absence of a rolling band in the video.
  • Figure 1 A is an example illustration of a presence of a rolling band, in accordance with example embodiments.
  • Figure IB is an example illustration of an absence of a rolling band, in accordance with example embodiments.
  • Figure 2 is an example overview of a processing pipeline for rolling band detection and correction, in accordance with example embodiments.
  • Figure 3 is a block diagram of an example computing device, in accordance with example embodiments.
  • Figure 4 is a flowchart of a method, in accordance with example embodiments.
  • Figure 5 is another flowchart of a method, in accordance with example embodiments.
  • Example methods, devices, and systems are described herein. It should be understood that the words “example” and “exemplary” are used herein to mean “serving as an example, instance, or illustration.” Any embodiment or feature described herein as being an “example” or “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments or features. Other embodiments can be utilized, and other changes can be made, without departing from the scope of the subject matter presented herein.
  • Some existing approaches to detecting a rolling band may involve determining row sums of pixel intensities in successive frames of a video. For example, row-wise changes in pixel intensities can be used to detect whether or not there is banding.
  • Some other approaches to detecting a rolling band may involve using data from an ambient light sensor (ALS) of a camera. For example, data from the ALS can be used to determine a frequency of ambient light.
  • ALS ambient light sensor
  • detection of banding based on row sums alone, or on ALS alone may be unreliable.
  • changes in row sums, and/or changes in ALS data may be caused by motion (e.g., motion of an object in the scene being captured, movement of the camera, etc.).
  • existing approaches to correction of the banding generally involve increasing the exposure time of the camera. However, this may be unnecessary in some situations, and/or may result in a reduction of the dynamic range in other situations.
  • an effective approach to detection of a rolling band is based on a combination of an analysis of row sums and an analysis of the data from an ALS. For example, a first frequency measurement of the ambient light and a second frequency measurement based on the pixel sums may be compared to detect banding, an extent of the detected banding, and whether correction of the banding is needed to resolve a visible banding. For example, rowwise changes in pixel intensities can be used to detect whether or not there is banding. In some instances, a sinusoidal curve can indicate banding, whereas a substantially linear function can indicate an absence of banding. Also, for example, a frequency of the banding, if present, can be determined from the sinusoidal curve.
  • a visibility score can be assigned to the banding to determine an extent to which banding is visible.
  • data from the ALS can be used to determine a frequency of flickering in the ambient light, and a confidence level of such measurement.
  • the frequency of the flickering and the frequency of the banding may be compared to determine an amount of banding, and whether correction is needed to resolve a visible banding. Factors such as the visibility score and the confidence level of the flicker measurement come into play.
  • a rolling band is significant and/or visible enough to be corrected
  • techniques for correction e.g., removal and/or reduction
  • the rolling band can be removed by increasing the exposure time of the camera.
  • this may lead to a loss of the dynamic range (e.g., low brightness regions are captured whereas high brightness regions are visible as white regions without any detail).
  • machine learning algorithms may be applied for object detection, segmentation, and/or masking to identify objects and/or regions that include image information that needs to be preserved. For example, a percentage of the image occupied by a bright region (e.g., a sunlit sky), a type of textured detail in an object, etc. may be determined.
  • image capture devices such as cameras
  • wireless computing devices e.g., mobile devices, such as mobile phones
  • tablet computers such as mobile phones
  • laptop computers video game interfaces
  • home automation devices such as automobiles and other types of vehicles.
  • the physical components of a camera may include one or more apertures through which light enters, one or more recording surfaces for capturing the images represented by the light, and lenses positioned in front of each aperture to focus at least part of the image on the recording surface(s).
  • the apertures may be of a fixed size or may be adjustable.
  • the recording surface may be a photographic film.
  • the recording surface may include an electronic image sensor (e.g., a charge coupled device (CCD) or a complementary metal -oxide-semiconductor (CMOS) sensor) to transfer and/or store captured images in a data storage unit (e.g., memory).
  • CMOS complementary metal -oxide-semiconductor
  • One or more shutters may be coupled to, or positioned near, the lenses or the recording surfaces. Each shutter may either be in a closed position, in which it blocks light from reaching the recording surface, or an open position, in which light is allowed to reach the recording surface.
  • the position of each shutter may be controlled by a shutter button. For instance, a shutter may be in the closed position by default. When the shutter button is triggered (e.g., pressed), the shutter may change from the closed position to the open position for a period of time, known as the shutter cycle. During the shutter cycle, an image may be captured on the recording surface. At the end of the shutter cycle, the shutter may change back to the closed position.
  • the shuttering process may be electronic.
  • the sensor may be reset to remove any residual signal in its photodiodes. While the electronic shutter remains open, the photodiodes may accumulate charge. When or after the shutter closes, these charges may be transferred to longer-term data storage. Combinations of mechanical and electronic shuttering may also be possible.
  • a shutter may be activated and/or controlled by something other than a shutter button.
  • the shutter may be activated by a softkey, a timer, or some other trigger.
  • the term “capture” may refer to any mechanical and/or electronic shuttering process that results in one or more images being recorded, regardless of how the shuttering process is triggered or controlled.
  • the exposure of a captured image may be determined by a combination of the size of the aperture, the brightness of the light entering the aperture, and the length of the shutter cycle (also referred to as the shutter length, the exposure length, or the exposure time). Additionally, a digital and/or analog gain (e.g., based on an ISO setting) may be applied to the image, thereby influencing the exposure.
  • the term “exposure length,” “exposure time,” or “exposure time interval” may refer to the shutter length multiplied by the gain for a particular aperture size. Thus, these terms may be used somewhat interchangeably, and should be interpreted as possibly being a shutter length, an exposure time, and/or any other metric that controls the amount of signal response that results from light reaching the recording surface.
  • a camera may capture one or more still images each time image capture is triggered.
  • a camera may capture a video image by continuously capturing images at a particular rate (e.g., 24 frames per second) as long as image capture remains triggered (e.g., while the shutter button is held down).
  • Some cameras when operating in a mode to capture a still image, may open the shutter when the camera device or application is activated, and the shutter may remain in this position until the camera device or application is deactivated. While the shutter is open, the camera device or application may capture and display a representation of a scene on a viewfinder (sometimes referred to as displaying a “preview frame”). When image capture is triggered, one or more distinct payload images of the current scene may be captured.
  • Cameras including digital and analog cameras, may include software to control one or more camera functions and/or settings, such as aperture size, exposure time, gain, and so on. Additionally, some cameras may include software that digitally processes images during or after image capture. While the description above refers to cameras in general, it may be particularly relevant to digital cameras. Digital cameras may be standalone devices e.g., a DSLR camera) or may be integrated with other devices.
  • Either or both of a front-facing camera and a rear-facing camera may include or be associated with an ambient light sensor (ALS) that may continuously or from time to time determine the ambient brightness of a scene that the camera can capture.
  • the ALS can be used to adjust the display brightness of a screen associated with the camera (e.g., a viewfinder). When the determined ambient brightness is high, the brightness level of the screen may be increased to make the screen easier to view. When the determined ambient brightness is low, the brightness level of the screen may be decreased, also to make the screen easier to view as well as to potentially save power.
  • the ambient light sensor’s input may be used to determine an exposure time of an associated camera, or to help in this determination.
  • FIG. 1 A is an example illustration of a presence of a rolling band, in accordance with example embodiments.
  • a first computing device 105 is shown to display an image (e.g., a video) of a cat.
  • second computing device 110 is used to capture a video of the image displayed on the first computing device 105, one or more rolling bands in the form of horizontal bands 115 may appear on a display of the second computing device 110.
  • Figure IB is an example illustration of an absence of a rolling band, in accordance with example embodiments.
  • first computing device 105 is shown to display the image (e.g., a video) of the cat, without any perceptible rolling bands.
  • Figure 2 is an example overview of a processing pipeline for rolling band detection and correction, in accordance with example embodiments.
  • the operations may involve displaying, by a graphical user interface of a computing device, a preview video of a scene being captured.
  • the computing device may be a mobile device.
  • the graphical user interface may be an interface that displays a captured video (e.g., a pre-recorded video stored on the computing device).
  • the graphical user interface may be a live-view interface that displays a live- view preview of a video.
  • a "live-view preview" of a video should be understood to be a sequence of images (e.g., video) comprising consecutive frames that are generated and displayed based on an image data stream from an image sensor of an image-capture device.
  • image data may be generated by a camera's image sensor (or a portion, subset, or sampling of the pixels on the image sensor).
  • This image data is representative of the field-of-view (FOV) of the camera, and thus indicative of the sequence of images that will be captured if the user taps the camera's shutter button or initiates image capture in some other manner.
  • FOV field-of-view
  • a camera device or other computing device may generate and display a live-view preview image based on the image data stream from the image sensor.
  • the live-view preview image can be a real-time image feed (e.g., video), such that the user is informed of the camera's FOV in real-time.
  • Block 205 involves receiving, from an image sensor of an image capturing device, a sensor row sum input.
  • the sensor row sum input may include aggregated light intensity data for two consecutive frames of a video.
  • the aggregated light intensity data can be any type of statistical data e.g., a mean, median, statistical distribution, etc.) that indicates row-wise light intensities.
  • the aggregated light intensity data may include respective row-sums of pixel intensities in the two consecutive frames.
  • a sensor row sum is a technique used to measure an amount of light that is received by each row of pixels on an image sensor.
  • the image sensor may include a plurality of light-sensing pixels that measure an intensity of light incident thereon and thereby collectively capture an image of an environment.
  • the light-sensing pixels may be arranged in arrays, and may be grouped by position into pixel sensor groups.
  • an array of pixel sensors may be grouped into pixel sensor groups, where each pixel sensor group includes an array of pixel sensors.
  • Such pixel sensor groups may be further grouped to define regions of interest (ROIs).
  • Image processing circuitry groups may be configured to each receive pixel information from a corresponding pixel sensor group and further configured to perform image processing operations on the pixel information to provide processed pixel information during operation of the image sensor.
  • the input is typically either singular or a set of three channel images consisting of red, green, and blue (RGB) color channels.
  • RGB red, green, and blue
  • Such color images are commonly captured with an array of light sensors covered with different color filters, called a color filter array, which are generally arranged as a repeated mosaic.
  • color filters can be, for example, a Bayer filter, a modified Bayer filter such as RGBE where a green filter is modified to an “emerald” filter, a red-yellow-yellow-blue (RYYB) filter, a cyan-yellow- yellow-magenta (CYYM) filter, a cyan -yellow-green-magenta (CYGM) filter, various modifications of the Bayer filter (e.g., RGBW where a green filter is modified to a “white” filter, a Quad Bayer filter (comprising 4x blue, 4x red, and 8x green filters), RYYB Quad Bayer (comprising 4x blue, 4x red, and 8x yellow filters), nonacell (comprising 9x blue, 9x red, and 18x green filters), RCCC filter (a monochrome sensor with a red channel to detect traffic lights, stop signs, etc.), RCCB filter (where the green pixels are clear), and others.
  • RGBE red-
  • Some embodiments involve determining a pattern of light intensities for the two consecutive frames based on the aggregated light intensity data.
  • the pattern of light intensities may be based on a difference between the respective row-wise sums of light intensities in the respective pixel arrays.
  • a first row-wise delta difference may be determined based on a difference between a first sum of pixel intensities in a first row of a first frame, and a second sum of pixel intensities in a first row of a second frame (e.g., successive to the first frame).
  • a second row-wise delta difference may be determined based on a difference between a third sum of pixel intensities in a second row of the first frame, and a fourth sum of pixel intensities in a second row of the second frame (e.g., successive to the first frame). Continuing thus, row-wise delta differences may be determined for each row of the pixel array.
  • determining the pattern of light intensities may involve applying a fast Fourier transform (FFT) to such row-wise delta differences over successive frames.
  • FFT fast Fourier transform
  • This allows for an analysis of the aggregated light intensity data to a signal in a frequency domain.
  • the delta differences are substantially similar (e.g.,the signal is substantially linear or flat)
  • this is indicative of an absence of a rolling band.
  • the delta differences resolve into a curve (e.g., the signal is substantially sinusoidal), this may be indicative of a presence of a rolling band.
  • Block 210 involves receiving ALS sensor input from an ALS sensor. For example, ambient data associated with the two consecutive frames may be received.
  • Block 215 involves detection of a rolling band. As described herein, such detection may be based on a combination of the row sum input and the ALS sensor input.
  • detection of a rolling band may involve analyzing the row sum data.
  • a rolling band can be identified through signal processing by comparing row sums in the two consecutive frames.
  • a pattern of light intensities for the two consecutive frames may be determined based on a difference between the respective row-wise sums of light intensities in the respective pixel arrays.
  • the pattern of light intensities may be used to determine a band frequency.
  • a FFT may be applied to the delta differences between the respective row-wise sums of light intensities in the respective pixel arrays to obtain a signal in a frequency domain. Within the frequency domain, a frequency for the signal may be obtained to determine the band frequency, and a peak value may be used to determine a band amplitude.
  • a flicker light source may be detected using the ALS sensor data.
  • the ALS is generally constantly running and sampling at a higher frequency and detecting ambient light sources.
  • a flicker measurement indicating an amount of flicker in the video may be determined based on the ALS data.
  • row sum data or ALS data may be used independently to detect a rolling band, such an approach may not be reliable.
  • movement of the image capture device and/or movement of an object in the FOV may trigger a false positive - i.e., detection of a rolling band when there is no rolling band.
  • ALS data and/or row sum data may be indicative of motion, and not of a rolling band. Applying a band correction in such instances may degrade video quality.
  • a more reliable band detection mechanism is based on correlating the pattern of light intensities with an amount of flicker to determine a presence or an absence of a rolling band in the video.
  • block 230 involves visible rolling band detection
  • block 235 involves determining whether a rolling band is detected.
  • the flicker measurement may be associated with a confidence level indicating a reliability of the flicker measurement.
  • the confidence level may be a normalized value in a range from 0 to 1, where 0 indicates a very low confidence, and 1 indicates a very high confidence.
  • a confidence threshold for the flicker measurement may be determined. Generally, the confidence threshold indicates a lowest acceptable confidence level for the flicker measurement, and may depend on factors such as the image capturing device, the ALS sensor, the ambient light, and so forth.
  • a flicker frequency may be determined from the amount of flicker.
  • the ALS data may indicate that the flicker frequency is at 100 Hertz (Hz), with a confidence level of 0.8.
  • the correlating of the pattern of light intensities and the amount of flicker involves determining whether the band frequency is within a range of the flicker frequency. For example, in the event that the flicker frequency is determined to be at 100 Hz, with a confidence level of 0.8 (with a confidence threshold at 0.7), and the band frequency is close to 100 Hz, then the band frequency may be determined to be within a range of the flicker frequency. In such a situation, it may be determined that a rolling band is detected at block 235.
  • the band frequency may be determined to be not within a range of the flicker frequency. For example, in the event that the flicker frequency is determined to be at 100 Hz, with a confidence level of 0.72 (with a confidence threshold at 0.7), and the band frequency is close to 500 Hz, then the band frequency may be determined to be not within a range of the flicker frequency. This may correspond to a false detection of a rolling band where the row sum data indicates motion instead of a rolling band. In such a situation, it may be determined that a rolling band is not detected at block 235.
  • the flicker frequency is determined to be at 100 Hz, with a confidence level of 0.6 (with a confidence threshold at 0.7), and the band frequency is close to 100 Hz, then the measurements may be deemed to be unreliable and it may be determined that a rolling band is not detected at block 235.
  • Some embodiments may involve, based on a determination that the confidence level of the flicker measurement exceeds the confidence threshold, determining the band amplitude from the pattern of light intensities, and determining whether the band amplitude exceeds a visibility threshold.
  • the visibility threshold indicates whether the rolling band is visible in the video. In the event the band amplitude does not exceed the visibility threshold, then there is likely no perceptible banding, and it may be determined that a rolling band is not detected at block 235.
  • the correlating of the pattern of light intensities and the amount of flicker involves determining whether the band frequency is within a range of the flicker frequency, where the range may be a larger range of values. For example, in the event that the flicker frequency is determined to be at 100 Hz, with a confidence level of 0.8 (with a confidence threshold at 0.7), and the band frequency is determined to range between 50 Hz and 150 Hz, then, based on a detection of perceptible banding, the band frequency may be determined to be within a range of the flicker frequency. In such a situation, it may be determined that a rolling band is detected at block 235.
  • Block 240 involves rolling band correction. Some embodiments involve, responsive to a determining of the presence of the rolling band in the video, applying a band removal strategy to potentially correct the presence of the rolling band in the video. In some embodiments, the applying of the band removal strategy involves determining whether to increase an exposure time for the image capturing device. Generally, an adjustment of the exposure time may be applied on a per frame basis during image capture (e.g., in near real-time).
  • HDR high dynamic range
  • An HDR display for an image can extend a range of user experience when viewing the image. For example, an image of a person in the dark may have a high composition of dark colors with a low luminance value. Likewise, an image of a sunlit sky may have a high composition of bright colors with a high luminance value. In some aspects, a ratio of respective luminance values may have a high dynamic range, for example, 1 : 1,000,000.
  • the camera parameters may be adjusted to widen the exposure range to capture more detail in a scene (e.g., capturing details in both bright and dark areas). Increasing the exposure time can over-expose the brighter areas.
  • Block 245 involves sky detection.
  • the “sky detection” is used herein as an example to illustrate how a rolling band correction may be applied to avoid an undesirable over-exposure of the brighter areas of a video. For example, a percentage of the sky in a current frame may be determined. In the event the percentage of the sky exceeds a threshold value, a rolling band correction may be applied at block 250 by increasing the exposure time. For example, although increasing the exposure time may add brightness to the sky regions, given that a large portion of the image is covered by the sky regions, the increased brightness may not degrade video quality.
  • a rolling band correction may not be applied at block 250, and the exposure time may be maintained (e.g., not increased). For example, increasing the exposure time may add brightness to the sky regions, given that a smaller portion of the image is covered by the sky regions, the increased brightness may degrade video quality. Accordingly, it is desirable to not increase the exposure time.
  • high brightness regions e.g., sky
  • the high brightness regions generally correspond to light intensities that exceed a brightness threshold.
  • it may be determined whether an amount of the identified high brightness regions in the two consecutive frames exceeds a size threshold (e.g., whether a percentage of the sky exceeds a threshold value).
  • a rolling band correction may be applied at block 250 by increasing the exposure time. For example, although increasing the exposure time may add brightness to the identified high brightness regions, given that a large portion of the image is covered by the identified high brightness regions, the increased brightness may not degrade video quality.
  • a rolling band correction may not be applied at block 250, and the exposure time may be maintained. For example, increasing the exposure time may add brightness to the identified high brightness regions, given that a smaller portion of the image is covered by the identified high brightness regions, the increased brightness may degrade video quality. Accordingly, it is desirable to not increase the exposure time.
  • the determining of whether to increase the exposure time for the image capturing device is based on a determining of whether the increase of the exposure time reduces a dynamic range of the video.
  • a rolling band correction may not be applied at block 250, and the exposure time may be maintained (e.g., not increased).
  • a rolling band correction may be applied at block 250, and the exposure time may be increased.
  • the video may include objects of interest, regions of interest, and so forth.
  • one or more trained machine learning algorithms e.g., an object detection algorithm, a face detection algorithm, a segmentation algorithm, and so forth
  • a user-approved facial recognition algorithm may be applied to identify one or more individuals in the image as likely objects of interest.
  • a user may enable access to a history of user preferences, and objects and/or individuals may be identified as being of high interest to the user.
  • the determining of whether to increase the exposure time for the image capturing device may be based on a determining of whether the increase of the exposure time has a negative impact on a visibility and/or image characteristics of the objects of interest, regions of interest, and so forth. For example, increasing the exposure time may cause some of the objects and/or regions of interest to become bright enough to degrade their image quality. Similarly, decreasing the exposure time may cause some of the objects and/or regions of interest to become dark enough to degrade their image quality. [0057] Some embodiments involve determining one or more characteristics of the object and/or region of interest.
  • the one or more characteristics may include textures such as jeans, leather, cotton, any kind of fabric, stone, brick, rough surfaces, smooth surfaces, rust, paint, tissue fabric, mouse pads, tin foil, ice cubes, foam, and bubbles.
  • the determining of whether the increase of the exposure time reduces the dynamic range may be based on the one or more characteristics of the region of interest.
  • the exposure time may be adjusted to maintain a high quality capture of the textures described above. For example, increasing the exposure time may cause some of the textures to become bright enough to lose their distinctive appearance. Similarly, decreasing the exposure time may cause some of the textures to become dark enough to lose their distinctive appearance.
  • Figure 3 is a block diagram of an example computing device 300, in accordance with example embodiments.
  • computing device 300 shown in Figure 3 can be configured to perform at least one function described herein, including methods 400 and/or 500.
  • Computing device 300 may include a user interface module 301, a network communications module 302, one or more processors 303, data storage 304, rolling band module 312, one or more cameras 318, one or more sensors 320, and power system 322, all of which may be linked together via a system bus, network, or other connection mechanism 305.
  • User interface module 301 can be operable to send data to and/or receive data from external user input/output devices.
  • user interface module 301 can be configured to send and/or receive data to and/or from user input devices such as a touch screen, a computer mouse, a keyboard, a keypad, a touch pad, a trackball, a joystick, a voice recognition module, and/or other similar devices.
  • User interface module 301 can also be configured to provide output to user display devices, such as one or more cathode ray tubes (CRT), liquid crystal displays, light emitting diodes (LEDs), displays using digital light processing (DLP) technology, printers, light bulbs, and/or other similar devices, either now known or later developed.
  • User interface module 301 can also be configured to generate audible outputs, with devices such as a speaker, speaker jack, audio output port, audio output device, earphones, and/or other similar devices.
  • User interface module 301 can further be configured with one or more haptic devices that can generate haptic outputs, such as vibrations and/or other outputs detectable by touch and/or physical contact with computing device 300.
  • Network communications module 302 can include one or more devices that provide one or more wireless interfaces 307 and/or one or more wireline interfaces 308 that are configurable to communicate via a network.
  • Wireless interface(s) 307 can include one or more wireless transmitters, receivers, and/or transceivers, such as a BluetoothTM transceiver, a Zigbee® transceiver, a Wi-FiTM transceiver, a WiMAXTM transceiver, an LTETM transceiver, and/or other type of wireless transceiver configurable to communicate via a wireless network.
  • Wireline interface(s) 308 can include one or more wireline transmitters, receivers, and/or transceivers, such as an Ethernet transceiver, a Universal Serial Bus (USB) transceiver, or similar transceiver configurable to communicate via a twisted pair wire, a coaxial cable, a fiberoptic link, or a similar physical connection to a wireline network.
  • wireline transmitters such as an Ethernet transceiver, a Universal Serial Bus (USB) transceiver, or similar transceiver configurable to communicate via a twisted pair wire, a coaxial cable, a fiberoptic link, or a similar physical connection to a wireline network.
  • USB Universal Serial Bus
  • network communications module 302 can be configured to provide reliable, secured, and/or authenticated communications.
  • information for facilitating reliable communications e.g., guaranteed message delivery
  • a message header and/or footer e.g., packet/message sequencing information, encapsulation headers and/or footers, size/time information, and transmission verification information such as cyclic redundancy check (CRC) and/or parity check values.
  • CRC cyclic redundancy check
  • Communications can be made secure (e.g., be encoded or encrypted) and/or decry pted/decoded using one or more cryptographic protocols and/or algorithms, such as, but not limited to, Data Encryption Standard (DES), Advanced Encryption Standard (AES), a Rivest-Shamir-Adelman (RSA) algorithm, a Diffie-Hellman algorithm, a secure sockets protocol such as Secure Sockets Layer (SSL) or Transport Layer Security (TLS), and/or Digital Signature Algorithm (DSA).
  • DES Data Encryption Standard
  • AES Advanced Encryption Standard
  • RSA Rivest-Shamir-Adelman
  • Diffie-Hellman algorithm a secure sockets protocol
  • SSL Secure Sockets Layer
  • TLS Transport Layer Security
  • DSA Digital Signature Algorithm
  • Other cryptographic protocols and/or algorithms can be used as well or in addition to those listed herein to secure (and then decry pt/decode) communications.
  • One or more processors 303 can include one or more general purpose processors (e.g., central processing unit (CPU), etc.), and/or one or more special purpose processors (e.g., digital signal processors, tensor processing units (TPUs), graphics processing units (GPUs), application specific integrated circuits, etc.).
  • processors 303 can be configured to execute computer-readable instructions 306 that are contained in data storage 304 and/or other instructions as described herein.
  • Data storage 304 can include one or more non-transitory computer-readable storage media that can be read and/or accessed by at least one of one or more processors 303.
  • the one or more computer-readable storage media can include volatile and/or non-volatile storage components, such as optical, magnetic, organic or other memory or disc storage, which can be integrated in whole or in part with at least one of one or more processors 303.
  • data storage 304 can be implemented using a single physical device (e.g., one optical, magnetic, organic or other memory or disc storage unit), while in other examples, data storage 304 can be implemented using two or more physical devices.
  • Data storage 304 can include computer-readable instructions 306 and perhaps additional data.
  • data storage 304 can include storage required to perform at least part of the herein-described methods, scenarios, and techniques and/or at least part of the functionality of the herein-described devices and networks.
  • computer-readable instructions 306 can include instructions that, when executed by processor(s) 303, enable computing device 300 to provide for some or all of the functionality described herein.
  • data storage 304 may store captured images, personalized correction factors, and so forth.
  • computer-readable instructions 306 can include instructions that, when executed by processor(s) 303, enable computing device 300 to carry out operations.
  • the operations may include receiving, from an image sensor of an image capturing device, aggregated light intensity data based on respective pixel arrays for two consecutive frames of a video.
  • the operations may also include determining a pattern of light intensities for the two consecutive frames based on the aggregated light intensity data.
  • the operations may additionally include receiving, from an ambient light sensor, ambient data associated with the two consecutive frames.
  • the operations may also include determining, based on the ambient data, a flicker measurement indicating an amount of flicker in the video.
  • the operations may further include correlating the pattern of light intensities and the amount of flicker to determine a presence or an absence of a rolling band in the video.
  • computing device 300 can include rolling band module 312.
  • Rolling band module 312 can be configured to detect and/or correct a rolling band in a video.
  • rolling band module 312 can be configured to correlate a pattern of light intensities (e.g., based on row sum data from one or more cameras 318) and an amount of flicker (e.g., based on ALS data from one or more sensors 320) to determine a presence or an absence of a rolling band in the video.
  • computing device 300 can include one or more cameras 318.
  • Camera(s) 318 can include one or more image capture devices, such as still and/or video cameras, equipped to capture light and record the captured light in one or more images; that is, camera(s) 318 can generate image(s) of captured light.
  • the one or more images can be one or more still images and/or one or more images utilized in video imagery.
  • Camera(s) 318 can capture light and/or electromagnetic radiation emitted as visible light, infrared radiation, ultraviolet light, and/or as one or more other frequencies of light.
  • Camera(s) 318 can include a wide camera, a tele camera, an ultrawide camera, and so forth.
  • camera(s) 318 can be front-facing or rear-facing cameras with reference to computing device 300.
  • Camera(s) 318 can include camera components such as, but are not limited to, an aperture, shutter, recording surface (e.g., photographic film and/or an image sensor), lens, and/or shutter button.
  • the camera components may be controlled at least in part by software executed by one or more processors 303.
  • computing device 300 can include one or more sensors 320. Sensors 320 can be configured to measure conditions within computing device 300 and/or conditions in an environment of computing device 300 and provide data about these conditions.
  • sensors 320 can include one or more of (i) sensors for obtaining data about computing device 300, such as, but not limited to, a thermometer for measuring a temperature of computing device 300, a battery sensor for measuring power of one or more batteries of power system 322, and/or other sensors measuring conditions of computing device 300; (ii) an identification sensor to identify other objects and/or devices, such as, but not limited to, a Radio Frequency Identification (RFID) reader, proximity sensor, one-dimensional barcode reader, two-dimensional barcode (e.g., Quick Response (QR) code) reader, and a laser tracker, where the identification sensors can be configured to read identifiers, such as RFID tags, barcodes, QR codes, and/or other devices and/or object configured to be read and provide at least identifying information; (iii)
  • RFID Radio Frequency Identification
  • Power system 322 can include one or more batteries 324 and/or one or more external power interfaces 326 for providing electrical power to computing device 300.
  • Each battery of the one or more batteries 324 can, when electrically coupled to the computing device 300, act as a source of stored electrical power for computing device 300.
  • One or more batteries 324 of power system 322 can be configured to be portable. Some or all of one or more batteries 324 can be readily removable from computing device 300. In other examples, some or all of one or more batteries 324 can be internal to computing device 300, and so may not be readily removable from computing device 300. Some or all of one or more batteries 324 can be rechargeable.
  • a rechargeable battery can be recharged via a wired connection between the battery and another power supply, such as by one or more power supplies that are external to computing device 300 and connected to computing device 300 via the one or more external power interfaces.
  • one or more batteries 324 can be non-rechargeable batteries.
  • One or more external power interfaces 326 of power system 322 can include one or more wired-power interfaces, such as a USB cable and/or a power cord, that enable wired electrical power connections to one or more power supplies that are external to computing device 300.
  • One or more external power interfaces 326 can include one or more wireless power interfaces, such as a Qi wireless charger, that enable wireless electrical power connections, such as via a Qi wireless charger, to one or more external power supplies.
  • computing device 300 can draw electrical power from the external power source the established electrical power connection.
  • power system 322 can include related sensors, such as battery sensors associated with the one or more batteries or other types of electrical power sensors.
  • One or more external power interfaces 326 of power system 322 can include one or more wired-power interfaces, such as a USB cable and/or a power cord, that enable wired electrical power connections to one or more power supplies that are external to computing device 300.
  • One or more external power interfaces 326 can include one or more wireless power interfaces, such as a Qi wireless charger, that enable wireless electrical power connections, such as via a Qi wireless charger, to one or more external power supplies.
  • computing device 300 can draw electrical power from the external power source the established electrical power connection.
  • power system 322 can include related sensors, such as battery sensors associated with the one or more batteries or other types of electrical power sensors. Example Methods of Operation
  • Figure 4 is a flowchart of a method, in accordance with example embodiments.
  • Method 400 may include various blocks or steps. The blocks or steps may be carried out individually or in combination. The blocks or steps may be carried out in any order and/or in series or in parallel. Further, blocks or steps may be omitted or added to method 400.
  • the blocks of method 400 may be carried out by various elements of computing device 300 as illustrated and described in reference to Figure 3.
  • Block 410 involves receiving, from an image sensor of an image capturing device, aggregated light intensity data based on respective pixel arrays for two consecutive frames of a video.
  • Block 420 involves determining a pattern of light intensities for the two consecutive frames based on the aggregated light intensity data.
  • Block 430 involves receiving, from an ambient light sensor, ambient data associated with the two consecutive frames.
  • Block 440 involves determining, based on the ambient data, a flicker measurement indicating an amount of flicker in the video.
  • Block 450 involves correlating the pattern of light intensities and the amount of flicker to determine a presence or an absence of a rolling band in the video.
  • Some embodiments involve determining a band frequency from the pattern of light intensities. Such embodiments involve determining a flicker frequency from the amount of flicker. The correlating of the pattern of light intensities and the amount of flicker involves determining whether the band frequency is within a range of the flicker frequency.
  • Some embodiments involve, based on a determination that the band frequency is not within a range of the flicker frequency, determining the absence of the rolling band in the video. [0082] Some embodiments involve, based on a determination that the band frequency is within a range of the flicker frequency, determining whether a confidence level of the flicker measurement exceeds a confidence threshold.
  • Some embodiments involve, based on a determination that the confidence level of the flicker measurement does not exceed the confidence threshold, determining the absence of the rolling band in the video.
  • Some embodiments involve, based on a determination that the confidence level of the flicker measurement exceeds the confidence threshold, determining a band amplitude from the pattern of light intensities. Such embodiments involve determining whether the band amplitude exceeds a visibility threshold, wherein the visibility threshold indicates whether the rolling band is visible in the video.
  • Some embodiments involve determining that the band amplitude exceeds the visibility threshold. Such embodiments involve determining the presence of the rolling band in the video. [0086] Some embodiments involve determining that the band amplitude does not exceed the visibility threshold. Such embodiments involve determining the absence of the rolling band in the video.
  • Some embodiments involve, responsive to a determining of the presence of the rolling band in the video, applying a band removal strategy to potentially correct the presence of the rolling band in the video.
  • the applying of the band removal strategy involves determining whether to increase an exposure time for the image capturing device.
  • Some embodiments involve identifying high brightness regions in the two consecutive frames, wherein the high brightness regions correspond to light intensities that exceed a brightness threshold. Such embodiments involve determining whether an amount of the identified high brightness regions in the two consecutive frames exceeds a size threshold.
  • Some embodiments involve, based on a determination that the amount of the identified high brightness regions in the two consecutive frames exceeds the size threshold, determining to increase the exposure time for the image capturing device to correct the presence of the rolling band in the video.
  • Some embodiments involve, based on a determination that the amount of the identified high brightness regions in the two consecutive frames does not exceed the size threshold, maintaining the exposure time for the image capturing device.
  • the determining of whether to increase the exposure time for the image capturing device may be based on a determining of whether the increase of the exposure time reduces a dynamic range of the video.
  • Some embodiments involve, upon a determination that the increase of the exposure time reduces the dynamic range of the video, maintaining the exposure time for the image capturing device.
  • Some embodiments involve, upon a determination that the increase of the exposure time does not reduce the dynamic range of the video, increasing the exposure time for the image capturing device.
  • Some embodiments involve, detecting, based on a trained machine learning model, a region of interest in the video. Such embodiments involve determining one or more characteristics of the region of interest. The determining of whether the increase of the exposure time reduces the dynamic range is based on the one or more characteristics of the region of interest.
  • the image capturing device may be a component of a mobile device.
  • the aggregated light intensity data includes respective row-wise sums of pixel values in the respective pixel arrays for the two consecutive frames, and wherein the pattern of light intensities is based on a difference between the respective row-wise sums of pixel values in the respective pixel arrays.
  • Figure 5 is a flowchart of a method, in accordance with example embodiments.
  • Method 500 may include various blocks or steps. The blocks or steps may be carried out individually or in combination. The blocks or steps may be carried out in any order and/or in series or in parallel. Further, blocks or steps may be omitted or added to method 500. Additionally, there may be additional relationships between the blocks. As such, Figure 5 illustrates one embodiment of the techniques described herein.
  • the blocks of method 500 may be carried out by various elements of computing device 300 as illustrated and described in reference to Figure 3.
  • Block 505 involves analyzing aggregated light intensity data.
  • Block 510 involves analyzing data from an ambient light sensor (ALS).
  • ALS ambient light sensor
  • Block 515 involves correlating the row sum data and the ALS data.
  • Block 520 involves determining whether a band frequency is within range of a flicker frequency.
  • method 500 Upon a determination that the band frequency is not within range of the flicker frequency, method 500 proceeds to block 525.
  • Block 525 involves determining that there is an absence of a rolling band. Method 500 proceeds to block 530.
  • Block 530 involves determining that no correction is applied.
  • method 500 Upon a determination that the band frequency is within range of the flicker frequency, method 500 proceeds to block 535.
  • Block 535 involves determining that there is a presence of a rolling band. Method 500 proceeds to block 545.
  • Block 545 involves determining whether a band amplitude exceed a visibility threshold. [00110] Upon a determination that the band amplitude does not exceed the visibility threshold, method 500 proceeds to block 530, and no correction is applied.
  • method 500 Upon a determination that the band amplitude exceeds the visibility threshold, method 500 proceeds to block 550.
  • Block 550 involves determining whether a size of a high brightness region exceeds a size threshold.
  • method 500 Upon a determination that the size of the high brightness region does not exceed the size threshold, method 500 proceeds to block 555.
  • Block 555 involves maintaining the exposure time.
  • method 500 Upon a determination that the size of the high brightness region exceeds the size threshold, method 500 proceeds to block 560.
  • Block 560 involves determining whether a HDR is negatively impacted.
  • method 500 proceeds to block 555, and exposure time is maintained.
  • method 500 proceeds to block 565.
  • Block 565 involves increasing the exposure time.
  • a step or block that represents a processing of information can correspond to circuitry that can be configured to perform the specific logical functions of a herein-described method or technique.
  • a step or block that represents a processing of information can correspond to a module, a segment, or a portion of program code (including related data).
  • the program code can include one or more instructions executable by a processor for implementing specific logical functions or actions in the method or technique.
  • the program code and/or related data can be stored on any type of computer readable medium such as a storage device including a disk, hard drive, or other storage medium.
  • the computer readable medium can also include non-transitory computer readable media such as computer-readable media that store data for short periods of time like register memory, processor cache, and random access memory (RAM).
  • the computer readable media can also include non-transitory computer readable media that store program code and/or data for longer periods.
  • the computer readable media may include secondary or persistent long-term storage, like read only memory (ROM), optical or magnetic disks, compact disc read only memory (CD-ROM), for example.
  • the computer readable media can also be any other volatile or non-volatile storage systems.
  • a computer readable medium can be considered a computer readable storage medium, for example, or a tangible storage device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

Procédé donné à titre d'exemple consistant à recevoir, en provenance d'un capteur d'image d'un dispositif de capture d'image, des données d'intensité lumineuse agrégées sur la base de réseaux de pixels respectifs pour deux trames consécutives d'une vidéo. Le procédé consiste également à déterminer un motif d'intensités lumineuses pour les deux trames consécutives sur la base des données d'intensité lumineuse agrégées. Le procédé consiste en outre à recevoir, en provenance d'un capteur de lumière ambiante, des données ambiantes associées aux deux trames consécutives. Le procédé consiste également à déterminer, sur la base des données ambiantes, une mesure de papillotement indiquant une quantité de papillotement dans la vidéo. Le procédé consiste en outre à corréler le motif d'intensités lumineuses et la quantité de papillotement pour déterminer une présence ou une absence d'une bande de roulement dans la vidéo.
PCT/US2023/034065 2023-09-29 2023-09-29 Systèmes et procédés de détection et d'atténuation d'une bande de roulement Pending WO2025071563A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2023/034065 WO2025071563A1 (fr) 2023-09-29 2023-09-29 Systèmes et procédés de détection et d'atténuation d'une bande de roulement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2023/034065 WO2025071563A1 (fr) 2023-09-29 2023-09-29 Systèmes et procédés de détection et d'atténuation d'une bande de roulement

Publications (1)

Publication Number Publication Date
WO2025071563A1 true WO2025071563A1 (fr) 2025-04-03

Family

ID=88585194

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2023/034065 Pending WO2025071563A1 (fr) 2023-09-29 2023-09-29 Systèmes et procédés de détection et d'atténuation d'une bande de roulement

Country Status (1)

Country Link
WO (1) WO2025071563A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070153094A1 (en) * 2006-01-05 2007-07-05 Ying Noyes Automatic flicker correction in an image capture device
KR100968377B1 (ko) * 2009-01-28 2010-07-09 주식회사 코아로직 플리커 노이즈 제거장치와 제거방법, 및 그 제거장치를 포함한 영상처리장치

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070153094A1 (en) * 2006-01-05 2007-07-05 Ying Noyes Automatic flicker correction in an image capture device
KR100968377B1 (ko) * 2009-01-28 2010-07-09 주식회사 코아로직 플리커 노이즈 제거장치와 제거방법, 및 그 제거장치를 포함한 영상처리장치

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ANONYMOUS: "Exposure (photography) - Wikipedia", 18 May 2023 (2023-05-18), XP093153323, Retrieved from the Internet <URL:https://web.archive.org/web/20230518102000/https://en.wikipedia.org/wiki/Exposure_(photography)> *

Similar Documents

Publication Publication Date Title
CN108322646B (zh) 图像处理方法、装置、存储介质及电子设备
US9866748B2 (en) System and method for controlling a camera based on processing an image captured by other camera
US9253375B2 (en) Camera obstruction detection
US12231777B2 (en) Exposure change control in low light environments
EP3520390B1 (fr) Recolorisation de flux d&#39;images infrarouges
US8675091B2 (en) Image data processing with multiple cameras
KR20170030933A (ko) 영상 처리 장치 및 그것의 자동 화이트 밸런싱 방법
EP4366289A1 (fr) Procédé de photographie et appareil associé
US11323632B2 (en) Electronic device and method for increasing exposure control performance of a camera by adjusting exposure parameter of the camera
US20230021016A1 (en) Hybrid object detector and tracker
CN107563329A (zh) 图像处理方法、装置、计算机可读存储介质和移动终端
CN113228621A (zh) 一种曝光控制方法和系统
JP2013062711A (ja) 撮影装置、撮影画像処理方法、およびプログラム
WO2025071563A1 (fr) Systèmes et procédés de détection et d&#39;atténuation d&#39;une bande de roulement
US20250301228A1 (en) Systems and Methods for Detection and Mitigation of a Rolling Band using a Secondary Camera
CN111656766B (zh) 用于基于图像的服务的设备
CN115668274A (zh) 计算机软件模块装置、电路装置、用于改进图像处理的装置和方法
WO2024239224A1 (fr) Dispositif de traitement d&#39;image, procédé de traitement d&#39;image et programme
WO2025053835A1 (fr) Procédés et systèmes d&#39;étalonnage dynamique de l&#39;intensité de torche pour une capture d&#39;image
WO2024076531A1 (fr) Système de mise au point automatique hybride avec focalisation prioritaire robuste d&#39;objets macro

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23797959

Country of ref document: EP

Kind code of ref document: A1