EP3579139A1 - Systems and methods for generating calibrated skin tone profiles - Google Patents
Systems and methods for generating calibrated skin tone profiles Download PDFInfo
- Publication number
- EP3579139A1 EP3579139A1 EP18206578.9A EP18206578A EP3579139A1 EP 3579139 A1 EP3579139 A1 EP 3579139A1 EP 18206578 A EP18206578 A EP 18206578A EP 3579139 A1 EP3579139 A1 EP 3579139A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- region
- interest
- skin tone
- pixels
- digital camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0282—Rating or review of business operators or products
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
-
- A—HUMAN NECESSITIES
- A45—HAND OR TRAVELLING ARTICLES
- A45D—HAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
- A45D44/00—Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
- A45D44/005—Other cosmetic or toiletry articles, e.g. for hairdressers' rooms for selecting or displaying personal cosmetic colours or hairstyle
Definitions
- the present disclosure generally relates to systems and methods for generating skin tone profiles of individuals depicted in digital images.
- a computing device with a digital camera obtains a reference image depicting at least one reference color and calibrates parameters of the digital camera based on the at least one reference color.
- the computing device captures, by the digital camera, a digital image of an individual utilizing the calibrated parameters.
- the computing device defines a region of interest in a facial region of the individual depicted in the digital image captured by the digital camera.
- the computing device generates a skin tone profile for pixels within the region of interest and displays a predetermined makeup product recommendation based on the skin tone profile.
- Another embodiment is a system that comprises a digital camera, a memory storing instructions, and a processor coupled to the memory.
- the processor is configured by the instructions to obtain a reference image depicting at least one reference color and calibrate parameters of the digital camera based on the at least one reference color.
- the processor is further configured to capture, by the digital camera, a digital image of an individual utilizing the calibrated parameters.
- the processor is further configured to define a region of interest in a facial region of the individual depicted in the digital image captured by the digital camera.
- the processor is further configured to generate a skin tone profile for pixels within the region of interest and display a predetermined makeup product recommendation based on the skin tone profile.
- Another embodiment is a non-transitory computer-readable storage medium storing instructions to be implemented by a computing device having a processor, wherein the instructions, when executed by the processor, cause the computing device to obtain a reference image depicting at least one reference color and calibrate parameters of a digital camera based on the at least one reference color.
- the processor is further configured to capture, by the digital camera, a digital image of an individual utilizing the calibrated parameters.
- the processor is further configured to define a region of interest in a facial region of the individual depicted in the digital image captured by the digital camera.
- the processor is further configured to generate a skin tone profile for pixels within the region of interest and display a predetermined makeup product recommendation based on the skin tone profile.
- FIG. 1 is a block diagram of a computing device 102 in which the techniques for generating skin tone profiles disclosed herein may be implemented.
- the computing device 102 may be embodied as a computing device such as, but not limited to, a smartphone, a tablet computing device, a laptop, and so on.
- a profile generator 104 executes on a processor of the computing device 102 and includes a reference color extractor 106, a calibration unit 108, a camera interface 110, and a content analyzer 112.
- the reference color extractor 106 is configured to obtain a reference image depicting one or more reference colors where the reference image may be depicted on a white balance card, a banknote, or other source with a known color scheme.
- the calibration unit 108 is configured to calibrate parameters of the digital camera based on the one or more reference colors.
- the camera interface 110 is configured to cause a digital camera to capture a digital image of an individual.
- the digital image may be encoded in any of a number of formats including, but not limited to, JPEG (Joint Photographic Experts Group) files, TIFF (Tagged Image File Format) files, PNG (Portable Network Graphics) files, GIF (Graphics Interchange Format) files, BMP (bitmap) files or any number of other digital formats.
- JPEG Joint Photographic Experts Group
- TIFF Tagged Image File Format
- PNG Portable Network Graphics
- GIF Graphics Interchange Format
- BMP bitmap
- the digital image may be derived from a still image of a video encoded in formats including, but not limited to, Motion Picture Experts Group (MPEG)-1, MPEG-2, MPEG-4, H.264, Third Generation Partnership Project (3GPP), 3GPP-2, Standard-Definition Video (SD-Video), High-Definition Video (HD-Video), Digital Versatile Disc (DVD) multimedia, Video Compact Disc (VCD) multimedia, High-Definition Digital Versatile Disc (HD-DVD) multimedia, Digital Television Video / High-definition Digital Television (DTV/HDTV) multimedia, Audio Video Interleave (AVI), Digital Video (DV), QuickTime (QT) file, Windows Media Video (WMV), Advanced System Format (ASF), Real Media (RM), Flash Media (FLV), an MPEG Audio Layer III (MP3), an MPEG Audio Layer II (MP2), Waveform Audio Format (WAV), Windows Media Audio (WMA), 360 degree video, 3D scan model, or any number of other digital formats.
- MPEG Motion Picture Experts
- the content analyzer 112 is configured to define a region of interest in a facial region of the individual depicted in the digital image captured by the digital camera.
- the content analyzer 112 is further configured to generate a skin color profile for pixels within the region of interest.
- the content analyzer 112 is further configured to obtain makeup product recommendations 118 from a data store 116 based on the generated skin color profile and display the makeup product recommendation 118 in a user interface to the user of the computing device 102.
- FIG. 2 illustrates a schematic block diagram of the computing device 102 in FIG. 1 .
- the computing device 102 may be embodied in any one of a wide variety of wired and/or wireless computing devices, such as a desktop computer, portable computer, dedicated server computer, multiprocessor computing device, smart phone, tablet, and so forth.
- the computing device 102 comprises memory 214, a processing device 202, a number of input/output interfaces 204, a network interface 206, a display 208, a peripheral interface 211, and mass storage 226, wherein each of these components are connected across a local data bus 210.
- the processing device 202 may include any custom made or commercially available processor, a central processing unit (CPU) or an auxiliary processor among several processors associated with the computing device 102, a semiconductor based microprocessor (in the form of a microchip), a macroprocessor, one or more application specific integrated circuits (ASICs), a plurality of suitably configured digital logic gates, and other well known electrical configurations comprising discrete elements both individually and in various combinations to coordinate the overall operation of the computing system.
- CPU central processing unit
- ASICs application specific integrated circuits
- the memory 214 may include any one of a combination of volatile memory elements (e.g., random-access memory (RAM, such as DRAM, and SRAM, etc.)) and nonvolatile memory elements (e.g., ROM, hard drive, tape, CDROM, etc .).
- RAM random-access memory
- nonvolatile memory elements e.g., ROM, hard drive, tape, CDROM, etc .
- the memory 214 typically comprises a native operating system 216, one or more native applications, emulation systems, or emulated applications for any of a variety of operating systems and/or emulated hardware platforms, emulated operating systems, etc.
- the applications may include application specific software which may comprise some or all the components of the computing device 102 depicted in FIG. 1 .
- the components are stored in memory 214 and executed by the processing device 202, thereby causing the processing device 202 to perform the operations/functions disclosed herein.
- the memory 214 can, and typically will, comprise other components which have been omitted for purposes of brevity.
- the components in the computing device 102 may be implemented by hardware and/or software.
- Input/output interfaces 204 provide any number of interfaces for the input and output of data.
- the computing device 102 comprises a personal computer
- these components may interface with one or more user input/output interfaces 204, which may comprise a keyboard or a mouse, as shown in FIG. 2 .
- the display 208 may comprise a computer monitor, a plasma screen for a PC, a liquid crystal display (LCD) on a hand held device, a touchscreen, or other display device.
- LCD liquid crystal display
- a non-transitory computer-readable medium stores programs for use by or in connection with an instruction execution system, apparatus, or device. More specific examples of a computer-readable medium may include by way of example and without limitation: a portable computer diskette, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM, EEPROM, or Flash memory), and a portable compact disc read-only memory (CDROM) (optical).
- RAM random access memory
- ROM read-only memory
- EPROM erasable programmable read-only memory
- CDROM portable compact disc read-only memory
- FIG. 3 is a flowchart 300 in accordance with various embodiments for generating skin tone profiles performed by the computing device 102 of FIG. 1 . It is understood that the flowchart 300 of FIG. 3 provides merely an example of the different types of functional arrangements that may be employed to implement the operation of the various components of the computing device 102. As an alternative, the flowchart 300 of FIG. 3 may be viewed as depicting an example of steps of a method implemented in the computing device 102 according to one or more embodiments.
- flowchart 300 of FIG. 3 shows a specific order of execution, it is understood that the order of execution may differ from that which is depicted. For example, the order of execution of two or more blocks may be scrambled relative to the order shown. Also, two or more blocks shown in succession in FIG. 3 may be executed concurrently or with partial concurrence. It is understood that all such variations are within the scope of the present disclosure.
- the computing device 102 obtains a reference image depicting at least one reference color.
- the reference image can depicted on such objects as a white balance card, a color checker, a banknote, a credit card, photocopy paper, tissue paper, a mobile phone, or a non-glossy white object.
- the object depicting the reference image is located at a predefined distance from the digital camera.
- the computing device 102 calibrates parameters of the digital camera based on the at least one reference color, where such parameters may include white balance level, exposure compensation, gamma correction, and so on.
- the computing device 102 captures a digital image of an individual utilizing the calibrated parameters.
- the computing device 102 defines a region of interest in a facial region of the individual depicted in the digital image captured by the digital camera.
- the computing device 102 defines the region of interest by determining a color distance between pixels in the facial region and one or more predetermined target skin tones and designating pixels within a threshold color distance of the one or more predetermined target skin tones as part of the region of interest.
- the computing device 102 defines the region of interest by identifying locations of predetermined feature points within the facial region and defining a boundary of the region of interest based on the locations of the predetermined feature points.
- FIG. 4 illustrates definition of a region of interest 404 according to various embodiments.
- the computing device 102 analyzes the facial region 402 of the individual and identifies various feature points (shown as dots). The computing device 102 generates a region of interest 404 based on the location of the feature points.
- the computing device 102 detects the locations of various target feature points, which may comprise, for example, the eyes, nose, mouth, eyebrows, and so on.
- the target feature points may also include the overall facial contour of the user's face.
- the computing device 102 then defines a region of interest 404 based on the location of the feature points.
- the computing device 102 may be configured to define a boundary based on a series of parabolic curves defined at or near various feature points.
- the region of interest comprises the cheek and nose regions of the user.
- the region of interest may be predefined based on specific target regions or features of the user (e.g., the cheek and nose regions) where the boundary is then defined based on the actual feature points detected on the user's face such that the region of interest encompasses those target regions or features.
- the computing device 102 generates a skin tone profile for pixels within the region of interest.
- the computing device 102 generates the skin tone profile by generating a luminance histogram for the pixels within the region of interest, removing predetermined portions of the luminance histogram to generate a target histogram portion, determining a dominant color value based on the target histogram portion, and generating the skin tone profile based on the determined dominant color value.
- the dominant color value may be determined by such techniques as calculating a mean of the target histogram, calculating a peak of the target histogram, calculating a weighted average of the target histogram, or by calculating a mean based on a mean-shift clustering algorithm of the target histogram.
- the luminance histogram illustrates the distribution of pixel brightness of the region of interest, where the pixel brightness is typically computed using either the Y component in the YUV color space or the L component in the Lab (or CIELAB) color space.
- FIG. 5 illustrates a luminance histogram 502 for pixels within the region of interest according to various embodiments.
- the computing device 102 removes predetermined portions 504, 506 of the histogram to generate a target histogram portion.
- One predetermined portion 506 may comprise, for example, the upper 30% of the histogram 502 that corresponds to pixels that are lighter (e.g., pixels that are part of a reflective portion). Specifically, the upper portion generally corresponds to the light reflection that occurs on the region of interest.
- Another predetermined portion 504 may comprise, for example, the lower 30% of the histogram 502 that corresponds to pixels that are darker ( e.g., pixels that are part of a shadow region). Specifically, the lower portion generally corresponds to the shadow effect that occurs on the region of interest.
- An average color value is determined based on the remaining histogram portion 508 (i.e., the target histogram portion), and the skin tone profile is generated based on the determined average color value.
- the computing device 102 is able to reduce the impact of light and shadow components on the average color calculation, thereby providing more accurate skin tone estimation.
- the computing device 102 generates the skin tone profile by extracting an illumination layer and a reflectance layer from the pixels within the region of interest and generating the skin tone profile based on the reflectance layer. For some embodiments, the computing device 102 generates the skin tone profile by converting a detected skin tone from a first color space to a second color space based on a predefined transformation matrix or by mapping a detected skin tone from a first classification to a second classification based on a predefined lookup table.
- each product recommendation 118 in the data store 116 also includes a target RGB value or range of target RGB values associated with a target skin tone color.
- target RGB value(s) a target YUV value or range of target YUV values could be stored for each product recommendation 118.
- a target Lab value or range of target Lab values could be stored for each product recommendation 118.
- the computing device 102 obtains one or more product recommendations 118 from the data store 116 by matching the estimated skin tone profile with one or more target RGB/YUV/Lab value(s) of corresponding product recommendations 118. Thereafter, the process in FIG. 3 ends.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Health & Medical Sciences (AREA)
- Strategic Management (AREA)
- Accounting & Taxation (AREA)
- Development Economics (AREA)
- Finance (AREA)
- Multimedia (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Entrepreneurship & Innovation (AREA)
- Game Theory and Decision Science (AREA)
- Economics (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Image Processing (AREA)
Abstract
Description
- This application claims priority to, and the benefit of, U.S. Provisional Patent Application entitled, "Method and Apparatus of Skin Tone Estimation," having Serial No.
, which is incorporated by reference in its entirety.62/681,174, filed on June 6, 2018 - The present disclosure generally relates to systems and methods for generating skin tone profiles of individuals depicted in digital images.
- With the proliferation of smartphones, tablets, and other display devices, people have the ability to capture digital images virtually any time where application programs have become popular on smartphones and other portable display devices for managing and editing captured digital content. However, due to variations in the color temperature setting in cameras, environmental lighting, and so on, it can be difficult to accurately estimate attributes (e.g., skin tone) of the facial region of an individual depicted in a digital image. Therefore, there is a need for an improved system and method for estimating skin tone profiles.
- In accordance with one embodiment, a computing device with a digital camera obtains a reference image depicting at least one reference color and calibrates parameters of the digital camera based on the at least one reference color. The computing device captures, by the digital camera, a digital image of an individual utilizing the calibrated parameters. The computing device defines a region of interest in a facial region of the individual depicted in the digital image captured by the digital camera. The computing device generates a skin tone profile for pixels within the region of interest and displays a predetermined makeup product recommendation based on the skin tone profile.
- Another embodiment is a system that comprises a digital camera, a memory storing instructions, and a processor coupled to the memory. The processor is configured by the instructions to obtain a reference image depicting at least one reference color and calibrate parameters of the digital camera based on the at least one reference color. The processor is further configured to capture, by the digital camera, a digital image of an individual utilizing the calibrated parameters. The processor is further configured to define a region of interest in a facial region of the individual depicted in the digital image captured by the digital camera. The processor is further configured to generate a skin tone profile for pixels within the region of interest and display a predetermined makeup product recommendation based on the skin tone profile.
- Another embodiment is a non-transitory computer-readable storage medium storing instructions to be implemented by a computing device having a processor, wherein the instructions, when executed by the processor, cause the computing device to obtain a reference image depicting at least one reference color and calibrate parameters of a digital camera based on the at least one reference color. The processor is further configured to capture, by the digital camera, a digital image of an individual utilizing the calibrated parameters. The processor is further configured to define a region of interest in a facial region of the individual depicted in the digital image captured by the digital camera. The processor is further configured to generate a skin tone profile for pixels within the region of interest and display a predetermined makeup product recommendation based on the skin tone profile.
- Other systems, methods, features, and advantages of the present disclosure will be or become apparent to one with skill in the art upon examination of the following drawings and detailed description. It is intended that all such additional systems, methods, features, and advantages be included within this description, be within the scope of the present disclosure, and be protected by the accompanying claims.
- Various aspects of the disclosure can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale, with emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
-
FIG. 1 is a block diagram of a computing device for generating skin tone profiles in accordance with various embodiments of the present disclosure. -
FIG. 2 is a schematic diagram of the computing device ofFIG. 1 in accordance with various embodiments of the present disclosure. -
FIG. 3 is a top-level flowchart illustrating examples of functionality implemented as portions of the computing device ofFIG. 1 for generating skin tone profiles according to various embodiments of the present disclosure. -
FIG. 4 illustrates a region of interest defined by the computing device inFIG. 1 according to various embodiments of the present disclosure. -
FIG. 5 illustrates a luminance histogram for pixels within the region of interest generated by the computing device inFIG. 1 according to various embodiments of the present disclosure. - Various embodiments are disclosed for accurately generating skin tone profiles of individuals depicted in digital images. Accurate determination of skin tone profiles is important for such applications as performing virtual application of makeup effects, recommending compatible makeup products, and so on. A description of a system for generating skin tone profiles is now described followed by a discussion of the operation of the components within the system.
FIG. 1 is a block diagram of acomputing device 102 in which the techniques for generating skin tone profiles disclosed herein may be implemented. Thecomputing device 102 may be embodied as a computing device such as, but not limited to, a smartphone, a tablet computing device, a laptop, and so on. - A
profile generator 104 executes on a processor of thecomputing device 102 and includes areference color extractor 106, acalibration unit 108, acamera interface 110, and acontent analyzer 112. Thereference color extractor 106 is configured to obtain a reference image depicting one or more reference colors where the reference image may be depicted on a white balance card, a banknote, or other source with a known color scheme. Thecalibration unit 108 is configured to calibrate parameters of the digital camera based on the one or more reference colors. - The
camera interface 110 is configured to cause a digital camera to capture a digital image of an individual. As one of ordinary skill will appreciate, the digital image may be encoded in any of a number of formats including, but not limited to, JPEG (Joint Photographic Experts Group) files, TIFF (Tagged Image File Format) files, PNG (Portable Network Graphics) files, GIF (Graphics Interchange Format) files, BMP (bitmap) files or any number of other digital formats. Alternatively, the digital image may be derived from a still image of a video encoded in formats including, but not limited to, Motion Picture Experts Group (MPEG)-1, MPEG-2, MPEG-4, H.264, Third Generation Partnership Project (3GPP), 3GPP-2, Standard-Definition Video (SD-Video), High-Definition Video (HD-Video), Digital Versatile Disc (DVD) multimedia, Video Compact Disc (VCD) multimedia, High-Definition Digital Versatile Disc (HD-DVD) multimedia, Digital Television Video / High-definition Digital Television (DTV/HDTV) multimedia, Audio Video Interleave (AVI), Digital Video (DV), QuickTime (QT) file, Windows Media Video (WMV), Advanced System Format (ASF), Real Media (RM), Flash Media (FLV), an MPEG Audio Layer III (MP3), an MPEG Audio Layer II (MP2), Waveform Audio Format (WAV), Windows Media Audio (WMA), 360 degree video, 3D scan model, or any number of other digital formats. - The
content analyzer 112 is configured to define a region of interest in a facial region of the individual depicted in the digital image captured by the digital camera. Thecontent analyzer 112 is further configured to generate a skin color profile for pixels within the region of interest. Thecontent analyzer 112 is further configured to obtainmakeup product recommendations 118 from adata store 116 based on the generated skin color profile and display themakeup product recommendation 118 in a user interface to the user of thecomputing device 102. -
FIG. 2 illustrates a schematic block diagram of thecomputing device 102 inFIG. 1 . Thecomputing device 102 may be embodied in any one of a wide variety of wired and/or wireless computing devices, such as a desktop computer, portable computer, dedicated server computer, multiprocessor computing device, smart phone, tablet, and so forth. As shown inFIG. 2 , thecomputing device 102 comprisesmemory 214, aprocessing device 202, a number of input/output interfaces 204, anetwork interface 206, adisplay 208, aperipheral interface 211, andmass storage 226, wherein each of these components are connected across a local data bus 210. - The
processing device 202 may include any custom made or commercially available processor, a central processing unit (CPU) or an auxiliary processor among several processors associated with thecomputing device 102, a semiconductor based microprocessor (in the form of a microchip), a macroprocessor, one or more application specific integrated circuits (ASICs), a plurality of suitably configured digital logic gates, and other well known electrical configurations comprising discrete elements both individually and in various combinations to coordinate the overall operation of the computing system. - The
memory 214 may include any one of a combination of volatile memory elements (e.g., random-access memory (RAM, such as DRAM, and SRAM, etc.)) and nonvolatile memory elements (e.g., ROM, hard drive, tape, CDROM, etc.). Thememory 214 typically comprises anative operating system 216, one or more native applications, emulation systems, or emulated applications for any of a variety of operating systems and/or emulated hardware platforms, emulated operating systems, etc. For example, the applications may include application specific software which may comprise some or all the components of thecomputing device 102 depicted inFIG. 1 . In accordance with such embodiments, the components are stored inmemory 214 and executed by theprocessing device 202, thereby causing theprocessing device 202 to perform the operations/functions disclosed herein. One of ordinary skill in the art will appreciate that thememory 214 can, and typically will, comprise other components which have been omitted for purposes of brevity. For some embodiments, the components in thecomputing device 102 may be implemented by hardware and/or software. - Input/
output interfaces 204 provide any number of interfaces for the input and output of data. For example, where thecomputing device 102 comprises a personal computer, these components may interface with one or more user input/output interfaces 204, which may comprise a keyboard or a mouse, as shown inFIG. 2 . Thedisplay 208 may comprise a computer monitor, a plasma screen for a PC, a liquid crystal display (LCD) on a hand held device, a touchscreen, or other display device. - In the context of this disclosure, a non-transitory computer-readable medium stores programs for use by or in connection with an instruction execution system, apparatus, or device. More specific examples of a computer-readable medium may include by way of example and without limitation: a portable computer diskette, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM, EEPROM, or Flash memory), and a portable compact disc read-only memory (CDROM) (optical).
- Reference is made to
FIG. 3 , which is aflowchart 300 in accordance with various embodiments for generating skin tone profiles performed by thecomputing device 102 ofFIG. 1 . It is understood that theflowchart 300 ofFIG. 3 provides merely an example of the different types of functional arrangements that may be employed to implement the operation of the various components of thecomputing device 102. As an alternative, theflowchart 300 ofFIG. 3 may be viewed as depicting an example of steps of a method implemented in thecomputing device 102 according to one or more embodiments. - Although the
flowchart 300 ofFIG. 3 shows a specific order of execution, it is understood that the order of execution may differ from that which is depicted. For example, the order of execution of two or more blocks may be scrambled relative to the order shown. Also, two or more blocks shown in succession inFIG. 3 may be executed concurrently or with partial concurrence. It is understood that all such variations are within the scope of the present disclosure. - At
block 310, thecomputing device 102 obtains a reference image depicting at least one reference color. For some embodiments, the reference image can depicted on such objects as a white balance card, a color checker, a banknote, a credit card, photocopy paper, tissue paper, a mobile phone, or a non-glossy white object. For such embodiments, the object depicting the reference image is located at a predefined distance from the digital camera. - At
block 320, thecomputing device 102 calibrates parameters of the digital camera based on the at least one reference color, where such parameters may include white balance level, exposure compensation, gamma correction, and so on. Atblock 330, thecomputing device 102 captures a digital image of an individual utilizing the calibrated parameters. - At
block 340, thecomputing device 102 defines a region of interest in a facial region of the individual depicted in the digital image captured by the digital camera. For some embodiments, thecomputing device 102 defines the region of interest by determining a color distance between pixels in the facial region and one or more predetermined target skin tones and designating pixels within a threshold color distance of the one or more predetermined target skin tones as part of the region of interest. - For some embodiments, the
computing device 102 defines the region of interest by identifying locations of predetermined feature points within the facial region and defining a boundary of the region of interest based on the locations of the predetermined feature points. To further illustrate, reference is made toFIG. 4 , which illustrates definition of a region ofinterest 404 according to various embodiments. In the example shown, thecomputing device 102 analyzes thefacial region 402 of the individual and identifies various feature points (shown as dots). Thecomputing device 102 generates a region ofinterest 404 based on the location of the feature points. - In accordance with some embodiments, the
computing device 102 detects the locations of various target feature points, which may comprise, for example, the eyes, nose, mouth, eyebrows, and so on. The target feature points may also include the overall facial contour of the user's face. Thecomputing device 102 then defines a region ofinterest 404 based on the location of the feature points. As shown, thecomputing device 102 may be configured to define a boundary based on a series of parabolic curves defined at or near various feature points. In the example shown, the region of interest comprises the cheek and nose regions of the user. That is, in some embodiments, the region of interest may be predefined based on specific target regions or features of the user (e.g., the cheek and nose regions) where the boundary is then defined based on the actual feature points detected on the user's face such that the region of interest encompasses those target regions or features. - Referring back to
FIG. 3 , atblock 350, thecomputing device 102 generates a skin tone profile for pixels within the region of interest. For some embodiments, thecomputing device 102 generates the skin tone profile by generating a luminance histogram for the pixels within the region of interest, removing predetermined portions of the luminance histogram to generate a target histogram portion, determining a dominant color value based on the target histogram portion, and generating the skin tone profile based on the determined dominant color value. The dominant color value may be determined by such techniques as calculating a mean of the target histogram, calculating a peak of the target histogram, calculating a weighted average of the target histogram, or by calculating a mean based on a mean-shift clustering algorithm of the target histogram. - The luminance histogram illustrates the distribution of pixel brightness of the region of interest, where the pixel brightness is typically computed using either the Y component in the YUV color space or the L component in the Lab (or CIELAB) color space. To further illustrate, reference is made to
FIG. 5 , which illustrates aluminance histogram 502 for pixels within the region of interest according to various embodiments. For some embodiments, thecomputing device 102 removes 504, 506 of the histogram to generate a target histogram portion.predetermined portions - One
predetermined portion 506 may comprise, for example, the upper 30% of thehistogram 502 that corresponds to pixels that are lighter (e.g., pixels that are part of a reflective portion). Specifically, the upper portion generally corresponds to the light reflection that occurs on the region of interest. Anotherpredetermined portion 504 may comprise, for example, the lower 30% of thehistogram 502 that corresponds to pixels that are darker (e.g., pixels that are part of a shadow region). Specifically, the lower portion generally corresponds to the shadow effect that occurs on the region of interest. - An average color value is determined based on the remaining histogram portion 508 (i.e., the target histogram portion), and the skin tone profile is generated based on the determined average color value. By excluding the predetermined upper and lower portions of the
histogram 502, thecomputing device 102 is able to reduce the impact of light and shadow components on the average color calculation, thereby providing more accurate skin tone estimation. - For some embodiments, the
computing device 102 generates the skin tone profile by extracting an illumination layer and a reflectance layer from the pixels within the region of interest and generating the skin tone profile based on the reflectance layer. For some embodiments, thecomputing device 102 generates the skin tone profile by converting a detected skin tone from a first color space to a second color space based on a predefined transformation matrix or by mapping a detected skin tone from a first classification to a second classification based on a predefined lookup table. - At
block 360, thecomputing device 102 displays a predetermined makeup product recommendation based on the skin tone profile. For some embodiments, eachproduct recommendation 118 in the data store 116 (FIG. 1 ) also includes a target RGB value or range of target RGB values associated with a target skin tone color. Note that instead of target RGB value(s), a target YUV value or range of target YUV values could be stored for eachproduct recommendation 118. Similarly, a target Lab value or range of target Lab values could be stored for eachproduct recommendation 118. Thecomputing device 102 obtains one ormore product recommendations 118 from thedata store 116 by matching the estimated skin tone profile with one or more target RGB/YUV/Lab value(s) ofcorresponding product recommendations 118. Thereafter, the process inFIG. 3 ends. - It should be emphasized that the above-described embodiments of the present disclosure are merely possible examples of implementations set forth for a clear understanding of the principles of the disclosure. Many variations and modifications may be made to the above-described embodiment(s) without departing substantially from the spirit and principles of the disclosure. All such modifications and variations are intended to be included herein within the scope of this disclosure and protected by the following claims.
Claims (15)
- A method implemented in a computing device having a digital camera, comprising:obtaining a reference image depicting at least one reference color;calibrating parameters of the digital camera based on the at least one reference color;capturing, by the digital camera, a digital image of an individual utilizing the calibrated parameters;defining a region of interest in a facial region of the individual depicted in the digital image captured by the digital camera;generating a skin tone profile for pixels within the region of interest; anddisplaying a predetermined makeup product recommendation based on the skin tone profile.
- The method of claim 1, wherein the reference image is depicted on one of the following objects: a white balance card, a color checker, a banknote, a credit card, photocopy paper, tissue paper, a mobile phone, a non-glossy white object.
- The method of claim 2, wherein the object depicting the reference image is located at a predefined distance from the digital camera.
- The method of claim 1, wherein the parameters of the digital camera comprise at least one of: white balance level; exposure compensation; and gamma correction.
- The method of claim 1, wherein defining the region of interest in the facial region comprises:determining a color distance between pixels in the facial region and one or more predetermined target skin tones; anddesignating pixels within a threshold color distance of the one or more predetermined target skin tones as part of the region of interest.
- The method of claim 1, wherein defining the region of interest in the facial region comprises:identifying locations of predetermined feature points within the facial region; anddefining a boundary of the region of interest based on the locations of the predetermined feature points.
- The method of claim 1, wherein generating the skin tone profile for the pixels within the region of interest comprises:generating a luminance histogram for the pixels within the region of interest;removing predetermined portions of the luminance histogram to generate a target histogram portion;determining a dominant color value based on the target histogram portion; andgenerating the skin tone profile based on the determined dominant color value.
- The method of claim 7, wherein determining the dominant color value based on the target histogram portion comprises one of:calculating a mean of the target histogram;calculating a peak of the target histogram;calculating a weighted average of the target histogram; orcalculating a mean based on a mean-shift clustering algorithm of the target histogram.
- The method of claim 1, wherein generating the skin tone profile for the pixels within the region of interest comprises:extracting an illumination layer and a reflectance layer from the pixels within the region of interest; andgenerating the skin tone profile based on the reflectance layer.
- The method of claim 1, wherein generating the skin tone profile for the pixels within the region of interest comprises one of:converting a detected skin tone from a first color space to a second color space based on a predefined transformation matrix; ormapping a detected skin tone from a first classification to a second classification based on a predefined lookup table.
- A system, comprising:a digital camera;a memory storing instructions;a processor coupled to the memory and configured by the instructions to at least:obtain a reference image depicting at least one reference color;calibrate parameters of the digital camera based on the at least one reference color;capture, by the digital camera, a digital image of an individual utilizing the calibrated parameters;define a region of interest in a facial region of the individual depicted in the digital image captured by the digital camera;generate a skin tone profile for pixels within the region of interest; anddisplay a predetermined makeup product recommendation based on the skin tone profile.
- The system of claim 11, wherein the reference image is depicted on one of the following objects: a white balance card, a color checker, a banknote, a credit card, photocopy paper, tissue paper, a mobile phone, a non-glossy white object.
- The system of claim 11, wherein the parameters of the digital camera comprise at least one of: white balance level; exposure compensation; and gamma correction.
- The system of claim 11, wherein the processor defines the region of interest in the facial region by:determining a color distance between pixels in the facial region and one or more predetermined target skin tones; and designating pixels within a threshold color distance of the one or more predetermined target skin tones as part of the region of interest, or wherein the processor defines the region of interest in the facial region by:identifying locations of predetermined feature points within the facial region; and defining a boundary of the region of interest based on the locations of the predetermined feature points, or
wherein the processor generates the skin tone profile for the pixels within the region of interest by: generating a luminance histogram for the pixels within the region of interest; removing predetermined portions of the luminance histogram to generate a target histogram portion; determining a dominant color value based on the target histogram portion; and generating the skin tone profile based on the determined dominant color value, or
wherein the processor generates the skin tone profile for the pixels within the region of interest by: extracting an illumination layer and a reflectance layer from the pixels within the region of interest; and generating the skin tone profile based on the reflectance layer. - A non-transitory computer-readable storage medium storing instructions to be implemented by a computing device having a processor, wherein the instructions, when executed by the processor, cause the computing device to perform the method of anyone of claims 1 to 10.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201862681174P | 2018-06-06 | 2018-06-06 | |
| US16/114,555 US10719729B2 (en) | 2018-06-06 | 2018-08-28 | Systems and methods for generating skin tone profiles |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| EP3579139A1 true EP3579139A1 (en) | 2019-12-11 |
Family
ID=64362329
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP18206578.9A Withdrawn EP3579139A1 (en) | 2018-06-06 | 2018-11-15 | Systems and methods for generating calibrated skin tone profiles |
Country Status (2)
| Country | Link |
|---|---|
| US (1) | US10719729B2 (en) |
| EP (1) | EP3579139A1 (en) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11935322B1 (en) | 2020-09-14 | 2024-03-19 | Apple Inc. | Obstruction-sensitive white point determination using face information |
| DE102023112912A1 (en) | 2023-05-16 | 2024-11-21 | Kai Fröhner | Method and system for color detection in facial images |
Families Citing this family (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10891331B1 (en) | 2019-02-07 | 2021-01-12 | Pinterest, Inc. | Skin tone filter |
| US11182963B2 (en) * | 2019-04-03 | 2021-11-23 | Posnap, Inc. | Computerized system and method for providing a mobile augmented reality item display and selection experience |
| US11348366B2 (en) | 2019-04-23 | 2022-05-31 | The Procter And Gamble Company | Apparatus and method for determining cosmetic skin attributes |
| WO2020219612A1 (en) * | 2019-04-23 | 2020-10-29 | The Procter & Gamble Company | Apparatus and method for visualizing cosmetic skin attributes |
| US10936853B1 (en) * | 2019-10-04 | 2021-03-02 | Adobe Inc. | Skin tone assisted digital image color matching |
| CN117136553A (en) * | 2021-03-30 | 2023-11-28 | 斯纳普公司 | inclusive camera rig |
| US12307812B2 (en) | 2021-05-05 | 2025-05-20 | Perfect Mobile Corp. | System and method for personality prediction using multi-tiered analysis |
| US11816144B2 (en) | 2022-03-31 | 2023-11-14 | Pinterest, Inc. | Hair pattern determination and filtering |
| US20240193731A1 (en) * | 2022-12-12 | 2024-06-13 | Google Llc | Face region based automatic white balance in images |
| JP2025002837A (en) * | 2023-06-23 | 2025-01-09 | パナソニックIpマネジメント株式会社 | Imaging device |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20070058858A1 (en) * | 2005-09-09 | 2007-03-15 | Michael Harville | Method and system for recommending a product based upon skin color estimated from an image |
| US20140267664A1 (en) * | 2013-03-15 | 2014-09-18 | Skin Republic, Inc. | Systems and methods for specifying and formulating customized topical agents |
| US20150326842A1 (en) * | 2014-12-10 | 2015-11-12 | Xiaoning Huai | Automatic White Balance with Facial Color Features as Reference Color Surfaces |
| US9449412B1 (en) * | 2012-05-22 | 2016-09-20 | Image Metrics Limited | Adaptive, calibrated simulation of cosmetic products on consumer devices |
Family Cites Families (32)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6980221B2 (en) * | 2001-07-06 | 2005-12-27 | Eastman Kodak Company | Method for representing a digital color image using a set of palette colors |
| US7844076B2 (en) | 2003-06-26 | 2010-11-30 | Fotonation Vision Limited | Digital image processing using face detection and skin tone information |
| WO2006024962A2 (en) | 2004-08-30 | 2006-03-09 | Sigrid Heuer | Cosmetic skin color determination and color mixing apparatus |
| FR2881858A1 (en) | 2005-02-04 | 2006-08-11 | Oreal | INTERACTIVE SYSTEM USEFUL IN COSMETICS AND METHOD FOR CONSTRUCTING A DATABASE |
| US7612794B2 (en) | 2005-05-25 | 2009-11-03 | Microsoft Corp. | System and method for applying digital make-up in video conferencing |
| US7689016B2 (en) * | 2005-05-27 | 2010-03-30 | Stoecker & Associates, A Subsidiary Of The Dermatology Center, Llc | Automatic detection of critical dermoscopy features for malignant melanoma diagnosis |
| US7522768B2 (en) | 2005-09-09 | 2009-04-21 | Hewlett-Packard Development Company, L.P. | Capture and systematic use of expert color analysis |
| KR100857463B1 (en) * | 2006-11-17 | 2008-09-08 | 주식회사신도리코 | Facial Region Detection Device and Correction Method for Photo Printing |
| WO2008108763A1 (en) | 2007-03-08 | 2008-09-12 | Hewlett-Packard Development Company, L.P. | Method and system for skin color estimation from an image |
| EP2043038A1 (en) * | 2007-05-08 | 2009-04-01 | Ruben Pais | Method and instruments for sale of personal care products |
| US7856118B2 (en) | 2007-07-20 | 2010-12-21 | The Procter & Gamble Company | Methods for recommending a personal care product and tools therefor |
| CN101796848A (en) * | 2007-09-03 | 2010-08-04 | Nxp股份有限公司 | Color enhancement |
| US9058765B1 (en) * | 2008-03-17 | 2015-06-16 | Taaz, Inc. | System and method for creating and sharing personalized virtual makeovers |
| US8027521B1 (en) * | 2008-03-25 | 2011-09-27 | Videomining Corporation | Method and system for robust human gender recognition using facial feature localization |
| US20100158357A1 (en) | 2008-12-19 | 2010-06-24 | Qualcomm Incorporated | Image processing method and system of skin color enhancement |
| US8634640B2 (en) | 2010-10-21 | 2014-01-21 | Hewlett-Packard Development Company, L.P. | Method and apparatus for selecting a color palette |
| US9135503B2 (en) * | 2010-11-09 | 2015-09-15 | Qualcomm Incorporated | Fingertip tracking for touchless user interface |
| CN103534664B (en) | 2011-05-12 | 2016-08-31 | 苹果公司 | There is sensing |
| US9024961B2 (en) * | 2011-12-19 | 2015-05-05 | Dolby Laboratories Licensing Corporation | Color grading apparatus and methods |
| FR2985064B1 (en) | 2011-12-23 | 2016-02-26 | Oreal | METHOD FOR DELIVERING COSMETIC ADVICE |
| JP2013207721A (en) | 2012-03-29 | 2013-10-07 | Fujifilm Corp | Image pickup device and its white balance correction method and white balance correction program |
| US9104908B1 (en) | 2012-05-22 | 2015-08-11 | Image Metrics Limited | Building systems for adaptive tracking of facial features across individuals and groups |
| US10558848B2 (en) * | 2017-10-05 | 2020-02-11 | Duelight Llc | System, method, and computer program for capturing an image with correct skin tone exposure |
| CN104320641B (en) | 2013-03-13 | 2017-04-12 | 豪威科技股份有限公司 | Apparatus and method for automated self-training of white balance by electronic cameras |
| US9064279B1 (en) | 2013-05-22 | 2015-06-23 | Sephora USA, Inc. | System for cosmetics matching based on skin tone |
| WO2015030705A1 (en) | 2013-08-26 | 2015-03-05 | Intel Corporation | Automatic white balancing with skin tone correction for image processing |
| CN104796683B (en) | 2014-01-22 | 2018-08-14 | 南京中兴软件有限责任公司 | A kind of method and system of calibration image color |
| US9251567B1 (en) * | 2014-03-13 | 2016-02-02 | Google Inc. | Providing color corrections to photos |
| US9760935B2 (en) * | 2014-05-20 | 2017-09-12 | Modiface Inc. | Method, system and computer program product for generating recommendations for products and treatments |
| JP6574074B2 (en) | 2016-02-08 | 2019-09-11 | イクアリティ コスメティクス インコーポレイテッドEquality Cosmetics, Inc. | Apparatus and method for formulating and dispensing visually customized cosmetics |
| CN106388781A (en) | 2016-09-29 | 2017-02-15 | 深圳可思美科技有限公司 | Method for detecting skin colors and pigmentation situation of skin |
| CN106846422B (en) | 2017-02-17 | 2020-06-19 | 深圳可思美科技有限公司 | Method for identifying skin sunscreen condition |
-
2018
- 2018-08-28 US US16/114,555 patent/US10719729B2/en active Active
- 2018-11-15 EP EP18206578.9A patent/EP3579139A1/en not_active Withdrawn
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20070058858A1 (en) * | 2005-09-09 | 2007-03-15 | Michael Harville | Method and system for recommending a product based upon skin color estimated from an image |
| US9449412B1 (en) * | 2012-05-22 | 2016-09-20 | Image Metrics Limited | Adaptive, calibrated simulation of cosmetic products on consumer devices |
| US20140267664A1 (en) * | 2013-03-15 | 2014-09-18 | Skin Republic, Inc. | Systems and methods for specifying and formulating customized topical agents |
| US20150326842A1 (en) * | 2014-12-10 | 2015-11-12 | Xiaoning Huai | Automatic White Balance with Facial Color Features as Reference Color Surfaces |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11935322B1 (en) | 2020-09-14 | 2024-03-19 | Apple Inc. | Obstruction-sensitive white point determination using face information |
| DE102023112912A1 (en) | 2023-05-16 | 2024-11-21 | Kai Fröhner | Method and system for color detection in facial images |
| WO2024236113A1 (en) | 2023-05-16 | 2024-11-21 | Froehner Kai | Method and system for colour determination in facial images |
Also Published As
| Publication number | Publication date |
|---|---|
| US10719729B2 (en) | 2020-07-21 |
| US20190377969A1 (en) | 2019-12-12 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10719729B2 (en) | Systems and methods for generating skin tone profiles | |
| EP3690825B1 (en) | Systems and methods for virtual application of makeup effects based on lighting conditions and surface properties of makeup effects | |
| US20190130169A1 (en) | Image processing method and device, readable storage medium and electronic device | |
| KR101954851B1 (en) | Metadata-based image processing method and apparatus | |
| Didyk et al. | Enhancement of bright video features for HDR displays | |
| EP3491963A1 (en) | Systems and methods for identification and virtual application of cosmetic products | |
| US10304164B2 (en) | Image processing apparatus, image processing method, and storage medium for performing lighting processing for image data | |
| US10204432B2 (en) | Methods and systems for color processing of digital images | |
| US12333694B2 (en) | Image processing method and apparatus | |
| US10762665B2 (en) | Systems and methods for performing virtual application of makeup effects based on a source image | |
| WO2007072907A1 (en) | Gray-scale correcting method, gray-scale correcting device, gray-scale correcting program, and image device | |
| CN107871303B (en) | An image processing method and device | |
| US20180039864A1 (en) | Fast and accurate skin detection using online discriminative modeling | |
| CN107862663A (en) | Image processing method, device, readable storage medium and computer equipment | |
| CN105096267B (en) | A kind of method and apparatus that eye brightness is adjusted based on identification of taking pictures | |
| CN107172354A (en) | Method for processing video frequency, device, electronic equipment and storage medium | |
| CN107911625A (en) | Light measuring method, light measuring device, readable storage medium and computer equipment | |
| CN107993209A (en) | Image processing method, device, computer-readable recording medium and electronic equipment | |
| CN105957020A (en) | Image generator and image generation method | |
| US10789769B2 (en) | Systems and methods for image style transfer utilizing image mask pre-processing | |
| CN110570476A (en) | System, method and storage medium for execution on computing device | |
| US10789693B2 (en) | System and method for performing pre-processing for blending images | |
| US11182634B2 (en) | Systems and methods for modifying labeled content | |
| US11763509B2 (en) | Frame calibration for robust video synthesis | |
| CN113298888B (en) | Image processing method, device, equipment and storage medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED |
|
| AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
| AX | Request for extension of the european patent |
Extension state: BA ME |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
| 17P | Request for examination filed |
Effective date: 20200609 |
|
| RBV | Designated contracting states (corrected) |
Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
| 17Q | First examination report despatched |
Effective date: 20210730 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
| 18D | Application deemed to be withdrawn |
Effective date: 20211210 |