WO2022001648A1 - Procédé et appareil de traitement d'images, ainsi que dispositif et support - Google Patents
Procédé et appareil de traitement d'images, ainsi que dispositif et support Download PDFInfo
- Publication number
- WO2022001648A1 WO2022001648A1 PCT/CN2021/100019 CN2021100019W WO2022001648A1 WO 2022001648 A1 WO2022001648 A1 WO 2022001648A1 CN 2021100019 W CN2021100019 W CN 2021100019W WO 2022001648 A1 WO2022001648 A1 WO 2022001648A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- area
- image
- target object
- change trend
- contrast value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
Definitions
- the present application belongs to the technical field of image processing, and in particular relates to an image processing method, apparatus, device and medium.
- Matting is one of the most common operations in image processing, which is to separate a certain part of an image from the image.
- a lasso tool, a marquee tool, a magic wand tool, a pen tool, etc. are usually used to manually cut out the image.
- the purpose of the embodiments of the present application is to provide an image processing method, apparatus, device, and medium, which can solve the problem of inaccurate matting.
- an embodiment of the present application provides an image processing method, including:
- N images are images captured by the image acquisition component at different focusing distances
- the pixel contrast value determine the coordinate information of the target object in the target image, wherein the target image is an image in N images;
- an image processing apparatus including:
- a first acquisition module configured to acquire N images, wherein the N images are images captured by the image acquisition component at different focusing distances;
- a division module for dividing each of the N images into M regions
- the second acquisition module is used to acquire the pixel contrast value of each area of each image
- a determining module configured to determine the coordinate information of the target object in the target image according to the pixel contrast value, wherein the target image is an image in N images;
- the third acquiring module is used for acquiring the target object in the target image according to the coordinate information.
- embodiments of the present application provide an electronic device, the electronic device includes a processor, a memory, and a program or instruction stored on the memory and executable on the processor, the program or instruction being The processor implements the steps of the method according to the first aspect when executed.
- an embodiment of the present application provides a computer-readable storage medium, where a program or an instruction is stored on the computer-readable storage medium, and when the program or instruction is executed by a processor, the method according to the first aspect is implemented A step of.
- an embodiment of the present application provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction, and implement the first aspect the steps of the method.
- each of the N images is divided into M areas, and the coordinate information of the target object in the target image is determined according to the pixel contrast value of each area of each image, and according to the coordinate information, Obtaining the target object in the target image is to realize the matting of the target object.
- the embodiment of the present application can automatically cut out the target object, which can improve the accuracy of the image cutout.
- FIG. 1 is a schematic flowchart of an image processing method provided by an embodiment of the present application.
- FIG. 2 is a schematic diagram of a target shooting scene provided by an embodiment of the present application.
- FIG. 3 is a schematic diagram of N images provided by an embodiment of the present application.
- FIG. 6 is a schematic diagram of sub-regional display of objects provided by an embodiment of the present application.
- FIG. 7 is a schematic diagram of a processed target image provided by an embodiment of the present application.
- FIG. 8 is a schematic structural diagram of an image processing apparatus provided by an embodiment of the present application.
- FIG. 9 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
- FIG. 1 is a schematic flowchart of an image processing method provided by an embodiment of the present application. As shown in Figure 1, the image processing method may include:
- S101 Acquire N images, where the N images are images captured by the image acquisition component at different focusing distances;
- S104 determine the coordinate information of the target object in the target image according to the pixel contrast value, wherein the target image is the image in the N images;
- S105 Acquire the target object in the target image according to the coordinate information.
- the execution subject may be an image processing apparatus, or a control module in the image processing apparatus for executing the image processing method.
- an image processing method performed by an image processing apparatus is used as an example to describe the image processing method provided by the embodiments of the present application.
- the image processing device acquires N images; divides each image in the N images into M areas; acquires the pixel contrast value of each area of each image; determines the coordinates of the target object in the target image according to the pixel contrast value information; obtain the target object in the target image according to the coordinate information.
- each of the N images is divided into M areas, and the coordinate information of the target object in the target image is determined according to the pixel contrast value of each area of each image, and according to the coordinate information, Obtaining the target object in the target image is to realize the matting of the target object.
- the embodiment of the present application can automatically cut out the target object, which can improve the accuracy of the image cutout.
- the image acquisition component may be a single image acquisition component.
- a single image acquisition component is used to collect N images of the target shooting scene in the process of gradually changing the focus distance, and each image in the N images is divided into M areas;
- the pixel contrast value of each area determines the coordinate information of the target object in the target image, and obtains the target object in the target image according to the coordinate information. That is, the matting of the target object is realized, and the accuracy of the matting can be improved.
- the change trend of the focus distance may be a gradual change from small to large, or a gradual change from large to small.
- Focusing distance refers to the distance between objects and images, which is the sum of the distance from the lens to the object and the distance from the lens to the photosensitive element.
- FIG. 2 is a schematic diagram of a target shooting scene provided by an embodiment of the present application.
- N images of the scene captured by the target shown in FIG. 2 in the process of gradually changing the focus distance are collected by using the image collection component, as shown in FIG. 3 .
- FIG. 3 is a schematic diagram of N images provided by an embodiment of the present application. Among them, Figure 3 shows four schematic diagrams of images.
- FIG. 4 is a schematic diagram of area division provided by an embodiment of the present application. Among them, each square in Figure 4 represents an area.
- each of the multiple regions obtained by dividing in S102 may include multiple pixels.
- the pixel contrast value of the area may be the sum of the pixel contrast values of each pixel in the multiple pixels included in the area.
- the pixel contrast value refers to the color difference between adjacent pixels.
- each of the multiple regions obtained by dividing in S102 may include only one pixel.
- the pixel contrast value of the area may be the pixel contrast value of one pixel included in the area.
- each area includes only one pixel, that is, the image is divided according to the pixel granularity, so that the coordinates of the target object in the target object can be more accurate, and thus the accuracy of the map can be achieved.
- S104 may include: classifying the M regions according to the change trend of the pixel contrast value and the change trend of the focus distance, and obtain a classification result; according to the classification result, determine that the target object is in the target image coordinate information in .
- the change trend of the pixel contrast value includes but is not limited to: from large to small, from small to large, from large to small and then from small to large, and from small to large and then from large to small .
- the changing trend of the focusing distance includes, but is not limited to: from large to small and from small to large.
- classifying the M regions according to the change trend of the pixel contrast value and the change trend of the focus distance, and obtaining the classification result may include: classifying the first region in the M regions into a classification result.
- the class is the background area, wherein the change trend of the pixel contrast value of the first area is the same as the change trend of the focus distance; the second area in the M areas is classified as the foreground area, wherein the pixel contrast value of the second area is the same.
- the change trend of is opposite to that of the focus distance; the third area in the M areas is classified as the main area, wherein the change trend of the pixel contrast value of the third area and the change trend of the focus distance are the same first and then opposite.
- the change trend of the pixel contrast value of the third area and the change trend of the focus distance are opposite first and then the same.
- the target object includes: at least one of a background area, a foreground area and a subject area.
- FIG. 5 is a schematic diagram of a result of region classification provided by an embodiment of the present application.
- the rate of change of the pixel contrast values of the background area, the foreground area, and the main body area may be different.
- the M regions may also be classified according to the change rate of the pixel contrast value and the change trend of the focus distance to obtain a classification result.
- the areas in the M areas with the pixel contrast value change rate greater than the first rate are classified as foreground areas; the pixel contrast value change rate in the M areas is smaller than the second rate.
- the area is classified as a background area; the area in which the pixel contrast value change rate in the M areas is between the second rate and the first rate is classified as the main area, wherein the first rate is greater than the second rate.
- the areas in the M areas with the change rate of the pixel contrast value greater than the third rate are classified as the background area; the areas in the M areas with the change rate of the pixel contrast value less than the fourth rate are classified as the background area. is the foreground area; the area where the pixel contrast value change rate in the M areas is between the fourth rate and the third rate is classified as the main area, wherein the third rate is greater than the fourth rate.
- the target image may be an image focusing on the target object among the N images, that is, the target object may be an image focusing on the target object among the N images.
- the target object is the main area
- the pixel contrast value of the target object increases from small to large, and then from large to small. Therefore, the image corresponding to the maximum pixel contrast value of the target object can be determined as the target image. It can be understood that when the pixel contrast value of the target object is at the maximum value, the focus of the image acquisition component is just on the target object, that is, the image acquisition component takes the target object as the focus.
- the image processing method provided by the embodiments of the present application may further include: performing first processing on the target object in the target image, and/or, performing the first processing on the target object in the target image except the target object.
- the other objects perform the second processing.
- the embodiments of the present application do not limit the specific processing manners of the first processing and the second processing, and any available processing manners can be applied to the embodiments of the present application.
- any available processing manners can be applied to the embodiments of the present application.
- color burn processing color gradient processing, soft light processing, sharpening processing, oil painting processing and color pencil processing and so on.
- the first processing and the second processing may be set according to actual requirements.
- the first processing may be a beauty processing.
- the second processing may be blurring processing.
- the target object when the target object is the main body region, the target object can be made clearer by performing blurring processing on other regions except the main body region in the target image.
- the image processing method provided by the embodiments of the present application may further include: displaying the target object in the fourth area of the screen; displaying the target image in the fifth area of the screen except the target object other objects. That is, the target object in the target image is separated from other objects in the target image except the target object. Then, the second processing is performed on the target object displayed in the fourth area and the objects other than the target object in the target image displayed in the fifth area, respectively.
- FIG. 6 is a schematic diagram of displaying objects in sub-regions provided by an embodiment of the present application.
- the target object is the main area, and other objects except the target object are the background area and the foreground area.
- the target objects displayed in the fourth area of the screen can be respectively and performing the second processing on other objects other than the target object in the target image displayed in the fifth area of the screen, to avoid processing the target object and other objects other than the target object in the target image on the target image, Mishandling the target object or objects other than the target object in the target image.
- the image processing method provided by the embodiments of the present invention may further include: combining the target object displayed in the fourth area that has undergone the first processing and the target object displayed in the fifth area that has undergone the second processing. Other objects in the target image except the target object are merged to obtain the processed target image.
- the target object and other objects other than the target object in the target image are displayed in sub-regions, as shown in FIG. 6 .
- the first processing may be performed on the target object displayed in the fourth area, and/or the target object displayed in the fifth area except the target object may be subjected to the first processing.
- the second processing is performed on other objects other than the object.
- the two processed objects are merged to obtain the processed target image, as shown in Figure 7 .
- FIG. 7 is a schematic diagram of a processed target image provided by an embodiment of the present application.
- the object in the target image, "the main area, the background area, and the foreground area” may also be displayed in three areas. Then, at least one object among the objects displayed in the sub-regions is processed, and after the object processing is completed, the three objects can be combined again to obtain a processed target image.
- FIG. 8 is a schematic structural diagram of an image processing apparatus provided by an embodiment of the present application. As shown in FIG. 8 , the image processing apparatus 800 may include:
- the first acquisition module 801 is configured to acquire N images, wherein the N images are images captured by the image acquisition component at different focusing distances;
- a dividing module 802 configured to divide each of the N images into M regions
- the second acquisition module 803 is used to acquire the pixel contrast value of each area of each image
- Determining module 804 is used to determine the coordinate information of the target object in the target image according to the pixel contrast value, wherein the target image is the image in the N images;
- the third obtaining module 805 is configured to obtain the target object in the target image according to the coordinate information.
- each of the N images is divided into M areas, and the coordinate information of the target object in the target image is determined according to the pixel contrast value of each area of each image, and according to the coordinate information, Obtaining the target object in the target image is to realize the matting of the target object.
- the embodiment of the present application can automatically cut out the target object, which can improve the accuracy of the image cutout.
- the determining module 804 may include:
- the classification sub-module is used to classify the M areas according to the change trend of the pixel contrast value and the change trend of the focus distance, and obtain the classification result;
- the determining sub-module is used for determining the coordinate information of the target object in the target image according to the classification result.
- the classification submodule may be specifically used for:
- the first area in the M areas is classified as a background area, wherein the change trend of the pixel contrast value of the first area is the same as the change trend of the focus distance;
- the third area in the M areas is classified as the main area, wherein the change trend of the pixel contrast value of the third area and the change trend of the focus distance are the same first and then opposite, or the change of the pixel contrast value of the third area.
- the trend and the change trend of focus distance are opposite first and then the same;
- the target object includes: at least one of a background area, a foreground area and a subject area.
- the image processing apparatus 800 may further include:
- the display module is used for displaying the target object in the fourth area of the screen, and displaying other objects in the target image except the target object in the fifth area of the screen.
- the target image may be an image with the target object as the focus among the N images.
- the image processing apparatus in this embodiment of the present application may be an apparatus, or may be a component, an integrated circuit, or a chip in a terminal.
- the apparatus may be a mobile electronic device or a non-mobile electronic device.
- the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palmtop computer, an in-vehicle electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook, or a personal digital assistant (personal digital assistant).
- UMPC ultra-mobile personal computer
- PDA personal digital assistant
- non-mobile electronic devices can be servers, network attached storage (Network Attached Storage, NAS), personal computer (personal computer, PC), television (television, TV), teller machine or self-service machine, etc., this application Examples are not specifically limited.
- the image processing apparatus in this embodiment of the present application may be an apparatus having an operating system.
- the operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, which are not specifically limited in the embodiments of the present application.
- the image processing apparatus provided in the embodiments of the present application can implement each process in the image processing method embodiments shown in FIG. 1 to FIG. 7 , and in order to avoid repetition, details are not repeated here.
- FIG. 9 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
- the electronic device 900 includes but is not limited to: a radio frequency unit 901 , a network module 902 , an audio output unit 903 , an input unit 904 , a sensor 905 , a display unit 906 , a user input unit 907 , an interface unit 908 , and a memory 909 , and components such as the processor 910 .
- the input unit 904 may include a graphics processor 9041 and a microphone 9042 .
- the display unit 906 may include a display panel 9061 .
- the user input unit 907 includes a touch panel 9071 and other input devices 9072 .
- the memory 909 may be used to store software programs as well as various data.
- the memory 909 may mainly include a stored program area and a stored data area, wherein the stored program area may store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
- the electronic device 900 may also include a power supply (such as a battery) for supplying power to various components, and the power supply may be logically connected to the processor 910 through a power management system, so that the power management system can manage charging, discharging, and power management. consumption management and other functions.
- a power supply such as a battery
- the structure of the electronic device shown in FIG. 9 does not constitute a limitation to the electronic device.
- the electronic device may include more or less components than the one shown, or combine some components, or arrange different components, which will not be repeated here. .
- the processor 910 is configured to: acquire N images, where the N images are images captured by the image acquisition component at different focusing distances; divide each of the N images into M regions; acquire each image The pixel contrast value of each area of then; according to the pixel contrast value, determine the coordinate information of the target object in the target image, wherein the target image is an image in N images; according to the coordinate information, obtain the target object in the target image.
- each of the N images is divided into M areas, and the coordinate information of the target object in the target image is determined according to the pixel contrast value of each area of each image, and according to the coordinate information, Obtaining the target object in the target image is to realize the matting of the target object.
- the embodiments of the present application can automatically perform image matting on the target object, which can improve the accuracy of image matting.
- the processor 910 may be specifically configured to:
- the M areas are classified, and the classification result is obtained;
- the coordinate information of the target object in the target image is determined.
- the processor 910 may be specifically configured to:
- the first area in the M areas is classified as a background area, wherein the change trend of the pixel contrast value of the first area is the same as the change trend of the focus distance;
- the third area in the M areas is classified as the main area, wherein the change trend of the pixel contrast value of the third area and the change trend of the focus distance are the same first and then opposite, or the change of the pixel contrast value of the third area.
- the trend and the change trend of focus distance are opposite first and then the same;
- the target object includes: at least one of a background area, a foreground area and a subject area.
- the display unit 906 may be used for:
- the target object is displayed in the fourth area of the screen, and other objects in the target image except the target object are displayed in the fifth area of the screen.
- an embodiment of the present application further provides an electronic device, including a processor 910, a memory 909, a program or instruction stored in the memory 909 and executable on the processor 910, the program or instruction being processed by the processor
- an electronic device including a processor 910, a memory 909, a program or instruction stored in the memory 909 and executable on the processor 910, the program or instruction being processed by the processor
- 910 When 910 is executed, each process of the above image processing method embodiment is implemented, and the same technical effect can be achieved. To avoid repetition, details are not described here.
- the electronic devices in the embodiments of the present application include the above-mentioned mobile electronic devices and non-mobile electronic devices.
- Embodiments of the present invention further provide an electronic device configured to execute each process of the above image processing method embodiments, and can achieve the same technical effect, which is not repeated here to avoid repetition.
- Embodiments of the present application further provide a computer-readable storage medium, where a program or an instruction is stored on the computer-readable storage medium, and when the program or instruction is executed by a processor, each process of the above image processing method embodiment is implemented, and can To achieve the same technical effect, in order to avoid repetition, details are not repeated here.
- the processor is the processor in the electronic device described in the foregoing embodiments.
- Examples of the computer-readable storage medium include non-transitory computer-readable storage media, such as computer read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk, etc. .
- An embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement the above image processing method embodiments.
- the chip includes a processor and a communication interface
- the communication interface is coupled to the processor
- the processor is configured to run a program or an instruction to implement the above image processing method embodiments.
- An embodiment of the present invention further provides a computer program product, which can be executed by a processor to implement the various processes of the above image processing method embodiments, and can achieve the same technical effect. To avoid repetition, details are not repeated here. .
- the chip mentioned in the embodiments of the present application may also be referred to as a system-on-chip, a system-on-chip, a system-on-a-chip, or a system-on-a-chip, or the like.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
Abstract
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202010611475.XA CN111866378A (zh) | 2020-06-30 | 2020-06-30 | 图像处理方法、装置、设备及介质 |
| CN202010611475.X | 2020-06-30 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2022001648A1 true WO2022001648A1 (fr) | 2022-01-06 |
Family
ID=72989934
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2021/100019 Ceased WO2022001648A1 (fr) | 2020-06-30 | 2021-06-15 | Procédé et appareil de traitement d'images, ainsi que dispositif et support |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN111866378A (fr) |
| WO (1) | WO2022001648A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN115002356A (zh) * | 2022-07-19 | 2022-09-02 | 深圳市安科讯实业有限公司 | 基于数字视频摄影用夜视方法 |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112362164B (zh) * | 2020-11-10 | 2022-01-18 | 广东电网有限责任公司 | 一种设备的温度监控方法、装置、电子设备及存储介质 |
| CN113055603A (zh) * | 2021-03-31 | 2021-06-29 | 联想(北京)有限公司 | 一种图像处理方法及电子设备 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2008294785A (ja) * | 2007-05-25 | 2008-12-04 | Sanyo Electric Co Ltd | 画像処理装置、撮像装置、画像ファイル及び画像処理方法 |
| US20120320239A1 (en) * | 2011-06-14 | 2012-12-20 | Pentax Ricoh Imaging Company, Ltd. | Image processing device and image processing method |
| CN102843510A (zh) * | 2011-06-14 | 2012-12-26 | 宾得理光映像有限公司 | 成像装置和距离信息检测方法 |
| CN110189339A (zh) * | 2019-06-03 | 2019-08-30 | 重庆大学 | 深度图辅助的主动轮廓抠图方法及系统 |
| CN111246106A (zh) * | 2020-01-22 | 2020-06-05 | 维沃移动通信有限公司 | 图像处理方法、电子设备及计算机可读存储介质 |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101930533B (zh) * | 2009-06-19 | 2013-11-13 | 株式会社理光 | 在图像采集设备中进行天空检测的装置和方法 |
| KR20110020519A (ko) * | 2009-08-24 | 2011-03-03 | 삼성전자주식회사 | 디지털 촬영장치, 그 제어방법 및 이를 실행하기 위한 프로그램을 저장한 기록매체 |
| CN102338972A (zh) * | 2010-07-21 | 2012-02-01 | 华晶科技股份有限公司 | 多人脸区块辅助对焦的方法 |
| CN105629631B (zh) * | 2016-02-29 | 2020-01-10 | Oppo广东移动通信有限公司 | 控制方法、控制装置及电子装置 |
| CN108305215A (zh) * | 2018-01-23 | 2018-07-20 | 北京易智能科技有限公司 | 一种基于智能移动终端的图像处理方法及系统 |
| CN110336951A (zh) * | 2019-08-26 | 2019-10-15 | 厦门美图之家科技有限公司 | 反差式对焦方法、装置及电子设备 |
-
2020
- 2020-06-30 CN CN202010611475.XA patent/CN111866378A/zh active Pending
-
2021
- 2021-06-15 WO PCT/CN2021/100019 patent/WO2022001648A1/fr not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2008294785A (ja) * | 2007-05-25 | 2008-12-04 | Sanyo Electric Co Ltd | 画像処理装置、撮像装置、画像ファイル及び画像処理方法 |
| US20120320239A1 (en) * | 2011-06-14 | 2012-12-20 | Pentax Ricoh Imaging Company, Ltd. | Image processing device and image processing method |
| CN102843510A (zh) * | 2011-06-14 | 2012-12-26 | 宾得理光映像有限公司 | 成像装置和距离信息检测方法 |
| CN110189339A (zh) * | 2019-06-03 | 2019-08-30 | 重庆大学 | 深度图辅助的主动轮廓抠图方法及系统 |
| CN111246106A (zh) * | 2020-01-22 | 2020-06-05 | 维沃移动通信有限公司 | 图像处理方法、电子设备及计算机可读存储介质 |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN115002356A (zh) * | 2022-07-19 | 2022-09-02 | 深圳市安科讯实业有限公司 | 基于数字视频摄影用夜视方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN111866378A (zh) | 2020-10-30 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN104135609B (zh) | 辅助拍照方法、装置及终端 | |
| CN112749613B (zh) | 视频数据处理方法、装置、计算机设备及存储介质 | |
| US10317777B2 (en) | Automatic zooming method and apparatus | |
| WO2022001648A1 (fr) | Procédé et appareil de traitement d'images, ainsi que dispositif et support | |
| WO2019237745A1 (fr) | Procédé et appareil de traitement d'image faciale, dispositif électronique et support de stockage lisible par ordinateur | |
| CN111290684B (zh) | 图像显示方法、图像显示装置及终端设备 | |
| CN106446223B (zh) | 地图数据的处理方法和装置 | |
| CN106249508B (zh) | 自动对焦方法和系统、拍摄装置 | |
| WO2022042679A1 (fr) | Procédé et appareil de traitement d'image, dispositif et support de stockage | |
| CN112714253A (zh) | 视频录制方法、装置、电子设备和可读存储介质 | |
| WO2018166069A1 (fr) | Procédé de prévisualisation de photographie, interface utilisateur graphique et terminal | |
| WO2021139178A1 (fr) | Procédé de synthèse d'image et dispositif associé | |
| CN108961267A (zh) | 图片处理方法、图片处理装置及终端设备 | |
| WO2021164328A1 (fr) | Procédé de génération d'image, dispositif et support de stockage | |
| US11770603B2 (en) | Image display method having visual effect of increasing size of target image, mobile terminal, and computer-readable storage medium | |
| CN112734661A (zh) | 图像处理方法及装置 | |
| CN111638844A (zh) | 截屏方法、装置及电子设备 | |
| CN108305262A (zh) | 文件扫描方法、装置及设备 | |
| CN111201773A (zh) | 拍摄方法及装置、移动终端及计算机可读存储介质 | |
| WO2022174826A1 (fr) | Procédé et appareil de traitement d'image, dispositif et support de stockage | |
| CN112150486B (zh) | 图像处理方法及装置 | |
| CN112383708B (zh) | 拍摄方法、装置、电子设备及可读存储介质 | |
| CN113709370A (zh) | 图像生成方法、装置、电子设备及可读存储介质 | |
| CN113742430B (zh) | 确定图数据中结点构成三角形结构个数的方法及系统 | |
| CN114897915B (zh) | 图像分割方法、装置、电子设备和存储介质 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21833084 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 21833084 Country of ref document: EP Kind code of ref document: A1 |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 21833084 Country of ref document: EP Kind code of ref document: A1 |
|
| 32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 26-06-2023) |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 21833084 Country of ref document: EP Kind code of ref document: A1 |