[go: up one dir, main page]

WO2016023406A1 - Procédé de prise de vue pour trace de déplacement d'objet, terminal mobile et support de stockage informatique - Google Patents

Procédé de prise de vue pour trace de déplacement d'objet, terminal mobile et support de stockage informatique Download PDF

Info

Publication number
WO2016023406A1
WO2016023406A1 PCT/CN2015/083362 CN2015083362W WO2016023406A1 WO 2016023406 A1 WO2016023406 A1 WO 2016023406A1 CN 2015083362 W CN2015083362 W CN 2015083362W WO 2016023406 A1 WO2016023406 A1 WO 2016023406A1
Authority
WO
WIPO (PCT)
Prior art keywords
shooting
image
composite image
brightness
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2015/083362
Other languages
English (en)
Chinese (zh)
Inventor
邹明双
里强
苗雷
崔小辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Publication of WO2016023406A1 publication Critical patent/WO2016023406A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio

Definitions

  • the present invention relates to the field of imaging technology, and in particular, to a method for photographing an object motion trajectory, a mobile terminal, and a computer storage medium.
  • the current mobile terminal has a shooting function that relies on the relevant processing algorithms provided by the camera hardware device and the chip supplier, and only several fixed shooting modes such as focus and white balance.
  • some non-professional shooting devices such as mobile phones, tablet computers and other mobile terminals can also take the object motion trajectory, specifically through the camera to continuously collect the image of the object movement, and synthesize all the collected images to form an object. A composite picture of the motion track.
  • the existing mobile terminal collects the moving image of the object with the same shooting parameters when shooting the motion track of the object, so that the user cannot adjust the shooting parameters according to actual needs during the shooting process, and therefore the image of the moving motion of the captured object is also Unable to meet the user's custom needs.
  • Embodiments of the present invention are directed to providing a method for photographing an object motion trajectory, a mobile terminal, and a computer storage medium, so that the image of the motion trajectory of the object that can be captured can satisfy the user's custom needs.
  • Embodiments of the present invention provide a method for photographing an object motion trajectory, and the method for photographing an object motion trajectory includes the following steps:
  • the parameter adjustment interface is displayed in the shooting window for the user to adjust the shooting parameters
  • Image synthesis is performed according to all the moving images of the objects collected, and a composite image corresponding to the motion track of the object is generated.
  • the displaying the parameter adjustment interface in the shooting window includes:
  • the display instruction sent by the user trigger adjustment control is received, and the parameter adjustment interface is displayed in the shooting window according to the display instruction.
  • the displaying the parameter adjustment interface in the shooting window includes:
  • the parameter adjustment interface is displayed in the shooting window.
  • the step of performing image synthesis according to all the moving images of the collected objects, and generating a composite image corresponding to the motion track of the object includes:
  • the selected current object moving image and the previous composite image are subjected to brightness synthesis processing to generate a composite image corresponding to the object motion track.
  • the step of performing brightness synthesis processing on the selected current object moving image and the previous synthesized composite image includes:
  • the brightness of the pixel at the position in the previous composite image is replaced with the pixel in the current image, and image synthesis is performed accordingly.
  • an embodiment of the present invention further provides a mobile terminal, where the mobile terminal includes:
  • the display module is configured to display a parameter adjustment interface in the shooting window after the shooting starts, for the user to adjust the shooting parameters;
  • the acquisition module is configured to receive the user-adjusted shooting parameters, and continuously collect the moving image of the object through the camera according to the adjusted shooting parameters;
  • a synthesis module configured to perform image synthesis according to all of the collected moving images of the object, A composite image corresponding to the motion track of the object is generated.
  • the display module is configured to:
  • the display instruction sent by the user trigger adjustment control is received, and the parameter adjustment interface is displayed in the shooting window according to the display instruction.
  • the display module is configured to:
  • the parameter adjustment interface is displayed in the shooting window.
  • the synthesizing module is configured to:
  • the selected current object moving image and the previous composite image are subjected to brightness synthesis processing to generate a composite image corresponding to the object motion track.
  • the synthesizing module is configured to:
  • the brightness of the pixel at the position in the previous composite image is replaced with the pixel in the current image, and image synthesis is performed accordingly.
  • the embodiment of the present invention further provides a computer storage medium, where the computer storage medium stores computer executable instructions, and the computer executable instructions are used to perform at least one of the foregoing methods.
  • a parameter adjustment interface is displayed in the shooting window for the user to adjust the shooting parameters, and the moving image of the object is continuously collected by the camera according to the adjusted shooting parameters; Image synthesis is performed to generate a composite image corresponding to the motion trajectory of the object.
  • a parameter adjustment interface for adjusting the shooting parameters is displayed to collect the moving image of the object according to the adjusted shooting parameters, and synthesized into a composite image corresponding to the motion track of the object, and the household can adjust the shooting according to actual needs during the shooting process.
  • the parameters enable the captured image of the object's motion trajectory to meet the user's custom needs.
  • FIG. 1 is a schematic flowchart of a method for photographing an object motion trajectory according to an embodiment of the present invention
  • FIG. 2 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention.
  • FIG. 3 is a schematic structural diagram of another mobile terminal according to an embodiment of the present invention.
  • Embodiments of the present invention provide a method for photographing an object motion trajectory.
  • FIG. 1 is a schematic flow chart of a first embodiment of a method for photographing an object motion trajectory according to the present invention.
  • the method for photographing the motion trajectory of the object includes:
  • Step S10 after the shooting starts, a parameter adjustment interface is displayed in the shooting window for the user to adjust the shooting parameters;
  • the embodiment of the invention adds a motion track shooting mode for the shooting function of the mobile terminal, and the user can select the motion track shooting mode or the normal shooting mode to capture the motion track of the object, wherein the motion track shooting mode combines the motion track to shoot the scene. It is required to preset the parameters such as exposure time, ISO, resolution, exposure compensation, noise reduction, etc. It is also possible to preset different parameters according to different starry sky scenes over different regions for the user to select when shooting.
  • the object motion is substantially relative to the captured mobile terminal, and the motion type may include the following three situations: 1. The state of the object is motion, the state of the mobile terminal is static; and the state of the object is At rest, the state of the mobile terminal is motion; third, the state of the object and the mobile terminal are the same, but the relative position changes.
  • the mobile terminal When the user selects the motion track shooting mode, after pressing the shooting button or triggering the virtual shooting button, the mobile terminal starts the motion track shooting, and displays the parameter adjustment interface in the shooting window for The user adjusts the shooting parameters through the parameter adjustment interface.
  • the parameter adjustment interface after receiving the display instruction sent by the user trigger adjustment control, the parameter adjustment interface may be displayed in the shooting window according to the display instruction.
  • the display When the user does not need to adjust the shooting parameter, the display may be cancelled by adjusting the control; parameter adjustment The interface can also be automatically displayed in the shooting window when entering the motion track shooting mode; the parameter adjustment interface can be presented in any form and size, covering the shooting window, or superimposed in the semi-transparent form above the shooting window; in the parameter adjustment interface Shooting parameters such as exposure time, exposure compensation, and white balance can be adjusted.
  • the parameter adjustment interface can be displayed as a progress bar for adjusting a specific shooting parameter, for example, the user can adjust the exposure time at any time during the shooting process, so that the brightness and the definition of the moving image of the captured object are not completely the same. .
  • Step S20 receiving the shooting parameters adjusted by the user, and continuously collecting the moving image of the object through the camera according to the adjusted shooting parameters;
  • the mobile terminal receives the adjusted shooting parameters, and starts to use the camera to continuously collect the moving image of the object with the adjusted shooting parameters, and the speed at which the camera continuously collects the image data may be preset.
  • the shooting parameters include, but are not limited to, exposure time, ISO and other common parameters of the camera.
  • the user can change the camera parameters in real time on the camera interface, and the mobile terminal will capture the moving image of the object according to the changed camera parameters. In this way, through the adjustment of the shooting parameter interface provided by the mobile terminal, the parameter adjustment can be directly performed dynamically, instead of the fixed shooting parameters, the convenience and flexibility are greatly improved.
  • the camera In order to ensure the continuity of the moving image of the object, the camera needs to continuously collect at least ten images within 1 s clock, and the subsequent image processing is often unable to keep up with the image acquisition speed. Therefore, it is preferable to cache the image data in the cache module. (Of course, if the processing speed of the mobile terminal is fast enough, you can also not use the cache). Further, in the process of collecting image data, the mobile terminal can adjust the acquisition speed in real time according to the remaining space of the cache module, thereby maximally utilizing the processing capability of the mobile terminal and preventing data overflow due to excessive acquisition speed. , which in turn leads to data loss.
  • Step S30 performing image synthesis according to the collected moving images of all objects to generate object motion The composite image corresponding to the trajectory.
  • all the moving images of the object are image-combined to generate a composite image corresponding to the motion track of the object, and the image synthesis may be performed in real time when the moving image of each object is acquired, or may be
  • the collected moving images of all the objects are first cached in the cache module, and the moving images of the objects are called from the cache module to be unified after the shooting ends.
  • the moving image of the first object When the moving image of the first object is acquired or selected, it is taken as the image to be combined, and after the moving image of the second object is acquired or selected, the image to be combined is synthesized into a current composite image, and sequentially The moving image of the object acquired or selected later is combined with the composite image generated by the previous one to finally generate a composite image formed by the moving images of all the objects captured.
  • the step S30 is specifically: performing a brightness synthesis process on the selected current object moving image and the previous combined composite image to generate a composite image corresponding to the object motion track.
  • the image synthesis module of the mobile terminal directly receives the captured moving image of the object; or selects the moving image of the object from the cache module to perform image synthesis in real time, and resets the cache module, and empties the data therein to provide space for subsequent data.
  • the image synthesis module performs a process of brightness synthesis based on the current object moving image and the brightness information of the composite image synthesized last, to generate a composite image.
  • the selected current moving image of the object and the composite image of the previous composite are subjected to brightness synthesis processing, and the composite image corresponding to the motion track of the object is generated:
  • the brightness of the pixel at the position in the previous composite image is replaced with the pixel in the current image, and image synthesis is performed accordingly.
  • the image synthesis module determines whether the brightness of the pixel in the current object moving image is greater than the brightness of the pixel of the composite image synthesized in the previous one, and if so, the pixel in the composite image of the previous composite image Replace with the image in the current object motion image After the pixels of the previous composite image are replaced by the smaller brightness, the final synthesized image is obtained. That is, the image synthesis of the present embodiment is performed by using the brightness selection method as the base image based on the already synthesized image (the composite image synthesized last time); then selecting the pixels in the subsequent image that are brighter than the base image to be replaced. Synthetic method.
  • the first object motion image has been acquired, and based on the first object motion image (past image), when the second object motion image (current image) is acquired, the first object is acquired.
  • the moving image is compared with the pixel of the corresponding position of the second object moving image. If the brightness of the second sheet is greater than the brightness of the first sheet, the pixel of the second object moving image is extracted to replace the first object moving image.
  • a composite image is finally obtained, and then the same image is processed on the subsequent object based on the composite image, and finally the composite image corresponding to the moving image of all the objects captured is obtained.
  • the image includes pixel unit 1, pixel unit 2, and pixel unit n, a total of n pixel units, wherein the brightness of the current image of the total of 200 pixel units of the pixel unit 501 to the pixel unit 700 is greater than that of the previous composite image.
  • the image synthesizing module replaces the pixels of the pixel unit 501 to the pixel unit 700 in the current image with the pixels of the pixel unit 501 to the pixel unit 700 in the past image, and obtains a new image, that is, a composite image, after the replacement.
  • This way of replacing the bright spots compared with the way of highlights superposition, can more clearly capture the trajectory of the star movement, preventing other bright spots next to the star track from being too bright and affecting the star track effect.
  • the image synthesis module also performs noise reduction processing on the composite image, and also controls the synthesis ratio of the newly synthesized image according to the exposure degree of the existing image to suppress overexposure generation.
  • the mobile terminal After generating the composite image corresponding to the motion track of the object, the mobile terminal displays the generated composite image in real time on the display screen, so that the user can preview the current object motion track in real time.
  • the composite image displayed by the mobile terminal is a compressed small-sized thumbnail image, and the full-size image is stored, that is, displayed and stored as two threads.
  • a parameter adjustment interface is displayed in the shooting window for the user to adjust the shooting parameters, and the moving image of the object is continuously collected by the camera according to the adjusted shooting parameters; the image is synthesized according to the collected moving images of all objects, and the object is generated.
  • a composite image corresponding to the motion trajectory Displaying a parameter adjustment interface for adjusting the shooting parameters to collect the moving image of the object according to the adjusted shooting parameters, and synthesizing into a composite image corresponding to the motion track of the object, and the household can adjust the shooting parameters according to actual needs during the shooting process, so that the The captured image of the object's motion track can meet the user's custom needs.
  • the embodiment of the invention further provides a mobile terminal.
  • FIG. 2 is a schematic diagram of functional modules of a first embodiment of a mobile terminal according to the present invention.
  • the mobile terminal includes:
  • the display module 10 is configured to display a parameter adjustment interface in the shooting window for the user to adjust the shooting parameters after the shooting starts;
  • the acquiring module 20 is configured to receive the user-adjusted shooting parameters, and continuously collect the moving image of the object through the camera according to the adjusted shooting parameters;
  • the synthesizing module 30 is configured to perform image synthesis according to the collected moving images of all objects, and generate a composite image corresponding to the motion track of the object.
  • the embodiment of the invention adds a motion track shooting mode for the shooting function of the mobile terminal, and the user can select the motion track shooting mode or the normal shooting mode to capture the motion track of the object, wherein the motion track shooting mode combines the motion track to shoot the scene. It is required to preset the parameters such as exposure time, ISO, resolution, exposure compensation, noise reduction, etc. It is also possible to preset different parameters according to different starry sky scenes over different regions for the user to select when shooting.
  • the motion of the object is substantially relative to the moving mobile terminal, and the motion type may include the following three situations: 1. The state of the object is motion, and the state of the mobile terminal is static; The state of the body is stationary, and the state of the mobile terminal is motion; third, the state of the object and the mobile terminal are the same, but the relative position changes.
  • the mobile terminal When the user selects the motion track shooting mode, presses the shooting button or triggers the virtual shooting button, the mobile terminal starts the motion track shooting, and the display module 10 displays the parameter adjustment interface in the shooting window for the user to adjust the shooting parameters through the parameter adjustment interface. .
  • the parameter adjustment interface after receiving the display instruction sent by the user trigger adjustment control, the parameter adjustment interface may be displayed in the shooting window according to the display instruction.
  • the display may be cancelled by adjusting the control; parameter adjustment The interface can also be automatically displayed in the shooting window when entering the motion track shooting mode; the parameter adjustment interface can be presented in any form and size, covering the shooting window, or superimposed in the semi-transparent form above the shooting window; in the parameter adjustment interface Shooting parameters such as exposure time, exposure compensation, and white balance can be adjusted.
  • the parameter adjustment interface can be displayed as a progress bar for adjusting a specific shooting parameter, for example, the user can adjust the exposure time at any time during the shooting process, so that the brightness and the definition of the moving image of the captured object are not completely the same. .
  • the mobile terminal receives the adjusted shooting parameters, and starts to adjust the shooting parameters, the collecting module 20 continuously collects the moving image of the object by using the camera, and the speed at which the camera continuously collects the image data may be Pre-set.
  • the shooting parameters include, but are not limited to, exposure time, ISO and other common parameters of the camera.
  • the user can change the camera parameters in real time on the camera interface, and the mobile terminal will capture the moving image of the object according to the changed camera parameters. In this way, through the adjustment of the shooting parameter interface provided by the mobile terminal, the parameter adjustment can be directly performed dynamically, instead of the fixed shooting parameters, the convenience and flexibility are greatly improved.
  • the camera In order to ensure the continuity of the moving image of the object, the camera needs to continuously collect at least ten images within 1 s clock, and the subsequent image processing is often unable to keep up with the image acquisition speed. Therefore, it is preferable to cache the image data in the cache module. (Of course, if the processing speed of the mobile terminal is fast enough, you can also not use the cache). Further, in the process of collecting image data, the mobile terminal The acquisition speed can be adjusted in real time according to the remaining space of the cache module, thereby maximally utilizing the processing capability of the mobile terminal, and preventing data overflow due to excessive acquisition speed, thereby causing data loss.
  • all the moving images of the object are image-combined by the synthesizing module 30 to generate a composite image corresponding to the motion track of the object, and the image synthesis may be performed in real time when the moving image of each object is acquired.
  • the collected moving image of all objects is first cached in the cache module, and the moving image of the object is called from the cache module to be unified after the shooting ends.
  • the moving image of the first object When the moving image of the first object is acquired or selected, it is taken as the image to be combined, and after the moving image of the second object is acquired or selected, the image to be combined is synthesized into a current composite image, and sequentially The moving image of the object acquired or selected later is combined with the composite image generated by the previous one to finally generate a composite image formed by the moving images of all the objects captured.
  • the synthesizing module 30 is configured to:
  • the selected current object moving image and the previous composite image are subjected to brightness synthesis processing to generate a composite image corresponding to the object motion track.
  • the image synthesis module of the mobile terminal directly receives the captured moving image of the object; or selects the moving image of the object from the cache module to perform image synthesis in real time, and resets the cache module, and empties the data therein to provide space for subsequent data.
  • the image synthesis module performs a process of brightness synthesis based on the current object moving image and the brightness information of the composite image synthesized last, to generate a composite image.
  • the synthesis module 30 is configured to:
  • the brightness of the pixel at the position in the previous composite image is replaced with the pixel in the current image, and image synthesis is performed accordingly.
  • the image synthesis module determines whether the brightness of the pixel in the current object moving image is greater than the brightness of the pixel of the composite image synthesized in the previous one, and if so, the pixel in the composite image of the previous composite image It is replaced with the pixel in the current object moving image. After all the pixels with the smaller brightness in the previous synthesized composite image are replaced, it is the final synthesized composite image. That is, the image synthesis of the present embodiment is performed by using the brightness selection method as the base image based on the already synthesized image (the composite image synthesized last time); then selecting the pixels in the subsequent image that are brighter than the base image to be replaced. Synthetic method.
  • the first object motion image has been acquired, and based on the first object motion image (past image), when the second object motion image (current image) is acquired, the first object is acquired.
  • the moving image is compared with the pixel of the corresponding position of the second object moving image. If the brightness of the second sheet is greater than the brightness of the first sheet, the pixel of the second object moving image is extracted to replace the first object moving image.
  • a composite image is finally obtained, and then the same image is processed on the subsequent object based on the composite image, and finally the composite image corresponding to the moving image of all the objects captured is obtained.
  • the image includes pixel unit 1, pixel unit 2, and pixel unit n, a total of n pixel units, wherein the brightness of the current image of the total of 200 pixel units of the pixel unit 501 to the pixel unit 700 is greater than that of the previous composite image.
  • the image synthesizing module replaces the pixels of the pixel unit 501 to the pixel unit 700 in the current image with the pixels of the pixel unit 501 to the pixel unit 700 in the past image, and obtains a new image, that is, a composite image, after the replacement.
  • This way of replacing the bright spots compared with the way of highlights superposition, can more clearly capture the trajectory of the star movement, preventing other bright spots next to the star track from being too bright and affecting the star track effect.
  • the image synthesis module also performs noise reduction processing on the composite image, and also controls the synthesis ratio of the newly synthesized image according to the exposure degree of the existing image to suppress overexposure generation.
  • the mobile terminal After generating the composite image corresponding to the motion track of the object, the mobile terminal displays the generated composite image in real time on the display screen, so that the user can preview the current object motion track in real time.
  • the composite image displayed by the mobile terminal is a compressed small-sized thumbnail image, and the full-size image is stored, that is, displayed and stored as two threads.
  • the mobile terminal may store each of the composite images locally, or may only store one composite image that was last generated when the shooting was ended.
  • a parameter adjustment interface is displayed in the shooting window for the user to adjust the shooting parameters, and the moving image of the object is continuously collected by the camera according to the adjusted shooting parameters; the image is synthesized according to the collected moving images of all objects, and the object is generated.
  • a composite image corresponding to the motion trajectory Displaying a parameter adjustment interface for adjusting the shooting parameters to collect the moving image of the object according to the adjusted shooting parameters, and synthesizing into a composite image corresponding to the motion track of the object, and the household can adjust the shooting parameters according to actual needs during the shooting process, so that the The captured image of the object's motion track can meet the user's custom needs.
  • the embodiment of the present invention further provides a computer storage medium, where the computer storage medium stores computer executable instructions, and the computer executable instructions are used to perform at least one of the foregoing methods; specifically, the method shown in FIG. .
  • the computer storage medium may be various types of storage media such as a ROM/RAM, a magnetic disk, an optical disk, a DVD, or a USB flash drive.
  • the computer storage medium may be a non-transitory storage medium.
  • the photographic lens 101 is composed of a plurality of optical lenses for forming a subject image, and is a single focus lens or a zoom lens.
  • the photographic lens 101 can be moved in the optical axis direction by the lens driving unit 111, and controls the focus position of the taking lens 101 based on the control signal from the lens driving control unit 112, and also controls the focus distance in the case of the zoom lens.
  • the lens drive control circuit 112 performs drive control of the lens drive unit 111 in accordance with a control command from the microcomputer 107.
  • the imaging element 102 functions as an imaging unit that captures a subject image and acquires captured image data.
  • Photodiodes constituting each pixel are two-dimensionally arranged in a matrix on the imaging element 102. Each photodiode generates a photoelectric conversion current corresponding to the amount of received light, and the photoelectric conversion current is charged by a capacitor connected to each photodiode.
  • the front surface of each pixel is provided with a Bayer array of RGB color filters.
  • the imaging element 102 is connected to an imaging circuit 103 that performs charge accumulation control and image signal readout control in the imaging element 102, and performs waveform shaping after reducing the reset noise of the read image signal (analog image signal). Further, gain improvement or the like is performed to obtain an appropriate signal level.
  • the imaging circuit 103 is connected to the A/D conversion unit 104, which performs analog-to-digital conversion on the analog image signal, and outputs a digital image signal (hereinafter referred to as image data) to the bus 199.
  • image data a digital image signal
  • the bus 199 is a transmission path for transmitting various data read or generated inside the camera.
  • the A/D conversion unit 104 is connected to the bus 199, and an image processor 105, a JPEG processor 106, a microcomputer 107, a SDRAM (Synchronous DRAM) 108, and a memory interface (hereinafter referred to as a memory I/F) are connected. 109. LCD (Liquid Crystal Display) driver 110.
  • the image processor 105 performs various kinds of images such as OB subtraction processing, white balance adjustment, color matrix calculation, gamma conversion, color difference signal processing, noise removal processing, simultaneous processing, edge processing, and the like on the image data based on the output of the imaging element 102. deal with.
  • the JPEG processor 106 compresses the image data read out from the SDRAM 108 in accordance with the JPEG compression method. Further, the JPEG processor 106 performs decompression of JPEG image data for image reproduction display.
  • the file recorded on the recording medium 115 is read, and after the compression processing is performed in the JPEG processor 106, the decompressed image data is temporarily stored in the SDRAM 108 and displayed on the LCD 116.
  • the JPEG method is adopted as the image compression/decompression method, but the compression/decompression method is not limited to this, of course.
  • Other compression and decompression methods such as MPEG, TIFF, and H.264 can be used.
  • the operation unit 113 includes but is not limited to a physical button or a virtual button, and the entity or virtual button may be a power button, a camera button, an edit button, a dynamic image button, a reproduction button, a menu button, a cross button, an OK button, a delete button, and an enlarge button.
  • the operation parts such as various input buttons and various input keys are detected, and the operation states of these operation members are detected.
  • the detection result is output to the microcomputer 107.
  • a touch panel is provided on the front surface of the LCD 116 as a display portion, and the touch position of the user is detected, and the touch position is output to the microcomputer 107.
  • the microcomputer 107 executes various processing sequences corresponding to the operation of the user based on the detection result of the operation member from the operation unit 113. (Also, this place can be changed to the computer 107 to execute various processing sequences corresponding to the user's operation based on the detection result of the touch panel in front of the LCD 116.
  • the flash memory 114 stores programs for executing various processing sequences of the microcomputer 107.
  • the microcomputer 107 performs overall control of the camera in accordance with the program. Further, the flash memory 114 stores various adjustment values of the camera, and the microcomputer 107 reads out the adjustment value, and performs control of the camera in accordance with the adjustment value.
  • the SDRAM 108 is an electrically rewritable volatile memory for temporarily storing image data or the like.
  • the SDRAM 108 temporarily stores image data output from the A/D conversion unit 104 and image data processed in the image processor 105, the JPEG processor 106, and the like.
  • the microcomputer 107 functions as a control unit of the entire camera, and collectively controls various processing sequences of the camera.
  • the microcomputer 107 is connected to the operation unit 113 and the flash memory 114.
  • the microcomputer 107 can control the apparatus in this embodiment to perform the following operations by executing a program:
  • the parameter adjustment interface is displayed in the shooting window for the user to adjust the shooting parameters
  • the displaying the parameter adjustment interface in the shooting window includes:
  • the display instruction sent by the user trigger adjustment control is received, and the parameter adjustment interface is displayed in the shooting window according to the display instruction.
  • the displaying the parameter adjustment interface in the shooting window includes:
  • the parameter adjustment interface is displayed in the shooting window.
  • the step of performing image synthesis according to all the moving images of the collected objects, and generating a composite image corresponding to the motion track of the object includes:
  • the selected current object moving image and the previous composite image are subjected to brightness synthesis processing to generate a composite image corresponding to the object motion track.
  • the step of performing brightness synthesis processing on the selected current object moving image and the previous synthesized composite image includes:
  • the brightness of the pixel at the position in the previous composite image is replaced with the pixel in the current image, and image synthesis is performed accordingly.
  • the memory interface 109 is connected to the recording medium 115, and performs control for writing image data and a file header attached to the image data to the recording medium 115 and reading from the recording medium 115.
  • the recording medium 115 is, for example, a recording medium such as a memory card that can be detachably attached to the camera body.
  • the recording medium 115 is not limited thereto, and may be a hard disk or the like built in the camera body.
  • the LCD driver 110 is connected to the LCD 116, and stores image data processed by the image processor 105 in the SDRAM.
  • the image data stored in the SDRAM is read and displayed on the LCD 116, or the image data stored in the JPEG processor 106 is compressed.
  • the JPEG processor 106 reads the compressed image data of the SDRAM, decompresses it, and displays the decompressed image data on the LCD 116.
  • the LCD 116 is disposed on the back surface of the camera body or the like to perform image display.
  • the LCD 116 is provided with a touch panel that detects a user's touch operation.
  • the liquid crystal display panel (LCD 116) is disposed as the display portion.
  • the present invention is not limited thereto, and various display panels such as an organic EL may be employed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un procédé de prise de vue pour une trace de déplacement d'un objet, un terminal mobile et un support de stockage informatique. Le procédé de prise de vue pour une trace de déplacement d'un objet comprend les étapes suivantes : une fois que la prise de vue commence, afficher une interface d'ajustement de paramètre sur une fenêtre de prise de vue pour un utilisateur pour ajuster un paramètre de prise de vue ; recevoir le paramètre de prise de vue ajusté par l'utilisateur, et acquérir de manière successive, selon le paramètre de prise de vue ajusté, des images de déplacement d'un objet par l'intermédiaire d'un appareil de prise de vue ; et réaliser une composition d'image selon toutes les images de déplacement acquises de l'objet pour générer une image composite correspondant à la trace de déplacement de l'objet. Dans un processus de prise de vue, une interface d'ajustement de paramètre pour ajuster un paramètre de prise de vue est affichée, de façon à acquérir des images de déplacement d'un objet selon le paramètre de prise de vue ajusté, et les composer en une image composite correspondant à la trace de déplacement de l'objet ; et un utilisateur peut ajuster le paramètre de prise de vue selon des exigences réelles dans le processus de prise de vue, de telle sorte qu'une image de prise de vue de la trace de déplacement de l'objet peut satisfaire des exigences personnalisées de l'utilisateur.
PCT/CN2015/083362 2014-08-13 2015-07-06 Procédé de prise de vue pour trace de déplacement d'objet, terminal mobile et support de stockage informatique Ceased WO2016023406A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201410396398.5 2014-08-13
CN201410396398.5A CN104125407B (zh) 2014-08-13 2014-08-13 物体运动轨迹的拍摄方法和移动终端

Publications (1)

Publication Number Publication Date
WO2016023406A1 true WO2016023406A1 (fr) 2016-02-18

Family

ID=51770659

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/083362 Ceased WO2016023406A1 (fr) 2014-08-13 2015-07-06 Procédé de prise de vue pour trace de déplacement d'objet, terminal mobile et support de stockage informatique

Country Status (2)

Country Link
CN (1) CN104125407B (fr)
WO (1) WO2016023406A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109194865A (zh) * 2018-08-06 2019-01-11 光锐恒宇(北京)科技有限公司 图像生成方法、装置、智能终端和计算机可读存储介质
CN109565718A (zh) * 2018-11-15 2019-04-02 北京小米移动软件有限公司 传输消息的方法及装置

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104125407B (zh) * 2014-08-13 2018-09-04 努比亚技术有限公司 物体运动轨迹的拍摄方法和移动终端
CN104462407B (zh) * 2014-12-12 2017-10-13 南京大学 基于地图轨迹的场景感知模型的前端数据动态集成方法
CN104683697A (zh) * 2015-03-17 2015-06-03 努比亚技术有限公司 拍摄参数调整方法及装置
CN105141853B (zh) 2015-08-18 2019-02-05 联想(北京)有限公司 图像处理方法以及电子设备
CN105827991B (zh) * 2016-01-22 2019-05-17 维沃移动通信有限公司 一种运动对象的拍照方法及移动终端
CN105681675A (zh) * 2016-03-22 2016-06-15 珠海格力电器股份有限公司 移动终端及其拍照模式设置方法和装置
CN105721757A (zh) * 2016-04-28 2016-06-29 努比亚技术有限公司 一种调整拍摄参数的装置和方法
CN106060384B (zh) * 2016-05-31 2019-07-19 努比亚技术有限公司 拍照控制方法和装置
CN106357979A (zh) * 2016-09-19 2017-01-25 宇龙计算机通信科技(深圳)有限公司 一种摄影方法、装置和终端
CN106357983A (zh) * 2016-11-15 2017-01-25 上海传英信息技术有限公司 拍摄参数调整方法及用户终端
CN107016728A (zh) * 2017-03-08 2017-08-04 惠州Tcl移动通信有限公司 一种在虚拟现实场景中模拟摄影的方法及系统
CN108573467A (zh) * 2017-03-09 2018-09-25 南昌黑鲨科技有限公司 基于图像的轨迹合成方法、装置及终端
CN107277377A (zh) * 2017-08-03 2017-10-20 深圳传音控股有限公司 拍照方法及装置
CN108566513A (zh) * 2018-03-28 2018-09-21 深圳臻迪信息技术有限公司 一种无人机对运动目标的拍摄方法
CN110785992A (zh) * 2018-08-31 2020-02-11 深圳市大疆创新科技有限公司 相机的控制方法及相机
CN110913118B (zh) * 2018-09-17 2021-12-17 腾讯数码(天津)有限公司 视频处理方法、装置及存储介质
CN110785990A (zh) * 2018-10-31 2020-02-11 深圳市大疆创新科技有限公司 一种视频拍摄方法、装置及控制设备
CN111192286B (zh) * 2018-11-14 2025-02-21 西安中兴新软件有限责任公司 一种图像合成方法、电子设备及存储介质
CN109361870A (zh) * 2018-11-28 2019-02-19 维沃移动通信有限公司 一种拍照方法及终端设备
CN113810587B (zh) * 2020-05-29 2023-04-18 华为技术有限公司 一种图像处理方法及装置
WO2022027447A1 (fr) * 2020-08-06 2022-02-10 深圳市大疆创新科技有限公司 Procédé de traitement d'image, et caméra et terminal mobile
CN112702497B (zh) * 2020-12-28 2022-04-26 维沃移动通信有限公司 一种拍摄方法及装置
CN112948048A (zh) * 2021-03-25 2021-06-11 维沃移动通信(深圳)有限公司 信息处理方法、装置、电子设备及存储介质
CN113114933A (zh) * 2021-03-30 2021-07-13 维沃移动通信有限公司 图像拍摄方法、装置、电子设备和可读存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1642323A (zh) * 2004-01-17 2005-07-20 上海迪比特实业有限公司 具摄像装置的移动终端快速调节摄像控制参数的方法
US20080088710A1 (en) * 2006-10-16 2008-04-17 Casio Computer Co., Ltd. Imaging apparatus, continuous imaging method, and recording medium for recording a program
CN101815176A (zh) * 2010-04-15 2010-08-25 西安酷派软件科技有限公司 一种照片连拍的处理方法、系统及摄像设备
CN103472971A (zh) * 2013-09-03 2013-12-25 小米科技有限责任公司 一种设置拍摄参数的方法、装置及终端设备
CN103905730A (zh) * 2014-03-24 2014-07-02 深圳市中兴移动通信有限公司 移动终端的拍摄方法和移动终端
CN104125407A (zh) * 2014-08-13 2014-10-29 深圳市中兴移动通信有限公司 物体运动轨迹的拍摄方法和移动终端

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103841328B (zh) * 2014-02-27 2015-03-11 深圳市中兴移动通信有限公司 慢速快门拍摄方法和拍摄装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1642323A (zh) * 2004-01-17 2005-07-20 上海迪比特实业有限公司 具摄像装置的移动终端快速调节摄像控制参数的方法
US20080088710A1 (en) * 2006-10-16 2008-04-17 Casio Computer Co., Ltd. Imaging apparatus, continuous imaging method, and recording medium for recording a program
CN101815176A (zh) * 2010-04-15 2010-08-25 西安酷派软件科技有限公司 一种照片连拍的处理方法、系统及摄像设备
CN103472971A (zh) * 2013-09-03 2013-12-25 小米科技有限责任公司 一种设置拍摄参数的方法、装置及终端设备
CN103905730A (zh) * 2014-03-24 2014-07-02 深圳市中兴移动通信有限公司 移动终端的拍摄方法和移动终端
CN104125407A (zh) * 2014-08-13 2014-10-29 深圳市中兴移动通信有限公司 物体运动轨迹的拍摄方法和移动终端

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109194865A (zh) * 2018-08-06 2019-01-11 光锐恒宇(北京)科技有限公司 图像生成方法、装置、智能终端和计算机可读存储介质
CN109565718A (zh) * 2018-11-15 2019-04-02 北京小米移动软件有限公司 传输消息的方法及装置
CN109565718B (zh) * 2018-11-15 2023-11-17 北京小米移动软件有限公司 传输消息的方法及装置
US11997542B2 (en) 2018-11-15 2024-05-28 Beijing Xiaomi Mobile Software Co., Ltd. Method and device for message transmission

Also Published As

Publication number Publication date
CN104125407A (zh) 2014-10-29
CN104125407B (zh) 2018-09-04

Similar Documents

Publication Publication Date Title
WO2016023406A1 (fr) Procédé de prise de vue pour trace de déplacement d'objet, terminal mobile et support de stockage informatique
US8687126B2 (en) Digital image signal processing method, medium for recording the method, and digital image signal processing apparatus
US9167163B2 (en) Digital image processing apparatus that displays first content, generates second content based on an input signal, and generates third content related to the second content, and method of controlling the same
CN104780324B (zh) 一种拍摄的方法和装置
US10419661B2 (en) Shooting method and shooting device
WO2016000515A1 (fr) Procédé et dispositif pour filmer une vidéo de traînée d'étoile, et support de stockage informatique
CN109155815A (zh) 摄像装置及其设定画面
KR101948692B1 (ko) 촬영 장치 및 이미지 합성 방법
US20170302848A1 (en) Photographing method, device and computer storage medium
KR102424984B1 (ko) 복수 개의 카메라를 포함하는 전자 장치 및 그 동작 방법
JP6325841B2 (ja) 撮像装置、撮像方法、およびプログラム
JP2017220892A (ja) 画像処理装置及び画像処理方法
WO2016011859A1 (fr) Procédé pour filmer une vidéo en peinture de lumière, terminal mobile, et support d'enregistrement
KR20130069039A (ko) 디스플레이 장치, 방법, 및 컴퓨터 판독 가능 저장 매체
WO2016029746A1 (fr) Procédé de prise de vue, dispositif de prise de vue et support d'enregistrement informatique
JP5126207B2 (ja) 撮像装置
KR101889932B1 (ko) 촬영 장치 및 이에 적용되는 촬영 방법
WO2016008359A1 (fr) Procédé de synthèse d'images de suivi de mouvement d'objets, dispositif et support de stockage informatique
CN104956657B (zh) 摄像装置和图像处理方法
WO2016000514A1 (fr) Procédé et dispositif servant à filmer une vidéo de nébuleuse et support informatique d'informations
WO2016011872A1 (fr) Procédé et appareil de photographie d'une image et support de stockage informatique
WO2016169488A1 (fr) Procédé et appareil de traitement d'images, support de stockage informatique et terminal
JP5530304B2 (ja) 撮像装置および撮影画像表示方法
WO2016019786A1 (fr) Procédé et système de photographie de la trajectoire de déplacement d'un objet, et support de stockage informatique
JP5910613B2 (ja) 画像合成装置及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15832105

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 10/07/2017)

122 Ep: pct application non-entry in european phase

Ref document number: 15832105

Country of ref document: EP

Kind code of ref document: A1