CN112136318A - Control device, imaging system, control method, and program - Google Patents
Control device, imaging system, control method, and program Download PDFInfo
- Publication number
- CN112136318A CN112136318A CN202080002784.5A CN202080002784A CN112136318A CN 112136318 A CN112136318 A CN 112136318A CN 202080002784 A CN202080002784 A CN 202080002784A CN 112136318 A CN112136318 A CN 112136318A
- Authority
- CN
- China
- Prior art keywords
- image pickup
- information
- support mechanism
- control
- imaging
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 171
- 238000000034 method Methods 0.000 title claims abstract description 18
- 230000007246 mechanism Effects 0.000 claims abstract description 100
- 230000001133 acceleration Effects 0.000 claims abstract description 38
- 230000008859 change Effects 0.000 claims description 67
- 230000009471 action Effects 0.000 claims description 6
- 230000015654 memory Effects 0.000 description 22
- 238000010586 diagram Methods 0.000 description 17
- 238000004891 communication Methods 0.000 description 8
- 238000003860 storage Methods 0.000 description 8
- 230000003287 optical effect Effects 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 238000004091 panning Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 230000003111 delayed effect Effects 0.000 description 2
- 230000010365 information processing Effects 0.000 description 2
- 230000033001 locomotion Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 238000007562 laser obscuration time method Methods 0.000 description 1
- 230000002045 lasting effect Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000004904 shortening Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/36—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/32—Means for focusing
- G03B13/34—Power focusing
- G03B13/36—Autofocus systems
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/56—Accessories
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B5/00—Adjustment of optical system relative to image or object surface other than for focusing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Studio Devices (AREA)
- Focusing (AREA)
- Accessories Of Cameras (AREA)
- Automatic Focus Adjustment (AREA)
- Adjustment Of Camera Lenses (AREA)
Abstract
Depending on the operation of the imaging apparatus rotatably supported by the support mechanism, there is a possibility that the timing of focus control of the imaging apparatus may be inappropriate. The control device of the present invention may be a control device that controls an image pickup device rotatably supported by the support mechanism. The control means may comprise circuitry configured to: the control method includes acquiring at least one of first information on an angular velocity of the image pickup apparatus and second information on an acceleration of the image pickup apparatus and an operation mode of a support mechanism for controlling an attitude of the image pickup apparatus, and controlling an execution timing of focus control of the image pickup apparatus based on the at least one of the first information and the second information and the operation mode.
Description
This application claims priority from japanese patent application No. JP2019-061606, filed on 2019-03-27, the contents of which are incorporated herein by reference.
The invention relates to a control device, an imaging system, a control method, and a program.
[ background Art document ]
[ patent document ]
[ patent document 1 ]: japanese patent laid-open publication No. 2012-22238
Disclosure of Invention
Technical problem to be solved by the invention
Depending on the operation of the imaging apparatus rotatably supported by the support mechanism, there is a possibility that the timing of focus control of the imaging apparatus may be inappropriate.
The control device according to one aspect of the present invention may be a control device that controls an image pickup device rotatably supported by the support mechanism. The control means may comprise circuitry configured to: the method includes acquiring at least one of first information on an angular velocity of the image pickup apparatus and second information on an acceleration of the image pickup apparatus and an operation mode of a support mechanism for controlling an attitude of the image pickup apparatus, and controlling a focus control execution timing of the image pickup apparatus based on the at least one of the first information and the second information and the operation mode.
The circuit may be configured as follows: further, the timing of execution of the focus control is controlled based on the position of the focus lens included in the image pickup apparatus.
The circuit may be configured as follows: the control device selects at least one of the first information and the second information based on the operation mode of the support mechanism, determines the operation state of the image pickup device based on the selected at least one of the first information and the second information, and controls the execution timing of the focus control based on the operation state of the image pickup device and the position of the focus lens.
The action pattern may include: a first operation mode for operating the support mechanism so that the posture of the imaging device changes in accordance with the change in the posture of the support mechanism base; and a second operation mode in which the support mechanism is operated to maintain the posture of the imaging device.
The circuit may be configured as follows: when the support mechanism operates in the first operation mode, the operation state of the imaging device is determined based on the first information and the second information.
The circuit may be configured as follows: when the support mechanism operates in the second operation mode, the operation state of the imaging device is determined based on the second information.
The circuit may be configured as follows: the execution timing of the focus control is controlled based on the operation state of the image pickup apparatus and a predetermined correspondence between the position of the focus lens and the execution timing of the focus control.
The image pickup device may perform focus control when a time during which a change in contrast value derived from an image captured by the image pickup device continuously satisfies a predetermined condition reaches a predetermined time. The circuit may control the execution timing of the focus control by adjusting a predetermined time.
An imaging system according to an aspect of the present invention may include the control device, the imaging device, and the support mechanism.
The imaging system may further include a grip member connected to the base of the support mechanism, having an operation interface that operates the imaging device and the support mechanism.
The control method according to one aspect of the present invention may be a control method of controlling an image pickup apparatus rotatably supported by a support mechanism. The control method may include a stage of acquiring at least one of first information on an angular velocity of the image pickup device and second information on an acceleration of the image pickup device and an action pattern of a support mechanism for controlling an attitude of the image pickup device. The control method may include controlling a focus control execution timing of the image pickup apparatus based on at least one of the first information and the second information and the operation mode.
The program according to one aspect of the present invention may be a program for causing a computer to function as the control device.
According to an aspect of the present invention, it is possible to optimize the focus control timing of the image pickup apparatus in correspondence with the action of the image pickup apparatus rotatably supported by the support mechanism.
In addition, the above summary does not list all necessary features of the present invention. Furthermore, sub-combinations of these feature sets may also constitute the invention.
Fig. 1 is an external perspective view of the camera system.
Fig. 2 is a diagram showing functional blocks of the image pickup system.
Fig. 3 is a diagram showing an example of the change of the angular velocity of the imaging apparatus around the roll axis (R), around the pitch axis (P), and around the yaw axis (Y) when the user walks while holding the imaging system.
Fig. 4 is a diagram showing an example of the change of the angular velocity of the imaging apparatus around the roll axis (R), around the pitch axis (P), and around the yaw axis (Y) when the user runs while holding the imaging system.
Fig. 5 is a diagram showing an example of the change in angular velocity of the imaging apparatus about the roll axis (R), about the pitch axis (P), and about the yaw axis (Y) when the user pans the imaging system.
Fig. 6 is a diagram showing an example of the change of angular velocity of the imaging apparatus about the roll axis (R), about the pitch axis (P), and about the yaw axis (Y) when the user changes from the stationary state to the walking state in the FPV mode.
Fig. 7 is a diagram showing an example of a change in vibration level of the image pickup apparatus when the user changes from the stationary state to the walking state in the FPV mode.
Fig. 8 is a diagram showing an example of the change of angular velocity of the imaging apparatus about the roll axis (R), about the pitch axis (P), and about the yaw axis (Y) when the user changes from the stationary state to the walking state in the FPV mode.
Fig. 9 is a diagram showing an example of a change in vibration level of the image pickup apparatus when the user changes from the stationary state to the walking state in the FPV mode.
Fig. 10 is a diagram showing an example of a policy table indicating a predetermined correspondence relationship between the operation state of the imaging apparatus and the position of the focus lens and the AF (autofocus) execution timing.
Fig. 11 is a diagram showing a flowchart representing one example of the AF execution time-execution-time control procedure of the image pickup control section.
Fig. 12 is an external perspective view of another form of the imaging system.
Fig. 13 is a diagram showing one example of the hardware configuration.
Description of the symbols
10 image pickup system
100 image pickup device
110 image pickup control unit
120 image sensor
130 memory
150 lens control part
152 lens driving unit
154 lens
200 supporting mechanism
201 rolling shaft driving mechanism
202 pitch axis drive mechanism
203 yaw axis driving mechanism
204 base part
210 attitude control section
212 angular velocity sensor
214 acceleration sensor
300 grip part
301 operating interface
302 display unit
400 smart phone
1200 computer
1210 host controller
1212 CPU
1214 RAM
1220 input/output controller
1222 communication interface
1230 ROM
The present invention will be described below with reference to embodiments of the invention, but the following embodiments do not limit the invention according to the claims. Moreover, all combinations of features described in the embodiments are not necessarily essential to the inventive solution. It will be apparent to those skilled in the art that various changes and modifications can be made in the following embodiments. It is apparent from the description of the claims that the modes to which such changes or improvements are made are included in the technical scope of the present invention.
The claims, the specification, the drawings, and the abstract of the specification contain matters to be protected by copyright. The copyright owner would not make an objection to the facsimile reproduction by anyone of the files, as represented by the patent office documents or records. However, in other cases, the copyright of everything is reserved.
Various embodiments of the present invention may be described with reference to flow diagrams and block diagrams, where blocks may represent (1) stages of a process to perform an operation or (2) a "part" of a device that has the role of performing an operation. The specified stages and "sections" may be implemented by programmable circuits and/or processors. The dedicated circuitry may comprise digital and/or analog hardware circuitry. May include Integrated Circuits (ICs) and/or discrete circuits. The programmable circuitry may comprise reconfigurable hardware circuitry. The reconfigurable hardware circuit may include logical AND, logical OR, logical XOR, logical NAND, logical NOR, AND other logical operations, as well as flip-flops, registers, Field Programmable Gate Arrays (FPGAs), Programmable Logic Arrays (PLAs), AND like storage elements.
A computer readable medium may comprise any tangible device that can store instructions for execution by a suitable device. As a result, a computer-readable medium having stored thereon instructions that are executable to create a means for implementing the operations specified in the flowchart or block diagram. As examples of the computer readable medium, an electronic storage medium, a magnetic storage medium, an optical storage medium, an electromagnetic storage medium, a semiconductor storage medium, and the like may be included. More specific examples of the computer-readable medium may include a floppy disk (registered trademark) disk, a floppy disk, a hard disk, a Random Access Memory (RAM), a Read Only Memory (ROM), an erasable programmable read only memory (EPROM or flash memory), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Static Random Access Memory (SRAM), a compact disc read only memory (CD-ROM), a Digital Versatile Disc (DVD), a blu-ray (registered trademark) disc, a memory stick, an integrated circuit card, and so forth.
Computer readable instructions may include any one of source code or object code described by any combination of one or more programming languages. The source code or object code comprises a conventional procedural programming language. Conventional procedural programming languages may be assembly instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or an object oriented programming language such as Smalltalk (registered trademark), JAVA (registered trademark), C + +, or the like, as well as the "C" programming language or similar programming languages. The computer readable instructions may be provided to a processor or programmable circuitry of a general purpose computer, special purpose computer, or other programmable data processing apparatus, either locally or via a Wide Area Network (WAN), such as a Local Area Network (LAN), the internet, or the like. A processor or programmable circuit may execute the computer readable instructions to create means for implementing the operations specified in the flowchart or block diagram. Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, and the like.
Fig. 1 is an external perspective view of an imaging system 10 according to the present embodiment. The imaging system 10 includes an imaging device 100, a support mechanism 200, and a grip 300. The support mechanism 200 supports the imaging apparatus 100 rotatably about a roll axis, a pitch axis, and a yaw axis, respectively, using actuators. The support mechanism 200 can change or maintain the attitude of the imaging apparatus 100 by rotating the imaging apparatus 100 about at least one of the roll axis, the pitch axis, and the yaw axis. The support mechanism 200 includes a roll axis drive mechanism 201, a pitch axis drive mechanism 202, and a yaw axis drive mechanism 203. The support mechanism 200 further comprises a base 204 for fixing the yaw axis drive mechanism 203. The grip 300 is fixed to the base 204. The grip 300 includes an operation interface 301 and a display 302. The imaging apparatus 100 is fixed to the pitch axis drive mechanism 202.
The operation interface 301 receives an operation instruction of the image pickup apparatus 100 and the support mechanism 200 from a user. The operation interface 301 may include a shutter/recording button that instructs the imaging apparatus 100 to perform shooting or recording. The operation interface 301 may include a power supply/function button that turns on or off the power supply of the image pickup system 10 and instructs switching of the still image shooting mode or the moving image shooting mode of the image pickup apparatus 100.
The display section 302 can display an image captured by the image capturing apparatus 100. The display unit 302 can display a menu screen for operating the imaging apparatus 100 and the support mechanism 200. The display unit 302 may be a touch panel display that receives an operation command for the imaging apparatus 100 and the support mechanism 200.
The user holds the grip 300 and takes a still image or a moving image by the imaging device 100. The image pickup apparatus 100 performs focus control. The image pickup apparatus 100 can perform contrast autofocus (contrast AF), image plane phase difference AF, and the like. The image pickup apparatus 100 may also perform focus control by predicting the focus position of the focus lens from the degrees of blur of at least two images captured by the image pickup apparatus 100.
When a time during which a change in contrast value derived from an image captured by the image capturing apparatus 100 continuously satisfies a predetermined condition reaches a predetermined time, the image capturing apparatus 100 executes focus control. The imaging apparatus 100 captures an image at a predetermined frame rate. The image pickup apparatus 100 derives a contrast value of an image for each frame. When the contrast value change of the predetermined number of frames is continuously equal to or more than the predetermined value, the image pickup apparatus 100 executes the focus control. The imaging apparatus 100 includes a counter for counting the number of frames in which the contrast value changes by a predetermined value or more. When the contrast value change is not equal to or greater than a predetermined value, the imaging apparatus 100 resets the count value of the counter. When the count value of the counter reaches a predetermined threshold value, the image pickup apparatus 100 executes focus control.
However, depending on the operating state of the imaging apparatus 100, there may be a case where the focus control is preferably executed before the count value of the counter reaches a predetermined threshold value. Alternatively, depending on the operating state of the imaging apparatus 100, it may be desirable not to execute the focus control when the count value of the counter reaches a predetermined threshold value.
For example, the user may take a photograph of an object relatively close to the user using the imaging system 10 when the user walks while taking a photograph using the imaging system 10 and stops walking. In this case, if the focus lens is located at a position where the object at infinity is in a focused state, it is preferable that the imaging apparatus 100 execute focus control before the count value of the counter reaches a predetermined threshold value. As described above, the optimum execution time of the focus control varies depending on the operation state of the imaging apparatus 100.
In contrast, in the present embodiment, the support mechanism 200 optimizes the timing of executing the focus control according to the operating state of the imaging apparatus 100.
Fig. 2 is a diagram showing functional blocks of the image pickup system 10. The imaging apparatus 100 includes an imaging control unit 110, an image sensor 120, a memory 130, a lens control unit 150, a lens driving unit 152, and a plurality of lenses 154.
The image sensor 120 may be formed of a CCD or a CMOS. The image sensor 120 outputs image data of the optical image formed by the plurality of lenses 154 to the image pickup control section 110. The imaging control unit 110 may be configured by a microprocessor such as a CPU or MPU, a microcontroller such as an MCU, or the like. The imaging control section 110 is an example of a circuit. The imaging control unit 110 can control the imaging apparatus 100 in accordance with an operation command of the imaging apparatus 100 from the grip 300.
The memory 130 may be a computer-readable recording medium and may also include at least one of flash memories such as an SRAM, a DRAM, an EPROM, an EEPROM, and a USB memory. The memory 130 stores programs and the like necessary for the imaging control unit 110 to control the image sensor 120 and the like. The memory 130 may be provided inside the housing of the image pickup apparatus 100. The grip 300 may include other memory for storing image data captured by the image capturing apparatus 100. The grip 300 may have a groove that enables the memory to be detached from the housing of the grip 300.
The plurality of lenses 154 may function as zoom lenses (zoom lenses), variable focal lenses (variable lenses), and focusing lenses. At least a part or all of the plurality of lenses 154 are configured to be movable along the optical axis. The lens control unit 150 drives the lens driving unit 152 in accordance with a lens control command from the image pickup control unit 110, and moves one or more lenses 154 in the optical axis direction. The lens control instruction is, for example, a zoom control instruction and a focus control instruction. The lens driving part 152 may include a Voice Coil Motor (VCM) that moves at least a part or all of the plurality of lenses 154 in the optical axis direction. The lens driving part 152 may include a motor such as a DC motor, a coreless motor, or an ultrasonic motor. The lens driving section 152 can transmit power from the motor to at least a part or all of the plurality of lenses 154 via a mechanism member such as a cam ring or a guide shaft, and move at least a part or all of the plurality of lenses 154 along the optical axis.
The imaging apparatus 100 further includes an attitude control unit 210, an angular velocity sensor 212, and an acceleration sensor 214. The angular velocity sensor 212 detects the angular velocity of the image pickup apparatus 100. The angular velocity sensor 212 detects angular velocities of the imaging apparatus 100 about the roll axis, the pitch axis, and the yaw axis, respectively. The attitude control section 210 acquires angular velocity information on the angular velocity of the image pickup apparatus 100 from the angular velocity sensor 212. The angular velocity information may indicate angular velocities of the imaging apparatus 100 about the roll axis, the pitch axis, and the yaw axis, respectively. The attitude control unit 210 acquires acceleration information on the acceleration of the image pickup apparatus 100 from the acceleration sensor 214. The acceleration information may indicate a vibration level indicating a magnitude of vibration of the image pickup apparatus 100. The acceleration information may indicate the acceleration of the imaging apparatus 100 in the roll axis, pitch axis, and yaw axis directions. The angular velocity information is one example of the first information. The acceleration information is one example of the second information.
The angular velocity sensor 212 and the acceleration sensor 214 may be provided in a housing for housing the image sensor 120, the lens 154, and the like. In the present embodiment, a mode in which the imaging apparatus 100 and the support mechanism 200 are integrated will be described. However, the support mechanism 200 may include a base that removably fixes the image pickup apparatus 100. In this case, the angular velocity sensor 212 and the acceleration sensor 214 may be provided outside the housing of the imaging apparatus 100 such as a base.
The attitude control unit 210 controls the support mechanism 200 based on the angular velocity information and the acceleration information so as to maintain or change the attitude of the imaging apparatus 100. The attitude control section 210 controls the support mechanism 200 so as to maintain or change the attitude of the image pickup apparatus 100 in accordance with the operation mode of the support mechanism 200 for controlling the attitude of the image pickup apparatus 100.
The operation modes include the following modes: in order to change the attitude of the imaging apparatus 100 in accordance with the change in the attitude of the base 204 of the support mechanism 200, at least one of the roll axis drive mechanism 201, the pitch axis drive mechanism 202, and the yaw axis drive mechanism 203 of the support mechanism 200 is operated. The operation modes include the following modes: in order to change the attitude of the imaging apparatus 100 in accordance with the change in the attitude of the base 204 of the support mechanism 200, the roll axis drive mechanism 201, the pitch axis drive mechanism 202, and the yaw axis drive mechanism 203 of the support mechanism 200 are operated. The operation modes include the following modes: in order to change the attitude of the imaging apparatus 100 in accordance with the change in the attitude of the base 204 of the support mechanism 200, the pitch axis drive mechanism 202 and the yaw axis drive mechanism 203 of the support mechanism 200 are operated. The operation modes include the following modes: in order to change the attitude of the imaging apparatus 100 in accordance with the change in the attitude of the base 204 of the support mechanism 200, only the yaw axis drive mechanism 203 is operated.
The action pattern may include: an FPV (First Person View) mode that operates the support mechanism 200 so that the attitude of the imaging apparatus 100 changes following the change in the attitude of the base 204 of the support mechanism 200; and a fixed mode in which the support mechanism 200 is operated to maintain the posture of the imaging apparatus 100.
The FPV mode is a mode in which at least one of the roll axis drive mechanism 201, the pitch axis drive mechanism 202, and the yaw axis drive mechanism 203 is operated so that the attitude of the imaging apparatus 100 changes following the change in the attitude of the base 204 of the support mechanism 200. The fixed mode is a mode in which at least one of the roll axis drive mechanism 201, the pitch axis drive mechanism 202, and the yaw axis drive mechanism 203 is operated to maintain the current posture of the imaging apparatus 100.
The imaging control unit 110 acquires angular velocity information from an angular velocity sensor 212 and acceleration information from an acceleration sensor 214 via the attitude control unit 210. The imaging control unit 110 acquires the operation mode of the support mechanism 200 from the attitude control unit 210. The imaging control unit 110 controls the timing of executing the focus control of the imaging apparatus 100 based on the operation mode of the support mechanism 200 and at least one of the angular velocity information and the acceleration information. The imaging control unit 110 may further control the timing of executing the focus control based on the position of the lens 154 functioning as a focus lens.
The imaging control unit 110 may select at least one of the angular velocity information and the acceleration information based on the operation mode of the support mechanism 200. When at least one of the roll axis drive mechanism 201, the pitch axis drive mechanism 202, and the yaw axis drive mechanism 203 is operated in order to change the attitude of the imaging device 100 in accordance with the change in the attitude of the base 204 of the support mechanism 200, the imaging controller 110 may select angular velocity information and acceleration information. The imaging control unit 110 may select acceleration information when at least one of the roll axis drive mechanism 201, the pitch axis drive mechanism 202, and the yaw axis drive mechanism 203 is operated in order to maintain the attitude of the imaging device 100. The imaging control unit 110 may determine the operating state of the imaging device 100 based on at least one of the selected angular velocity information and the selected acceleration information. The imaging control unit 110 can control the execution timing of the focus control based on the operation state of the imaging apparatus 100 and the position of the lens 154.
Fig. 3 shows an example of the change of the angular velocity of the imaging apparatus 100 about the roll axis (R), about the pitch axis (P), and about the yaw axis (Y) when the user walks while holding the imaging system 10. Fig. 4 shows an example of the change of the angular velocity around the roll axis (R), around the pitch axis (P), and around the yaw axis (Y) when the user runs while holding the imaging system 10. Fig. 5 shows an example of the change in angular velocity about the roll axis (R), about the pitch axis (P), and about the yaw axis (Y) when the user pans the imaging system 10.
The imaging control unit 110 may determine the operating state of the imaging device 100 based on the angular velocity information shown in fig. 3 to 5. However, depending on the operation mode of the support mechanism 200, the imaging control unit 110 may not be able to accurately determine the operation state of the imaging apparatus 100 based on the angular velocity information. For example, when the support mechanism 200 is operated in the fixed mode, at least one of the angular velocities of the imaging apparatus 100 about the roll axis (R), about the pitch axis (P), and about the yaw axis (Y) does not substantially change even when the user is walking or running. In this case, the imaging control unit 110 cannot accurately determine the operating state of the imaging apparatus 100 based on the angular velocity information.
Fig. 6 shows an example of the change of the angular velocity of the imaging apparatus 100 about the roll axis (R), about the pitch axis (P), and about the yaw axis (Y) when the user changes from the stationary state to the walking state in the FPV mode. Fig. 7 shows an example of a change in the vibration level of the image pickup apparatus 100 when the user changes from the stationary state to the walking state in the FPV mode.
In the FPV mode, the imaging control unit 110 may determine the operation state of the imaging apparatus 100 based on changes in angular velocity of the imaging apparatus 100 around the roll axis (R), around the pitch axis (P), and around the yaw axis (Y), or changes in the vibration level of the imaging apparatus 100. In the FPV mode, the imaging control unit 110 may determine the operating state of the imaging apparatus 100 by comparing a predetermined change pattern of the angular velocity in the stationary state, the walking state, or the running state, for example, with a current angular velocity change pattern specified based on the angular velocity information.
In the FPV mode, the imaging control unit 110 may determine the operation state of the imaging device 100 by comparing a predetermined change pattern of the vibration level of each imaging device 100 in the still state, the walking state, and the running state with a change pattern of the current vibration level designated based on the acceleration information. In the FPV mode, the imaging control unit 110 may determine the operating state of the imaging apparatus 100 based on the result of comparing the predetermined change pattern of the angular velocity with the angular velocity change pattern specified based on the angular velocity information, and the result of comparing the predetermined change pattern of the vibration level with the vibration level change pattern specified based on the acceleration information.
The memory 130 may store angular velocity variation patterns and vibration level variation patterns in the stationary state, the walking state, and the running state, respectively, which are obtained by collecting actual measurement values at the time of actual movement of the user. The imaging control unit 110 may specify the operating state of the imaging apparatus 100 by referring to the angular velocity change pattern and the vibration level change pattern in the stationary state, the walking state, and the running state, respectively, stored in the memory 130. The memory 130 may store an angular velocity change pattern when the walking state or running state is changed to the stopped state, and a vibration level change pattern when the walking state or running state is changed to the stopped state. The angular velocity or the vibration level at the time of transition to the stopped state is represented by, for example, a change pattern in which the angular velocity or the vibration level becomes small.
Fig. 8 shows an example of the change of the angular velocity of the imaging apparatus 100 about the roll axis (R), about the pitch axis (P), and about the yaw axis (Y) when the user changes from the stationary state to the walking state in the stationary mode. Fig. 9 shows an example of a change in the vibration level of the image pickup apparatus 100 when the user changes from the stationary state to the walking state in the stationary mode.
In the fixed mode, the support mechanism 200 operates without being affected by the user's motion to maintain the posture of the image pickup apparatus 100. Therefore, even when the user changes from the stationary state to the walking state, the angular velocity of the imaging apparatus 100 around the roll axis (R), around the pitch axis (P), and around the yaw axis (Y) does not substantially change. Therefore, in the fixed mode, the imaging control unit 110 cannot accurately determine the operating state of the imaging apparatus 100 based on the angular velocity information. On the other hand, even in the fixed mode, when the user changes from the stationary state to the walking state, the vibration level changes. That is, in the fixed mode, the imaging control unit 110 may determine the operating state of the imaging device 100 by comparing a predetermined change pattern of the vibration level of each imaging device 100 in the stationary state, the walking state, and the running state with a change pattern of the current vibration level specified based on the acceleration information.
In combination with the above, when the support mechanism 200 operates in the FPV mode, the imaging control unit 110 can determine the operating state of the imaging apparatus 100 based on the angular velocity information and the acceleration information. When the support mechanism 200 operates in the FPV mode, the imaging control unit 110 may determine whether the operating state of the imaging apparatus 100 is any one of the still state, the walking state, and the running state based on a result of comparing the angular velocity change pattern specified in advance in each of the still state, the walking state, and the running state with the angular velocity change pattern specified based on the angular velocity information, and a result of comparing the vibration level change pattern specified in advance in each of the still state, the walking state, and the running state with the vibration level change pattern specified based on the acceleration information. The imaging control unit 110 can determine whether or not the operating state of the imaging apparatus 100 is the panning state shown in fig. 5 based on the respective comparison results.
When the support mechanism 200 operates in the fixed mode, the imaging control unit 110 may determine the operating state of the imaging device 100 based on the acceleration information. When the support mechanism 200 operates in the fixed mode, the imaging control unit 110 may determine whether the operating state of the imaging apparatus 100 is any one of the stationary state, the walking state, and the running state based on a result of comparison between a vibration level change pattern predetermined in advance in each of the stationary state, the walking state, and the running state and a vibration level change pattern specified based on the acceleration information.
The imaging control unit 110 may determine, as the operating state of the imaging device 100, an operating state of a predetermined angular velocity change pattern having a similarity to the angular velocity change pattern specified based on the angular velocity information of at least a threshold value. The imaging control unit 110 may determine, as the operating state of the imaging device 100, an operating state of a predetermined vibration level change pattern having a similarity to a vibration level change pattern determined from the acceleration information of at least a threshold value. The imaging control unit 110 may determine, as the operation state of the imaging device 100, an operation state in which the similarity of the angular velocity change pattern and the similarity of the vibration level change pattern are equal to or greater than a threshold value.
The imaging control unit 110 may control the execution timing of the focus control based on the operation state of the imaging apparatus 100 and a predetermined correspondence relationship between the position of the focus lens and the focus control execution timing. When a time during which a change in contrast value derived from an image captured by the image capturing apparatus 100 continuously satisfies a predetermined condition reaches a predetermined time, the image capturing apparatus 100 executes focus control. The imaging control unit 110 can control the execution timing of the focus control by adjusting a predetermined time. The imaging control unit 110 can control the execution timing of the Auto Focus (AF) by adjusting a predetermined time.
Fig. 10 shows an example of a policy table indicating a predetermined correspondence relationship between the operation state of the image pickup apparatus and the position of the focus lens and the AF execution timing. The imaging control section 110 may control the execution timing of AF according to the policy table shown in fig. 10. When the AF execution timing corresponding to the operation state of the image pickup apparatus 100 is registered in the policy table, the image pickup control part 110 may execute AF at the AF execution timing registered in the policy table.
Any one of the shortened mode, the immediate mode, the first delay mode, and the second delay mode is registered in the policy table as the execution timing of AF. The shortened mode is a mode in which the execution time of AF advances. The imaging control unit 110 includes a counter for counting the number of frames in which the contrast value changes by a predetermined value or more. When the contrast value change is not equal to or greater than a predetermined value, the imaging control unit 110 resets the count value of the counter. When the count value of the counter reaches a predetermined threshold value, the imaging control unit 110 executes AF. In the shortening mode, the image pickup control unit 110 can shorten the time until the count value of the counter reaches the predetermined first threshold value by increasing the count value every time the counter counts. Alternatively, when the count value reaches a second threshold value smaller than a predetermined first threshold value, the imaging control unit 110 executes AF, and the AF execution timing can be advanced.
In the immediate mode, the imaging control section 110 can immediately execute AF regardless of the count value of the counter. In the first delay mode, the execution timing of AF can be delayed by updating the count value of the current counter to a half value. In the second delay mode, the imaging control unit 110 does not execute AF until a third threshold value larger than a predetermined first threshold value is reached, and thus the timing of executing AF can be delayed.
When the operation state of the image pickup apparatus 100 continues to be the walking state for 2 seconds or more, and the operation is stopped, and the position of the focus lens corresponds to a position near infinity, the image pickup control unit 110 may execute AF in the shortened mode.
When the operation state of the imaging device 100 continues to be a walking state, a running state, or a panning state (a state in which the imaging device 100 moves left and right slowly) for 1 second or more, and then the operation is completely stopped, and AF is not performed for 3 seconds or more, the imaging control unit 110 may perform AF in the immediate mode.
When the operation state of the image pickup apparatus 100 indicates a panning state that lasts 1.5 seconds or more, the position of the focus lens corresponds to a position in the close vicinity, and AF is not performed for 2 seconds or more, the image pickup control section 110 may perform AF in the immediate mode.
The imaging control unit 110 may execute AF in the first delay mode when the operating state of the imaging apparatus 100 indicates a panning state that lasts for 1.5 seconds or more and the position of the focus lens corresponds to a position near infinity.
The operating state of the imaging apparatus 100 indicates a panning state that lasts for 8 seconds or more, and if AF is not performed for 8 seconds or more, AF may be performed in the second delay mode.
When the operating state of the image pickup apparatus 100 shows a walking state for less than 2 seconds and the position of the focus lens is a position corresponding to the close vicinity, the image pickup control section 110 may execute AF in the shortened mode.
When the operating state of the image pickup apparatus 100 indicates a running state that lasts 2 seconds or more and the position of the focus lens is a position corresponding to the vicinity of infinity, the image pickup control unit 110 may execute AF in the first delay mode.
The imaging control unit 110 may execute AF in the second delay mode when the operating state of the imaging apparatus 100 indicates a running state lasting 10 seconds or more and AF is not executed for 10 seconds or more.
The policy table shown in fig. 10 is only one example. In the policy table, an execution timing of AF may be registered in association with an arbitrary operation state of the image pickup apparatus 100 and a position of the focus lens that can be determined from at least one of the angular velocity information and the acceleration information.
Fig. 11 is a flowchart showing one example of a control procedure of the AF execution timing of the image pickup control portion 110.
The imaging control unit 110 acquires angular velocity information and acceleration information of the imaging device 100 and operation mode information of the support mechanism 200 from the attitude control unit 210. Further, the imaging control unit 110 acquires lens position information of the focus lens from the lens control unit 150 (S100).
The imaging control unit 110 determines the operating state of the imaging device 100 based on the angular velocity information, the acceleration information, the lens position information, and the operation mode information (S102). The imaging control unit 110 may select angular velocity information and acceleration information or select acceleration based on the operation mode of the support mechanism 200. The imaging control unit 110 may determine the operating state of the imaging device 100 based on any one of the selected angular velocity information and the selected acceleration information. The imaging control unit 110 refers to the policy table stored in the memory 130, and determines whether or not the operation state of the imaging device 100 is registered in the policy table (S104).
If the operation state of the image pickup apparatus 100 is registered in the policy table, the image pickup control section 110 controls the focus control execution timing of the image pickup apparatus 100 based on the policy table (S106). The imaging control unit 110 can select a mode corresponding to the operating state of the imaging apparatus 100 from among the shortened mode, the immediate mode, the first delay mode, and the second delay mode, and control the timing of executing the focus control of the imaging apparatus 100.
As described above, according to the present embodiment, the imaging system 10 controls the execution timing of the focus control based on the operation state of the imaging apparatus 100 and the position of the lens 154. Accordingly, the timing of executing the focus control can be controlled more appropriately in accordance with the operating state of the image pickup apparatus 100.
Fig. 12 shows another aspect of the imaging system 10. As shown in fig. 12, the camera system 10 can be used in a state in which a mobile terminal including a display such as a smartphone 400 is fixed to the side of the grip 300.
FIG. 13 illustrates one example of a computer 1200 in which aspects of the invention may be embodied, in whole or in part. The program installed on the computer 1200 can cause the computer 1200 to function as one or more "sections" of or operations associated with the apparatus according to the embodiment of the present invention. Alternatively, the program can cause the computer 1200 to execute the operation or the one or more "sections". The program enables the computer 1200 to execute the processes or the stages of the processes according to the embodiments of the present invention. Such programs may be executed by the CPU1212 to cause the computer 1200 to perform specified operations associated with some or all of the blocks in the flowchart and block diagrams described herein.
The computer 1200 of the present embodiment includes a CPU1212 and a RAM1214, which are connected to each other through a host controller 1210. The computer 1200 also includes a communication interface 1222, an input/output unit, which are connected to the host controller 1210 through the input/output controller 1220. Computer 1200 also includes a ROM 1230. The CPU1212 operates in accordance with programs stored in the ROM1230 and the RAM1214, thereby controlling the respective units.
The communication interface 1222 communicates with other electronic devices through a network. The hard disk drive may store programs and data used by CPU1212 in computer 1200. The ROM1230 stores therein a boot program or the like executed by the computer 1200 at runtime, and/or a program depending on the hardware of the computer 1200. The program is provided through a computer-readable recording medium such as a CR-ROM, a USB memory, or an IC card, or a network. The program is installed in the RAM1214 or the ROM1230, which is also an example of a computer-readable recording medium, and executed by the CPU 1212. The information processing described in these programs is read by the computer 1200, and causes cooperation between the programs and the various types of hardware resources described above. An apparatus or method may be constructed by implementing operations or processes of information according to the use of the computer 1200.
For example, when performing communication between the computer 1200 and an external device, the CPU1212 may execute a communication program loaded in the RAM1214, and instruct the communication interface 1222 to perform communication processing based on processing described in the communication program. The communication interface 1222 reads transmission data stored in a transmission buffer provided in a recording medium such as the RAM1214 or a USB memory and transmits the read transmission data to a network, or writes reception data received from the network in a reception buffer or the like provided in the recording medium, under the control of the CPU 1212.
Further, the CPU1212 may cause the RAM1214 to read all or a necessary portion of a file or a database stored in an external recording medium such as a USB memory, and perform various types of processing on data on the RAM 1214. Then, the CPU1212 may write back the processed data to the external recording medium.
Various types of information such as various types of programs, data, tables, and databases may be stored in the recording medium and processed by the information. With respect to data read from the RAM1214, the CPU1212 may execute various types of processing described throughout this disclosure, including various types of operations specified by an instruction sequence of a program, information processing, condition judgment, condition transition, unconditional transition, information retrieval/replacement, and the like, and write the result back into the RAM 1214. Further, the CPU1212 can retrieve information in files, databases, etc., within the recording medium. For example, in a case where a plurality of entries having attribute values of the 1 st attribute respectively associated with attribute values of the 2 nd attribute are stored in the recording medium, the CPU1212 may retrieve an entry matching a condition specifying an attribute value of the 1 st attribute from among the plurality of entries, and read an attribute value of the 2 nd attribute stored in the entry, thereby acquiring an attribute value of the 2 nd attribute associated with the 1 st attribute satisfying a predetermined condition.
The programs or software modules described above may be stored on the computer 1200 or on a computer-readable storage medium near the computer 1200. In addition, a recording medium such as a hard disk or a RAM provided in a server system connected to a dedicated communication network or the internet may be used as the computer-readable storage medium, so that the program can be provided to the computer 1200 via the network.
The present invention has been described above using the embodiments, but the technical scope of the present invention is not limited to the scope described in the above embodiments. It will be apparent to those skilled in the art that various changes and modifications can be made in the above embodiments. It is apparent from the description of the claims that the modes to which such changes or improvements are made are included in the technical scope of the present invention.
It should be noted that the execution order of the operations, the sequence, the steps, the stages, and the like in the devices, systems, programs, and methods shown in the claims, the description, and the drawings of the specification can be realized in any order as long as "before. The operational flow in the claims, the specification, and the drawings of the specification is described using "first", "next", and the like for convenience, but this does not necessarily mean that the operations are performed in this order.
Claims (12)
- A control device for controlling an image pickup device rotatably supported by a support mechanism,a circuit comprising the following configuration: acquiring at least one of first information on an angular velocity of the image pickup device and second information on an acceleration of the image pickup device and an operation mode of the support mechanism for controlling the attitude of the image pickup device,controlling a focus control execution timing of the image pickup apparatus based on the operation mode and at least one of the first information and the second information.
- The control device of claim 1, wherein the circuit is configured to: and further controlling an execution timing of the focus control based on a position of a focus lens included in the image pickup apparatus.
- The control device of claim 2, wherein the circuit is configured to:selecting at least one of the first information and the second information based on an action mode of the support mechanism,determining an operating state of the image pickup apparatus based on at least one of the selected first information and the selected second information,controlling an execution timing of the focus control based on an operation state of the image pickup apparatus and a position of the focus lens.
- The control device according to claim 3, wherein the action mode includes: a first operation mode in which the support mechanism is operated so that the posture of the imaging device changes in accordance with a change in the posture of the support mechanism base; and a second operation mode in which the support mechanism is operated to maintain the posture of the imaging device.
- The control device of claim 4, wherein the circuit is configured to: when the support mechanism operates in the first operation mode, the operation state of the imaging device is determined based on the first information and the second information.
- The control device of claim 4, wherein the circuit is configured to: when the support mechanism operates in the second operation mode, the operation state of the imaging device is determined based on the second information.
- The control device of claim 3, wherein the circuit is configured to: the execution timing of the focus control is controlled based on the operation state of the image pickup apparatus and a predetermined correspondence between the position of the focus lens and the execution timing of the focus control.
- The control apparatus according to claim 1, wherein the image pickup apparatus executes the focus control when a time during which a change in a contrast value derived from an image picked up by the image pickup apparatus continues to satisfy a predetermined condition reaches a predetermined time,the circuit controls the execution timing of the focus control by adjusting the predetermined time.
- An image pickup system, comprising:the control device according to any one of claims 1 to 8,the imaging device, andthe supporting mechanism.
- The camera system according to claim 9, further comprising a grip member connected to a base of the support mechanism having an operation interface for operating the camera device and the support mechanism.
- A control method for controlling an image pickup apparatus rotatably supported by a support mechanism, comprising:a stage of acquiring at least one of first information on an angular velocity of the image pickup device and second information on an acceleration of the image pickup device and an operation mode of the support mechanism for controlling the attitude of the image pickup device; andand controlling a focus control execution timing of the image pickup device based on the operation mode and at least one of the first information and the second information.
- A program for causing a computer to function as the control device according to any one of claims 1 to 8.
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2019061606A JP6686254B1 (en) | 2019-03-27 | 2019-03-27 | Control device, imaging system, control method, and program |
| JP2019-061606 | 2019-03-27 | ||
| PCT/CN2020/080211 WO2020192551A1 (en) | 2019-03-27 | 2020-03-19 | Control device, camera system, control method, and program |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN112136318A true CN112136318A (en) | 2020-12-25 |
| CN112136318B CN112136318B (en) | 2022-04-01 |
Family
ID=70286883
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202080002784.5A Expired - Fee Related CN112136318B (en) | 2019-03-27 | 2020-03-19 | Control device, imaging system, control method, and computer-readable storage medium |
Country Status (3)
| Country | Link |
|---|---|
| JP (1) | JP6686254B1 (en) |
| CN (1) | CN112136318B (en) |
| WO (1) | WO2020192551A1 (en) |
Citations (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2010211166A (en) * | 2009-02-13 | 2010-09-24 | Fujitsu Ltd | Image pickup apparatus, portable terminal device, and focusing mechanism control method |
| CN102879976A (en) * | 2012-08-13 | 2013-01-16 | 深圳市先河系统技术有限公司 | Automatic focusing method and camera device |
| CN107026960A (en) * | 2015-12-25 | 2017-08-08 | 奥林巴斯株式会社 | Camera device |
| CN107079103A (en) * | 2016-05-31 | 2017-08-18 | 深圳市大疆灵眸科技有限公司 | Cloud platform control method, device and cloud platform |
| CN206786246U (en) * | 2017-05-11 | 2017-12-22 | 蔡子昊 | A kind of high performance intelligent shooting tripod head |
| CN107864340A (en) * | 2017-12-13 | 2018-03-30 | 浙江大华技术股份有限公司 | The method of adjustment and photographic equipment of a kind of photographic parameter |
| CN108184067A (en) * | 2018-01-18 | 2018-06-19 | 桂林智神信息技术有限公司 | A kind of method of work with burnt system |
| CN108259703A (en) * | 2017-12-31 | 2018-07-06 | 深圳市秦墨科技有限公司 | A kind of holder with clapping control method, device and holder |
| WO2018137134A1 (en) * | 2017-01-24 | 2018-08-02 | Sz Dji Osmo Technology Co., Ltd. | Query response by a gimbal mounted camera |
| CN208239858U (en) * | 2018-06-25 | 2018-12-14 | 智卓(深圳)电子科技有限公司 | A kind of hand-held holder with touch key-press control |
-
2019
- 2019-03-27 JP JP2019061606A patent/JP6686254B1/en not_active Expired - Fee Related
-
2020
- 2020-03-19 WO PCT/CN2020/080211 patent/WO2020192551A1/en not_active Ceased
- 2020-03-19 CN CN202080002784.5A patent/CN112136318B/en not_active Expired - Fee Related
Patent Citations (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2010211166A (en) * | 2009-02-13 | 2010-09-24 | Fujitsu Ltd | Image pickup apparatus, portable terminal device, and focusing mechanism control method |
| CN102879976A (en) * | 2012-08-13 | 2013-01-16 | 深圳市先河系统技术有限公司 | Automatic focusing method and camera device |
| CN107026960A (en) * | 2015-12-25 | 2017-08-08 | 奥林巴斯株式会社 | Camera device |
| CN107079103A (en) * | 2016-05-31 | 2017-08-18 | 深圳市大疆灵眸科技有限公司 | Cloud platform control method, device and cloud platform |
| WO2018137134A1 (en) * | 2017-01-24 | 2018-08-02 | Sz Dji Osmo Technology Co., Ltd. | Query response by a gimbal mounted camera |
| CN206786246U (en) * | 2017-05-11 | 2017-12-22 | 蔡子昊 | A kind of high performance intelligent shooting tripod head |
| CN107864340A (en) * | 2017-12-13 | 2018-03-30 | 浙江大华技术股份有限公司 | The method of adjustment and photographic equipment of a kind of photographic parameter |
| CN108259703A (en) * | 2017-12-31 | 2018-07-06 | 深圳市秦墨科技有限公司 | A kind of holder with clapping control method, device and holder |
| CN108184067A (en) * | 2018-01-18 | 2018-06-19 | 桂林智神信息技术有限公司 | A kind of method of work with burnt system |
| CN208239858U (en) * | 2018-06-25 | 2018-12-14 | 智卓(深圳)电子科技有限公司 | A kind of hand-held holder with touch key-press control |
Non-Patent Citations (1)
| Title |
|---|
| 无: "DJI大疆创新如影Ronin-S手持云台正式发售", 《数码影像时代》 * |
Also Published As
| Publication number | Publication date |
|---|---|
| CN112136318B (en) | 2022-04-01 |
| JP6686254B1 (en) | 2020-04-22 |
| JP2020160349A (en) | 2020-10-01 |
| WO2020192551A1 (en) | 2020-10-01 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10237491B2 (en) | Electronic apparatus, method of controlling the same, for capturing, storing, and reproducing multifocal images | |
| JPWO2008114683A1 (en) | Computer program product for subject tracking, subject tracking device, and camera | |
| CN106412437B (en) | Terminal focusing method and device and terminal | |
| JP2019146022A (en) | Imaging device and imaging method | |
| JP2009027212A (en) | Imaging device | |
| CN105629428A (en) | Optical instrument and control method for lens | |
| CN104811607B (en) | Photographic device, image capture method and recording medium | |
| CN111052725B (en) | Control device, imaging device, control method, and program | |
| US11394878B2 (en) | Image capturing apparatus, method of controlling image capturing apparatus, and storage medium | |
| CN112585938B (en) | Control device, imaging device, control method, and program | |
| CN112136318B (en) | Control device, imaging system, control method, and computer-readable storage medium | |
| CN110830726B (en) | Automatic focusing method, device, equipment and storage medium | |
| CN112313574B (en) | Control device, imaging system, control method, and program | |
| CN111226433B (en) | Specifying device, control device, imaging device, specifying method, and program | |
| WO2021088669A1 (en) | Control apparatus, photographing apparatus, control method and program | |
| JP5932340B2 (en) | Focus adjustment device | |
| JP6858065B2 (en) | Imaging device and its control method | |
| JP6543876B2 (en) | CONTROL DEVICE, IMAGING SYSTEM, MOBILE OBJECT, CONTROL METHOD, AND PROGRAM | |
| CN112369010B (en) | Control device, imaging device, and control method | |
| US20250039548A1 (en) | Information processing apparatus, information processing method, and non-transitory computer-readable storage medium | |
| JP6812618B1 (en) | Control device, imaging device, control method, and program | |
| JP2008294667A (en) | IMAGING DEVICE, ITS CONTROL METHOD, PROGRAM, AND STORAGE MEDIUM | |
| JP2019185541A (en) | Image processing apparatus, image processing method, and program | |
| JP2025179366A (en) | Image capture device, image capture device control method, and program | |
| CN111936927A (en) | Control device, imaging device, system, control method, and program |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant | ||
| CF01 | Termination of patent right due to non-payment of annual fee | ||
| CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20220401 |