[go: up one dir, main page]

WO2019065551A1 - Système de capture d'image, dispositif de capture d'image, procédé de capture d'image et programme de capture d'image - Google Patents

Système de capture d'image, dispositif de capture d'image, procédé de capture d'image et programme de capture d'image Download PDF

Info

Publication number
WO2019065551A1
WO2019065551A1 PCT/JP2018/035255 JP2018035255W WO2019065551A1 WO 2019065551 A1 WO2019065551 A1 WO 2019065551A1 JP 2018035255 W JP2018035255 W JP 2018035255W WO 2019065551 A1 WO2019065551 A1 WO 2019065551A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging
unit
image data
sorting
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2018/035255
Other languages
English (en)
Japanese (ja)
Inventor
宏輔 栗林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujifilm Corp
Original Assignee
Fujifilm Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujifilm Corp filed Critical Fujifilm Corp
Publication of WO2019065551A1 publication Critical patent/WO2019065551A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present invention relates to an imaging system, an imaging apparatus, an imaging method, and an imaging program, and more particularly to imaging assistance.
  • an imaging apparatus provided with an imaging support function of acquiring information of an imaging position and displaying guide information such as an imaging position and an imaging angle to an imaging person based on an image captured in the past at the imaging position.
  • Patent Document 1 describes an imaging device for guiding a position at which a reference photo is captured using augmented reality technology.
  • the imaging device described in Patent Document 1 displays a virtual object on a liquid crystal screen on which a subject is displayed using an augmented reality technology.
  • the virtual object changes in the same manner as a real subject according to the imaging position and the angle of view.
  • Patent Document 2 describes an imaging device that performs display for guiding a photographer to a place where a reference photo is captured in order to capture the same photo as the reference photo.
  • the imaging device described in Patent Document 2 displays, on the display unit, a reference photograph transmitted from the photograph posting site, and guidance information such as the movement distance to the imaging location, the movement direction, and the height difference.
  • Patent Document 3 describes an imaging device that displays on a monitor in real time a shift in latitude between the location where the reference photo was captured and the current position and a shift in longitude in order to capture the same photo as the reference photo. There is.
  • Patent Document 4 describes an imaging support system that displays an image suitable for imaging support when there are many images stored in an image database.
  • the imaging support system described in Patent Document 4 displays a plurality of reference composition images arranged in descending order of priority on the display unit of the imaging device.
  • Patent Literature 1 to Patent Literature 3 display guide information and the like based on one reference photograph determined by the user.
  • Patent Document 1 to Patent Document 3 do not describe how to narrow down reference photographs to be candidates for imaging when a plurality of reference photographs are stored in a storage unit such as an image database. Further, Patent Document 1 to Patent Document 3 do not describe how to display the narrowed reference photo and how the user selects the narrowed reference photo.
  • Patent Document 4 Although the imaging support system described in Patent Document 4 displays a plurality of reference composition images arranged in descending order of priority, Patent Document 4 describes how the plurality of displayed reference composition images are used. Not listed.
  • the present invention has been made in view of such circumstances, and enables the selection of a reference picture to be a sample of composition from a storage unit in which a plurality of reference pictures are stored, and imaging using the selected reference picture It is an object of the present invention to provide an imaging system, an imaging apparatus, an imaging method, and an imaging program that realize the above.
  • the imaging system is an image data acquisition system for acquiring a plurality of image data for reference of composition from a storage unit storing a plurality of image data and a plurality of image data stored in the storage unit And a plurality of image data based on the sort condition setting unit that sets sort conditions for a plurality of image data acquired using the image data acquisition unit, and the sort condition set using the sort condition setting unit
  • a sorting unit for sorting the images a display unit for displaying images corresponding to a plurality of image data sorted using the sorting unit in the order of the sorting order in the set sorting condition, and the display unit
  • a selection unit that selects an image from a plurality of images, an imaging condition setting unit that sets imaging conditions of image data corresponding to the image selected using the selection unit, and a selection unit Apply an imaging condition set using an imaging position and an imaging condition setting unit that indicate an imaging position and an imaging direction at which the same image as the selected image can be imaged, and instruct using the composition assistance unit
  • an imaging unit configured to perform imaging by applying
  • the plurality of image data when acquiring a plurality of image data to be a reference of composition, are sorted based on the sorting criterion, and the plurality of image data are displayed in the order of the sorting order. This makes it possible to easily select image data to be a reference for the composition.
  • Image data is selected from among the image data displayed in the order of sorting order.
  • the imaging condition of the selected image data, the imaging position, and the imaging direction are set in the imaging unit. This enables imaging using the imaging conditions and composition of the selected image.
  • an acquisition range specification unit may be provided to specify an acquisition range of the image data.
  • the imaging position of image data is mentioned as an acquisition range of image data.
  • the storage unit stores a plurality of image data obtained by performing imaging at a designated point and a peripheral point of the designated point, and the imaging unit captures an image at the designated point May be configured to
  • the designated point can automatically set the current position of the imaging device.
  • the designated point may be manually set by the user.
  • the sort condition setting unit includes at least one of the number of times of imaging, the number of times the social button has been operated, and the number of times read from the storage unit in the past. It is also possible to set one as the sort condition.
  • the number of times of imaging representing the order of popularity, the number of times the social button is operated, and the number of times read from the storage unit in the past can be set.
  • the sort condition may be one or more. In the case of setting a plurality of sort conditions, it is preferable to set the priority of the sort conditions.
  • the sorting condition setting unit may set the condition of the imaging device used for imaging as the sorting condition.
  • condition of the imaging device can be set as the sort condition.
  • the matching degree of the imaging device includes the matching degree of an optical system such as a lens, the matching degree of an imaging device, and the matching degree of a manufacturer.
  • a 5th aspect is provided with the imaging log acquisition part which acquires the imaging log of the imaging device used for imaging in the imaging system of a 4th aspect, and the conditions of an imaging device are used for the imaging acquired using the imaging log acquisition part It may be configured to include an imaging history of the imaging device.
  • the imaging history of the imaging device used for imaging can be set as the sorting condition.
  • the imaging history of the imaging device used for imaging can be acquired using incidental information incidental to image data stored in the imaging device.
  • An imaging history storage unit that stores imaging history may be provided.
  • the sorting condition setting unit may set two or more types of sorting conditions including the conditions of the imaging device.
  • two or more types of sorting conditions including the conditions of the imaging device can be set. This enables imaging that reflects the conditions of the imaging device and reflects the conditions such as the order of popularity.
  • Priority may be set to the sort condition.
  • the display control unit is configured to cause the display unit to display an image from the first place in the sorting order in the sorting condition to a predetermined order. It is good also as composition.
  • the seventh aspect it is possible to preferentially select an image higher in sort order.
  • the display control unit may adjust the number of images to be displayed on the display unit in accordance with the display capability of the display unit.
  • the eighth aspect it is possible to display on the display unit a number of images according to the capability of the display unit.
  • the display capability includes the number of images that can be displayed, the resolution of the display unit, and the like.
  • the display control unit reduces at least a part of the images when the number of images displayed on the display unit is a predetermined number or more. It may be configured to be displayed on the display unit.
  • the ninth aspect it is possible to display a larger number of images on the display unit.
  • a tenth aspect is the imaging system according to any one of the seventh aspect to the ninth aspect, wherein the display control unit performs composition assistance information using the composition assistance unit at the time of imaging at the designated point, and an imaging condition setting unit Alternatively, at least one of the imaging conditions set by using may be superimposed on the subject.
  • At least one of the composition assistance information and the imaging condition can be referred to.
  • the display control unit may be configured to display at least one of the composition assistance information and the imaging condition in augmented reality.
  • the extended realization display can be used to display at least one of the composition support information and the imaging condition.
  • generates an augmented-reality image is preferable.
  • the imaging device may have a configuration including an imaging unit, a display unit, and a display control unit.
  • the imaging device display of at least one of composition support information and imaging conditions and display control can be performed.
  • the server device and the imaging device connected to the server device via a network are provided, and the server includes a storage unit.
  • the imaging apparatus may be configured to include a display unit, an imaging condition setting unit, a composition support unit, and an imaging unit.
  • the imaging device can acquire a plurality of images acquired from the server device.
  • the imaging device can arrange a plurality of images in the order of sort order and display the images on the display unit of the imaging device.
  • the imaging device can set imaging conditions corresponding to the selected image.
  • the imaging device can indicate the composition corresponding to the selected image.
  • the server apparatus may include a sort condition setting unit and a sort unit.
  • the imaging device may be configured to include a sort condition setting unit and a sort unit.
  • the imaging device sets sorting conditions, performs sorting of a plurality of images using the set sorting conditions, and arranges the plurality of images in the order of sorting order and displays the plurality of images on the display unit of the imaging device It becomes possible.
  • the selection unit is an operation unit used to select an image from a plurality of images displayed using the display unit. May be provided.
  • the user can operate the operation unit to select an image to be a reference of composition from among a plurality of images displayed arranged in the order of sort order.
  • the imaging device acquires an image data acquisition unit that acquires a plurality of image data to be a reference of a composition from a plurality of image data stored in the storage unit, and an image data acquisition unit.
  • a sorting condition setting unit that sets sorting conditions for the plurality of pieces of image data; a sorting unit that sorts the plurality of image data based on the sorting conditions set using the sorting condition setting unit; Selecting an image from among the plurality of images displayed using the display unit that displays the images corresponding to the plurality of image data sorted in sequence in the order of the sorting order under the set sorting condition and the display unit It is possible to capture the same image as the image selected using the selection unit and the imaging condition setting unit that sets the imaging condition of the image data corresponding to the image selected using the unit and the selection unit Apply an imaging condition set using an imaging position and an imaging direction, a composition support unit specifying an imaging direction and an imaging condition setting unit, apply an imaging position instructed using a composition support unit, and an imaging direction And an imaging unit including the imaging
  • the same matters as the matters specified in the second to fifteenth aspects can be appropriately combined.
  • the component carrying the processing or function specified in the imaging system can be grasped as the component of the imaging apparatus carrying the processing or function corresponding to this.
  • An imaging method includes an image data acquisition step of acquiring a plurality of image data to be a reference of composition from among a plurality of image data stored in the storage unit, and a plurality of image data acquired in the image data acquisition step.
  • the sorting condition setting step of setting sorting conditions for the image data of the above, the sorting step of sorting the plurality of image data based on the sorting conditions set in the sorting condition setting step, and the plurality of sorting steps in the sorting step A display step of arranging and displaying images corresponding to the image data in the order of sorting in the set sorting condition, a selection step of selecting an image from a plurality of images displayed in the display step, and An imaging condition setting step of setting an imaging condition of image data corresponding to a selected image, and an image selected in the selection step
  • the imaging position and imaging direction specified in the composition assistance step are applied by applying the imaging support condition for specifying the imaging position and imaging direction where the imaging of the same image is possible, and the imaging condition setting step. And an imaging step of applying and imaging.
  • the same matters as the matters specified in the second to fifteenth aspects can be appropriately combined.
  • the component carrying the processing or function specified in the imaging system can be grasped as the component of the imaging method carrying the processing or function corresponding thereto.
  • the imaging program uses an image data acquisition function and an image data acquisition function for acquiring a plurality of image data as reference for composition from among a plurality of image data stored in the storage unit in a computer.
  • Sorting condition setting function for setting sorting conditions for a plurality of image data acquired by the user, sorting function for sorting a plurality of image data based on sorting conditions set using the sorting condition setting function, and sorting function
  • a display function for displaying images corresponding to a plurality of image data sorted in the order of the sorting order in the set sorting condition, and a selecting function for selecting an image from a plurality of images displayed using the display function ,
  • Imaging condition setting function for setting imaging conditions of image data corresponding to an image selected using the selection function, selection using the selection function
  • the same matters as the matters specified in the second to fifteenth aspects can be combined as appropriate.
  • a component carrying a process or function specified in the imaging system can be grasped as a component of an imaging program carrying a process or function corresponding thereto.
  • An eighteenth aspect is an imaging device having at least one or more processors and at least one or more memories, wherein the processor acquires a plurality of image data from a storage unit in which the plurality of image data are stored.
  • a plurality of image data based on the sorting conditions set using the data acquisition function, the sorting condition setting function for setting sorting conditions for a plurality of image data acquired using the image data acquiring function, and the sorting condition setting function
  • Select an image from among the images, set imaging conditions for the image data corresponding to the image selected using the selection function Apply imaging conditions set using imaging condition setting function and composition support function that indicates the imaging position and imaging direction where imaging of the same image as the image selected using imaging condition setting function selection function is possible
  • the imaging system can be configured as an imaging system
  • the plurality of image data when a plurality of image data to be a reference of composition is acquired, the plurality of image data is sorted based on the sort criterion, and the plurality of image data are arranged and displayed in the order of sort order. This makes it possible to easily select image data to be a reference for the composition.
  • Image data is selected from among the image data displayed in order of sort order.
  • the imaging condition of the selected image data, the imaging position, and the imaging direction are set in the imaging unit. This enables imaging using the imaging conditions and composition of the selected image.
  • FIG. 1 is a front perspective view showing the appearance of an embodiment of a digital camera.
  • FIG. 2 is a rear perspective view of the digital camera shown in FIG.
  • FIG. 3 is a schematic block diagram of a finder device.
  • FIG. 4 is a schematic block diagram of the inside of the digital camera.
  • FIG. 5 is a block diagram of main functions realized by the camera microcomputer.
  • FIG. 6 is an explanatory view showing an outline of the imaging support function.
  • FIG. 7 is an explanatory view of a reference photograph showing an example of the reference photograph.
  • FIG. 8 is a block diagram of a camera microcomputer for realizing the imaging support function.
  • FIG. 9 is a flowchart showing the flow of the procedure of the imaging support function.
  • FIG. 10 is a diagram showing an example of the designated point setting screen.
  • FIG. 10 is a diagram showing an example of the designated point setting screen.
  • FIG. 11 is a view showing an example of the result screen of the designated point automatic setting.
  • FIG. 12 is a view showing another example of the designated point setting screen.
  • FIG. 13 is a diagram showing an example of the sort reference setting screen.
  • FIG. 14 is a diagram showing an example of the sort criterion selection screen.
  • FIG. 15 is a view showing another example of the sort criterion selection screen.
  • FIG. 16 is a view showing an example of the reference photograph display screen.
  • FIG. 17 is a diagram showing an example of the reference photo selection screen.
  • FIG. 18 is a view showing an example of the composition information display screen.
  • FIG. 19 is a diagram showing an example of imaging condition display.
  • FIG. 20 shows an example of the exposure adjustment selection screen.
  • FIG. 21 is a block diagram of an imaging system showing an application example of a network system.
  • FIG. 22 is a block diagram of a modification of the imaging system shown in FIG.
  • FIG. 1 is a front perspective view showing the appearance of an embodiment of a digital camera.
  • FIG. 2 is a rear perspective view of the digital camera shown in FIG.
  • the digital camera 1 is an example of an imaging device.
  • the digital camera 1 shown in FIGS. 1 and 2 is a lens-integrated digital camera in which an imaging lens 4 is integrated with a camera body 2.
  • the digital camera 1 includes an imaging lens 4, an operation unit 6, a rear monitor 8, a finder device 10 and the like in the camera body 2.
  • the imaging lens 4 is provided on the front of the camera body 2.
  • the imaging lens 4 is configured by a single focus lens having a focusing function, and is configured by combining a plurality of lenses.
  • the operation unit includes the power lever 12, the shutter button 14, the first function button 16, the exposure correction dial 18, the shutter speed / sensitivity dial 20, the front command dial 22, the finder switching lever 24, the second function button 26, the focus mode switching lever 28, view mode button 30, metering mode button 32, AE lock button 34, rear command dial 36, focus lever 38, play button 40, erase button 42, back button 44, cross button 46, menu / OK button 48, AF lock It has a button 50, a quick menu button 52 and the like.
  • FIGS. 1 and 2 the illustration of the reference numeral 6 representing the operation unit is omitted.
  • symbol 6 showing the operation part is illustrated in FIG.
  • AE is an abbreviation of automatic exposure, which is an English expression that represents automatic exposure.
  • AF is an abbreviation of auto focus, which is an English expression that represents automatic focusing.
  • the power lever 12, the shutter button 14, the first function button 16, the exposure correction dial 18, and the shutter speed / sensitivity dial 20 are provided on the upper surface of the camera body 2.
  • the front command dial 22, the finder switching lever 24, the second function button 26 and the focus mode switching lever 28 are provided on the front of the camera body 2.
  • the rear monitor 8 is provided on the rear of the camera body 2.
  • the rear monitor 8 is an example of a display unit.
  • the rear monitor 8 is configured of, for example, an LCD.
  • the LCD is an abbreviation of liquid crystal display, which is an English expression representing a liquid crystal display device.
  • the finder device 10 is a hybrid type finder device capable of switching between the OVF mode and the EVF mode.
  • the finder device 10 includes a finder window 10 a on the front side of the camera body 2 and a finder eyepiece 10 b on the rear side.
  • OVF is an abbreviation of optical view finder, which is an English expression representing an optical viewfinder.
  • EVF is an abbreviation of electronic view finder that represents an electronic view finder.
  • FIG. 3 is a schematic block diagram of a finder device.
  • the finder device 10 includes a finder first optical system 60, a finder LCD 62, a finder second optical system 64, and a liquid crystal shutter 66.
  • the finder first optical system 60 is an optical system for optically observing an object.
  • the finder first optical system 60 includes an objective lens group 60a and an eyepiece lens group 60b.
  • the finder first optical system 60 constitutes an optical finder.
  • the finder second optical system 64 is an optical system for guiding the image displayed on the finder LCD 62 to the light path of the finder first optical system 60.
  • the finder second optical system 64 and the finder LCD 62 constitute a superimposed display unit.
  • the finder second optical system 64 includes a half prism 64 a and a target lens group 64 b.
  • the half prism 64 a is disposed between the objective lens group 60 a and the eyepiece lens group 60 b of the finder first optical system 60.
  • the half prism 64a has a semipermeable membrane 64a1 on the inner slope.
  • the semipermeable membrane 64a1 splits the light incident on the half prism 64a into transmitted light and reflected light.
  • the target lens group 64b is disposed between the finder LCD 62 and the half prism 64a, and guides the light from the finder LCD 62 to the eyepiece group 60b via the half prism 64a.
  • the liquid crystal shutter 66 is disposed between the finder window 10a and the objective lens group 60a.
  • the liquid crystal shutter 66 opens and closes the finder window 10a. That is, when the finder window 10a is opened, the light from the finder window 10a is transmitted. When the finder window 10a is closed, the light from the finder window 10a is blocked.
  • the finder device 10 having the above configuration, when the liquid crystal shutter 66 is opened, light from the subject is transmitted through the objective lens group 60a, the half prism 64a, and the eyepiece lens group 60b and guided to the finder eyepiece 10b.
  • the finder eyepiece 10b when looking through the finder eyepiece 10b, it is possible to observe an optical image of the subject.
  • An optical image of a subject is sometimes called an optical finder image.
  • the liquid crystal shutter 66 When the liquid crystal shutter 66 is opened and an image is displayed on the finder LCD 62, light from the finder LCD 62 is incident on the half prism 64a via the target lens group 64b. The light incident on the half prism 64a is reflected by the semipermeable film 64a1 and is guided to the eyepiece lens group 60b. As a result, when looking through the finder eyepiece 10b, the image on the finder LCD 62 is observed superimposed on the optical finder image observed by the finder first optical system 60.
  • the liquid crystal shutter 66 when the liquid crystal shutter 66 is closed, the incidence of light on the objective lens group 60a is blocked. As a result, even when looking through the finder eyepiece 10b, it is not possible to observe the optical finder image. On the other hand, the light from the finder LCD 62 is guided to the finder eyepiece 10b. Therefore, the image of the finder LCD 62 can be observed.
  • the mode in which the optical finder image can be observed is the OVF mode
  • the mode in which only the image on the finder LCD 62 can be observed is the EVF mode.
  • Switching between the OVF mode and the EVF mode is performed by operating the finder switching lever 24.
  • the finder switching lever 24 is provided so as to be swingable, and each time it is swung, the OVF mode and the EVF mode are alternately switched.
  • the OVF mode is set, the liquid crystal shutter 66 is opened, and when the EVF mode is set, the liquid crystal shutter 66 is closed.
  • FIG. 4 is a schematic block diagram of the inside of the digital camera.
  • the digital camera 1 includes an imaging lens 4, a finder device 10, an image sensor 70, an analog signal processing unit 74, a digital signal processing unit 76, a back monitor 8, a storage unit 78, a communication unit 80, an operation unit 6, a camera microcomputer 100, and the like. Prepare.
  • the imaging lens 4 is configured by combining a plurality of lenses.
  • the imaging lens 4 includes a focus lens drive mechanism 82 and an aperture drive mechanism 84.
  • the focus lens drive mechanism 82 includes an actuator and a drive circuit of the actuator.
  • the focus lens drive mechanism 82 moves the focus lens group 4 f along the optical axis L in accordance with an instruction from the camera microcomputer 100. This adjusts the focus.
  • the diaphragm drive mechanism 84 includes an actuator and a drive circuit of the actuator.
  • the diaphragm drive mechanism 84 operates the diaphragm 4i in accordance with an instruction from the camera microcomputer 100. That is, the diaphragm drive mechanism 84 switches the aperture of the diaphragm 4i. Thereby, the amount of light incident on the image sensor 70 is adjusted.
  • finder device In the finder device 10, the display of the finder LCD 62 is controlled by the camera microcomputer 100 via the finder LCD driver 62dr. Further, in the finder device 10, the driving of the liquid crystal shutter 66 is controlled by the camera microcomputer 100 via the liquid crystal shutter driver 66dr.
  • the digital camera 1 may not have the finder device 10.
  • the image sensor 70 receives light passing through the imaging lens 4 to capture an image.
  • the image sensor 70 is an example of a component of an imaging unit.
  • the image sensor 70 is configured by a known two-dimensional image sensor such as a CMOS type or a CCD type.
  • CMOS is an abbreviation of complementary metal-oxide semiconductor, which is an English notation representing a complementary metal oxide semiconductor.
  • CCD is an abbreviation for charge coupled device which stands for charge coupled device.
  • the analog signal processing unit 74 takes in an analog image signal for each pixel output from the image sensor 70 and performs signal processing on the analog image signal.
  • Signal processing for an analog image signal includes correlated double sampling processing, amplification processing, and the like.
  • the analog signal processing unit 74 includes an ADC, converts an analog image signal after predetermined signal processing into a digital image signal, and outputs the digital image signal.
  • ADC is an abbreviation of analog to digital converter, which is an English-language notation for an analog-to-digital converter.
  • An analog-to-digital converter may be called an AD converter.
  • the digital signal processing unit 76 takes in the digital image signal output from the analog signal processing unit 74, performs signal processing on the digital image signal, and generates image data. The generated image data is output to the camera microcomputer 100.
  • Signal processing on digital image signals includes tone conversion processing, white balance correction processing, gamma correction processing, synchronization processing, YC conversion processing, and the like.
  • Y represents luminance.
  • C represents a color difference.
  • the digital signal processing unit 76 detects information of the brightness of the subject necessary for the exposure control based on the captured image signal. Information on the detected brightness of the subject is output to the camera microcomputer 100.
  • the digital signal processing unit 76 detects information of the contrast of the subject necessary for the autofocus control based on the captured image signal.
  • the detected contrast information is output to the camera microcomputer 100.
  • Rear monitor The display of the rear monitor 8 is controlled by the camera microcomputer 100 via the LCD driver 8dr.
  • the storage unit 78 stores various data including image data.
  • the storage unit 78 is configured to include a built-in memory and a control circuit that reads and writes data from and to the built-in memory.
  • the built-in memory is composed of, for example, a non-volatile memory such as an EEPROM. Reading and writing of data to the storage unit 78 is controlled by the camera microcomputer 100.
  • the EEPROM is an abbreviation of electrically erasable programmable read only memory, which is an English notation representing an electrically erasable and writable storage element.
  • the storage unit 78 can also be configured using an external memory such as a memory card.
  • an external memory such as a memory card
  • the digital camera 1 is provided with a card slot or the like for loading a memory card.
  • the communication unit 80 wirelessly or wiredly communicates with an external device under the control of the camera microcomputer 100, and transmits / receives various signals to / from each other.
  • the communication method is not particularly limited, and is a commonly used communication method, such as a communication method based on the wireless LAN standard, a communication method based on the specific power saving wireless standard, a communication method using a mobile telephone network, etc. And a communication method according to the USB standard, etc. are used.
  • LAN is an abbreviation of local area network.
  • USB is an abbreviation of universal serial bus.
  • Operation part The operation unit 6 outputs a signal corresponding to the operation of each operation member to the camera microcomputer 100.
  • the camera microcomputer 100 functions as a control unit that generally controls the entire operation of the digital camera 1.
  • the camera microcomputer 100 also functions as an arithmetic processing unit that calculates physical quantities necessary for controlling the digital camera 1.
  • the camera microcomputer 100 is configured using a computer including a CPU, a RAM, and a ROM, or a microcomputer.
  • the CPU is an abbreviation of central processing unit, which is an English notation representing a central processing unit.
  • the RAM is an abbreviation of random access memory, which is an English notation representing a storage element that can be read out immediately.
  • the ROM is an abbreviation of read only memory, which is an English notation representing a read only memory element.
  • the camera microcomputer 100 executes a focus control program, an exposure control program, a display control program, an image combining program, and the like to realize various functions.
  • the program executed by the camera microcomputer 100 and various data required for control executed by the camera microcomputer 100 are stored in the ROM.
  • FIG. 5 is a block diagram of main functions implemented by the camera microcomputer.
  • the camera microcomputer 100 includes a focus control unit 112, an exposure setting unit 114, an aperture control unit 118, a rear monitor display control unit 120, a storage control unit 122, a communication control unit 124, and a finder LCD display control unit 126. , And functions as a liquid crystal shutter drive control unit 128 and the like.
  • the focus control unit 112 performs so-called contrast autofocus control. That is, the focus control unit 112 moves the focus lens group 4f shown in FIG. 4 from the near end to the infinite end via the focus lens drive mechanism 82, and detects the position where the contrast is maximum.
  • the focus control unit 112 shown in FIG. 5 moves the focus lens group 4f shown in FIG. 4 to a position where the contrast is maximum via the focus lens drive mechanism 82.
  • the exposure setting unit 114 illustrated in FIG. 5 sets the shutter speed and the aperture value that are appropriate for exposure based on the detection result of the brightness of the subject.
  • the exposure setting unit 114 may set the exposure period of the image sensor 70 shown in FIG. 4 instead of the shutter speed.
  • Aperture control section The aperture control unit 118 shown in FIG. 5 controls the aperture 4i shown in FIG. 4 via the aperture drive mechanism 84 so as to obtain the aperture value set by the exposure setting unit 114.
  • the rear monitor display control unit 120 shown in FIG. 5 controls the display of the rear monitor 8 via the LCD driver 8dr. For example, when the image data obtained by imaging is displayed on the back monitor 8, the image data is converted into a data format that can be displayed on the back monitor 8, and is output to the back monitor 8.
  • the storage control unit 122 controls reading and writing of data with respect to the storage unit 78.
  • Image data obtained by imaging is stored in the storage unit 78 via the storage control unit 122.
  • the image data stored in the storage unit 78 is reproduced, the image data is read from the storage unit 78 via the storage control unit 122.
  • the communication control unit 124 controls communication with an external device via the communication unit 80.
  • the finder LCD display control unit 126 controls the display of the finder LCD 62 via the finder LCD driver 62dr. As described above, the image displayed on the finder LCD 62 is superimposed on the optical finder image. Therefore, the finder LCD display control unit 126 also functions as a superimposed display control unit.
  • the finder LCD display control unit 126 shown in FIG. 5 is omitted.
  • the liquid crystal shutter drive control unit 128 controls the driving of the liquid crystal shutter 66 via the liquid crystal shutter driver 66dr to open and close the liquid crystal shutter 66.
  • the liquid crystal shutter drive control unit 128 shown in FIG. 5 is omitted.
  • the digital camera 1 described with reference to FIGS. 1 to 5 has an imaging support function.
  • the digital camera 1 When the digital camera 1 is switched to the imaging support mode, the digital camera 1 performs an imaging support function.
  • the imaging support function of the digital camera 1 will be described in detail below.
  • FIG. 6 is an explanatory view showing an outline of the imaging support function.
  • FIG. 6 illustrates a photographer 200 who is about to capture a landscape at a designated point using the digital camera 1.
  • the imaging person 200 causes the digital camera 1 to perform the imaging support function, it is possible to acquire the reference photo of the landscape at the designated point and the peripheral point of the designated point.
  • the surrounding point of the designated point may be arbitrarily set as long as it is a point at which a reference photo serving as a reference for composition in imaging of the designated point is taken.
  • the point near the designated point may be set by the photographer, or may be automatically set based on the information of the designated point.
  • FIG. 7 is an explanatory view of a reference photograph showing an example of the reference photograph.
  • the first reference picture 210, the second reference picture 212, the third reference picture 214, and the fourth reference picture 216 shown in FIG. 7 are a plurality of reference images displayed on the back monitor 8 of the digital camera 1 shown in FIG. It is a photograph.
  • the photographer 200 shown in FIG. 6 can select a reference picture to be taken from a plurality of reference pictures displayed on the back monitor 8 of the digital camera 1.
  • the digital camera 1 uses the image data corresponding to the reference picture and incidental information of the image data corresponding to the reference picture to obtain the imaging condition of the digital camera 1.
  • FIG. 8 is a block diagram of a camera microcomputer for realizing the imaging support function.
  • the camera microcomputer 100 shown in FIG. 8 includes a position information acquisition unit 220, a reference photograph acquisition unit 222, a sort reference setting unit 224, a sort unit 226, a selection unit 228, a composition support unit 230, an imaging condition setting unit 232, and imaging history storage.
  • a unit 234 is provided.
  • the position information acquisition unit 220 acquires information on the current position of the digital camera 1 from the position information transmission device 236 via the communication unit 80.
  • An example of the position information transmitter 236 is GPS satellites. Note that GPS is an abbreviation of global positioning system, which is an English expression representing the global positioning system.
  • Communication between the communication unit 80 and the position information transmitter 236 may be via a public network (not shown). Communication between the communication unit 80 and the position information transmitting device 236 may be performed via a network (not shown).
  • the reference picture acquisition unit 222 acquires a plurality of image data corresponding to a plurality of reference pictures at the current position of the digital camera 1 from the image server 238 via the communication unit 80.
  • the plurality of image data corresponding to the plurality of reference photographs acquired using the reference photograph acquisition unit 222 is transmitted to the image processing unit 227.
  • the reference photograph acquisition unit 222 is an example of an image data acquisition unit.
  • a storage unit (not shown) of the image server 238 is an example of a storage unit that stores a plurality of image data obtained by performing imaging at a designated point and a point around the designated point.
  • the sorting criteria setting unit 224 sets sorting criteria. Sort criteria are input using the operation unit 6. The sorting criterion is transmitted to the image processing unit 227. Sort criteria correspond to sort criteria.
  • the sort criterion setting unit 224 is an example of a sort condition setting unit.
  • the sort criteria setting unit 224 may set two or more sort criteria. When setting two or more types of sort criteria, the sort criteria setting unit 224 may set the priority of the sort criteria.
  • the sorting unit 226 sorts the plurality of reference photographs acquired by using the reference photograph acquiring unit 222 by using the sorting criterion set by using the sorting criterion setting unit 224. A plurality of image data corresponding to the plurality of reference photographs sorted using the sorting unit 226 is transmitted to the image processing unit 227.
  • the selection unit 228 selects a photograph to be captured from the plurality of sorted reference photographs.
  • the selection unit 228 can use the input information input using the operation unit 6 to select a photograph to be imaged from a plurality of reference photographs.
  • the composition support unit 230 causes the back monitor 8 to display composition support information that indicates the imaging position and imaging direction in which the same imaging as the reference photo selected using the selection unit 228 is possible.
  • the composition support information includes imaging position instruction information for instructing an imaging position, and imaging direction instruction information for instructing an imaging direction.
  • the imaging condition setting unit 232 sets imaging conditions of the image data corresponding to the reference photo selected by using the selection unit 228.
  • the imaging condition set using the imaging condition setting unit 232 may be displayed on the back monitor 8.
  • the image processing unit 227 acquires image data corresponding to a plurality of reference photographs acquired using the reference photograph acquisition unit 222.
  • the image processing unit 227 acquires information indicating the sorting result transmitted from the sorting unit 226.
  • the image processing unit 227 acquires selection information transmitted from the selection unit 228.
  • the image processing unit 227 acquires composition support information transmitted from the composition support unit 230.
  • the image processing unit 227 performs signal processing on image data and various information.
  • the image data subjected to the signal processing and various information are transmitted to the LCD driver 8 dr via the rear monitor display control unit 120.
  • the image processing unit is an example of a component of the display control unit.
  • the rear monitor 8 displays the image data and various information transmitted from the image processing unit 227 via the rear monitor display control unit 120 and the LCD driver 8dr.
  • the rear monitor display control unit 120 is an example of a component of the display control unit.
  • the LCD driver 8dr is an example of a component of the display control unit.
  • the camera microcomputer 100 may be configured by distributing the components shown in FIG. 5 and the components shown in FIG. 8 to a plurality of processors.
  • the imaging history storage unit 234 stores the imaging history of the digital camera 1.
  • the imaging history of the digital camera 1 is stored in association with the image data.
  • the imaging history is information reflecting the preference of the imaging person, preference, and the like, such as settings in past imaging. Examples of imaging history include various adjustments such as angle, color, and white balance.
  • the imaging history of the digital camera 1 stored in the imaging history storage unit 234 is applied to the sort criterion of the reference photo. That is, the sort reference setting unit 224 can set the imaging history of the digital camera 1 read out from the imaging history storage unit 234 as the sort reference.
  • the sort reference setting unit 224 that reads out the imaging history from the imaging history storage unit 234 is an example of the imaging history acquisition unit.
  • FIG. 9 is a flowchart showing the flow of the procedure of the imaging support function.
  • the flowchart of the imaging support function shown in FIG. 9 is synonymous with the procedure of the imaging method using the imaging support function.
  • the imaging support function is started.
  • the positional information acquiring unit 220 shown in FIG. 8 is used to acquire positional information of the designated point. You may acquire the positional information on the periphery of a designated point.
  • the designated point may be the current position.
  • the designated point may be designated by the user using a map or the like. In the present embodiment, an aspect in which the designated point is the current position is exemplified.
  • the position information of the designated point acquired in the designated point information acquiring step S10 shown in FIG. 9 is transmitted to the image processing unit 227 shown in FIG.
  • the position information of the designated point may be stored using a storage unit (not shown).
  • Image data acquisition step S12 the reference photograph acquired at the designated point and around the designated point is acquired using the reference photograph acquisition unit 222 shown in FIG.
  • the reference picture acquisition unit 222 shown in FIG. 8 acquires a plurality of reference pictures.
  • Image data acquisition process S12 is an example of a process which realizes an image data acquisition function.
  • the process proceeds to the sort criterion setting determination step S14.
  • the sort criterion setting unit 224 shown in FIG. 8 determines whether or not the sort criterion is to be set.
  • the setting of the sort criteria includes an aspect of changing the preset sort criteria.
  • sort criterion setting determination step S14 when the sort criterion is not set in advance, or when the sort criterion set in advance is changed, the determination is YES. If the determination is YES, the process proceeds to sort criterion setting step S16. In sort criterion setting step S16, sort criterion setting unit 224 shown in FIG. 8 sets sort criterion.
  • the sorting criteria can apply the information input using the operation unit 6.
  • Information representing the sort criteria set using the sort criteria setting unit 224 is transmitted to the sort unit 226.
  • sort criterion setting step S16 shown in FIG. 9 after sort criterion setting unit 224 shown in FIG. 8 sets sort criteria, the process proceeds to sort step S18.
  • the sort criterion setting step S16 is an example of a sort condition setting step.
  • the step of setting the initial condition of the sort condition (not shown) is an example of the sort condition setting step.
  • NO is determined when the preset sort criterion is not changed. If the determination is NO, the process proceeds to the sorting step S18.
  • the sorting unit 226 shown in FIG. 8 performs sorting of a plurality of reference photographs acquired using the reference photograph acquiring unit 222, using the sorting criteria set in advance.
  • Examples of sorting conditions in the sorting step S18 include the order of popularity, the order of the degree of image matching, and the order of the degree of matching of the imaging device.
  • the sorting step S18 may be configured to be selectable in ascending order or descending order.
  • the process proceeds to the display step S20 shown in FIG.
  • the image processing unit 227 illustrated in FIG. 8 displays the top N reference photographs on the back monitor 8.
  • N is an integer of 1 or more.
  • the process proceeds to the selection step S22 in FIG.
  • a reference photograph to be captured is selected from the N reference photographs on the rear monitor 8.
  • the composition support unit 230 shown in FIG. 8 includes information on the imaging position when imaging the reference photograph selected in the selection step S22 shown in FIG. 9, and information on the imaging direction. Get composition information.
  • the composition support unit 230 shown in FIG. 8 displays the composition information of the reference photograph on the back monitor 8 via the image processing unit 227, the back monitor display control unit 120, and the LCD driver 8dr. .
  • the imaging condition setting unit 232 shown in FIG. 8 acquires the imaging condition when the reference photograph selected in the selection step S22 of FIG. 9 is imaged.
  • the imaging conditions include an exposure setting, a focus setting, and the like. The imaging conditions at the time of imaging the reference photo can be read out from the incidental information incidental to the image data of the reference photo.
  • the imaging condition setting unit 232 shown in FIG. 8 sets the imaging conditions of the digital camera 1 shown in FIG.
  • the reference photograph information setting step S24 is an example of the imaging condition setting step.
  • the reference photo information setting step S24 is an example of the composition support step.
  • the reference photo information setting step S24 is an example of a step for realizing the imaging condition setting function and the composition assistance function.
  • the process proceeds to the exposure determination step S26 shown in FIG.
  • the camera microcomputer 100 shown in FIG. 8 determines whether the exposure setting of the digital camera 1 shown in FIG. 6 is an appropriate exposure setting that matches the reference photo.
  • the exposure determination step S26 shown in FIG. 9 when the camera microcomputer 100 shown in FIG. 8 determines that the exposure setting is appropriate for the reference photo, the determination is YES. In the case of a YES determination, the process proceeds to the imaging step S28 shown in FIG.
  • the exposure determination step S26 if it is determined that the camera microcomputer 100 shown in FIG. 8 does not meet the reference photo and the exposure adjustment is necessary, the determination is NO. If the determination is NO, the process proceeds to the exposure adjustment step S30 shown in FIG.
  • the exposure adjustment step S30 exposure adjustment is performed automatically or manually using the exposure setting unit 114 shown in FIG.
  • the exposure adjustment step S30 shown in FIG. 9 is adjusted to an appropriate exposure that matches the reference photo
  • the process proceeds to the imaging step S28.
  • imaging step S28 imaging using the digital camera 1 shown in FIG. 6 is performed.
  • the camera microcomputer 100 illustrated in FIG. 8 ends the imaging support method.
  • the aspect of applying the focus information of the reference photo as the focus information of imaging of the designated point is exemplified.
  • the focus in the imaging of the designated point may be confirmed, and the focus may be readjusted if the focus is not appropriate.
  • FIGS. 10 to 20 A screen displayed on the back monitor 8 of the digital camera 1 when the imaging support method is performed will be described using FIGS. 10 to 20.
  • FIG. Each screen shown in FIGS. 10 to 20 is generated using the image processing unit 227 and the rear monitor display control unit 120 shown in FIG. 8. The data of each screen is converted into a signal that can be displayed on the rear monitor 8 using the LCD driver 8dr.
  • FIG. 10 is a diagram showing an example of the designated point setting screen.
  • the designated point automatic setting screen 300 shown in FIG. 10 is used when the current position of the digital camera 1 shown in FIG. 6 is set as the designated point. It is possible to select whether or not to automatically set the current position of the digital camera 1 as the designated point using the operation unit 6 shown in FIG. When the OK button 302 of the designated spot automatic setting screen 300 is operated, the selection is confirmed.
  • the position information acquisition unit 220 shown in FIG. 8 acquires information on the current position of the digital camera 1 from the position information transmitting device 236 via the communication unit 80 and acquires it. Information on the current location of the digital camera 1 is displayed on the rear monitor 8.
  • FIG. 11 is a view showing an example of the result screen of the designated point automatic setting.
  • the result screen 310 of the designated point automatic setting shown in FIG. 11 displays the address of the designated point.
  • the OK button 302 on the result screen 310 is operated, the designated point is decided.
  • the return button 304 of the result screen 310 is operated, the display is switched to the designated place automatic setting screen 300 shown in FIG.
  • FIG. 12 is a view showing another example of the designated point setting screen.
  • the designated place manual setting screen 320 shown in FIG. 12 may manually select the position on the map.
  • the position on the selected map is set as a designated point.
  • the designated point manual setting screen 320 has a hierarchical structure in which the area including the selected position is switched to a screen illustrated in detail, and a more detailed position can be manually selected.
  • a screen for selecting a region is a top hierarchy
  • a screen for selecting a prefecture is a second hierarchy
  • a screen for selecting a city is a third hierarchy
  • the screen for selecting is the lowest hierarchy.
  • FIG. 12 shows the screen of the top hierarchy of the above example.
  • FIG. 13 is a diagram showing an example of the sort reference setting screen.
  • the sort criterion setting selection screen 330 shown in FIG. 13 can select whether or not to set the sort criterion when the sort criterion is not set in advance.
  • the sort criterion setting selection screen 330 can select whether or not to change the sort criterion when the sort criterion is set in advance.
  • FIG. 14 is a diagram showing an example of the sort criterion selection screen.
  • the sort criteria selection screen 332 shown in FIG. 14 is displayed when the sort criteria setting is selected on the sort criteria setting selection screen 330 shown in FIG. The same applies to the sort criteria selection screen 340 shown in FIG.
  • FIG. 14 shows the sort criteria selection screen 332 using the pull-down menu 334.
  • the sort criteria displayed in pull-down menu 334 can be selected using operation unit 6 shown in FIG.
  • the OK button 302 of the sort criteria selection screen 332 shown in FIG. 14 is operated, sort criteria are determined.
  • FIG. 15 is a view showing another example of the sort criterion selection screen.
  • the sort criterion selection screen 340 shown in FIG. 15 can select the similarity of images and the degree of coincidence of cameras as sort criteria.
  • a camera is synonymous with an imaging device.
  • the sort criteria may include the order of popularity, the degree of coincidence of imaging devices, and the degree of image similarity.
  • the degree of coincidence of the camera is an example of the condition of the imaging device.
  • the order of popularity the order of the number of times of imaging, the order of the number of times the social button is pressed, and the order of the number of times read from the image server 238 shown in FIG.
  • the number of likes shown in FIG. 14 corresponds to the number of times the social button is pressed.
  • the order of the degree of coincidence of the imaging device the order of small angle of view and the order of close specifications of the lens may be mentioned.
  • the order of the degree of similarity of the images the order of the distance between the photographing position of the reference photograph and the designated point is short, and the order of the small difference of the similarity evaluation value of the image.
  • sort criteria are determined.
  • FIG. 16 is a view showing an example of the reference photograph display screen.
  • the reference photo display screen 350 shown in FIG. 16 is a reference photo from the top to the top at the top of the sort order: the first reference photo 362, the second reference photo 364, the third reference photo 366, and the fourth reference photo Images corresponding to 368 are displayed side by side in the order of the first reference picture 362, the second reference picture 364, the third reference picture 366, and the fourth reference picture 368.
  • FIG. 16 illustrates an example of displaying four reference images on the rear monitor 8 in accordance with the display capability of the rear monitor 8.
  • the first reference photo 362 shown in FIG. 16 the first reference photo 210 shown in FIG. 7 can be mentioned.
  • the second reference picture 364 shown in FIG. 16 the second reference picture 212 shown in FIG. 7 can be mentioned.
  • the third reference picture 366 shown in FIG. 16 the third reference picture 214 shown in FIG. 7 can be mentioned.
  • the fourth reference picture 368 shown in FIG. 16 a fourth reference picture 216 shown in FIG. 7 can be mentioned.
  • some of the images may be reduced images such as thumbnail images.
  • an image of two or less places may be used as a thumbnail image.
  • the number of reference photographs may be set according to the size of the display device that displays the reference photograph display screen 350. For example, when displaying a reference photo on the display of the mobile terminal with an imaging function, it is possible to display nine images.
  • the reference photo display screen 350 may be switched to a screen displaying that a reference photo with a sorting order of 5 or less does not exist.
  • FIG. 17 is a diagram showing an example of the reference photo selection screen.
  • the second reference photo 364 is selected, and the first reference photo 362, the third reference photo 366, and the fourth reference photo 368 are not selected.
  • the OK button 302 of the result screen 310 is operated, the selection of the second reference photo 364 is finalized.
  • the extended implementation display may be applied to superimpose and display the reference photo on the through image.
  • the image processing unit 227 illustrated in FIG. 8 generates image data for extended implementation display from the image data corresponding to the reference photo selected as the reference photo to be captured.
  • Known image processing can be applied to image processing in generation of image data for extended realization display.
  • the detailed description of the image processing in the generation of the image data for extended realization display is omitted.
  • augmented reality may be expressed using AR, an abbreviation of Augmented Reality.
  • the image data for expanded realization display generated using the image processing unit 227 is expanded and displayed on the back monitor 8 via the back monitor display control unit 120 and the LCD driver 8dr.
  • the rear surface monitor 8 superimposes a through image on the reference photo displayed in the expanded state.
  • the superimposed display referred to here includes an aspect in which the reference photo is displayed in an extended manner, outside the frame of the screen displaying the through image. The same applies to superimposed display on the composition information display screen described later and superimposed display on the imaging condition display screen.
  • FIG. 18 is a view showing an example of the composition information display screen.
  • an imaging position 374 and an imaging direction 376 are displayed on the two-dimensional map 372.
  • a three-dimensional map may be applied.
  • the two-dimensional map 372 can apply the two-dimensional map acquired by performing aerial photography using an aerial imaging device.
  • the image data corresponding to the reference photo be added with the information of the imaging position and the information of the imaging direction as incidental information.
  • Composition information can apply extended realization indication. That is, the image processing unit 227 shown in FIG. 8 is used to generate extended realization image data representing composition information.
  • the extended implementation image data of composition information generated using the image processing unit 227 can be displayed on the rear monitor 8 via the rear monitor display control unit 120 and the LCD driver 8dr.
  • the extended implementation display of composition information may be displayed superimposed on the through image.
  • a two-screen display in which the through image and the composition information are displayed on different screens may be applied.
  • the two-screen display it is preferable to be able to adjust the screen size of the through image and the screen size of the composition information.
  • FIG. 19 is a diagram showing an example of imaging condition display.
  • FIG. 19 illustrates an example in which imaging conditions are superimposed and displayed outside the frame f on the through image display screen 382 for displaying the through image 380.
  • a through image display screen 382 shown in FIG. 19 displays imaging conditions such as an imaging mode, an exposure mode, a shutter speed, an aperture value, an exposure correction value, and a setting sensitivity.
  • the through image display screen 382 shown in FIG. 19 displays information such as the number of shootable images and the remaining amount of battery.
  • the imaging conditions shown in FIG. 19 may be displayed inside the frame f. In the case where the imaging condition is displayed inside the frame f, it is preferable that the background of the imaging condition be transparent or translucent to maintain the visibility of the through image.
  • an extended implementation display can be applied. That is, the image processing unit 227 shown in FIG. 8 is used to generate extended realization image data representing imaging conditions.
  • the extended realization image data of the imaging condition generated using the image processing unit 227 can be displayed on the rear monitor 8 via the rear monitor display control unit 120 and the LCD driver 8dr.
  • the extended implementation display of the imaging conditions may be displayed superimposed on the through image.
  • FIG. 20 shows an example of the exposure adjustment selection screen.
  • the exposure adjustment selection screen 390 shown in FIG. 20 can select whether to perform exposure adjustment.
  • the OK button 302 is operated, the selection is confirmed.
  • a focus adjustment selection screen in the aspect configured to enable readjustment of the focus.
  • [Operation effect 1] A plurality of reference photographs to be used as reference for composition in imaging of a designated point are acquired. Multiple reference photos are sorted using sort criteria. Sort the top N reference photos into the sort order and display them on the back monitor of the digital camera. As a result, the user of the digital camera can easily select a reference picture taken from among a plurality of reference pictures displayed on the rear monitor.
  • the imaging conditions of the selected reference photo are set in the digital camera.
  • the composition of the selected reference photo is instructed. It becomes possible to take an image with the application of the shooting conditions and composition of the reference photo to be taken.
  • the order of popularity is set as the sort condition, it is possible to capture an image using the imaging condition and composition of a popular reference photo.
  • the matching degree of the imaging device is set as the sorting condition, imaging according to the performance of the digital camera used for imaging becomes possible.
  • imaging history of the digital camera used for imaging is set as the sorting condition, imaging according to the imaging history becomes possible.
  • the preference of the user of the digital camera is reflected in the imaging history, it is possible to perform imaging according to the preference of the user of the digital camera.
  • the rear monitor displays thumbnail images of at least some reference photos. This allows the rear monitor to display more reference pictures.
  • the rear monitor superimposes an imaging condition and composition support information on the through image.
  • the user of the digital camera can visually recognize the through image, the imaging condition, and the composition support information on one screen.
  • the rear monitor superimposes the imaging condition and the composition support information on the through image using the augmented reality display. This makes it possible to display the through image, the imaging condition, and the composition support information in a well-balanced manner.
  • FIG. 21 is a block diagram of an imaging system showing an application example of a network system.
  • the imaging system 400 shown in FIG. 21 includes the digital camera 1 and a server apparatus 402.
  • the digital camera 1 and the server device 402 are configured to be able to communicate via the network 404.
  • the server apparatus 402 illustrated in FIG. 21 has the same configuration and the same function as the image server 238 illustrated in FIG. 8.
  • the network 404 shown in FIG. 21 is not limited in scale, communication system, etc., and any network can be applied.
  • the server device 402 includes an image database 410, a processing unit 412, and a server communication unit 414.
  • the image database 410 stores image data corresponding to a plurality of reference photographs.
  • the processing unit 412 controls reading of image data from the image database 410 and writing of image data to the image database 410. Also, the processing unit 412 extracts image data that matches the extraction condition from the plurality of image data stored in the image database 410.
  • a designated point set in the digital camera 1 is applied.
  • the processing unit 412 extracts image data corresponding to at least one of a reference photo matching the designated point set in the digital camera 1 and a reference photo matching the periphery of the designated point.
  • the server communication unit 414 executes communication with devices connected to the network 404. For example, when the extraction condition of the image data is transmitted from the digital camera 1 to the server device 402, the server device 402 acquires the extraction condition of the image data via the server communication unit 414.
  • the server device 402 when transmitting image data from the server device 402 to the digital camera 1, the server device 402 transmits image data corresponding to the reference photo extracted from the image database 410 to the digital camera 1 via the server communication unit 414. Do.
  • the imaging system 3 of the digital camera 1 shown in FIG. 21 includes the imaging lens 4, the image sensor 70 and the analog signal processing unit 74 shown in FIG. 4.
  • the network 404 is connected with a plurality of terminal devices.
  • the terminal device include a computer, a portable terminal device, and the like.
  • FIG. 22 is a block diagram of a modification of the imaging system shown in FIG.
  • the imaging system 400A illustrated in FIG. 22 includes the sort reference setting unit 420 and the sort unit 422 in the server device 402A.
  • the imaging system 400A further includes a sort reference transmission unit 250 in the camera microcomputer 100A of the digital camera 1A.
  • the sort reference transmitting unit 250 transmits the sort condition to the sort reference setting unit 420 of the server apparatus 402A via the communication unit 80 and the server communication unit 414.
  • the sorting criterion setting unit 420 of the server device 402A sets sorting conditions transmitted from the sorting criterion transmission unit 250 of the digital camera 1A.
  • the sorting unit 422 sorts the plurality of reference photographs based on the sorting condition set by using the sorting criterion setting unit 420.
  • the sorting unit 422 transmits image data corresponding to the plurality of sorted reference photos to the camera microcomputer 100A of the digital camera 1A via the processing unit 412, the server communication unit 414, the network 404, and the communication unit 80.
  • the camera microcomputer 100A arranges the plurality of reference photographs transmitted from the server device 402 in the order of the sorting order and displays the reference photographs on the rear monitor 8.
  • An imaging system to which a network system is applied can obtain the following effects.
  • the digital camera acquires a plurality of reference photographs from the image database of the server device.
  • the digital camera can use the reference photo stored in the image database of the server device, and does not need to have a large-capacity storage unit.
  • a digital camera includes a sort condition setting unit and a sort unit. This eliminates the need for communication between the server device and the digital camera when setting sorting conditions and obtaining sorting results. Further, communication between the server device and the digital camera is limited when the digital camera acquires a reference picture. Furthermore, setting of sorting conditions and processing of the server device in sorting are unnecessary.
  • the imaging support function is realized by the computer, but the hardware configuration for realizing these functions is not limited to this. It can be realized by various processors.
  • Various types of processors include CPUs that are general-purpose processors that function as processing units that execute software and perform various types of processing, and specific types such as PLDs and ASICs that are processors whose circuit configurations can be changed after manufacturing such as FPGAs. It includes a dedicated electric circuit, which is a processor having a circuit configuration specially designed to execute processing. Software is synonymous with program.
  • FPGA Field Programmable Gate Array
  • PLD is an abbreviation of Programmable Logic Device, which is an English-language notation for a programmable logic device.
  • ASIC is an abbreviation of Application Specific Integrated Circuit, which is an English-language notation for Application Specific Integrated Circuit.
  • One processing unit may be configured by one of these various processors, or may be configured by two or more processors of the same type or different types. For example, it may be configured by a plurality of FPGAs, or may be configured by a combination of a CPU and an FPGA.
  • a plurality of processing units may be configured by one processor.
  • a plurality of processing units are configured by one processor
  • first, one processor is configured by a combination of one or more CPUs and software as represented by computers such as clients and servers; There is a form in which this processor functions as a plurality of processing units.
  • the various processing units are configured using one or more of the various processors as a hardware structure.
  • SoC is an abbreviation of System On Chip, which is an English notation representing a system on chip.
  • IC is an abbreviation for Integrated Circuit which stands for integrated circuit.
  • the hardware-like structure of these various processors is, more specifically, an electric circuit combining circuit elements such as semiconductor elements.
  • the functions of the digital camera and the imaging system described above can be realized by causing a computer to execute a program. That is, on the computer, a reference photo acquisition function for acquiring reference photos, a sorting condition setting function for setting sorting conditions, a sorting function for executing sorting of multiple reference photos, a display function for displaying sorted reference photos, It is possible to configure an imaging program for realizing a selection function of selecting a reference photograph to be a reference of composition from a reference photograph, an imaging condition function of setting imaging conditions, a composition instructing function of instructing composition, and an imaging function. .
  • the imaging program may cause a computer to realize functions other than the functions of the above-described imaging apparatus and imaging system.
  • a non-temporary computer-readable recording medium such as a hard disk, a CD (Compact Disk), a DVD (Digital Versatile Disk), various semiconductor memories, etc., storing an imaging program.
  • Imaging represents a concept that includes imaging.
  • the imaging may include both still image acquisition and moving image acquisition.
  • An imaging device represents a concept that includes a camera.
  • a digital camera is an example of an imaging device.
  • a photograph may include a form in which an image is recorded on photo paper, and a form in which an image is displayed on a display device.
  • the image data represents an electrical signal representative of the image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un système de capture d'image, un dispositif de capture d'image, un procédé de capture d'image et un programme de capture d'image qui permettent de sélectionner, à partir d'une unité de mémoire dans laquelle une pluralité de photographies de référence sont mémorisées, une photographie de référence servant de modèle de composition, et d'effectuer une imagerie à l'aide de la photographie de référence sélectionnée. Une pluralité d'images en tant que références de composition sont acquises (S12) ; une condition de tri est définie par rapport à une pluralité d'éléments de données d'image (S14, S16) ; la pluralité d'éléments de données d'image sont triés en fonction de la condition de tri (S18) ; des images triées sont affichées dans l'ordre des rangs de tri (S20) ; une image est sélectionnée parmi la pluralité d'images affichées (S22) ; une condition de capture d'image pour des données d'image correspondant à l'image sélectionnée est définie, et des instructions sont données concernant une position de capture d'image et une direction de capture d'image qui permettent la capture de la même image que l'image sélectionnée (S24) ; et l'imagerie est réalisée (S28).
PCT/JP2018/035255 2017-09-26 2018-09-25 Système de capture d'image, dispositif de capture d'image, procédé de capture d'image et programme de capture d'image Ceased WO2019065551A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-185259 2017-09-26
JP2017185259 2017-09-26

Publications (1)

Publication Number Publication Date
WO2019065551A1 true WO2019065551A1 (fr) 2019-04-04

Family

ID=65902168

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/035255 Ceased WO2019065551A1 (fr) 2017-09-26 2018-09-25 Système de capture d'image, dispositif de capture d'image, procédé de capture d'image et programme de capture d'image

Country Status (1)

Country Link
WO (1) WO2019065551A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113840080A (zh) * 2020-06-23 2021-12-24 佳能株式会社 摄像设备、摄像设备的控制方法和计算机可读介质
JP2022018601A (ja) * 2020-07-16 2022-01-27 キヤノン株式会社 シミュレーション装置、コンピュータプログラム及び記憶媒体
CN114125211A (zh) * 2020-08-28 2022-03-01 佳能株式会社 摄像设备、摄像设备的控制方法和计算机可读存储介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013021473A (ja) * 2011-07-11 2013-01-31 Sony Corp 情報処理装置、情報取得方法およびコンピュータプログラム
JP2013021680A (ja) * 2011-06-14 2013-01-31 Canon Inc 画像に関する処理支援システム、情報処理装置、及び画像に関する処理影支援方法
JP2016152593A (ja) * 2015-02-19 2016-08-22 キヤノン株式会社 サーバ装置、携帯装置、撮像支援方法、コンピュータプログラム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013021680A (ja) * 2011-06-14 2013-01-31 Canon Inc 画像に関する処理支援システム、情報処理装置、及び画像に関する処理影支援方法
JP2013021473A (ja) * 2011-07-11 2013-01-31 Sony Corp 情報処理装置、情報取得方法およびコンピュータプログラム
JP2016152593A (ja) * 2015-02-19 2016-08-22 キヤノン株式会社 サーバ装置、携帯装置、撮像支援方法、コンピュータプログラム

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113840080A (zh) * 2020-06-23 2021-12-24 佳能株式会社 摄像设备、摄像设备的控制方法和计算机可读介质
JP2022018601A (ja) * 2020-07-16 2022-01-27 キヤノン株式会社 シミュレーション装置、コンピュータプログラム及び記憶媒体
JP7721258B2 (ja) 2020-07-16 2025-08-12 キヤノン株式会社 シミュレーション装置、画像処理装置、コンピュータプログラム及び記憶媒体
CN114125211A (zh) * 2020-08-28 2022-03-01 佳能株式会社 摄像设备、摄像设备的控制方法和计算机可读存储介质
CN114125211B (zh) * 2020-08-28 2025-09-02 佳能株式会社 摄像设备、摄像设备的控制方法和计算机可读存储介质

Similar Documents

Publication Publication Date Title
US10009540B2 (en) Image processing device, image capturing device, and image processing method for setting a combination parameter for combining a plurality of image data
JP6512810B2 (ja) 撮像装置および制御方法とプログラム
US20080303936A1 (en) Camera system
JP2009036986A (ja) 撮影装置および撮影装置の制御方法
JP5203657B2 (ja) 拡大表示機能付きカメラ
JPWO2007066629A1 (ja) カメラシステム、カメラ本体、交換レンズユニット、および撮像方法
CN108471503A (zh) 摄像设备及其控制方法
JP2017192086A (ja) 画像生成装置、画像観察装置、撮像装置および画像処理プログラム
WO2019065551A1 (fr) Système de capture d'image, dispositif de capture d'image, procédé de capture d'image et programme de capture d'image
JP2009053296A (ja) 撮影装置および撮影装置の制御方法
JP2009069170A (ja) 撮影装置および撮影装置の制御方法
JP4509081B2 (ja) デジタルカメラ及びデジタルカメラのプログラム
JP2019169985A (ja) 画像処理装置
JP2009036985A (ja) 撮影装置および撮影装置の制御方法
CN102062989A (zh) 相机系统
US8917331B2 (en) Digital photographing apparatus and method of controlling the same
JP4935559B2 (ja) 撮像装置
JP2009219085A (ja) 撮像装置
JP2009036987A (ja) 撮影装置および撮影装置の制御方法
US8345140B2 (en) Image capturing apparatus and method of controlling same
JP2009048123A (ja) 撮影装置および撮影装置の制御方法
JP2009086036A (ja) 撮影装置および撮影装置の制御方法
JP2004221785A (ja) カメラ制御装置
JP2007259004A (ja) デジタルカメラ、画像処理装置及び画像処理プログラム
WO2019049532A1 (fr) Dispositif de capture d'image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18861293

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18861293

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP