WO2014203523A1 - Dispositif de détermination de position d'imagerie et procédé de détermination de position d'imagerie - Google Patents
Dispositif de détermination de position d'imagerie et procédé de détermination de position d'imagerie Download PDFInfo
- Publication number
- WO2014203523A1 WO2014203523A1 PCT/JP2014/003255 JP2014003255W WO2014203523A1 WO 2014203523 A1 WO2014203523 A1 WO 2014203523A1 JP 2014003255 W JP2014003255 W JP 2014003255W WO 2014203523 A1 WO2014203523 A1 WO 2014203523A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- state
- person
- monitoring area
- state map
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/141—Control of illumination
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
- G08B13/19613—Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
- H04N21/44213—Monitoring of end-user related data
- H04N21/44218—Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
Definitions
- the present disclosure relates to an imaging position determination device and an imaging position determination method suitable for use in a surveillance camera system.
- Some surveillance camera systems have an image processing device that extracts a face image of a person to be verified from captured images.
- an image processing apparatus described in Patent Document 1. This image processing apparatus extracts the face image of the person to be collated from the image captured by the imaging unit, and the face image of the person to be collated is not suitable for collation with the accumulated face image of the person who accumulated, Based on the face image, adjustment of the imaging direction of the imaging means, adjustment of the zoom magnification, and adjustment of the exposure amount are performed.
- the image processing apparatus described in Patent Document 1 described above determines that the face image of the person to be collated is not suitable for collation, the image processing apparatus adjusts the imaging direction, zoom magnification, and exposure amount of the imaging unit. I could't adjust the situation. Moreover, it was not possible to determine that the face is not suitable for face collation with respect to the face that is closed to the eyes or the face that is not facing the front. Specifically, a change in lighting conditions can be considered as one of the factors that cause the eye state and face orientation of the person to be verified to change. The illumination conditions by the sun light change with time, and it is necessary to change the adjustment of the camera in accordance with the change. In the image processing apparatus described in Patent Document 1, the change amount is appropriately predicted. I could not. Moreover, it was not possible to determine that the face with the eyes closed was not suitable for collation.
- the present disclosure has been made in view of such circumstances, and in a shooting position determination apparatus and a shooting position determination method for performing image processing for performing face matching from a captured image, it is possible to perform camera adjustment in accordance with the situation.
- Another object of the present invention is to provide an imaging position determination device and an imaging position determination method that can reduce the acquisition of an eye-meditation face.
- An imaging position determination device includes an input unit that acquires an image and a position of a person in a monitoring area, a state extraction unit that extracts the state of the person from the image acquired by the input unit, and the position of the person
- a state map creating unit that creates a state map indicating the state of the person in the monitoring area from the state of the person; a position determining unit that determines a shooting position of the person in the monitoring area using the state map; It is characterized by having.
- FIG. 1 is a block diagram illustrating a schematic configuration of an imaging position determination device according to an embodiment of the present disclosure.
- the figure which looked at the monitoring area of the imaging position determination apparatus of FIG. 1 from right above The figure which looked at the monitoring area of the imaging position determination apparatus of FIG. 1 from the side
- FIG. 3 is an example of an eye state map generated by an eye state map generation unit of the imaging position determination device of FIG.
- FIG. 2 is an example of an eye state map generated by an eye state map generation unit of the imaging position determination device of FIG.
- FIG. 3 is an example of an eye state map generated by an eye state map generation unit of the imaging position determination device of FIG.
- FIG. 2 is an example of an eye state map generated by an eye state map generation unit of the imaging position determination device of FIG.
- the flowchart which shows the process of the adjustment amount estimation part of the imaging position determination apparatus of FIG. It is an example of the display information produced
- FIG. 1 is a block diagram illustrating a schematic configuration of an imaging position determination device according to an embodiment of the present disclosure.
- the imaging position determination device 1 includes a camera (imaging unit) 2, an image processing device 3, an optimum monitoring position display screen generation unit 4, various map display screen generation units 5, It has a camera installation optimal position display screen generation unit 6, a predicted adjustment amount display screen generation unit 7, and a display unit 8.
- the camera 2 captures a monitoring area, acquires a monitoring image, and outputs it as image data.
- the image processing device 3 includes an image receiving unit (input unit) 30, a face detection unit 31, a proper face image determination unit 32, a face state map generation unit 33, an adjustment amount prediction unit (position determination unit) 34, And a camera setting changing unit 35.
- the image receiving unit 30 acquires an image of a person in the monitoring area from the image data output from the camera 2.
- the face detection unit 31 performs face detection processing for detecting a human face from the image acquired by the image receiving unit 30.
- the appropriate face image determination unit 32 includes a face size determination unit 321 that determines the face size of the person detected by the face detection unit 31, and a face brightness determination unit that determines the brightness of the face of the person detected by the face detection unit 31. 322 and an eye state detection unit (state extraction unit) 323 that detects the open / closed state of the eyes of the person detected by the face detection unit 31.
- the face state map generation unit 33 includes an eye state map generation unit (state map generation unit) 331 that generates an eye state map indicating the open / closed state of a person's eyes in the monitoring area from the position of the person and the state of the person, A face size map generation unit 332 that generates a face size map from the face size of the face, and a face brightness map generation unit 333 that generates a face brightness map from the face brightness of the person.
- the face state map generation unit 33 outputs the generated eye state map, face size map, and face brightness map to the various map display screen generation unit 5 and the camera installation optimum position display screen generation unit 6, respectively.
- the adjustment amount prediction unit 34 determines the optimum monitoring position of the person in the monitoring area using the various maps generated by the face state map generation unit 33, and the determined optimum monitoring position is displayed in the optimum monitoring position display screen generation unit 4. Output.
- the adjustment amount prediction unit 34 also predicts the adjustment amount (imaging direction, zoom magnification, exposure amount, etc.) of the camera 2 at the determined optimum monitoring position, and outputs the predicted adjustment amount to the camera setting change unit 35.
- the camera setting change unit 35 changes the camera setting using the adjustment amount predicted by the adjustment amount prediction unit 34.
- the optimal monitoring position display screen generation unit 4 generates display information for visually displaying the optimal monitoring position determined by the adjustment amount prediction unit 34 at a three-dimensional position optimal for the setting of the camera 2.
- the various map display screen generation unit 5 generates display information for visually displaying the various maps generated by the face state map generation unit 33.
- the camera installation optimum position display screen generation unit 6 analyzes the states of the various maps generated by the face state map generation unit 33 and proposes where the camera 2 can be installed for optimal monitoring. Generate display information.
- the predicted adjustment amount display screen generation unit 7 generates display information for visually displaying the adjustment amount predicted by the adjustment amount prediction unit 34.
- the display unit 8 includes display information output from the optimum monitoring position display screen generation unit 4, display information output from the various map display screen generation units 5, display information output from the camera installation optimal position display screen generation unit 6, and The display information output from the predicted adjustment amount display screen generation unit 7 is visually displayed.
- FIG. 2 and 3 are diagrams showing a monitoring area 50 by the camera 2.
- FIG. 2 is a view of the area 50 viewed from directly above
- FIG. 3 is a view of the area 50 viewed from the side.
- the camera 2 is arranged in the vicinity of the store entrance and shoots the person 100 coming out of the store.
- the far end and the near end are partitioned at a constant interval, and each section is called a cell 51.
- FIGS. 4 and 6 are views of the eye state map viewed from directly above, respectively.
- 7 is a view of the eye state map as viewed from the side. 4 and 5, the eye state map shows the state of the human eye in a certain time zone, and the darker the cell, the more images in the eye-meditation state.
- the time zone in which the eye state maps shown in FIGS. 4 and 5 are obtained many face images in the eye-meditation state are observed in the cells 51 in the fourth column counting from the camera 2 side.
- FIGS. 4 and 5 there are a plurality of cells 51 in which a large number of face images in an eye-meditation state are observed, and an area corresponding to the plurality of cells 51 is referred to as a deterioration area 60.
- the imaging direction, zoom magnification, exposure amount, and the like of the camera 2 are adjusted so that an optimal face image without an eye-meditation state can be acquired.
- the camera 2 is adjusted so that an optimal face image (face image without eye-meditation) can be acquired in the area (optimal monitoring area) 70 corresponding to the cell 51 in the fifth column counting from the camera 2 side, Highly accurate face matching can be performed.
- the previous deterioration area is set to 60B (B: Before)
- the current deterioration area is set to 60A (A: After).
- the photographing position determination device 1 a plurality of eye state maps are generated within a predetermined time, and the photographing position of a person in the monitoring area 50 according to the moving direction of the eye state map in the monitoring area 50 To change. 6 and 7, the previous area (optimal monitoring area) is 70B (B: Before), and the current area (optimal monitoring area) is 70A (A: After).
- the photographing position determining apparatus 1 changes the photographing position of the person in the monitoring area 50 so as not to observe many face images in the closed-eye state.
- the eye state map generation unit 331 calculates a score indicating the eye-meditation state for each cell 51 in the monitoring area 50 and integrates the cells to generate an eye state map.
- This eye state map is created a plurality of times within a predetermined time, and each time the eye state map is generated, the adjustment amount predicting unit 34 uses the eye state map to photograph the person in the monitoring area 50 (hereinafter referred to as optimum monitoring). (Referred to as position).
- the adjustment amount prediction unit 34 also predicts the adjustment amount (imaging direction, zoom magnification, exposure amount, etc.) of the camera 2 at the determined optimum monitoring position, and outputs the predicted adjustment amount to the camera setting change unit 35.
- FIG. 8 is a flowchart showing the processing of the adjustment amount prediction unit 34.
- an eye state map for a time zone A is input (step S1).
- an eye state map for time zone B is input (step S2).
- the difference between them is calculated (step S3).
- the moving speed of the deterioration area that is, the eye-meditation area) is calculated from the calculated difference (step S4).
- the optimum monitoring position is calculated (step S5).
- face size and face brightness can be processed in the same manner.
- face brightness it can be used that the effect of sunlight is continuous in time as the eye condition moves the eye meditation area. Information that there is a tendency that many children with small faces tend to visit stores can be used for prediction.
- the eye state map, the face size map, and the face brightness map may be created and used at the same time every day or may be created and used for each season.
- the lighting state can be controlled, the environment of the monitoring area can be improved by controlling the lighting state.
- the various map display screen generation unit 5 that generates display information for displaying the eye state map, the face size map, and the face brightness map allows the user to recognize a position where the accuracy of face matching may be deteriorated. Therefore, it is possible to further improve accuracy by making fine adjustments based on the information. Further, which position is finally the monitoring position can be confirmed by the display information generated by the optimum monitoring position display screen generation unit 4 and is an obstacle that obstructs the flow line. By removing or the like, the user can also improve the monitoring environment of the position.
- the camera setting changing unit 35 changes the optimum monitoring position using the adjustment amount of the camera 2 predicted by the adjustment amount prediction unit 34. That is, the optimum monitoring position of the person in the monitoring area 50 is changed according to the moving direction in the monitoring area 50 of the eye state map. For example, if an eye state map created at a certain time is shown in FIGS. 4 and 5, and an eye state map created at a certain time thereafter is shown in FIGS. 6 and 7, FIG. 4 and FIG. In the eye state map at the time shown in FIG. 5, it is detected that a cell having a bad eye state is moving in a direction away from the camera 2 in time, and the optimum monitoring area 70B moves to the optimum monitoring area 70A from the speed. The camera 2 is adjusted. In this way, the camera can be adjusted with an adjustment amount suitable for the situation. This is because the angle at which the sun's light, which is one of the causes of glare, changes continuously in time.
- the various map display screen generation unit 5 generates display information for display according to the scale in the real space.
- FIGS. 9 and 10 are examples of display information generated by the various map display screen generation unit 5, FIG. 9 is a diagram seen from directly above, and FIG. 10 is a diagram seen from the side. Since the display information generated by the various map display screen generation unit 5 is displayed on the display unit 8, the user can see which eye state is specifically deteriorated by looking at the eye state map. Can be understood intuitively. For example, as shown in FIG. 9, if there is a deterioration area 61 far to the left when viewed from the camera 2 and the cause is the influence of outside light from outside the store, the blind is lowered or the direction of the mirror is changed. It is possible to take further improvement measures such as, and further improve the monitoring environment.
- the optimal monitoring position display screen generation unit 4 generates display information for displaying the optimal monitoring position for face detection in the current situation.
- 11 and 12 are examples of display information generated by the optimum monitoring position display screen generation unit 4, FIG. 11 is a view seen from directly above, and FIG. 12 is a view seen from the side. Since the display information for displaying the optimum monitoring position 71 is displayed on the display unit 8, the user does not put anything that blocks the flow line in the displayed part, for example, by viewing this display. On the contrary, facilities can be arranged so that passers-by can easily pass through the location, and the monitoring environment can be further improved.
- the camera installation optimum position display screen generation unit 6 generates display information for displaying the optimum three-dimensional position for the setting of the camera 2.
- FIGS. 13 and 14 are examples of display information generated by the camera installation optimum position display screen generation unit 6, FIG. 13 is a view before adjusting the camera monitoring position, and FIG. 14 is a view after adjusting the camera monitoring position. . Since the display information generated by the camera installation optimal position display screen generation unit 6 is displayed on the display unit 8, the user can view the display and monitor the camera 2 at an optimal position. I know what will happen. For example, when the eye state map is in a state as shown in FIG.
- the eye state map is deteriorated from the lower right to the upper left as shown by the arrow 90 from the distribution state of the eye state map.
- I can guess. For example, the wind of an air conditioner blows directly.
- the adjustment amount prediction unit 34 predicts the adjustment amount of the optimum monitoring position of the camera 2 as shown in the figure.
- generation part 7 produces
- the environment of the monitoring area 50 can be improved as shown in FIG. In this case, the monitoring area before adjustment is 50B (B: Before), and the monitoring area after adjustment is 50A (A: After).
- the image receiving unit 30 that acquires the image and position of the person in the monitoring area 50, and the human eye from the image acquired by the image receiving unit 30.
- An eye state detection unit 323 that detects an open / closed state
- an eye state map generation unit 331 that generates an eye state map indicating the eye state of the person in the monitoring area 50 from the open / closed state of the person's eyes acquired by the eye state detection unit 323.
- FIG. 15 is a diagram illustrating an example of a two-dimensional image that can be observed through the camera 2.
- reference numeral 500 denotes the angle of view of the camera 2
- reference numerals 501 and 502 denote image acquisition areas (areas corresponding to shooting positions in images taken by the camera 2).
- An imaging position determination device includes an input unit that acquires an image and a position of a person in a monitoring area, a state extraction unit that extracts the state of the person from the image acquired by the input unit, and the position of the person
- a state map creating unit that creates a state map indicating the state of the person in the monitoring area from the state of the person; a position determining unit that determines a shooting position of the person in the monitoring area using the state map; It is characterized by having.
- the state map creation unit creates the state map for each time zone.
- the shooting position can be determined in accordance with the time zone.
- the photographing position determination device controls the illumination state in the monitoring area using the state map.
- the environment of the monitoring area can be improved.
- the state of the person is an open / closed state of the person's eyes.
- the state map is created a plurality of times within a predetermined time, and the shooting position of the person in the monitoring area is changed according to the moving direction of the state map in the monitoring area.
- the shooting position can be changed so as not to be affected by this, so that the acquisition of an eye-meditation face can be reduced.
- the position of the obstacle in the monitoring area is estimated from the state map.
- the image capturing unit captures the monitoring area and acquires a monitoring image
- the imaging unit determines the determined image in the monitoring image from the determined image capturing position in the monitoring area.
- An image acquisition area corresponding to the shooting position is determined.
- the photographing unit is controlled based on the determined photographing position.
- the image processing apparatus further includes a display screen generation unit that generates display information for displaying the state map.
- the created state map can be confirmed visually.
- the shooting position determination method of the present disclosure includes an input step of acquiring an image and a position of a person in a monitoring area, a state extraction step of extracting the state of the person from the image acquired in the input step, and the position of the person
- a state map creating step for creating a state map indicating the state of the person in the monitoring area from the state of the person; a position determining step for determining a shooting position of the person in the monitoring area using the state map; It is characterized by having.
- the position of the person and the shooting position in accordance with the state of the person can be determined.
- the state map creating step creates the state map for each time period.
- the shooting position can be determined in accordance with the time zone.
- the photographing position determination method controls the illumination state in the monitoring area using the state map.
- the environment of the monitoring area can be improved.
- the state of the person is an open / closed state of the person's eyes.
- the state map is created a plurality of times within a predetermined time, and the shooting position of the person in the monitoring area is changed according to the moving direction of the state map in the monitoring area.
- the shooting position can be changed so as not to be affected by this, so it is possible to reduce the acquisition of an eye-meditation face.
- the position of the obstacle in the monitoring area is estimated from the state map.
- the method further includes a photographing step of photographing the monitoring area to obtain a monitoring image, and the photographing step determines the determined position in the monitoring image from the determined photographing position in the monitoring area. An image acquisition area corresponding to the shooting position is determined.
- the photographing step is controlled based on the determined photographing position.
- the method described above further includes a display screen generation step of generating display information for displaying the state map.
- the created state map can be confirmed visually.
- an imaging position determination device and an imaging position determination method for performing image processing for performing face matching from a captured image it is possible to perform camera adjustment according to the situation and to reduce eye-meditation face acquisition. It has an effect and can be applied to a surveillance camera system.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- Ophthalmology & Optometry (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Studio Devices (AREA)
Abstract
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2015522558A JP6218089B2 (ja) | 2013-06-18 | 2014-06-17 | 撮影位置決定装置及び撮影位置決定方法 |
| EP14813989.2A EP3013043A4 (fr) | 2013-06-18 | 2014-06-17 | Dispositif de détermination de position d'imagerie et procédé de détermination de position d'imagerie |
| US14/898,871 US9905010B2 (en) | 2013-06-18 | 2014-06-17 | Image position determination device and image position determination method for reducing an image of a closed eye person |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2013127594 | 2013-06-18 | ||
| JP2013-127594 | 2013-06-18 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2014203523A1 true WO2014203523A1 (fr) | 2014-12-24 |
Family
ID=52104274
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2014/003255 Ceased WO2014203523A1 (fr) | 2013-06-18 | 2014-06-17 | Dispositif de détermination de position d'imagerie et procédé de détermination de position d'imagerie |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US9905010B2 (fr) |
| EP (1) | EP3013043A4 (fr) |
| JP (1) | JP6218089B2 (fr) |
| WO (1) | WO2014203523A1 (fr) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2018180494A1 (fr) * | 2017-03-30 | 2018-10-04 | 日本電気株式会社 | Système, procédé et programme de surveillance d'image |
| RU2679730C1 (ru) * | 2016-09-20 | 2019-02-12 | Тосиба Инфрастракче Системз Энд Солюшнз Корпорейшн | Система сопоставления изображений и способ сопоставления изображений |
| WO2020026325A1 (fr) * | 2018-07-31 | 2020-02-06 | 日本電気株式会社 | Dispositif d'évaluation, dispositif de dérivation, procédé de surveillance, dispositif de surveillance, procédé d'évaluation, programme informatique, et procédé de dérivation |
Families Citing this family (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP3013043A4 (fr) * | 2013-06-18 | 2017-04-12 | Panasonic Intellectual Property Management Co., Ltd. | Dispositif de détermination de position d'imagerie et procédé de détermination de position d'imagerie |
| US10667007B2 (en) * | 2014-01-22 | 2020-05-26 | Lenovo (Singapore) Pte. Ltd. | Automated video content display control using eye detection |
| US10186124B1 (en) * | 2017-10-26 | 2019-01-22 | Scott Charles Mullins | Behavioral intrusion detection system |
| IL314289A (en) | 2019-04-10 | 2024-09-01 | Raptor Vision Llc | monitoring systems |
| EP4020981B1 (fr) | 2020-12-22 | 2025-11-12 | Axis AB | Caméra et procédé associé pour faciliter son installation |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2004178229A (ja) * | 2002-11-26 | 2004-06-24 | Matsushita Electric Works Ltd | 人の存在位置検出装置とその検出方法及び同検出装置を用いた自律移動装置 |
| JP2005505209A (ja) * | 2001-09-27 | 2005-02-17 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | コンピュータベースの視覚監視用の最適マルチカメラセットアップ |
| JP2008009689A (ja) * | 2006-06-29 | 2008-01-17 | Matsushita Electric Ind Co Ltd | 顔登録装置、顔認証装置および顔登録方法 |
| JP2009086932A (ja) | 2007-09-28 | 2009-04-23 | Omron Corp | 画像処理装置および方法、並びにプログラム |
| JP2011114580A (ja) * | 2009-11-26 | 2011-06-09 | Panasonic Corp | 複数カメラ監視システム、携帯端末装置、センター装置及び複数カメラ監視方法 |
| JP2012525755A (ja) * | 2009-04-29 | 2012-10-22 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | カメラの最適視角位置を選択する方法 |
Family Cites Families (18)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7076102B2 (en) | 2001-09-27 | 2006-07-11 | Koninklijke Philips Electronics N.V. | Video monitoring system employing hierarchical hidden markov model (HMM) event learning and classification |
| US8711217B2 (en) | 2000-10-24 | 2014-04-29 | Objectvideo, Inc. | Video surveillance system employing video primitives |
| US7043075B2 (en) | 2001-09-27 | 2006-05-09 | Koninklijke Philips Electronics N.V. | Computer vision system and method employing hierarchical object classification scheme |
| US7110569B2 (en) | 2001-09-27 | 2006-09-19 | Koninklijke Philips Electronics N.V. | Video based detection of fall-down and other events |
| US20030058342A1 (en) | 2001-09-27 | 2003-03-27 | Koninklijke Philips Electronics N.V. | Optimal multi-camera setup for computer-based visual surveillance |
| US7369680B2 (en) | 2001-09-27 | 2008-05-06 | Koninklijke Phhilips Electronics N.V. | Method and apparatus for detecting an event based on patterns of behavior |
| US7202791B2 (en) | 2001-09-27 | 2007-04-10 | Koninklijke Philips N.V. | Method and apparatus for modeling behavior using a probability distrubution function |
| US20030058237A1 (en) | 2001-09-27 | 2003-03-27 | Koninklijke Philips Electronics N.V. | Multi-layered background models for improved background-foreground segmentation |
| US20030058111A1 (en) | 2001-09-27 | 2003-03-27 | Koninklijke Philips Electronics N.V. | Computer vision based elderly care monitoring system |
| US7822241B2 (en) * | 2003-08-21 | 2010-10-26 | Koninklijke Philips Electronics N.V. | Device and method for combining two images |
| US20060171453A1 (en) | 2005-01-04 | 2006-08-03 | Rohlfing Thomas R | Video surveillance system |
| WO2006118563A1 (fr) | 2005-04-29 | 2006-11-09 | Chubb International Holdings Limited | Procede et dispositif de localisation constante de region d'interet |
| US20060255931A1 (en) | 2005-05-12 | 2006-11-16 | Hartsfield Andrew J | Modular design for a security system |
| US7574043B2 (en) | 2005-06-27 | 2009-08-11 | Mitsubishi Electric Research Laboratories, Inc. | Method for modeling cast shadows in videos |
| US8031970B2 (en) * | 2007-08-27 | 2011-10-04 | Arcsoft, Inc. | Method of restoring closed-eye portrait photo |
| JP4831434B2 (ja) * | 2007-12-27 | 2011-12-07 | アイシン・エィ・ダブリュ株式会社 | 地物情報収集装置及び地物情報収集プログラム、並びに自車位置認識装置及びナビゲーション装置 |
| GB2476500B (en) * | 2009-12-24 | 2012-06-20 | Infrared Integrated Syst Ltd | Activity mapping system |
| EP3013043A4 (fr) * | 2013-06-18 | 2017-04-12 | Panasonic Intellectual Property Management Co., Ltd. | Dispositif de détermination de position d'imagerie et procédé de détermination de position d'imagerie |
-
2014
- 2014-06-17 EP EP14813989.2A patent/EP3013043A4/fr not_active Withdrawn
- 2014-06-17 US US14/898,871 patent/US9905010B2/en active Active
- 2014-06-17 WO PCT/JP2014/003255 patent/WO2014203523A1/fr not_active Ceased
- 2014-06-17 JP JP2015522558A patent/JP6218089B2/ja active Active
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2005505209A (ja) * | 2001-09-27 | 2005-02-17 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | コンピュータベースの視覚監視用の最適マルチカメラセットアップ |
| JP2004178229A (ja) * | 2002-11-26 | 2004-06-24 | Matsushita Electric Works Ltd | 人の存在位置検出装置とその検出方法及び同検出装置を用いた自律移動装置 |
| JP2008009689A (ja) * | 2006-06-29 | 2008-01-17 | Matsushita Electric Ind Co Ltd | 顔登録装置、顔認証装置および顔登録方法 |
| JP2009086932A (ja) | 2007-09-28 | 2009-04-23 | Omron Corp | 画像処理装置および方法、並びにプログラム |
| JP2012525755A (ja) * | 2009-04-29 | 2012-10-22 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | カメラの最適視角位置を選択する方法 |
| JP2011114580A (ja) * | 2009-11-26 | 2011-06-09 | Panasonic Corp | 複数カメラ監視システム、携帯端末装置、センター装置及び複数カメラ監視方法 |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP3013043A4 |
Cited By (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| RU2679730C1 (ru) * | 2016-09-20 | 2019-02-12 | Тосиба Инфрастракче Системз Энд Солюшнз Корпорейшн | Система сопоставления изображений и способ сопоставления изображений |
| US10515457B2 (en) | 2016-09-20 | 2019-12-24 | Kabushiki Kaisha Toshiba | Image collation system and image collation method |
| WO2018180494A1 (fr) * | 2017-03-30 | 2018-10-04 | 日本電気株式会社 | Système, procédé et programme de surveillance d'image |
| JPWO2018180494A1 (ja) * | 2017-03-30 | 2020-02-06 | 日本電気株式会社 | 映像監視システム、映像監視方法および映像監視プログラム |
| US10909385B2 (en) | 2017-03-30 | 2021-02-02 | Nec Corporation | Image monitoring system, image monitoring method, and image monitoring program |
| JP7147746B2 (ja) | 2017-03-30 | 2022-10-05 | 日本電気株式会社 | 映像監視システム、映像監視方法および映像監視プログラム |
| WO2020026325A1 (fr) * | 2018-07-31 | 2020-02-06 | 日本電気株式会社 | Dispositif d'évaluation, dispositif de dérivation, procédé de surveillance, dispositif de surveillance, procédé d'évaluation, programme informatique, et procédé de dérivation |
| US11328404B2 (en) | 2018-07-31 | 2022-05-10 | Nec Corporation | Evaluation apparatus, evaluation method, and non-transitory storage medium |
Also Published As
| Publication number | Publication date |
|---|---|
| EP3013043A1 (fr) | 2016-04-27 |
| US9905010B2 (en) | 2018-02-27 |
| JPWO2014203523A1 (ja) | 2017-02-23 |
| EP3013043A4 (fr) | 2017-04-12 |
| US20160133021A1 (en) | 2016-05-12 |
| JP6218089B2 (ja) | 2017-10-25 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6218089B2 (ja) | 撮影位置決定装置及び撮影位置決定方法 | |
| JP5435307B2 (ja) | 車載カメラ装置 | |
| US20190206067A1 (en) | Image processing apparatus, monitoring system, image processing method,and program | |
| JP4979525B2 (ja) | マルチカメラシステム | |
| JP2013005234A5 (fr) | ||
| JP5127531B2 (ja) | 画像監視装置 | |
| CN105620364A (zh) | 提供驾驶信息的方法和装置 | |
| JP2016085739A (ja) | ビデオ処理で使用する少なくとも一つのパラメータの修正 | |
| EP2309454A2 (fr) | Appareil et procédé de détection de mouvement | |
| CN115103094B (zh) | 一种基于注视点的摄像头模组远视角调节方法及系统 | |
| JP2009116742A (ja) | 車載用画像処理装置、画像処理方法、および、プログラム | |
| JP6820075B2 (ja) | 乗員数検知システム、乗員数検知方法、およびプログラム | |
| JP2006211139A (ja) | 撮像装置 | |
| WO2013114862A1 (fr) | Dispositif et procédé de réglage d'appareil photo optimal | |
| WO2013094212A1 (fr) | Dispositif de contrôle d'exposition, dispositif d'imagerie, dispositif d'affichage d'image et méthode de contrôle d'exposition | |
| JP2008281385A (ja) | 画像処理装置 | |
| US20190041231A1 (en) | Eyeglasses-type wearable information terminal, control method thereof, and control program | |
| JP2015177371A (ja) | 監視装置 | |
| JP2007148988A (ja) | 顔認証装置、顔認証方法および入退場管理装置 | |
| US10715787B1 (en) | Depth imaging system and method for controlling depth imaging system thereof | |
| GB2529435A (en) | A Method of Generating A Framed Video Stream | |
| JP2010175966A (ja) | 画像処理装置及び撮像装置 | |
| JP2018201146A (ja) | 画像補正装置、画像補正方法、注目点認識装置、注目点認識方法及び異常検知システム | |
| JP4618062B2 (ja) | 監視装置、監視システム、および監視方法 | |
| JP4872396B2 (ja) | 画像編集装置、画像編集方法および画像編集プログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14813989 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 14898871 Country of ref document: US |
|
| ENP | Entry into the national phase |
Ref document number: 2015522558 Country of ref document: JP Kind code of ref document: A |
|
| REEP | Request for entry into the european phase |
Ref document number: 2014813989 Country of ref document: EP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2014813989 Country of ref document: EP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |