US20170076465A1 - Image processing apparatus and image processing method - Google Patents
Image processing apparatus and image processing method Download PDFInfo
- Publication number
- US20170076465A1 US20170076465A1 US15/341,086 US201615341086A US2017076465A1 US 20170076465 A1 US20170076465 A1 US 20170076465A1 US 201615341086 A US201615341086 A US 201615341086A US 2017076465 A1 US2017076465 A1 US 2017076465A1
- Authority
- US
- United States
- Prior art keywords
- area
- rectangular display
- display area
- line segment
- detection
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G06T7/204—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G06T7/0044—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
- G06T7/248—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30232—Surveillance
Definitions
- the present invention relates to an image processing apparatus and an image processing method for observation or monitoring using pictures.
- a conventional technique tracks the object or human body detected from the picture within the screen and detects the passage through the specific portion.
- a passage detection line is set near the upper and lower ends or the left and right ends within an imaging screen, even an object that has passed over the passage detection line cannot sometimes be detected.
- a scheme of tracking a tracking position in an object to be detected on an object barycenter basis when detecting an object moving from outside to inside a screen.
- the barycentric position in the first image that depicts an object that has entered the screen, the barycentric position has already been located inside the set detection line and has passed it. This is because, when the object appears in the imaging screen, the barycentric position has already been located inwardly from the screen by several pixels. For this reason, in this case, the tracking line of the barycentric position does not intersect with the passage detection determination line, and, hence, the passage of the object cannot be detected. This phenomenon is especially noticeable for a fast object.
- a detection method of determining the passage of an object to be detected when part of the object comes into contact with a detection line, instead of the barycenter of the object.
- This can avoid the above detection omission, but may cause a false detection.
- the passage of an object is generally detected by tracking the barycenter of the object or the midpoint of a diagonal line of the object and using the intersection between it and a detection line.
- the present invention has been made in consideration of the above problems, and provides a technique for preventing false detection near a screen end when detecting the passage of an object or a human body through a specific portion within a screen by using the pictures obtained from a surveillance camera, or the like.
- the present invention provides an image processing apparatus comprising a detection unit that detects that an object moving within a display screen has passed through an object detection line segment set in the display screen, and a setting unit that sets a region, where the detection is inhibited, in a frame of the display screen, wherein the detection unit detects that the object has passed through the object detection line segment set in a region other than the region set by the setting unit.
- the present invention provides an image processing method comprising a detection step of detecting that an object moving within a display screen has passed through an object detection line segment set in the display screen, and a setting step of setting a region, where the detection is inhibited, in a frame of the display screen, wherein, in the detection step, it is detected that the object has passed through the object detection line segment set in a region other than a region set in the setting step.
- the present invention provides a non-transitory computer-readable storage medium recording a program for causing a computer to execute a detection step of detecting that an object moving within a display screen has passed through an object detection line segment set in the display screen, and a setting step of setting a region, where the detection is inhibited, in a frame of the display screen, wherein, in the detection step, it is detected that the object has passed through the object detection line segment set in a region other than a region set in the setting step.
- FIG. 1 is a block diagram showing an example of the functional arrangement of an image processing apparatus
- FIG. 2 is a view showing an example of the arrangement of information managed by a locus management unit 104 ;
- FIG. 3A is a view showing the first arrangement example of parameters defining an object detection region
- FIG. 3B is a view showing the second arrangement example of parameters defining an object detection region
- FIG. 3C is a view showing the third arrangement example of parameters defining an object detection region
- FIG. 4 is a view for explaining the processing performed by a locus information determination unit 106 ;
- FIG. 5 is a view for explaining the determination of the passage of an object
- FIG. 6 is a view for explaining an application window
- FIG. 7 is a flowchart showing setting processing for an object detection region and an inhibition region
- FIG. 8 is a view for explaining setting processing for an inhibition region
- FIG. 9 is a view showing a display example of an object detection region.
- FIG. 10 is a view for explaining setting processing for an inhibition region.
- This embodiment is directed to an image processing apparatus that displays a moving image depicting a moving object in a display screen to detect that the object moving in the display screen has passed through an object detection region set in the display screen.
- An image processing apparatus 100 can be a general PC (Personal Computer), an image processing circuit mounted in a camera capable of capturing moving images, or another type of device, as long as it can implement the function of the image processing apparatus described above.
- PC Personal Computer
- the image processing apparatus 100 displays its processing result on the display device 190 in the form of images, characters, and the like. The following is a case in which a moving image is displayed on the display screen of the display device 190 .
- An image acquisition unit 101 sequentially acquires the images of frames constituting a moving image depicting one or more objects that move in and out of the display screen or move across a plurality of frames within the display screen.
- the image acquisition unit 101 sequentially outputs the acquired images of the respective frames to an object detection unit 102 .
- the image acquisition unit 101 may acquire such a moving image from an imaging device capable of capturing moving images or a device holding such moving images in advance. That is, the source of moving images is not specifically limited.
- the object detection unit 102 detects an object depicted in the image of a frame received from the image acquisition unit 101 by using a technique such as a background differencing technique. Obviously, the object detection method to be used is not limited to any specific method. Upon detecting an object from the image of a given frame, the object detection unit 102 generates various kinds of information (to be described later) associated with the detection.
- a locus management unit 104 manages information for each object that is obtained by the object detection unit 102 and the object tracking unit 103 .
- FIG. 2 shows an example of the arrangement of information managed by the locus management unit 104 .
- information (object information) 202 is managed for each object.
- the object information 202 is managed for each object ID.
- information 203 is managed for each frame (Timestamp) in which the object has been detected.
- the information 203 includes a detected coordinate position (Position), information (Boundingbox) that defines a circumscribed rectangle enclosing the region of the detected object, and the size of the object (size).
- pieces of information that can be included in object information are not limited to them, and may include any kind of information as long as it provides the ability to implement the processing to be described below.
- a locus information determination unit 106 properly uses each information managed by the locus management unit 104 .
- a determination parameter setting unit 105 acquires or sets parameters for the determination of the passage of an object in the image of each frame through an object detection region, that is, parameters that define the object detection region. The determination parameter setting unit 105 then sets the acquired or set parameters in the locus information determination unit 106 .
- FIG. 3A shows an example of the arrangement of parameters acquired or set by the determination parameter setting unit 105 .
- the parameters shown in FIG. 3A define a line segment (Line) connecting coordinates (10, 10) and coordinates (20, 30) on the display screen of the display device 190 as an object detection region. These parameters define that when an object having a size (Size) of 100 to 250 has passed through (cross) this object detection region (line segment), the object is regarded as a detection target.
- Size size
- the locus information determination unit 106 performs passage determination processing for an object with respect to an object detection region based on the parameters set by the determination parameter setting unit 105 and the information managed by the locus management unit 104 .
- the processing performed by the locus information determination unit 106 when the parameters shown in FIG. 3A are set will be described with reference to FIG. 4 .
- the locus information determination unit 106 determines whether a motion vector 404 from a circumscribed rectangle 402 of an object in a frame immediately preceding a frame of interest to a circumscribed rectangle 403 of an object in the frame of interest has intersected with a line segment 401 defined by parameters. To determine whether the motion vector has intersected with the line segment is to determine whether the object has passed through the line segment 401 .
- a screen 501 is the display screen of the display device 190 that is displaying the image of a frame at time t 1 .
- a line segment 502 set on the right end of the screen 501 is a line segment defined as an object detection region by parameters.
- a vehicle (a circumscribed rectangle 503 in FIG. 5 ) that will appear in the subsequent frame on the screen is approaching from the right side of the screen 501 .
- a screen 504 is a display screen of the display device 190 that displays the image of a frame (a frame at time t 2 ) succeeding the frame at time t 1 . Part of the vehicle is depicted in the screen 504 .
- a circumscribed rectangle 505 is a circumscribed rectangle of the vehicle detected by the object detection unit 102 from the screen 504 .
- a position 506 is the center position of the circumscribed rectangle 505 . Assume that the vehicle has appeared for the first time in this frame within the display screen.
- a screen 507 is a display screen of the display device 190 that displays the image of a frame (a frame at time t 3 ) succeeding the frame at time t 2 . All of the vehicle is depicted in the screen 507 .
- a circumscribed rectangle 508 is a circumscribed rectangle of the vehicle detected by the object detection unit 102 from the screen 507 .
- a position 509 is the center position of the circumscribed rectangle 508 .
- the locus information determination unit 106 determines whether a line segment connecting the position 506 and the position 509 intersects with the line segment 502 . In the case shown in FIG. 5 , although the object as the vehicle has passed through the line segment 502 in fact, the line segment connecting the position 506 and the position 509 does not intersect with the line segment 502 . For this reason, in this case, the locus information determination unit 106 determines that the object as the vehicle has not passed through the line segment 502 . This determination is, therefore, falsely determined as a result.
- this embodiment sets a region near an end portion (frame) of a screen as an inhibition region of “detection whether an object moving in the screen has passed through the object detection region set in the screen”.
- Reference numeral 601 denotes an application window that is displayed on the screen of the display device 190 to set an inhibition region, and 602 , a region for displaying the image of each frame acquired by the image acquisition unit 101 . It is not indispensable to display the image of each frame in the region 602 in the following processing, and the image (still image) of a given frame or no image may be displayed in the region 602 .
- a region that inhibits the above passage determination is set in the region 602 .
- an inhibition region is set as indicated by reference numeral 603 .
- the inhibition region 603 is a region outside of a region 605 enclosed by the following four borders in the region 602 :
- this embodiment is configured to perform the above passage determination for only the region 605 enclosed by these four borders.
- reference numeral 604 denotes the distance (set distance) between the bottom border of the region 605 and the bottom border of the region 602 . Since the region 602 is a reduced region of the display screen of the display device 190 , the inhibition region 603 is a reduced region of an inhibition region set on the actual display screen. On the actual display screen of the display device 190 , a region outside of the region enclosed by the following four borders is an inhibition region for passage determination:
- the user can set the inhibition region 603 described above in the region 602 , as well as an object detection region by using the determination parameter setting unit 105 .
- the determination parameter setting unit 105 is implemented by an input device, such as a keyboard or mouse.
- the inhibition region 603 may be determined in advance. Referring to FIG. 6 , the user sequentially sets points 606 , 607 , and 608 to set, as an object detection region, a series of line segments constituted by a line segment connecting the points 606 and 607 and a line segment connecting the points 607 and 608 .
- the inhibition region 603 inhibits passage determination, and hence, the determination parameter setting unit 105 inhibits an object detection region from being included in the inhibition region 603 .
- the determination parameter setting unit 105 inhibits the setting operation.
- this apparatus uses a method of making the user operate the determination parameter setting unit 105 to move the cursor and to set a point at the current position of the cursor, the apparatus may inhibit the cursor from moving to the inhibition region 603 .
- the apparatus may allow the cursor to move to the inhibition region 603 , but may reject a point setting operation, or may forcibly move a point set in the inhibition region 603 to the outside of the inhibition region 603 .
- the apparatus controls the setting processing so as to prevent the object detection region from overlapping an inhibition region.
- the apparatus performs passage determination in regions other than the region that inhibits passage determination, thereby preventing the above false determination.
- the determination result obtained by the locus information determination unit 106 may be output to the outside via an outside output unit 107 .
- the outside output unit 107 is a display device formed by a CRT or a liquid crystal screen, the outside output unit 107 may be used instead of the display device 190 .
- step S 701 the locus information determination unit 106 determines whether to continue the following processing, that is, to terminate this processing.
- the apparatus terminates the processing if a condition for terminating the processing is satisfied, for example, an instruction to terminate the processing is input. If the following processing is to be continued, the process advances to step S 702 .
- step S 702 the locus information determination unit 106 reads parameters like those shown in FIG. 3A , which have been acquired or set by the determination parameter setting unit 105 , from the determination parameter setting unit 105 . If the determination parameter setting unit 105 has not acquired/set such parameters, the process advances to step S 704 , while skipping steps S 702 and S 703 .
- step S 703 the locus information determination unit 106 calculates a region that permits detection of an object and a region that inhibits detection of an object from the parameters read in step S 702 . Processing performed in this step will be described with reference to FIGS. 3B and 8 .
- the parameters read in step S 702 are those having the arrangement shown in FIG. 3B .
- a line segment (Line) connecting coordinates (950, 250) and coordinates (950, 600) on the display screen of the display device 190 is defined as an object detection region.
- the parameters define that when an object having a size (Size) of 300 to 400 has passed through this object detection region (line segment), the object is regarded as a detection target.
- the locus information determination unit 106 may set a set distance 802 to half of the maximum size of the object to be detected, that is, 200. Assume that the coordinate positions of the upper left and lower right corners of a display screen 801 of the display device 190 are respectively (0, 0) and (1200, 1000). In this case, the locus information determination unit 106 sets the inside of a rectangular region 803 with the coordinate positions of the upper left and lower right corners being (200, 200) and (1000, 800), respectively, as a region that provides the ability to set an object detection region, and a region outside of the rectangular region 803 as the above inhibition region.
- the parameters shown in FIG. 3C define a line segment (Line) connecting coordinates (1190, 250) and coordinates (1190, 600) on the display screen of the display device 190 as an object detection region. These parameters also define that when an object having a size (Size) of 150 to 200 has passed through (cross) this object detection region (line segment), the object is regarded as a detection target.
- the locus information determination unit 106 may set a set distance 1001 to half of the maximum size of the object to be detected, that is, 100 , as shown in FIG. 10 .
- the locus information determination unit 106 sets the inside of a rectangular region 1002 with the coordinate positions of the upper left and lower right corners being (100, 100) and (1100, 900), respectively, as a region that provides the ability to set an object detection region, and a region outside of the rectangular region 1002 as the above inhibition region.
- part of the object detection region is included in the inhibition region.
- the apparatus may re-set either or both of an object detection region and an inhibition region so as to avoid the object detection region from being included in the inhibition region. If, for example, an object detection region is to be re-set, “Coordinate” of the parameters shown in FIG. 3C may be corrected to (1100, 250) and (1100, 600).
- a set distance is obtained in accordance with the size of an object to be detected.
- another method may be used as a method of obtaining a set distance. For example, it is possible to obtain a set distance in accordance with the moving speed of this object in the screen. In this case, it is possible to obtain the moving distance of the object between a current frame and a past frame in the screen as a moving speed and to increase a set distance with an increase in obtained moving speed.
- step S 704 the apparatus displays an application window showing a region that inhibits detection of an object and an object detection region on the display screen of the display device 190 .
- a method of displaying each region is not limited to this.
- step S 704 When the process advances to step S 704 , while skipping steps S 702 and S 703 , the apparatus does not display an object detection region or an inhibition region, and the user newly sets these regions by operating the determination parameter setting unit 105 in step S 704 .
- the width and height of an object to be detected are not designated.
- a width and a height may be added as setting items.
- the set distance at the top and bottom borders is longer than that at the left and right borders. This also applies to a case in which a set distance is obtained in accordance with the moving speed of the object.
- step S 705 the locus information determination unit 106 determines whether an inhibition region or an object detection region has been set (re-set). This determination is to determine whether the user has set (re-set) an inhibition region or an object detection region by, for example, changing a set distance, editing or creating an object detection region in the application window by using the determination parameter setting unit 105 . If the apparatus determines, as a result of this determination, that the user has not set (re-set) any region, the process advances to step S 707 . If the apparatus determines that the user has set (re-set) a region, the process advances to step S 706 .
- the locus information determination unit 106 reflects the change made by setting (re-setting) in the inhibition region and/or the object detection region. More specifically, the locus information determination unit 106 reflects information after a change in the object detection region in the information managed by the locus management unit 104 , and stores information defining the set (re-set) inhibition region. Assume that the object detection region is included in a region that inhibits the detection of any object at this time. The coping techniques described above are applied to such a case.
- step S 706 if the user designates a region 1201 as an object detection region, the apparatus may display not only the region 1201 , but also, a region 1202 that is set by moving the region 1201 , so as to prevent it from being included in an inhibition region.
- step S 702 If a situation that will terminate this processing occurs (a condition for the termination of the processing is satisfied or an instruction to terminate the processing is input), the process returns to step S 702 through step S 707 . If no such situation has occurred, the apparatus terminates the processing through step S 707 . Obviously, if the process returns to step S 702 through steps S 706 and S 707 , the change made in step S 706 is reflected in the corresponding region.
- aspects of the present invention can also be realized by a computer of a system or an apparatus (or devices such as a CPU or an MPU) that reads out and executes a program recorded on a memory device to perform the functions of the above-described embodiment(s), and by a method, the steps of which are performed by a computer of a system or an apparatus by, for example, reading out and executing a program recorded on a memory device to perform the functions of the above-described embodiment(s).
- the program is provided to the computer, for example, via a network or from a recording medium of various types serving as the memory device (for example, a computer-readable medium).
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Alarm Systems (AREA)
Abstract
An image processing apparatus includes a detection unit to detect that an object moving in an image displayed within a rectangular display area has passed through a detection line segment. A setting unit sets, based on a user operation, the detection line segment within the rectangular display area. A display control unit displays a setting area being set along at least one side of the rectangular display area and a remaining area of the rectangular display area other than the setting area, so that they can be distinguished. The setting area is an area in which false determination of the detection by the detection unit may occur due to at least one of a size of the object and a speed of the object.
Description
- This is a continuation application of copending U.S. patent application Ser. No. 13/454,441, filed Apr. 24, 2012, which is incorporated by reference herein in its entirety.
- This application also claims the benefit of Japanese Patent Application No. 2011-114110, filed May 20, 2011, which is hereby incorporated by reference herein in its entirety.
- Field of the Invention
- The present invention relates to an image processing apparatus and an image processing method for observation or monitoring using pictures.
- Description of the Related Art
- When detecting the passage of an object or a human body through a specific portion in a screen by using a picture obtained from a surveillance camera, or the like, a conventional technique tracks the object or human body detected from the picture within the screen and detects the passage through the specific portion.
- There is known a conventional technique of detecting an object from motion vectors, estimating a search position in the next frame, and tracking the object by template matching (for example, Japanese Patent Laid-Open No. 2002-373332). There is also known a technique as a conventional example, which performs face tracking based on motion information detected from the correlation between a current frame and a past frame (for example, Japanese Patent Laid-Open No. 2010-50934). It is possible to determine the passage of an object through a specific portion based on this tracking result. In general, when performing passage detection, the passage of an object is determined by detecting that a tracking line, which is the locus of object tracking, intersects with a set determination line segment or determination region frame.
- When, however, a passage detection line is set near the upper and lower ends or the left and right ends within an imaging screen, even an object that has passed over the passage detection line cannot sometimes be detected. Assume the use of a scheme of tracking a tracking position in an object to be detected on an object barycenter basis, when detecting an object moving from outside to inside a screen. In this case, in the first image that depicts an object that has entered the screen, the barycentric position has already been located inside the set detection line and has passed it. This is because, when the object appears in the imaging screen, the barycentric position has already been located inwardly from the screen by several pixels. For this reason, in this case, the tracking line of the barycentric position does not intersect with the passage detection determination line, and, hence, the passage of the object cannot be detected. This phenomenon is especially noticeable for a fast object.
- The following is another example of the inability to detect passage. When an object is located at the position of a screen end, it is not possible to determine the moving direction of the object in the image captured for the first time. This makes it impossible sometimes to detect the passage of the object.
- On the other hand, it is possible to set a detection method of determining the passage of an object to be detected when part of the object comes into contact with a detection line, instead of the barycenter of the object. This can avoid the above detection omission, but may cause a false detection. For example, when, in fact, an object has just passed nearby a detection line, although part of the object has touched the line, false detection occurs, because the object has not actually moved across the detection line. For this reason, the passage of an object is generally detected by tracking the barycenter of the object or the midpoint of a diagonal line of the object and using the intersection between it and a detection line.
- When settings are made to inhibit the detection of any object having a specific size or less upon filtering the size of each object to be detected, part of the object falls outside of the screen at a screen end. For this reason, the apparent size of the object to be detected decreases on the screen, and the object is excluded from detection at the time of filtering. This may lead to detection omission.
- The present invention has been made in consideration of the above problems, and provides a technique for preventing false detection near a screen end when detecting the passage of an object or a human body through a specific portion within a screen by using the pictures obtained from a surveillance camera, or the like.
- According to a first aspect, the present invention provides an image processing apparatus comprising a detection unit that detects that an object moving within a display screen has passed through an object detection line segment set in the display screen, and a setting unit that sets a region, where the detection is inhibited, in a frame of the display screen, wherein the detection unit detects that the object has passed through the object detection line segment set in a region other than the region set by the setting unit.
- According to a second aspect, the present invention provides an image processing method comprising a detection step of detecting that an object moving within a display screen has passed through an object detection line segment set in the display screen, and a setting step of setting a region, where the detection is inhibited, in a frame of the display screen, wherein, in the detection step, it is detected that the object has passed through the object detection line segment set in a region other than a region set in the setting step.
- According to a third aspect, the present invention provides a non-transitory computer-readable storage medium recording a program for causing a computer to execute a detection step of detecting that an object moving within a display screen has passed through an object detection line segment set in the display screen, and a setting step of setting a region, where the detection is inhibited, in a frame of the display screen, wherein, in the detection step, it is detected that the object has passed through the object detection line segment set in a region other than a region set in the setting step.
- Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
-
FIG. 1 is a block diagram showing an example of the functional arrangement of an image processing apparatus; -
FIG. 2 is a view showing an example of the arrangement of information managed by alocus management unit 104; -
FIG. 3A is a view showing the first arrangement example of parameters defining an object detection region; -
FIG. 3B is a view showing the second arrangement example of parameters defining an object detection region; -
FIG. 3C is a view showing the third arrangement example of parameters defining an object detection region; -
FIG. 4 is a view for explaining the processing performed by a locusinformation determination unit 106; -
FIG. 5 is a view for explaining the determination of the passage of an object; -
FIG. 6 is a view for explaining an application window; -
FIG. 7 is a flowchart showing setting processing for an object detection region and an inhibition region; -
FIG. 8 is a view for explaining setting processing for an inhibition region; -
FIG. 9 is a view showing a display example of an object detection region; and -
FIG. 10 is a view for explaining setting processing for an inhibition region. - An embodiment of the present invention will be described below with reference to the accompanying drawings. The embodiment described below is an example of a concrete execution of the present invention, and one of the specific embodiments of the arrangements described in the scope of claims.
- This embodiment is directed to an image processing apparatus that displays a moving image depicting a moving object in a display screen to detect that the object moving in the display screen has passed through an object detection region set in the display screen.
- An example of the functional arrangement of the image processing apparatus according to this embodiment will be described first with reference to the block diagram of
FIG. 1 . Animage processing apparatus 100 can be a general PC (Personal Computer), an image processing circuit mounted in a camera capable of capturing moving images, or another type of device, as long as it can implement the function of the image processing apparatus described above. - A
display device 190 formed by a CRT, liquid crystal screen, or the like, is connected to theimage processing apparatus 100. Theimage processing apparatus 100 displays its processing result on thedisplay device 190 in the form of images, characters, and the like. The following is a case in which a moving image is displayed on the display screen of thedisplay device 190. - An
image acquisition unit 101 sequentially acquires the images of frames constituting a moving image depicting one or more objects that move in and out of the display screen or move across a plurality of frames within the display screen. Theimage acquisition unit 101 sequentially outputs the acquired images of the respective frames to anobject detection unit 102. Theimage acquisition unit 101 may acquire such a moving image from an imaging device capable of capturing moving images or a device holding such moving images in advance. That is, the source of moving images is not specifically limited. - The
object detection unit 102 detects an object depicted in the image of a frame received from theimage acquisition unit 101 by using a technique such as a background differencing technique. Obviously, the object detection method to be used is not limited to any specific method. Upon detecting an object from the image of a given frame, theobject detection unit 102 generates various kinds of information (to be described later) associated with the detection. - When the
object detection unit 102 detects the same object as that detected from the image of a frame immediately preceding a frame of interest, anobject tracking unit 103 associates the objects in the respective frames with each other. Assume that theobject tracking unit 103 assigns object ID=A to the object that theobject detection unit 102 has detected from the image of the frame immediately preceding the frame of interest. When theobject detection unit 102 also detects the same object from the frame of interest, theobject tracking unit 103 also assigns object ID=A to the object. In this manner, when identical objects are detected throughout a plurality of frames, the same ID is assigned to each object. Note that a new object ID is assigned to an object newly detected in a frame of interest. - A
locus management unit 104 manages information for each object that is obtained by theobject detection unit 102 and theobject tracking unit 103.FIG. 2 shows an example of the arrangement of information managed by thelocus management unit 104. - In
management information 201 managed by thelocus management unit 104, information (object information) 202 is managed for each object. In other words, in themanagement information 201, theobject information 202 is managed for each object ID. In theobject information 202 for one object,information 203 is managed for each frame (Timestamp) in which the object has been detected. Theinformation 203 includes a detected coordinate position (Position), information (Boundingbox) that defines a circumscribed rectangle enclosing the region of the detected object, and the size of the object (size). Obviously, pieces of information that can be included in object information are not limited to them, and may include any kind of information as long as it provides the ability to implement the processing to be described below. A locusinformation determination unit 106 properly uses each information managed by thelocus management unit 104. - A determination
parameter setting unit 105 acquires or sets parameters for the determination of the passage of an object in the image of each frame through an object detection region, that is, parameters that define the object detection region. The determinationparameter setting unit 105 then sets the acquired or set parameters in the locusinformation determination unit 106. -
FIG. 3A shows an example of the arrangement of parameters acquired or set by the determinationparameter setting unit 105. The parameters shown inFIG. 3A define a line segment (Line) connecting coordinates (10, 10) and coordinates (20, 30) on the display screen of thedisplay device 190 as an object detection region. These parameters define that when an object having a size (Size) of 100 to 250 has passed through (cross) this object detection region (line segment), the object is regarded as a detection target. - The locus
information determination unit 106 performs passage determination processing for an object with respect to an object detection region based on the parameters set by the determinationparameter setting unit 105 and the information managed by thelocus management unit 104. The processing performed by the locusinformation determination unit 106 when the parameters shown inFIG. 3A are set will be described with reference toFIG. 4 . - The locus
information determination unit 106 determines whether a motion vector 404 from a circumscribed rectangle 402 of an object in a frame immediately preceding a frame of interest to a circumscribed rectangle 403 of an object in the frame of interest has intersected with a line segment 401 defined by parameters. To determine whether the motion vector has intersected with the line segment is to determine whether the object has passed through the line segment 401. - Determination of the passage of an object through the above line segment set on an end portion of the display screen of the
display device 190 will be described below with reference toFIG. 5 . - A
screen 501 is the display screen of thedisplay device 190 that is displaying the image of a frame at time t1. Aline segment 502 set on the right end of thescreen 501 is a line segment defined as an object detection region by parameters. Although not displayed on thescreen 501, a vehicle (a circumscribedrectangle 503 inFIG. 5 ) that will appear in the subsequent frame on the screen is approaching from the right side of thescreen 501. - A
screen 504 is a display screen of thedisplay device 190 that displays the image of a frame (a frame at time t2) succeeding the frame at time t1. Part of the vehicle is depicted in thescreen 504. A circumscribedrectangle 505 is a circumscribed rectangle of the vehicle detected by theobject detection unit 102 from thescreen 504. Aposition 506 is the center position of the circumscribedrectangle 505. Assume that the vehicle has appeared for the first time in this frame within the display screen. - A
screen 507 is a display screen of thedisplay device 190 that displays the image of a frame (a frame at time t3) succeeding the frame at time t2. All of the vehicle is depicted in thescreen 507. A circumscribedrectangle 508 is a circumscribed rectangle of the vehicle detected by theobject detection unit 102 from thescreen 507. Aposition 509 is the center position of the circumscribedrectangle 508. - When performing passage determination for the screen at time t3, the locus
information determination unit 106 determines whether a line segment connecting theposition 506 and theposition 509 intersects with theline segment 502. In the case shown inFIG. 5 , although the object as the vehicle has passed through theline segment 502 in fact, the line segment connecting theposition 506 and theposition 509 does not intersect with theline segment 502. For this reason, in this case, the locusinformation determination unit 106 determines that the object as the vehicle has not passed through theline segment 502. This determination is, therefore, falsely determined as a result. - Providing an object detection region on an end portion of a screen in this manner will increase the possibility of such false determination. In contrast, in order to prevent such false determination, this embodiment sets a region near an end portion (frame) of a screen as an inhibition region of “detection whether an object moving in the screen has passed through the object detection region set in the screen”.
- This inhibition region will be described with reference to
FIG. 6 .Reference numeral 601 denotes an application window that is displayed on the screen of thedisplay device 190 to set an inhibition region, and 602, a region for displaying the image of each frame acquired by theimage acquisition unit 101. It is not indispensable to display the image of each frame in theregion 602 in the following processing, and the image (still image) of a given frame or no image may be displayed in theregion 602. - In this embodiment, a region that inhibits the above passage determination is set in the
region 602. Referring toFIG. 6 , an inhibition region is set as indicated byreference numeral 603. Theinhibition region 603 is a region outside of aregion 605 enclosed by the following four borders in the region 602: -
- a border spaced apart from the top border of the
region 602 by a set distance in an inward direction of theregion 602, - a border spaced apart from the bottom border of the
region 602 by a set distance in an inward direction of theregion 602, - a border spaced apart from the left border of the
region 602 by a set distance in an inward direction of theregion 602, and - a border spaced apart from the right border of the
region 602 by a set distance in an inward direction of theregion 602.
- a border spaced apart from the top border of the
- In other words, this embodiment is configured to perform the above passage determination for only the
region 605 enclosed by these four borders. Referring toFIG. 6 ,reference numeral 604 denotes the distance (set distance) between the bottom border of theregion 605 and the bottom border of theregion 602. Since theregion 602 is a reduced region of the display screen of thedisplay device 190, theinhibition region 603 is a reduced region of an inhibition region set on the actual display screen. On the actual display screen of thedisplay device 190, a region outside of the region enclosed by the following four borders is an inhibition region for passage determination: -
- a border spaced apart from the top border of the display screen by a set distance in an inward direction of the display screen,
- a border spaced apart from the bottom border of the display screen by a set distance in an inward direction of the display screen,
- a border spaced apart from the left border of the display screen by a set distance in an inward direction of the display screen, and
- a border spaced apart from the right border of the display screen by a set distance in an inward direction of the display screen.
- The user can set the
inhibition region 603 described above in theregion 602, as well as an object detection region by using the determinationparameter setting unit 105. In this case, the determinationparameter setting unit 105 is implemented by an input device, such as a keyboard or mouse. Obviously, theinhibition region 603 may be determined in advance. Referring toFIG. 6 , the user sequentially sets points 606, 607, and 608 to set, as an object detection region, a series of line segments constituted by a line segment connecting the 606 and 607 and a line segment connecting thepoints 607 and 608. As described above, however, thepoints inhibition region 603 inhibits passage determination, and hence, the determinationparameter setting unit 105 inhibits an object detection region from being included in theinhibition region 603. In the case shown inFIG. 6 , since thepoint 608 is included in theinhibition region 603, when the user tries to set thepoint 608, the determinationparameter setting unit 105 inhibits the setting operation. Various kinds of methods of inhibiting setting operation are conceivable. If, for example, this apparatus uses a method of making the user operate the determinationparameter setting unit 105 to move the cursor and to set a point at the current position of the cursor, the apparatus may inhibit the cursor from moving to theinhibition region 603. Alternatively, the apparatus may allow the cursor to move to theinhibition region 603, but may reject a point setting operation, or may forcibly move a point set in theinhibition region 603 to the outside of theinhibition region 603. - In this manner, when setting an object detection region in this embodiment, the apparatus controls the setting processing so as to prevent the object detection region from overlapping an inhibition region. As a consequence, the apparatus performs passage determination in regions other than the region that inhibits passage determination, thereby preventing the above false determination.
- The determination result obtained by the locus
information determination unit 106 may be output to the outside via anoutside output unit 107. If theoutside output unit 107 is a display device formed by a CRT or a liquid crystal screen, theoutside output unit 107 may be used instead of thedisplay device 190. - Setting processing for an object detection region and an inhibition region will be described with reference to
FIG. 7 showing a flowchart for this processing. - In step S701, the locus
information determination unit 106 determines whether to continue the following processing, that is, to terminate this processing. The apparatus terminates the processing if a condition for terminating the processing is satisfied, for example, an instruction to terminate the processing is input. If the following processing is to be continued, the process advances to step S702. - In step S702, the locus
information determination unit 106 reads parameters like those shown inFIG. 3A , which have been acquired or set by the determinationparameter setting unit 105, from the determinationparameter setting unit 105. If the determinationparameter setting unit 105 has not acquired/set such parameters, the process advances to step S704, while skipping steps S702 and S703. - In step S703, the locus
information determination unit 106 calculates a region that permits detection of an object and a region that inhibits detection of an object from the parameters read in step S702. Processing performed in this step will be described with reference toFIGS. 3B and 8 . - Assume that the parameters read in step S702 are those having the arrangement shown in
FIG. 3B . According to the parameters shown inFIG. 3B , a line segment (Line) connecting coordinates (950, 250) and coordinates (950, 600) on the display screen of thedisplay device 190 is defined as an object detection region. In addition, the parameters define that when an object having a size (Size) of 300 to 400 has passed through this object detection region (line segment), the object is regarded as a detection target. - Upon acquiring such parameters from the determination
parameter setting unit 105, the locusinformation determination unit 106 may set aset distance 802 to half of the maximum size of the object to be detected, that is, 200. Assume that the coordinate positions of the upper left and lower right corners of adisplay screen 801 of thedisplay device 190 are respectively (0, 0) and (1200, 1000). In this case, the locusinformation determination unit 106 sets the inside of arectangular region 803 with the coordinate positions of the upper left and lower right corners being (200, 200) and (1000, 800), respectively, as a region that provides the ability to set an object detection region, and a region outside of therectangular region 803 as the above inhibition region. - Assume that the locus
information determination unit 106 has read parameters having the arrangement shown inFIG. 3C from the determinationparameter setting unit 105 in step S702. The parameters shown inFIG. 3C define a line segment (Line) connecting coordinates (1190, 250) and coordinates (1190, 600) on the display screen of thedisplay device 190 as an object detection region. These parameters also define that when an object having a size (Size) of 150 to 200 has passed through (cross) this object detection region (line segment), the object is regarded as a detection target. - Upon acquiring such parameters from the determination
parameter setting unit 105, the locusinformation determination unit 106 may set aset distance 1001 to half of the maximum size of the object to be detected, that is, 100, as shown inFIG. 10 . In this case, the locusinformation determination unit 106 sets the inside of arectangular region 1002 with the coordinate positions of the upper left and lower right corners being (100, 100) and (1100, 900), respectively, as a region that provides the ability to set an object detection region, and a region outside of therectangular region 1002 as the above inhibition region. In this case, part of the object detection region is included in the inhibition region. The apparatus may re-set either or both of an object detection region and an inhibition region so as to avoid the object detection region from being included in the inhibition region. If, for example, an object detection region is to be re-set, “Coordinate” of the parameters shown inFIG. 3C may be corrected to (1100, 250) and (1100, 600). - According to the above description, a set distance is obtained in accordance with the size of an object to be detected. However, another method may be used as a method of obtaining a set distance. For example, it is possible to obtain a set distance in accordance with the moving speed of this object in the screen. In this case, it is possible to obtain the moving distance of the object between a current frame and a past frame in the screen as a moving speed and to increase a set distance with an increase in obtained moving speed.
- Referring back to
FIG. 7 , in step S704, as shown inFIG. 6 , the apparatus displays an application window showing a region that inhibits detection of an object and an object detection region on the display screen of thedisplay device 190. A method of displaying each region is not limited to this. - When the process advances to step S704, while skipping steps S702 and S703, the apparatus does not display an object detection region or an inhibition region, and the user newly sets these regions by operating the determination
parameter setting unit 105 in step S704. - According to the above description, the width and height of an object to be detected are not designated. However, a width and a height may be added as setting items. In this case, when calculating set distances in step S703, it is possible to set a set distance at the top and bottom borders in accordance with a height, and a set distance at the left and right borders in accordance with a width. When, for example, setting a human body as an object to be detected, since the human body is an object that is longer in the widthwise direction than in the height direction, the set distance at the top and bottom borders is longer than that at the left and right borders. This also applies to a case in which a set distance is obtained in accordance with the moving speed of the object. That is, when parameters are set on the assumption that the moving speed of the same object varies in the horizontal and vertical directions depending on the installation conditions for a camera, it is possible to set a set distance at the top and bottom borders in accordance with the moving speed in the vertical direction and a set distance at the left and right borders in accordance with the moving speed in the horizontal direction.
- In step S705, the locus
information determination unit 106 determines whether an inhibition region or an object detection region has been set (re-set). This determination is to determine whether the user has set (re-set) an inhibition region or an object detection region by, for example, changing a set distance, editing or creating an object detection region in the application window by using the determinationparameter setting unit 105. If the apparatus determines, as a result of this determination, that the user has not set (re-set) any region, the process advances to step S707. If the apparatus determines that the user has set (re-set) a region, the process advances to step S706. - In step S706, the locus
information determination unit 106 reflects the change made by setting (re-setting) in the inhibition region and/or the object detection region. More specifically, the locusinformation determination unit 106 reflects information after a change in the object detection region in the information managed by thelocus management unit 104, and stores information defining the set (re-set) inhibition region. Assume that the object detection region is included in a region that inhibits the detection of any object at this time. The coping techniques described above are applied to such a case. - For example, in step S706, as shown in
FIG. 9 , if the user designates aregion 1201 as an object detection region, the apparatus may display not only theregion 1201, but also, aregion 1202 that is set by moving theregion 1201, so as to prevent it from being included in an inhibition region. - If a situation that will terminate this processing occurs (a condition for the termination of the processing is satisfied or an instruction to terminate the processing is input), the process returns to step S702 through step S707. If no such situation has occurred, the apparatus terminates the processing through step S707. Obviously, if the process returns to step S702 through steps S706 and S707, the change made in step S706 is reflected in the corresponding region.
- As has been described above, according to this embodiment, when a detection region is set near the upper and lower ends or the left and right ends within the imaging screen by using the picture obtained from a surveillance camera, or the like, it is possible to prevent false detection near an end of the screen.
- Obviously, although the above description has exemplified the determination of passage of an object through a region as an example of object detection, the present invention can be applied to other detection contents.
- Aspects of the present invention can also be realized by a computer of a system or an apparatus (or devices such as a CPU or an MPU) that reads out and executes a program recorded on a memory device to perform the functions of the above-described embodiment(s), and by a method, the steps of which are performed by a computer of a system or an apparatus by, for example, reading out and executing a program recorded on a memory device to perform the functions of the above-described embodiment(s). For this purpose, the program is provided to the computer, for example, via a network or from a recording medium of various types serving as the memory device (for example, a computer-readable medium).
- While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
Claims (19)
1-8. (canceled)
9. An image processing apparatus comprising:
a detection unit configured to detect that an object moving in an image displayed within a rectangular display area has passed through a detection line segment;
a setting unit configured to set, based on a user operation, the detection line segment within the rectangular display area; and
a display control unit configured to display a setting area being set along at least one side of the rectangular display area and a remaining area of the rectangular display area other than the setting area, so that they can be distinguished,
wherein the setting area is an area in which false determination of the detection by the detection unit may occur due to at least one of a size of the object and a speed of the object.
10. The apparatus according to claim 9 , further comprising a control unit configured to control operation so that at least a portion of the detection line segment is not included within the setting area.
11. The apparatus according to claim 10 , wherein the control unit is configured to move the detection line segment so that at least a portion of the detection line segment is not included within the setting area.
12. The apparatus according to claim 10 , wherein the control unit is configured to re-set the setting area so that at least a portion of the detection line segment is not included within the setting area.
13. The apparatus according to claim 10 , wherein the control unit is configured to limit the user operation so that at least a portion of the detection line segment is not included within the setting area.
14. The apparatus according to claim 9 , wherein the setting area is an area where the detection is inhibited, and is an outside of an area enclosed by a border spaced apart from a top border of the rectangular display area by a set distance in an inward direction of the rectangular display area, a border spaced apart from a bottom border of the rectangular display area by a set distance in an inward direction of the rectangular display area, a border spaced apart from a left border of the rectangular display area by a set distance in an inward direction of the rectangular display area, and a border spaced apart from a right border of the rectangular display area by a set distance in an inward direction of the rectangular display area.
15. An image processing method comprising:
detecting that an object moving in an image displayed within a rectangular display area has passed through a detection line segment;
setting, based on a user operation, the detection line segment within the rectangular display area; and
displaying a setting area being set along at least one side of the rectangular display area and a remaining area of the rectangular display area other than the setting area, so that they can be distinguished,
wherein the setting area is an area in which false determination of the detection in the detecting unit may occur due to at least one of a size of the object and a speed of the object.
16. The method according to claim 15 , further comprising controlling operation so that at least a portion of the detection line segment is not included within the setting area.
17. The method according to claim 16 , wherein, in the controlling, the detection line segment is moved so that at least a portion of the detection line segment is not included within the setting area.
18. The method according to claim 16 , wherein, in the controlling, the setting area is re-set so that at least a portion of the detection line segment is not included within the setting area.
19. The method according to claim 16 , wherein, in the controlling, the user operation is limited so that at least a portion of the detection line segment is not included within the setting area.
20. The method according to claim 15 , wherein the setting area is an area where the detection is inhibited, and is an outside of an area enclosed by a border spaced apart from a top border of the rectangular display area by a set distance in an inward direction of the rectangular display area, a border spaced apart from a bottom border of the rectangular display area by a set distance in an inward direction of the rectangular display area, a border spaced apart from a left border of the rectangular display area by a set distance in an inward direction of the rectangular display area, and a border spaced apart from a right border of the rectangular display area by a set distance in an inward direction of the rectangular display area.
21. A non-transitory computer-readable storage medium storing a computer-program for causing a computer to function as:
a detection unit configured to detect that an object moving in an image displayed within a rectangular display area has passed through a detection line segment;
a setting unit configured to set, based on a user operation, the detection line segment within the rectangular display area; and
a display control unit configured to display a setting area being set along at least one side of the rectangular display area and a remaining area of the rectangular display area other than the setting area so that they can be distinguished,
wherein the setting area is an area in which false determination of the detection by the detection unit may occur due to at least one of a size of the object and a speed of the object.
22. The medium according to claim 21 , further comprising a control unit configured to control operation so that at least a portion of the detection line segment is not included within the setting area.
23. The medium according to claim 22 , wherein the control unit is configured to move the detection line segment so that at least a portion of the detection line segment is not included within the setting area.
24. The medium according to claim 22 , wherein the control unit is configured to re-set the setting area so that at least a portion of the detection line segment is not included within the setting area.
25. The medium according to claim 22 , wherein the control unit is configured to limit the user operation so that at least a portion of the detection line segment is not included within the setting area.
26. The medium according to claim 21 , wherein the setting area is an area where the detection is inhibited, and is an outside of an area enclosed by a border spaced apart from a top border of the rectangular display area by a set distance in an inward direction of the rectangular display area, a border spaced apart from a bottom border of the rectangular display area by a set distance in an inward direction of the rectangular display area, a border spaced apart from a left border of the rectangular display area by a set distance in an inward direction of the rectangular display area, and a border spaced apart from a right border of the rectangular display area by a set distance in an inward direction of the rectangular display area.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US15/341,086 US20170076465A1 (en) | 2011-05-20 | 2016-11-02 | Image processing apparatus and image processing method |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2011114110A JP5885398B2 (en) | 2011-05-20 | 2011-05-20 | Image processing apparatus and image processing method |
| JP2011-114110 | 2011-05-20 | ||
| US13/454,441 US9514541B2 (en) | 2011-05-20 | 2012-04-24 | Image processing apparatus and image processing method |
| US15/341,086 US20170076465A1 (en) | 2011-05-20 | 2016-11-02 | Image processing apparatus and image processing method |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/454,441 Continuation US9514541B2 (en) | 2011-05-20 | 2012-04-24 | Image processing apparatus and image processing method |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20170076465A1 true US20170076465A1 (en) | 2017-03-16 |
Family
ID=47174593
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/454,441 Active 2032-09-21 US9514541B2 (en) | 2011-05-20 | 2012-04-24 | Image processing apparatus and image processing method |
| US15/341,086 Abandoned US20170076465A1 (en) | 2011-05-20 | 2016-11-02 | Image processing apparatus and image processing method |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/454,441 Active 2032-09-21 US9514541B2 (en) | 2011-05-20 | 2012-04-24 | Image processing apparatus and image processing method |
Country Status (3)
| Country | Link |
|---|---|
| US (2) | US9514541B2 (en) |
| JP (1) | JP5885398B2 (en) |
| CN (1) | CN102800102B (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20190206064A1 (en) * | 2015-03-23 | 2019-07-04 | Nec Corporation | Monitoring apparatus, monitoring system, monitoring method, and computer-readable storage medium |
Families Citing this family (21)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2014051010A1 (en) * | 2012-09-28 | 2014-04-03 | 株式会社Jvcケンウッド | Diagnosis assistance device and diagnosis assistance method |
| MY168266A (en) * | 2013-01-29 | 2018-10-16 | Syusei Co Ltd | Surveillance system |
| JP6159179B2 (en) * | 2013-07-09 | 2017-07-05 | キヤノン株式会社 | Image processing apparatus and image processing method |
| EP2833325A1 (en) * | 2013-07-30 | 2015-02-04 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Apparatus and method for resource-adaptive object detection and tracking |
| JP2015142181A (en) * | 2014-01-27 | 2015-08-03 | キヤノン株式会社 | Control device and control method |
| US9659350B2 (en) * | 2014-01-31 | 2017-05-23 | Morpho, Inc. | Image processing device and image processing method for image correction, and non-transitory computer readable recording medium thereof |
| US9836816B2 (en) | 2014-04-05 | 2017-12-05 | Sony Interactive Entertainment America Llc | Varying effective resolution by screen location in graphics processing by approximating projection of vertices onto curved viewport |
| US10783696B2 (en) | 2014-04-05 | 2020-09-22 | Sony Interactive Entertainment LLC | Gradient adjustment for texture mapping to non-orthonormal grid |
| US10068311B2 (en) | 2014-04-05 | 2018-09-04 | Sony Interacive Entertainment LLC | Varying effective resolution by screen location by changing active color sample count within multiple render targets |
| US9710957B2 (en) | 2014-04-05 | 2017-07-18 | Sony Interactive Entertainment America Llc | Graphics processing enhancement by tracking object and/or primitive identifiers |
| US9865074B2 (en) | 2014-04-05 | 2018-01-09 | Sony Interactive Entertainment America Llc | Method for efficient construction of high resolution display buffers |
| US9652882B2 (en) | 2014-04-05 | 2017-05-16 | Sony Interactive Entertainment America Llc | Gradient adjustment for texture mapping for multiple render targets with resolution that varies by screen location |
| JP6392370B2 (en) | 2014-04-05 | 2018-09-19 | ソニー インタラクティブ エンタテインメント アメリカ リミテッド ライアビリテイ カンパニー | An efficient re-rendering method for objects to change the viewport under various rendering and rasterization parameters |
| US11302054B2 (en) | 2014-04-05 | 2022-04-12 | Sony Interactive Entertainment Europe Limited | Varying effective resolution by screen location by changing active color sample count within multiple render targets |
| US9710881B2 (en) | 2014-04-05 | 2017-07-18 | Sony Interactive Entertainment America Llc | Varying effective resolution by screen location by altering rasterization parameters |
| US9495790B2 (en) | 2014-04-05 | 2016-11-15 | Sony Interactive Entertainment America Llc | Gradient adjustment for texture mapping to non-orthonormal grid |
| JP6587489B2 (en) * | 2015-10-07 | 2019-10-09 | キヤノン株式会社 | Image processing apparatus, image processing method, and image processing system |
| JP7123545B2 (en) * | 2017-10-30 | 2022-08-23 | キヤノン株式会社 | Information processing device, information processing method and program |
| JP6615847B2 (en) | 2017-11-08 | 2019-12-04 | 株式会社東芝 | Image processing apparatus, image processing system, image processing method, and program |
| JP7518640B2 (en) * | 2020-03-13 | 2024-07-18 | キヤノン株式会社 | Image processing device and image processing method |
| JP7642327B2 (en) | 2020-06-22 | 2025-03-10 | キヤノン株式会社 | IMAGING CONTROL DEVICE, IMAGING SYSTEM, CONTROL METHOD FOR IMAGING DEVICE, AND PROGRAM |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20030214592A1 (en) * | 2002-03-01 | 2003-11-20 | Hiromasa Ikeyama | Image pickup device and image processing method |
| US20070126868A1 (en) * | 2005-12-06 | 2007-06-07 | Hitachi Kokusai Electric Inc. | Image processing apparatus, image processing system, and recording medium for programs therefor |
| US20070217780A1 (en) * | 2006-03-17 | 2007-09-20 | Shinichiro Hirooka | Object detection apparatus |
| US20100231713A1 (en) * | 2006-03-31 | 2010-09-16 | Matsushita Electric Industrial Co., Ltd. | Monitoring camera device |
| US20110267260A1 (en) * | 2010-04-30 | 2011-11-03 | Samsung Electronics Co., Ltd. | Interactive display apparatus and operating method thereof |
Family Cites Families (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP4467838B2 (en) | 2001-06-15 | 2010-05-26 | パナソニック株式会社 | Image recognition apparatus and image recognition method |
| JP2003087772A (en) * | 2001-09-10 | 2003-03-20 | Fujitsu Ltd | Image control device |
| JP4468224B2 (en) | 2005-03-30 | 2010-05-26 | 株式会社日立製作所 | Elevator position detection system and method |
| JP4431532B2 (en) * | 2005-09-16 | 2010-03-17 | 富士フイルム株式会社 | Target image position detecting device and method, and program for controlling target image position detecting device |
| JP4752584B2 (en) * | 2006-04-11 | 2011-08-17 | ソニー株式会社 | Indicator light control program, information processing apparatus, and indicator light control method |
| US20090315712A1 (en) * | 2006-06-30 | 2009-12-24 | Ultrawave Design Holding B.V. | Surveillance method and system using object based rule checking |
| JP2008009341A (en) * | 2006-06-30 | 2008-01-17 | Sony Corp | Autofocus device and method, and imaging apparatus |
| US20110019873A1 (en) * | 2008-02-04 | 2011-01-27 | Konica Minolta Holdings, Inc. | Periphery monitoring device and periphery monitoring method |
| KR100977385B1 (en) * | 2008-04-10 | 2010-08-20 | 주식회사 팬택 | Mobile terminal capable of controlling a widget-type idle screen and a standby screen control method using the same |
| JP2010009134A (en) * | 2008-06-24 | 2010-01-14 | Sony Corp | Image processing system, image processing method, and program |
| JP5219697B2 (en) | 2008-08-25 | 2013-06-26 | キヤノン株式会社 | Image processing apparatus, imaging apparatus, control method for image processing apparatus, and program |
| JP4737270B2 (en) | 2008-10-31 | 2011-07-27 | 富士ゼロックス株式会社 | Image processing apparatus and program |
| JP2010128727A (en) * | 2008-11-27 | 2010-06-10 | Hitachi Kokusai Electric Inc | Image processor |
| US8509483B2 (en) * | 2011-01-31 | 2013-08-13 | Qualcomm Incorporated | Context aware augmentation interactions |
-
2011
- 2011-05-20 JP JP2011114110A patent/JP5885398B2/en active Active
-
2012
- 2012-04-24 US US13/454,441 patent/US9514541B2/en active Active
- 2012-05-21 CN CN201210157994.9A patent/CN102800102B/en active Active
-
2016
- 2016-11-02 US US15/341,086 patent/US20170076465A1/en not_active Abandoned
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20030214592A1 (en) * | 2002-03-01 | 2003-11-20 | Hiromasa Ikeyama | Image pickup device and image processing method |
| US20070126868A1 (en) * | 2005-12-06 | 2007-06-07 | Hitachi Kokusai Electric Inc. | Image processing apparatus, image processing system, and recording medium for programs therefor |
| US20070217780A1 (en) * | 2006-03-17 | 2007-09-20 | Shinichiro Hirooka | Object detection apparatus |
| US20100231713A1 (en) * | 2006-03-31 | 2010-09-16 | Matsushita Electric Industrial Co., Ltd. | Monitoring camera device |
| US20110267260A1 (en) * | 2010-04-30 | 2011-11-03 | Samsung Electronics Co., Ltd. | Interactive display apparatus and operating method thereof |
Non-Patent Citations (1)
| Title |
|---|
| Fujiimiyuki Japanese Patent Application 2010128727 * |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20190206064A1 (en) * | 2015-03-23 | 2019-07-04 | Nec Corporation | Monitoring apparatus, monitoring system, monitoring method, and computer-readable storage medium |
| US10957052B2 (en) * | 2015-03-23 | 2021-03-23 | Nec Corporation | Monitoring apparatus, monitoring system, monitoring method, and computer-readable storage medium |
| US20210174517A1 (en) * | 2015-03-23 | 2021-06-10 | Nec Corporation | Monitoring apparatus, monitoring system, monitoring method, and computer-readable storage medium |
| US11842499B2 (en) * | 2015-03-23 | 2023-12-12 | Nec Corporation | Monitoring apparatus, monitoring system, monitoring method, and computer-readable storage medium |
Also Published As
| Publication number | Publication date |
|---|---|
| US20120293486A1 (en) | 2012-11-22 |
| CN102800102B (en) | 2016-01-20 |
| CN102800102A (en) | 2012-11-28 |
| JP5885398B2 (en) | 2016-03-15 |
| JP2012243161A (en) | 2012-12-10 |
| US9514541B2 (en) | 2016-12-06 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20170076465A1 (en) | Image processing apparatus and image processing method | |
| US11756305B2 (en) | Control apparatus, control method, and storage medium | |
| US10810438B2 (en) | Setting apparatus, output method, and non-transitory computer-readable storage medium | |
| US9367734B2 (en) | Apparatus, control method, and storage medium for setting object detection region in an image | |
| US10445887B2 (en) | Tracking processing device and tracking processing system provided with same, and tracking processing method | |
| US10275639B2 (en) | Face detecting and tracking method, method for controlling rotation of robot head and robot | |
| JP5925068B2 (en) | Video processing apparatus, video processing method, and program | |
| KR101064573B1 (en) | System to track moving objects using particle filtration | |
| JP5484184B2 (en) | Image processing apparatus, image processing method, and program | |
| JP6381313B2 (en) | Control device, control method, and program | |
| JP2005309746A (en) | Moving object tracking method, moving object tracking program and recording medium thereof, and moving object tracking apparatus | |
| US11363241B2 (en) | Surveillance apparatus, surveillance method, and storage medium | |
| JP2012150837A (en) | Gesture recognition device, gesture recognition method and program therefor | |
| US9400929B2 (en) | Object detection device and method for detecting an object by performing a raster scan on a scan window | |
| US20130265420A1 (en) | Video processing apparatus, video processing method, and recording medium | |
| JP6991045B2 (en) | Image processing device, control method of image processing device | |
| JP2005309740A (en) | Moving object tracking method, moving object tracking program and recording medium thereof, and moving object tracking apparatus | |
| JPH05300516A (en) | Animation processor | |
| JP6308612B2 (en) | Image processing apparatus, image processing method, and image processing program | |
| JP6965419B2 (en) | Information processing equipment, information processing methods, and programs | |
| EP3434624B1 (en) | Projection instruction device, parcel sorting system, and projection instruction method | |
| JP2024114369A (en) | DETECTION APPARATUS, DETECTION METHOD, AND DETECTION PROGRAM | |
| KR20110124441A (en) | Pan-Tilt-Zoom Camera and Object Detection Method of the Camera |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |