[go: up one dir, main page]

WO2013161345A1 - Système de surveillance et procédé de surveillance - Google Patents

Système de surveillance et procédé de surveillance Download PDF

Info

Publication number
WO2013161345A1
WO2013161345A1 PCT/JP2013/053277 JP2013053277W WO2013161345A1 WO 2013161345 A1 WO2013161345 A1 WO 2013161345A1 JP 2013053277 W JP2013053277 W JP 2013053277W WO 2013161345 A1 WO2013161345 A1 WO 2013161345A1
Authority
WO
WIPO (PCT)
Prior art keywords
monitoring
image
resolution
information
terminal device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2013/053277
Other languages
English (en)
Japanese (ja)
Inventor
照久 高野
秋彦 香西
真史 安原
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nissan Motor Co Ltd
Original Assignee
Nissan Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nissan Motor Co Ltd filed Critical Nissan Motor Co Ltd
Publication of WO2013161345A1 publication Critical patent/WO2013161345A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/27Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R2021/0027Post collision measures, e.g. notifying emergency services
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/50Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the display information being shared, e.g. external display, data transfer to other traffic participants or centralised traffic controller

Definitions

  • the present invention relates to a monitoring system and a monitoring method.
  • This application claims priority based on Japanese Patent Application No. 2012-098545 filed on Apr. 24, 2012.
  • the contents described in the application are incorporated into the present application by reference and made a part of the description of the present application.
  • a security device that detects the occurrence of abnormalities is known by installing multiple security camera devices in shopping streets, store entrances, home entrances, and other streets, and monitoring surrounding images captured by the security camera device (Patent Document 1).
  • the present invention provides a monitoring system that reduces the amount of communication data and suppresses the delay of information transmission by adjusting the amount of information of a monitoring image to be transmitted when the communication speed is equal to or lower than a predetermined value. For the purpose.
  • a first image having a first resolution with a relatively high resolution and a second image having a second resolution lower than the first resolution are set to a predetermined composition condition.
  • the central monitoring device when the communication speed is low, a monitoring image formed by mixing information with high resolution and information with low resolution on a monitoring terminal device mounted on each mobile unit is displayed in the central monitoring device. Therefore, it is possible to suppress an increase in the amount of communication data while ensuring the transmission frequency of information. That is, the central monitoring device can acquire rough information with a small amount of communication data a plurality of times instead of acquiring detailed information with a large amount of communication data once when the communication speed is low. Since the monitoring terminal device is controlled at the same time, information can be collected in real time without encouraging the congestion of the communication line that causes a decrease in the communication speed.
  • FIG. 1 is a schematic diagram showing a monitoring system according to an embodiment of the present invention. It is a block diagram which shows the monitoring system of FIG. It is a perspective view which shows the example of arrangement
  • FIG. 1 shows an example of the display image shown on the display of a central monitoring apparatus.
  • FIG. 2 shows an example of the display image shown on the display of a central monitoring apparatus.
  • FIG. 3 shows an example of the display image shown on the display of a central monitoring apparatus.
  • FIG. 4 shows an example of the display image shown on the display of a central monitoring apparatus.
  • FIG. 5 shows an example of the display image shown on the display of a central monitoring apparatus. It is FIG.
  • FIG. 6 which shows an example of the display image shown on the display of a central monitoring apparatus.
  • FIG. 7 which shows an example of the display image shown on the display of a central monitoring apparatus.
  • FIG. 8 which shows an example of the display image shown on the display of a central monitoring apparatus.
  • a monitoring system is used in a city by a supervisor such as a police station or a fire department or a security consignment company using captured images of cameras mounted on a plurality of moving bodies. It is embodied in a monitoring system 1 that centrally monitors the security of the country.
  • Each of the monitoring terminal devices 10 mounted on the plurality of moving bodies acquires the position information, the monitoring image around the moving body, and the time information at a predetermined timing, and the position information, the monitoring image, and the time
  • the monitoring information including the information is transmitted to the central monitoring device 20 installed on the monitor side via wireless communication.
  • the central monitoring device 20 accumulates monitoring information including at least a monitoring image and position information acquired from the monitoring terminal device 10, and displays the position information of the moving body on the map information in a superimposed manner via a display or the like.
  • a monitoring image and time information captured at each moving body or each position are displayed. Therefore, as shown in FIG. 1, the monitoring system 1 of this example is mounted on a moving body V and transmits monitoring information such as position information and monitoring images via a telecommunication network 30.
  • a central monitoring device 20 that acquires and processes the monitoring information.
  • the monitoring system 1 when the communication speed of information received from the monitoring terminal device 10 is equal to or lower than a predetermined threshold, the monitoring system 1 according to the present embodiment has a predetermined knitting condition defined from the viewpoint of reducing the amount of communication data.
  • a monitoring information transmission command for generating a mixed monitoring image is transmitted via wireless communication.
  • a monitoring image according to the monitoring information transmission command acquired via wireless communication is generated, and monitoring information including the monitoring image is transmitted to the supervisor side via wireless communication.
  • the supervisor side adjusts the resolution of the image information according to the communication speed reduction state, and acquires the monitoring information including the monitoring image with the communication data amount reduced.
  • the monitor can select the moving object V for which a monitoring image is to be generated based on the position of the moving object superimposed on the map information, the movement schedule of the moving object V acquired in advance, and the like.
  • the monitor can select the moving body V by pointing the target moving body V with a pointer such as a cursor or a touch pen, or by touching the touch panel display screen with a finger. Further, the monitor specifies the direction in which he / she wants to pay attention according to the location where the accident has occurred, the location designated by the notification, the location to be monitored, and other points of interest based on the position and traveling direction of the moving object V.
  • the monitor inputs to the central monitoring device 20 information specifying the selected moving object V, information specifying one or more directions to be watched, and a change in the watching direction.
  • the monitor designates the start point and end point or start point, start point, intermediate point, and end point of the range to be watched with a pointer such as a cursor or a touch pen, or touches the touch panel display screen with a finger. By doing so, the gaze direction can be input.
  • the mobile body V on which the monitoring terminal device 10 is mounted is not particularly limited as long as it travels in the target monitoring area, and includes mobile bodies V such as automobiles, motorcycles, industrial vehicles, and trams.
  • automobiles include private automobiles V2 and emergency automobiles V3, and in particular, taxis and route buses V1 that travel randomly and constantly in a predetermined area are preferably included.
  • FIG. 1 illustrates an emergency vehicle V3 such as a taxi V1, a private vehicle V2, a police car, a fire engine or an ambulance, but these are collectively referred to as a moving body V or a passenger vehicle V.
  • FIG. 2 is a block diagram illustrating a specific configuration of the central monitoring device 20 and the monitoring terminal device 10.
  • the monitoring terminal device 10 and the central monitoring device 20 can communicate via the telecommunication network 30.
  • the communication device 23 of the monitoring terminal device 10 is a communication means capable of wireless communication, and exchanges information with the communication device 23 of the central monitoring device 20 via the telecommunication network 30.
  • the telecommunications network 30 is a commercial telephone network, a mobile phone communication device can be used widely, and when the telecommunications network 30 is a dedicated telecommunications network for the monitoring system 1 of this example, it is dedicated to it.
  • the communication devices 23 and 13 can be used.
  • a wireless LAN, WiFi (registered trademark), WiMAX (registered trademark), Bluetooth (registered trademark), a dedicated wireless line, or the like can be used.
  • the central monitoring device 20 inputs the positional information and the monitoring image transmitted from the monitoring terminal device 10 mounted on each of the moving bodies V to the database in order to monitor the city by the captured image of the camera of the moving body V. It has an input function and a display control function for displaying the received monitoring image on the display 24 while superimposing the received positional information on the display 24 on the map information read from the map database.
  • the central monitoring apparatus 20 of this embodiment selects the moving body V for which a monitoring image is to be acquired with reference to the position of the moving body V on the map information presented by the monitor, and monitors the selected moving body V. It has a function of generating a monitoring information transmission command for the terminal device 10.
  • the central monitoring device 20 of the present embodiment has a communication speed calculation function for calculating the communication speed of information transmitted from the monitoring terminal device 10, and when the calculated communication speed is equal to or less than a predetermined threshold, In addition to generating a monitoring image in which the first image having a relatively high resolution and the second image having a second resolution lower than the first resolution are mixed with the information to be transmitted under a predetermined composition condition.
  • a command generation function for generating a monitoring information transmission command for transmitting the monitoring information including the generated monitoring image to the central monitoring device 20 and a generated monitoring information transmission command are transmitted to the selected monitoring terminal device 10. Command transmission function.
  • the central monitoring device 20 includes a central control device 21, an image processing device 22, a communication device 23, a display 24, and an input device 25.
  • the central monitoring device 20 according to the present embodiment has a database for storing monitoring information inside the central monitoring device 20, but may be provided outside the central monitoring device 20 as long as it is accessible.
  • the central control device 21 of the central monitoring device 20 is configured by a CPU, a ROM, and a RAM, and controls the image processing device 22, the communication device 23, and the display 24 to transmit the positional information, the monitoring image, and the information transmitted from the monitoring terminal device 10.
  • the time information is received, subjected to image processing as necessary, and displayed on the display 24.
  • the central controller 21 calculates a communication speed when communication with a certain mobile unit V is performed.
  • the communication speed of the present embodiment is a data transfer speed between the communication device 23 of the central monitoring device 20 and the communication device 13 of the monitoring terminal device 10.
  • the communication speed of the present embodiment can be determined based on the frame rate of the monitoring image received from the monitoring terminal device 10 on the central monitoring device 20 side.
  • the frame rate is the number of images processed (rewritten) per unit time in image display and moving image playback. If the communication speed between the communication device 23 of the central monitoring device 20 and the monitoring terminal device 10 communication device 13 is high, the frame rate related to the processing of the monitoring image received from the monitoring terminal device 10 on the central monitoring device 20 side becomes high. If the communication speed between the two is slow, the frame rate related to the monitoring image processing on the central monitoring device 20 side becomes low, so the communication speed can be evaluated based on the frame rate in the monitoring terminal device 10.
  • the term “communication speed when the central monitoring device 20 and the monitoring terminal device 10 exchange information” includes “a frame rate related to processing of a monitoring image in the central monitoring device 20”. That is, in this embodiment, whether or not the communication speed is equal to or lower than the predetermined threshold can be determined based on whether or not the frame rate related to the monitoring image processing of the central monitoring device 20 is equal to or lower than the predetermined threshold.
  • the communication speed receives a flag indicating completion of data reception from the communication device 23 of the central monitoring device 20 from the timing when the communication device 13 of the monitoring terminal device 10 transmits data to the communication device 23 of the central monitoring device 20. This can be determined based on the time T until the timing. If the communication speed between the communication device 23 of the central monitoring device 20 and the monitoring terminal device 10 communication device 13 is high, the time T from information transmission to receipt of reception notification is shortened. Since the time T from transmission to receipt of the reception notification becomes longer, the communication speed can be evaluated based on the time T required for the information transmission completion in the monitoring terminal device 10.
  • the term “communication speed when the central monitoring device 20 and the monitoring terminal device 10 exchange information” in this specification is “the time required for the monitoring terminal device 20 to transmit information to the central monitoring device 20”. Including. That is, in the present embodiment, whether or not the communication speed is equal to or lower than the predetermined threshold is determined until the monitoring terminal device 10 transmits information to the central monitoring device 20 and receives a signal of information reception completion from the central monitoring device 20. Can be determined by whether or not the time is equal to or greater than a predetermined threshold.
  • the predetermined threshold regarding the communication speed depends on the performance of the communication device 23 of the central monitoring device 20 and the communication device 13 of the monitoring terminal device 10, the communication environment around the location of the monitoring terminal device 10, and the communication environment around the location of the central monitoring device 20. And can be set appropriately.
  • the threshold for evaluating the frame rate is a threshold for determining whether or not the communication speed is decreasing, and the technical significance is the same as the threshold for evaluating the communication speed. Since the value has a meaning different from the speed, the threshold for evaluating the frame rate is set by independent determination separately from the threshold for evaluating the communication speed. For the same reason, the threshold for evaluating the time from the information transmission to the transmission completion confirmation in the monitoring terminal device 10 is set by independent determination separately from the threshold for evaluating the communication speed.
  • the central control device 21 exists in the vicinity of the moving body V and the moving body V when the communication speed when the central monitoring device 20 and the monitoring terminal device 10 exchange information is below a predetermined threshold.
  • Mobile unit V or a mobile unit V belonging to a predefined area to which the mobile unit V belongs is selected, and a monitoring terminal device for the selected mobile unit V 10 generates a monitoring image under a predetermined composition condition, generates a monitoring information transmission command including a command for transmitting the generated monitoring image to the central monitoring device 20, and transmits the monitoring information to the selected monitoring terminal device 10.
  • the frame rate related to the monitoring image processing in the central monitoring device 20 is predetermined. This includes a case where the value is equal to or less than the value and a case where the time from transmission of information to confirmation of transmission completion in the monitoring terminal device 10 is equal to or greater than a predetermined value.
  • the monitoring information transmission command includes a communication ID and other identifiers for identifying the monitoring terminal device 10 of the selected moving object V.
  • Selection of the moving body V can be performed based on a selection command for the moving body V input from the input device 25.
  • the mobile body V whose communication speed is equal to or lower than a predetermined threshold that is, the mobile body V that is a target for transmitting the monitoring information transmission command, is individually determined for each mobile body V based on the actual communication speed with the central monitoring device 20.
  • a predetermined threshold that is, the mobile body V that is a target for transmitting the monitoring information transmission command
  • a plurality of moving bodies V may be selected together based on the position information. Furthermore, all the mobile objects V existing in the communication area covered by the base station where the communication speed is reduced may be selected together. Further, in normal times, the moving body V existing in the vicinity area of the accident occurrence point input from the outside, the moving body V existing in the vicinity area of the moving body V reporting the occurrence of the accident, or monitoring in advance is important. One or a plurality of moving bodies V may be selected by automatically extracting moving bodies V and the like existing in the vicinity area of the priority monitoring point defined as the point to perform.
  • the monitoring terminal device 10 mounted on the moving body V selected by the monitor uses the cameras 1a to 1d provided at different positions of the moving body V and uses the monitoring image according to the composition condition included in the monitoring information transmission command. And the monitoring information including the monitoring image is transmitted to the central monitoring device 20. In this way, the central monitoring device 20 and the monitoring terminal device 10 of the moving object V cooperate with each other to provide monitoring information for monitoring the city.
  • the image processing device 22 has a map database, displays map information from the map database on the display 24, and superimposes and displays position information detected by the position detection device 15 of the monitoring terminal device 10 on the map information. To do. In addition, image processing is performed to display the monitoring image captured by the camera 11 of the monitoring terminal device 10 and processed by the image processing device 12 on the display 24.
  • the display 24 can be composed of, for example, a liquid crystal display device having a size capable of displaying two or more window screens on one screen, or two or more liquid crystal display devices each displaying two or more window screens.
  • One window screen displays a screen in which the position information of each moving body V is superimposed on the map information (see FIG. 1), and the other window screen is captured by the camera 11 of the moving body V. A monitoring image generated based on the captured image is displayed.
  • the input device 25 is an input device such as a keyboard, a mouse, or a touch panel.
  • the input device 25 specifies a desired moving object V, outputs a monitoring information transmission command to the moving object V, and displays various information displayed on the display 24. This is used when inputting a processing command.
  • the communication device 23 is a communication means capable of wireless communication, and exchanges information with the communication device 13 of the monitoring terminal device 10 via the telecommunication network 30.
  • the telecommunications network 30 is a commercial telephone network, a mobile phone communication device can be used widely, and when the telecommunications network 30 is a dedicated telecommunications network for the monitoring system 1 of this example, it is dedicated to it.
  • the communication devices 13 and 23 can be used.
  • the monitoring terminal device 10 is a terminal device mounted on each of the plurality of moving bodies V.
  • the monitoring terminal device 10 is mounted on each of the plurality of moving bodies V and a position detection function for detecting position information of each of the plurality of moving bodies V.
  • the monitoring image generating function for generating a monitoring image based on the captured image around the moving object V captured by the camera 11, and the position information, the monitoring image and the time information acquired at a predetermined timing are stored in the central monitoring device 20.
  • a communication function for receiving a command from the central monitoring device 20. Therefore, each moving body V includes a plurality of cameras 11a to 11d, an image processing device 12, a communication device 13, a control device 14, and a position detection device 15.
  • the time information is mainly information used for post-event analysis, and may be omitted.
  • the plurality of cameras 11 mounted on the respective moving bodies V are constituted by CCD cameras or the like, take images of respective predetermined directions or predetermined areas around the moving body V, and output the image pickup signals to the image processing device 12.
  • the camera 11 of this embodiment can set the resolution, and can image the surroundings of the vehicle V with a predetermined resolution.
  • the image processing device 12 reads an imaging signal from the camera 11 and executes image processing for generating a monitoring image. Details of this image processing will be described later.
  • the position detection device 15 includes a GPS device and its correction device, and detects the current position of the moving object V and outputs it to the control device 14.
  • the control device 14 includes a CPU, a ROM, and a RAM, receives a monitoring information transmission command from the central monitoring device 20 received via the telecommunication network 30 and the communication device 13, and receives the camera 11, the image processing device 12, The communication device 13 and the position detection device 15 are controlled, the monitoring image generated by the image processing device 12, the position information of the moving object V detected by the position detection device 15, and the time information from the clock built in the CPU, Is output to the central monitoring device 20 via the communication device 13 and the telecommunication network 30.
  • the control device 14 In generating the monitoring image, the control device 14 follows the monitoring information transmission command acquired from the central monitoring device 20 via the communication device 13, and the resolution is relative based on the predetermined composition information included in the monitoring information transmission command.
  • a monitoring image generation function for causing the image processing device 12 to generate a monitoring image including a first image having a first resolution higher than the first resolution and a second image having a second resolution lower than the first resolution;
  • a monitoring information transmission function for causing the central monitoring device 20 to transmit the included monitoring information via the communication device 13 is executed.
  • a monitoring image including at least a first image having a relatively high first resolution and a second image having a relatively low second resolution will be described. It is also possible to further include images of other resolutions.
  • the control device 14 causes the image processing device 12 to generate a monitoring image having a predetermined resolution set in advance before receiving the monitoring information transmission command from the central monitoring device 20, and the central monitoring device via the communication device 13. 20 to send.
  • the cameras 11a to 11d are configured by using an image sensor such as a CCD, and the four cameras 11a to 11d are installed at different positions outside the passenger car V, respectively, and respectively photograph the four directions of front, rear, left and right around the moving body V.
  • the camera 11a installed at a predetermined position in front of the passenger car V such as a front grille is an object or road surface (front) in the area SP1 in front of the passenger car V and in the space in front thereof. Take a view).
  • the camera 11c installed at a predetermined position in the rear part of the passenger car V such as a rear finisher part or a roof spoiler part, shows an object or road surface (rear view) existing in the area SP3 behind the passenger car V and in the space behind it. Take a picture.
  • FIG. 4 is a view of the arrangement of the cameras 11a to 11d as viewed from above the passenger car V.
  • the camera 11a that captures the area SP1 the camera 11b that captures the area SP2, the camera 11c that captures the area SP3, and the camera 11d that captures the area SP4 are the outer periphery VE of the body of the passenger car V.
  • the camera 11b is installed on the left side of the camera 11a, and the camera 11c is on the left side of the camera 11b.
  • the camera 11d is installed on the left side of the camera 11c, and the camera 11a is installed on the left side of the camera 11d.
  • the camera 11d is installed on the right side of the camera 11a, and on the right side of the camera 11d.
  • a camera 11c is installed, a camera 11b is installed on the right side of the camera 11c, and a camera 11a is installed on the right side of the camera 11b.
  • FIG. 5A shows an example of an image GSP1 in which the front camera 11a images the area SP1
  • FIG. 5B shows an example of an image GSP2 in which the left side camera 11b images the area SP2
  • FIG. 5C shows a rear camera
  • 11c shows an example of an image GSP3 obtained by imaging the area SP3
  • FIG. 5D is an image diagram showing an example of an image GSP4 obtained by the right side camera 11d imaging the area SP4.
  • the size of each image in the present embodiment is 480 vertical pixels ⁇ 640 horizontal pixels or 960 vertical pixels ⁇ horizontal 1280 pixels.
  • an image having a relatively high first resolution of 960 pixels ⁇ width 128 is set as the first image, and an image having a relatively low second resolution of 480 pixels ⁇ width 640 pixels is the second image.
  • the image size is not particularly limited, and may be any size that can be played back by a general terminal device at two relatively different resolutions.
  • the number of cameras 11 and the positions of the cameras 11 can be appropriately determined according to the size, shape, detection area setting method, etc. of the passenger car V.
  • the plurality of cameras 11 described above are assigned identifiers according to their arrangement, and the control device 14 can identify each of the cameras 11 based on each identifier. Further, the control device 14 can transmit an imaging command and other commands to the specific camera 11 by attaching an identifier to the command signal.
  • the control device 14 controls the image processing device 12 to acquire each image signal picked up by the camera 11, and the image processing device 12 processes the image pickup signal from each camera 11, and is shown in FIGS. 5A to 5D. Convert to surveillance image. Then, the control device 14 generates a monitoring image based on the four monitoring images shown in FIGS. 5A to 5D (monitoring image generation function), and projects the monitoring image set on the side of the projection model of the columnar body. Mapping information to be projected onto the surface is associated with the monitoring image (mapping information adding function) and output to the central monitoring device 20.
  • the monitoring image generation function and the mapping information addition function will be described in detail.
  • the monitoring image is generated on the basis of the four monitoring images obtained by imaging the periphery of the passenger car V, and the process of associating the mapping information with the monitoring image is executed by the monitoring terminal device 10 as in this example, and also by the central monitoring device 20. It can also be executed.
  • four monitoring images obtained by imaging the periphery of the passenger car V are transmitted as they are from the monitoring terminal device 10 to the central monitoring device 20, and are monitored by the image processing device 22 and the central control device 21 of the central monitoring device 20. It is only necessary to generate an image, associate mapping information, and perform projection conversion.
  • the control device 14 of the monitoring terminal device 10 controls the image processing device 12 to acquire the imaging signals of the respective cameras 11a to 11d, and further turns clockwise or counterclockwise along the outer periphery of the body of the passenger car V.
  • One monitoring image is generated so that the monitoring images of the cameras 11a to 11d installed in the direction of are arranged in the installation order of these cameras 11a to 11d.
  • the four cameras 11a to 11d are installed in the order of the cameras 11a, 11b, 11c, and 11d in the counterclockwise direction along the outer periphery VE of the body of the passenger car V.
  • the control device 14 connects the four cameras 11a to 11d in a horizontal direction so that the four images captured by the cameras 11a to 11d are integrated in accordance with the installation order of these cameras 11a to 11d (cameras 11a ⁇ 11b ⁇ 11c ⁇ 11d).
  • one monitoring image is generated.
  • the images are arranged such that the ground contact surface (road surface) of the passenger vehicle V is the lower side, and the images are connected to each other at sides in the height direction (vertical direction) with respect to the road surface.
  • FIG. 6 is a diagram illustrating an example of the monitoring image K.
  • the monitoring image K of the present embodiment includes a captured image GSP1 in which the front camera 11a images the area SP1 and a left camera 11b in the area P along the direction P from the left side to the right side of the drawing.
  • a captured image GSP2 captured from SP2 a captured image GSP3 captured by the rear camera 11c capturing the area SP3, and a captured image GSP4 captured by the right side camera 11d capturing the area SP4 are arranged in this order in the horizontal direction.
  • Four images are taken as a series of images.
  • the monitoring image K generated in this way is displayed in order from the left end to the right side with the image corresponding to the road surface (the ground contact surface of the moving body V) facing down, so that the monitor can turn around the moving body V. It can be visually recognized on the display 24 in the same manner as when looking around clockwise. As described above, the monitoring image K in the mode shown in FIG. 6 can be reproduced in the arrangement in the expected order without causing the positional relationship to be out of order even when processing such as transmission / reception is performed.
  • one monitoring image K when one monitoring image K is generated, four images acquired at substantially the same time as the photographing timings of the cameras 11a to 11d are used. Thereby, since the information included in the monitoring image K can be synchronized, the situation around the moving object V at a predetermined timing can be accurately expressed.
  • the monitoring image K generated from the respective captured images with substantially the same imaging timing of the camera 11 is stored over time, and a moving image monitoring image K including a plurality of monitoring images K per predetermined unit time is generated. You may do it.
  • generating the monitoring image K of the moving image based on the images with the same imaging timing it is possible to accurately represent the change in the situation around the moving object V.
  • the conventional central monitoring device 20 has a disadvantage that it cannot simultaneously watch images (moving images) in a plurality of directions and cannot monitor the entire periphery of the moving object V on one screen.
  • control device 14 of the present embodiment since the control device 14 of the present embodiment generates one monitoring image K from a plurality of images, it can simultaneously play back moving images of images in different imaging directions regardless of the function of the central monitoring device 20. . That is, by continuously reproducing the monitoring image K (moving image reproduction), four images included in the monitoring image K are simultaneously reproduced (moving image reproduction), and the state change of the regions in different directions is displayed on one screen. Can be monitored.
  • the first image of the first resolution of the present embodiment includes an image of 960 ⁇ 1280 pixels shown in FIGS. 5A to 5D, and these four images are made into one image as shown in FIG.
  • the monitoring image K compressed to 960 ⁇ 1280 pixels is included
  • the second image of the second resolution includes the images of 480 ⁇ 640 pixels shown in FIGS. 5A to 5D, and these four images are shown in FIG.
  • a monitoring image K compressed into 1280 ⁇ 240 pixels is included as one image.
  • the control device 14 of the monitoring terminal device 10 can also attach a line figure indicating the boundary between the arranged images to the monitoring image K.
  • the control device 14 displays rectangular partition images Bb, Bc, Bd, Ba, Ba ′ between the images as line figures indicating the boundaries between the arranged images. It can be attached to the monitoring image K.
  • the partition image functions as a frame of each captured image.
  • the image distortion is large in the vicinity of the boundary of each captured image, it is possible to hide the image of the region with large distortion or to suggest that the distortion is large by arranging the partition image at the boundary of the captured image. .
  • control device 14 can generate the monitoring image K after correcting the distortion when four images are projected on the projection plane set on the side surface of the projection model described later.
  • image distortion is likely to occur.
  • the captured image tends to be largely distorted. Therefore, it is defined in advance to correct the image distortion. It is desirable to correct the distortion of the captured image using the image conversion algorithm and the correction amount.
  • the control device 14 reads information on the same projection model as the projection model for projecting the monitoring image K in the central monitoring device 20 from the ROM, and takes a captured image on the projection plane of the projection model.
  • the distortion generated on the projection plane can be corrected in advance.
  • the image conversion algorithm and the correction amount can be appropriately defined according to the characteristics of the camera 11 and the shape of the projection model. In this way, by correcting in advance the distortion when the image K is projected with respect to the projection plane of the projection model, it is possible to provide the monitoring image K with good visibility with less distortion. Further, by correcting the distortion in advance, it is possible to reduce the positional deviation between the images arranged side by side.
  • mapping information addition function will be described.
  • the control device 14 projects the generated monitoring image K on the projection plane set on the side surface of the columnar projection model M with the ground contact surface of the passenger car V as the bottom surface.
  • the process of associating the mapping information with the monitoring image K is executed.
  • the mapping information is information for allowing the central monitoring device 20 that has received the monitoring image K to easily recognize the projection reference position.
  • FIG. 8 is a diagram showing an example of the projection model M of the present embodiment
  • FIG. 9 is a schematic sectional view taken along the xy plane of the projection model M shown in FIG.
  • the projection model M of the present embodiment is a regular octagonal prism body having a regular octagonal bottom surface and a height along the vertical direction (z-axis direction in the figure).
  • the shape of the projection model M is not particularly limited as long as it is a column having side surfaces adjacent to each other along the boundary of the bottom surface, and is a cylinder, or a prism, such as a triangular column, a quadrangular column, or a hexagonal column, or An anti-rectangular column having a polygonal bottom surface and a triangular side surface can also be used.
  • the bottom surface of the projection model M of this embodiment is parallel to the ground contact surface of the passenger car V.
  • Projection surfaces Sa, Sb, Sc, and Sd (hereinafter collectively referred to as a projection surface S) that project an image around the passenger vehicle V that contacts the bottom surface of the projection model M are provided on the inner surface of the side surface of the projection model M. Is set.
  • the projection surface S includes a part of the projection surface Sa and a part of the projection surface Sb, a part of the projection surface Sb and a part of the projection surface Sc, a part of the projection surface Sc and a part of the projection surface Sd, and the projection surface Sd. And a part of the projection surface Sa.
  • the monitoring image K is projected on the projection plane S as an image of the passenger car V viewed from above the viewpoint R (R1 to R8, hereinafter referred to as viewpoint R) above the projection model M surrounding the passenger car V.
  • the control device 14 associates the reference coordinates of the captured image arranged at the right end or the left end with the monitoring image K as mapping information.
  • the control device 14 is arranged at the right end as mapping information (reference coordinates) indicating the start end position or the end position of the monitoring image K when projected onto the projection model M.
  • mapping information reference coordinates
  • the coordinates A (x, y) of the upper left vertex of the captured image GSP1 and the coordinates B (x, y) of the upper right vertex of the captured image GSP2 arranged at the left end are attached to the monitoring image K.
  • the reference coordinates of the captured image indicating the start position or the end position are not particularly limited, and may be the lower left vertex of the monitoring image K arranged at the left end or the lower right vertex of the monitoring image K arranged at the right end.
  • the mapping information may be attached to each pixel of the image data of the monitoring image K, or may be managed as a file different from the monitoring image K.
  • the information indicating the start position or the end position of the monitoring image K is associated with the monitoring image K as mapping information, whereby the central monitoring apparatus 20 that has received the monitoring image K Since the reference position at the time of the projection process can be easily recognized, the monitoring images K arranged in the order of arrangement of the cameras 11a to 11d are sequentially and easily projected onto the projection surface S on the side surface of the projection model M. Can do. That is, as shown in FIG. 9, the captured image GSP1 in front of the moving body V is projected onto the projection surface Sa located in the imaging direction of the camera 11a, and the right side of the moving body V is projected onto the projection surface Sb located in the imaging direction of the camera 11b.
  • the captured image GSP2 is projected, the captured image GSP3 behind the moving body V is projected onto the projection surface Sc located in the imaging direction of the camera 11c, and the left side of the moving body V is projected onto the projection surface Sd located in the imaging direction of the camera 11d.
  • One captured image GSP4 can be projected.
  • the monitoring image K projected on the projection model M can show an image that can be seen as if looking around the passenger car V.
  • the monitoring image K including four images arranged in a line in the horizontal direction in accordance with the installation order of the cameras 11a to 11d is projected on the side surfaces arranged in the horizontal direction in the column of the projection model M.
  • An image around the passenger car V can be reproduced in the monitoring image K projected on the projection plane S of the body projection model M while maintaining the positional relationship.
  • control device 14 of the present embodiment can store the correspondence between the coordinate values of the monitoring image K and the coordinate values of the projection planes S of the projection model M as mapping information, and attach it to the monitoring image K.
  • it may be stored in the central monitoring device 20 in advance.
  • the positions of the viewpoint R and the projection plane S shown in FIGS. 8 and 9 are examples, and can be arbitrarily set.
  • the viewpoint R can be changed by the operation of the operator.
  • the relationship between the viewpoint R and the projection position of the monitoring image K is defined in advance, and when the position of the viewpoint R is changed, a predetermined coordinate transformation is performed, so that the viewpoint R is viewed from the newly set viewpoint R.
  • the monitoring image K can be projected onto the projection surface S (Sa to Sd). A known method can be used for this viewpoint conversion processing.
  • the control device 14 generates the monitoring image K based on the monitoring image captured at a predetermined timing, and the monitoring image K includes line information (partitions) indicating mapping information, reference coordinates, and boundaries. Image) information is associated and stored over time according to the imaging timing.
  • the control device 14 may store the monitoring image K as one moving image file including a plurality of monitoring images K per predetermined unit time, or in a form that can be transferred / reproduced by a streaming method.
  • the monitoring image K may be stored.
  • control apparatus 14 of this embodiment can include the movement speed acquired from the apparatus which detects the movement speed of the moving body V, and the vehicle speed sensor 16 with which a vehicle is provided in this example in monitoring information.
  • This moving speed (vehicle speed) is used when the knitting condition included in the monitoring information transmission command is set on the central monitoring device 20 side.
  • the communication device 23 of the central monitoring device 20 receives the monitoring image K transmitted from the monitoring terminal device 10 and the mapping information associated with the monitoring image K.
  • the images of the four cameras 11 installed at different positions of the body of the passenger car V are installed along the clockwise or counterclockwise direction along the outer periphery of the body of the passenger car V.
  • the cameras 11a to 11d are arranged according to the installation order (clockwise or counterclockwise order along the outer periphery of the body of the moving body V).
  • the monitoring image K is associated with mapping information for projecting the monitoring image K onto the projection plane S of the octagonal prism projection model M.
  • the communication device 23 transmits the acquired monitoring image K and mapping information to the image processing device 22.
  • the image processing apparatus 22 reads the projection model M stored in advance, and sets it on the side surface of the octagonal prism projection model M with the ground contact surface of the passenger car V shown in FIGS. 8 and 9 as the bottom surface based on the mapping information.
  • a display image is generated by projecting the monitoring image K onto the projected planes Sa to Sd. Specifically, according to the mapping information, each pixel of the received monitoring image K is projected onto each pixel of the projection surfaces Sa to Sd. Further, when projecting the monitoring image K onto the projection model M, the image processing device 22 recognizes the start point of the monitoring image K (the right end or the left end of the monitoring image K) based on the reference coordinates received together with the monitoring image K.
  • the projection processing is performed so that the start point coincides with the start point (the right end or the left end of the projection surface S) defined in advance on the projection model M. Further, when projecting the monitoring image K onto the projection model M, the image processing device 22 arranges a line figure (partition image) indicating the boundary of each image on the projection model M.
  • the partition image can be attached to the projection model M in advance, or can be attached to the monitoring image K after the projection processing.
  • the display 24 or the touch panel display 24 having a display function displays the monitoring image K projected on the projection surface S of the projection model M.
  • 10 to 17 show examples of display images of the monitoring image K.
  • FIG. 10 shows a monitoring image K projected on the projection surfaces Sd, Sa, Sb viewed from the viewpoint R1 shown in FIGS.
  • An image of the moving body V viewed from each viewpoint R is pasted on the bottom surface of the projection model M. Further, the portion where the image between the projection surfaces Sd, Sa, and Sb is not displayed is a “line figure indicating a boundary (partition image)”.
  • FIG. 11 shows the monitoring image K viewed from the viewpoint R2
  • FIG. 12 shows the monitoring image K viewed from the viewpoint R3
  • FIG. 13 shows the monitoring image K viewed from the viewpoint R4
  • FIG. 15 shows the monitoring image K viewed from the viewpoint R5,
  • FIG. 15 shows the monitoring image K viewed from the viewpoint R6,
  • FIG. 16 shows the monitoring image K viewed from the viewpoint R7, and
  • FIG. The observed monitoring image K is shown.
  • the terminal device 800 transfers the captured image of each camera 1 along the x-axis direction or the y-axis direction (sideways) according to the installation order of the cameras 1 installed on the body of the moving body V.
  • the arranged monitoring image K is mapped (laterally) along the side surface of the projection model M of the columnar body according to the arrangement order, the monitoring image K shown in the projection model M has a clock around the moving body V. It is possible to show an image that can be seen when looking around. That is, the supervisor can obtain the same information as when the user gets on the moving body V and looks around while watching the monitoring image K while staying at a position separated from the moving body V.
  • the captured image GSP1 of the camera 1a provided on the front grille of the moving body V is projected onto the projection surface S facing the front grille of the moving body V, and the projection surface S facing the right side mirror of the moving body V is projected.
  • the captured image GSP3 is projected, and the captured image GSP2 of the camera 1b provided on the left side mirror of the moving body V can be projected onto the projection surface S facing the left side mirror of the moving body V.
  • the monitoring image K projected while maintaining the positional relationship of the video around the moving object V.
  • the monitor can easily grasp what is happening around the moving body V.
  • FIGS. 10 to 17 attached to the present application are still images, the actual display image reproduction state cannot be shown.
  • the display screen of the display 24 is not displayed.
  • Each of the images shown on each projection plane S is a moving image. That is, the moving image of the imaging area SP1 in front of the moving body V is projected on the projection surface S facing the front grille of the moving body V, and the moving body V is projected on the projection surface S facing the right side mirror of the moving body V.
  • a moving image of the imaging area SP4 on the right side of the V is displayed, and a moving image of the imaging area SP3 behind the moving object V is displayed on the projection surface S facing the rear part of the moving object V, and the left side of the moving object V is displayed.
  • a moving image of the imaging region SP2 on the left side of the moving object V is projected. That is, a plurality of moving image monitoring images K based on captured images captured by different cameras 1 can be simultaneously reproduced on each projection plane S shown in FIGS.
  • the viewpoint can be freely set and changed by the operation of the supervisor. Since the correspondence relationship between the viewpoint position and the projection plane S is defined in advance in the image processing device 22 or the display 24 described above, the monitoring image K corresponding to the changed viewpoint is displayed on the display 24 based on this correspondence relationship. can do.
  • the resolution of each image included in each display image shown in FIG. 10 to FIG. 17 may be made common from the viewpoint of easy viewing, or may be a different resolution from the viewpoint of reducing the amount of communication data.
  • the resolution of each image constituting the monitoring image K at a certain timing may be different, but it is possible to simultaneously reproduce the monitoring images K of a plurality of moving images.
  • FIGS. 19A, 19B, and 19C are flowcharts showing the operation on the central monitoring device 20 side.
  • the surrounding video and the indoor video are acquired from the in-vehicle camera 11 at a predetermined time interval (one routine shown in FIG. 18), and a monitoring image is obtained by the image processing device 12. Conversion is performed (step ST1). Further, the current position information of the passenger car V on which the monitoring terminal device 10 is mounted is detected from the position detection device 15 having GPS (step ST2).
  • step ST3 it is determined whether or not the report button 16 for reporting the abnormality is pressed. If the report button 16 is pressed, the process proceeds to step ST4, and the monitoring image acquired in step ST1 and the acquired in step ST2.
  • the positional information is associated with the CPU time information, and these are transmitted as monitoring information to the central monitoring device 20 via the communication device 13 and the telecommunications network 30 together with the abnormality information indicating that an abnormality has occurred.
  • the occurrence of an abnormality related to security such as an accident or a crime is automatically transmitted to the central monitoring device 20 together with the position information of the passenger car V and the monitoring image around the passenger car V, thereby further strengthening the monitoring in the city. Will be.
  • the monitoring image and the position information are acquired in the first steps ST1 and ST2, but the monitoring image and the position information may be acquired at a timing between steps ST3 and ST4.
  • step ST3 if the report button 16 is not pressed, the process proceeds to step ST5, where it communicates with the central monitoring device 20 and obtains a control command.
  • step ST6 the monitoring terminal device 10 determines whether or not a monitoring information transmission command has been acquired from the central monitoring device 20, and proceeds to step ST7 if a monitoring information transmission command has been acquired.
  • step ST7 it is determined whether or not the acquired monitoring information transmission command includes a command for generating a monitoring image in which images of different resolutions are mixed under a predetermined composition condition.
  • the process proceeds to step ST8 and the resolution is changed.
  • the resolution change command is transmitted to the image processing apparatus 12 that performs image editing processing according to the resolution or the camera 11 that has a resolution setting function.
  • the acquired monitoring information transmission command does not include a command for generating a monitoring image in which images of different resolutions are mixed under a predetermined composition condition
  • the first resolution is relatively high without changing the resolution.
  • a monitoring image is generated. That is, when the composition information is included in the monitoring information transmission command, the monitoring image is generated by mixing the second images having the relatively low second resolution.
  • step ST9 the control device 14 generates a first image of the first resolution and a second image of the second resolution according to the knitting conditions of the monitoring information transmission command, and the first image at a ratio according to the knitting conditions And the second image.
  • the “ratio at which the first image and the second image are mixed” in the composition condition is adjusted by the interval, frequency, or cycle at which the first image and the second image are generated and transmitted. That is, the mixing ratio of the first image and the second image in the composition condition is the number of the first images of the first resolution generated per unit time or the number of the first images transmitted per unit time. It can be adjusted by the ratio of the number of second images of the second resolution generated per unit time or the number of second images transmitted per unit time.
  • the control device 14 determines the first resolution or It is possible to capture a vehicle periphery at the second resolution and generate a monitoring image in which the first image and the second image are mixed under a predetermined composition condition.
  • the first image having the first resolution and the second image having the second resolution are generated by the image processing apparatus 12 having a function of processing a captured image captured by the camera 11 into an image having a predetermined resolution. Can be done.
  • the control device 14 performs image processing on a monitoring image in which a first image having a first resolution and a second image having a second resolution lower than the first resolution are mixed according to a knitting condition according to the knitting condition of the monitoring information transmission command.
  • the device 12 is generated.
  • the control device 14 transmits monitoring information including the monitoring image, time information, and position information according to the composition condition to the central monitoring device 20 according to the monitoring information transmission command.
  • the monitoring information may include the moving speed of the moving object V as necessary.
  • monitoring information such as a monitoring image, position information, and time information is stored in the memory of the monitoring terminal device 10.
  • step ST10 when the monitoring information transmission command is not acquired from the central monitoring device 20, the process proceeds to step ST10, where it is determined whether or not the passenger vehicle V exists in a predefined priority monitoring area.
  • monitoring information including a monitoring image is transmitted.
  • the monitoring information including the monitoring image that does not change the communication speed is transmitted from the viewpoint that a detailed monitoring image is preferable if it is within the priority monitoring area.
  • the process proceeds to step ST11, and monitoring information not including the monitoring image, that is, time information and position information is transmitted to the central monitoring device 20.
  • step ST11 of FIG. 19A position information and time information are acquired from all the passenger cars V and stored in the database 26.
  • the database 26 stores monitoring information including monitoring images, position information, and time information acquired from the passenger car V (monitoring terminal device 10) in association with the position information. That is, if position information is designated, a series of monitoring information can be called.
  • the monitoring information can include a mobile body ID (monitoring terminal device ID) for specifying the monitoring terminal device 10.
  • the mobile object ID may be the address of the communication device 13 of the monitoring terminal device 10.
  • step ST12 based on the position information acquired in step ST11, the position of the passenger car V is displayed by superimposing dots on the map information of the map database displayed on the display 24 as shown in the upper left of FIG. By looking at this map information, it is possible to see at a glance at which position the moving body V carrying the monitoring terminal device 10 is traveling. In other words, it is possible to identify the moving object V on which the monitoring terminal device 10 existing at the position to be monitored is mounted. Since the position information of the passenger car V is acquired and transmitted at a predetermined timing for each routine in FIG. 11, the supervisor can grasp the current position of the passenger car V in a timely manner.
  • step ST13 it is determined whether or not abnormality information notified from the monitoring terminal device 10 of the passenger car V, that is, a notification that an abnormality relating to security such as an accident or a crime has occurred has been received.
  • This abnormality information is output when the passenger of the passenger car V presses the notification button 16 of the monitoring terminal device 10.
  • the passenger vehicle V for which the abnormality information is output is identified in step ST14, the monitoring image and the time information are received from the monitoring terminal device 10 of the passenger vehicle, and the monitoring image is displayed on the display 24. Further, as shown in the upper left of FIG. 1, highlighting is performed such as changing the color so that the passenger car displayed on the map information can be distinguished from other passenger cars. Thereby, the position where the abnormality has occurred can be visually recognized on the map information, and the abnormality content can be grasped on the display 24.
  • the passenger vehicle V traveling in the vicinity (within a predetermined distance) of the passenger vehicle V that has output the abnormality information is detected, and a monitoring information transmission command including a monitoring image and time information is output to the passenger vehicle V.
  • the monitoring information can be acquired from the passenger vehicle V that travels in the vicinity of the passenger vehicle V that has output the abnormality information. Therefore, in addition to the monitoring image from the passenger vehicle V that has output the abnormality information, the abnormality information Can be understood in detail.
  • the monitoring information transmission command can include a composition condition described later.
  • step ST16 the position information of the passenger vehicle V that has output the abnormality information is transmitted to emergency vehicles such as police cars, ambulances, and fire engines.
  • emergency vehicles such as police cars, ambulances, and fire engines.
  • a monitoring image may be attached and transmitted in order to notify the abnormal content.
  • the emergency vehicle can be dispatched before a report from the site is entered, and a quick response to an accident or crime is possible.
  • step ST17 all position information, monitoring images and time information received from the monitoring terminal device 10 are recorded on the recording medium. This record is used to resolve these after an accident or crime. If there is no abnormality information in step ST13, the process proceeds to step ST21 in FIG. 19B without performing the processes in steps ST14 to ST17.
  • step ST21 it is determined whether there is an image information transmission command from an emergency passenger car such as a police car, an ambulance, or a fire engine. If an image transmission command is input, the process proceeds to step ST22. In step ST22, it is determined whether or not the passenger car V exists in the area specified by the image transmission command. If the passenger car V exists, the process proceeds to step ST23. In step ST23, a monitoring information transmission command is output to the passenger vehicle V existing in the area specified by the image transmission command. Thereby, the image information from the passenger car V can also be acquired in step ST11 of FIG. 19A in the next routine, and this can be transferred to the emergency passenger car, or the meaning of the transmission command from the emergency passenger car can be grasped. can do. If not corresponding to steps ST21 and ST22, the process proceeds to step ST24 without performing the processes of steps ST21 to ST23.
  • step ST24 it is determined whether or not there is a passenger car V in the vicinity of a suspicious location such as a preset frequent crime delay, and if so, the process proceeds to step ST25 to include a monitoring image for the passenger car V. Output monitoring information transmission command. Suspicious areas are streets with poor security and streets. As a result, the monitoring of streets and streets that are suspicious places can be strengthened, and crime prevention can be expected. If the passenger vehicle V does not exist in the region near the suspicious part, the process proceeds to step ST26 without performing the process of step ST22.
  • step ST26 it is determined whether or not there is a passenger vehicle V in the vicinity of the priority monitoring position where the priority monitoring object whose details should be monitored can be imaged. If the passenger vehicle V exists in the vicinity of the priority monitoring position, step ST27 is determined. To the passenger vehicle V and outputs a priority monitoring command for requesting transmission of monitoring information including a monitoring image in which the priority monitoring target is expanded. As a result, it is possible to monitor the priority monitoring target in detail, and to effectively detect a suspicious object that causes an incident or an accident in the specified priority monitoring target, so that prevention of crime can be expected. If there is no passenger vehicle V in the vicinity of the priority monitoring position, the process proceeds to step ST28 without performing the process of step ST27.
  • step ST28 based on the position information received from each passenger car V, the passenger car V is not traveling within a predetermined time within a predetermined area that is required to be monitored (not limited to the suspicious location and the priority monitoring area). It is determined whether there is a route, and when there is such a route, it is monitored whether there is a passenger vehicle V traveling on the route. If there is a passenger car V traveling on the route most recently, the process proceeds to step ST29, and a monitoring information transmission command including a monitoring image is output to the passenger car V. As a result, it is possible to automatically acquire a monitoring image of a route that is in a region other than the suspicious location or the priority monitoring region and has a small traffic volume of the passenger car V. If there is no route that satisfies the condition of step ST28, the process returns to step ST11 of FIG. 19A without performing the process of step ST29.
  • the above-described monitoring information transmission command can include a command for requesting transmission of the monitoring image, position information, time information, and moving speed, and can further include an image resolution setting command applied when the monitoring image is created. Can be included.
  • FIG. 19C is a flowchart showing a procedure of processing relating to generation and transmission of a monitoring information transmission command including a predetermined composition condition.
  • FIG. 19C and FIGS. 20 to 23 the generation and transmission processing of the monitoring information transmission command of this embodiment will be described.
  • 20 to 23 are diagrams for explaining the transmission state of the monitoring image based on each composition condition.
  • the central monitoring device 20 records the monitoring image, the position information, the time information, and the moving speed acquired from each passenger vehicle V in a recording medium in association with the identifier of each moving body V.
  • Step 31 and subsequent steps are processing related to generation and transmission processing of the next monitoring information transmission command based on information acquired up to the previous time.
  • the central monitoring device 20 sets the monitor image of the acquired monitoring information on the touch panel display 24 having an input function so that it can be presented to the monitor.
  • the central monitoring device 20 receives information specifying the moving object V present at the position to be monitored from the monitor. This operation is an operation such as clicking a dot corresponding to the moving object V displayed in the map information shown in the upper left of FIG.
  • step ST33 the central monitoring apparatus 20 receives an input of a gaze direction that the supervisor wants to gaze at.
  • the gaze direction is input by touching, tapping, or sliding touching the screen of the touch panel display 24.
  • the direction of the touched point can be specified as the gaze direction.
  • the left side camera 11b of the monitoring image K performs input for selecting the captured image GSP2 obtained by imaging the area SP2.
  • the supervisor touches the point PRa in the area where the captured image GSP4 is presented in the touch panel display 24, so that the right direction of the moving object V is set as the gaze direction. Can be entered.
  • the central monitoring device 20 calculates a communication speed in communication (information exchange) with the monitoring terminal device 10 mounted on the identified vehicle V.
  • the communication speed in the present embodiment is the data amount per unit time of information received by the central monitoring device 20 from the monitoring terminal device 10 side.
  • the communication speed can be determined based on the frame rate related to monitoring image processing in the central monitoring device 20 and the time required from transmission of information to confirmation of transmission completion in the monitoring terminal device 10.
  • step ST34 the central monitoring device 20 determines whether or not the communication speed is equal to or lower than a predetermined threshold (whether the frame rate is equal to or lower than the predetermined threshold, the time required from transmission to transmission completion is equal to or higher than the predetermined threshold). Whether or not there is).
  • step ST34 the central monitoring apparatus 20 determines that the communication speed is equal to or lower than a predetermined threshold (the frame rate is equal to or lower than the predetermined threshold, or the time until information transmission is completed is equal to or higher than the predetermined value).
  • Advances to step ST35 In step ST35, generation of the knitting condition for the monitoring image including images with different resolutions is started. The resolution can be specified for each frame.
  • the knitting conditions in this embodiment include a matter of mixing a first image having a first resolution with a relatively high resolution and a second image having a second resolution lower than the first resolution under a predetermined knitting condition.
  • the composition condition in the present embodiment defines that at least a first image and a second image having different resolutions are included at a predetermined ratio, but images of further different third resolution, fourth resolution to nth resolution are included. Inclusion can be defined at a predetermined rate.
  • the composition condition of the present embodiment is that the first image of the first resolution that is the resolution of the image to be transmitted in the normal state where the communication speed is not less than or equal to the predetermined threshold is the second lower than the first resolution.
  • a combination of the second images of resolution is defined.
  • composition conditions include the transmission frequency of the first image and the second image transmitted as the monitoring image, the existence ratio of the first image and the second image included in the monitoring image transmitted per unit time, the first image and the first image
  • the transmission order of two images, and the transmission interval (cycle) of the first image and the second image sequentially transmitted according to the transmission order are included.
  • FIG. 20 is a diagram for explaining the transmission state of the monitoring image based on the first composition condition.
  • the monitoring terminal device 10 in the normal process (1) in which the communication speed between the central monitoring device 20 and the monitoring terminal device 10 does not decrease, the monitoring terminal device 10 has a relatively high first resolution. The first image is transmitted to the central monitoring apparatus 20 at timings t0 to t3 set at regular intervals.
  • the monitoring terminal device 10 in the processes (2) and (3) when the communication speed between the central monitoring device 20 and the monitoring terminal device 10 is a predetermined threshold decrease, the monitoring terminal device 10 is relatively A first image with a high first resolution and a second image with a relatively low second resolution are alternately transmitted to the central monitoring device 20 at timings t0 to t3 set at regular intervals.
  • composition conditions shown in (2) and (3) of this example indicate that one or more second images of the second resolution are transmitted after a predetermined interval has elapsed after transmitting the first image of the first resolution.
  • the composition condition shown in (2) of this example since the number of second images transmitted at t1 and t3 is two (640 ⁇ 480 pixels ⁇ 2), one first image (1280 ⁇ 960 pixels ⁇ 2) The amount of information to be communicated can be reduced compared to the case shown in (1) where only 1) is transmitted.
  • the composition condition shown in (3) of this example since the number of second images transmitted at t1 and t3 is four (640 ⁇ 480 pixels ⁇ 4), one first image (1280 ⁇ 960) is obtained.
  • the amount of information related to communication is the same as in the case of (1) transmitting only (pixels ⁇ 1), but in the case of (1), two images are transmitted between t0 and t1 ( In the case of 2), there is an advantage that as many as five images can be transmitted between t0 and t1. That is, a large number of monitoring images can be provided to the monitor while maintaining the amount of information.
  • a monitoring image formed by mixing the first image having a high resolution and the second image having a low resolution on the monitoring terminal device 10 mounted on each moving body V is combined. Since transmission is performed to the central monitoring device 20, an increase in the amount of information related to communication can be suppressed while ensuring the transmission frequency of information. That is, when the communication speed is low, the central monitoring device 20 is not detailed but instead of acquiring one detailed image with a large amount of information related to communication. Since the monitoring terminal device 10 is controlled so that many (a plurality of) small images can be acquired, information can be collected in real time without encouraging congestion of the communication line that causes a decrease in communication speed.
  • the central monitoring apparatus 20 of this embodiment can generate a monitoring information transmission command for each monitoring direction with the moving body V as a reference.
  • the central monitoring device 20 includes, in the monitoring information transmission command, information that specifies the monitoring direction (front, rear, right side, left side) with respect to the moving body, and the resolution and the monitoring direction in each monitoring direction.
  • the transmission frequency can be associated.
  • the central monitoring device 20 preliminarily stores information for specifying the monitoring direction with respect to the moving object V and information for associating the cameras 11a to 11d installed at a predetermined position of the moving object V and imaging the predetermined monitoring direction.
  • information for identifying the cameras 11a to 11d can be included in the monitoring information transmission command.
  • the monitoring direction included in the monitoring information transmission command is stored on the moving body V side in such a manner that the monitoring direction based on the moving body V and information associating the cameras 11a to 11d for imaging the predetermined monitoring direction are stored.
  • the monitoring terminal device 10 may identify the cameras 11a to 11d targeted by the monitoring information transmission command.
  • the central monitoring device 20 of the present embodiment can create and execute a monitoring information transmission command for each direction around the moving object V.
  • the camera 11a installed at the front portion of the moving object V is specified. And a 1st image and a 2nd image can be produced
  • FIG. 21 is a diagram for explaining the transmission state of the monitoring image based on the second composition condition.
  • the same knitting condition can be defined for all the directions (front: Fr, rear Rr, right: Right, left: Left) or all the cameras 11a to 11d. That is, in the example of (1), the monitoring terminal device 10 has a relatively high first resolution image and a relatively low first image in all directions (or imaging directions of all the cameras 11a to 11d).
  • the second image of two resolutions are alternately transmitted to the central monitoring device 20 at timings t0 to t3 set at a constant interval f0.
  • this embodiment as shown in FIG.
  • step ST ⁇ b> 40 it is determined whether or not the supervisor inputs the gaze direction that the supervisor wants to gaze among the surroundings of the moving body V on which the monitoring terminal device 10 is mounted to the central monitoring device 20 side. If the gaze direction is designated by the supervisor, the process proceeds to step ST36.
  • step ST36 when the gaze direction is input, the central monitoring device 20 generates a monitoring information transmission command including a knitting condition according to the gaze direction. Since the gaze direction is a direction that the monitor wants to observe particularly carefully, it is preferable to provide the monitor with a relatively high resolution image for the gaze direction even when the communication speed decreases. For this reason, the central monitoring apparatus 20 of this embodiment produces
  • a first image having a high first resolution is generated, a second image having a relatively low second resolution is generated based on a captured image of the camera 11 that captures a direction other than the gaze direction, and a second image having a different resolution is generated.
  • a knitting condition in which one image and the second image are mixed at a predetermined ratio is generated. This composition condition is included in a monitoring information transmission command transmitted to the monitoring terminal device 10.
  • FIG. 22 is a diagram for explaining a transmission state of the monitoring image based on the third composition condition.
  • the same knitting conditions can be defined for all the directions (front: Fr, rear Rr, right: Right, left: Left) or all the cameras 11a to 11d. That is, in the example shown in (1) of FIG. 22, the monitoring terminal device 10 compares the first image with a relatively high first resolution with respect to all directions (or the imaging directions of all the cameras 11a to 11d), Therefore, the second image of the second resolution having a lower resolution is alternately transmitted to the central monitoring apparatus 20 at timings t0 to t3 set at a constant interval f0.
  • the present embodiment as shown in (2) of FIG.
  • the monitoring terminal device 10 captures the front that is the gaze direction for the gaze direction (for example, forward: Fr) designated by the monitor. Based on the captured image at the timing P1 of the camera 11a, the first image of the first resolution is generated, and the captured images at the timing P1 of the cameras 11b, 11c, and 11d that capture the rear, right side, and left side that are directions other than the gaze direction. The second image of the second resolution is generated based on the above, and the monitoring image obtained by mixing them is transmitted to the central monitoring device 20 at the timing t0.
  • a first image having a first resolution is generated based on a captured image at timing P2 of the camera 11a that captures the front in the gaze direction, and based on a captured image at timing P2 of the camera 11c that captures the rear that is not in the gaze direction.
  • the first image having the first resolution is generated, and further, the second image having the second resolution is generated based on the captured images at the timing P2 of the cameras 11b and 11d that capture the right side and the left side which are directions other than the gaze direction.
  • the monitoring image in which these are mixed is transmitted to the central monitoring apparatus 20 at timing t0.5.
  • a first image having a relatively high first resolution is generated for the gaze direction
  • a second image having a relatively low second resolution is generated for the direction other than the gaze direction, thereby monitoring a plurality of directions. While obtaining images uniformly, by giving light weight to the gaze direction and resolutions other than the gaze direction, the amount of information related to communication between the central monitoring device 20 and the monitoring terminal device 10 can be reduced.
  • step ST41 the central monitoring apparatus 20 acquires the vehicle speed detected by the vehicle speed sensor 16 of the passenger vehicle V specified by the supervisor via the telecommunication network 30.
  • step ST42 the central monitoring device 20 determines whether the vehicle speed is equal to or higher than a predetermined threshold value. If the vehicle speed is greater than or equal to the predetermined threshold, the process proceeds to step ST37.
  • step ST37 the central monitoring apparatus 20 generates a knitting condition that increases the proportion of the second image having the relatively low second resolution.
  • the predetermined threshold value that is, when the moving object V is traveling fast
  • the image around the vehicle also changes greatly. Since the video around the vehicle changes every moment, it is preferable to provide a large number of images to the monitor at short intervals when the vehicle speed is high even when the communication speed decreases. For this reason, the central monitoring apparatus 20 of this embodiment produces
  • FIG. 23 is a diagram for explaining a transmission state of the monitoring image based on the fourth composition condition.
  • the monitoring terminal device 10 uses a relatively high first resolution first image and a relatively low second resolution second image.
  • the images are alternately transmitted to the central monitoring device 20 at timings t0 to t3 set at regular intervals so that the images are included at the same rate.
  • the monitoring terminal device 10 applies the first image of the first resolution at the timing t0 according to the knitting condition in which the ratio of the second image is increased.
  • the second image with the second resolution is transmitted twice at timings t1 and t2, and then the first image with the first resolution is transmitted at timing t3.
  • the ratio between the first image and the second image is 1: 1, but in (2), the ratio between the first image and the second image is 1: 2.
  • the ratio of the relatively low second resolution second image included in the monitoring image is increased, so that the vehicle speed is equal to or higher than the predetermined threshold.
  • the amount of information required for communication between the central monitoring device 20 and the monitoring terminal device 10 is reduced, and the monitoring image around the vehicle that varies greatly according to the moving speed Can be provided in real time.
  • step ST38 the central monitoring apparatus 20 generates a monitoring information transmission command including the knitting conditions generated in steps ST35 to ST37.
  • a monitoring information transmission command including the composition conditions is transmitted to the vehicle previously identified in ST32. This monitoring information transmission command is a command for controlling the next generation of monitoring information.
  • step ST34 if the communication speed is equal to or lower than the predetermined threshold (the frame rate is equal to or lower than the predetermined threshold, or the time required from the start of transmission to the completion of transmission is equal to or higher than the predetermined threshold), the previous monitoring information transmission command Therefore, the monitoring terminal device 10 may be processed based on the processing, so that the processing after step ST35 is terminated without performing the processing, and the processing returns to step ST30. Further, when the supervisor does not specify the gaze direction in step ST40, the processing after step ST41 is performed. Furthermore, when the vehicle speed is less than the predetermined threshold value in step ST42, the process proceeds to step ST38.
  • the monitoring information transmission command is transmitted to the specified vehicle V.
  • the monitoring terminal device 10 that has acquired the monitoring information transmission command generates a first image and a second image in order to obtain a monitoring image according to the composition condition.
  • the image processing device 12 may be caused to change the resolution of the captured image, or the resolution of the camera 11 may be changed as desired. You may acquire the captured image of the resolution.
  • FIG. 24 shows the time load related to each process when the generation of the first image and the second image is executed only by the image processing apparatus 12, and is distributed to the camera 11 and the image processing apparatus 12. It is a figure which shows the time load concerning each process in a case. In FIG. 24, a temporal load related to each process is indicated by a rectangular block.
  • the camera 11 and the image processing device 12 can execute processing of four captured images in parallel. Therefore, the time tq2 required to process the four captured images dispersedly in the camera 11 and the image processing device 12 to generate the first image or the second image is four times only for the image processing device 12.
  • the time tq1 required for processing the captured image to generate the first image or the second image is shorter.
  • the processing time for generating the first image or the second image is shortened. Therefore, the monitoring information including the monitoring image is stored in the central monitoring device 20. The time required to provide the data can also be shortened. As a result, a monitoring image can be provided to the monitoring person in real time.
  • the monitoring system 1 provides the first information with a relatively high first resolution to the monitoring terminal device 10 mounted on each mobile body when the communication speed is low. And the monitoring image formed by mixing the second information with the relatively low second resolution are transmitted to the central monitoring device 20, so that an increase in the amount of communication data can be suppressed while ensuring the information transmission frequency. . That is, the central monitoring device 20 can acquire rough information with a small amount of communication data a plurality of times instead of acquiring detailed information with a large amount of communication data once when the communication speed is low. Since the monitoring terminal device 10 is controlled as described above, information can be collected in real time without encouraging the congestion of the communication line that causes a decrease in the communication speed.
  • the central monitoring device 20 of the monitoring system 1 Since the central monitoring device 20 of the monitoring system 1 according to the present embodiment of the present invention generates a monitoring information transmission command for each monitoring direction with the moving body V as a reference, for example, the progress of the moving body V in the monitor
  • the camera 11a installed in the front part of the moving object V is specified, and the first image and the second image are generated based on the captured image of the camera 11a. Can be made. Thereby, it is possible to reduce the amount of information related to communication between the central monitoring device 20 and the monitoring terminal device 10 by adjusting the resolution while acquiring a monitoring image according to the intention of the monitoring person.
  • the central monitoring device 20 of the monitoring system 1 is a camera that captures a gaze direction when a specific gaze direction that the monitor wants to gaze is input around the moving body V.
  • a first image having a relatively high first resolution is generated based on the captured image of 11 and a second image having a relatively low second resolution based on the captured image of the camera 11 that captures a direction other than the gaze direction. Since the knitting condition for generating the image is generated, the monitoring image in a plurality of directions can be obtained evenly, and the gaze direction and the resolution other than the gaze direction are given light weight, so that the central monitoring device 20 and the monitoring terminal device 10 are The amount of information related to the communication can be reduced.
  • the central monitoring device 20 of the monitoring system 1 is configured such that when the vehicle speed of the moving object V is equal to or higher than a predetermined threshold, a second image having a relatively low second resolution is used as the monitoring image.
  • the communication between the central monitoring device 20 and the monitoring terminal device 10 is performed by giving a light weight to the ratio of the second image when the vehicle speed is equal to or higher than the predetermined threshold value and below the predetermined threshold value. While reducing the amount of information, it is possible to provide a real-time monitoring image around the vehicle that varies greatly according to the moving speed.
  • the monitoring terminal device 10 of the monitoring system 1 provides each camera 11 having a resolution setting function for imaging the surroundings of the vehicle with a predetermined resolution according to the knitting condition of the monitoring information transmission command.
  • the surroundings of the vehicle can be imaged at the resolution or the second resolution, and a monitoring image in which the first image and the second image are mixed under a predetermined composition condition can be generated. Since the processing of four captured images can be executed in parallel by the camera 11 and the image processing device 12, the processing time for generating the first image or the second image is shortened. The time required for providing the monitoring information to be included to the central monitoring device 20 can also be shortened. As a result, a monitoring image can be provided to the monitoring person in real time.
  • the monitoring terminal device 10 of the monitoring system 1 can cause the image processing device 12 to generate the first image having the first resolution and the second image having the second resolution.
  • the actions and effects (1) to (5) above can be achieved.
  • the monitoring terminal device 10 of the monitoring system 1 has the first resolution first. Since the monitoring image of the image is generated and transmitted to the central monitoring device 20, when the communication speed is higher than the predetermined threshold, the monitoring image having a relatively high resolution is transmitted to the monitoring person. Can monitor the city based on detailed monitoring images.
  • the monitoring system 1 including the monitoring terminal device 10 and the central monitoring device 20 will be described as an example of the monitoring system including the monitoring terminal device and the central monitoring device according to the present invention. It is not limited to this.
  • the central monitoring device including the communication speed calculation unit, the command generation unit, and the command transmission unit according to the present invention, a communication speed calculation function, a command generation function, and a command transmission function
  • the central monitoring device 20 including the control device 21, the image processing device 22, the communication device 23, the display 24, and the input device 25 will be described as an example.
  • the present invention is not limited to this.
  • the control apparatus 14 which has a monitoring image generation function and the monitoring information transmission function,
  • the monitoring terminal device 10 including the cameras 11a to 11d, the image processing device 12, the communication device 13, the position detection device 15, and the vehicle speed sensor 16 will be described as an example, but is not limited thereto.
  • the position information of the passenger car V and the monitoring images from the cameras 11a to 11d are acquired.
  • the monitoring image from the fixed camera 11f installed in the city shown in FIG. You may get it.
  • the passenger car V which acquires a positional information and a monitoring image it is desirable to use the taxi V1 and bus

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Signal Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
PCT/JP2013/053277 2012-04-24 2013-02-12 Système de surveillance et procédé de surveillance Ceased WO2013161345A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012098545 2012-04-24
JP2012-098545 2012-04-24

Publications (1)

Publication Number Publication Date
WO2013161345A1 true WO2013161345A1 (fr) 2013-10-31

Family

ID=49482687

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/053277 Ceased WO2013161345A1 (fr) 2012-04-24 2013-02-12 Système de surveillance et procédé de surveillance

Country Status (1)

Country Link
WO (1) WO2013161345A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107333107A (zh) * 2017-07-21 2017-11-07 广东美的制冷设备有限公司 监控拍摄方法、装置及其设备
CN108206928A (zh) * 2016-12-16 2018-06-26 无锡市蜂鸣屏蔽设备科技有限公司 一种屏蔽门监控方法
US10659991B2 (en) 2016-12-06 2020-05-19 Nissan North America, Inc. Bandwidth constrained image processing for autonomous vehicles

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007036615A (ja) * 2005-07-26 2007-02-08 Matsushita Electric Ind Co Ltd カメラ制御装置及び監視システム
JP2007214769A (ja) * 2006-02-08 2007-08-23 Nissan Motor Co Ltd 車両用映像処理装置、車両周囲監視システム並びに映像処理方法
WO2008035745A1 (fr) * 2006-09-20 2008-03-27 Panasonic Corporation Systeme moniteur, camera et procede de codage d'images video
JP2011041311A (ja) * 2003-11-18 2011-02-24 Intergraph Software Technologies Co デジタル映像監視

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011041311A (ja) * 2003-11-18 2011-02-24 Intergraph Software Technologies Co デジタル映像監視
JP2007036615A (ja) * 2005-07-26 2007-02-08 Matsushita Electric Ind Co Ltd カメラ制御装置及び監視システム
JP2007214769A (ja) * 2006-02-08 2007-08-23 Nissan Motor Co Ltd 車両用映像処理装置、車両周囲監視システム並びに映像処理方法
WO2008035745A1 (fr) * 2006-09-20 2008-03-27 Panasonic Corporation Systeme moniteur, camera et procede de codage d'images video

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10659991B2 (en) 2016-12-06 2020-05-19 Nissan North America, Inc. Bandwidth constrained image processing for autonomous vehicles
JP2020514850A (ja) * 2016-12-06 2020-05-21 ニッサン ノース アメリカ,インク 自律走行車のための帯域幅制約画像処理
CN108206928A (zh) * 2016-12-16 2018-06-26 无锡市蜂鸣屏蔽设备科技有限公司 一种屏蔽门监控方法
CN107333107A (zh) * 2017-07-21 2017-11-07 广东美的制冷设备有限公司 监控拍摄方法、装置及其设备

Similar Documents

Publication Publication Date Title
JP5786963B2 (ja) 監視システム
JP5648746B2 (ja) 車両用監視装置、車両用監視システム、端末装置及び車両の監視方法
JP5672862B2 (ja) 撮像装置、撮像システム及び撮像方法
US20160159281A1 (en) Vehicle and control method thereof
JP5811190B2 (ja) 監視システム
JPWO2016194039A1 (ja) 情報提示システム
JP7467402B2 (ja) 画像処理システム、移動装置、画像処理方法、およびコンピュータプログラム
JP5064201B2 (ja) 画像表示システム及びカメラ出力制御方法
JP6260174B2 (ja) 監視画像提示システム
WO2013161345A1 (fr) Système de surveillance et procédé de surveillance
WO2013111494A1 (fr) Système de surveillance
JP5790788B2 (ja) 監視システム
WO2013111491A1 (fr) Système de surveillance
JP5796638B2 (ja) 監視システム
CN216331763U (zh) 一种集成全景功能和bsd功能的智能汽车电子后视镜设备
WO2013111492A1 (fr) Système de surveillance
JP2006148327A (ja) 画像生成装置
WO2013111479A1 (fr) Système de surveillance
WO2013125301A1 (fr) Système de surveillance
JP5812105B2 (ja) 監視システム
JP2002036954A (ja) 車両周辺監視装置および車両周辺監視システム
WO2013111493A1 (fr) Système de surveillance
WO2013129000A1 (fr) Système de surveillance
WO2013136894A1 (fr) Système et procédé de suivi
JP6054738B2 (ja) カメラモジュール、カメラシステムおよび画像表示方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13781098

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13781098

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP