[go: up one dir, main page]

WO2013111493A1 - Monitoring system - Google Patents

Monitoring system Download PDF

Info

Publication number
WO2013111493A1
WO2013111493A1 PCT/JP2012/083475 JP2012083475W WO2013111493A1 WO 2013111493 A1 WO2013111493 A1 WO 2013111493A1 JP 2012083475 W JP2012083475 W JP 2012083475W WO 2013111493 A1 WO2013111493 A1 WO 2013111493A1
Authority
WO
WIPO (PCT)
Prior art keywords
monitoring
information
image
command
link
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2012/083475
Other languages
French (fr)
Japanese (ja)
Inventor
照久 高野
真史 安原
秋彦 香西
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nissan Motor Co Ltd
Original Assignee
Nissan Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nissan Motor Co Ltd filed Critical Nissan Motor Co Ltd
Publication of WO2013111493A1 publication Critical patent/WO2013111493A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3453Special cost functions, i.e. other than distance or default speed limit of road segments
    • G01C21/3461Preferred or disfavoured areas, e.g. dangerous zones, toll or emission zones, intersections, manoeuvre types or segments such as motorways, toll roads or ferries
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19647Systems specially adapted for intrusion detection in or around a vehicle
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to a monitoring system.
  • This application claims priority based on Japanese Patent Application No. 2012-011439 filed on Jan. 23, 2012.
  • the contents described in the application are incorporated into the present application by reference and made a part of the description of the present application.
  • a security device that detects the occurrence of abnormalities is known by installing multiple security camera devices in shopping streets, store entrances, home entrances, and other streets, and monitoring surrounding images captured by the security camera device (Patent Document 1).
  • the object of the present invention is to provide a monitoring system capable of monitoring the entire city even when a camera mounted on a moving body is used.
  • the present invention achieves the above object by outputting a command for transmitting monitoring information of a monitoring point specified based on the traveling frequency of a moving object calculated for each link.
  • the central monitoring device can monitor the entire city using the monitoring terminal device mounted on the moving body that moves randomly.
  • FIG. 1 is a schematic diagram showing a monitoring system according to an embodiment of the present invention. It is a block diagram which shows the monitoring system of FIG. It is a perspective view which shows arrangement
  • FIG. 2 shows main control contents on the central monitoring device side of the monitoring system of FIG. It is a figure which shows the example of information of a database. It is a figure showing an example of information about run frequency. It is a 1st figure for demonstrating how to obtain
  • the monitoring system is embodied as a monitoring system 1 that centrally monitors the security of a city by authorities such as a police station and a fire station. That is, the position information of each of the plurality of moving objects, the image information around the moving objects, and the time information are acquired at a predetermined timing, and the position information, the image information, and the time information are acquired via wireless communication. The position information is displayed on the map information and, if necessary, the image information and the time information are displayed on the display. Therefore, the monitoring system 1 of this example acquires and processes monitoring information via the telecommunications network 30 and the monitoring terminal device 10 that acquires monitoring information such as position information and image information as shown in FIG. Central monitoring device 20 is provided.
  • FIG. 2 is a block diagram showing a specific configuration of the monitoring terminal device 10 and the central monitoring device 20.
  • the monitoring system of this embodiment can acquire monitoring information related to monitoring points on links with low travel frequency.
  • the monitoring terminal device 10 is a terminal device mounted on a plurality of moving bodies V, and is mounted on each of the plurality of moving bodies and a position detection function that detects position information of each of the plurality of moving bodies V.
  • An image generation function that captures the surroundings of the moving body with a camera and generates image information, a time detection function, an information acquisition control function that acquires position information, image information, and time information at a predetermined timing;
  • a monitoring information generation function for generating monitoring information including position information and / or image information, and a communication function for outputting the position information, image information, and time information to the central monitoring apparatus 20 and acquiring a command from the central monitoring apparatus 20 And a function for reporting the occurrence of an abnormality.
  • the monitoring terminal device 10 of this embodiment includes a display 18 and a navigation device 19. Note that the time information is mainly information used for post-event analysis, and may be omitted.
  • the mobile body V on which the monitoring terminal device 10 is mounted is not particularly limited as long as it travels in the target monitoring area, and includes mobile bodies such as passenger cars, motorcycles, industrial vehicles, and trams.
  • the vehicle V1, the private passenger car V2, and the emergency passenger car V3 are included, but in particular, a taxi or a route bus V1 that travels randomly and constantly in a predetermined area is particularly preferable.
  • FIG. 1 illustrates an emergency passenger car V3 such as a taxi V1, a private passenger car V2, a police car, a fire engine or an ambulance, but these are collectively referred to as a moving body V or a passenger car V.
  • Each moving body V includes a plurality of in-vehicle cameras 11a to 11e (hereinafter collectively referred to as cameras 11), an image processing device 12, a communication device 13, an in-vehicle control device 14, a position detection device 15, and a notification button 16.
  • the camera 11 is composed of a CCD camera or the like, images the surroundings of the moving object V, and outputs the image pickup signal to the image processing device 12.
  • the image processing device 12 reads an imaging signal from the camera 11 and performs image processing on the image information. Details of this image processing will be described later.
  • the position detection device 15 is composed of a GPS device and its correction device, etc., detects the current position of the moving object V, and outputs it to the in-vehicle control device 14.
  • the notification button 16 is an input button installed in the passenger compartment, and inputs information for reporting an abnormality when a driver or a passenger finds an incident (an incident related to security such as an accident, fire, or crime). It is a manual button. This information can include position information of the moving body V that has reported the abnormality.
  • the in-vehicle control device 14 includes a CPU, a ROM, and a RAM, and controls the image processing device 12, the communication device 13, and the position detection device 15 when the notification button 16 is pressed, and is generated by the image processing device 12.
  • the image information, the position information of the moving object V detected by the position detection device 15, and the time information from the clock built in the CPU are output to the central monitoring device 20 via the communication device 13 and the telecommunication network 30.
  • a command for requesting information such as an image transmission command is acquired from the central monitoring device 20 received via the telecommunication network 30 and the communication device 13, and the image processing device 12, the communication device 13, and the position detection device 15 are obtained.
  • Monitoring information including image information generated by the image processing device 12, position information of the moving object V detected by the position detection device 15, and time information from a clock built in the CPU is transmitted to the communication device 13. And output to the central monitoring device 20 through the telecommunication network 30.
  • the in-vehicle control device 14 can store monitoring information including image information, position information, time information, and the like for at least a predetermined time.
  • the communication device 13 is a communication means capable of wireless communication, and exchanges information with the communication device 23 of the central monitoring device 20 via the telecommunication network 30.
  • the telecommunications network 30 is a commercial telephone network, a mobile phone communication device can be used widely, and when the telecommunications network 30 is a dedicated telecommunications network for the monitoring system 1 of this example, it is dedicated to it.
  • the communication devices 13 and 23 can be used.
  • a wireless LAN, WiFi (registered trademark), WiMAX (registered trademark), Bluetooth (registered trademark), a dedicated wireless line, or the like can be used.
  • the central monitoring device 20 includes an information acquisition function for acquiring the position information and image information output from the monitoring terminal device 10 described above, a storage function for storing the acquired monitoring information in association with the position information in the database 26, a database 26 and the map information MP including link information, a travel frequency calculation function for calculating the travel frequency of the passenger car V for each link of the link information, and monitoring included in the link to be monitored based on the calculated travel frequency
  • a command output function for outputting a command for specifying a point and transmitting monitoring information of the monitoring point and the map information MP read from the map database are displayed, and the received position information is displayed and controlled on the map information MP.
  • a display control function for displaying the obtained image information on the display 24.
  • the central monitoring device 20 specifies a monitoring point included in the link to be monitored based on the calculated traveling frequency, moves the moving object V to the specified monitoring point, and transmits the monitoring information of the monitoring point.
  • a command output function for outputting a command can also be provided.
  • the central control device 21 includes a CPU, a ROM, and a RAM, and controls the image processing device 22, the communication device 23, and the display 24, and receives position information, image information, and time information transmitted from the monitoring terminal device 10.
  • the image is displayed on the display 24 after being subjected to image processing as necessary.
  • the image processing device 24 has a map database, displays map information MP from the map database on the display 24, and displays position information detected by the position detection device 15 of the monitoring terminal device 10 on the map information MP. Superimposed display. Further, image processing for displaying image information captured by the vehicle-mounted camera 11 of the monitoring terminal device 10 and processed by the image processing device 12 on the display 24 is performed.
  • the display 24 can be composed of, for example, a liquid crystal display device having a size capable of displaying two window screens on one screen or two liquid crystal display devices each displaying two window screens.
  • One window screen displays a screen in which the position information of each moving object V is superimposed on the map information MP (see FIG. 1), and the other window screen displays an image captured by the in-vehicle camera 11. The image information concerning is displayed.
  • the input device 25 is constituted by a keyboard or a mouse, and is used when inputting an information acquisition command output to a desired moving body V or inputting various information processing commands displayed on the display 24. It is done.
  • the monitoring point can be input by the supervisor via the input device 25.
  • the monitor designates a monitoring point by clicking (selecting and inputting) the icon of each point superimposed on the map information MP, and sets a monitoring area based on this monitoring point. Can do.
  • the communication device 23 is a communication means capable of wireless communication, and exchanges information with the communication device 13 of the monitoring terminal device 10 via the telecommunication network 30.
  • the telecommunications network 30 is a commercial telephone network, a mobile phone communication device can be used widely, and when the telecommunications network 30 is a dedicated telecommunications network for the monitoring system 1 of this example, it is dedicated to it.
  • the communication devices 13 and 23 can be used.
  • the cameras 11a to 11e are configured using an image sensor such as a CCD, and the four on-vehicle cameras 11a to 11d are installed at different positions outside the passenger car V, respectively, and shoot four directions around the vehicle.
  • the camera 1 of this embodiment has a zoom-up function for enlarging and imaging a subject, and can arbitrarily change the focal length according to the control command, or can arbitrarily change the imaging magnification according to the control command.
  • the in-vehicle camera 11 a installed at a predetermined position in front of the passenger car V such as a front grill portion is an object or road surface (in the area SP1 in front of the passenger car V and in the space in front thereof)
  • the in-vehicle camera 11b installed at a predetermined position on the left side of the passenger car V such as the left side mirror portion is an object or road surface (left side view) that exists in the area SP2 on the left side of the passenger car V and in the surrounding space.
  • the in-vehicle camera 11c installed at a predetermined position in the rear part of the passenger car V, such as a rear finisher part or a roof spoiler part, is an object or road surface (rear view) existing in the area SP3 behind the passenger car V and in the space behind it.
  • the in-vehicle camera 11d installed at a predetermined position on the right side of the passenger car V such as the right side mirror portion is an object or road surface (right side view) that exists in the area SP4 on the right side of the passenger car V and in the surrounding space.
  • one in-vehicle camera 11e is installed, for example, on the ceiling of a passenger car, and images the area SP5 in the passenger compartment as shown in FIG. Used for crime prevention or crime reporting.
  • FIG. 4 is a view of the arrangement of the in-vehicle cameras 11a to 11e as viewed from above the passenger car V.
  • the in-vehicle camera 11a that images the area SP1 the in-vehicle camera 11b that images the area SP2
  • the in-vehicle camera 11c that images the area SP3 the in-vehicle camera 11d that images the area SP4 are It is installed along the outer periphery VE of the body along the counterclockwise direction (counterclockwise) or the clockwise direction (clockwise).
  • the in-vehicle camera 11b is installed on the left side of the in-vehicle camera 11a, and the left side of the in-vehicle camera 11b.
  • the vehicle-mounted camera 11c is installed on the left side of the vehicle-mounted camera 11c, and the vehicle-mounted camera 11a is installed on the left side of the vehicle-mounted camera 11d.
  • the in-vehicle camera 11d is installed on the right side of the in-vehicle camera 11a.
  • the vehicle-mounted camera 11c is installed on the right side
  • the vehicle-mounted camera 11b is installed on the right side of the vehicle-mounted camera 11c
  • the vehicle-mounted camera 11a is installed on the right side of the vehicle-mounted camera 11b.
  • FIG. 5A shows an example of an image GSP1 in which the front in-vehicle camera 11a images the area SP1
  • FIG. 5B shows an example of an image GSP2 in which the left-side in-vehicle camera 11b images the area SP2
  • FIG. 5D shows an example of an image GSP3 in which the area SP3 is imaged
  • FIG. 5D shows an example of an image GSP4 in which the right-side in-vehicle camera 11d images the area SP4
  • FIG. 5E shows an indoor in-vehicle camera 11e.
  • the size of each image is vertical 480 pixels ⁇ horizontal 640 pixels.
  • the image size is not particularly limited, and may be any size as long as a general terminal device can reproduce a moving image.
  • the number and position of the in-vehicle camera 11 can be appropriately determined according to the size, shape, detection area setting method, etc. of the passenger car V.
  • the plurality of in-vehicle cameras 11 described above are assigned identifiers corresponding to the respective arrangements, and the in-vehicle control device 14 can identify each of the in-vehicle cameras 11 based on each identifier.
  • the vehicle-mounted control apparatus 14 can transmit an imaging command and other commands to a specific vehicle-mounted camera 11 by attaching an identifier to the command signal.
  • the in-vehicle control device 14 controls the image processing device 12 to acquire each image signal picked up by the in-vehicle camera 11, and the image processing device 12 processes the image pickup signal from each in-vehicle camera 11 to perform FIG. It is converted into image information shown in 5E. Then, the in-vehicle control device 14 generates a monitoring image based on the four pieces of image information shown in FIGS. 5A to 5D (image generation function), and the monitoring image is projected on the side of the projection model of the columnar body. Mapping information to be projected onto the surface is associated with the monitoring image (mapping information adding function) and output to the central monitoring device 20.
  • the image generation function and the mapping information addition function will be described in detail.
  • the process of generating a monitoring image based on the four pieces of image information obtained by imaging the periphery of the passenger car V and associating the mapping information with the monitoring image is executed by the monitoring terminal device 10 as in this example, and also by the central monitoring device 20. It can also be executed. In this case, four pieces of image information obtained by imaging the periphery of the passenger car V are transmitted as they are from the monitoring terminal device 10 to the central monitoring device 20, and are monitored by the image processing device 22 and the central control device 21 of the central monitoring device 20. It is only necessary to generate an image, associate mapping information, and perform projection conversion.
  • the in-vehicle control device 14 of the monitoring terminal device 10 of the present embodiment controls the image processing device 12 to acquire the imaging signals of the in-vehicle cameras 11a to 11e, respectively, and further clockwise or along the outer periphery of the body of the passenger car V
  • One monitoring image is generated so that the image information of the in-vehicle cameras 11a to 11d installed in the counterclockwise direction is arranged in the order of installation of these in-vehicle cameras 11a to 11d.
  • the four in-vehicle cameras 11a to 11d are installed in the order of the cameras 11a, 11b, 11c, and 11d in the counterclockwise direction (counterclockwise) along the outer periphery VE of the body of the passenger car V. Therefore, the vehicle-mounted control device 14 integrates the four images captured by the vehicle-mounted cameras 11a to 11d in accordance with the order of installation of the vehicle-mounted cameras 11a to 11d (vehicle-mounted cameras 11a ⁇ 11b ⁇ 11c ⁇ 11d). Are connected in the horizontal direction to generate a single monitoring image. In the monitoring image of the present embodiment, the images are arranged such that the ground contact surface (road surface) of the passenger vehicle V is the lower side, and the images are connected to each other at sides in the height direction (vertical direction) with respect to the road surface.
  • FIG. 6 is a diagram illustrating an example of the monitoring image K.
  • the monitoring image K of the present embodiment includes a captured image GSP1 in which the front in-vehicle camera 11a images the area SP1 along the direction P from the left side to the right side in the drawing, and the left in-vehicle camera 11b.
  • a captured image GSP2 obtained by imaging the area SP2 a captured image GSP3 obtained by the rear vehicle-mounted camera 11c imaging the area SP3, and a captured image GSP4 obtained by the right-side vehicle-mounted camera 11d imaging the area SP4 are arranged in this order in the horizontal direction. These four images are arranged as a series of images.
  • the monitor image K generated in this way is displayed in order from the left end to the right side with the image corresponding to the road surface (vehicle contact surface) facing down, so that the monitor can rotate the periphery of the vehicle V counterclockwise. It can be visually recognized on the display 24 in a manner similar to the look around.
  • one monitoring image K when one monitoring image K is generated, four images acquired at substantially the same time as the photographing timings of the in-vehicle cameras 11a to 11d are used. Thereby, since the information contained in the monitoring image K can be synchronized, the situation around the vehicle at a predetermined timing can be accurately expressed.
  • the monitoring image K generated from the respective captured images having substantially the same imaging timing of the camera is stored with time, and the moving image monitoring image K including the plurality of monitoring images K per predetermined unit time is generated. It may be. By generating the moving image monitoring image K based on the images having the same imaging timing, it is possible to accurately represent changes in the situation around the vehicle.
  • the conventional central monitoring device 20 has a disadvantage in that it cannot simultaneously watch images (moving images) in a plurality of directions and cannot monitor the entire vehicle periphery on a single screen.
  • the vehicle-mounted control apparatus 14 of this embodiment produces
  • the monitoring terminal device 10 of the present embodiment generates the monitoring image K by compressing the data amount of the image so that the number of pixels of the monitoring image K is substantially the same as the number of pixels of the images of the in-vehicle cameras 11a to 11d.
  • the size of each image shown in FIGS. 5A to 5D is 480 ⁇ 640 pixels
  • compression processing is performed so that the size of the monitoring image K is 1280 ⁇ 240 pixels as shown in FIG. Do.
  • image processing and image reproduction can be performed.
  • the in-vehicle control device 14 of the present embodiment can also attach a line figure indicating the boundary of each arranged image to the monitoring image K.
  • the in-vehicle controller 14 forms a rectangular partition image Bb, Bc, Bd, Ba, Ba ′ between the images as a line figure indicating the boundary between the arranged images. Can be attached to the monitoring image K.
  • the partition image functions as a frame of each captured image.
  • the image distortion is large in the vicinity of the boundary of each captured image, it is possible to hide the image of the region with large distortion or to suggest that the distortion is large by arranging the partition image at the boundary of the captured image. .
  • the vehicle-mounted control apparatus 14 of this embodiment can also generate
  • image distortion is likely to occur.
  • the distortion of the captured image tends to be large, so in order to correct the image distortion in advance. It is desirable to correct the distortion of the captured image using the defined image conversion algorithm and correction amount.
  • the in-vehicle control device 14 reads out the same projection model information as the projection model for projecting the monitoring image K in the central monitoring device 20 from the ROM, and images it on the projection plane of the projection model. It is also possible to project an image and correct in advance distortion generated on the projection surface.
  • the image conversion algorithm and the correction amount can be appropriately defined according to the characteristics of the in-vehicle camera 11 and the shape of the projection model. In this way, by correcting in advance the distortion when the image K is projected with respect to the projection plane of the projection model, it is possible to provide the monitoring image K with good visibility with less distortion. Further, by correcting the distortion in advance, it is possible to reduce the positional deviation between the images arranged side by side.
  • the mapping information addition function will be described.
  • the in-vehicle control device 14 projects the generated monitoring image K on the projection plane set on the side surface of the projection model M of the columnar body with the ground contact surface of the passenger car V as the bottom surface.
  • a process for associating the mapping information for monitoring with the monitoring image K is executed.
  • the mapping information is information for allowing the central monitoring device 20 that has received the monitoring image K to easily recognize the projection reference position.
  • FIG. 8 is a diagram showing an example of the projection model M of the present embodiment
  • FIG. 9 is a schematic sectional view taken along the xy plane of the projection model M shown in FIG.
  • the projection model M of this embodiment is a regular octagonal prism body having a regular octagonal bottom surface and a height along the vertical direction (z-axis direction in the figure).
  • the shape of the projection model M is not particularly limited as long as it is a column having side surfaces adjacent to each other along the boundary of the bottom surface, and is a cylinder, or a prism, such as a triangular column, a quadrangular column, or a hexagonal column, or An anti-rectangular column having a polygonal bottom surface and a triangular side surface can also be used.
  • the bottom surface of the projection model M of this embodiment is parallel to the ground contact surface of the passenger car V.
  • Projection surfaces Sa, Sb, Sc, and Sd (hereinafter collectively referred to as a projection surface S) that project an image around the passenger vehicle V that contacts the bottom surface of the projection model M are provided on the inner surface of the side surface of the projection model M. Is set.
  • the projection surface S includes a part of the projection surface Sa and a part of the projection surface Sb, a part of the projection surface Sb and a part of the projection surface Sc, a part of the projection surface Sc and a part of the projection surface Sd, and the projection surface Sd. And a part of the projection surface Sa.
  • the monitoring image K is projected on the projection plane S as an image of the passenger car V viewed from above the viewpoint R (R1 to R8, hereinafter referred to as viewpoint R) above the projection model M surrounding the passenger car V.
  • the in-vehicle control device 14 associates the reference coordinates of the captured image arranged at the right end or the left end with the monitoring image K as mapping information.
  • the in-vehicle control device 14 is arranged at the right end as mapping information (reference coordinates) indicating the start end position or the end position of the monitoring image K when projected onto the projection model M.
  • mapping information reference coordinates
  • the coordinates A (x, y) of the upper left vertex of the captured image GSP1 and the coordinates B (x, y) of the upper right vertex of the captured image GSP2 arranged at the left end are attached to the monitoring image K.
  • the reference coordinates of the captured image indicating the start position or the end position are not particularly limited, and may be the lower left vertex of the monitoring image K arranged at the left end or the lower right vertex of the monitoring image K arranged at the right end.
  • the mapping information may be attached to each pixel of the image data of the monitoring image K, or may be managed as a file different from the monitoring image K.
  • the information indicating the start position or the end position of the monitoring image K that is, the reference coordinates used as a reference in the projection processing is associated with the monitoring image K as mapping information, whereby the central monitoring apparatus 20 that has received the monitoring image K Since the reference position at the time of the projection process can be easily recognized, the monitoring images K arranged in the order in which the in-vehicle cameras 11a to 11d are arranged are projected sequentially and easily on the projection surface S on the side surface of the projection model M. be able to. That is, as shown in FIG.
  • the captured image GSP1 in front of the vehicle is projected onto the projection surface Sa positioned in the imaging direction of the in-vehicle camera 11a, and the captured image on the right side of the vehicle is projected onto the projection surface Sb positioned in the imaging direction of the in-vehicle camera 11b.
  • GSP2 is projected, a captured image GSP3 behind the vehicle is projected onto a projection plane Sc located in the imaging direction of the in-vehicle camera 11c, and a captured image GSP4 on the left side of the vehicle is projected onto the projection plane Sd positioned in the imaging direction of the in-vehicle camera 11d. can do.
  • the monitoring image K projected on the projection model M can show an image that can be seen as if looking around the passenger car V. That is, since the monitoring image K including four images arranged in a line in the horizontal direction according to the installation order of the in-vehicle cameras 11a to 11d is projected on the side surfaces that are also arranged in the horizontal direction in the column of the projection model M. An image around the passenger car V can be reproduced in the monitoring image K projected on the projection surface S of the projection model M of the columnar body while maintaining the positional relationship.
  • the in-vehicle control device 14 of the present embodiment stores the correspondence relationship between each coordinate value of the monitoring image K and the coordinate value of each projection plane S of the projection model M as mapping information, and attaches it to the monitoring image K.
  • it may be stored in the central monitoring device 20 in advance.
  • the positions of the viewpoint R and the projection plane S shown in FIGS. 8 and 9 are examples, and can be arbitrarily set.
  • the viewpoint R can be changed by the operation of the operator.
  • the relationship between the viewpoint R and the projection position of the monitoring image K is defined in advance, and when the position of the viewpoint R is changed, a predetermined coordinate transformation is performed, so that the viewpoint R is viewed from the newly set viewpoint R.
  • the monitoring image K can be projected onto the projection surface S (Sa to Sd). A known method can be used for this viewpoint conversion processing.
  • the in-vehicle control device 14 generates the monitoring image K based on the image information captured at a predetermined timing, and the monitoring image K includes a line figure (mapping information, reference coordinates, and boundary). (Partition image) information is associated and stored over time according to the imaging timing.
  • the in-vehicle control device 14 may store the monitoring image K as a single moving image file including a plurality of monitoring images K per predetermined unit time, or can be transferred / reproduced by a streaming method.
  • the monitoring image K may be stored in
  • the communication device 23 of the central monitoring device 20 receives the monitoring image K transmitted from the monitoring terminal device 10 and the mapping information associated with the monitoring image K. Moreover, the image information image
  • this monitoring image K as described above, images of the four in-vehicle cameras 11 installed at different positions of the body of the passenger car V are installed along the outer periphery of the body of the passenger car V along the clockwise or counterclockwise direction.
  • the vehicle-mounted cameras 11a to 11d are arranged according to the installation order (clockwise or counterclockwise order along the outer periphery of the body of the vehicle V).
  • the monitoring image K is associated with mapping information for projecting the monitoring image K onto the projection plane S of the octagonal prism projection model M.
  • the communication device 23 transmits the acquired monitoring image K and mapping information to the image processing device 22.
  • the image processing apparatus 22 reads the projection model M stored in advance, and sets it on the side surface of the octagonal prism projection model M with the ground contact surface of the passenger car V shown in FIGS. 8 and 9 as the bottom surface based on the mapping information.
  • a display image is generated by projecting the monitoring image K onto the projected planes Sa to Sd. Specifically, according to the mapping information, each pixel of the received monitoring image K is projected onto each pixel of the projection surfaces Sa to Sd. Further, when projecting the monitoring image K onto the projection model M, the image processing device 22 recognizes the start point of the monitoring image K (the right end or the left end of the monitoring image K) based on the reference coordinates received together with the monitoring image K.
  • the projection processing is performed so that the start point coincides with the start point (the right end or the left end of the projection surface S) defined in advance on the projection model M. Further, when projecting the monitoring image K onto the projection model M, the image processing device 22 arranges a line figure (partition image) indicating the boundary of each image on the projection model M.
  • the partition image can be attached to the projection model M in advance, or can be attached to the monitoring image K after the projection processing.
  • the display 24 displays the monitoring image K projected on the projection plane S of the projection model M.
  • FIG. 10 shows an example of a display image of the monitoring image K.
  • the input device 25 such as a mouse or a keyboard or the display 24 as the touch panel type input device 25
  • the viewpoint can be freely set and changed by the operation of the supervisor. Since the correspondence relationship between the viewpoint position and the projection plane S is defined in advance in the image processing device 22 or the display 24 described above, the monitoring image K corresponding to the changed viewpoint is displayed on the display 24 based on this correspondence relationship. can do.
  • FIG. 11 is a flowchart showing the operation on the monitoring terminal device 10 side
  • FIGS. 12A and 12B are flowcharts showing the operation on the central monitoring device 20 side
  • FIG. 13 is an example of database information
  • FIG. FIG. 16 is a first diagram for explaining how to obtain a new route
  • FIG. 16 is a second diagram for explaining how to obtain a new route.
  • the processing shown in FIG. 11 is executed in the monitoring terminal device 10 mounted on the passenger car V, but can be mounted on a passenger car V3 belonging to a supervisor such as a police vehicle or an emergency vehicle, and can also monitor a taxi, a bus, and the like. It can be mounted on a business passenger car V1 or a private passenger car V2. That is, based on the monitoring information acquired from the passenger car V3 belonging to the monitor such as a police car, a command for transmitting the monitoring information of the monitoring point is transmitted to the business passenger car V1 such as a taxi or a bus or the private car V2. Can be done.
  • a command to transmit monitoring information of monitoring points is transmitted to passenger cars V3 belonging to a monitor such as a police car.
  • a taxi or the like is guided to a monitoring point of a link with a low traffic frequency of a police vehicle, or a police vehicle is guided to a monitoring point of a link with a low frequency of taxi or the like. be able to.
  • the monitoring terminal device 10 As shown in FIG. 11, in the monitoring terminal device 10, surrounding video and indoor video are acquired from the in-vehicle camera 11 at a predetermined time interval (one routine shown in FIG. 11), and the image processing device 12 converts the video information into image information. Conversion is performed (step ST1). Further, the current position information of the passenger car V on which the monitoring terminal device 10 is mounted is detected from the position detection device 15 having GPS (step ST2).
  • step ST3 it is determined whether or not the report button 16 for reporting the abnormality is pressed. If the report button 16 is pressed, the process proceeds to step ST4, and the image information acquired in step ST1 and the image information acquired in step ST2 are acquired.
  • the positional information is associated with the CPU time information, and these are transmitted as monitoring information to the central monitoring device 20 via the communication device 13 and the telecommunications network 30 together with the abnormality information indicating that an abnormality has occurred.
  • the occurrence of an abnormality related to security such as an accident or crime is automatically transmitted to the central monitoring device 20 together with the position information of the passenger car V and the image information around the passenger car V, thereby further strengthening the monitoring in the city. Will be.
  • the image information and the position information are acquired in the first steps ST1 and ST2, but the image information and the position information may be acquired at a timing between steps ST3 and ST4.
  • step ST3 if the report button 16 has not been pressed, the process proceeds to step ST5 to communicate with the central monitoring device 20 and obtain a control command.
  • step ST6 the monitoring terminal device 10 determines whether or not a monitoring information transmission command related to a monitoring point has been acquired from the central monitoring device 20, and if a monitoring information transmission command is acquired, the process proceeds to step ST7. It is determined whether or not the acquired monitoring information transmission command includes a route change command. If a route change command is included, the process proceeds to step ST8.
  • the monitoring terminal device 10 displays the designated monitoring point included in the route change command (route change command) or a link including the monitoring point on the output device 18.
  • the output device of this embodiment includes a display and a speaker. This display method is not particularly limited, and the name of the facility existing at the monitoring point, the name of the street including the link, and the like can be presented to the occupant by text or voice.
  • the navigation apparatus 19 may superimpose the monitoring point or the link including the monitoring point on the map information MP and present it on the display 18. Thereby, the passenger
  • step ST9 Since it may be impossible for business reasons to stop at a monitoring point or a monitoring link, the convenience of the passenger can be improved by displaying it once.
  • the monitoring terminal device 10 causes the navigation device 19 to calculate a route via the monitoring point included in the monitoring information transmission command or the link including the monitoring point.
  • a method for calculating a route from the current position to the destination via the designated monitoring point a method known at the time of filing can be appropriately used. If the occupant can move along the newly calculated route, the occupant sets the route and executes route guidance. Then, the process proceeds to step ST10, and when it arrives at the designated monitoring point or link, the surroundings are imaged by the camera 11, and image information is generated. After arriving at the designated monitoring point or link, the process proceeds to step 101, and monitoring information including image information, position information, and time information is transmitted to the central monitoring device 20 in accordance with the contents of the monitoring information transmission command.
  • step ST7 when the route change command is not included in the monitoring information transmission command, the process proceeds to step ST101, and without changing the route, according to the monitoring information transmission command acquired in step ST6, image information, position information, Monitoring information including time information is transmitted to the central monitoring device 20.
  • the monitoring information transmission command in the present embodiment may not necessarily require that image information be included. This is because, when a monitoring information request command is sent to the police or other supervisor, it is visually confirmed under the safety confirmation authority of the police or the like, and a simple report such as “abnormal / no abnormal” is monitored. This is because it can be information. Further, when a storage command is included in the monitoring information transmission command, image information, position information, and time information are stored.
  • step ST6 even if the monitoring information transmission command is not acquired from the central monitoring device 20, if the passenger vehicle V is present in the predefined priority monitoring area in step ST102, the process proceeds to step S103, where the image information Send monitoring information including On the other hand, if the image transmission command is not acquired and it is not the priority monitoring area, the process proceeds to step ST104, and monitoring information not including image information, that is, time information and position information is transmitted to the central monitoring device 20.
  • FIG. 13 is a diagram illustrating an example of information stored in the database 26.
  • monitoring information including image information, position information, and time information acquired from the passenger car V (monitoring terminal device 10) is stored in association with the position information. That is, if position information is designated, a series of monitoring information can be called.
  • the link ID to which the position information belongs may be associated with the vehicle speed of the passenger car.
  • the monitoring information can include a mobile body ID (monitoring terminal device ID) for specifying the monitoring terminal device 10.
  • the mobile object ID may be the address of the communication device 13 of the monitoring terminal device 10.
  • step ST12 the passenger car V is displayed on the map information MP of the map database displayed on the display 24 as shown in the upper left of FIG. 1 based on the position information acquired in step ST11. Since the position information of the passenger car V is acquired and transmitted at a predetermined timing for each routine in FIG. 11, the supervisor can grasp the current position of the passenger car V in a timely manner.
  • step ST13 it is determined whether or not abnormality information notified from the monitoring terminal device 10 of the passenger car V, that is, a notification that an abnormality relating to security such as an accident or a crime has occurred has been received.
  • This abnormality information is output when the passenger of the passenger car V presses the notification button 16 of the monitoring terminal device 10. If there is abnormality information, the passenger vehicle V to which abnormality information was output is identified in step ST14, image information and time information are received from the monitoring terminal device 10 of the passenger vehicle, and the image information is displayed on the display 24. Further, as shown in the upper left of FIG. 1, highlighting is performed, for example, by changing the color so that the passenger car displayed on the map information MP can be distinguished from other passenger cars. Thereby, the position where the abnormality has occurred can be visually recognized on the map information MP, and the abnormality content can be grasped on the display 24.
  • the passenger vehicle V traveling in the vicinity (within a predetermined distance) of the passenger vehicle V that has output the abnormality information is detected, and a transmission command for image information and time information is output to the passenger vehicle V.
  • the image information can be acquired from the passenger vehicle V that travels in the vicinity of the passenger vehicle V that has output the abnormality information. The contents can be grasped in detail.
  • step ST16 the position information of the passenger car V that has output the abnormality information is transmitted to an emergency passenger car such as a police car, an ambulance, or a fire engine.
  • an emergency passenger car such as a police car, an ambulance, or a fire engine.
  • image information may be attached and transmitted in order to notify the abnormal content.
  • the emergency passenger car can be dispatched before a report from the site is entered, and it is possible to quickly deal with accidents and crimes.
  • step ST17 all position information, image information, and time information received from the monitoring terminal device 10 are recorded on the recording medium. This record is used to resolve these after an accident or crime. If there is no abnormality information in step ST13, the process proceeds to step ST21 without performing the processes in steps ST14 to ST17.
  • step ST13 if there is no abnormality information, the processing of steps ST111 to T114 is performed.
  • the central monitoring apparatus 20 refers to the database 26 and calculates and stores the traveling frequency for each link.
  • the central monitoring device 20 refers to the map information MP included in the central monitoring device 20 based on the collected data shown in FIG. 13, and determines the number of passenger cars V on which the position information for each link included in the map information MP is plotted.
  • the counted traveling frequency can be calculated.
  • An example of travel frequency data in this embodiment is shown in FIG. As shown in FIG. 14, the traveling frequency indicated by the number of units per unit time is associated with the number of traveling units of the current link for each link.
  • the unit time for calculating the running frequency can be arbitrarily set, and can be 1 hour, 4 hours, half a day, 8 to 5 o'clock, 5 o'clock or later, 1 day, and the like. Further, the unit time can be obtained by identifying weekdays and holidays and dividing the driving frequency into weekdays and holidays.
  • a real-time traveling situation can be recognized by indicating the current number of traveling vehicles. For example, if the traveling frequency is low but the current number of traveling vehicles is large, it can be determined that there is no need to direct the passenger car V for monitoring. On the other hand, when the traveling frequency is high but the current traveling number is extremely low, it can be determined that the passenger car V for monitoring needs to be directed specially.
  • the central monitoring device 20 specifies a monitoring point based on the traveling frequency.
  • the monitoring point specifying method is not particularly limited, but a link having a traveling frequency less than a predetermined value or a point on the link can be specified as the monitoring point.
  • the threshold of the driving frequency used for specifying the monitoring point may be provided for each link in consideration of the traffic volume, or may be a common threshold.
  • the central monitoring device 20 can specify a point on a link other than a link whose traveling frequency is higher than a predetermined value as a monitoring point.
  • the central monitoring device 20 creates a route change command passing through the specified monitoring point, and transmits a monitoring information transmission command for the monitoring point including the route change command to the monitoring terminal device 10 in step ST114.
  • the monitoring information transmission command does not necessarily include an image information transmission command, and may be a command that requests only a report of a determination such as “no abnormality / abnormal”. This is because when the monitoring information transmission command is transmitted to the police vehicle V3 belonging to the monitor, the monitoring effect can be obtained only by the determination of the monitor such as the police having the authority of safety confirmation.
  • the monitoring information transmission command when the monitoring information transmission command is sent to the vehicle V2 belonging to a taxi company other than the monitoring company, a transportation company or the like, the monitoring information transmission command preferably includes a command for requesting transmission of image information. This is because safety cannot be confirmed by a taxi company, and safety needs to be confirmed by a police, security company or the like having the authority of safety confirmation based on the provided image information.
  • a monitoring information transmission command including only information on the monitoring point may be sent to the monitoring terminal device 10.
  • information on the monitoring point is presented to the occupant on the monitoring terminal device 10 side, and the occupant can cause the navigation device 19 on the passenger car V side to search for a route through the monitoring point.
  • a command to move to the monitoring point may be included in the monitoring information transmission command and sent to the monitoring terminal device 10. Rather than simply indicating the position of the monitoring point, the moving body V can be forced to go to the monitoring point, so that the monitoring information at the monitoring point can be reliably acquired.
  • the mode of the command to move to the monitoring point is not limited, and may be a voice command, a text command, a destination setting command or a travel route setting command for the navigation device 19.
  • the command to move to the monitoring point may be sent only to the monitoring terminal device 10 (moving body) existing within a predetermined distance from the monitoring point.
  • the command to move to the monitoring point may be sent only to the monitoring terminal device 10 (moving body) belonging to a monitor such as the police or the fire department, or is mounted on the moving body V for business use or private use according to the case. It may be sent only to the monitoring terminal device 10.
  • FIGS. 15 and 16 An example of a route change command is shown in FIGS. 15 and 16 as an example.
  • the route selected by itself is A ⁇ B ⁇ C ⁇ F (R11, R12).
  • a route change command including a monitoring link Z or a monitoring point on the link Z with a low traveling frequency of the passenger vehicle V carrying the monitoring terminal device 10 is obtained from the central monitoring device 20 and the route can be changed.
  • the route is changed from A ⁇ D ⁇ E ⁇ F (R21, R22) so as to pass through the link Z.
  • the route selected by itself is A ⁇ B ⁇ C ⁇ F (R31, R32), as in the previous example.
  • a route change command including a monitoring link W or a monitoring point on a link other than the link W having a high traveling frequency of the passenger vehicle V on which the monitoring terminal device 10 is mounted is acquired from the central monitoring device 20, and it is determined that the route can be changed.
  • the route is changed from A ⁇ B ⁇ E ⁇ F (R41, R42) so as not to pass through the link W, that is, to pass through a link other than the link W.
  • the passenger vehicle V equipped with the monitoring terminal device 10 moves to a link other than the link W according to the route after the change, and includes monitoring information including image information captured at a monitoring point on a link other than the link W or a link other than the link W. Monitoring information including the presence or absence of abnormality is sent to the central monitoring device 20.
  • the central monitoring device 10 can acquire monitoring information from the monitoring terminal device 10 mounted on the passenger car V belonging to a monitor such as a police or a security company.
  • a command for requesting monitoring information of a monitoring point specified based on the traveling frequency derived from the position information included in the monitoring information is mounted on the passenger car V belonging to a monitor such as the police or the security company. It may be sent to the monitoring terminal device 10 or may be sent to the monitoring terminal device 10 mounted on a business or private passenger car V such as a taxi or a transport vehicle other than the supervisor.
  • another police vehicle V3 may be dispatched to a monitoring point where the traveling frequency of the police vehicle V3 is low, or a business vehicle V2 such as a taxi may be dispatched to a monitoring point where the traveling frequency of the police vehicle V3 is low.
  • the central monitoring device 10 can acquire monitoring information from the monitoring terminal device 10 mounted on the passenger car V belonging to a trader / individual other than the monitoring person such as a taxi or a transportation company.
  • requires the monitoring information of the monitoring spot specified based on the driving frequency derived
  • the police vehicle V3 may be dispatched to a monitoring point where the travel frequency of the business vehicle V2 is low (not high), or another business vehicle such as a taxi is used at a monitoring point where the travel frequency of the business vehicle V2 is low (not high).
  • V2 may be dispatched.
  • step ST21 shown in FIG. 12B it is determined whether there is an image information transmission command from an emergency passenger car such as a police car, an ambulance, or a fire engine. If an image transmission command is input, the process proceeds to step ST22. In step ST22, it is determined whether or not the passenger vehicle V exists in the area specified by the image information transmission command. If the passenger vehicle V exists, the process proceeds to step ST23. In step ST23, an image information transmission command is output to the passenger vehicle V existing in the area specified by the image information transmission command. Thereby, the image information from the passenger car V can be acquired in step ST11 of FIG. 12A of the next routine, and this is transferred to the emergency passenger car or the meaning of the transmission command from the emergency passenger car is grasped. be able to. If not corresponding to steps ST21 and ST22, the process proceeds to step ST24 without performing the processes of steps ST21 to ST23.
  • step ST24 it is determined whether or not there is a passenger car V in the vicinity of a suspicious location such as a preset crime-prone delay, and if so, the process proceeds to step ST25 to transmit image information to the passenger car V. Outputs a command. Suspicious areas are streets with poor security and streets. As a result, the monitoring of streets and streets that are suspicious places can be strengthened, and crime prevention can be expected. If the passenger vehicle V does not exist in the region near the suspicious part, the process proceeds to step ST26 without performing the process of step ST22.
  • step ST26 it is determined whether or not there is a passenger vehicle V in the vicinity of the priority monitoring position where the priority monitoring object whose details should be monitored can be imaged. If the passenger vehicle V exists in the vicinity of the priority monitoring position, step ST27 is determined. To the passenger vehicle V, and outputs a priority monitoring command for requesting transmission of image information in which the priority monitoring target is expanded. As a result, it is possible to monitor the priority monitoring target in detail, and to effectively detect a suspicious object that causes an incident or an accident in the specified priority monitoring target, so that prevention of crime can be expected. If there is no passenger vehicle V in the vicinity of the priority monitoring position, the process proceeds to step ST28 without performing the process of step ST27.
  • step ST28 based on the position information received from each passenger car V, the passenger car V is not traveling within a predetermined time within a predetermined area that is required to be monitored (not limited to the suspicious location and the priority monitoring area). It is determined whether there is a route, and when there is such a route, it is monitored whether there is a passenger vehicle V traveling on the route. Then, if there is a passenger car V traveling on the route most recently, the process proceeds to step ST29, and an image information transmission command is output to the passenger car V. Thereby, it is possible to automatically acquire image information of a route that is a region other than the suspicious portion or the priority monitoring region and has a small traffic volume of the passenger car V. If there is no route that satisfies the condition of step ST28, the process returns to step ST11 of FIG. 12A without performing the process of step ST29.
  • the monitoring system of the present embodiment has the following effects. (1) Since the monitoring system 1 of the present example outputs a command to transmit monitoring information of a monitoring point specified based on the traveling frequency of the passenger car V calculated for each link, the link of the traveling frequency of the passenger vehicle V is low. Since a command requesting transmission of monitoring information related to a monitoring point is transmitted, even if a link with a low traveling frequency occurs due to a random movement of the passenger car V, monitoring information related to the monitoring point of the link is acquired. Can do. As a result, the central monitoring device 20 can uniformly monitor the city using the monitoring terminal device 10 mounted on a randomly moving passenger car.
  • the image transmission command for transmitting the monitoring information including the image information of the monitoring point is output as a command, so that a taxi company other than the monitoring person is based on the driving frequency on behalf of the monitoring person.
  • the monitoring system 1 of this example selects a point on the link with a low traveling frequency as the monitoring point, the passenger car V on which the monitoring terminal device 10 is mounted can travel all over the city. The same effect as 2) can be achieved.
  • a link other than a link with high driving frequency that is, a point on a link with low driving frequency is selected as a monitoring point. Therefore, a high driving frequency is avoided by avoiding a link with high driving frequency.
  • the passenger car V can be guided to a link that does not exist. As a result, the passenger car V on which the monitoring terminal device 10 is mounted can be made to travel evenly in the city, and the same effect as the above (2) can be achieved.
  • the passenger car belonging to a monitor such as a police or a security company
  • the passenger car belonging to a monitor such as a police or a security company It is possible to generate a command according to the traveling frequency for each link.
  • the monitoring system 1 since monitoring information is acquired from the monitoring terminal device 10 mounted on the passenger car V belonging to a person other than the monitoring person such as a taxi or a transportation company, the monitoring system 1 can be used for a trader or individual such as a taxi or a transportation company. A command according to the traveling frequency for each link of the passenger car to which the vehicle belongs can be generated. As a result, a link such as a low (not high) travel frequency of the trader / individual can be guided to supplement the monitoring by a supervisor such as the police.
  • a taxi or the like is provided at a monitoring point where the traveling frequency of the police vehicle V3 is low.
  • the business vehicle V2 can be dispatched, and another business vehicle V2 such as a taxi can be dispatched to a monitoring point where the travel frequency of the business vehicle V2 is low.
  • the monitoring point included in the command or the link including the monitoring point is displayed superimposed on the map information MP.
  • the occupant can visually check the position of the monitoring point or the link including the monitoring point with respect to the route intended by the passenger.
  • the monitoring point is set as the via point on the route to the intended destination. Can be incorporated.
  • the monitoring method of this example has the same operations and effects as the monitoring system including the monitoring terminal device 10 and the central monitoring device 20.
  • the position information of the passenger car V and the image information from the in-vehicle cameras 11a to 11e are acquired.
  • the image information from the fixed camera 11f installed in the city shown in FIG. May be obtained.
  • the passenger car V that acquires the position information and the image information it is desirable to use a taxi V1 or a bus that travels in a predetermined area as shown in FIG. 1, but even if a private passenger car V2 or an emergency passenger car V3 is used. Good.
  • the in-vehicle camera 11e in the room is acquired. It may be omitted.
  • the number of the four on-vehicle cameras 11a to 11d may be three or less, particularly in an environment where image information can be acquired from many passenger cars V, such as a monitoring area where there is a large amount of traffic.
  • the central control device 21 corresponds to travel frequency calculation means and command output means
  • the communication device 23 and input device 25 correspond to information acquisition means, abnormality information reception means and command output means according to the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Alarm Systems (AREA)

Description

監視システムMonitoring system

 本発明は、監視システムに関するものである。
 本出願は、2012年1月23日に出願された日本国特許出願の特願2012―011439に基づく優先権を主張するものであり、文献の参照による組み込みが認められる指定国については、上記の出願に記載された内容を参照により本出願に組み込み、本出願の記載の一部とする。
The present invention relates to a monitoring system.
This application claims priority based on Japanese Patent Application No. 2012-011439 filed on Jan. 23, 2012. For designated countries that are allowed to be incorporated by reference, The contents described in the application are incorporated into the present application by reference and made a part of the description of the present application.

 商店街、店舗の出入り口、家庭の玄関その他の街中に複数の防犯カメラ装置を設置し、当該防犯カメラ装置により撮像された周囲の映像を監視することで、異常の発生を検出する防犯装置が知られている(特許文献1)。 A security device that detects the occurrence of abnormalities is known by installing multiple security camera devices in shopping streets, store entrances, home entrances, and other streets, and monitoring surrounding images captured by the security camera device (Patent Document 1).

特開2011-215767号公報JP 2011-215767 A

 しかしながら、ランダムに移動する移動体に搭載されたカメラを用いて街中を監視する場合には、移動体の走行頻度が高いリンクと走行頻度が低いリンクとが生じてしまうため、街中を万遍なく監視することができないという問題がある。 However, when a city is monitored using a camera mounted on a moving body that moves at random, a link with a high traveling frequency and a link with a low traveling frequency are generated. There is a problem that it cannot be monitored.

 本発明は、移動体に搭載されたカメラを用いる場合であっても、街中を万遍なく監視することができる監視システムを提供することを目的とする。 The object of the present invention is to provide a monitoring system capable of monitoring the entire city even when a camera mounted on a moving body is used.

 本発明は、リンクごとに算出された移動体の走行頻度に基づいて特定された監視地点の監視情報を送信する指令を出力することにより、上記目的を達成する。 The present invention achieves the above object by outputting a command for transmitting monitoring information of a monitoring point specified based on the traveling frequency of a moving object calculated for each link.

 本発明によれば、移動体の走行頻度が低いリンクの監視地点に関する監視情報の送信を求める指令を送信するので、移動体がランダムに移動することにより走行頻度が低いリンクが生じた場合であっても、そのリンクの監視地点に関する監視情報を取得することができる。この結果、中央監視装置はランダムに動く移動体に搭載された監視端末装置を用いて、街中を万遍なく監視することができる。 According to the present invention, since the command for transmitting the monitoring information related to the monitoring point of the link where the traveling frequency of the moving body is low is transmitted, it is a case where a link with a low traveling frequency occurs due to the mobile body moving randomly. However, the monitoring information regarding the monitoring point of the link can be acquired. As a result, the central monitoring device can monitor the entire city using the monitoring terminal device mounted on the moving body that moves randomly.

本発明の一実施の形態に係る監視システムを示す模式図である。1 is a schematic diagram showing a monitoring system according to an embodiment of the present invention. 図1の監視システムを示すブロック図である。It is a block diagram which shows the monitoring system of FIG. 図1の監視システムにおける車載カメラの配置及びその撮像範囲を示す斜視図である。It is a perspective view which shows arrangement | positioning and the imaging range of the vehicle-mounted camera in the monitoring system of FIG. 図1の監視システムにおける車載カメラの配置及びその撮像範囲を示す平面図である。It is a top view which shows arrangement | positioning and the imaging range of the vehicle-mounted camera in the monitoring system of FIG. フロントの車載カメラの撮影画像の一例を示す図である。It is a figure which shows an example of the picked-up image of a front vehicle-mounted camera. 右サイドの車載カメラの撮影画像の一例を示す図である。It is a figure which shows an example of the picked-up image of the right side vehicle-mounted camera. リアの車載カメラの撮影画像の一例を示す図である。It is a figure which shows an example of the picked-up image of a rear vehicle-mounted camera. 左サイドの車載カメラの撮影画像の一例を示す図である。It is a figure which shows an example of the picked-up image of the left side vehicle-mounted camera. 室内の車載カメラの撮影画像の一例を示す図である。It is a figure which shows an example of the picked-up image of an indoor vehicle-mounted camera. 複数の画像に基づいて生成された監視画像の一例を示す図である。It is a figure which shows an example of the monitoring image produced | generated based on the some image. 監視画像の歪み補正処理を説明するための図である。It is a figure for demonstrating the distortion correction process of the monitoring image. 投影モデルの一例を示す模式図である。It is a schematic diagram which shows an example of a projection model. 図10に示す投影モデルのxy面に沿う断面模式図である。It is a cross-sectional schematic diagram along xy plane of the projection model shown in FIG. 中央監視装置のディスプレイに表示される画像例を示す図である。It is a figure which shows the example of an image displayed on the display of a central monitoring apparatus. 図1の監視システムの監視端末装置側の主たる制御内容を示すフローチャートである。It is a flowchart which shows the main control content by the monitoring terminal device side of the monitoring system of FIG. 図1の監視システムの中央監視装置側の主たる制御内容を示すフローチャート(その1)である。3 is a flowchart (No. 1) showing main control contents on the central monitoring device side of the monitoring system of FIG. 図1の監視システムの中央監視装置側の主たる制御内容を示すフローチャート(その2)である。7 is a flowchart (No. 2) showing main control contents on the central monitoring device side of the monitoring system of FIG. データベースの情報例を示す図である。It is a figure which shows the example of information of a database. 走行頻度に関する情報例を示す図である。It is a figure showing an example of information about run frequency. 新たな経路の求め方を説明するための第1の図である。It is a 1st figure for demonstrating how to obtain | require a new path | route. 新たな経路の求め方を説明するための第2の図である。It is a 2nd figure for demonstrating how to obtain | require a new path | route.

 以下に示す一実施の形態は、本発明に係る監視システムを、街中の治安を警察署や消防署などの当局にて集中監視する監視システム1に具体化したものである。すなわち、複数の移動体のそれぞれの位置情報と、当該移動体の周囲の画像情報と、時刻情報とを所定のタイミングで取得し、これら位置情報と画像情報と時刻情報とを、無線通信を介して、当局に設置された中央監視装置へ送信し、これら位置情報を地図情報上に表示するとともに必要に応じて画像情報と時刻情報とをディスプレイに表示するものである。そのため、本例の監視システム1は、図1に示すように位置情報及び画像情報などの監視情報を取得する監視端末装置10と、電気通信回線網30を介して監視情報を取得して処理する中央監視装置20とを備える。 In the following embodiment, the monitoring system according to the present invention is embodied as a monitoring system 1 that centrally monitors the security of a city by authorities such as a police station and a fire station. That is, the position information of each of the plurality of moving objects, the image information around the moving objects, and the time information are acquired at a predetermined timing, and the position information, the image information, and the time information are acquired via wireless communication. The position information is displayed on the map information and, if necessary, the image information and the time information are displayed on the display. Therefore, the monitoring system 1 of this example acquires and processes monitoring information via the telecommunications network 30 and the monitoring terminal device 10 that acquires monitoring information such as position information and image information as shown in FIG. Central monitoring device 20 is provided.

 図2は、監視端末装置10及び中央監視装置20の具体的構成を示すブロック図である。本実施形態の監視システムは、走行頻度の低いリンク上の監視地点に関する監視情報を取得することができる。 FIG. 2 is a block diagram showing a specific configuration of the monitoring terminal device 10 and the central monitoring device 20. The monitoring system of this embodiment can acquire monitoring information related to monitoring points on links with low travel frequency.

 監視端末装置10は、複数の移動体Vに搭載される端末装置であって、これら複数の移動体Vのそれぞれの位置情報を検出する位置検出機能と、複数の移動体のそれぞれに装着されたカメラで当該移動体の周囲を撮像して画像情報を生成する画像生成機能と、時刻検出機能と、所定のタイミングで位置情報、画像情報及び時刻情報を取得する情報取得制御機能と、取得された位置情報及び/又は画像情報を含む監視情報を生成する監視情報生成機能と、これら位置情報、画像情報及び時刻情報を中央監視装置20へ出力するとともに中央監視装置20からの指令を取得する通信機能と、異常の発生を通報する機能とを有する。そのため、複数の車載カメラ11a~11e、画像処理装置12、通信装置13、車載制御装置14、位置検出装置15及び通報ボタン16を備える。また、本実施形態の監視端末装置10は、ディスプレイ18、ナビゲーション装置19を備えている。なお、時刻情報は主として事象の事後解析に供される情報であるため省略してもよい。 The monitoring terminal device 10 is a terminal device mounted on a plurality of moving bodies V, and is mounted on each of the plurality of moving bodies and a position detection function that detects position information of each of the plurality of moving bodies V. An image generation function that captures the surroundings of the moving body with a camera and generates image information, a time detection function, an information acquisition control function that acquires position information, image information, and time information at a predetermined timing; A monitoring information generation function for generating monitoring information including position information and / or image information, and a communication function for outputting the position information, image information, and time information to the central monitoring apparatus 20 and acquiring a command from the central monitoring apparatus 20 And a function for reporting the occurrence of an abnormality. Therefore, a plurality of in-vehicle cameras 11a to 11e, an image processing device 12, a communication device 13, an in-vehicle control device 14, a position detection device 15, and a notification button 16 are provided. In addition, the monitoring terminal device 10 of this embodiment includes a display 18 and a navigation device 19. Note that the time information is mainly information used for post-event analysis, and may be omitted.

 監視端末装置10が搭載される移動体Vは、目的とする監視領域を走行するものであれば特に限定されず、乗用車、二輪車、産業車両、路面電車などの移動体を含み、乗用車には業務車両V1や自家用乗用車V2や緊急乗用車V3が含まれるが、なかでも特に予め決められた領域をランダム且つ常時走行するタクシーや路線バスV1などが好適に含まれる。図1には、タクシーV1、自家用乗用車V2、パトカー、消防車又は救急車などの緊急乗用車V3を例示するが、これらを総称する場合は移動体Vまたは乗用車Vという。 The mobile body V on which the monitoring terminal device 10 is mounted is not particularly limited as long as it travels in the target monitoring area, and includes mobile bodies such as passenger cars, motorcycles, industrial vehicles, and trams. The vehicle V1, the private passenger car V2, and the emergency passenger car V3 are included, but in particular, a taxi or a route bus V1 that travels randomly and constantly in a predetermined area is particularly preferable. FIG. 1 illustrates an emergency passenger car V3 such as a taxi V1, a private passenger car V2, a police car, a fire engine or an ambulance, but these are collectively referred to as a moving body V or a passenger car V.

 それぞれの移動体Vには、複数の車載カメラ11a~11e(以下、総称する場合はカメラ11という。)、画像処理装置12、通信装置13、車載制御装置14、位置検出装置15及び通報ボタン16が搭載されている。カメラ11は、CCDカメラなどで構成され、移動体Vの周囲を撮像し、その撮像信号を画像処理装置12へ出力する。画像処理装置12は、カメラ11からの撮像信号を読み出し、画像情報に画像処理する。この画像処理の詳細は後述する。 Each moving body V includes a plurality of in-vehicle cameras 11a to 11e (hereinafter collectively referred to as cameras 11), an image processing device 12, a communication device 13, an in-vehicle control device 14, a position detection device 15, and a notification button 16. Is installed. The camera 11 is composed of a CCD camera or the like, images the surroundings of the moving object V, and outputs the image pickup signal to the image processing device 12. The image processing device 12 reads an imaging signal from the camera 11 and performs image processing on the image information. Details of this image processing will be described later.

 位置検出装置15は、GPS装置及びその補正装置などで構成され、当該移動体Vの現在位置を検出し、車載制御装置14へ出力する。通報ボタン16は、車室内に設置された入力ボタンであって、運転手や同乗者がインシデント(事故、火事、犯罪など治安に関する出来事)を発見した際に異常を通報するための情報を入力する手動ボタンである。この情報には、異常を通報した移動体Vの位置情報を含めることができる。 The position detection device 15 is composed of a GPS device and its correction device, etc., detects the current position of the moving object V, and outputs it to the in-vehicle control device 14. The notification button 16 is an input button installed in the passenger compartment, and inputs information for reporting an abnormality when a driver or a passenger finds an incident (an incident related to security such as an accident, fire, or crime). It is a manual button. This information can include position information of the moving body V that has reported the abnormality.

 車載制御装置14は、CPU,ROM,RAMにより構成され、通報ボタン16が押されたときに、画像処理装置12、通信装置13及び位置検出装置15を制御し、画像処理装置12で生成された画像情報と、位置検出装置15で検出された移動体Vの位置情報と、CPUが内蔵する時計からの時刻情報とを通信装置13及び電気通信回線網30を介して中央監視装置20へ出力する。また、電気通信回線網30及び通信装置13を介して受信された中央監視装置20から画像送信指令などの情報を要求する指令を取得し、画像処理装置12、通信装置13及び位置検出装置15を制御し、画像処理装置12で生成された画像情報と、位置検出装置15で検出された移動体Vの位置情報と、CPUが内蔵する時計からの時刻情報とを含む監視情報を、通信装置13及び電気通信回線網30を介して中央監視装置20へ出力する。なお、車載制御装置14は、画像情報、位置情報、時刻情報などを含む監視情報を少なくとも所定時間記憶しておくことができる。 The in-vehicle control device 14 includes a CPU, a ROM, and a RAM, and controls the image processing device 12, the communication device 13, and the position detection device 15 when the notification button 16 is pressed, and is generated by the image processing device 12. The image information, the position information of the moving object V detected by the position detection device 15, and the time information from the clock built in the CPU are output to the central monitoring device 20 via the communication device 13 and the telecommunication network 30. . Also, a command for requesting information such as an image transmission command is acquired from the central monitoring device 20 received via the telecommunication network 30 and the communication device 13, and the image processing device 12, the communication device 13, and the position detection device 15 are obtained. Monitoring information including image information generated by the image processing device 12, position information of the moving object V detected by the position detection device 15, and time information from a clock built in the CPU is transmitted to the communication device 13. And output to the central monitoring device 20 through the telecommunication network 30. Note that the in-vehicle control device 14 can store monitoring information including image information, position information, time information, and the like for at least a predetermined time.

 通信装置13は、無線通信が可能な通信手段であり、電気通信回線網30を介して中央監視装置20の通信装置23と情報の授受を実行する。電気通信回線網30が商用電話回線網である場合は携帯電話通信装置を汎用することができ、電気通信回線網30が本例の監視システム1の専用電気通信回線網である場合は、それ専用の通信装置13,23を用いることができる。なお、電気通信回線網30に代えて、無線LAN、WiFi(登録商標)、WiMAX(登録商標)、Bluetooth(登録商標)、専用無線回線などを用いることもできる。 The communication device 13 is a communication means capable of wireless communication, and exchanges information with the communication device 23 of the central monitoring device 20 via the telecommunication network 30. When the telecommunications network 30 is a commercial telephone network, a mobile phone communication device can be used widely, and when the telecommunications network 30 is a dedicated telecommunications network for the monitoring system 1 of this example, it is dedicated to it. The communication devices 13 and 23 can be used. Instead of the telecommunication network 30, a wireless LAN, WiFi (registered trademark), WiMAX (registered trademark), Bluetooth (registered trademark), a dedicated wireless line, or the like can be used.

 中央監視装置20は、上述した監視端末装置10から出力された位置情報及び画像情報を取得する情報取得機能と、取得した監視情報を位置情報と対応づけてデータベース26に記憶する蓄積機能と、データベース26及びリンク情報を含む地図情報MPを参照して、リンク情報のリンクごとに乗用車Vの走行頻度を算出する走行頻度算出機能と、算出された走行頻度に基づいて監視すべきリンクに含まれる監視地点を特定し、監視地点の監視情報を送信する指令を出力する指令出力機能と、地図データベースから読み出した地図情報MPを表示するとともに、受信した位置情報を地図情報MP上に表示制御し、受信した画像情報をディスプレイ24に表示する表示制御機能と、を有する。
 中央監視装置20は、算出された走行頻度に基づいて監視すべきリンクに含まれる監視地点を特定し、特定された監視地点に移動体Vを移動させて、その監視地点の監視情報を送信させる指令を出力する指令出力機能をも備えることができる。
The central monitoring device 20 includes an information acquisition function for acquiring the position information and image information output from the monitoring terminal device 10 described above, a storage function for storing the acquired monitoring information in association with the position information in the database 26, a database 26 and the map information MP including link information, a travel frequency calculation function for calculating the travel frequency of the passenger car V for each link of the link information, and monitoring included in the link to be monitored based on the calculated travel frequency A command output function for outputting a command for specifying a point and transmitting monitoring information of the monitoring point and the map information MP read from the map database are displayed, and the received position information is displayed and controlled on the map information MP. A display control function for displaying the obtained image information on the display 24.
The central monitoring device 20 specifies a monitoring point included in the link to be monitored based on the calculated traveling frequency, moves the moving object V to the specified monitoring point, and transmits the monitoring information of the monitoring point. A command output function for outputting a command can also be provided.

 中央制御装置21は、CPU,ROM,RAMにより構成され、画像処理装置22、通信装置23及びディスプレイ24を制御して、監視端末装置10から送信された位置情報、画像情報及び時刻情報を受信し、必要に応じて画像処理を施したうえでディスプレイ24に表示する。 The central control device 21 includes a CPU, a ROM, and a RAM, and controls the image processing device 22, the communication device 23, and the display 24, and receives position information, image information, and time information transmitted from the monitoring terminal device 10. The image is displayed on the display 24 after being subjected to image processing as necessary.

 画像処理装置24は、地図データベースを有し、当該地図データベースからの地図情報MPをディスプレイ24に表示するとともに、監視端末装置10の位置検出装置15により検出された位置情報を当該地図情報MP上に重畳表示する。また、監視端末装置10の車載カメラ11で撮像され、画像処理装置12で処理された画像情報をディスプレイ24に表示するための画像処理を施す。 The image processing device 24 has a map database, displays map information MP from the map database on the display 24, and displays position information detected by the position detection device 15 of the monitoring terminal device 10 on the map information MP. Superimposed display. Further, image processing for displaying image information captured by the vehicle-mounted camera 11 of the monitoring terminal device 10 and processed by the image processing device 12 on the display 24 is performed.

 ディスプレイ24は、たとえば一つの画面上に2つのウィンド画面が表示できる大きさの液晶表示装置又は2つのウィンド画面をそれぞれ表示する2つの液晶表示装置により構成することができる。そして、一方のウィンド画面には、地図情報MP上に各移動体Vの位置情報を重ね合わせた画面を表示し(図1参照)、他方のウィンド画面には、車載カメラ11で撮像された映像に係る画像情報を表示する。 The display 24 can be composed of, for example, a liquid crystal display device having a size capable of displaying two window screens on one screen or two liquid crystal display devices each displaying two window screens. One window screen displays a screen in which the position information of each moving object V is superimposed on the map information MP (see FIG. 1), and the other window screen displays an image captured by the in-vehicle camera 11. The image information concerning is displayed.

 入力装置25は、キーボード又はマウスで構成され、所望の移動体Vに対して出力される情報取得指令を入力したり、ディスプレイ24に表示される各種情報の処理指令を入力したりする場合に用いられる。先述した乗用車Vの走行頻度の低いリンク及び/又は監視地点が選択された場合には、この監視地点を監視者が入力装置25を介して入力することもできる。特に限定されないが、監視者は、地図情報MP上に重畳表示された各地点のアイコンをクリック(選択入力)することにより監視地点を指定し、この監視地点を基準とする監視エリアを設定することができる。 The input device 25 is constituted by a keyboard or a mouse, and is used when inputting an information acquisition command output to a desired moving body V or inputting various information processing commands displayed on the display 24. It is done. When the above-described link and / or monitoring point with a low traveling frequency of the passenger car V is selected, the monitoring point can be input by the supervisor via the input device 25. Although not particularly limited, the monitor designates a monitoring point by clicking (selecting and inputting) the icon of each point superimposed on the map information MP, and sets a monitoring area based on this monitoring point. Can do.

 通信装置23は、無線通信が可能な通信手段であり、電気通信回線網30を介して監視端末装置10の通信装置13と情報の授受を実行する。電気通信回線網30が商用電話回線網である場合は携帯電話通信装置を汎用することができ、電気通信回線網30が本例の監視システム1の専用電気通信回線網である場合は、それ専用の通信装置13,23を用いることができる。 The communication device 23 is a communication means capable of wireless communication, and exchanges information with the communication device 13 of the monitoring terminal device 10 via the telecommunication network 30. When the telecommunications network 30 is a commercial telephone network, a mobile phone communication device can be used widely, and when the telecommunications network 30 is a dedicated telecommunications network for the monitoring system 1 of this example, it is dedicated to it. The communication devices 13 and 23 can be used.

 次に車載カメラ11a~11eの装着位置と撮像範囲について説明する。ここでは移動体Vとして乗用車Vを例に挙げて説明する。カメラ11a~11eはCCD等の撮像素子を用いて構成され、4つの車載カメラ11a~11dは乗用車Vの外部の異なる位置にそれぞれ設置され、車両周囲の4方向をそれぞれ撮影する。本実施形態のカメラ1は、被写体を拡大して撮像するズームアップ機能を備え、制御指令に従って任意に焦点距離を変更し、又は制御指令に従って任意に撮像倍率を変更することができる。 Next, the mounting positions and imaging ranges of the on-vehicle cameras 11a to 11e will be described. Here, a passenger car V will be described as an example of the moving body V. The cameras 11a to 11e are configured using an image sensor such as a CCD, and the four on-vehicle cameras 11a to 11d are installed at different positions outside the passenger car V, respectively, and shoot four directions around the vehicle. The camera 1 of this embodiment has a zoom-up function for enlarging and imaging a subject, and can arbitrarily change the focal length according to the control command, or can arbitrarily change the imaging magnification according to the control command.

 例えば、図3に示すように、フロントグリル部分などの乗用車Vの前方の所定位置に設置された車載カメラ11aは、乗用車Vの前方のエリアSP1内及びその前方の空間に存在する物体又は路面(フロントビュー)を撮影する。また、左サイドミラー部分などの乗用車Vの左側方の所定位置に設置された車載カメラ11bは、乗用車Vの左側方のエリアSP2内及びその周囲の空間に存在する物体又は路面(左サイドビュー)を撮影する。また、リアフィニッシャー部分やルーフスポイラー部分などの乗用車Vの後方部分の所定位置に設置された車載カメラ11cは、乗用車Vの後方のエリアSP3内及びその後方の空間に存在する物体又は路面(リアビュー)を撮影する。また、右サイドミラー部分などの乗用車Vの右側方の所定位置に設置された車載カメラ11dは、乗用車Vの右側方のエリアSP4内及びその周囲の空間に存在する物体又は路面(右サイドビュー)を撮影する。なお、図3には図示を省略したが、1つの車載カメラ11eは、乗用車の室内の例えば天井部に設置され、図4に示すように室内のエリアSP5を撮像し、タクシーの無賃乗車や強盗などの犯罪防止又は犯罪通報に供される。 For example, as shown in FIG. 3, the in-vehicle camera 11 a installed at a predetermined position in front of the passenger car V such as a front grill portion is an object or road surface (in the area SP1 in front of the passenger car V and in the space in front thereof) Shoot the front view. The in-vehicle camera 11b installed at a predetermined position on the left side of the passenger car V such as the left side mirror portion is an object or road surface (left side view) that exists in the area SP2 on the left side of the passenger car V and in the surrounding space. Shoot. The in-vehicle camera 11c installed at a predetermined position in the rear part of the passenger car V, such as a rear finisher part or a roof spoiler part, is an object or road surface (rear view) existing in the area SP3 behind the passenger car V and in the space behind it. Shoot. The in-vehicle camera 11d installed at a predetermined position on the right side of the passenger car V such as the right side mirror portion is an object or road surface (right side view) that exists in the area SP4 on the right side of the passenger car V and in the surrounding space. Shoot. Although not shown in FIG. 3, one in-vehicle camera 11e is installed, for example, on the ceiling of a passenger car, and images the area SP5 in the passenger compartment as shown in FIG. Used for crime prevention or crime reporting.

 図4は、各車載カメラ11a~11eの配置を乗用車Vの上空から見た図である。同図に示すように、エリアSP1を撮像する車載カメラ11a、エリアSP2を撮像する車載カメラ11b、エリアSP3を撮像する車載カメラ11c、エリアSP4を撮像する車載カメラ11dの4つは、乗用車Vのボディの外周VEに沿って左回り(反時計回り)又は右回り(時計回り)に沿って設置されている。つまり、同図に矢印Cで示す左回り(反時計回り)に乗用車Vのボディの外周VEに沿って見ると、車載カメラ11aの左隣りに車載カメラ11bが設置され、車載カメラ11bの左隣りに車載カメラ11cが設置され、車載カメラ11cの左隣りに車載カメラ11dが設置され、車載カメラ11dの左隣りに車載カメラ11aが設置されている。逆に同図に示す矢印Cの方向とは反対に(時計回り)に乗用車Vのボディの外周VEに沿って見ると、車載カメラ11aの右隣りに車載カメラ11dが設置され、車載カメラ11dの右隣りに車載カメラ11cが設置され、車載カメラ11cの右隣りに車載カメラ11bが設置され、車載カメラ11bの右隣りに車載カメラ11aが設置されている。 FIG. 4 is a view of the arrangement of the in-vehicle cameras 11a to 11e as viewed from above the passenger car V. As shown in the figure, the in-vehicle camera 11a that images the area SP1, the in-vehicle camera 11b that images the area SP2, the in-vehicle camera 11c that images the area SP3, and the in-vehicle camera 11d that images the area SP4 are It is installed along the outer periphery VE of the body along the counterclockwise direction (counterclockwise) or the clockwise direction (clockwise). That is, when viewed along the outer periphery VE of the body of the passenger car V in the counterclockwise direction (counterclockwise) indicated by the arrow C in the figure, the in-vehicle camera 11b is installed on the left side of the in-vehicle camera 11a, and the left side of the in-vehicle camera 11b. The vehicle-mounted camera 11c is installed on the left side of the vehicle-mounted camera 11c, and the vehicle-mounted camera 11a is installed on the left side of the vehicle-mounted camera 11d. Conversely, when viewed along the outer periphery VE of the body of the passenger car V in the direction opposite to the direction of the arrow C shown in the figure (clockwise), the in-vehicle camera 11d is installed on the right side of the in-vehicle camera 11a. The vehicle-mounted camera 11c is installed on the right side, the vehicle-mounted camera 11b is installed on the right side of the vehicle-mounted camera 11c, and the vehicle-mounted camera 11a is installed on the right side of the vehicle-mounted camera 11b.

 図5Aは、フロントの車載カメラ11aがエリアSP1を撮像した画像GSP1の一例を示し、図5Bは、左サイドの車載カメラ11bがエリアSP2を撮像した画像GSP2の一例を示し、図5Cは、リアの車載カメラ11cがエリアSP3を撮像した画像GSP3の一例を示し、図5Dは、右サイドの車載カメラ11dがエリアSP4を撮像した画像GSP4の一例を示し、図5Eは、室内の車載カメラ11eが室内エリアSP5を撮像した画像GSP5の一例を示す画像図である。ちなみに、各画像のサイズは、縦480ピクセル×横640ピクセルである。画像サイズは特に限定されず、一般的な端末装置で動画再生が可能なサイズであればよい。 5A shows an example of an image GSP1 in which the front in-vehicle camera 11a images the area SP1, FIG. 5B shows an example of an image GSP2 in which the left-side in-vehicle camera 11b images the area SP2, and FIG. 5D shows an example of an image GSP3 in which the area SP3 is imaged, FIG. 5D shows an example of an image GSP4 in which the right-side in-vehicle camera 11d images the area SP4, and FIG. 5E shows an indoor in-vehicle camera 11e. It is an image figure which shows an example of image GSP5 which imaged indoor area SP5. Incidentally, the size of each image is vertical 480 pixels × horizontal 640 pixels. The image size is not particularly limited, and may be any size as long as a general terminal device can reproduce a moving image.

 なお、車載カメラ11の配置数及び配置位置は、乗用車Vの大きさ、形状、検出領域の設定手法等に応じて適宜に決定することができる。上述した複数の車載カメラ11は、それぞれの配置に応じた識別子が付されており、車載制御装置14は、各識別子に基づいて各車載カメラ11のそれぞれを識別することができる。また、車載制御装置14は、指令信号に識別子を付することにより、特定の車載カメラ11に撮像指令その他の指令を送信することができる。 It should be noted that the number and position of the in-vehicle camera 11 can be appropriately determined according to the size, shape, detection area setting method, etc. of the passenger car V. The plurality of in-vehicle cameras 11 described above are assigned identifiers corresponding to the respective arrangements, and the in-vehicle control device 14 can identify each of the in-vehicle cameras 11 based on each identifier. Moreover, the vehicle-mounted control apparatus 14 can transmit an imaging command and other commands to a specific vehicle-mounted camera 11 by attaching an identifier to the command signal.

 車載制御装置14は、画像処理装置12を制御して車載カメラ11によって撮像された撮像信号をそれぞれ取得し、画像処理装置12は、各車載カメラ11からの撮像信号を処理して図5A~図5Eに示す画像情報に変換する。そして、車載制御装置14は、図5A~図5Dに示す4つの画像情報に基づいて監視画像を生成するとともに(画像生成機能)、この監視画像を柱体の投影モデルの側面に設定された投影面に投影するためのマッピング情報を監視画像に対応づけ(マッピング情報付加機能)、中央監視装置20へ出力する。以下、画像生成機能とマッピング情報付加機能について詳述する。 The in-vehicle control device 14 controls the image processing device 12 to acquire each image signal picked up by the in-vehicle camera 11, and the image processing device 12 processes the image pickup signal from each in-vehicle camera 11 to perform FIG. It is converted into image information shown in 5E. Then, the in-vehicle control device 14 generates a monitoring image based on the four pieces of image information shown in FIGS. 5A to 5D (image generation function), and the monitoring image is projected on the side of the projection model of the columnar body. Mapping information to be projected onto the surface is associated with the monitoring image (mapping information adding function) and output to the central monitoring device 20. Hereinafter, the image generation function and the mapping information addition function will be described in detail.

 なお、乗用車Vの周囲を撮像した4つの画像情報に基づいて監視画像を生成し、これにマッピング情報を関連付ける処理は、本例のように監視端末装置10で実行するほか、中央監視装置20で実行することもできる。この場合には、乗用車Vの周囲を撮像した4つの画像情報を監視端末装置10から中央監視装置20へそのまま送信し、これを中央監視装置20の画像処理装置22及び中央制御装置21にて監視画像を生成するとともにマッピング情報を関連付け、投影変換すればよい。 The process of generating a monitoring image based on the four pieces of image information obtained by imaging the periphery of the passenger car V and associating the mapping information with the monitoring image is executed by the monitoring terminal device 10 as in this example, and also by the central monitoring device 20. It can also be executed. In this case, four pieces of image information obtained by imaging the periphery of the passenger car V are transmitted as they are from the monitoring terminal device 10 to the central monitoring device 20, and are monitored by the image processing device 22 and the central control device 21 of the central monitoring device 20. It is only necessary to generate an image, associate mapping information, and perform projection conversion.

 まず、画像生成機能について説明する。本実施形態の監視端末装置10の車載制御装置14は、画像処理装置12を制御して各車載カメラ11a~11eの撮像信号をそれぞれ取得し、さらに乗用車Vのボディの外周に沿って右回り又は左回りの方向に設置された車載カメラ11a~11dの画像情報がこれらの車載カメラ11a~11dの設置順に配置されるように、一枚の監視画像を生成する。 First, the image generation function will be described. The in-vehicle control device 14 of the monitoring terminal device 10 of the present embodiment controls the image processing device 12 to acquire the imaging signals of the in-vehicle cameras 11a to 11e, respectively, and further clockwise or along the outer periphery of the body of the passenger car V One monitoring image is generated so that the image information of the in-vehicle cameras 11a to 11d installed in the counterclockwise direction is arranged in the order of installation of these in-vehicle cameras 11a to 11d.

 上述したように、本実施形態において、4つの車載カメラ11a~11dは乗用車Vのボディの外周VEに沿って左回り(反時計回り)にカメラ11a、11b、11c、11dの順に設置されているので、車載制御装置14は、これらの車載カメラ11a~11dの設置の順序(車載カメラ11a→11b→11c→11d)に従って、各車載カメラ11a~11dが撮像した4枚の画像が一体となるように水平方向に繋げ、一枚の監視画像を生成する。本実施形態の監視画像において、各画像は乗用車Vの接地面(路面)が下辺となるように配置され、各画像は路面に対して高さ方向(垂直方向)の辺で互いに接続される。 As described above, in the present embodiment, the four in-vehicle cameras 11a to 11d are installed in the order of the cameras 11a, 11b, 11c, and 11d in the counterclockwise direction (counterclockwise) along the outer periphery VE of the body of the passenger car V. Therefore, the vehicle-mounted control device 14 integrates the four images captured by the vehicle-mounted cameras 11a to 11d in accordance with the order of installation of the vehicle-mounted cameras 11a to 11d (vehicle-mounted cameras 11a → 11b → 11c → 11d). Are connected in the horizontal direction to generate a single monitoring image. In the monitoring image of the present embodiment, the images are arranged such that the ground contact surface (road surface) of the passenger vehicle V is the lower side, and the images are connected to each other at sides in the height direction (vertical direction) with respect to the road surface.

 図6は、監視画像Kの一例を示す図である。同図に示すように、本実施形態の監視画像Kは、図面左側から図面右側へ向かう方向Pに沿って、フロントの車載カメラ11aがエリアSP1を撮像した撮像画像GSP1、左サイドの車載カメラ11bがエリアSP2を撮像した撮像画像GSP2、リアの車載カメラ11cがエリアSP3を撮像した撮像画像GSP3、及び右サイドの車載カメラ11dがエリアSP4を撮像した撮像画像GSP4が、水平方向にこの順序で並べて配置され、これら4つの画像が一連の画像とされている。このように生成された監視画像Kを、路面(車両の接地面)に対応する画像を下にして左端から右側へ順番に表示することにより、監視者は、車両Vの周囲を反時計回りに見回したのと同様にディスプレイ24上で視認することができる。 FIG. 6 is a diagram illustrating an example of the monitoring image K. As shown in the figure, the monitoring image K of the present embodiment includes a captured image GSP1 in which the front in-vehicle camera 11a images the area SP1 along the direction P from the left side to the right side in the drawing, and the left in-vehicle camera 11b. A captured image GSP2 obtained by imaging the area SP2, a captured image GSP3 obtained by the rear vehicle-mounted camera 11c imaging the area SP3, and a captured image GSP4 obtained by the right-side vehicle-mounted camera 11d imaging the area SP4 are arranged in this order in the horizontal direction. These four images are arranged as a series of images. The monitor image K generated in this way is displayed in order from the left end to the right side with the image corresponding to the road surface (vehicle contact surface) facing down, so that the monitor can rotate the periphery of the vehicle V counterclockwise. It can be visually recognized on the display 24 in a manner similar to the look around.

 なお、一つの監視画像Kを生成する際には、各車載カメラ11a~11dの撮影タイミングを略同時にして取得した4つの画像が用いられる。これにより、監視画像Kに含まれる情報を同期させることができるので、所定タイミングにおける車両周囲の状況を正確に表現することができる。 It should be noted that when one monitoring image K is generated, four images acquired at substantially the same time as the photographing timings of the in-vehicle cameras 11a to 11d are used. Thereby, since the information contained in the monitoring image K can be synchronized, the situation around the vehicle at a predetermined timing can be accurately expressed.

 また、カメラの撮像タイミングが略同時である各撮像画像から生成した監視画像Kを経時的に記憶し、所定の単位時間あたりに複数の監視画像Kが含まれる動画の監視画像Kを生成するようにしてもよい。撮像タイミングが同時の画像に基づいて動画の監視画像Kを生成することにより、車両周囲の状況の変化を正確に表現することができる。 In addition, the monitoring image K generated from the respective captured images having substantially the same imaging timing of the camera is stored with time, and the moving image monitoring image K including the plurality of monitoring images K per predetermined unit time is generated. It may be. By generating the moving image monitoring image K based on the images having the same imaging timing, it is possible to accurately represent changes in the situation around the vehicle.

 ところで、各撮像領域の画像をそれぞれ経時的に記憶し、各撮像領域ごとに生成した動画の監視画像Kを中央監視装置20へ送信した場合には、中央監視装置20の機能によっては、複数の動画を同時に再生できない場合がある。このような従来の中央監視装置20においては、複数の動画を同時に再生表示することができないため、各動画を再生する際には画面を切り替えて動画を一つずつ再生しなければならない。つまり、従来の中央監視装置20では、複数方向の映像(動画)を同時に見ることができず、車両周囲の全体を一画面で監視することができないという不都合がある。 By the way, when the image of each imaging area is memorize | stored each time and the monitoring image K of the moving image produced | generated for every imaging area is transmitted to the central monitoring apparatus 20, depending on the function of the central monitoring apparatus 20, a several You may not be able to play videos at the same time. In such a conventional central monitoring apparatus 20, since a plurality of moving images cannot be reproduced and displayed at the same time, when reproducing each moving image, the moving images must be reproduced one by one by switching the screen. In other words, the conventional central monitoring device 20 has a disadvantage in that it cannot simultaneously watch images (moving images) in a plurality of directions and cannot monitor the entire vehicle periphery on a single screen.

 これに対して本実施形態の車載制御装置14は、複数の画像から一つの監視画像Kを生成するので、中央監視装置20の機能にかかわらず、異なる撮像方向の画像を同時に動画再生することができる。つまり、監視画像Kを連続して再生(動画再生)することにより、監視画像Kに含まれる4枚の画像を同時に連続して再生(動画再生)し、方向の異なる領域の状態変化を一画面で監視することができる。 On the other hand, since the vehicle-mounted control apparatus 14 of this embodiment produces | generates one monitoring image K from several images, regardless of the function of the central monitoring apparatus 20, it can reproduce simultaneously the moving image reproduction of the image of a different imaging direction. it can. That is, by continuously reproducing the monitoring image K (moving image reproduction), four images included in the monitoring image K are simultaneously reproduced (moving image reproduction), and the state change of the regions in different directions is displayed on one screen. Can be monitored.

 また、本実施形態の監視端末装置10は、監視画像Kの画素数が各車載カメラ11a~11dの画像の画素数と略同一になるように画像のデータ量を圧縮して監視画像Kを生成することもできる。図5A~図5Dに示す各画像のサイズは480×640ピクセルであるのに対し、本実施形態では、図6に示すように監視画像Kのサイズが1280×240ピクセルとなるように圧縮処理を行う。これにより、監視画像Kのサイズ(1280×240=307,200ピクセル)が、各画像のサイズ(480×640×4枚=307,200ピクセル)と等しくなるので、監視画像Kを受信した中央監視装置20側の機能にかかわらず、画像処理及び画像再生を行うことができる。 In addition, the monitoring terminal device 10 of the present embodiment generates the monitoring image K by compressing the data amount of the image so that the number of pixels of the monitoring image K is substantially the same as the number of pixels of the images of the in-vehicle cameras 11a to 11d. You can also While the size of each image shown in FIGS. 5A to 5D is 480 × 640 pixels, in this embodiment, compression processing is performed so that the size of the monitoring image K is 1280 × 240 pixels as shown in FIG. Do. As a result, the size of the monitoring image K (1280 × 240 = 307,200 pixels) becomes equal to the size of each image (480 × 640 × 4 = 307,200 pixels). Regardless of the function on the apparatus 20 side, image processing and image reproduction can be performed.

 さらに、本実施形態の車載制御装置14は、配置された各画像の境界を示す線図形を、監視画像Kに付することもできる。図6に示す監視画像Kを例にすると、車載制御装置14は、配置された各画像の境界を示す線図形として、各画像の間に矩形の仕切り画像Bb,Bc,Bd,Ba,Ba´を監視画像Kに付することができる。このように、4つの画像の境界に仕切り画像を配置することにより、一連にされた監視画像Kの中で、撮像方向が異なる各画像をそれぞれ別個に認識させることができる。つまり、仕切り画像は各撮像画像の額縁として機能する。また、各撮像画像の境界付近は画像の歪みが大きいので、撮像画像の境界に仕切り画像を配置することにより、歪みの大きい領域の画像を隠すことや、歪みが大きいことを示唆することができる。 Furthermore, the in-vehicle control device 14 of the present embodiment can also attach a line figure indicating the boundary of each arranged image to the monitoring image K. Taking the monitoring image K shown in FIG. 6 as an example, the in-vehicle controller 14 forms a rectangular partition image Bb, Bc, Bd, Ba, Ba ′ between the images as a line figure indicating the boundary between the arranged images. Can be attached to the monitoring image K. In this manner, by arranging the partition images at the boundaries of the four images, it is possible to recognize each image having a different imaging direction in the series of monitoring images K. That is, the partition image functions as a frame of each captured image. In addition, since the image distortion is large in the vicinity of the boundary of each captured image, it is possible to hide the image of the region with large distortion or to suggest that the distortion is large by arranging the partition image at the boundary of the captured image. .

 また、本実施形態の車載制御装置14は、後述する投影モデルの側面に設定された投影面に4つの画像を投影させた場合の歪みを補正してから、監視画像Kを生成することもできる。撮影された画像の周辺領域は画像の歪みが生じやすく、特に広角レンズを用いた車載カメラ11である場合には撮像画像の歪みが大きくなる傾向があるため、画像の歪みを補正するために予め定義された画像変換アルゴリズムと補正量とを用いて、撮像画像の歪みを補正することが望ましい。 Moreover, the vehicle-mounted control apparatus 14 of this embodiment can also generate | occur | produce the monitoring image K, after correct | amending the distortion at the time of projecting four images on the projection surface set to the side surface of the projection model mentioned later. . In the peripheral area of the captured image, image distortion is likely to occur. In particular, in the case of the in-vehicle camera 11 using the wide-angle lens, the distortion of the captured image tends to be large, so in order to correct the image distortion in advance. It is desirable to correct the distortion of the captured image using the defined image conversion algorithm and correction amount.

 特に限定されないが、車載制御装置14は、図7に示すように、中央監視装置20において監視画像Kを投影させる投影モデルと同じ投影モデルの情報をROMから読み出し、この投影モデルの投影面に撮像画像を投影し、投影面において生じた歪みを予め補正することもできる。なお、画像変換アルゴリズムと補正量は車載カメラ11の特性、投影モデルの形状に応じて適宜定義することができる。このように、投影モデルの投影面に関し画像Kを投影した場合の歪みを予め補正しておくことにより、歪みの少ない視認性の良い監視画像Kを提供することができる。また、歪みを予め補正しておくことにより、並べて配置された各画像同士の位置ズレを低減させることができる。 Although not particularly limited, as shown in FIG. 7, the in-vehicle control device 14 reads out the same projection model information as the projection model for projecting the monitoring image K in the central monitoring device 20 from the ROM, and images it on the projection plane of the projection model. It is also possible to project an image and correct in advance distortion generated on the projection surface. The image conversion algorithm and the correction amount can be appropriately defined according to the characteristics of the in-vehicle camera 11 and the shape of the projection model. In this way, by correcting in advance the distortion when the image K is projected with respect to the projection plane of the projection model, it is possible to provide the monitoring image K with good visibility with less distortion. Further, by correcting the distortion in advance, it is possible to reduce the positional deviation between the images arranged side by side.

 次に、マッピング情報付加機能について説明する。本実施形態の監視端末装置10において、車載制御装置14は、乗用車Vの接地面を底面とする柱体の投影モデルMの側面に設定された投影面に、生成された監視画像Kを投影するためのマッピング情報を監視画像Kに対応づける処理を実行する。マッピング情報は、監視画像Kを受信した中央監視装置20に、容易に投影基準位置を認識させるための情報である。図8は本実施形態の投影モデルMの一例を示す図、図9は図8に示す投影モデルMのxy面に沿う断面模式図である。 Next, the mapping information addition function will be described. In the monitoring terminal device 10 of the present embodiment, the in-vehicle control device 14 projects the generated monitoring image K on the projection plane set on the side surface of the projection model M of the columnar body with the ground contact surface of the passenger car V as the bottom surface. A process for associating the mapping information for monitoring with the monitoring image K is executed. The mapping information is information for allowing the central monitoring device 20 that has received the monitoring image K to easily recognize the projection reference position. FIG. 8 is a diagram showing an example of the projection model M of the present embodiment, and FIG. 9 is a schematic sectional view taken along the xy plane of the projection model M shown in FIG.

 図8,9に示すように、本実施形態の投影モデルMは、底面が正八角形で、鉛直方向(図中z軸方向)に沿って高さを有する正八角柱体である。なお、投影モデルMの形状は、底面の境界に沿って隣接する側面を有する柱体であれば特に限定されず、円柱体、若しくは三角柱体、四角柱体、六角柱体などの角柱体、又は底面が多角形で側面が三角形の反角柱体とすることもできる。 8 and 9, the projection model M of this embodiment is a regular octagonal prism body having a regular octagonal bottom surface and a height along the vertical direction (z-axis direction in the figure). Note that the shape of the projection model M is not particularly limited as long as it is a column having side surfaces adjacent to each other along the boundary of the bottom surface, and is a cylinder, or a prism, such as a triangular column, a quadrangular column, or a hexagonal column, or An anti-rectangular column having a polygonal bottom surface and a triangular side surface can also be used.

 また、同図に示すように、本実施形態の投影モデルMの底面は乗用車Vの接地面と平行である。また、投影モデルMの側面の内側面には、投影モデルMの底面に接地する乗用車Vの周囲の映像を映し出す投影面Sa,Sb,Sc,Sd(以下、投影面Sと総称する。)が設定されている。投影面Sは、投影面Saの一部と投影面Sbの一部、投影面Sbの一部と投影面Scの一部、投影面Scの一部と投影面Sdの一部、投影面Sdの一部と投影面Saの一部により構成することもできる。監視画像Kは、乗用車Vを取り囲む投影モデルMの上方の視点R(R1~R8、以下、視点Rと総称する。)から乗用車Vを俯瞰した映像として投影面Sに投影される。 Also, as shown in the figure, the bottom surface of the projection model M of this embodiment is parallel to the ground contact surface of the passenger car V. Projection surfaces Sa, Sb, Sc, and Sd (hereinafter collectively referred to as a projection surface S) that project an image around the passenger vehicle V that contacts the bottom surface of the projection model M are provided on the inner surface of the side surface of the projection model M. Is set. The projection surface S includes a part of the projection surface Sa and a part of the projection surface Sb, a part of the projection surface Sb and a part of the projection surface Sc, a part of the projection surface Sc and a part of the projection surface Sd, and the projection surface Sd. And a part of the projection surface Sa. The monitoring image K is projected on the projection plane S as an image of the passenger car V viewed from above the viewpoint R (R1 to R8, hereinafter referred to as viewpoint R) above the projection model M surrounding the passenger car V.

 本実施形態の車載制御装置14は、右端又は左端に配置された撮像画像の基準座標を、マッピング情報として監視画像Kに対応づける。図6に示す監視画像Kを例にすると、車載制御装置14は、投影モデルMに投影される際の、監視画像Kの始端位置又は終端位置を示すマッピング情報(基準座標)として、右端に配置された撮像画像GSP1の左上頂点の座標A(x、y)と、左端に配置された撮像画像GSP2の右上頂点の座標B(x、y)とを監視画像Kに付する。なお、始端位置又は終端位置を示す撮像画像の基準座標は特に限定されず、左端に配置された監視画像Kの左下頂点、又は右端に配置された監視画像Kの右下頂点としてもよい。またマッピング情報は、監視画像Kの画像データの各画素に付してもよいし、監視画像Kとは別のファイルとして管理してもよい。 The in-vehicle control device 14 according to the present embodiment associates the reference coordinates of the captured image arranged at the right end or the left end with the monitoring image K as mapping information. Taking the monitoring image K shown in FIG. 6 as an example, the in-vehicle control device 14 is arranged at the right end as mapping information (reference coordinates) indicating the start end position or the end position of the monitoring image K when projected onto the projection model M. The coordinates A (x, y) of the upper left vertex of the captured image GSP1 and the coordinates B (x, y) of the upper right vertex of the captured image GSP2 arranged at the left end are attached to the monitoring image K. Note that the reference coordinates of the captured image indicating the start position or the end position are not particularly limited, and may be the lower left vertex of the monitoring image K arranged at the left end or the lower right vertex of the monitoring image K arranged at the right end. The mapping information may be attached to each pixel of the image data of the monitoring image K, or may be managed as a file different from the monitoring image K.

 このように、監視画像Kの始端位置又は終端位置を示す情報、つまり投影処理において基準とする基準座標をマッピング情報として監視画像Kに対応づけることにより、監視画像Kを受信した中央監視装置20が、容易に投影処理時における基準位置を認識することができるので、車載カメラ11a~11dの配置順に並べられた監視画像Kを、投影モデルMの側面の投影面Sに容易且つ迅速に順次投影することができる。すなわち、図9に示すように車載カメラ11aの撮像方向に位置する投影面Saに車両前方の撮像画像GSP1を投影し、車載カメラ11bの撮像方向に位置する投影面Sbに車両右側方の撮像画像GSP2を投影し、車載カメラ11cの撮像方向に位置する投影面Scに車両後方の撮像画像GSP3を投影し、車載カメラ11dの撮像方向に位置する投影面Sdに車両左側方の撮像画像GSP4を投影することができる。 As described above, the information indicating the start position or the end position of the monitoring image K, that is, the reference coordinates used as a reference in the projection processing is associated with the monitoring image K as mapping information, whereby the central monitoring apparatus 20 that has received the monitoring image K Since the reference position at the time of the projection process can be easily recognized, the monitoring images K arranged in the order in which the in-vehicle cameras 11a to 11d are arranged are projected sequentially and easily on the projection surface S on the side surface of the projection model M. be able to. That is, as shown in FIG. 9, the captured image GSP1 in front of the vehicle is projected onto the projection surface Sa positioned in the imaging direction of the in-vehicle camera 11a, and the captured image on the right side of the vehicle is projected onto the projection surface Sb positioned in the imaging direction of the in-vehicle camera 11b. GSP2 is projected, a captured image GSP3 behind the vehicle is projected onto a projection plane Sc located in the imaging direction of the in-vehicle camera 11c, and a captured image GSP4 on the left side of the vehicle is projected onto the projection plane Sd positioned in the imaging direction of the in-vehicle camera 11d. can do.

 これにより、投影モデルMに投影された監視画像Kは、あたかも乗用車Vの周囲を見回したときに見える映像を示すことができる。つまり、車載カメラ11a~11dの設置順序に応じて水平方向一列に配置された4つの画像を含む監視画像Kは、投影モデルMの柱体において、同じく水平方向に並ぶ側面に投影されるので、柱体の投影モデルMの投影面Sに投影された監視画像Kに、乗用車Vの周囲の映像をその位置関係を維持したまま再現することができる。 Thereby, the monitoring image K projected on the projection model M can show an image that can be seen as if looking around the passenger car V. That is, since the monitoring image K including four images arranged in a line in the horizontal direction according to the installation order of the in-vehicle cameras 11a to 11d is projected on the side surfaces that are also arranged in the horizontal direction in the column of the projection model M. An image around the passenger car V can be reproduced in the monitoring image K projected on the projection surface S of the projection model M of the columnar body while maintaining the positional relationship.

 なお、本実施形態の車載制御装置14は、監視画像Kの各座標値と投影モデルMの各投影面Sの座標値との対応関係をマッピング情報として記憶し、監視画像Kに付することができるが、中央監視装置20に予め記憶させてもよい。 Note that the in-vehicle control device 14 of the present embodiment stores the correspondence relationship between each coordinate value of the monitoring image K and the coordinate value of each projection plane S of the projection model M as mapping information, and attaches it to the monitoring image K. However, it may be stored in the central monitoring device 20 in advance.

 また、図8,9に示す視点R、投影面Sの位置は例示であり、任意に設定することができる。特に、視点Rは、操作者の操作によって変更可能である。視点Rと監視画像Kの投影位置との関係は予め定義されており、視点Rの位置が変更された場合には所定の座標変換を実行することにより、新たに設定された視点Rから見た監視画像Kを投影面S(Sa~Sd)に投影することができる。この視点変換処理には公知の手法を用いることができる。 Further, the positions of the viewpoint R and the projection plane S shown in FIGS. 8 and 9 are examples, and can be arbitrarily set. In particular, the viewpoint R can be changed by the operation of the operator. The relationship between the viewpoint R and the projection position of the monitoring image K is defined in advance, and when the position of the viewpoint R is changed, a predetermined coordinate transformation is performed, so that the viewpoint R is viewed from the newly set viewpoint R. The monitoring image K can be projected onto the projection surface S (Sa to Sd). A known method can be used for this viewpoint conversion processing.

 以上のように、本実施形態の車載制御装置14は、所定タイミングで撮影された画像情報に基づいて監視画像Kを生成し、この監視画像Kにマッピング情報、基準座標、境界を示す線図形(仕切り画像)の情報を対応づけ、撮像タイミングに従って経時的に記憶する。特に限定されないが、車載制御装置14は、所定の単位時間あたりに複数の監視画像Kを含む一つの動画ファイルとして監視画像Kを記憶してもよいし、ストリーミング方式で転送・再生が可能な形態で監視画像Kを記憶してもよい。 As described above, the in-vehicle control device 14 according to the present embodiment generates the monitoring image K based on the image information captured at a predetermined timing, and the monitoring image K includes a line figure (mapping information, reference coordinates, and boundary). (Partition image) information is associated and stored over time according to the imaging timing. Although not particularly limited, the in-vehicle control device 14 may store the monitoring image K as a single moving image file including a plurality of monitoring images K per predetermined unit time, or can be transferred / reproduced by a streaming method. The monitoring image K may be stored in

 中央監視装置20の通信装置23は、監視端末装置10から送信された監視画像Kとこの監視画像Kに対応づけられたマッピング情報を受信する。また、室内の車載カメラ11eにて撮影された画像情報は別途受信する。この監視画像Kは、上述したとおり乗用車Vのボディの異なる位置に設置された4つの車載カメラ11の画像が、乗用車Vのボディの外周に沿って右回り又は左回りの方向に沿って設置された車載カメラ11a~11dの設置順序(車両Vのボディの外周に沿う右回り又は左回りの順序)に従って配置されたものである。また、この監視画像Kには、監視画像Kを八角柱体の投影モデルMの投影面Sに投影させるためのマッピング情報が対応づけられている。通信装置23は取得した監視画像K及びマッピング情報を画像処理装置22へ送信する。 The communication device 23 of the central monitoring device 20 receives the monitoring image K transmitted from the monitoring terminal device 10 and the mapping information associated with the monitoring image K. Moreover, the image information image | photographed with the indoor vehicle-mounted camera 11e is received separately. In this monitoring image K, as described above, images of the four in-vehicle cameras 11 installed at different positions of the body of the passenger car V are installed along the outer periphery of the body of the passenger car V along the clockwise or counterclockwise direction. The vehicle-mounted cameras 11a to 11d are arranged according to the installation order (clockwise or counterclockwise order along the outer periphery of the body of the vehicle V). The monitoring image K is associated with mapping information for projecting the monitoring image K onto the projection plane S of the octagonal prism projection model M. The communication device 23 transmits the acquired monitoring image K and mapping information to the image processing device 22.

 画像処理装置22は、予め記憶している投影モデルMを読み出し、マッピング情報に基づいて、図8及び図9に示す乗用車Vの接地面を底面とする八角柱体の投影モデルMの側面に設定された投影面Sa~Sdに監視画像Kを投影させた表示画像を生成する。具体的には、マッピング情報に従い、受信した監視画像Kの各画素を、投影面Sa~Sdの各画素に投影する。また、画像処理装置22は、監視画像Kを投影モデルMに投影する際に、監視画像Kと共に受信した基準座標に基づいて、監視画像Kの開始点(監視画像Kの右端又は左端)を認識し、この開始点が予め投影モデルM上に定義された開始点(投影面Sの右端又は左端)と合致するように投影処理を行う。また、画像処理装置22は、監視画像Kを投影モデルMに投影する際に、各画像の境界を示す線図形(仕切り画像)を投影モデルM上に配置する。仕切り画像は、予め投影モデルMに付しておくこともでき、投影処理後に監視画像Kに付すこともできる。 The image processing apparatus 22 reads the projection model M stored in advance, and sets it on the side surface of the octagonal prism projection model M with the ground contact surface of the passenger car V shown in FIGS. 8 and 9 as the bottom surface based on the mapping information. A display image is generated by projecting the monitoring image K onto the projected planes Sa to Sd. Specifically, according to the mapping information, each pixel of the received monitoring image K is projected onto each pixel of the projection surfaces Sa to Sd. Further, when projecting the monitoring image K onto the projection model M, the image processing device 22 recognizes the start point of the monitoring image K (the right end or the left end of the monitoring image K) based on the reference coordinates received together with the monitoring image K. Then, the projection processing is performed so that the start point coincides with the start point (the right end or the left end of the projection surface S) defined in advance on the projection model M. Further, when projecting the monitoring image K onto the projection model M, the image processing device 22 arranges a line figure (partition image) indicating the boundary of each image on the projection model M. The partition image can be attached to the projection model M in advance, or can be attached to the monitoring image K after the projection processing.

 ディスプレイ24は、投影モデルMの投影面Sに投影した監視画像Kを表示する。図10は、監視画像Kの表示画像の一例を示す。なお、マウスやキーボードなどの入力装置25又はディスプレイ24をタッチパネル式の入力装置25とすることで、監視者の操作により視点を自在に設定・変更することができる。視点位置と投影面Sとの対応関係は上述の画像処理装置22又はディスプレイ24において予め定義されているので、この対応関係に基づいて、変更後の視点に応じた監視画像Kをディスプレイ24に表示することができる。 The display 24 displays the monitoring image K projected on the projection plane S of the projection model M. FIG. 10 shows an example of a display image of the monitoring image K. In addition, by using the input device 25 such as a mouse or a keyboard or the display 24 as the touch panel type input device 25, the viewpoint can be freely set and changed by the operation of the supervisor. Since the correspondence relationship between the viewpoint position and the projection plane S is defined in advance in the image processing device 22 or the display 24 described above, the monitoring image K corresponding to the changed viewpoint is displayed on the display 24 based on this correspondence relationship. can do.

 次に本実施形態に係る監視システム1の動作について説明する。図11は監視端末装置10側の動作を示すフローチャート、図12A,12Bは中央監視装置20側の動作を示すフローチャート、図13はデータベースの情報例、図14は走行頻度に関する情報例、図15は新たな経路の求め方を説明するための第1の図、図16は新たな経路の求め方を説明するための第2の図である。 Next, the operation of the monitoring system 1 according to this embodiment will be described. FIG. 11 is a flowchart showing the operation on the monitoring terminal device 10 side, FIGS. 12A and 12B are flowcharts showing the operation on the central monitoring device 20 side, FIG. 13 is an example of database information, FIG. FIG. 16 is a first diagram for explaining how to obtain a new route, and FIG. 16 is a second diagram for explaining how to obtain a new route.

 図11に示す処理は、乗用車Vに搭載された監視端末装置10において実行されるが、警察車両、緊急車両などの監視者に属する乗用車V3に搭載することができるとともに、タクシー、バスなどの監視者以外の業務用の乗用車V1又は自家用乗用車V2に搭載することができる。つまり、警察車両などの監視者に属する乗用車V3から取得した監視情報に基づいて、タクシー、バスなどの業務用の乗用車V1又は自家用車V2に対して監視地点の監視情報を送信する指令を送信してすることができる。他方、タクシー、バスなどの業務用の乗用車V1又は自家用車V2から取得した監視情報に基づいて、警察車両などの監視者に属する乗用車V3に対して監視地点の監視情報を送信する指令を送信することができる。後に詳述するが、このようにすることで、警察車両の通行頻度が低いリンクの監視地点にタクシー等を誘導すること、又はタクシー等の通行頻度が低いリンクの監視地点に警察車両を誘導することができる。さらに、過去において警察車両の通行頻度が低いリンクの監視地点に改めて警察車両を誘導すること、過去においてタクシー等の通行頻度が低いリンクの監視地点に改めてタクシー等を誘導することができる。 The processing shown in FIG. 11 is executed in the monitoring terminal device 10 mounted on the passenger car V, but can be mounted on a passenger car V3 belonging to a supervisor such as a police vehicle or an emergency vehicle, and can also monitor a taxi, a bus, and the like. It can be mounted on a business passenger car V1 or a private passenger car V2. That is, based on the monitoring information acquired from the passenger car V3 belonging to the monitor such as a police car, a command for transmitting the monitoring information of the monitoring point is transmitted to the business passenger car V1 such as a taxi or a bus or the private car V2. Can be done. On the other hand, based on the monitoring information acquired from commercial passenger cars V1 such as taxis and buses or private cars V2, a command to transmit monitoring information of monitoring points is transmitted to passenger cars V3 belonging to a monitor such as a police car. be able to. As will be described in detail later, by doing so, a taxi or the like is guided to a monitoring point of a link with a low traffic frequency of a police vehicle, or a police vehicle is guided to a monitoring point of a link with a low frequency of taxi or the like. be able to. Furthermore, it is possible to guide a police vehicle to a monitoring point of a link where the frequency of traffic of police cars is low in the past, and to guide a taxi or the like to a monitoring point of a link where the frequency of traffic such as taxis is low in the past.

 図11に示すように、監視端末装置10においては、所定の時間間隔(同図に示す1ルーチン)で車載カメラ11から周囲の映像と室内の映像を取得し、画像処理装置12によって画像情報に変換する(ステップST1)。また、GPSを備える位置検出装置15から当該監視端末装置10が搭載された乗用車Vの現在位置情報を検出する(ステップST2)。 As shown in FIG. 11, in the monitoring terminal device 10, surrounding video and indoor video are acquired from the in-vehicle camera 11 at a predetermined time interval (one routine shown in FIG. 11), and the image processing device 12 converts the video information into image information. Conversion is performed (step ST1). Further, the current position information of the passenger car V on which the monitoring terminal device 10 is mounted is detected from the position detection device 15 having GPS (step ST2).

 ステップST3では、異常を通報する通報ボタン16が押されたか否かを判断し、通報ボタン16が押された場合はステップST4へ進み、ステップST1にて取得した画像情報と、ステップST2で取得した位置情報と、CPUの時刻情報とを関連付け、これらを、異常が発生した旨の異常情報とともに、監視情報として通信装置13及び電気通信回線網30を介して中央監視装置20へ送信する。これにより、事故、犯罪などの治安に関する異常が発生したことを、乗用車Vの位置情報と、乗用車Vの周囲の画像情報と共に中央監視装置20へ自動送信されるので、街中の監視がより一層強化されることになる。なお、本例では最初のステップST1及びST2において画像情報と位置情報とを取得するが、ステップST3とST4との間のタイミングでこれら画像情報と位置情報とを取得してもよい。 In step ST3, it is determined whether or not the report button 16 for reporting the abnormality is pressed. If the report button 16 is pressed, the process proceeds to step ST4, and the image information acquired in step ST1 and the image information acquired in step ST2 are acquired. The positional information is associated with the CPU time information, and these are transmitted as monitoring information to the central monitoring device 20 via the communication device 13 and the telecommunications network 30 together with the abnormality information indicating that an abnormality has occurred. As a result, the occurrence of an abnormality related to security such as an accident or crime is automatically transmitted to the central monitoring device 20 together with the position information of the passenger car V and the image information around the passenger car V, thereby further strengthening the monitoring in the city. Will be. In this example, the image information and the position information are acquired in the first steps ST1 and ST2, but the image information and the position information may be acquired at a timing between steps ST3 and ST4.

 ステップST3に戻り、通報ボタン16が押されていない場合はステップST5へ進み、中央監視装置20と通信し、制御指令を取得する。 Returning to step ST3, if the report button 16 has not been pressed, the process proceeds to step ST5 to communicate with the central monitoring device 20 and obtain a control command.

 続いて、ステップST6において、監視端末装置10は、中央監視装置20から監視地点に関する監視情報送信指令を取得したか否かを判断し、監視情報送信指令を取得した場合にはステップST7へ進み、取得した監視情報送信指令にルート(経路)変更指令が含まれているか否かを判断する。ルート変更指令が含まれている場合にはステップST8へ進む。 Subsequently, in step ST6, the monitoring terminal device 10 determines whether or not a monitoring information transmission command related to a monitoring point has been acquired from the central monitoring device 20, and if a monitoring information transmission command is acquired, the process proceeds to step ST7. It is determined whether or not the acquired monitoring information transmission command includes a route change command. If a route change command is included, the process proceeds to step ST8.

 ステップST8において、監視端末装置10は、ルート変更指令(経路変更指令)に含まれる指定された監視地点又は監視地点を含むリンクを出力装置18に表示する。本実施形態の出力装置は、ディスプレイ、スピーカを含む。この表示手法は、特に限定されず、監視地点に存在する施設名や、リンクが含まれる通りの名称などをテキストや音声によって乗員に提示することができる。また、ナビゲーション装置19は、監視地点又は監視地点が含まれるリンクを地図情報MPに重畳させてディスプレイ18に提示してもよい。これにより、乗員は、自身が意図する経路に対する監視地点又は監視地点を含むリンクの位置を目視にて確認することができる。 In step ST8, the monitoring terminal device 10 displays the designated monitoring point included in the route change command (route change command) or a link including the monitoring point on the output device 18. The output device of this embodiment includes a display and a speaker. This display method is not particularly limited, and the name of the facility existing at the monitoring point, the name of the street including the link, and the like can be presented to the occupant by text or voice. Moreover, the navigation apparatus 19 may superimpose the monitoring point or the link including the monitoring point on the map information MP and present it on the display 18. Thereby, the passenger | crew can confirm the position of the link containing the monitoring point with respect to the path | route which he / she intends or a monitoring point visually.

 そして、乗用車Vの運転者は監視地点又は監視する監視リンクを確認し、立ち寄りが可能であると判断した後にステップST9以降を行う。監視地点又は監視リンクに立ち寄ることが業務上不可能である場合もあるため、一旦表示することにより乗員の便宜を図ることができる。 Then, the driver of the passenger car V confirms the monitoring point or the monitoring link to be monitored, and after step ST9 is determined that it is possible to stop by, the step ST9 and subsequent steps are performed. Since it may be impossible for business reasons to stop at a monitoring point or a monitoring link, the convenience of the passenger can be improved by displaying it once.

 続くステップST9において、監視端末装置10は、監視情報送信指令に含まれる監視地点又は当該監視地点が含まれるリンクを経由する経路をナビゲーション装置19に算出させる。指定された監視地点を経由して現在位置から目的地へ至る経路の算出手法は出願時に知られた手法を適宜に利用することができる。乗員は新たに算出された経路に沿って移動することが可能であれば、その経路を設定し、経路案内を実行させる。そして、ステップST10に進み、指定された監視地点又はリンクに到着したらカメラ11で周囲を撮像し、画像情報を生成する。指定された監視地点又はリンクに到着したその後、ステップ101へ進み、監視情報送信指令の内容に従い、画像情報、位置情報、時刻情報を含む監視情報を中央監視装置20に送信する。ステップST7において、監視情報送信指令にルート変更指令が含まれていない場合には、ステップST101へ進み、ルート変更をすることなく、ステップST6で取得した監視情報送信指令に従い、画像情報、位置情報、時刻情報を含む監視情報を中央監視装置20に送信する。 In subsequent step ST9, the monitoring terminal device 10 causes the navigation device 19 to calculate a route via the monitoring point included in the monitoring information transmission command or the link including the monitoring point. As a method for calculating a route from the current position to the destination via the designated monitoring point, a method known at the time of filing can be appropriately used. If the occupant can move along the newly calculated route, the occupant sets the route and executes route guidance. Then, the process proceeds to step ST10, and when it arrives at the designated monitoring point or link, the surroundings are imaged by the camera 11, and image information is generated. After arriving at the designated monitoring point or link, the process proceeds to step 101, and monitoring information including image information, position information, and time information is transmitted to the central monitoring device 20 in accordance with the contents of the monitoring information transmission command. In step ST7, when the route change command is not included in the monitoring information transmission command, the process proceeds to step ST101, and without changing the route, according to the monitoring information transmission command acquired in step ST6, image information, position information, Monitoring information including time information is transmitted to the central monitoring device 20.

 なお、本実施形態における監視情報送信指令は、必ずしも画像情報を含むことを要求しない場合がある。これは、監視情報の要求指令が警察等の監視者側に送出された場合には、警察等の安全確認権限の下において目視確認がされ、「異常あり/異常なし」といった簡易な報告を監視情報とすることができるからである。また、監視情報送信指令に記憶指令が含まれている場合には画像情報、位置情報、時刻情報を記憶する。 Note that the monitoring information transmission command in the present embodiment may not necessarily require that image information be included. This is because, when a monitoring information request command is sent to the police or other supervisor, it is visually confirmed under the safety confirmation authority of the police or the like, and a simple report such as “abnormal / no abnormal” is monitored. This is because it can be information. Further, when a storage command is included in the monitoring information transmission command, image information, position information, and time information are stored.

 ステップST6に戻り、中央監視装置20から監視情報送信指令を取得しない場合であっても、ステップST102において乗用車Vが予め定義された重点監視領域に存在する場合には、ステップS103へ進み、画像情報を含む監視情報を送信する。他方、画像送信指令を取得せず、重点監視領域でもない場合には、ステップST104へ進み、画像情報を含まない監視情報、つまり時刻情報、位置情報を中央監視装置20へ送信する。 Returning to step ST6, even if the monitoring information transmission command is not acquired from the central monitoring device 20, if the passenger vehicle V is present in the predefined priority monitoring area in step ST102, the process proceeds to step S103, where the image information Send monitoring information including On the other hand, if the image transmission command is not acquired and it is not the priority monitoring area, the process proceeds to step ST104, and monitoring information not including image information, that is, time information and position information is transmitted to the central monitoring device 20.

 図12AのステップST11では、すべての乗用車Vから位置情報、時刻情報を取得し、データベース26に蓄積する。図13は、データベース26に蓄積される情報の一例を示す図である。図13に示すように、乗用車V(監視端末装置10)から取得された画像情報、位置情報、時刻情報を含む監視情報は、位置情報に対応づけて記憶されている。つまり、位置情報を指定すると、一連の監視情報を呼び出すことができる。また、位置情報が属するリンクID、乗用車の車速を対応づけておいてもよい。さらに、この監視情報には、監視端末装置10を特定するための移動体ID(監視端末装置ID)を含ませることができる。移動体IDは監視端末装置10の通信装置13のアドレスであってもよい。 In step ST11 of FIG. 12A, position information and time information are acquired from all the passenger cars V and stored in the database 26. FIG. 13 is a diagram illustrating an example of information stored in the database 26. As shown in FIG. 13, monitoring information including image information, position information, and time information acquired from the passenger car V (monitoring terminal device 10) is stored in association with the position information. That is, if position information is designated, a series of monitoring information can be called. Further, the link ID to which the position information belongs may be associated with the vehicle speed of the passenger car. Further, the monitoring information can include a mobile body ID (monitoring terminal device ID) for specifying the monitoring terminal device 10. The mobile object ID may be the address of the communication device 13 of the monitoring terminal device 10.

 ステップST12において、ステップST11で取得した位置情報に基づいて乗用車Vを、ディスプレイ24に表示された地図データベースの地図情報MP上に図1の左上に示すように表示する。乗用車Vの位置情報は、図11の1ルーチン毎の所定のタイミングにて取得され送信されるので、監視者は乗用車Vの現在位置をタイムリーに把握することができる。 In step ST12, the passenger car V is displayed on the map information MP of the map database displayed on the display 24 as shown in the upper left of FIG. 1 based on the position information acquired in step ST11. Since the position information of the passenger car V is acquired and transmitted at a predetermined timing for each routine in FIG. 11, the supervisor can grasp the current position of the passenger car V in a timely manner.

 ステップST13では、乗用車Vの監視端末装置10から通報される異常情報、すなわち事故、犯罪などの治安に関する異常が発生した旨の通報を受信したか否かを判断する。この異常情報は、乗用車Vの搭乗者が監視端末装置10の通報ボタン16を押すことで出力される。異常情報がある場合は、ステップST14にて異常情報が出力された乗用車Vを特定し、その乗用車の監視端末装置10から画像情報および時刻情報を受信し、画像情報をディスプレイ24に表示する。また、図1左上に示すように、地図情報MP上に表示されたその乗用車を他の乗用車と識別できるように色彩を変更するなど、強調表示を行う。これにより、異常が発生した位置を地図情報MP上で視認することができるとともに、異常内容をディスプレイ24にて把握することができる。 In step ST13, it is determined whether or not abnormality information notified from the monitoring terminal device 10 of the passenger car V, that is, a notification that an abnormality relating to security such as an accident or a crime has occurred has been received. This abnormality information is output when the passenger of the passenger car V presses the notification button 16 of the monitoring terminal device 10. If there is abnormality information, the passenger vehicle V to which abnormality information was output is identified in step ST14, image information and time information are received from the monitoring terminal device 10 of the passenger vehicle, and the image information is displayed on the display 24. Further, as shown in the upper left of FIG. 1, highlighting is performed, for example, by changing the color so that the passenger car displayed on the map information MP can be distinguished from other passenger cars. Thereby, the position where the abnormality has occurred can be visually recognized on the map information MP, and the abnormality content can be grasped on the display 24.

 次のステップST15では、異常情報を出力した乗用車Vの近傍(所定距離内)を走行する乗用車Vを検出し、その乗用車Vに対して画像情報および時刻情報の送信指令を出力する。これにより異常情報を出力した乗用車Vの近傍を走行する乗用車Vから画像情報を取得することができるので、異常情報を出力した乗用車Vからの画像情報に加えた複数の画像情報により、異常情報の内容を詳細に把握することができる。 In the next step ST15, the passenger vehicle V traveling in the vicinity (within a predetermined distance) of the passenger vehicle V that has output the abnormality information is detected, and a transmission command for image information and time information is output to the passenger vehicle V. As a result, the image information can be acquired from the passenger vehicle V that travels in the vicinity of the passenger vehicle V that has output the abnormality information. The contents can be grasped in detail.

 ステップST16では、異常情報を出力した乗用車Vの位置情報をパトカー、救急車、消防車等の緊急乗用車へ送信する。この場合に、異常内容を報知するために画像情報を添付して送信してもよい。これにより、現場からの通報が入る前に緊急乗用車を出動させることができ、事故や犯罪に対する迅速な対処が可能となる。 In step ST16, the position information of the passenger car V that has output the abnormality information is transmitted to an emergency passenger car such as a police car, an ambulance, or a fire engine. In this case, image information may be attached and transmitted in order to notify the abnormal content. As a result, the emergency passenger car can be dispatched before a report from the site is entered, and it is possible to quickly deal with accidents and crimes.

 ステップST17では、監視端末装置10から受信した全ての位置情報、画像情報および時刻情報を記録媒体へ記録する。この記録は、事故や犯罪の発生後においてこれらを解決する際に用いられる。なお、ステップST13にて異常情報がない場合はステップST14~ST17の処理を行うことなくステップST21へ進む。 In step ST17, all position information, image information, and time information received from the monitoring terminal device 10 are recorded on the recording medium. This record is used to resolve these after an accident or crime. If there is no abnormality information in step ST13, the process proceeds to step ST21 without performing the processes in steps ST14 to ST17.

 ここで、ステップST13に戻り、異常情報が無い場合には、ステップST111~T114の処理を行う。 Here, returning to step ST13, if there is no abnormality information, the processing of steps ST111 to T114 is performed.

 ステップST111において、中央監視装置20は、データベース26を参照し、リンクごとの走行頻度を算出・記憶する。中央監視装置20は、図13に示す収集したデータに基づいて、中央監視装置20が備える地図情報MPを参照し、この地図情報MPに含まれるリンクごと位置情報がプロットされた乗用車Vの台数をカウントアップした走行頻度を算出することができる。本実施形態における走行頻度のデータの一例を図14に示す。図14に示すように、単位時間あたりの台数で示される走行頻度と、現在のリンクの走行台数とがリンクごとに対応づけられている。走行頻度を算出する際の単位時間は任意に設定することができ、1時間、4時間、半日、8時~5時、5時以降、1日などとすることができる。また、単位時間は平日と休日とを識別し、走行頻度を平日と休日に分けて求めることもできる。また、現在の走行台数を示すことにより、リアルタイムの走行状況を認識することができる。例えば、走行頻度は低いが、現在の走行台数が多い場合には、特別に監視のための乗用車Vを向かわせる必要が無いと判断することができる。他方、走行頻度が高いのに、現在の走行台数が極端に低い場合には、特別に監視のための乗用車Vを向かわせる必要があると判断することもできる。 In step ST111, the central monitoring apparatus 20 refers to the database 26 and calculates and stores the traveling frequency for each link. The central monitoring device 20 refers to the map information MP included in the central monitoring device 20 based on the collected data shown in FIG. 13, and determines the number of passenger cars V on which the position information for each link included in the map information MP is plotted. The counted traveling frequency can be calculated. An example of travel frequency data in this embodiment is shown in FIG. As shown in FIG. 14, the traveling frequency indicated by the number of units per unit time is associated with the number of traveling units of the current link for each link. The unit time for calculating the running frequency can be arbitrarily set, and can be 1 hour, 4 hours, half a day, 8 to 5 o'clock, 5 o'clock or later, 1 day, and the like. Further, the unit time can be obtained by identifying weekdays and holidays and dividing the driving frequency into weekdays and holidays. In addition, a real-time traveling situation can be recognized by indicating the current number of traveling vehicles. For example, if the traveling frequency is low but the current number of traveling vehicles is large, it can be determined that there is no need to direct the passenger car V for monitoring. On the other hand, when the traveling frequency is high but the current traveling number is extremely low, it can be determined that the passenger car V for monitoring needs to be directed specially.

 続くステップST112において、中央監視装置20は、走行頻度に基づいて、監視地点を特定する。監視地点の特定手法は特に限定されないが、走行頻度が所定値未満のリンクまたはそのリンク上の地点を監視地点として特定することができる。監視地点の特定に用いられる走行頻度の閾値は、交通量を考慮してリンクごとに設けてもよいし、共通の閾値としてもよい。 In subsequent step ST112, the central monitoring device 20 specifies a monitoring point based on the traveling frequency. The monitoring point specifying method is not particularly limited, but a link having a traveling frequency less than a predetermined value or a point on the link can be specified as the monitoring point. The threshold of the driving frequency used for specifying the monitoring point may be provided for each link in consideration of the traffic volume, or may be a common threshold.

 また、中央監視装置20は、走行頻度が所定値以上の高いリンク以外のリンク上の地点を監視地点として特定することができる。これにより、走行頻度の少ないリンクの地点を監視地点として乗用車Vを出向かせることに加えて、走行頻度の多いリンクを避けて、それ以外のリンク上の地点を監視地点として走行頻度を均一にさせることができ、市街のリンクを万遍なく均等な頻度で監視することができる。 Moreover, the central monitoring device 20 can specify a point on a link other than a link whose traveling frequency is higher than a predetermined value as a monitoring point. As a result, in addition to dispatching the passenger car V with the location of a link with a low travel frequency as a monitoring point, avoid a link with a high travel frequency and make the travel frequency uniform with a point on the other link as a monitoring point. It is possible to monitor city links with uniform frequency.

 次のステップST113において、中央監視装置20は、特定した監視地点を経由するルート変更指令を作成し、ステップST114においてルート変更指令を含む監視地点の監視情報送信指令を監視端末装置10へ送信する。本実施形態の監視情報送信指令は、画像情報の送信指令を必ずしも含むものではなく、「異常なし/異常あり」といった判断の報告だけを求める指令であってもよい。監視情報送信指令を監視者に属する警察車両V3に送出する場合には、安全確認権限を有する警察などの監視者の判断だけでも監視の効果を得られるからである。他方、監視情報送信指令を監視者以外のタクシー業者、運送業者等に属する車両V2に送出する場合には、監視情報送信指令には、画像情報の送信を要求する指令を含めることが好ましい。タクシー業者によって安全確認はできないので、提供された画像情報に基づいて安全確認権限を有する警察、警備会社等により安全が確認される必要があるからである。 In the next step ST113, the central monitoring device 20 creates a route change command passing through the specified monitoring point, and transmits a monitoring information transmission command for the monitoring point including the route change command to the monitoring terminal device 10 in step ST114. The monitoring information transmission command according to the present embodiment does not necessarily include an image information transmission command, and may be a command that requests only a report of a determination such as “no abnormality / abnormal”. This is because when the monitoring information transmission command is transmitted to the police vehicle V3 belonging to the monitor, the monitoring effect can be obtained only by the determination of the monitor such as the police having the authority of safety confirmation. On the other hand, when the monitoring information transmission command is sent to the vehicle V2 belonging to a taxi company other than the monitoring company, a transportation company or the like, the monitoring information transmission command preferably includes a command for requesting transmission of image information. This is because safety cannot be confirmed by a taxi company, and safety needs to be confirmed by a police, security company or the like having the authority of safety confirmation based on the provided image information.

 なお、ステップ112において監視地点を特定した後に、その監視地点の情報だけを含む監視情報送信指令を監視端末装置10へ送出してもよい。この場合は、監視端末装置10側で監視地点の情報が乗員に提示され、乗員が監視地点を経由する経路を乗用車V側のナビゲーション装置19に探索させることができる。 In addition, after specifying a monitoring point in step 112, a monitoring information transmission command including only information on the monitoring point may be sent to the monitoring terminal device 10. In this case, information on the monitoring point is presented to the occupant on the monitoring terminal device 10 side, and the occupant can cause the navigation device 19 on the passenger car V side to search for a route through the monitoring point.

 ステップ112において監視地点を特定した後に、その監視地点へ移動する指令を監視情報送信指令に含ませて監視端末装置10へ送出してもよい。単に監視地点の位置を示すよりも、移動体Vを強制的に監視地点へ向かわせることができるので、監視地点における監視情報を確実に取得することができる。もちろん、監視地点へ移動する指令の態様は限定されず、音声による指令、文字による指令、ナビゲーション装置19に対する目的地の設定指令や走行ルートの設定指令としてもよい。また、監視地点へ移動する指令は、監視地点から所定距離以内に存在する監視端末装置10(移動体)のみに送出してもよい。監視地点へ移動する指令は、警察、消防署などの監視者に属する監視端末装置10(移動体)にのみ送出してもよいし、事案に応じて業務用又は自家用の移動体Vに搭載された監視端末装置10にのみ送出してもよい。 After specifying a monitoring point in step 112, a command to move to the monitoring point may be included in the monitoring information transmission command and sent to the monitoring terminal device 10. Rather than simply indicating the position of the monitoring point, the moving body V can be forced to go to the monitoring point, so that the monitoring information at the monitoring point can be reliably acquired. Of course, the mode of the command to move to the monitoring point is not limited, and may be a voice command, a text command, a destination setting command or a travel route setting command for the navigation device 19. The command to move to the monitoring point may be sent only to the monitoring terminal device 10 (moving body) existing within a predetermined distance from the monitoring point. The command to move to the monitoring point may be sent only to the monitoring terminal device 10 (moving body) belonging to a monitor such as the police or the fire department, or is mounted on the moving body V for business use or private use according to the case. It may be sent only to the monitoring terminal device 10.

 一例ではあるが、ルート変更指令の実施例を図15及び図16に示す。図15に示すように、ある乗用車Vが地点Aから地点Fへ向かおうとするときに、自らが選択した経路がA→B→C→F(R11,R12)であるとする。中央監視装置20から監視端末装置10を搭載する乗用車Vの走行頻度が低い監視リンクZ又はリンクZ上の監視地点を含むルート変更指令を取得し、経路変更が可能であると判断した場合には、このリンクZを通過するように、経路をA→D→E→F(R21,R22)へ変更する。監視端末装置10を搭載する乗用車Vは、この変更後の経路に従いリンクZへ向かい、リンクZ上の監視地点において撮像した画像情報を含む監視情報又はリンクZの異常の有無を含む監視情報を中央監視装置20へ送出する。 An example of a route change command is shown in FIGS. 15 and 16 as an example. As shown in FIG. 15, when a certain passenger car V is going from point A to point F, it is assumed that the route selected by itself is A → B → C → F (R11, R12). When it is determined that a route change command including a monitoring link Z or a monitoring point on the link Z with a low traveling frequency of the passenger vehicle V carrying the monitoring terminal device 10 is obtained from the central monitoring device 20 and the route can be changed. The route is changed from A → D → E → F (R21, R22) so as to pass through the link Z. The passenger car V equipped with the monitoring terminal device 10 heads for the link Z according to the route after the change, and centralizes monitoring information including image information captured at the monitoring point on the link Z or monitoring information including the presence or absence of an abnormality of the link Z. It is sent to the monitoring device 20.

 また、図16に示すように、前例と同様に、自らが選択した経路がA→B→C→F(R31,R32)であるとする。中央監視装置20から監視端末装置10を搭載する乗用車Vの走行頻度が高い監視リンクW又はリンクW以外のリンク上の監視地点を含むルート変更指令を取得し、経路変更が可能であると判断した場合には、このリンクWを通過しないように、つまりリンクW以外のリンクを通過するように、経路をA→B→E→F(R41,R42)へ変更する。監視端末装置10を搭載する乗用車Vは、この変更後の経路に従いリンクW以外のリンクへ向かい、リンクW以外のリンク上の監視地点において撮像した画像情報を含む監視情報又はリンクW以外のリンクの異常の有無を含む監視情報を中央監視装置20へ送出する。 Also, as shown in FIG. 16, it is assumed that the route selected by itself is A → B → C → F (R31, R32), as in the previous example. A route change command including a monitoring link W or a monitoring point on a link other than the link W having a high traveling frequency of the passenger vehicle V on which the monitoring terminal device 10 is mounted is acquired from the central monitoring device 20, and it is determined that the route can be changed. In this case, the route is changed from A → B → E → F (R41, R42) so as not to pass through the link W, that is, to pass through a link other than the link W. The passenger vehicle V equipped with the monitoring terminal device 10 moves to a link other than the link W according to the route after the change, and includes monitoring information including image information captured at a monitoring point on a link other than the link W or a link other than the link W. Monitoring information including the presence or absence of abnormality is sent to the central monitoring device 20.

 特に限定されないが、中央監視装置10は、警察、警備会社などの監視者に属する乗用車Vに搭載された監視端末装置10から監視情報を取得することができる。そして、この監視情報に含まれる位置情報などから導出された走行頻度に基づいて特定された監視地点の監視情報を要求する指令を、同じく警察、警備会社などの監視者に属する乗用車Vに搭載された監視端末装置10に送出してもよいし、又は監視者以外のタクシーや運送車両などの業務用又は自家用の乗用車Vに搭載された監視端末装置10に送出してもよい。つまり、警察車両V3の走行頻度が低い監視地点に他の警察車両V3を出向かせてもよいし、警察車両V3の走行頻度が低い監視地点にタクシー等の業務車両V2を出向かせてもよい。 Although not particularly limited, the central monitoring device 10 can acquire monitoring information from the monitoring terminal device 10 mounted on the passenger car V belonging to a monitor such as a police or a security company. A command for requesting monitoring information of a monitoring point specified based on the traveling frequency derived from the position information included in the monitoring information is mounted on the passenger car V belonging to a monitor such as the police or the security company. It may be sent to the monitoring terminal device 10 or may be sent to the monitoring terminal device 10 mounted on a business or private passenger car V such as a taxi or a transport vehicle other than the supervisor. That is, another police vehicle V3 may be dispatched to a monitoring point where the traveling frequency of the police vehicle V3 is low, or a business vehicle V2 such as a taxi may be dispatched to a monitoring point where the traveling frequency of the police vehicle V3 is low.

 他方、中央監視装置10は、タクシー、運送会社などの監視者以外の業者・個人に属する乗用車Vに搭載された監視端末装置10から監視情報を取得することができる。そして、この監視情報に含まれる位置情報などから導出された走行頻度に基づいて特定された監視地点の監視情報を要求する指令を、警察、警備会社などの監視者に属する乗用車Vに搭載された監視端末装置10に送出してもよいし、又は監視者以外のタクシーや運送車両などの業務用又は自家用の乗用車Vに搭載された監視端末装置10に送出してもよい。つまり、業務車両V2の走行頻度が低い(高くない)監視地点に警察車両V3を出向かせてもよいし、業務車両V2の走行頻度が低い(高くない)監視地点にタクシー等の他の業務車両V2を出向かせてもよい。 On the other hand, the central monitoring device 10 can acquire monitoring information from the monitoring terminal device 10 mounted on the passenger car V belonging to a trader / individual other than the monitoring person such as a taxi or a transportation company. And the instruction | command which requests | requires the monitoring information of the monitoring spot specified based on the driving frequency derived | led-out from the positional information etc. which are contained in this monitoring information was mounted in the passenger car V which belongs to monitoring persons, such as a police and a security company. You may send out to the monitoring terminal device 10, or you may send out to the monitoring terminal device 10 mounted in the passenger cars V for business or private use, such as a taxi and a transport vehicle other than a supervisor. That is, the police vehicle V3 may be dispatched to a monitoring point where the travel frequency of the business vehicle V2 is low (not high), or another business vehicle such as a taxi is used at a monitoring point where the travel frequency of the business vehicle V2 is low (not high). V2 may be dispatched.

 図12Bに示す、ステップST21では、パトカー、救急車又は消防車などの緊急乗用車から画像情報の送信指令があるか否かを判断し、画像送信指令が入力された場合にはステップST22へ進む。ステップST22では、画像情報の送信指令で特定された地域に乗用車Vが存在するか否かを判断し、乗用車Vが存在する場合はステップST23へ進む。そして、ステップST23において、画像情報の送信指令で特定された地域に存在する乗用車Vに対して画像情報の送信指令を出力する。これにより、次のルーチンの図12AのステップST11にてその乗用車Vからの画像情報を取得することができ、これを緊急乗用車に転送したり、緊急乗用車からの送信指令の意味を把握したりすることができる。なお、ステップST21及びST22に該当しない場合はステップST21~ST23の処理を行うことなくステップST24へ進む。 In step ST21 shown in FIG. 12B, it is determined whether there is an image information transmission command from an emergency passenger car such as a police car, an ambulance, or a fire engine. If an image transmission command is input, the process proceeds to step ST22. In step ST22, it is determined whether or not the passenger vehicle V exists in the area specified by the image information transmission command. If the passenger vehicle V exists, the process proceeds to step ST23. In step ST23, an image information transmission command is output to the passenger vehicle V existing in the area specified by the image information transmission command. Thereby, the image information from the passenger car V can be acquired in step ST11 of FIG. 12A of the next routine, and this is transferred to the emergency passenger car or the meaning of the transmission command from the emergency passenger car is grasped. be able to. If not corresponding to steps ST21 and ST22, the process proceeds to step ST24 without performing the processes of steps ST21 to ST23.

 ステップST24では、予め設定された犯罪多発遅滞などの不審箇所の近傍領域に乗用車Vが存在するか否かを判断し、存在する場合はステップST25へ進んでその乗用車Vに対して画像情報の送信指令を出力する。不審箇所とは治安の悪い通り、街などである。これにより、不審箇所である通りや街の監視を強化することができ、犯罪の未然防止が期待できる。なお、不審箇所の近傍領域に乗用車Vが存在しない場合はステップST22の処理を行うことなくステップST26へ進む。 In step ST24, it is determined whether or not there is a passenger car V in the vicinity of a suspicious location such as a preset crime-prone delay, and if so, the process proceeds to step ST25 to transmit image information to the passenger car V. Outputs a command. Suspicious areas are streets with poor security and streets. As a result, the monitoring of streets and streets that are suspicious places can be strengthened, and crime prevention can be expected. If the passenger vehicle V does not exist in the region near the suspicious part, the process proceeds to step ST26 without performing the process of step ST22.

 ステップST26では、詳細を監視しておくべき重点監視対象を撮像できる重点監視位置の近傍に乗用車Vが存在するか否かを判断し、重点監視位置の近傍に乗用車Vが存在する場合はステップST27へ進んでその乗用車Vに対して重点監視対象を拡大した画像情報の送信を求める重点監視指令を出力する。これにより、重点監視対象を詳細に監視することができ、特定された重点監視対象において事件や事故の原因となる不審物の発見を効果的に行うことができ、犯罪の未然防止が期待できる。なお、重点監視位置の近傍に乗用車Vが存在しない場合はステップST27の処理を行うことなくステップST28へ進む。 In step ST26, it is determined whether or not there is a passenger vehicle V in the vicinity of the priority monitoring position where the priority monitoring object whose details should be monitored can be imaged. If the passenger vehicle V exists in the vicinity of the priority monitoring position, step ST27 is determined. To the passenger vehicle V, and outputs a priority monitoring command for requesting transmission of image information in which the priority monitoring target is expanded. As a result, it is possible to monitor the priority monitoring target in detail, and to effectively detect a suspicious object that causes an incident or an accident in the specified priority monitoring target, so that prevention of crime can be expected. If there is no passenger vehicle V in the vicinity of the priority monitoring position, the process proceeds to step ST28 without performing the process of step ST27.

 ステップST28では、各乗用車Vから受信した位置情報に基づいて、監視が必要とされる所定領域(不審箇所及び重点監視領域には限定されない)内に、一定時間内に乗用車Vが走行していない路線があるか否かを判断し、そのような路線があった場合において、その路線を走行する乗用車Vがあるか否かを監視する。そして、直近にその路線を走行する乗用車Vが存在すれば、ステップST29へ進み、その乗用車Vに対して画像情報の送信指令を出力する。これにより、不審箇所や重点監視領域以外の区域であって乗用車Vの通行量が少ない路線の画像情報を自動的に取得することができる。なお、ステップST28の条件を満足する路線がない場合はステップST29の処理を行うことなく図12AのステップST11へ戻る。 In step ST28, based on the position information received from each passenger car V, the passenger car V is not traveling within a predetermined time within a predetermined area that is required to be monitored (not limited to the suspicious location and the priority monitoring area). It is determined whether there is a route, and when there is such a route, it is monitored whether there is a passenger vehicle V traveling on the route. Then, if there is a passenger car V traveling on the route most recently, the process proceeds to step ST29, and an image information transmission command is output to the passenger car V. Thereby, it is possible to automatically acquire image information of a route that is a region other than the suspicious portion or the priority monitoring region and has a small traffic volume of the passenger car V. If there is no route that satisfies the condition of step ST28, the process returns to step ST11 of FIG. 12A without performing the process of step ST29.

 以上のとおり、本実施形態の監視システムは以下の効果を奏する。
(1)本例の監視システム1はンクごとに算出された乗用車Vの走行頻度に基づいて特定された監視地点の監視情報を送信する指令を出力するので、乗用車Vの走行頻度が低いリンクの監視地点に関する監視情報の送信を求める指令を送信するので、乗用車Vがランダムに移動することにより走行頻度が低いリンクが生じた場合であっても、そのリンクの監視地点に関する監視情報を取得することができる。この結果、中央監視装置20はランダムに動く乗用車に搭載された監視端末装置10を用いて、街中を万遍なく監視することができる。
As described above, the monitoring system of the present embodiment has the following effects.
(1) Since the monitoring system 1 of the present example outputs a command to transmit monitoring information of a monitoring point specified based on the traveling frequency of the passenger car V calculated for each link, the link of the traveling frequency of the passenger vehicle V is low. Since a command requesting transmission of monitoring information related to a monitoring point is transmitted, even if a link with a low traveling frequency occurs due to a random movement of the passenger car V, monitoring information related to the monitoring point of the link is acquired. Can do. As a result, the central monitoring device 20 can uniformly monitor the city using the monitoring terminal device 10 mounted on a randomly moving passenger car.

(2)本例の監視システム1では、監視地点の画像情報を含む監視情報を送信する画像送信指令を指令として出力するので、監視者以外のタクシー業者が、監視者に代わって走行頻度に基づいて特定した監視地点を撮像した画像情報を監視者に提供することができる。これにより、監視者は、実際に現場に行かなくても各監視地点の安全を判断することができる。この結果、街中を漏れなく監視して監視レベルを向上させつつ、監視に要するコストを低減させることができる。 (2) In the monitoring system 1 of this example, the image transmission command for transmitting the monitoring information including the image information of the monitoring point is output as a command, so that a taxi company other than the monitoring person is based on the driving frequency on behalf of the monitoring person. Thus, it is possible to provide the monitor with image information obtained by capturing the identified monitoring point. Thereby, the supervisor can judge the safety of each monitoring point, without actually going to the spot. As a result, it is possible to reduce the cost required for monitoring while monitoring the city without omission and improving the monitoring level.

(3)本例の監視システム1では走行頻度が低いリンク上の地点を、監視地点として選択するので、監視端末装置10を搭載する乗用車Vを街中に万遍なく走行させることができ、上記(2)と同様の作用効果を奏することができる。 (3) Since the monitoring system 1 of this example selects a point on the link with a low traveling frequency as the monitoring point, the passenger car V on which the monitoring terminal device 10 is mounted can travel all over the city. The same effect as 2) can be achieved.

(4)本例の監視システム1では走行頻度が高いリンク以外のリンク、つまり走行頻度が高くないリンク上の地点を、監視地点として選択するので、走行頻度が高いリンクを避けて走行頻度が高くないリンクへ乗用車Vを誘導することができる。この結果、監視端末装置10を搭載する乗用車Vを街中に均等に走行させることができ、上記(2)と同様の作用効果を奏することができる。 (4) In the monitoring system 1 of this example, a link other than a link with high driving frequency, that is, a point on a link with low driving frequency is selected as a monitoring point. Therefore, a high driving frequency is avoided by avoiding a link with high driving frequency. The passenger car V can be guided to a link that does not exist. As a result, the passenger car V on which the monitoring terminal device 10 is mounted can be made to travel evenly in the city, and the same effect as the above (2) can be achieved.

(5)本例の監視システム1では、警察、警備会社などの監視者に属する乗用車Vに搭載された監視端末装置10から監視情報を取得するので、警察、警備会社などの監視者に属する乗用車のリンクごとの走行頻度に応じた指令を生成することができる。この結果、治安が悪く、危険である場所には監視者を集中させつつ、監視者の走行頻度が低い(高くない)リンクを導いて、他の乗用車Vを向かわせることができる。 (5) In the monitoring system 1 of this example, since monitoring information is acquired from the monitoring terminal device 10 mounted on the passenger car V belonging to a monitor such as a police or a security company, the passenger car belonging to a monitor such as a police or a security company It is possible to generate a command according to the traveling frequency for each link. As a result, it is possible to direct other passenger vehicles V by guiding links with low (not high) travel frequency of the supervisor while concentrating the supervisor in places where the security is poor and dangerous.

(6)本例の監視システム1では、タクシー、運送会社などの監視者以外に属する乗用車Vに搭載された監視端末装置10から監視情報を取得するので、タクシー、運送会社などの業者・個人に属する乗用車のリンクごとの走行頻度に応じた指令を生成することができる。この結果、業者・個人の走行頻度が低い(高くない)リンクを導いて、警察などの監視者が監視を補うことができる。 (6) In the monitoring system 1 of this example, since monitoring information is acquired from the monitoring terminal device 10 mounted on the passenger car V belonging to a person other than the monitoring person such as a taxi or a transportation company, the monitoring system 1 can be used for a trader or individual such as a taxi or a transportation company. A command according to the traveling frequency for each link of the passenger car to which the vehicle belongs can be generated. As a result, a link such as a low (not high) travel frequency of the trader / individual can be guided to supplement the monitoring by a supervisor such as the police.

(7)本例の監視システム1では、指令を監視者に属する乗用車Vに搭載された監視端末装置10へ送出するので、警察車両V3の走行頻度が低い監視地点に他の警察車両V3を出向かせることができし、業務車両V2の走行頻度が低い監視地点に警察車両V3を出向かせることもできる。 (7) In the monitoring system 1 of this example, since the command is sent to the monitoring terminal device 10 mounted on the passenger car V belonging to the supervisor, another police vehicle V3 is dispatched to a monitoring point where the traveling frequency of the police vehicle V3 is low. The police vehicle V3 can be sent to a monitoring point where the travel frequency of the business vehicle V2 is low.

(8)本例の監視システム1では、指令を監視者以外の業務用又は自家用の移動体に搭載された監視端末装置10に送出するので、警察車両V3の走行頻度が低い監視地点にタクシー等の業務車両V2を出向かせることができ、業務車両V2の走行頻度が低い監視地点にタクシー等の他の業務車両V2を出向かせることができる。 (8) In the monitoring system 1 of this example, since the command is sent to the monitoring terminal device 10 mounted on a business or private mobile body other than the supervisor, a taxi or the like is provided at a monitoring point where the traveling frequency of the police vehicle V3 is low. The business vehicle V2 can be dispatched, and another business vehicle V2 such as a taxi can be dispatched to a monitoring point where the travel frequency of the business vehicle V2 is low.

(9)特定された監視地点へ移動する指令を監視情報送信指令に含ませて監視端末装置10へ送出することにより、移動体Vを強制的に監視地点へ向かわせて、監視地点における監視情報を確実に取得することができる。 (9) By including a command to move to the specified monitoring point in the monitoring information transmission command and sending it to the monitoring terminal device 10, the mobile object V is forcibly directed to the monitoring point, and monitoring information at the monitoring point Can be surely acquired.

(10)本例の監視システム1では、監視端末装置10側において、指令に含まれる監視地点又は当該監視地点が含まれるリンクを表示するので、乗用車Vの乗員は、監視が求められている監視地点を認識しつつ、業務を遂行することができる。 (10) In the monitoring system 1 of this example, since the monitoring point included in the command or the link including the monitoring point is displayed on the monitoring terminal device 10 side, the passenger of the passenger car V is required to monitor. Work can be performed while recognizing the location.

(11)本例の監視システム1では、監視端末装置10側のナビゲーション装置において、指令に含まれる監視地点又は当該監視地点が含まれるリンクを地図情報MPに重畳させて表示するので、乗用車Vの乗員は、自身が意図する経路に対する監視地点又は監視地点を含むリンクの位置を目視にて確認することができる。 (11) In the monitoring system 1 of this example, in the navigation device on the monitoring terminal device 10 side, the monitoring point included in the command or the link including the monitoring point is displayed superimposed on the map information MP. The occupant can visually check the position of the monitoring point or the link including the monitoring point with respect to the route intended by the passenger.

(12)本例の監視システム1では、指令に含まれる監視地点又は当該監視地点が含まれるリンクを経由する経路を探索するので、自身が意図する目的地へ至る経路に監視地点を経由地点として組み込むことができる。 (12) In the monitoring system 1 of this example, since the route that passes through the monitoring point included in the command or the link that includes the monitoring point is searched, the monitoring point is set as the via point on the route to the intended destination. Can be incorporated.

(13)本例の監視方法は、上記監視端末装置10と中央監視装置20とを備える監視システムと同様の作用及び効果を奏する。 (13) The monitoring method of this example has the same operations and effects as the monitoring system including the monitoring terminal device 10 and the central monitoring device 20.

 なお、上述した実施形態では、乗用車Vの位置情報と車載カメラ11a~11eからの画像情報を取得するようにしたが、図1に示す、街中に設置された固定カメラ11fからの画像情報と組み合わせて取得してもよい。また、位置情報と画像情報を取得する乗用車Vは、図1に示すように予め決められた領域を走行するタクシーV1やバスを用いることが望ましいが、自家用乗用車V2や緊急乗用車V3を用いてもよい。 In the above-described embodiment, the position information of the passenger car V and the image information from the in-vehicle cameras 11a to 11e are acquired. However, in combination with the image information from the fixed camera 11f installed in the city shown in FIG. May be obtained. Further, as the passenger car V that acquires the position information and the image information, it is desirable to use a taxi V1 or a bus that travels in a predetermined area as shown in FIG. 1, but even if a private passenger car V2 or an emergency passenger car V3 is used. Good.

 また、上述した実施形態では、乗用車Vに5つの車載カメラを搭載し、このうち4つの車載カメラ11a~11dを用いて360°周囲の映像を画像情報として取得したが、室内の車載カメラ11eを省略してもよい。また、交通量が多い監視領域のように多くの乗用車Vから画像情報が取得できる環境等であれば特に、4つの車載カメラ11a~11dを3つ以下にしてもよい。 In the above-described embodiment, five in-vehicle cameras are mounted on the passenger car V, and images around 360 ° are acquired as image information using the four in-vehicle cameras 11a to 11d, but the in-vehicle camera 11e in the room is acquired. It may be omitted. In addition, the number of the four on-vehicle cameras 11a to 11d may be three or less, particularly in an environment where image information can be acquired from many passenger cars V, such as a monitoring area where there is a large amount of traffic.

 上記中央制御装置21は走行頻度算出手段、指令出力手段に相当し、上記通信装置23、入力装置25は本発明に係る情報取得手段、異常情報受付手段及び指令出力手段に相当する。 The central control device 21 corresponds to travel frequency calculation means and command output means, and the communication device 23 and input device 25 correspond to information acquisition means, abnormality information reception means and command output means according to the present invention.

1…車両監視システム
 10…監視端末装置
  11,11a~11e…車載カメラ
  11f…街中固定カメラ
  12…画像処理装置
  13…通信装置
  14…車載制御装置
  15…位置検出装置
  16…通報ボタン
  18…出力装置,ディスプレイ,スピーカ
  19…ナビゲーション装置
 20…中央監視装置
  21…中央制御装置
  22…画像処理装置
  23…通信装置
  24…ディスプレイ
  25…入力装置
 30…電気通信回線網
V,V1,V2,V3…移動体
M…投影モデル
S,Sa,Sb、Sc、Sd…投影面
R1~R8…視点
DESCRIPTION OF SYMBOLS 1 ... Vehicle monitoring system 10 ... Monitoring terminal device 11, 11a-11e ... Vehicle-mounted camera 11f ... Street fixed camera 12 ... Image processing device 13 ... Communication device 14 ... Vehicle-mounted control device 15 ... Position detection device 16 ... Notification button 18 ... Output device , Display, speaker 19 ... navigation device 20 ... central monitoring device 21 ... central control device 22 ... image processing device 23 ... communication device 24 ... display 25 ... input device 30 ... telecommunications network V, V1, V2, V3 ... moving body M ... Projection model S, Sa, Sb, Sc, Sd ... Projection planes R1-R8 ... Viewpoint

Claims (13)

 複数の移動体のそれぞれの位置情報を検出する位置検出手段と、前記複数の移動体のそれぞれに装着され、当該移動体の周囲を撮像して画像情報を生成する画像生成手段と、を備える監視端末装置から無線通信を介して監視情報を取得する中央監視装置を備える監視システムであって、
 前記中央監視装置は、
 前記監視端末装置から出力された位置情報及び時刻情報を少なくとも含む監視情報を取得する情報取得手段と、
 前記取得する監視情報及びリンク情報を含む地図情報を参照して、前記リンク情報のリンクごとに前記移動体の走行頻度を算出する走行頻度算出手段と、
 前記算出された走行頻度に基づいて監視すべき監視地点を特定し、前記監視地点の監視情報を送信させる指令を前記監視端末装置に出力する指令出力手段と、を備えることを特徴とする監視システム。
A monitoring device comprising: position detection means for detecting position information of each of the plurality of moving bodies; and image generation means that is attached to each of the plurality of moving bodies and images the surroundings of the moving bodies to generate image information. A monitoring system including a central monitoring device that acquires monitoring information from a terminal device via wireless communication,
The central monitoring device is
Information acquisition means for acquiring monitoring information including at least position information and time information output from the monitoring terminal device;
With reference to map information including the monitoring information and link information to be acquired, a travel frequency calculating means for calculating the travel frequency of the mobile body for each link of the link information;
A monitoring system comprising: command output means for specifying a monitoring point to be monitored based on the calculated traveling frequency and outputting a command for transmitting monitoring information of the monitoring point to the monitoring terminal device. .
 前記中央監視装置の前記指令出力手段は、前記監視地点の画像情報を含む監視情報を送信する画像送信指令を前記指令として出力することを特徴とする請求項1に記載の監視システム。 The monitoring system according to claim 1, wherein the command output means of the central monitoring device outputs an image transmission command for transmitting monitoring information including image information of the monitoring point as the command.  前記中央監視装置の前記指令出力手段は、前記走行頻度が低いリンク上の地点を、監視地点として選択することを特徴とする請求項1又は2に記載の監視システム。 3. The monitoring system according to claim 1 or 2, wherein the command output means of the central monitoring device selects a point on the link having a low traveling frequency as a monitoring point.  前記中央監視装置の前記指令出力手段は、前記通過頻度が高いリンク以外のリンク上の地点を、監視地点として選択することを特徴とする請求項1又は2に記載の監視システム。 The monitoring system according to claim 1 or 2, wherein the command output means of the central monitoring device selects a point on a link other than the link having a high passing frequency as a monitoring point.  前記中央監視装置の情報取得手段は、監視者に属する移動体に搭載された前記監視端末装置から監視情報を取得することを特徴とする請求項1~4の何れか一項に記載の監視システム。 The monitoring system according to any one of claims 1 to 4, wherein the information acquisition means of the central monitoring device acquires monitoring information from the monitoring terminal device mounted on a mobile body belonging to a monitor. .  前記中央監視装置の情報取得手段は、監視者以外の業務用又は自家用の移動体に搭載された前記監視端末装置から監視情報を取得することを特徴とする請求項1~5の何れか一項に記載の監視システム。 6. The information acquisition means of the central monitoring device acquires monitoring information from the monitoring terminal device mounted on a business or private mobile body other than the supervisor. The monitoring system described in.  前記中央監視装置の前記指令出力手段は、前記指令を監視者に属する移動体に搭載された前記監視端末装置に送出することを特徴とする請求項1~6の何れか一項に記載の監視システム。 The monitoring according to any one of claims 1 to 6, wherein the command output means of the central monitoring device sends the command to the monitoring terminal device mounted on a moving body belonging to a monitor. system.  前記中央監視装置の前記指令出力手段は、前記指令を監視者以外の業務用又は自家用の移動体に搭載された前記監視端末装置に送出することを特徴とする請求項1~7の何れか一項に記載の監視システム。 The command output means of the central monitoring device sends the command to the monitoring terminal device mounted on a business or private mobile body other than the monitor. The monitoring system according to item.  前記中央監視装置の前記指令出力手段は、前記選択された監視地点の監視情報を送信させる指令は、前記監視地点に前記移動体を移動させる指令を含むことを特徴とする請求項1~8の何れか一項に記載の監視システム。 9. The command output means of the central monitoring apparatus, wherein the command for transmitting the monitoring information of the selected monitoring point includes a command for moving the moving body to the monitoring point. The monitoring system according to any one of the above.  前記監視端末装置は、出力装置を備え、
 前記出力装置は、前記指令に含まれる監視地点又は当該監視地点が含まれるリンクを表示することを特徴とする請求項1~9の何れか一項に記載の監視システム。
The monitoring terminal device includes an output device,
The monitoring system according to any one of claims 1 to 9, wherein the output device displays a monitoring point included in the command or a link including the monitoring point.
 前記監視端末装置は、地図情報を有するナビゲーション装置を備え、
 前記ナビゲーション装置は、前記指令に含まれる監視地点又は当該監視地点が含まれるリンクを前記地図情報に重畳させて前記出力装置に表示することを特徴とする請求項10に記載の監視システム。
The monitoring terminal device includes a navigation device having map information,
The monitoring system according to claim 10, wherein the navigation device superimposes a monitoring point included in the command or a link including the monitoring point on the map information and displays the same on the output device.
 前記監視端末装置は、地図情報及び現在位置から目的地までの経路を探索する経路探索機能を有するナビゲーション装置を備え、
 前記ナビゲーション装置は、前記指令に含まれる監視地点又は当該監視地点が含まれるリンクを経由する前記経路を探索することを特徴とする請求項1~9の何れか一項に記載の監視システム。
The monitoring terminal device includes a navigation device having a map search function and a route search function for searching a route from the current position to the destination,
The monitoring system according to any one of claims 1 to 9, wherein the navigation device searches for the route via a monitoring point included in the command or a link including the monitoring point.
 複数の移動体に搭載された監視端末装置から、各移動体の位置情報を少なくとも含む監視情報を取得するステップと、
 前記取得した監視情報及びリンク情報を含む地図情報を参照して、前記リンク情報のリンクごとに前記移動体の走行頻度を算出するステップと、
 前記算出された走行頻度に基づいて監視すべきリンクに含まれる監視地点を特定し、前記監視地点の監視情報を送信する旨の指令を出力するステップと、を備えることを特徴とする監視方法。
Obtaining monitoring information including at least position information of each moving body from monitoring terminal devices mounted on a plurality of moving bodies;
With reference to map information including the acquired monitoring information and link information, calculating a traveling frequency of the moving body for each link of the link information;
A monitoring method comprising: specifying a monitoring point included in a link to be monitored based on the calculated traveling frequency, and outputting a command to transmit monitoring information of the monitoring point.
PCT/JP2012/083475 2012-01-23 2012-12-25 Monitoring system Ceased WO2013111493A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-011439 2012-01-23
JP2012011439 2012-01-23

Publications (1)

Publication Number Publication Date
WO2013111493A1 true WO2013111493A1 (en) 2013-08-01

Family

ID=48873232

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/083475 Ceased WO2013111493A1 (en) 2012-01-23 2012-12-25 Monitoring system

Country Status (1)

Country Link
WO (1) WO2013111493A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105300388A (en) * 2015-10-10 2016-02-03 东圳医疗器械(上海)有限公司 Scooter monitoring method and intelligent terminal
WO2020071358A1 (en) * 2018-10-02 2020-04-09 シャープ株式会社 Display control device, display control device control method, display control program, and recording medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006266760A (en) * 2005-03-22 2006-10-05 Hitachi Ltd Navigation device, navigation method, navigation program, server device, and navigation information distribution system
JP2008039687A (en) * 2006-08-09 2008-02-21 Denso Corp Road map updating system and vehicle-side device used for the same
WO2010150348A1 (en) * 2009-06-23 2010-12-29 パイオニア株式会社 Video recording/reproduction device, video recording/reproduction method, and video recording/reproduction program
JP2011022161A (en) * 2010-10-04 2011-02-03 Yupiteru Corp Traffic monitoring point detector and program
JP2011114580A (en) * 2009-11-26 2011-06-09 Panasonic Corp Multiple cameras monitoring system, mobile terminal apparatus, center apparatus, and method for monitoring multiple cameras
JP2011215767A (en) * 2010-03-31 2011-10-27 Zenrin Datacom Co Ltd Server device, method of using security camera images, program for using security camera images, and security camera system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006266760A (en) * 2005-03-22 2006-10-05 Hitachi Ltd Navigation device, navigation method, navigation program, server device, and navigation information distribution system
JP2008039687A (en) * 2006-08-09 2008-02-21 Denso Corp Road map updating system and vehicle-side device used for the same
WO2010150348A1 (en) * 2009-06-23 2010-12-29 パイオニア株式会社 Video recording/reproduction device, video recording/reproduction method, and video recording/reproduction program
JP2011114580A (en) * 2009-11-26 2011-06-09 Panasonic Corp Multiple cameras monitoring system, mobile terminal apparatus, center apparatus, and method for monitoring multiple cameras
JP2011215767A (en) * 2010-03-31 2011-10-27 Zenrin Datacom Co Ltd Server device, method of using security camera images, program for using security camera images, and security camera system
JP2011022161A (en) * 2010-10-04 2011-02-03 Yupiteru Corp Traffic monitoring point detector and program

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105300388A (en) * 2015-10-10 2016-02-03 东圳医疗器械(上海)有限公司 Scooter monitoring method and intelligent terminal
WO2020071358A1 (en) * 2018-10-02 2020-04-09 シャープ株式会社 Display control device, display control device control method, display control program, and recording medium
JPWO2020071358A1 (en) * 2018-10-02 2021-09-02 シャープ株式会社 Display control device, control method of display control device, display control program and recording medium

Similar Documents

Publication Publication Date Title
JP5786963B2 (en) Monitoring system
JP7574900B2 (en) Video sharing system, video sharing method and program
US10572738B2 (en) Method and system for detecting a threat or other suspicious activity in the vicinity of a person or vehicle
US10572737B2 (en) Methods and system for detecting a threat or other suspicious activity in the vicinity of a person
US10572740B2 (en) Method and system for detecting a threat or other suspicious activity in the vicinity of a motor vehicle
US10572739B2 (en) Method and system for detecting a threat or other suspicious activity in the vicinity of a stopped emergency vehicle
JP6451840B2 (en) Information presentation system
JP2012195793A (en) Vehicle periphery monitoring device
JP7349888B2 (en) Driving support method and in-vehicle device
JP5811190B2 (en) Monitoring system
WO2012137367A1 (en) Image accumulation system
WO2013111494A1 (en) Monitoring system
JP7348724B2 (en) In-vehicle device and display method
JP5790788B2 (en) Monitoring system
KR20110108861A (en) Bus control method and bus terminal device
WO2013111491A1 (en) Monitoring system
WO2013111493A1 (en) Monitoring system
WO2013111479A1 (en) Monitoring system
WO2013125301A1 (en) Surveillance system
WO2013111492A1 (en) Monitoring system
WO2013161345A1 (en) Monitoring system and monitoring method
JP7140043B2 (en) Information processing equipment
JP5796638B2 (en) Monitoring system
JP5812105B2 (en) Monitoring system
JP2011258068A (en) Traffic information provision system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12866752

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12866752

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP