WO2021230049A1 - Système de notification d'informations - Google Patents
Système de notification d'informations Download PDFInfo
- Publication number
- WO2021230049A1 WO2021230049A1 PCT/JP2021/016518 JP2021016518W WO2021230049A1 WO 2021230049 A1 WO2021230049 A1 WO 2021230049A1 JP 2021016518 W JP2021016518 W JP 2021016518W WO 2021230049 A1 WO2021230049 A1 WO 2021230049A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- user
- danger
- degree
- contact
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H3/00—Appliances for aiding patients or disabled persons to walk about
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/01—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
- G08B25/04—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using a single signalling line, e.g. in a closed loop
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/005—Traffic control systems for road vehicles including pedestrian guidance indicator
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Definitions
- One aspect of the present invention relates to an information notification system.
- Patent Document 1 when an object image enters the wearer's side from the alert line in the captured image, the wearer is alerted by the image of the earphone and the wearer worn by the wearer. The device is described.
- One aspect of the present invention is made in view of the above circumstances, and relates to an information notification system capable of appropriately notifying a user of necessary information according to the degree of danger of the user.
- the information notification system includes an acquisition unit that acquires one or more captured images related to an area around the user, and one or a plurality of captured images captured by a terminal attached to the user. Based on the detection unit that detects objects in the area around the user, and the determination unit that determines the degree of risk that the object detected by the detection unit will come into contact with the user based on the result detected by the detection unit.
- a generator that generates notification information indicating that the user is dangerous based on the degree of danger and an output unit that outputs the notification information to the terminal are provided. The higher the degree of danger, the more the user is an object. Generate notification information to help you recognize the danger of contact.
- an object approaching the user is recognized based on one or a plurality of captured images related to the area around the user, and the risk of the object coming into contact with the user is determined. .. Then, in this information notification system, notification information is generated so that the user can easily recognize the danger of contact with an object as the degree of danger increases, and the notification information is output to the terminal. According to such a configuration, it is possible to notify the user according to the risk of contact with an object. Specifically, for example, when the risk of an object coming into contact with a user is low, it is not suitable for the determined risk by simply outputting notification information to the extent that the user is notified of the approach of the object.
- the notification information emphasizing that the danger is imminent is output to the user, so that the contact between the object and the user is appropriate. It can be avoided.
- the information notification system according to one aspect of the present invention, it is possible to appropriately notify the user of necessary information according to the degree of danger of the user.
- necessary information can be appropriately notified to the user according to the degree of danger of the user.
- FIG. 1 is a diagram illustrating an outline of an information notification system according to the present embodiment.
- FIG. 2 is a block diagram showing a functional configuration of the information notification system according to the present embodiment.
- FIG. 3 is a diagram illustrating notification information.
- FIG. 4 is a diagram illustrating a degree of danger that an object comes into contact with a user.
- FIG. 5 is a diagram showing an example of an image displayed on a communication terminal.
- FIG. 6 is a diagram showing an example of a danger image displayed on a communication terminal.
- FIG. 7 is a diagram showing an example of a danger image displayed on a communication terminal.
- FIG. 8 is a flowchart showing a process performed by the information notification system according to the present embodiment.
- FIG. 9 is a flowchart showing a process performed by the information notification system according to the present embodiment.
- FIG. 10 is a diagram showing a hardware configuration of a communication terminal, an object detection server, and a determination server included in the information notification system according to the present embodiment
- FIG. 1 is a diagram illustrating an outline of the information notification system 1 according to the present embodiment.
- FIG. 2 is a block diagram showing a functional configuration of the information notification system 1 according to the present embodiment.
- the information notification system 1 shown in FIGS. 1 and 2 is a system for notifying various information including notification information indicating that the user is dangerous.
- the information notification system 1 includes a communication terminal (terminal) 10 attached to the user, an object detection server 30, and a determination server 50.
- FIG. 1 illustrates an image P1 which is one of a plurality of temporally continuous captured images captured by the communication terminal 10.
- the object detection server 30 detects an object in the captured image for a plurality of captured images captured by the communication terminal 10, and transmits information about the detected object to the determination server 50.
- the object detection server 30 detects the person H1 as an object based on a plurality of captured images including the image P1.
- the determination server 50 determines the degree of risk that the detected object will come into contact with the user based on the detection result detected by the object detection server 30. Then, the determination server 50 generates notification information based on the determined risk level.
- the notification information is information indicating that the user is in danger.
- the determination server 50 generates notification information so that the higher the degree of danger, the easier it is for the user to recognize the danger of contact with an object.
- the notification information includes sound information and image information.
- the sound information is information notified by sound in the speaker of the communication terminal 10, and is information on dangerous sounds that make it easier for the user to recognize the danger of contact with an object as the degree of danger increases.
- the image information is information to be notified (displayed) on the screen of the communication terminal 10, and is information related to a dangerous image in which the higher the degree of danger, the easier it is for the user to recognize the danger of contact with an object.
- the details of the image information and the sound information will be described later. Then, when the determination server 50 outputs the notification information to the communication terminal 10, a dangerous sound is emitted from the speaker of the communication terminal 10, and a dangerous image is displayed on the screen of the communication terminal 10.
- a person (object) H1 detected as an object is running from the front toward the user, and a predetermined condition for determining that the person H1 has a high risk of contacting the user.
- the determination server 50 determines that the risk of contact with the user by the person H1 is "high" (details will be described later), and notifies the user so that the danger of contact with the person H1 can be easily recognized. Generate information. Then, when the determination server 50 outputs the notification information to the communication terminal 10, as shown in FIG. 3, a dangerous sound M1 is emitted from the speaker of the communication terminal 10, and the screen of the communication terminal 10 is dangerous.
- Image P2 which is an image, is displayed. The mode in which the dangerous sound and the dangerous image are output is only an example, and only one of them may be output or another notification information may be output.
- the information notification system 1 notifies the user of notification information indicating that it is dangerous.
- the number of communication terminals 10 shown in FIGS. 1 and 2 is one, the number of communication terminals 10 may be plural.
- the communication terminal 10 is, for example, a terminal configured to perform wireless communication.
- the communication terminal 10 is a terminal worn by a user, and is, for example, a goggle-type wearable device.
- a plurality of continuously (temporally continuous) captured images are captured by the mounted camera.
- the communication terminal 10 transmits the plurality of captured images captured to the object detection server 30.
- the acquired plurality of captured images are used for detecting an object by the object detection server 30 and the like.
- the communication terminal 10 has a storage unit 11, a transmission unit 12, and an output unit 13.
- the storage unit 11 stores various information such as a plurality of captured images and notification information acquired from the determination server 50.
- the transmission unit 12 transmits a plurality of captured images to the object detection server 30 and the determination server 50.
- the output unit 13 outputs a specific output based on the notification information stored in the storage unit 11. Specifically, the output unit 13 may emit a dangerous sound from the speaker included in the communication terminal 10. Further, the output unit 13 may display a dangerous image on the screen of the communication terminal 10.
- the object detection server 30 is a server that detects an object in the area around the user based on a plurality of captured images acquired from the communication terminal 10.
- the object detection server 30 detects an object for each of the plurality of captured images.
- the object detection server 30 has a storage unit 31, an acquisition unit 32, and a detection unit 33 as functional components.
- the storage unit 31 stores the data 300.
- the data 300 is data in which the template of each object listed in advance as an object that can come into contact with the user and the name of the object are associated with each other. Such data 300 may be limited data selected as objects that can come into contact with each other, or may be data of various objects that have not been selected at all.
- the storage unit 31 may have an external configuration of the object detection server 30. That is, the data 300 may be data stored in a server outside the object detection server 30.
- the acquisition unit 32 acquires a plurality of captured images that are continuous in time from the communication terminal 10.
- Each captured image is an image captured by a communication terminal 10 worn by the user and relating to an area around the user.
- the plurality of captured images may be a sufficient number of captured images for the determination unit 53 to estimate the movement of the object, the distance between the user and the object, the approach speed to the user, and the like, which will be described later.
- the detection unit 33 detects an object in the area around the user based on the plurality of captured images captured by the communication terminal 10 and the data 300 stored in the storage unit 31. Specifically, the detection unit 33 detects an object captured in each captured image by a known image recognition process using the data of the template of the data 300. Then, the detection unit 33 transmits the object information, which is the information about the detected object, to the determination server 50 as the detection result.
- the object information includes the name of the detected object, the position information of the object in each captured image, the time information when the captured image was captured, and the like.
- the determination server 50 determines the degree of danger that the object detected by the detection unit 33 comes into contact with the user based on the result detected by the detection unit 33. Then, the determination server 50 generates notification information indicating that it is dangerous to the user based on the degree of danger, and outputs the generated notification information to the communication terminal 10.
- the determination server 50 includes a storage unit 51, an acquisition unit 52, a determination unit 53, a generation unit 54, and an output unit 55.
- the storage unit 51 stores information used for various processes performed by the determination server 50, such as determination of the degree of danger. Specifically, the storage unit 51 stores a plurality of captured images acquired from the communication terminal 10, object information acquired from the object detection server 30, and the like.
- the acquisition unit 52 acquires object information from the object detection server 30. Further, the acquisition unit 52 acquires a plurality of captured images from the communication terminal 10. The plurality of captured images are used to generate notification information by the generation unit 54, which will be described later.
- the determination unit 53 determines the degree of risk that the object detected by the detection unit 33 will come into contact with the user based on the result (object information) detected by the detection unit 33.
- the method of determining the degree of danger by the determination unit 53 will be described with reference to an example in which a plurality of captured images including the image P1 shown in FIG. 1 are acquired.
- the determination unit 53 determines the degree of danger based on the positions of the objects in the plurality of captured images detected by the detection unit 33. As shown in FIG. 4, in the present embodiment, the degree of danger is classified into four ranks of "high”, “medium”, "low”, and "none".
- the determination unit 53 determines each of the contact possibility, the distance between the user and the object, and the approach speed of the object to the user, and based on the determination result, "high” and “medium”. , “Low”, and “None” to determine the most suitable risk.
- Contactability is the possibility that an object will come into contact with the user and is classified as, for example, either "contact” or "non-contact”.
- the determination unit 53 determines the possibility of contact by performing the frontal determination and the movement vector determination.
- the frontal determination is the determination of the position of the object with respect to the user.
- the movement vector determination is a determination of a region where an object is predicted to move.
- the determination unit 53 determines whether or not the object is located in the frontal region with respect to the user based on the positions of the objects in the plurality of captured images. In the present embodiment, the determination unit 53 determines whether or not the object is located in a region of a predetermined angle from the user toward the front front.
- the predetermined angle is, for example, an angle of ⁇ 15 °. That is, in the present embodiment, the "frontal region" means a region of ⁇ 15 ° from the user toward the front of the front when viewed from the vertical direction.
- the determination unit 53 determines whether or not the object is located in the front area based on the position of the object included in the object information.
- the determination unit 53 determines in the frontal determination that the object is located in the above-mentioned area of ⁇ 15 °, it is regarded as “front” and it is determined that the object is not located in the above-mentioned area of ⁇ 15 °. In the case, it is "non-front".
- the frontal region is not limited to the region of ⁇ 15 ° from the user toward the front of the front, and may be a region of another angle.
- the determination unit 53 determines the movement vector of the object based on the position of the object included in the object information. Specifically, the determination unit 53 calculates the movement vector of the object by taking the difference in the position of the object between the continuous captured images.
- the determination unit 53 has a position of an object in an image of one of a plurality of images (hereinafter referred to as “first image”) and an image of one of the plurality of images following the first image. The time difference from the position of the object in the second image captured in is taken. Then, the determination unit 53 takes a time difference between the position of the object in the second image and the position of the object in the third image captured after the second image, which is one of the plurality of images.
- the determination unit 53 determines the range in which the object moves based on the calculated movement vector. Specifically, based on the movement vector, the determination unit 53 moves the object in the front region while staying in the front region, or moves toward a region different from the front region. To determine whether to move. Then, the determination unit 53 determines whether the object stays in the area different from the front area and moves or moves toward the front area for the object in the area different from the front area. judge. In this way, the determination unit 53 determines the region where the object is predicted to move based on the movement vector of the object derived from the position of the object in the plurality of captured images.
- the determination unit 53 performs the contact determination based on the respective results of the frontal determination and the movement vector determination. Specifically, when the result of the frontal determination is "front” and the result of the movement vector determination is "stays in the front area and moves", the determination unit 53 sets the contact determination as "contact”. judge. This is because it can be said that an object located in the area in front of the user and moving in the area in front of the user is likely to come into contact with the user.
- the determination unit 53 determines the contact determination as "non-contact" when the result of the movement vector determination is "moving toward a region different from the front region". Is determined. This is because even if the object is located in the front area, if the object moves toward a different area from the front area, the risk of the object touching the user is low. Is. That is, when the determination unit 53 estimates that the object is located in the area in front of the user and that the object moves to a region different from the area in front based on the movement vector, the object is in the area in front of the user. The possibility of contact is judged to be lower than the case where it is estimated that the object stays inside and moves.
- the determination unit 53 determines the contact determination as "non-contact”. Is determined. This is because it can be said that an object moving in an area different from the front area is unlikely to come into contact with the user.
- the determination unit 53 determines that the contact determination is "contact” when the result of the movement vector determination is "moving toward the frontal region". .. This is because when an object moving in an area different from the front area moves toward the front area, the object stays in a different area from the front area and moves with the user. This is because the risk of contact is high. That is, when the determination unit 53 estimates that the object is located in a region different from the region in front of the user and estimates that the object moves toward the region in front based on the movement vector, the object is estimated. The possibility of contact is judged to be higher than the case where it is estimated that the object stays outside the front area and moves.
- the determination unit 53 determines the distance between the user and the object based on the positions of the objects in the plurality of captured images detected by the detection unit 33. As an example, the determination unit 53 estimates the distance between the user and the object based on, for example, the position information and the time information of the object included in the object information. Then, when the determination unit 53 determines that the estimated distance between the user and the object is smaller than the threshold value (second threshold value), the determination unit 53 determines that the distance between the user and the object is "close (near)".
- the threshold value second threshold value
- the determination unit 53 determines that the estimated distance between the user and the object is not smaller than the threshold value (second threshold value)
- the determination unit 53 determines that the distance between the user and the object is "far (far)".
- the threshold value is a predetermined value, and is a limit allowable value of the distance between the user and the object, which is generally considered to have a low risk of the object coming into contact with the user.
- the determination unit 53 determines the approach speed based on the positions of the objects in the plurality of captured images detected by the detection unit 33. As an example, the determination unit 53 estimates the approach speed based on, for example, the position information and the time information of the object included in the object information. Then, when the determination unit 53 determines that the estimated approach speed is larger than the threshold value (first threshold value), the determination unit 53 determines that the approach speed is "fast (fast)". On the other hand, when the determination unit 53 determines that the estimated approach speed is not larger than the threshold value (first threshold value), the determination unit 53 determines that the approach speed is "slow (slow)".
- the threshold value is a predetermined value, and is generally a limit permissible value of the approach speed, which is considered to have a low risk of an object coming into contact with the user.
- the determination unit 53 determines the degree of danger based on the results of determining the contact possibility, the distance between the user and the object, and the approach speed. First, when the determination unit 53 determines "contact” in the contact determination of the possibility of contact and “close” in the determination of the distance between the user and the object, the degree of danger is determined regardless of the determination result of the approach speed. Is determined to be "high".
- the determination unit 53 determines "contact” in the contact determination of the possibility of contact and "far” in the determination of the distance between the user and the object, the degree of danger is determined in consideration of the determination of the approach speed. judge. Specifically, when the determination unit 53 determines "contact” in the contact determination of the possibility of contact, “far” in the determination of the distance between the user and the object, and “fast” in the determination of the approach speed, the degree of danger. Is determined to be “medium”. On the other hand, when the determination unit 53 determines "contact” in the contact determination of the possibility of contact, “far” in the determination of the distance between the user and the object, and “slow” in the determination of the approach speed, the risk level is “low”. “. Further, when the determination unit 53 determines "non-contact” in the contact determination of the possibility of contact, it determines that the degree of danger is “none” regardless of the determination result of the distance between the user and the object and the approach speed. do.
- the determination unit 53 determines that the degree of danger is higher than when it is determined that the approach speed is not larger than the first threshold value. Further, when the determination unit 53 determines that the distance between the user and the object is smaller than the second threshold value, the determination unit 53 determines that the degree of danger is higher than when it is determined that the distance is not smaller than the second threshold value. ..
- the acquisition unit 32 acquires a plurality of images including the image P1
- the detection unit 33 obtains information about the detected person H1 as a detection result (object name (person), each image pickup).
- the position information and time information of the person H1 in the image) is transmitted to the determination server 50.
- the acquisition unit 52 acquires information about the person H1, and the determination unit 53 determines the degree of danger.
- the determination unit 53 determines the possibility of contact (that is, contact determination). Specifically, the determination unit 53 is based on the position of the person H1 in a plurality of images including the image P1, for example, when the person H1 (object) is located in a region of ⁇ 15 ° toward the front front. Determines "front" (that is, the person H1 is located in the front area) in the front determination. Further, in the example shown in FIG. 1, the person H1 stays and moves in the front area.
- the determination unit 53 determines "contact" in the contact determination based on the results of the frontal determination and the movement vector determination.
- the determination unit 53 determines the distance between the user and the person H1 and determines the approach speed.
- the distance between the user and the person H1 is not smaller than the threshold value, and the person H1 reaches the user at a speed larger than the threshold value.
- the determination unit 53 determines that the distance between the user and the object is "far” and the approach speed is "fast” based on the position of the person H1 in the plurality of captured images including the image P1.
- the determination unit 53 determines that the person H1 is the user based on the result of each determination described above (accessibility: “contact”, distance between the user and the person H1: “far”, and approach speed: “fast”). It is determined that the risk of contact with is “medium” (see FIG. 4). As described above, the degree of danger in the example shown in FIG. 1 is determined.
- the generation unit 54 generates notification information indicating that the user is dangerous based on the degree of danger determined by the determination unit 53.
- the generation unit 54 generates notification information so that the higher the risk, the easier it is for the user to recognize the danger of contact with an object.
- the notification information includes sound information and image information.
- the sound information is information for emitting a dangerous sound in the speaker of the communication terminal 10.
- the generation unit 54 generates sound information (notification information) so that a dangerous sound (sound according to the degree of danger) is emitted from a place corresponding to the position of the object in the speaker.
- the "place corresponding to the position of the object in the speaker” is, for example, a place in which the position of the object seen by the user is reflected in the speaker (hereinafter, simply referred to as "speaker") in which the communication terminal 10 is attached to the user. Is.
- the position of the object as seen by the user is located in front of the user and on the left side (hereinafter, simply referred to as "front left side"), the position on the left side of the user's body in the speaker is in the speaker. It is a place corresponding to the position of the object.
- the generation unit 54 generates notification information based on the position information of the object included in the object information so that the speaker emits a dangerous sound notifying the position of the object in the left-right direction of the user. Specifically, for example, when the position of the object seen by the user is located on the left side of the front, the generation unit 54 generates notification information so that a dangerous sound is emitted from a place close to the user's left ear in the speaker. do.
- the generation unit 54 notifies the speaker that a dangerous sound indicating the position of an object in three dimensions (that is, in the left-right and front-back directions of the user) is emitted. Information may be generated. Specifically, for example, when the position of the object as seen by the user is located on the left side of the front, the generation unit 54 emits a dangerous sound from a place close to the left ear of the user in the speaker and from the direction of the left side of the front. Generate notification information so that the user can hear the danger sound.
- Danger sounds are sounds according to the degree of danger.
- the volume, content, etc. of the "sound according to the degree of danger" changes according to the level of the degree of danger, for example, the higher the degree of danger, the louder the sound, and the higher the degree of danger, the stronger the alerting sound. It's a sound.
- the generation unit 54 generates, as dangerous sounds, a sound that conveys the position of the object, the name of the object, the fact that the object is approaching, and the mode in which the object is approaching.
- the sound is generated based on, for example, the name of the object included in the object information, the distance between the user and the object, and the approach speed.
- Examples of the mode in which the object is approaching include, for example, when the object is a person, a mode such as running or walking. In the example shown in FIG. 3, the person H1 who is an object runs and approaches the user from 5 m ahead in front of the user. Therefore, the generation unit 54 generates a danger sound M1 that conveys that the person H1 is running and approaching from 5 m ahead of the front surface.
- the generation unit 54 generates image information (notification information) so that a dangerous image is displayed on the screen of the communication terminal 10 (hereinafter, simply referred to as “screen”).
- the danger image is an image in which the higher the degree of danger, the easier it is for the user to recognize the danger of contact with an object.
- Information about the object is displayed in the danger image.
- the information about the object is, for example, a captured image acquired by the acquisition unit 32. That is, the dangerous image is an image based on the captured image. Further, the danger image may include (superimpose) other information different from the object depending on the degree of danger (specifically, the lower the degree of danger).
- the other information is information different from the object, such as information related to a message application such as an e-mail to a user and information related to a map application, and is, for example, information acquired from another server (not shown).
- the generation unit 54 generates image information so that a dangerous image that is continuous in time corresponding to each of the plurality of captured images is displayed on the screen.
- the generation unit 54 produces a normal image in which other information is superimposed on the captured image when there is no object in the area in front of the user or when the determination unit 53 determines that the risk level is “absent”. Generate.
- the generation unit 54 generates image information so that a normal image that is continuous in time corresponding to each of the plurality of captured images is displayed on the screen.
- the image P3 shown in FIG. 5 is an example of a normal image, in which one image Pn included in a plurality of captured images has other information (information different from information about an object) Z, message information, and weather information. , And the image on which the map information is superimposed. In the image P3, the other information Z is superimposed on the image Pn in a non-transparent display state.
- the generation unit 54 When the determination unit 53 determines that the risk level is "low”, the generation unit 54 generates a danger image in which other information is superimposed on the captured image.
- the image (danger image) P4 shown in FIG. 6 is an example of a danger image when the degree of danger is “low”, and other information Z is superimposed on one image Pd included in a plurality of captured images. It is an image that was made. However, in the image P4, the image Pd (information about the object) is conspicuously displayed.
- the generation unit 54 generates a dangerous image by setting the display of other information to a predetermined transparency.
- the other information Z included in the image P3 is in a non-transparent display state, while the other information Z included in the image P4 is displayed semi-transparently.
- other information of the image information when the degree of danger is determined to be "low” is less conspicuous than other information Z contained in the normal image (in other words, information about the object).
- the image Pd is prominently generated).
- the generation unit 54 generates only the captured image as a danger image when the determination unit 53 determines that the risk level is "medium” or "high". That is, the danger image generated when the determination unit 53 determines that the degree of danger is “medium” or “high” does not superimpose other information on the captured image. , It is different from the danger image generated when the degree of danger is determined to be “low”.
- the image (danger image) P2 shown in FIG. 7 is an example of a danger image when the determination unit 53 determines that the degree of danger is “medium” and when it is determined to be “high”. In the image P2, the information about the object is conspicuously displayed.
- the generation unit 54 hides other information and generates a dangerous image.
- the other information Z is displayed semi-transparently in the image P4 generated when the degree of risk is "low", whereas the other information Z is not included in the image P2.
- a frame surrounding the person H1 which is an object is displayed.
- the image information when the risk level is determined to be "medium” and when the risk level is determined to be “high” is compared with the case where the risk level is determined to be "low”.
- other information is generated so as to be inconspicuous (in other words, information about the object (image Pd) is further conspicuous).
- the generation unit 54 generates image information (notification information) so that the higher the degree of danger, the easier it is for the user to recognize the danger of contact with an object.
- the output unit 55 outputs the notification information generated by the determination unit 53.
- the output notification information is acquired by the communication terminal 10.
- a danger sound is emitted from the speaker of the communication terminal 10, and a danger image is displayed on the screen of the communication terminal 10.
- the risk level is "high”
- the notification information sound information and image information corresponding to the risk level: "high”
- the dangerous sound M1 is emitted from the speaker of the communication terminal 10
- the image P2 which is a dangerous image is displayed on the screen of the communication terminal 10 (see FIG. 3).
- the dangerous sound M1 is emitted from a place corresponding to the person H1 in the speaker (a central place in the left-right direction of the user).
- the danger sound M1 is emitted as a sound corresponding to the danger degree "high” at a louder volume than when the danger degree is determined to be “low” and "medium”.
- the image P2 is a dangerous image of one of the temporally continuous dangerous images corresponding to a plurality of captured images, and is an image in which a frame surrounding the person H1 is superimposed on the image P1. As described above, in the information notification system 1, the user is notified of the notification information indicating that it is dangerous.
- FIG. 8 is a flowchart showing a process performed by the information notification system 1.
- the object detection server 30 acquires a plurality of temporally continuous captured images from the communication terminal 10 attached to the user (step S11). Specifically, the object detection server 30 is imaged by the communication terminal 10 and acquires a plurality of captured images related to the area around the user.
- the object detection server 30 detects an object in the area around the user based on the plurality of captured images acquired in step S11 and the data 300 stored in the object detection server 30 (). Step S12). Then, the detected object information in the object detection server 30 is transmitted to the determination server 50.
- step S13 determines the risk of the detected object coming into contact with the user based on the result (object information) detected by the object detection server 30 (step S13).
- the process of step S13 will be described in detail with reference to the flowchart of FIG.
- the determination server 50 determines whether or not the object is located in the area in front of the user based on the object information (step S21). Specifically, the determination unit 53 determines whether or not the object is located in a region of ⁇ 15 ° from the user toward the front front.
- the determination server 50 heads for an area different from the front area based on the movement vector of the object. It is determined whether or not to move (step S22). Specifically, the determination server 50 calculates a movement vector based on the position of the object included in the object information, and determines a region where the object is predicted to move based on the calculated movement vector. When it is determined that the object moves toward an area different from the front area (step S22: YES), the determination server 50 determines that the degree of danger is "none" (step S23), and the process shown in FIG. Is finished. On the other hand, when it is determined that the object does not move toward a region different from the front region (step S22: NO), the process proceeds to step S25.
- step S21: NO When it is determined by the user that the object is not located in the front area (step S21: NO), whether the object moves toward the front area based on the movement vector of the object in the determination server 50. Whether or not it is determined (step S24). If it is determined that the object moves toward the front area (step S24: YES), the process proceeds to S25. On the other hand, when it is determined that the object does not move toward the front area (step S24: NO), the process proceeds to step S23.
- the determination server 50 determines whether or not the distance between the user and the object estimated based on the position information and the time information of the object included in the object information is smaller than the threshold value (second threshold value) (step). S25).
- step S25: YES When it is determined that the estimated distance is smaller than the threshold value (step S25: YES), the determination server 50 determines that the risk level is "high” (step S26), and the process shown in FIG. 9 ends. do.
- step S25: NO when it is determined that the estimated distance is not larger than the threshold value (step S25: NO), the threshold value is the approach speed estimated based on the position information and the time information of the object included in the object information in the determination server 50. It is determined whether or not it is larger than (first threshold value) (step S27).
- step S27: YES When it is determined that the estimated approach speed is larger than the threshold value (step S27: YES), the determination server 50 determines that the risk level is "medium” (step S28), and the process shown in FIG. 9 is performed. finish. On the other hand, when it is determined that the estimated approach speed is not larger than the threshold value (step S27: NO), the determination server 50 determines that the risk level is “low” (step S29), which is shown in FIG. Processing ends.
- notification information indicating that the user is dangerous is generated based on the degree of danger (step S14). Specifically, the determination server 50 generates sound information so that the dangerous sound is emitted from the place corresponding to the position of the object in the speaker, and the dangerous image is displayed on the screen of the communication terminal 10. , Generate image information.
- the determination server 50 outputs the generated notification information to the communication terminal 10 (step S15).
- the notification information is acquired by the communication terminal 10
- a dangerous sound is emitted from the speaker of the communication terminal 10
- a dangerous image is displayed on the screen of the communication terminal 10.
- the information notification system 1 includes an acquisition unit 32 that acquires one or a plurality of captured images related to an area around the user, and one or a plurality of images captured by the communication terminal 10 mounted on the user. Based on the image, the detection unit 33 that detects an object in the area around the user, and based on the result detected by the detection unit 33, determines the degree of danger that the object detected by the detection unit 33 comes into contact with the user. A determination unit 53 is provided, a generation unit 54 that generates notification information indicating that the user is dangerous based on the degree of danger, and an output unit 55 that outputs the notification information to the communication terminal 10. The generation unit 54 generates notification information so that the higher the risk, the easier it is for the user to recognize the danger of contact with an object.
- an object approaching the user is recognized based on a plurality of captured images related to the area around the user, and the degree of danger that the object comes into contact with the user is determined. Then, in the information notification system 1, notification information is generated so that the user can easily recognize the danger of contact with an object as the degree of danger increases, and the notification information is output to the communication terminal 10. According to such a configuration, it is possible to notify the user according to the risk of contact with an object. Specifically, for example, when the risk of an object coming into contact with a user is low, it is not suitable for the determined risk by simply outputting notification information to the extent that the user is notified of the approach of the object. It is possible to prevent the user from feeling annoyed by excessive notification.
- the notification information emphasizing that the danger is imminent is output to the user, so that the contact between the object and the user is appropriate. It can be avoided.
- the information notification system 1 it is possible to appropriately notify the user of necessary information according to the degree of danger of the user.
- the information notification system 1 suppresses excessive notifications that are not suitable for the determined risk level, and thus has a technical effect of reducing the processing load.
- the acquisition unit 32 acquires a plurality of captured images that are continuous in time, and the determination unit 53 has a risk that the object comes into contact with the user based on the position of the object in the plurality of captured images detected by the detection unit 33. Determine the degree. This reduces the risk of an object coming into contact with the user by taking into account the position of the object between the plurality of captured images that are continuous in time (for example, the change in the position of the object between the plurality of captured images). It can be determined with higher accuracy.
- the determination unit 53 determines the contact possibility that the object may come into contact with the user based on the position of the object in the plurality of captured images, and determines the degree of danger based on the determination result of the contact possibility.
- the possibility of contact is determined based on whether or not the object is located in the area in front of the user and the movement vector of the object derived from the position of the object in a plurality of captured images.
- the contact possibility of the object is determined based on the position of the object with respect to the user and the moving direction of the object. Then, by determining the degree of danger from the determination result of the contact possibility of the object determined in this way, for example, when the possibility of contact with the object is high, the degree of danger can be increased, and the degree of danger can be increased. It can be determined with higher accuracy.
- the determination unit 53 estimates that the object is located in a region different from the front region with respect to the user and estimates that the object moves toward the front region based on the movement vector, the object is in front of the front region. Compared with the case where it is estimated that the object stays and moves in a region different from the region, the possibility of contact is judged to be higher.
- an object moving in an area different from the front area is unlikely to come into contact with the user.
- the risk of the object coming into contact with the user is considered to be higher than when the object stays in a different area from the front area and moves. Be done.
- the determination unit 53 determines the degree of danger based only on the position of the object with respect to the user, even if the object moves toward the front area, the possibility of contact is determined to be low, and as a result, the user is determined. Objects may come into contact.
- the possibility of contact is determined in consideration of whether or not an object located in an area different from the front area moves toward the front area, so that the degree of danger is made more accurate. It can be determined.
- the determination unit 53 estimates that the object is located in the front area with respect to the user and estimates that the object moves to a different area from the front area based on the movement vector, the determination unit 53 moves the object into the front area. The possibility of contact is judged to be lower than in the case of presuming that the object stays and moves.
- an object moving in the front area is likely to come into contact with the user.
- the determination unit 53 determines the degree of danger based only on the position of the object with respect to the user, there is a high possibility of contact if the object is located in the front area even if the object only crosses in front of the user. It may be determined, and as a result, excessive notification may be given that is not suitable for the determined risk level.
- the possibility of contact is determined in consideration of whether or not an object located in the front area moves toward an area different from the front area, so that the degree of danger is more accurate. Can be determined.
- the determination unit 53 determines that the speed at which the object moves in the direction approaching the user is greater than the first threshold value, the degree of danger is higher than when it is determined that the speed is not greater than the first threshold value. judge.
- the risk of an object coming into contact with the user depends on the speed at which the object approaches the user. Specifically, for example, even when the object moves in the front area, if the object approaches the user at a low speed, the risk of contact between the user and the object is considered to be low.
- the risk level is determined in consideration of the speed at which the object approaches the user, so that the risk level can be determined with higher accuracy.
- the determination unit 53 determines that the distance between the user and the object is smaller than the second threshold value, the determination unit 53 determines that the degree of danger is higher than when it is determined that the distance is not smaller than the second threshold value.
- the risk of an object coming into contact with the user depends on the distance between the user and the object. Specifically, for example, even if the object is moving in the area in front of the user, if the object is located far away from the user, the risk of the object touching the user is low. It can be said that.
- the degree of danger is determined in consideration of the distance of the object to the user, so that the degree of danger can be determined with higher accuracy.
- the information notification system 1 includes information notified by sound in the speaker of the communication terminal 10, and the generation unit 54 notifies the speaker so that the sound corresponding to the position of the object is emitted from the place corresponding to the position of the object. Generate information. As a result, for example, even for a visually impaired user, it is possible to give an appropriate notification to the user according to the degree of danger of the user, and the user can grasp the location of an object having a high degree of contact. Can be made easier.
- the notification information includes information notified by a danger image displayed in a mode corresponding to the degree of danger on the screen of the communication terminal 10, and the generation unit 54 generates the information in the danger image as the degree of danger increases. Generate notification information so that information about the object stands out. In particular, in the information notification system 1, the generation unit 54 generates notification information so that the information about the object is conspicuous by making other information inconspicuous as the degree of danger is higher in the danger image.
- the information notification system 1 the necessary information can be appropriately notified to the user by adjusting the balance between the display of information about the object and the display of other information according to the degree of danger of the user.
- the communication terminal 10, the object detection server 30, and the determination server 50 physically include a processor 1001, a memory 1002, a storage 1003, a communication device 1004, an input device 1005, an output device 1006, a bus 1007, and the like. It may be configured as.
- the word “device” can be read as a circuit, device, unit, etc.
- the hardware configuration of the communication terminal 10, the object detection server 30, and the determination server 50 may be configured to include one or more of the devices shown in FIG. 10, or may not include some of the devices. May be done.
- the object detection server 30, and the determination server 50 For each function in the communication terminal 10, the object detection server 30, and the determination server 50, by loading predetermined software (program) on the hardware such as the processor 1001 and the memory 1002, the processor 1001 performs an operation and the communication device. It is realized by controlling communication by 1004 and reading and / or writing of data in the memory 1002 and the storage 1003.
- predetermined software program
- the processor 1001 operates, for example, an operating system to control the entire computer.
- the processor 1001 may be configured by a central processing unit (CPU: Central Processing Unit) including an interface with a peripheral device, a control device, an arithmetic unit, a register, and the like.
- CPU Central Processing Unit
- the control function of the detection unit 33 of the object detection server 30 may be realized by the processor 1001.
- the processor 1001 reads a program (program code), a software module and data from the storage 1003 and / or the communication device 1004 into the memory 1002, and executes various processes according to these.
- program program code
- a program that causes a computer to execute at least a part of the operations described in the above-described embodiment is used.
- control function of the detection unit 33 of the object detection server 30 may be realized by a control program stored in the memory 1002 and operated by the processor 1001, and other functional blocks may be similarly realized.
- processor 1001 may be executed simultaneously or sequentially by two or more processors 1001.
- Processor 1001 may be mounted on one or more chips.
- the program may be transmitted from the network via a telecommunication line.
- the memory 1002 is a computer-readable recording medium, and is composed of at least one such as a ROM (Read Only Memory), an EPROM (Erasable Programmable ROM), an EEPROM (Electrically Erasable Programmable ROM), and a RAM (Random Access Memory). May be done.
- the memory 1002 may be referred to as a register, a cache, a main memory (main storage device), or the like.
- the memory 1002 can store a program (program code), a software module, and the like that can be executed to implement the wireless communication method according to the embodiment of the present invention.
- the storage 1003 is a computer-readable recording medium, and is, for example, an optical disk such as a CDROM (Compact Disc ROM), a hard disk drive, a flexible disk, an optical magnetic disk (for example, a compact disk, a digital versatile disk, or a Blu-ray (registration)). It may consist of at least one such as a (trademark) disk), a smart card, a flash memory (eg, a card, stick, key drive), a floppy (registered trademark) disk, a magnetic strip, and the like.
- the storage 1003 may be referred to as an auxiliary storage device.
- the storage medium described above may be, for example, a database, server or other suitable medium containing memory 1002 and / or storage 1003.
- the communication device 1004 is hardware (transmission / reception device) for communicating between computers via a wired and / or wireless network, and is also referred to as, for example, a network device, a network controller, a network card, a communication module, or the like.
- the input device 1005 is an input device (for example, a keyboard, a mouse, a microphone, a switch, a button, a sensor, etc.) that accepts an input from the outside.
- the output device 1006 is an output device (for example, a display, a speaker, an LED lamp, etc.) that outputs to the outside.
- the input device 1005 and the output device 1006 may have an integrated configuration (for example, a touch panel).
- each device such as the processor 1001 and the memory 1002 is connected by the bus 1007 for communicating information.
- the bus 1007 may be composed of a single bus or may be composed of different buses between the devices.
- the communication terminal 10, the object detection server 30, and the determination server 50 include a microprocessor, a digital signal processor (DSP: Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), and an FPGA (Field Programmable). It may be configured to include hardware such as Gate Array), and a part or all of each functional block may be realized by the hardware.
- the processor 1001 may be implemented on at least one of these hardware.
- the information notification system 1 has been described as being configured to include the communication terminal 10, the object detection server 30, and the determination server 50, but the present invention is not limited to this, and each function of the information notification system 1 is a communication terminal. It may be realized only by 10.
- the object detection server 30 has an acquisition unit 32 and a detection unit 33
- the determination server 50 has a determination unit 53, a generation unit 54, and an output unit 55.
- another server may include a part or all of each functional component
- the communication terminal 10 may include a part of each functional component. You may.
- the degree of danger may be determined for an object located in the area of the back surface, the right side surface, and the left side surface of the user, and notification information may be generated based on the degree of danger.
- the notification information is sound information and image information, but the notification information may be only sound information or only image information, and for example, the lamp of the communication terminal 10 is lit. It may be other information such as optical information for the purpose.
- the acquisition unit 32 acquires one captured image relating to the area around the user captured by the communication terminal 10 mounted on the user, and the detection unit 33 acquires one captured image of the user's surroundings based on the one captured image. Objects in the area may be detected.
- the determination unit 53 has a possibility of contact based on whether or not the object is located in the area in front of the user and the movement vector of the object derived from the position of the object in a plurality of captured images. May be determined.
- the determination unit 53 may determine the degree of danger based on the result detected by the detection unit 33. As an example, the determination unit 53 may determine the degree of danger based only on the contact possibility, and also, a method of comprehensively considering the contact possibility, the distance between the user and the object, and the approach speed. The degree of risk may be determined using a method different from that of.
- the generation unit 54 may generate notification information so that the higher the degree of danger in the danger image, the more conspicuous the information about the object. As an example, the generation unit 54 may generate notification information so that the higher the risk, the smaller the other information is displayed.
- information other than the notification information may be notified to the user.
- information related to sound other than the notification information may be emitted from the speaker included in the communication terminal 10.
- the detection unit 33 detects the object H2, which is a signboard arranged on the road, in addition to the person H1 as an object, and the generation unit 54 detects the object H2 based on the information about the object H2. , The position of the object, and the information for making a sound that conveys the name of the object may be generated.
- the object H2 is arranged 5 m in front of the user and 1 m to the left of the user. Therefore, the generation unit 54 generates information for emitting the sound M2 indicating that there is a signboard 5 m in front of the front surface and 1 m ahead. Further, for example, as shown in FIG.
- the detection unit 33 detects the object H3, which is a signboard with the store name on the street, as an object, and the generation unit 54 detects the object based on the information about the object H3.
- Information may be generated for the sound M3 that conveys the position of H3, the type of store indicated by the object H3, and the name of the store to be emitted.
- the type of store and the name of the store are stored in, for example, the data 300 of the object detection server 30.
- the output unit 55 outputs the information for emitting the sound
- the place corresponding to the object H2 in the speaker the left side in the left-right direction in the speaker with the communication terminal 10 attached to the user
- the object H3 the object corresponding to the object H2 in the speaker (the left side in the left-right direction in the speaker with the communication terminal 10 attached to the user) and the object H3.
- the above-mentioned sounds M2 and M3 are emitted from the place corresponding to (the center in the left-right direction in the speaker in which the communication terminal 10 is attached to the user).
- Each aspect / embodiment described in the present specification includes LTE (Long Term Evolution), LTE-A (LTE-Advanced), SUPER 3G, IMT-Advanced, 4G, 5G, FRA (Future Radio Access), W-CDMA. (Registered Trademark), GSM (Registered Trademark), CDMA2000, UMB (Ultra Mobile Broad-band), IEEE 802.11 (Wi-Fi), IEEE 802.16 (WiMAX), IEEE 802.20, UWB (Ultra-Wide) Band), WiMAX®, and other systems that utilize suitable systems and / or extended next-generation systems based on them may be applied.
- the input / output information and the like may be saved in a specific place (for example, a memory) or may be managed by a management table. Information to be input / output may be overwritten, updated, or added. The output information and the like may be deleted. The input information or the like may be transmitted to another device.
- the determination may be made by a value represented by 1 bit (0 or 1), by a boolean value (Boolean: true or false), or by comparing numerical values (for example, a predetermined value). It may be done by comparison with the value).
- the notification of predetermined information (for example, the notification of "being X") is not limited to the explicit one, but is performed implicitly (for example, the notification of the predetermined information is not performed). May be good.
- Software whether referred to as software, firmware, middleware, microcode, hardware description language, or other names, is an instruction, instruction set, code, code segment, program code, program, subprogram, software module.
- Applications, software applications, software packages, routines, subroutines, objects, executable files, execution threads, procedures, features, etc. should be broadly interpreted.
- software, instructions, etc. may be transmitted and received via a transmission medium.
- the software may use wired technology such as coaxial cable, fiber optic cable, twisted pair and digital subscriber line (DSL) and / or wireless technology such as infrared, wireless and microwave to website, server, or other.
- wired technology such as coaxial cable, fiber optic cable, twisted pair and digital subscriber line (DSL) and / or wireless technology such as infrared, wireless and microwave to website, server, or other.
- DSL digital subscriber line
- wireless technology such as infrared, wireless and microwave to website, server, or other.
- the information, signals, etc. described herein may be represented using any of a variety of different techniques.
- data, instructions, commands, information, signals, bits, symbols, chips, etc. that may be referred to throughout the above description are voltages, currents, electromagnetic waves, magnetic fields or magnetic particles, light fields or photons, or any of these. It may be represented by a combination of.
- information, parameters, etc. described in the present specification may be represented by an absolute value, a relative value from a predetermined value, or another corresponding information. ..
- the communication terminal 10 may be a mobile communication terminal, a subscriber station, a mobile unit, a subscriber unit, a wireless unit, a remote unit, a mobile device, a wireless device, a wireless communication device, a remote device, a mobile subscriber station, or an access terminal, depending on the person in the art. , Mobile device, wireless device, remote device, handset, user agent, mobile client, client, or some other suitable term.
- any reference to that element does not generally limit the quantity or order of those elements. These designations can be used herein as a convenient way to distinguish between two or more elements. Thus, references to the first and second elements do not mean that only two elements can be adopted there, or that the first element must somehow precede the second element.
- 1 Information notification system, 10 ... Communication terminal (terminal), 32 ... Acquisition unit, 33 ... Detection unit, 53 ... Judgment unit, 54 ... Generation unit, 55 ... Output unit, H1 ... Person (object), M1 ... Danger sound (Sound), P1 ... image (captured image), P2, P4 ... image (dangerous image), Pd ... image (information about an object), Z ... other information (information different from information about an object).
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Emergency Management (AREA)
- Health & Medical Sciences (AREA)
- Epidemiology (AREA)
- Pain & Pain Management (AREA)
- Physical Education & Sports Medicine (AREA)
- Rehabilitation Therapy (AREA)
- Life Sciences & Earth Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Alarm Systems (AREA)
- Traffic Control Systems (AREA)
- Navigation (AREA)
- Rehabilitation Tools (AREA)
- Closed-Circuit Television Systems (AREA)
- Emergency Alarm Devices (AREA)
Abstract
L'invention concerne un système de notification d'informations qui comprend : une unité d'acquisition pour acquérir une image capturée ou une pluralité d'images capturées se rapportant à la zone autour d'un utilisateur, capturées par un terminal de communication placé sur l'utilisateur ; une unité de détection pour détecter un objet dans la zone autour de l'utilisateur sur la base de la ou des images capturées ; une unité de détermination pour déterminer le degré de danger avec lequel l'objet détecté par l'unité de détection peut entrer en contact avec l'utilisateur, sur la base des résultats détectés par l'unité de détection ; une unité de génération pour générer des informations de notification indiquant que l'utilisateur est en danger, sur la base du degré de danger ; et une unité de sortie pour délivrer en sortie les informations de notification au terminal de communication. L'unité de génération génère les informations de notification de telle manière que plus le degré de danger est élevé, plus facilement l'utilisateur perçoit le degré de danger de contact avec l'objet.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2022521807A JP7504201B2 (ja) | 2020-05-12 | 2021-04-23 | 情報通知システム |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2020083731 | 2020-05-12 | ||
| JP2020-083731 | 2020-05-12 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2021230049A1 true WO2021230049A1 (fr) | 2021-11-18 |
Family
ID=78525674
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2021/016518 Ceased WO2021230049A1 (fr) | 2020-05-12 | 2021-04-23 | Système de notification d'informations |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP7504201B2 (fr) |
| WO (1) | WO2021230049A1 (fr) |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2007148835A (ja) * | 2005-11-28 | 2007-06-14 | Fujitsu Ten Ltd | 物体判別装置、報知制御装置、物体判別方法および物体判別プログラム |
| JP2009110065A (ja) * | 2007-10-26 | 2009-05-21 | Toyota Central R&D Labs Inc | 運転支援装置 |
| JP2015118667A (ja) * | 2013-12-20 | 2015-06-25 | 株式会社大成化研 | 接近報知装置 |
| US20160253560A1 (en) * | 2015-02-27 | 2016-09-01 | Sony Corporation | Visibility enhancement devices, systems, and methods |
| JP2016186786A (ja) * | 2015-01-21 | 2016-10-27 | トヨタ モーター エンジニアリング アンド マニュファクチャリング ノース アメリカ,インコーポレイティド | 画像及びオーディオデータに基づいた危険の検出及び警告用のウェアラブルスマート装置 |
| JP2017536595A (ja) * | 2014-09-26 | 2017-12-07 | ハーマン インターナショナル インダストリーズ インコーポレイテッド | 歩行者情報システム |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2014096661A (ja) * | 2012-11-08 | 2014-05-22 | International Business Maschines Corporation | 動画撮影中において動画中の動体オブジェクトをリアルタイムに隠消するための方法、並びに、その動画撮影機器及び当該動画撮影機器のためのプログラム |
| EP3001289A4 (fr) * | 2013-05-23 | 2017-01-18 | Pioneer Corporation | Contrôleur d'affichage |
| JP2019066564A (ja) * | 2017-09-28 | 2019-04-25 | 日本精機株式会社 | 表示装置、表示制御方法、及びプログラム |
-
2021
- 2021-04-23 JP JP2022521807A patent/JP7504201B2/ja active Active
- 2021-04-23 WO PCT/JP2021/016518 patent/WO2021230049A1/fr not_active Ceased
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2007148835A (ja) * | 2005-11-28 | 2007-06-14 | Fujitsu Ten Ltd | 物体判別装置、報知制御装置、物体判別方法および物体判別プログラム |
| JP2009110065A (ja) * | 2007-10-26 | 2009-05-21 | Toyota Central R&D Labs Inc | 運転支援装置 |
| JP2015118667A (ja) * | 2013-12-20 | 2015-06-25 | 株式会社大成化研 | 接近報知装置 |
| JP2017536595A (ja) * | 2014-09-26 | 2017-12-07 | ハーマン インターナショナル インダストリーズ インコーポレイテッド | 歩行者情報システム |
| JP2016186786A (ja) * | 2015-01-21 | 2016-10-27 | トヨタ モーター エンジニアリング アンド マニュファクチャリング ノース アメリカ,インコーポレイティド | 画像及びオーディオデータに基づいた危険の検出及び警告用のウェアラブルスマート装置 |
| US20160253560A1 (en) * | 2015-02-27 | 2016-09-01 | Sony Corporation | Visibility enhancement devices, systems, and methods |
Also Published As
| Publication number | Publication date |
|---|---|
| JPWO2021230049A1 (fr) | 2021-11-18 |
| JP7504201B2 (ja) | 2024-06-21 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN115175086B (zh) | 在运动和通勤期间实现安全的耳机使用的空间音频的系统和方法 | |
| KR101668165B1 (ko) | 웨어러블 컴퓨팅 시스템상에 사운드 표시들을 디스플레이 | |
| KR102244856B1 (ko) | 웨어러블 장치와의 사용자 인터랙션을 제공하는 방법 및 이를 수행하는 웨어러블 장치 | |
| US10096301B2 (en) | Method for controlling function and electronic device thereof | |
| US9390607B2 (en) | Smart device safety mechanism | |
| US20150094118A1 (en) | Mobile device edge view display insert | |
| US10848606B2 (en) | Divided display of multiple cameras | |
| KR20150129423A (ko) | 전자 장치 및 전자 장치의 제스처 인식 방법 및 전자 장치 | |
| US20150158426A1 (en) | Apparatus, control method thereof and computer-readable storage medium | |
| US10085107B2 (en) | Sound signal reproduction device, sound signal reproduction method, program, and recording medium | |
| KR20200081466A (ko) | 이미지 처리 방법, 이미지 처리 장치, 전자 기기 및 저장 매체 | |
| KR20150099650A (ko) | 생체 정보디스플레이 방법 및 장치 | |
| US9826303B2 (en) | Portable terminal and portable terminal system | |
| JP2024506809A (ja) | 危険行為の識別方法及び装置、電子機器並びに記憶媒体 | |
| CN107219920A (zh) | 基于场景的ar眼镜识别方法、装置和ar眼镜 | |
| US12444142B2 (en) | Positioning system to position a terminal carried by a user in a vehicle | |
| US20220198794A1 (en) | Related information output device | |
| WO2021230049A1 (fr) | Système de notification d'informations | |
| WO2023084945A1 (fr) | Dispositif de commande de sortie | |
| KR101614315B1 (ko) | 웨어러블 장치 및 그 제어 방법 | |
| JP7246255B2 (ja) | 情報処理装置及びプログラム | |
| KR102374400B1 (ko) | 이미지 처리 방법 및 장치, 전자 기기와 저장 매체 | |
| US20230091669A1 (en) | Information processing apparatus, information processing system, and non-transitory computer readable medium storing program | |
| WO2020230892A1 (fr) | Dispositif de traitement | |
| WO2021172137A1 (fr) | Système de partage de contenu et terminal |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21803162 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2022521807 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 21803162 Country of ref document: EP Kind code of ref document: A1 |