[go: up one dir, main page]

WO2024236748A1 - Abnormality determination device and abnormality determination method - Google Patents

Abnormality determination device and abnormality determination method Download PDF

Info

Publication number
WO2024236748A1
WO2024236748A1 PCT/JP2023/018362 JP2023018362W WO2024236748A1 WO 2024236748 A1 WO2024236748 A1 WO 2024236748A1 JP 2023018362 W JP2023018362 W JP 2023018362W WO 2024236748 A1 WO2024236748 A1 WO 2024236748A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
state
value
neural network
output
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/JP2023/018362
Other languages
French (fr)
Japanese (ja)
Inventor
拓也 上杉
将人 後町
勝己 高橋
廣愛 浅見
洋 酒巻
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Priority to JP2025520317A priority Critical patent/JP7696532B2/en
Priority to PCT/JP2023/018362 priority patent/WO2024236748A1/en
Publication of WO2024236748A1 publication Critical patent/WO2024236748A1/en
Anticipated expiration legal-status Critical
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/10Interfaces, programming languages or software development kits, e.g. for simulating neural networks

Definitions

  • the disclosed technology relates to an abnormality determination technology that determines the condition of a subject.
  • abnormality determination techniques there are those that determine the condition of a person to be determined by using the results of measurements taken by a sensor device (such as a camera, radar, or infrared sensor).
  • a sensor device such as a camera, radar, or infrared sensor.
  • abnormality judgment technologies for example, in abnormality judgment technologies that judge the state of a subject of judgment, such as a driver (occupant) of a vehicle (mobile body), since this concerns the safety of the subject of judgment, it is desirable to make the judgment with high accuracy to avoid erroneous judgments.
  • Deep learning uses a mathematical model called a neural network (NN).
  • a neural network outputs results through layers such as an input layer, a hidden layer (intermediate layer), and an output layer.
  • Patent Document 1 discloses an identification device that uses a convolutional neural network (CNN), which is a type of neural network, to identify when a driver of a mobile object is in an abnormal state. Specifically, in the process of identifying the driver's condition (e.g., whether or not the driver is in an abnormal state) based on an acquired image, the identification device of Patent Document 1 obtains the driver's skeletal information from the image using a convolutional neural network. The identification device of Patent Document 1 improves the accuracy of the driver's skeletal information by using a neural network.
  • CNN convolutional neural network
  • the neural network extracts and uses a large number of judgment indices indicating the state of the person being judged from an image (for example, a large number of types of indices such as the open/closed state of the eyelids, the number of blinks, the position of the face, the state of brain waves, the heat dissipation state, driving performance (reaction time, etc.)).
  • a large number of judgment indices indicating the state of the person being judged from an image (for example, a large number of types of indices such as the open/closed state of the eyelids, the number of blinks, the position of the face, the state of brain waves, the heat dissipation state, driving performance (reaction time, etc.)).
  • Patent Document 1 merely outputs skeletal information from an image via a neural network, and is not capable of solving the above-mentioned problems.
  • the present disclosure aims to solve the above problems and to enable anomaly detection technology to quickly output highly accurate detection results using a neural network.
  • the abnormality determination device of the present disclosure is an image acquisition unit that acquires and outputs an image; a neural network unit having a plurality of layers in a neural network, acquiring an image output by the image acquisition unit, and outputting a judgment result indicating a state of a judgment target included in the image; an output storage unit that stores output data output by the neural network unit; a state comparison unit that prestores a reference value that is a reference for an input value to a first layer that is at least one of the multiple layers of the neural network unit, and judges a change in the state of the object to be judged using a difference value between the reference value and a current value that is a value that is to be newly input to the first layer; a processing control unit that instructs the neural network unit not to execute processing in the first layer and subsequent layers when the state comparison unit determines that the state of the object to be determined has not changed, and instructs the neural network unit to output the output data stored in the output storage unit; Equipped with:
  • the present disclosure has the effect of enabling anomaly detection technology to quickly output highly accurate detection results using a neural network.
  • FIG. 1 is a diagram showing an example of the configuration of an abnormality warning device 100A and an abnormality determination device 1000A according to a first embodiment of the present disclosure.
  • FIG. 2 is a flowchart showing an example of the processing of the abnormality warning device 100A and the abnormality determination device 1000A according to the first embodiment of the present disclosure.
  • FIG. 3 is a flowchart showing an example of more detailed processing of the abnormality warning device 100A and the abnormality determination device 1000A according to the first embodiment of the present disclosure.
  • FIG. 4 is a diagram showing an example of the configuration of an abnormality warning device 100B and an abnormality determination device 1000B according to the second embodiment of the present disclosure.
  • FIG. 1 is a diagram showing an example of the configuration of an abnormality warning device 100A and an abnormality determination device 1000A according to a first embodiment of the present disclosure.
  • FIG. 2 is a flowchart showing an example of the processing of the abnormality warning device 100A and the abnormality determination device 1000A according to the first embodiment of the present disclosure.
  • FIG. 5 is a diagram showing an example of the internal configuration of the abnormality warning device 100B and the feature extraction unit 1300B in the abnormality determination device 1000B.
  • FIG. 6 is a diagram showing an example of the internal configuration of the abnormality warning device 100B and the abnormality state determination unit 1500B in the abnormality determination device 1000B.
  • FIG. 7 is a flowchart showing an example of processing by the abnormality warning device 100B and the abnormality determination device 1000B according to the second embodiment of the present disclosure.
  • FIG. 8 is a flowchart showing an example of more detailed processing of the abnormality warning device 100B and the abnormality determination device 1000B according to the second embodiment of the present disclosure.
  • FIG. 9 is a flowchart showing an example of a process of the feature extraction unit 1300B in a case where the state comparison unit 1030 (first state comparison unit 1030B) in the feature extraction unit 1300B according to the second embodiment of the present disclosure is the image state comparison unit 1341.
  • FIG. 10 is a flowchart illustrating an example of a process of the feature extraction unit 1300B in the second embodiment of the present disclosure when the state comparison unit 1030 (first state comparison unit 1030B) is a convolutional layer state comparison unit.
  • FIG. 11 is a flowchart illustrating an example of a process performed by the feature extraction unit 1300B in the second embodiment of the present disclosure when the state comparison unit 1030 (first state comparison unit 1030B) is a pooling layer state comparison unit.
  • FIG. 10 is a flowchart illustrating an example of a process of the feature extraction unit 1300B in a case where the state comparison unit 1030 (first state comparison unit 1030B) in the feature extraction unit 1300B according to the second embodiment
  • FIG. 12 is a flowchart showing an example of processing of the abnormal state determination unit 1500B in the case where the state comparison unit (second state comparison unit 1540B) in the abnormal state determination unit 1500B according to the second embodiment of the present disclosure is the first fully connected layer state comparison unit 1541.
  • FIG. 13 is a flowchart showing an example of processing by the abnormal state determination unit 1500B in the second embodiment of the present disclosure when the state comparison unit (second state comparison unit 1540B) in the abnormal state determination unit 1500B is a second fully connected layer state comparison unit.
  • FIG. 14 is a diagram showing an example of the configuration of an abnormality warning device 100C and an abnormality determination device 1000C according to the third embodiment of the present disclosure.
  • FIG. 15 is a diagram showing an example of the internal configuration of the feature extraction unit 1300C in the abnormality warning device 100C and the abnormality determination device 1000C.
  • FIG. 16 is a diagram showing an example of the internal configuration of the abnormality warning device 100C and the abnormality state determination unit 1500C in the abnormality determination device 1000C.
  • FIG. 17 is a flowchart showing an example of the processing of the abnormality warning device 100C and the abnormality determination device 1000C according to the third embodiment of the present disclosure.
  • FIG. 18 is a flowchart showing a detailed first example of the processing of the feature extraction unit 1300C in the abnormality warning device 100C and the abnormality determination device 1000C according to the third embodiment of the present disclosure.
  • FIG. 19 is a flowchart showing a detailed first example of the processing of the abnormal state determination unit 1500C in the abnormality warning device 100C and the abnormality determination device 1000C according to the third embodiment of the present disclosure.
  • FIG. 20 is a flowchart showing a second detailed example of the processing of the feature extraction unit 1300C in the abnormality warning device 100C and the abnormality determination device 1000C according to the third embodiment of the present disclosure.
  • FIG. 21 is a flowchart showing a second detailed example of the processing of the abnormal state determination unit 1500C in the abnormality warning device 100C and the abnormality determination device 1000C according to the third embodiment of the present disclosure.
  • FIG. 22 is a diagram illustrating a first example of a hardware configuration for realizing the functions of the present disclosure.
  • FIG. 23 is a diagram illustrating a second example of a hardware configuration for realizing the functions of the present disclosure.
  • Embodiment 1 In the first embodiment, one mode of a basic configuration of the present disclosure will be described.
  • FIG. 1 is a diagram showing an example of the configuration of an abnormality warning device 100A and an abnormality determination device 1000A according to a first embodiment of the present disclosure.
  • the abnormality warning device 100A acquires an image, uses the image to judge the state of an object to be judged that is captured in the image, and outputs a warning if the state of the object to be judged indicates an abnormality.
  • the abnormality warning device 100 shown in FIG. 1 includes an abnormality determination device 1000A and a warning output unit 2000.
  • the abnormality determination device 1000A obtains an image and uses the image to output the state of the object to be determined captured in the image.
  • the state of the object to be determined is output data including a value in the form of, for example, a determination value or a probability value.
  • the abnormality determination device 1000A includes an image acquisition unit 1010, a neural network unit 1020, a state comparison unit 1030, a processing control unit 1040, and an output storage unit 1050.
  • the image acquisition unit 1010 acquires and outputs an image.
  • the image acquisition unit 1010 acquires an image by inputting an image (moving image or still image) captured by a camera, for example.
  • the camera is, for example, an imaging device installed inside the vehicle cabin, which captures an image of a determination target present inside the vehicle cabin (the determination target is, for example, a living body such as a passenger including the driver) and outputs the captured image.
  • the image is a moving image
  • the moving image is divided by a camera into still images (frames) at regular time intervals, and the divided images are input to the image acquisition unit 1010 .
  • the neural network unit 1020 is composed of a neural network.
  • the neural network unit 1020 has multiple layers in a neural network, acquires the image output by the image acquisition unit 1010, and outputs output data that is a judgment result indicating the state of the judgment target contained in the image.
  • the multiple layers in a neural network are an input layer, a hidden layer (an intermediate layer), and an output layer, and more specifically, layers such as a convolutional layer and a pooling layer.
  • a neural network with many layers (deep) is particularly called a DNN (Deep Neural Network), and its derivatives include, for example, a CNN (Convolutional Neural Network) and an RNN (Recurrent Neural Network).
  • CNN is often used in the fields of object recognition and image recognition, while RNN is often used in time series processing, voice recognition, natural language processing, and the like.
  • the neural network in the present disclosure is applicable regardless of the form of the neural network. In this description, the form of CNN will be described as an example.
  • the state comparison unit 1030 judges a change in the state of the object to be judged.
  • the state comparison unit 1030 pre-stores a reference value that is a standard for the input value to the first layer, which is at least one of the multiple layers of the neural network unit 1020, and determines a change in the state of the object to be judged using the difference value between the reference value and the current value, which is the value that is to be newly input to the first layer.
  • the first layer may be a predetermined one of all layers in the neural network unit 1020, a predetermined portion of all layers, or all layers.
  • the reference value is a previous value, which is an input value of at least one layer, the first layer, stored as a result of previous processing by multiple layers of the neural network unit 1020, and is used to obtain a difference value between the previous value and a current value, which is an input value that is about to be newly input to the first layer.
  • a previous value which is an input value of at least one layer, the first layer
  • a current value which is an input value that is about to be newly input to the first layer.
  • the reference value can use typical processing results that have been learned and modeled in advance.
  • the reference value may be a typical output value of a pre-modeled intermediate layer in a neural network.
  • the reference value may use a part of the results determined based on advance information.
  • the partial results determined based on the prior information are assumed to be, for example, pixels that have a high probability of indicating the presence of a head. Also, for example, pixels that are known to have a large change but low importance, such as the background, may not be used.
  • the reference value may be the output value of a predetermined node among the nodes included in the neural network. This allows for reduced memory usage and data handling, resulting in faster processing speeds.
  • the state comparison unit 1030 uses the difference value and a pre-stored threshold value to determine the magnitude of the difference between the reference value and the current value, and determines a change in the state of the object to be determined based on the result. Specifically, for example, in units of images, if the sum of squares of the difference values is smaller than a pre-stored threshold value, the state comparison unit 1030 determines that the state of the determination target is unchanged. More specifically, the state comparison unit 1030 determines that the state of the determination target is unchanged when the sum of the absolute values of the difference values is smaller than a pre-stored threshold value, for example, for each image.
  • the processing control unit 1040 In response to changes in the state of the object to be judged, the processing control unit 1040 issues a command to restrict the current processing operation in the neural network unit 1020, and also issues a command to output the output data output by the previous processing of the neural network unit 1020.
  • the processing control unit 1040 instructs the neural network unit 1020 not to execute processing in at least one layer out of the multiple layers, that is, the first layer or subsequent layers, and also instructs the neural network unit 1020 to output the output data stored in the output storage unit 1050.
  • the output data is, for example, a judgment value that is a value indicating the state of the person to be judged. In the embodiment of the present disclosure, the judgment value is also referred to as a probability value that includes a probabilistic element.
  • the output storage section 1050 stores the output data (decision value (probability value)) output by the neural network section 1020 .
  • the output storage unit 1050 receives a command from the process control unit 1040, it outputs the stored output data.
  • the output storage unit 1050 outputs the output data to the warning output unit 2000, for example.
  • the output storage unit 1050 holds the latest output data output as a result of processing performed by the neural network unit 1020. For example, every time the output storage unit 1050 obtains output data output by the neural network unit 1020, it erases the output data previously stored and stores the latest output data.
  • the alarm output unit 2000 acquires output data, and outputs an alarm signal based on the output data to an alarm device (not shown) or the like.
  • the warning output unit 2000 acquires output data that is a judgment value indicating the drowsy state or dozing state of the person to be judged, and outputs a warning to the person to be judged in accordance with the judgment value.
  • the abnormality determination device 1000A shown in FIG. 1 is shown as being configured not to include the alarm output unit 2000, but may be configured to include the alarm output unit 2000. When configured in this manner, the abnormality determination device 1000A is equivalent to the abnormality warning device 100A shown in FIG. 14. In the following explanation, the abnormality determination device 1000A will be described as being configured to include the alarm output unit 2000, except in cases where it is necessary to distinguish between the abnormality warning device 100A and the abnormality determination device 1000A.
  • abnormality determination device 1000A may be configured to include a control unit (not shown), a storage unit (not shown), and a communication unit (not shown).
  • a control unit (not shown) controls the entire abnormality determination device 1000A and each of its components.
  • the control unit (not shown) starts up the abnormality determination device 1000A in response to, for example, an external command.
  • a storage unit (not shown) stores each piece of data used in the abnormality determination device 1000A.
  • the storage unit stores, for example, outputs (output data) from each component in the abnormality determination device 1000A, and outputs data requested by each component to the component that has made the request.
  • a communication unit (not shown) communicates with an external device. For example, communication is performed between the abnormality determination device 1000A and an imaging device such as an in-vehicle camera. In addition, for example, if the abnormality determination device 1000A does not have a display unit or an audio output unit, communication is performed between the abnormality determination device 1000A and an external device such as a display unit or an audio output device.
  • FIG. 2 is a flowchart showing an example of the processing of the abnormality warning device 100A and the abnormality determination device 1000A according to the first embodiment of the present disclosure.
  • the abnormality determination device 1000A starts the process shown in FIG. 2 when an image is input from a camera, for example.
  • Abnormality determination device 1000A executes image acquisition processing (step ST100). In the image acquisition process, the image acquisition unit 1010 of the abnormality determination device 1000A acquires and outputs an image.
  • Abnormality determination device 1000A executes a state storage and state comparison process (step ST200).
  • the state comparison unit 1030 of the abnormality determination device 1000A stores the previous value, which is the input value of at least one layer, that is, the first layer, among the processing results in all layers of the neural network unit 1020 for at least the first time after the start of processing.
  • the state comparison unit 1030 does not perform state comparison processing for the first processing, but simply stores the previous value, but judges whether there has been a change in the state of the object to be judged from the second processing onwards.
  • the state comparison unit 1030 pre-stores a reference value that is a standard for the input value to the first layer, which is at least one of the multiple layers of the neural network unit 1020, and determines a change in the state of the object to be judged using the difference value between the reference value and the current value, which is the value that is to be newly input to the first layer.
  • Abnormality determination device 1000A executes a process control process (step ST300).
  • the process control unit 1040 of the abnormality determination device 1000A issues a command to restrict the current processing operation in the neural network unit 1020 in response to a change in the state of the object to be determined, and also issues a command to output the output data output by the previous processing of the neural network unit 1020.
  • the processing control unit 1040 instructs the neural network unit 1020 not to execute processing in at least one layer out of the multiple layers, that is, the first layer or subsequent layers, and also instructs the processing control unit 1040 to output the output data stored in the output storage unit 1050.
  • Abnormality determination device 1000A executes a state output process (step ST400).
  • the neural network unit 1020 or the output storage unit 1050 of the abnormality determination device 1000A outputs output data.
  • the neural network unit 1020 outputs output data to the warning output unit 2000.
  • the output storage unit 1050 receives a command from the process control unit 1040 , it outputs the latest stored output data to the alarm output unit 2000 .
  • Abnormality determination device 1000A executes an alarm output process (step ST500).
  • the alarm output unit 2000 of the abnormality determination device 1000A acquires output data and outputs an alarm signal based on the output data to an alarm device (not shown) etc.
  • the alarm output unit 2000 determines whether to output an alarm based on a determination value included in the output data, and when it determines to output an alarm, outputs an alarm signal to an alarm device (not shown) etc.
  • step ST500 When the abnormality determination device 1000A executes the process of step ST500, The series of processes shown in FIG. 2 ends, and the process is repeated from step ST100. For example, when the camera is turned off, the abnormality determination device 1000A is also turned off in conjunction with the camera.
  • FIG. 3 is a flowchart showing an example of more detailed processing of the abnormality warning device 100A and the abnormality determination device 1000A according to the first embodiment of the present disclosure.
  • abnormality determination device 1000A starts the process shown in FIG. 3, first, similarly to step ST100 described above, it executes image acquisition processing (step ST100).
  • Abnormality determination device 1000A executes a storage process (step ST201).
  • the state comparison unit 1030 of the abnormality determination device 1000A stores the previous value, which is the input value of at least one layer, the first layer, among the processing results in all layers of the neural network unit 1020 for at least the first time after the start of processing. Furthermore, the state comparison unit 1030 stores the input values of the first layer each time processing is performed in the first layer.
  • the state comparison unit 1030 of the abnormality determination device 1000A executes a previous storage determination process to determine whether the data has been stored previously (step ST202).
  • the state comparison unit 1030 of the abnormality determination device 1000A executes a comparison process between the current value and the previous value (step ST203).
  • the state comparison unit 1030 calculates the difference (difference value) between the current value, which is the value that is about to be newly input to the first layer, and the previous value (reference value).
  • State comparison section 1030 of abnormality determination device 1000A determines a change in the state of the object to be determined using the difference value (step ST204).
  • the state comparison unit 1030 uses the difference value and a pre-stored threshold value to determine the magnitude of the difference between the reference value and the current value, and determines a change in the state of the object to be determined based on the result. Specifically, for example, in units of images, if the sum of squares of the difference values is smaller than a pre-stored threshold value, the state comparison unit 1030 determines that the state of the determination target is unchanged. More specifically, the state comparison unit 1030 determines that the state of the determination target is unchanged when the sum of the absolute values of the difference values is smaller than a pre-stored threshold value, for example, for each image.
  • step ST204 determines that the difference is equal to or greater than the threshold value and that the state of the object to be determined has changed.
  • the processing control unit 1040 of the abnormality determination device 1000A executes a normal processing command process (step ST301).
  • the processing control unit 1040 outputs a command to execute processing in the first layer of the neural network unit 1020.
  • the neural network unit 1020 of the abnormality determination device 1000A executes output processing (step ST401).
  • the neural network unit 1020 outputs the output data to the alarm output unit 2000.
  • Output storage unit 1050 of abnormality determination device 1000A executes an output storage process (step ST402).
  • the output storage section 1050 stores the output data output by the neural network section 1020 .
  • the processing control unit 1040 of the abnormality determination device 1000A executes an omission processing command processing (step ST302).
  • the processing control unit 1040 instructs the neural network unit 1020 not to execute processing in the first layer or subsequent layers, which is at least one layer among the multiple layers, and also instructs the output storage unit 1050 to output the output data stored therein.
  • the output storage unit 1050 of the abnormality determination device 1000A executes a stored data output process (step ST403).
  • the output storage unit 1050 receives a command from the process control unit 1040, it outputs the stored output data to the alarm output unit 2000.
  • the alarm output unit 2000 executes an alarm output process (step ST500).
  • the alarm output unit 2000 determines whether to output an alarm based on the judgment value contained in the output data, similar to the processing of step ST500 already described, and if it determines to output an alarm, outputs an alarm signal to an alarm device or the like (not shown).
  • abnormality determination device 1000A executes the process of step ST500, it ends the series of processes shown in FIG. 3 and repeats the process from step ST100. For example, when the camera is turned off, the abnormality determination device 1000A is also turned off in conjunction with the camera.
  • the configuration and processing described above allows the anomaly determination device to omit processing of layers in the neural network depending on the conditions.
  • the abnormality determination device of the present disclosure has the following configuration. "An image acquisition unit that acquires and outputs an image; a neural network unit having a plurality of layers in a neural network, acquiring an image output by the image acquisition unit, and outputting a judgment result indicating a state of a judgment target included in the image; an output storage unit that stores output data (judgment value (probability value)) output by the neural network unit; a state comparison unit that prestores a reference value that is a reference for an input value to a first layer that is at least one of the multiple layers of the neural network unit, and judges a change in the state of the object to be judged using a difference value between the reference value and a current value that is a value that is to be newly input to the first layer; a processing control unit that instructs the neural network unit not to execute processing in the first layer and subsequent layers when the state comparison unit determines that the state of the object to be determined has not changed, and instructs the neural network unit to output the output data stored in the output storage unit
  • the abnormality determination method of the present disclosure has the following configuration.
  • the image acquisition unit outputs the acquired image to the neural network unit, the neural network unit having a plurality of layers in a neural network acquires the image output by the image acquisition unit, and outputs a judgment result indicating a state of a judgment target included in the image;
  • an output storage unit stores the output data (judgment value (probability value)) output by the neural network unit;
  • a state comparison unit pre-stores a reference value that is a reference for an input value to a first layer that is at least one of the multiple layers of the neural network unit, and judges a change in the state of the object to be judged using a difference value between the reference value and a current value that is a value that is to be newly input to the first layer;
  • the processing control unit instructs the neural network unit not to execute processing in the first layer and subsequent layers, and instructs the neural network unit to output the output data stored in the output storage unit.
  • the abnormality determination device of the present disclosure is further configured as follows. "The anomaly determination device, wherein the reference value is a previous value stored for each layer as a result of the previous processing in the neural network.” As a result, the present disclosure has the effect of providing an abnormality determination device in an abnormality determination technique that can quickly output highly accurate determination results using a neural network. Furthermore, the present disclosure achieves the same effect as the above by applying the above configuration to the abnormality determination method.
  • the abnormality determination device of the present disclosure is further configured as follows. "The reference value is a typical output value of a pre-modeled intermediate layer in the neural network. An abnormality determination device characterized by the above. As a result, the present disclosure has the effect of providing an abnormality determination device in an abnormality determination technique that can quickly output highly accurate determination results using a neural network. Furthermore, the present disclosure achieves the same effect as the above by applying the above configuration to the abnormality determination method.
  • the abnormality determination device of the present disclosure is further configured as follows. "The reference value is an output value of a predetermined node among the nodes included in the neural network. An abnormality determination device characterized by the above. As a result, the present disclosure has the effect of providing an abnormality determination device in an abnormality determination technique that can quickly output highly accurate determination results using a neural network. Furthermore, the present disclosure achieves the same effect as the above by applying the above configuration to the above abnormality determination method.
  • the abnormality determination device of the present disclosure is further configured as follows. "The state comparison unit is a change in the state of the object to be determined based on a result of determining whether a difference between the reference value and the current value is greater or smaller using the difference value and a pre-stored threshold value; An abnormality determination device characterized by the above. As a result, the present disclosure has the effect of providing an abnormality determination device in an abnormality determination technique that can quickly output highly accurate determination results using a neural network. Furthermore, the present disclosure achieves the same effect as the above by applying the above configuration to the abnormality determination method.
  • the abnormality determination device of the present disclosure is further configured as follows. "The state comparison unit is When the sum of squares of the difference values is smaller than a pre-stored threshold value for each image, it is determined that the state of the object to be determined is unchanged.
  • the present disclosure has the effect of providing an abnormality determination device in an abnormality determination technique that can quickly output highly accurate determination results using a neural network. Furthermore, the present disclosure achieves the same effect as the above by applying the above configuration to the abnormality determination method.
  • the abnormality determination device of the present disclosure is further configured as follows. "The state comparison unit is When the sum of the absolute values of the difference values is smaller than a pre-stored threshold value for each image, it is determined that the state of the object to be determined is unchanged.
  • the present disclosure has the effect of providing an abnormality determination device in an abnormality determination technique that can quickly output highly accurate determination results using a neural network. Furthermore, the present disclosure achieves the same effect as the above by applying the above configuration to the abnormality determination method.
  • the abnormality determination device of the present disclosure is further configured as follows. "The present invention further includes an alarm output unit that acquires the output data, which is a judgment value indicating a drowsy state or a dozing state of a person to be judged, and outputs an alarm to the person to be judged according to the judgment value.
  • the present disclosure has the effect of providing an abnormality determination device in an abnormality determination technique that can quickly output highly accurate determination results using a neural network. Furthermore, the present disclosure achieves the same effect as the above by applying the above configuration to the abnormality determination method.
  • Embodiment 2 a form in which the basic mechanism of the present disclosure is applied to layers in the neural networks of a feature extraction unit and an abnormal state determination unit will be described.
  • FIG. 4 is a diagram showing an example of the configuration of an abnormality warning device 100B and an abnormality determination device 1000B according to the second embodiment of the present disclosure.
  • the abnormality warning device 100B includes an abnormality determination device 1000B and a warning output unit 2000.
  • the abnormality determination device 1000B obtains an image, and uses the image to output the state of the determination target captured in the image.
  • the abnormality determination device 1000B shown in FIG. 4 includes an image acquisition unit 1100B, a feature extraction unit 1300B, and an abnormal state determination unit 1500B.
  • the neural network unit already described is configured to include, as will be described later, a first neural network unit 1320 and a second neural network unit 1520.
  • First neural network unit 1320 is included in feature extraction unit 1300B
  • second neural network unit 1520 is included in abnormal state determination unit 1500B.
  • the output storage unit already described is configured to include a first output storage unit 1390 and a second output storage unit.
  • the first output storage unit 1390 is included in the feature extraction unit 1300B, and the second output storage unit is included in the abnormal state determination unit 1500B.
  • the state comparison unit already described is configured to include a first state comparison unit 1340 and a second state comparison unit 1540.
  • First state comparison unit 1340 is included in feature extraction unit 1300B
  • second state comparison unit 1540 is included in abnormal state determination unit 1500B.
  • the process control unit already described is configured to include a first process control unit 1380 and a second process control unit 1580.
  • the first process control unit 1380 is included in the feature extraction unit 1300B
  • the second process control unit 1580 is included in the abnormal state determination unit 1500B.
  • the image acquisition unit 1100B is similar to the image acquisition unit 1100A already described, a detailed description of the image acquisition unit 1100B will be omitted here.
  • the feature extraction unit 1300B uses the image to output a feature map that represents the characteristic state of the subject contained in the image. For example, the feature extraction unit 1300B extracts parts of the image that show signs of drowsiness, such as the eyelids and eyeballs, and generates a feature map that represents the state of the subject that is characteristic of an abnormal state such as drowsiness.
  • FIG. 5 is a diagram showing an example of the internal configuration of the abnormality warning device 100B and the feature extraction unit 1300B in the abnormality determination device 1000B.
  • the feature extraction unit 1300B shown in FIG. 5 is configured to include a neural network unit (first neural network unit 1320), a state comparison unit (first state comparison unit 1340B), a processing control unit (first processing control unit 1380B), and an output storage unit (first output storage unit 1390).
  • the first neural network unit 1320 has multiple layers that are part of the layers that make up the neural network, acquires the image output by the image acquisition unit 1100, and outputs a feature map that represents the characteristic state of the object to be judged contained in the image.
  • the first neural network unit 1320 shown in FIG. 5 includes an image branching unit 1321, a convolution layer unit 1322, a pooling layer unit 1325, and an image combination unit 1328.
  • the image branching unit 1321 branches and inputs an image to multiple nodes in a layer of a neural network.
  • the image is branched according to the number of subsequent convolutional layers and pooling layers. In FIG. 5, the image is branched into two.
  • the convolution layer unit 1322 performs filtering to extract characteristic parts of the object to be determined in the image.
  • the convolution layer unit 1322 shown in FIG. 5 includes a first convolution layer 1323 and a second convolution layer 1324.
  • the first convolutional layer 1323 and the second convolutional layer 1324 perform filtering to extract face (body) parts for detecting a drowsy state, for example, by convolution processing (cross-correlation processing) using pre-prepared convolution filters of size 3x3 or 5x5.
  • the pooling layer unit 1325 shown in FIG. 5 includes a first pooling layer 1326 and a second pooling layer 1327 .
  • the first pooling layer 1326 and the second pooling layer 1327 generate an image relating to features that are robust against image position, for example, by calculating the maximum or average value for each predetermined region.
  • the first neural network unit 1320 shown in FIG. 5 includes two pairs of convolutional layers and pooling layers, but it is also effective to include one pair or three or more pairs.
  • the first pooling layer and the second pooling layer 1327 may be followed by two or more convolution layers and two or more pooling layers.
  • a normalized linear unit layer (activation function) or the like may be included after the convolution layer.
  • the image combination unit 1328 combines multiple images output through the convolution layer and the pooling layer.
  • the image combining unit 1328 extracts areas that indicate signs of drowsiness, such as the eyelids and eyeballs, and generates a feature map.
  • the first state comparison unit 1340B prestores a reference value, which is the standard for the input value to each layer in the first neural network unit 1320, and uses the difference between the reference value and the current value, which is the value that is about to be newly input to the layer, to determine a change in the state of the object to be determined.
  • the reference value is a previous value, which is an input value for each layer, stored as a result of the previous processing by the multiple layers of the first neural network unit 1320, and is used to obtain a difference value between the previous value and a current value, which is an input value that is about to be newly input to the layer.
  • the first state comparison unit 1340 does not perform state comparison processing for the first processing (processing for the first image) and simply stores the value, but in the second and subsequent processing (processing for the second image), it determines changes in the state of the object to be judged.
  • the following values may be stored and used as the reference values.
  • the reference value can use typical processing results that have been learned and modeled in advance.
  • the reference value may be a typical output value of a pre-modeled intermediate layer in a neural network.
  • the reference value may use a part of the results determined based on advance information.
  • the partial results determined based on the prior information are assumed to be, for example, pixels that have a high probability of indicating the presence of a head. Also, for example, pixels that are known to have a large change but low importance, such as the background, may not be used.
  • the reference value may be the output value of a predetermined node among the nodes included in the neural network. This allows for reduced memory usage and data handling, resulting in faster processing speeds.
  • the first state comparison section 1340 uses the difference value and a pre-stored threshold value to determine the magnitude of the difference between the reference value and the current value, and determines a change in the state of the object to be determined based on the result. Specifically, the first state comparing section 1340 determines that the state of the determination target is unchanged when the sum of squares of the difference values is smaller than a pre-stored threshold value for each image. More specifically, the first state comparing section 1340 determines that the state of the determination target is unchanged when the sum of the absolute values of the difference values recorded in image units is smaller than a pre-stored threshold value.
  • "storing" means holding the values of the two-dimensional images (feature maps) of the respective input sources.
  • comparison means for example, taking the absolute value of the difference for each element (pixel) between a stored two-dimensional image (from the previous time/previous frame) and the latest two-dimensional image, and then comparing the sum or average with a predetermined threshold value.
  • the image state comparison unit 1341 shown in FIG. 5 includes a storage unit 1341a and a comparison processing unit 1341b.
  • the first processing control unit 1380B instructs the first neural network unit 1320 not to execute processing in the second layer and subsequent layers of the object to be determined, and also instructs the first processing control unit 1380B to output the feature map stored in the first output storage unit 1390.
  • the first process control unit 1380B shown in FIG. 15 includes a feature extraction process control unit 1381B.
  • the feature extraction process control unit 1381B executes the function of the first process control unit 1380B in the feature extraction unit 1300.
  • the first output storage section 1390 stores the output data (decision value (probability value)) output by the first neural network section 1320 .
  • the first output storage unit 1390 stores the feature map output by the first neural network unit 1320 .
  • the combined image storage unit 1391 stores a feature map, which is a combined image combined and output by the first neural network unit 1320 .
  • FIG. 6 is a diagram showing an example of the internal configuration of the abnormality warning device 100B and the abnormality state determination unit 1500B in the abnormality determination device 1000B.
  • the abnormal condition determination unit 1500B uses a feature map, which is a two-dimensional image, to output a determination value indicating the condition of the object to be determined.
  • the abnormal state determination unit 1500B shown in FIG. 6 is configured to include a neural network unit (second neural network unit 1520), a state comparison unit (second state comparison unit 1540), and an output storage unit (second output storage unit) 1590.
  • the neural network unit (second neural network unit 1520) has multiple layers in a neural network, acquires the feature map output by the feature extraction unit 1300, and uses the feature map to output a judgment value indicating the state of the object to be judged as output data.
  • the neural network unit (second neural network unit 1520 ) shown in FIG. 6 includes a state classification unit 1525 and a probability output layer 1529 .
  • the state classification unit 1525 has a function of, for example, converting a two-dimensional image (feature map) into a one-dimensional vector, and further consolidating the output into an indication of the drowsy state (for example, four outputs: eyelids: dozing/not dozing, eyeballs: dozing/not dozing).
  • the state classification unit 1525 shown in FIG. 6 includes a first fully connected layer 1527 and a second fully connected layer 1528.
  • the state classification unit 1525 generates a one-dimensional vector whose number of elements is the desired number of outputs through a first fully connected layer 1527 and a second fully connected layer 1528 .
  • the probability output layer 1529 applies a softmax function, for example, and sets the sum of the output values to 1.0, thereby giving the output results a probabilistic meaning.
  • the second neural network unit 1520 shown in FIG. 6 has two fully connected layers, but it may be configured to have three or more fully connected layers if there are no fully connected layers.
  • the second state comparison unit 1540B prestores a reference value that is a standard for the input value to the third layer, which is at least one of the layers in the second neural network unit 1520, and uses the difference between the reference value and the current value, which is the value that is about to be newly input to the third layer, to determine a change in the state of the object to be determined.
  • the reference value is a previous value, which is an input value for each layer, stored as a result of the previous processing by the multiple layers of the second neural network unit 1520, and is used to obtain a difference value between the previous value and a current value, which is an input value that is about to be newly input to the first layer.
  • the second state comparison unit 1540B does not perform state comparison processing for the first processing (processing for the first image) and simply stores the values, but determines changes in the state of the object to be judged for the second and subsequent processing (processing for the second image).
  • the following values may be stored and used as the reference values.
  • the reference value can use typical processing results that have been learned and modeled in advance.
  • the reference value may be a typical output value of a pre-modeled intermediate layer in a neural network.
  • the reference value may use a part of the results determined based on advance information.
  • the partial results determined based on the prior information are assumed to be, for example, pixels that have a high probability of indicating the presence of a head. Also, for example, pixels that are known to have a large change but low importance, such as the background, may not be used.
  • the reference value may be the output value of a predetermined node among the nodes included in the neural network. This allows for reduced memory usage and data handling, resulting in faster processing speeds.
  • the second state comparison section 1540B uses the difference value and a pre-stored threshold value to determine the magnitude of the difference between the reference value and the current value, and determines a change in the state of the object to be determined based on the result. Specifically, when the sum of squares of the difference values is smaller than a pre-stored threshold value for each image, second state comparison section 1540B determines that the state of the determination target is unchanged. More specifically, when the sum of the absolute values of the difference values recorded in image units is smaller than a pre-stored threshold value, the second state comparing section 1540B determines that the state of the determination target is unchanged.
  • the second state comparison unit 1540B shown in FIG. 6 includes a first fully connected layer state comparison unit 1541.
  • "storing" means holding the values of the one-dimensional vectors of the respective input sources.
  • comparison means for example, taking the absolute value of the difference for each element between the stored one-dimensional vector (of the previous time/previous frame) and the latest one-dimensional vector, and then comparing the sum or average with a predetermined threshold value.
  • the first fully connected layer state comparison unit 1541 shown in FIG. 6 includes a storage unit 1541a and a comparison processing unit 1541b.
  • the storage unit 1541 a stores a reference value, which is an output value of the first fully connected layer 1527 and an input value of the second fully connected layer 1528 , for each output by the first fully connected layer 1527 .
  • the comparison processing unit 1541b compares the current value with a reference value.
  • the second processing control unit 1580B instructs the second neural network unit 1520 not to execute processing in the layer of the object to be judged and subsequent layers, and also instructs the second processing control unit 1580B to output the judgment value stored in the second output storage unit as output data.
  • the second process control unit 1580B shown in FIG. 6 includes an abnormal state determination process control unit 1581B.
  • the abnormal state determination process control unit 1581B functions to limit the processing of the second neural network unit 1520 in the abnormal state determination unit 1500B.
  • the second output storage unit 1590 stores the decision value output by the second neural network unit 1520 .
  • the probability storage unit stores the judgment value output by the second neural network unit 1520 .
  • the judgment value is a value indicating the state of the person to be judged, and is, for example, a probability value output by the probability output layer 1529 so that the output result has a probabilistic meaning.
  • the warning output unit 2000 acquires output data that is a judgment value indicating the drowsy state or dozing state of the person to be judged, and outputs a warning to the person to be judged in accordance with the judgment value.
  • the abnormality determination device 1000B shown in FIG. 4 is shown as being configured not to include the alarm output unit 2000, but may be configured to include the alarm output unit 2000. When configured in this manner, the abnormality determination device 1000B is equivalent to the abnormality warning device 100B shown in FIG. 4. In the following explanation, the abnormality determination device 1000B will be described as being configured to include the alarm output unit 2000, except in cases where it is necessary to distinguish between the abnormality warning device 100B and the abnormality determination device 1000B.
  • abnormality determination device 1000B may be configured to include a control unit (not shown), a storage unit (not shown), and a communication unit (not shown).
  • a control unit (not shown) controls the entire abnormality determination device 1000B and each of its components.
  • the control unit (not shown) starts up the abnormality determination device 1000A in accordance with, for example, an external command.
  • a storage unit (not shown) stores each piece of data used in the abnormality determination device 1000B.
  • the storage unit stores, for example, outputs (output data) from each component in the abnormality determination device 1000B, and outputs data requested by each component to the component that has made the request.
  • the communication unit (not shown) communicates with an external device. For example, the communication unit communicates between the abnormality determination device 1000B and an imaging device such as an in-vehicle camera. In addition, for example, if the abnormality determination device 1000B does not have a display unit or an audio output unit, the communication unit communicates between the abnormality determination device 1000B and an external device such as a display unit or an audio output device.
  • FIG. 7 is a flowchart showing an example of processing by the abnormality warning device 100B and the abnormality determination device 1000B according to the second embodiment of the present disclosure.
  • the abnormality determination device 1000B starts the process shown in FIG.
  • Abnormality determination device 1000B executes image acquisition processing (step ST2100).
  • the image acquisition unit 1100 of the abnormality determination device 1000B acquires and outputs an image.
  • Abnormality determination device 1000B executes a storage and state comparison process (step ST2200).
  • the first state comparison unit 1340B of the abnormality determination device 1000B stores the previous value (reference value), which is the input value to the layer, for each layer of the first neural network unit 1320 for at least the first time after the process starts.
  • Abnormality determination device 1000B executes a feature extraction process control process (step ST2300).
  • the feature extraction process control unit 1381 of the abnormality determination device 1000B executes a normal process command or an omission process command to the first neural network unit 1320 depending on the determination result by the first state comparison unit 1340. Furthermore, when issuing an omission processing command, the feature extraction processing control unit 1381 issues an output command to the combined image storage unit 1391 .
  • Abnormality determination device 1000B executes combined image output processing (step ST2400).
  • the first neural network unit 1320 executes normal processing and outputs a combined image, or the combined image storage unit 1391 outputs the previous combined image, whereby the feature extraction unit 1300 outputs a combined image.
  • Abnormality determination device 1000B executes a process of storing the fully connected layer states and comparing the fully connected layer states (step ST2500).
  • the second state comparison unit 1540 in the abnormality determination device 1000B pre-stores, for each of the multiple layers of the second neural network unit 1520, a reference value that is a standard for the input value to that layer, and determines a change in the state of the object to be determined using the difference value between the reference value and the current value, which is the value that is to be newly input to that layer.
  • the second state comparison unit 1540 does not perform state comparison processing for the first processing (processing for the first image) and simply stores the values, but determines changes in the state of the object to be judged for the second and subsequent processing (processing for the second image).
  • Abnormality determination device 1000B executes an abnormal state determination process control process (step ST2600).
  • the abnormal state determination process control unit 1581B issues a normal command or an omission command to the second neural network unit 1520.
  • the abnormal state determination process control unit 1581B issues an output command to the probability storage unit 1591.
  • Abnormality determination device 1000B executes a result output process (step ST2700).
  • the second neural network unit 1520 executes normal processing and outputs a judgment value, or the probability storage unit 1591 outputs a previous judgment value (probability value), so that the abnormal state judgment unit 1500 outputs the judgment value as output data.
  • Abnormality determination device 1000B executes an alarm output process (step ST2800).
  • the alarm output unit 2000 of the abnormality determination device 1000B acquires output data and outputs an alarm signal based on the output data to an alarm device (not shown) etc.
  • the alarm output unit 2000 determines whether to output an alarm based on a determination value included in the output data, and when it determines to output an alarm, outputs an alarm signal to an alarm device (not shown) etc.
  • abnormality determining device 1000B executes the process of step ST3800, it ends the series of processes shown in FIG. 7 and repeats the process from step ST3100.
  • the abnormality determination device 1000B is also turned off in conjunction with, for example, the camera being turned off.
  • FIG. 8 is a flowchart showing an example of more detailed processing of the abnormality warning device 100B and the abnormality determination device 1000B according to the second embodiment of the present disclosure.
  • the flowchart in FIG. 8 shows an example of processing equivalent to the processing from step ST2200 to step ST2700 in the flowchart in FIG.
  • the feature extraction unit 1300B in the abnormality determination device 1000B starts the process of step ST2200
  • the feature extraction unit 1300B first executes a storage process (step ST2201).
  • the state comparison unit 1340 of the feature extraction unit 1300B stores a previous value that is an input value of the first layer, which is at least one layer among all layers of the neural network unit 1320 (first neural network unit 1320). Furthermore, the state comparison unit 1340 stores the input values of the first layer each time processing is performed in the first layer.
  • the feature extraction unit 1300B in the abnormality determination device 1000B executes a previous storage determination process (step ST2202).
  • the feature extraction unit 1300B in the abnormality determination device 1000B executes a comparison process (step ST2203).
  • the state comparison unit 1340 of the feature extraction unit 1300B calculates the difference (differential value) between the current value, which is the value that is about to be newly input to the first layer, and the previous value (reference value).
  • Feature extraction unit 1300B in abnormality determination device 1000B executes a difference determination process (step ST2204).
  • state comparison unit 1340 of feature extraction unit 1300B uses the difference value to determine a change in the state of the determination target. Specifically, for example, in units of images, if the sum of squares of the difference values is smaller than a pre-stored threshold value, the state comparison section 1340 determines that the state of the determination target is unchanged. More specifically, the state comparison unit 1340 determines that the state of the determination target is unchanged when the sum of the absolute values of the difference values is smaller than a pre-stored threshold value, for example, for each image.
  • step ST2202 determines that the data was not previously stored (step ST2202 "NO"), or if the state comparison unit 1340 determines that the difference is equal to or greater than the threshold value and that the state of the object to be determined has changed (step ST2204 "NO")
  • the feature extraction unit 1300B in the abnormality determination device 1000B executes a normal processing command (step ST2301).
  • the processing control unit 1380 of the feature extraction unit 1300B outputs a command to execute processing in the first layer of the neural network unit 1320 (first neural network unit 1320).
  • the feature extraction unit 1300B in the abnormality determination device 1000B executes output processing (step ST2401).
  • the first neural network unit 1320 of the feature extraction unit 1300B executes processing according to the normal processing command, and outputs output data indicating the determination result.
  • the feature extraction unit 1300B in the abnormality determination device 1000B executes an output storage process (step ST2402).
  • the output storage unit 1390 (first output storage unit 1390) of the feature extraction unit 1300B stores the output data output by the neural network unit 1320.
  • the feature extraction unit 1300B in the abnormality determination device 1000B executes an omission processing command (step ST2302).
  • the processing control unit 1380 of the feature extraction unit 1300B instructs the neural network unit 1320 not to execute processing in the first layer or subsequent layers, which is at least one layer out of the multiple layers, and instructs the output storage unit 1390 to output the output data stored therein.
  • the feature extraction unit 1300B in the abnormality determination device 1000B executes a stored data output process (step ST2403).
  • a stored data output process when the output storage unit 1390 (combined image storage unit 1391 of the output storage unit 1390) of the feature extraction unit 1300B receives a command from the process control unit 1380, it outputs the stored output data, that is, the combined image, to the alarm output unit 2000.
  • the abnormal state determination unit 1500B in the abnormality determination device 1000B executes a fully connected layer output storage process (step ST2501).
  • the state comparison unit 1540 of the abnormal state determination unit 1500B stores the value output from the fully connected layer (the first fully connected layer 1527 or the second fully connected layer 1528) in the state classification unit 1525 as the state of the fully connected layer.
  • the abnormal state determination unit 1500B in the abnormality determination device 1000B executes a previous storage determination process (step ST2502).
  • the abnormal state determination unit 1500B in the abnormality determination device 1000B determines that a previous value has been stored ("YES" in step ST2502), it executes a comparison process (step ST2503).
  • the state comparison unit 1540 of the abnormal state determination unit 1500B calculates the difference (differential value) between the current value, which is the value that is about to be newly input to the first layer (the second fully connected layer 1528 or the probability output layer 1529), and the previous value (reference value).
  • the abnormal state determination unit 1500B in the abnormality determination device 1000B executes a difference determination process (step ST2504).
  • the state comparison unit 1540 of the abnormal state determination unit 1500B determines the magnitude of the difference between the reference value and the current value using the difference between the current value and the previous value (reference value) and a pre-stored threshold value.
  • step ST2504 determines that the difference is equal to or greater than the threshold value (step ST2504 "NO")
  • the abnormal state determination unit 1500B in the abnormality determination device 1000B executes a normal processing command (step ST2601).
  • the processing control unit 1580 of the abnormal state determination unit 1500B executes a normal processing command to the neural network unit 1520 (second neural network unit 1520).
  • the abnormal state determination unit 1500B in the abnormality determination device 1000B executes an output process (step ST2701).
  • the neural network unit 1520 (second neural network unit 1520) of the abnormal state determination unit 1500B outputs the determination value as output data.
  • the abnormal state determination unit 1500B in the abnormality determination device 1000B executes an output storage process (step ST2702).
  • the output storage unit 1590 (second output storage unit 1590) in the abnormal state determination unit 1500B stores the determination value output by the neural network unit 1520 (second neural network unit 1520).
  • the abnormal state determination unit 1500B in the abnormality determination device 1000B executes an omission processing command (step ST2602).
  • the processing control unit 1580 of the abnormal state determination unit 1500B instructs the second neural network unit 1520 not to execute processing in layers subsequent to the layer to be determined, and instructs the second output storage unit to output the determination value stored therein.
  • the abnormal state determination unit 1500B in the abnormality determination device 1000B executes a stored data output process (step ST2703).
  • the second output storage unit 1590 of the abnormal state determination unit 1500B stores the determination value output by the neural network unit 1520 (second neural network unit 1520).
  • the configuration may be one or a combination of the following: the image state comparison unit 1341, a convolution layer state comparison unit that determines a state change in the convolution layer (see the image state comparison unit 1342 in the embodiment described later), a pooling layer state comparison unit that determines a state change in the pooling layer (see the pooling layer state comparison unit 1343 in the embodiment described later), a first fully connected layer state comparison unit (see the first fully connected layer state comparison unit 1541 in the embodiment described later), and a second fully connected layer state comparison unit that determines a state change in the second fully connected layer 1528 (see the second fully connected layer state comparison unit 1542 in the embodiment described later).
  • the image state comparison unit 1341 a convolution layer state comparison unit that determines a state change in the convolution layer
  • a pooling layer state comparison unit that determines a state change in the pooling layer
  • a first fully connected layer state comparison unit see the first fully connected layer state comparison unit 1541 in the embodiment described later
  • a second fully connected layer state comparison unit that determine
  • FIG. 9 is a flowchart showing an example of a process of the feature extraction unit 1300B in the case where the state comparison unit (first state comparison unit 1340B) in the feature extraction unit 1300B according to the second embodiment of the present disclosure is the image state comparison unit 1341.
  • the first state comparison section 1340B of the feature extraction section 1300B acquires an image acquisition command and executes an image storage process (step ST2211).
  • the image state comparison section 1341 of the first state comparison section 1340B stores the image that is the input data for the first neural network section 1320. Furthermore, every time the first neural network unit 1320 acquires an image, the image state comparison unit 1341 stores the image (the image before being processed by the first neural network unit 1320) in the storage unit 1341a.
  • the image state comparison section 1341 executes a process of determining whether the image has been previously stored (step ST2212).
  • the image state comparison section 1341 refers to the storage section 1341a to determine whether or not the previously input image is stored.
  • the image state comparison unit 1341 executes a process of comparing the current image with the previous image (step ST2213).
  • the comparison processing section 1341b of the image state comparison section 1341 compares the currently input image with the previously input image.
  • the comparison processing unit 1341b executes a process of determining whether the difference is less than a threshold value (step ST2214).
  • the comparison processing unit 1341b uses the difference value between the image input this time and the image input last time and a pre-stored threshold value to determine the magnitude of the difference between a reference value (the image input last time) and a current value (the image to be input this time).
  • the feature extraction processing control unit 1381 which is the first processing control unit 1380, issues a normal processing command to the first neural network unit 1320 (step ST2311).
  • First neural network unit 1320 executes normal processing in accordance with the normal processing command, and outputs a combined image (step ST2411).
  • the combined image storage section 1391 which is the first output storage section 1390, stores the combined image output by the first neural network section 1320 (step ST2412).
  • the feature extraction processing control unit 1381 which is the first processing control unit 1380, issues an omission processing command to the first neural network unit 1320 (step ST2312) and also issues an output command to the combined image storage unit 1391, which is the first output storage unit 1390 (step ST2413).
  • the first neural network unit 1320 does not execute processing of the subsequent layers in the first neural network unit 1320 .
  • the combined image storage unit 1391 outputs the combined image in accordance with the output command.
  • FIG. 10 is a flowchart illustrating an example of a process of the feature extraction unit 1300B in the second embodiment of the present disclosure when the state comparison unit (first state comparison unit 1340B) in the feature extraction unit 1300B is a convolutional layer state comparison unit.
  • the first state comparison unit 1340B of the feature extraction unit 1300B executes a convolution layer state storage process (step ST2221).
  • the convolution layer state comparison unit of the first state comparison unit 1340B stores an input value (a value before being processed by the pooling layer unit 1325) which is an output value of the convolution layer unit 1322 and is input data to the pooling layer unit 1325.
  • the convolution layer state comparison unit stores the output value (the value before being processed by the pooling layer unit 1325) in the storage unit.
  • the convolution layer state comparison unit executes a process of determining whether the data has been stored previously (step ST2222).
  • the convolution layer state comparison unit refers to the storage unit and determines whether a previously input value (reference value: in this explanation, the previous value) is stored.
  • the convolution layer state comparison unit determines that the previous state has been stored ("YES" in step ST2222), it executes a comparison process between the current state and the previous state (step ST2223).
  • the comparison processing unit executes a process of determining whether the difference is less than a threshold value (step ST2224).
  • the comparison processing unit determines the magnitude of the difference between a reference value (the value input previously) and the current value (the value being input currently) using a difference value between the value being input currently (the current value) and the value input previously (the previous value) and a pre-stored threshold value.
  • the feature extraction processing control unit 1381 which is the first processing control unit 1380, issues a normal processing command to the first neural network unit 1320 (step ST2321).
  • First neural network unit 1320 executes normal processing in accordance with the normal processing command, and outputs a combined image (step ST2421).
  • the combined image storage section 1391 which is the first output storage section 1390, stores the combined image output by the first neural network section 1320 (step ST2422).
  • the feature extraction processing control unit 1381B which is the first processing control unit 1380, issues an omission processing command to the first neural network unit 1320 (step ST2322) and also issues an output command to the combined image storage unit 1391, which is the first output storage unit 1390 (step ST2423).
  • the first neural network unit 1320 does not execute processing of the subsequent layers in the first neural network unit 1320 .
  • the combined image storage unit 1391 outputs the combined image in accordance with the output command.
  • FIG. 11 is a flowchart illustrating an example of a process performed by the feature extraction unit 1300B in the second embodiment of the present disclosure when the state comparison unit (first state comparison unit 1340B) in the feature extraction unit 1300B is a pooling layer state comparison unit.
  • the first state comparison unit 1340B of the feature extraction unit 1300B executes a pooling layer state storage process (step ST2221).
  • the pooling layer state comparison unit of the first state comparison unit 1340B stores the input value (the value before being processed by the image combination unit 1328), which is the output value of the pooling layer unit 1325 and is input data to the image combination unit 1328.
  • the pooling layer state comparison unit stores the output value (the value before being processed by the image combination unit 1328) in the storage unit.
  • the pooling layer state comparison unit executes a process of determining whether or not the data has been stored previously (step ST2222).
  • the pooling layer state comparison unit refers to the storage unit and determines whether a previously input value (reference value: in this explanation, the previous value) is stored.
  • the pooling layer state comparison unit executes a process of comparing the current data with the previous data (step ST2223).
  • the comparison processing unit of the pooling layer state comparison unit compares the value currently being input to the image combination unit 1328 (current value) with the value previously input (previous value).
  • the comparison processing unit executes a process of determining whether the difference is less than a threshold value (step ST2224).
  • the comparison processing unit determines the magnitude of the difference between a reference value (the value input previously) and the current value (the value being input currently) using a difference value between the value being input currently (the current value) and the value input previously (the previous value) and a pre-stored threshold value.
  • the feature extraction processing control unit 1381B which is the first processing control unit 1380, issues a normal processing command to the first neural network unit 1320 (step ST2321).
  • First neural network unit 1320 executes normal processing in accordance with the normal processing command, and outputs a combined image (step ST2421).
  • the combined image storage section 1391 which is the first output storage section 1390, stores the combined image output by the first neural network section 1320 (step ST2422).
  • the feature extraction processing control unit 1381B which is the first processing control unit 1380, issues an omission processing command to the first neural network unit 1320 (step ST2322) and also issues an output command to the combined image storage unit 1391, which is the first output storage unit 1390 (step ST2423).
  • the first neural network unit 1320 does not execute processing of the subsequent layers in the first neural network unit 1320 .
  • the combined image storage unit 1391 outputs the combined image in accordance with the output command.
  • FIG. 12 is a flowchart showing an example of processing of the abnormal state determination unit 1500B in the case where the state comparison unit (second state comparison unit 1540B) in the abnormal state determination unit 1500B according to the second embodiment of the present disclosure is the first layer state comparison unit 1541.
  • the second state comparison unit 1540B When the abnormal state determination unit 1500B executes processing on the data output by the feature extraction unit 1300B (step ST2511), the second state comparison unit 1540B first acquires the first fully connected layer output (step ST2512). It then executes a first fully connected layer state storage process (step ST2513). The first layer state comparison unit 1541 in the second state comparison unit 1540B stores the output value from the first fully connected layer 1527, which is the input value of the second fully connected layer 1528 (the value before being processed by the second fully connected layer 1528).
  • the first layer state comparing unit 1541 executes a process of determining whether or not the data has been previously stored (step ST2514).
  • the first layer state comparison unit 1541 refers to the storage unit 1541a and determines whether a reference value (previous value) is stored.
  • the first layer state comparison unit 1541 determines that the previous value has been stored ("YES" in step ST2514), it executes a comparison process between the current value and the previous value (step ST2515).
  • the first fully connected layer state comparison unit 1541 compares the current value with a reference value (previous value).
  • the first layer state comparing unit 1541 executes a process of determining whether the difference is smaller than a threshold value (step ST2516).
  • the comparison processing unit 1541b judges whether the difference between the current value and the reference value (previous value) is large or small, using a difference between the current value and the reference value and a pre-stored threshold value.
  • the abnormal state determination processing control unit 1581B which is the second processing control unit 1580, issues a normal processing command to the second neural network unit 1520 (step ST2611).
  • Second neural network unit 1520 executes normal processing in accordance with the normal processing command, and outputs a decision value (step ST2711).
  • Probability storage section 1591 which is second output storage section 1590, stores the decision value output by second neural network section 1520 (step ST2712).
  • the abnormal state determination process controller 1581B which is the second process controller 1580, executes an omission process command (step ST2612).
  • the abnormal state judgment processing control unit 1581B instructs the second neural network unit 1520 not to execute processing in layers subsequent to the layer to be judged, and also instructs it to output the judgment value stored in the second output storage unit 1590.
  • the second neural network unit 1520 does not execute processing of the subsequent layers in the second neural network unit 1520 .
  • the probability storage section 1591 which is the second output storage section 1590, outputs the decision value in accordance with the output command (step ST2713).
  • FIG. 13 is a flowchart showing an example of processing by abnormal state determination unit 1500B in the case where the state comparison unit (second state comparison unit 1540B) in abnormal state determination unit 1500B according to embodiment 2 of the present disclosure is the second layer state comparison unit.
  • the abnormal state determination unit 1500B executes processing on the data output by the feature extraction unit 1300B (step ST2521).
  • the second fully connected layer 1528 then executes processing on the output by the first fully connected layer (step ST2522).
  • the second state comparison unit 1540B acquires the second fully connected layer output (step ST2523).
  • the second state comparison unit 1540B executes a second fully connected layer state storage process (step ST2524).
  • the second layer state comparison unit in the second state comparison unit 1540B stores the output value from the second fully connected layer 1528 and the input value of the probability output layer 1529 (the value before being processed by the probability output layer 1529).
  • the second layer state comparing unit executes a process of determining whether the data has been stored previously (step ST2525).
  • the second layer state comparison unit refers to the storage unit and determines whether a reference value (previous value) is stored.
  • the second layer state comparison unit determines that the value was stored last time ("YES" in step ST2525), it executes a comparison process between the current value and the previous value (step ST2526).
  • the second layer state comparison unit compares the current value with the reference value (previous value).
  • the second layer state comparing unit executes a process of determining whether the difference is smaller than a threshold value (step ST2527).
  • the comparison processing section determines whether the difference between the current value and the reference value (previous value) is large or small, using a difference between the current value and the reference value (previous value) and a pre-stored threshold value.
  • the abnormal state determination processing control unit 1581B which is the second processing control unit 1580, issues a normal processing command to the second neural network unit 1520 (step ST2621).
  • Second neural network unit 1520 executes normal processing in accordance with the normal processing command, and outputs a determination value (step ST2721).
  • Probability storage section 1591 which is second output storage section 1590, stores the decision value output by second neural network section 1520 (step ST2722).
  • the abnormal state judgment process control unit 1581B which is the second process control unit 1580, executes an omission process command (step ST2622).
  • the abnormal state judgment processing control unit 1581B instructs the second neural network unit 1520 not to execute processing in layers subsequent to the layer to be judged, and also instructs it to output the judgment value stored in the second output storage unit 1590.
  • the second neural network unit 1520 does not execute processing of the subsequent layers in the second neural network unit 1520 .
  • the probability storage section 1591 which is the second output storage section 1590, outputs a decision value in accordance with the output command (step ST2723).
  • the abnormality determination device of the present disclosure is further configured as follows. "The neural network unit includes a first neural network unit and a second neural network unit,
  • the output storage unit includes a first output storage unit and a second output storage unit,
  • the state comparison unit includes a first state comparison unit and a second state comparison unit
  • the processing control unit includes a first processing control unit and a second processing control unit, the first neural network unit having a plurality of layers that are a part of the layers constituting the neural network, acquiring an image output by the image acquisition unit, and outputting a feature map that represents a characteristic state of the object to be determined that is included in the image;
  • the first output storage unit that stores the feature map output by the first neural network unit;
  • the first state comparison unit pre-stores a reference value that is a reference for an input value to a second layer that is at least one of the layers in the first neural network unit, and judges a change in the state of the object to be judged by using a difference value between the reference value and a current value
  • the present disclosure has the effect of providing an abnormality determination device that can use a neural network in an abnormality determination technique to more quickly output highly accurate determination results. Furthermore, the present disclosure achieves the same effect as the above by applying the above configuration to the abnormality determination method.
  • the abnormality determination device of the present disclosure is further configured as follows. "The first state comparison unit and the second state comparison unit are a change in the state of the object to be determined based on a result of determining whether a difference between the reference value and the current value is greater or smaller using the difference value and a pre-stored threshold value; An abnormality determination device characterized by the above. As a result, the present disclosure has the effect of providing an abnormality determination device that can use a neural network in an abnormality determination technique to more quickly output highly accurate determination results. Furthermore, the present disclosure achieves the same effect as the above by applying the above configuration to the abnormality determination method.
  • the abnormality determination device of the present disclosure is further configured as follows. "The first state comparison unit and the second state comparison unit are When the sum of squares of the difference values is smaller than a pre-stored threshold value for each image, it is determined that the state of the object to be determined is unchanged.
  • the present disclosure has the effect of providing an abnormality determination device that can use a neural network in an abnormality determination technique to more quickly output highly accurate determination results. Furthermore, the present disclosure achieves the same effect as the above by applying the above configuration to the abnormality determination method.
  • the abnormality determination device of the present disclosure is further configured as follows. "The first state comparison unit and the second state comparison unit are When the sum of the absolute values of the difference values is smaller than a pre-stored threshold value for each image, it is determined that the state of the object to be determined is unchanged.
  • the present disclosure has the effect of providing an abnormality determination device that can use a neural network in an abnormality determination technique to more quickly output highly accurate determination results. Furthermore, the present disclosure achieves the same effect as the above by applying the above configuration to the abnormality determination method.
  • Embodiment 3 a form in which the basic mechanism of the present disclosure is applied to all layers in a neural network will be described. In the third embodiment, the configuration and processing already described will be omitted as appropriate.
  • FIG. 14 is a diagram showing an example of the configuration of an abnormality warning device 100C and an abnormality determination device 1000C according to the third embodiment of the present disclosure.
  • the abnormality warning device 100C includes an abnormality determination device 1000C and a warning output unit 2000.
  • the abnormality determination device 1000C obtains an image, and uses the image to output the state of the object to be determined that is captured in the image.
  • An abnormality determination device 1000C shown in FIG. 14 includes an image acquisition unit 1100C, a feature extraction unit 1300C, and an abnormal state determination unit 1500C.
  • the neural network unit already described is configured to include, as will be described later, a first neural network unit 1320 and a second neural network unit 1520.
  • First neural network unit 1320 is included in feature extraction unit 1300C
  • second neural network unit 1520 is included in abnormal state determination unit 1500C.
  • the output storage unit already described is configured to include a first output storage unit 1390 and a second output storage unit.
  • the first output storage unit 1390 is included in the feature extraction unit 1300C, and the second output storage unit is included in the abnormal state determination unit 1500C.
  • the state comparison unit already described is configured to include a first state comparison unit 1340 and a second state comparison unit 1540.
  • First state comparison unit 1340 is included in feature extraction unit 1300C
  • second state comparison unit 1540 is included in abnormal state determination unit 1500C.
  • the process control unit already described is configured to include a first process control unit 1380 and a second process control unit 1580.
  • the first process control unit 1380 is included in the feature extraction unit 1300C
  • the second process control unit 1580 is included in the abnormal state determination unit 1500C.
  • the image acquisition unit 1100C is similar to the image acquisition units 1100A and 1100B already described, a detailed description of the image acquisition unit 1100C will be omitted here.
  • the feature extraction unit 1300C uses the image to output a feature map that represents the characteristic state of the subject contained in the image. For example, the feature extraction unit 1300C extracts parts of the image that show signs of drowsiness, such as the eyelids and eyeballs, and generates a feature map that represents the state of the subject that is characteristic of an abnormal state such as drowsiness.
  • FIG. 15 is a diagram showing an example of the internal configuration of the feature extraction unit 1300C in the abnormality warning device 100C and the abnormality determination device 1000C.
  • the feature extraction unit 1300C shown in FIG. 15 is configured to include a neural network unit (first neural network unit 1320), a state comparison unit (first state comparison unit 1340C), a processing control unit (first processing control unit 1380C), and an output storage unit (first output storage unit 1390).
  • the first neural network unit 1320 has multiple layers that are part of the layers that make up the neural network, acquires the image output by the image acquisition unit 1100, and outputs a feature map that represents the characteristic state of the object to be judged contained in the image.
  • the first neural network unit 1320 shown in FIG. 15 includes an image branching unit 1321, a convolution layer unit 1322, a pooling layer unit 1325, and an image combination unit 1328.
  • the image branching unit 1321 branches and inputs an image to multiple nodes in a layer of a neural network.
  • the image is branched according to the number of subsequent convolutional layers and pooling layers. In FIG. 15, the image is branched into two.
  • the convolution layer unit 1322 performs filtering to extract characteristic parts of the object to be determined in the image.
  • the convolution layer unit 1322 shown in FIG. 15 includes a first convolution layer 1323 and a second convolution layer 1324.
  • the first convolutional layer 1323 and the second convolutional layer 1324 perform filtering to extract face (body) parts for detecting a drowsy state, for example, by convolution processing (cross-correlation processing) using pre-prepared convolution filters of size 3x3 or 5x5.
  • the pooling layer unit 1325 shown in FIG. 15 includes a first pooling layer 1326 and a second pooling layer 1327 .
  • the first pooling layer 1326 and the second pooling layer 1327 generate an image relating to features that are robust against image position, for example, by calculating the maximum or average value for each predetermined region.
  • the first neural network unit 1320 shown in FIG. 15 includes two pairs of convolutional layers and pooling layers, but it is also effective to include one pair or three or more pairs.
  • the first pooling layer and the second pooling layer 1327 may be followed by two or more convolution layers and two or more pooling layers.
  • a normalized linear unit layer (activation function) or the like may be included after the convolution layer.
  • the image combination unit 1328 combines multiple images output through the convolution layer and the pooling layer.
  • the image combining unit 1328 extracts areas that indicate signs of drowsiness, such as the eyelids and eyeballs, and generates a feature map.
  • the first state comparison unit 1340C prestores a reference value, which is the standard for the input value to each layer in the first neural network unit 1320, and uses the difference between the reference value and the current value, which is the value that is about to be newly input to the layer, to determine a change in the state of the object to be determined.
  • the reference value is a previous value, which is an input value for each layer, stored as a result of the previous processing by the multiple layers of the first neural network unit 1320, and is used to obtain a difference value between the previous value and a current value, which is an input value that is about to be newly input to the layer.
  • the first state comparison unit 1340 does not perform state comparison processing for the first processing (processing for the first image) and simply stores the value, but in the second and subsequent processing (processing for the second image), it determines changes in the state of the object to be judged.
  • the following values may be stored and used as the reference values.
  • the reference value can use typical processing results that have been learned and modeled in advance.
  • the reference value may be a typical output value of a pre-modeled intermediate layer in a neural network.
  • the reference value may use a part of the results determined based on advance information.
  • the partial results determined based on the prior information are assumed to be, for example, pixels that have a high probability of indicating the presence of a head. Also, for example, pixels that are known to have a large change but low importance, such as the background, may not be used.
  • the reference value may be the output value of a predetermined node among the nodes included in the neural network. This allows for reduced memory usage and data handling, resulting in faster processing speeds.
  • the first state comparison section 1340 uses the difference value and a pre-stored threshold value to determine the magnitude of the difference between the reference value and the current value, and determines a change in the state of the object to be determined based on the result. Specifically, the first state comparing section 1340 determines that the state of the determination target is unchanged when the sum of squares of the difference values is smaller than a pre-stored threshold value for each image. More specifically, the first state comparing section 1340 determines that the state of the determination target is unchanged when the sum of the absolute values of the difference values recorded in image units is smaller than a pre-stored threshold value.
  • the first state comparison unit 1340C shown in FIG. 15 includes an image state comparison unit 1341, a convolution layer state comparison unit 1342, and a pooling layer state comparison unit 1343.
  • "storing" means holding the values of the two-dimensional images (feature maps) that are the respective input sources.
  • comparison means for example, taking the absolute value of the difference for each element (pixel) between a stored two-dimensional image (of the previous time/frame) and the latest two-dimensional image, and further comparing the sum or average with a predetermined threshold value.
  • the image state comparison unit 1341 shown in FIG. 15 includes a storage unit 1341a and a comparison processing unit 1341b.
  • the convolutional layer state comparison unit 1342 shown in FIG. 15 includes a storage unit 1342a and a comparison processing unit 1342b.
  • the pooling layer state comparison unit 1343 shown in FIG. 15 includes a storage unit 1343a and a comparison processing unit 1343b.
  • the first processing control unit 1380C instructs the first neural network unit 1320 not to execute processing in layers subsequent to the layer of the object to be judged, and also instructs the first output storage unit 1390 to output the feature map stored in the first output storage unit 1390.
  • the first process control unit 1380C shown in FIG. 15 includes a feature extraction process control unit 1381C.
  • the feature extraction process control unit 1381C executes the function of the first process control unit 1380C in the feature extraction unit 1300.
  • the first output storage section 1390 stores the output data (decision value (probability value)) output by the first neural network section 1320 .
  • the first output storage unit 1390 stores the feature map output by the first neural network unit 1320 .
  • the combined image storage unit 1391 stores a feature map, which is a combined image combined and output by the first neural network unit 1320 .
  • the abnormal condition determination unit 1500C uses a feature map, which is a two-dimensional image, to output a determination value indicating the condition of the object to be determined.
  • FIG. 16 is a diagram showing an example of the internal configuration of the abnormality warning device 100C and the abnormality state determination unit 1500C in the abnormality determination device 1000C.
  • the abnormal state determination unit 1500C shown in FIG. 16 is configured to include a neural network unit (second neural network unit 1520), a state comparison unit (second state comparison unit 1540), and an output storage unit (second output storage unit) 1590.
  • the neural network unit (second neural network unit 1520) has multiple layers in a neural network, acquires the feature map output by the feature extraction unit 1300, and uses the feature map to output a judgment value indicating the state of the object to be judged as output data.
  • the neural network unit (second neural network unit 1520 ) shown in FIG. 16 includes a state classification unit 1525 and a probability output layer 1529 .
  • the state classification unit 1525 has a function of, for example, converting a two-dimensional image (feature map) into a one-dimensional vector, and further consolidating the output into an indication of the drowsy state (for example, four outputs: eyelids: dozing/not dozing, eyeballs: dozing/not dozing).
  • the state classification unit 1525 shown in FIG. 16 includes a first fully connected layer 1527 and a second fully connected layer 1528.
  • the state classification unit 1525 generates a one-dimensional vector whose number of elements is the desired number of outputs through a first fully connected layer 1527 and a second fully connected layer 1528 .
  • the probability output layer 1529 applies a softmax function, for example, and sets the sum of the output values to 1.0, thereby giving the output results a probabilistic meaning.
  • the second neural network unit 1520 shown in FIG. 16 has two fully connected layers, but it may be configured to have three or more fully connected layers if there are no fully connected layers.
  • the second state comparison unit 1540 prestores a reference value, which is the standard for the input value to each layer of the second neural network unit 1520, and uses the difference between the reference value and the current value, which is the value that is about to be newly input to the layer, to determine a change in the state of the object to be determined.
  • the reference value is a previous value, which is an input value for each layer, stored as a result of the previous processing by the multiple layers of the second neural network unit 1520, and is used to obtain a difference value between the previous value and a current value, which is an input value that is about to be newly input to the first layer.
  • the previous value is used as the reference value.
  • the second state comparison unit 1540 does not perform state comparison processing for the first processing (processing for the first image) and simply stores the value, but determines changes in the state of the object to be judged for the second and subsequent processing (processing for the second image).
  • the following values may be stored and used as the reference values.
  • the reference value can use typical processing results that have been learned and modeled in advance.
  • the reference value may be a typical output value of a pre-modeled intermediate layer in a neural network.
  • the reference value may use a part of the results determined based on advance information.
  • the partial results determined based on the prior information are assumed to be, for example, pixels that have a high probability of indicating the presence of a head. Also, for example, pixels that are known to have a large change but low importance, such as the background, may not be used.
  • the reference value may be the output value of a predetermined node among the nodes included in the neural network. This allows for reduced memory usage and data handling, resulting in faster processing speeds.
  • the second state comparison section 1540 uses the difference value and a pre-stored threshold value to determine the magnitude of the difference between the reference value and the current value, and determines a change in the state of the object to be determined based on the result. Specifically, when the sum of squares of the difference values is smaller than a pre-stored threshold value for each image, second state comparing section 1540 determines that the state of the determination target is unchanged. More specifically, when the sum of the absolute values of the difference values recorded in image units is smaller than a pre-stored threshold value, the second state comparing section 1540 determines that the state of the determination target is unchanged.
  • the 16 includes a first layer state comparison unit 1541 and a second layer state comparison unit 1542.
  • the first layer state comparison unit 1541 is also referred to as a first fully connected layer state comparison unit 1541.
  • the second layer state comparison unit 1542 is also referred to as a second fully connected layer state comparison unit 1542.
  • "storing" means holding the values of the one-dimensional vectors of the respective input sources.
  • comparison means for example, taking the absolute value of the difference between a stored one-dimensional vector (of the previous time/frame) and the latest one-dimensional vector for each element, and then comparing the sum or average with a predetermined threshold value.
  • the first fully connected layer state comparison unit 1541 shown in FIG. 16 includes a storage unit 1541a and a comparison processing unit 1541b.
  • the storage unit 1541a stores a reference value, which is an output value of the first fully connected layer 1527 and an input value of the second fully connected layer, for each output by the first fully connected layer 1527.
  • the comparison processing unit 1541b compares the current value with a reference value.
  • the second fully connected layer state comparison unit 1542 shown in FIG. 16 includes a storage unit 1542a and a comparison processing unit 1542b.
  • the storage unit 1542a stores a reference value, which is an output value of the second fully connected layer and an input value of the probability output layer 1529, for each output by the second fully connected layer.
  • the comparison processing unit 1542b compares the current value with a reference value.
  • the second processing control unit 1580C instructs the second neural network unit 1520 not to execute processing in the layer of the object to be judged and subsequent layers, and also instructs the second processing control unit 1580C to output the judgment value stored in the second output storage unit as output data.
  • the second process control unit 1580C shown in FIG. 16 includes an abnormal state determination process control unit 1581C.
  • the abnormal state determination processing control unit 1581C functions to limit the processing of the second neural network unit 1520 in the abnormal state determination unit 1500C.
  • the second output storage unit 1590 stores the decision value output by the second neural network unit 1520 .
  • the probability storage unit 1591 stores the judgment value output by the second neural network unit 1520 .
  • the judgment value is a value indicating the state of the person to be judged, and is, for example, a probability value output by the probability output layer 1529 so that the output result has a probabilistic meaning.
  • the warning output unit 2000 acquires output data that is a judgment value indicating the drowsy state or dozing state of the person to be judged, and outputs a warning to the person to be judged in accordance with the judgment value.
  • the abnormality determination device 1000C shown in FIG. 14 is shown as being configured not to include the alarm output unit 2000, but may be configured to include the alarm output unit 2000. When configured in this manner, the abnormality determination device 1000C is equivalent to the abnormality warning device 100C shown in FIG. 14. In the following explanation, the abnormality determination device 1000C will be explained as being configured to include the alarm output unit 2000, except in cases where it is necessary to distinguish between the abnormality warning device 100C and the abnormality determination device 1000C.
  • abnormality determination device 1000C may be configured to include a control unit (not shown), a storage unit (not shown), and a communication unit (not shown).
  • a control unit (not shown) controls the entire abnormality determination device 1000C and each of its components.
  • the control unit (not shown) starts up the abnormality determination device 1000C in response to, for example, an external command.
  • a storage unit (not shown) stores each piece of data used in the abnormality determination device 1000C.
  • the storage unit stores, for example, outputs (output data) from each component in the abnormality determination device 1000C, and outputs data requested by each component to the component that has made the request.
  • the communication unit (not shown) communicates with an external device. For example, the communication unit communicates between the abnormality determination device 1000C and an imaging device such as an in-vehicle camera. In addition, for example, if the abnormality determination device 1000C does not have a display unit or an audio output unit, the communication unit communicates between the abnormality determination device 1000C and an external device such as a display unit or an audio output device.
  • FIG. 17 is a flowchart showing an example of the processing of the abnormality warning device 100C and the abnormality determination device 1000C according to the third embodiment of the present disclosure.
  • the abnormality determination device 1000C starts the process shown in FIG. 17 when an image is input from a camera, for example.
  • Abnormality determination device 1000C executes image acquisition processing (step ST3100).
  • the image acquisition unit 1100 of the abnormality determination device 1000C acquires and outputs an image.
  • abnormality determination device 1000C executes a storage and state comparison process (step ST3200).
  • the first state comparison unit 1340C of the abnormality determination device 1000C stores the previous values, which are the input values for each layer, which are the processing results for all layers of the first neural network unit 1320 for at least the first time after the start of processing.
  • the first state comparison unit 1340 does not perform state comparison processing for the first processing (processing for the first image) and simply stores the values, but in the second and subsequent processing (processing for the second image), it determines whether there has been a change in the state of the object to be judged.
  • abnormality determination device 1000C executes feature extraction process control (step ST3300).
  • the feature extraction process control unit 1381 of the abnormality determination device 1000C executes a normal process command or an omission process command to the first neural network unit 1320 depending on the determination result by the first state comparison unit 1340. Furthermore, when issuing an omission processing command, the feature extraction processing control unit 1381 issues an output command to the combined image storage unit 1391 .
  • abnormality determination device 1000C executes combined image output processing (step ST3400).
  • the first neural network unit 1320 executes normal processing and outputs a combined image, or the combined image storage unit 1391 outputs the previous combined image, whereby the feature extraction unit 1300 outputs a combined image.
  • Abnormality determination device 1000C executes a process of storing the fully connected layer states and comparing the fully connected layer states (step ST3500).
  • the second state comparison unit 1540 in the abnormality determination device 1000C pre-stores, for each of the multiple layers of the second neural network unit 1520, a reference value that is a standard for the input value to that layer, and determines a change in the state of the object to be determined using the difference value between the reference value and the current value, which is the value that is to be newly input to that layer.
  • the second state comparison unit 1540 does not perform state comparison processing for the first processing (processing for the first image) and simply stores the values, but determines changes in the state of the object to be judged for the second and subsequent processing (processing for the second image).
  • Abnormality determination device 1000C executes an abnormal state determination process control process (step ST3600).
  • the abnormal state determination process control unit 1581C issues a normal command or an omission command to the second neural network unit 1520.
  • the abnormal state determination process control unit 1581C issues an output command to the probability storage unit 1591.
  • Abnormality determination device 1000C executes a result output process (step ST3700).
  • the second neural network unit 1520 executes normal processing and outputs a judgment value, or the probability storage unit outputs a previous judgment value (probability value), which causes the abnormality judgment unit to output the judgment value as output data.
  • Abnormality determination device 1000C executes an alarm output process (step ST3800).
  • the alarm output unit 2000 of the abnormality determination device 1000C acquires output data and outputs an alarm signal based on the output data to an alarm device (not shown) etc.
  • the alarm output unit 2000 determines whether to output an alarm based on a determination value included in the output data, and when it determines to output an alarm, outputs an alarm signal to an alarm device (not shown) etc.
  • abnormality determining device 1000C executes the process of step ST3800, it ends the series of processes shown in FIG. 17 and repeats the process from step ST3100.
  • the abnormality determination device 1000C is also turned off in conjunction with, for example, the camera being turned off.
  • FIG. 18 is a flowchart showing a detailed first example of the processing of the feature extraction unit 1300C in the abnormality warning device 100C and the abnormality determination device 1000C according to the third embodiment of the present disclosure.
  • the first state comparison section 1340C of the feature extraction section 1300C executes an image storage process (step ST3210).
  • the image state comparison section 1341 of the first state comparison section 1340C stores the image that is the input data for the first neural network section 1320. Furthermore, every time the first neural network unit 1320 acquires an image, the image state comparison unit 1341 stores the image in the storage unit 1341a.
  • the image state comparison section 1341 executes a process of determining whether the image has been previously stored (step ST3211).
  • the image state comparison section 1341 refers to the storage section 1341a to determine whether the previously input image is stored therein.
  • the image state comparison unit 1341 executes a process of comparing the current image with the previous image (step ST3212).
  • the comparison processing section 1341b of the image state comparison section 1341 compares the currently input image with the previously input image.
  • the comparison processing unit 1341b executes a process of determining whether the difference is less than a threshold value (step ST3213).
  • the comparison processing unit 1341b judges the magnitude of the difference between the reference value and the current value using a difference value between the currently input image and the previously input image and a pre-stored threshold value.
  • the convolution layer state comparison unit 1342 executes a convolution layer state storage process (step ST3214).
  • the convolution layer state comparison unit 1342 stores input data for the convolution layer in the storage unit 1342a.
  • the convolution layer state comparison unit 1342 executes a process of determining whether or not the data has been previously stored (step ST3215).
  • the convolution layer state comparison unit 1342 refers to the storage unit 1342a and determines whether a previous value, which is a value input previously, is stored.
  • step ST3215 determines that the previous state has been stored (step ST3215 "YES")
  • step ST3216 executes a comparison process between the current state and the previous state
  • the feature extraction section 1300C executes a process of determining whether the difference is less than a threshold value (step ST3217).
  • the comparison processing unit 1342b judges whether the difference between the previous value and the current value is large or small, using a difference value between the currently input image and the previously input image and a pre-stored threshold value.
  • the pooling layer state comparing unit 1343 executes a pooling layer state storage process (step ST3218). In the pooling layer state storage process, the pooling layer state comparison unit 1343 stores input data for the pooling layer in the storage unit 1343a.
  • the comparison processing unit 1343b in the pooling layer state comparison unit 1343 executes a process to determine whether the data was previously stored (step ST3219).
  • step ST3219 If the comparison processing unit 1343b of the pooling layer state comparison unit 1343 determines that the value was stored previously ("YES" in step ST3219), it executes a comparison process between the current value and the reference value (previous value) (step ST3220).
  • the comparison processing unit 1343b of the pooling layer state comparison unit 1343 executes a process to determine whether the difference between the reference value (previous value) and the current value is less than a threshold value (step ST3221).
  • the comparison processing unit 1341b of the image state comparison unit 1341, the comparison processing unit 1342b of the convolution layer state comparison unit 1342, and the comparison processing unit 1343b of the pooling layer state comparison unit 1343 determine that the difference between the reference value (previous value) and the current value is greater than the threshold value (step ST3213 "NO", step ST3217 “NO”, step ST3221 “NO"), the feature extraction unit 1300C executes normal processing command processing (step ST3311). In the normal processing command process, the feature extraction processing control unit 1381 issues a normal processing command to the first neural network unit 1320 .
  • the feature extraction section 1300C executes an output process (step ST3411).
  • the first neural network portion 1320 of the feature extraction portion 1300C outputs the combined image.
  • the feature extraction section 1300C executes an output storage process (step ST3412).
  • the combined image storage section 1391 of the output storage section in the feature extraction section 1300C stores the combined image output from the first neural network section 1320.
  • the feature extraction unit 1300C executes an omission processing command processing (step ST3312).
  • the feature extraction processing control unit 1381 commands the first neural network unit 1320 not to execute processing in layers subsequent to the layer to be judged, and also commands the first output storage unit 1390 to output the feature map stored therein.
  • the feature extraction section 1300C executes a stored data output process (step ST3413).
  • the feature extraction unit 1300C ends the process shown in FIG.
  • FIG. 19 is a flowchart showing a detailed first example of the processing of the abnormal state determination unit 1500C in the abnormality warning device 100C and the abnormality determination device 1000C according to the third embodiment of the present disclosure.
  • the abnormal condition determination unit 1500C starts processing when it acquires the feature map output from the feature extraction unit 1300C.
  • the abnormal condition determination unit 1500C first executes processing on the data output by the feature extraction unit 1300C (step ST3510).
  • the abnormal state determination unit 1500C acquires the fully connected layer output (step ST3511).
  • the second state comparison section 1540 in the abnormal state determination section 1500C obtains the output value output from the first fully connected layer 1527.
  • the first fully connected layer state comparison unit 1541 executes a first fully connected layer state storage process (step ST3512).
  • the first fully connected layer state comparison unit 1541 stores the output value from the first fully connected layer 1527, which is the input value to the second fully connected layer 1528, as a reference value.
  • the first fully connected layer state comparing unit 1541 executes a process of judging whether the state has been stored previously (step ST3513).
  • the first fully connected layer state comparison unit 1541 refers to the storage unit 1541a to determine whether a reference value (previous value) is stored.
  • the first fully connected layer state comparison unit 1541 determines that the previous value has been stored ("YES" in step ST3513), it executes a comparison process between the current value and the previous value (step ST3514).
  • the first fully connected layer state comparison unit 1541 compares the current value with a reference value (previous value).
  • the first fully connected layer state comparing unit 1541 executes a process of determining whether the difference is smaller than a threshold value (step ST3515).
  • the comparison processing unit 1541b judges whether the difference between the current value and the reference value (previous value) is large or small, using a difference between the current value and the reference value and a pre-stored threshold value.
  • the second fully connected layer state comparison unit 1542 stores the output value from the second fully connected layer 1528, which is the input value to the probability output layer 1529, as a reference value.
  • the second fully connected layer state comparing unit 1542 executes a process of judging whether the state has been stored previously (step ST3517).
  • the second fully connected layer state comparison unit 1542 refers to the storage unit 1542a to determine whether a reference value (previous value) is stored.
  • the second fully connected layer state comparison unit 1542 determines that the previous value has been stored ("YES" in step ST3517), it executes a comparison process between the current value and the previous value (step ST3518).
  • the second fully connected layer state comparison unit 1542 compares the current value with a reference value (previous value).
  • the second fully connected layer state comparing unit 1542 executes a process of determining whether the difference is smaller than the threshold value (step ST3519).
  • the comparison processing unit 1542b judges whether the difference between the current value and the reference value (previous value) is large or small, using a difference between the current value and the reference value (previous value) and a pre-stored threshold value.
  • step ST3515 determines that the difference is greater than the threshold value
  • step ST3519 determines that the difference is greater than the threshold value
  • Abnormal state determination section 1500C executes output processing (step ST3711).
  • Neural network section 1520 (second neural network section 1520) in abnormal state determination section 1500C outputs a determination value.
  • Abnormal state determination section 1500C executes an output storage process (step ST3712).
  • An output storage unit 1590 (second output storage unit 1590) in the abnormal state determination unit 1500C stores the determination value output by the neural network unit 1520 (second neural network unit 1520).
  • the abnormal state determination unit 1500C executes an omission processing command (step ST3519).
  • the abnormal state judgment processing control unit 1581 instructs the second neural network unit 1520 not to execute processing in layers subsequent to the layer to be judged, and also instructs the second output storage unit 1590 to output the judgment value stored therein.
  • Abnormal state determination section 1500C executes a stored data output process (step ST3713).
  • the second output storage section 1590 outputs the decision value.
  • FIG. 20 is a flowchart showing a second detailed example of the processing of the feature extraction unit 1300C in the abnormality warning device 100C and the abnormality determination device 1000C according to the third embodiment of the present disclosure.
  • the first state comparison section 1340C of the feature extraction section 1300C executes an image storage process (step ST3210).
  • the image state comparison section 1341 of the first state comparison section 1340C stores the image that is the input data for the first neural network section 1320. Furthermore, every time the first neural network unit 1320 acquires an image, the image state comparison unit 1341 stores the image in the storage unit 1341a.
  • the image state comparison section 1341 executes a process of determining whether the image has been previously stored (step ST3211).
  • the image state comparison section 1341 refers to the storage section 1341a to determine whether the previously input image is stored therein.
  • the image state comparison unit 1341 executes a process of comparing the current image with the previous image (step ST3212).
  • the comparison processing section 1341b of the image state comparison section 1341 compares the currently input image with the previously input image.
  • the comparison processing unit 1341b executes a process to determine whether the difference is less than a threshold value (step ST3213).
  • the comparison processing unit 1341b determines the magnitude of the difference between the reference value and the current value using the difference value between the currently input image and the previously input image and a pre-stored threshold value.
  • step ST3211 "NO" determines that the value has not been stored previously
  • step ST3213 "NO” determines that the difference between the previous value and the current value is equal to or greater than a threshold value
  • step ST3213 "NO" the feature extraction processing control unit 1381C issues a normal processing command
  • step ST3231 processing is performed by the convolution layer unit 1322.
  • step ST3214 executes a convolution layer state storage process (step ST3214).
  • the convolution layer state comparison unit 1342 stores input data for the convolution layer in the storage unit 1342a.
  • the convolutional layer state comparison unit 1342 executes a process to determine whether a previous value has been stored (step ST3215).
  • the convolutional layer state comparison unit 1342 refers to the storage unit 1342a and determines whether a previous value (reference value), which is a value input previously, has been stored.
  • step ST3215 determines that the previous state has been stored (step ST3215 "YES")
  • step ST3216 executes a comparison process between the current state and the previous state
  • the feature extraction unit 1300C executes a process to determine whether the difference is less than a threshold value (step ST3217). Specifically, the comparison processing unit 1342b of the convolution layer state comparison unit 1342 determines the magnitude of the difference between the previous value and the current value using the difference between the value to be input this time (current value) and the value input last time (previous value) to the pooling layer unit 1325 and a pre-stored threshold value.
  • the convolutional layer state comparison unit 1342 determines that the value was not stored previously ("NO" in step ST3211), or if the comparison processing unit 1342b of the convolutional layer state comparison unit 1342 determines that the difference between the previous value and the current value is equal to or greater than a threshold value ("NO" in step ST3217), the feature extraction processing control unit 1381C issues a normal processing command (step ST3232), and processing is performed by the pooling layer unit 1325.
  • the pooling layer state comparison unit 1343 stores the input data for the pooling layer (the input value to be processed) in the storage unit 1343a.
  • the comparison processing unit 1343b in the pooling layer state comparison unit 1343 executes a process to determine whether the data was previously stored (step ST3219).
  • step ST3219 If the comparison processing unit 1343b of the pooling layer state comparison unit 1343 determines that the value was stored previously ("YES" in step ST3219), it executes a comparison process between the current value and the reference value (previous value) (step ST3220).
  • the comparison processing unit 1343b of the pooling layer state comparison unit 1343 executes a process to determine whether the difference between the reference value (previous value) and the current value is less than a threshold value (step ST3221).
  • the feature extraction unit 1300C executes normal processing command processing (step ST3311).
  • the feature extraction processing control unit 1381 issues a normal processing command to the first neural network unit 1320.
  • the image combination unit 1328 in the first neural network unit 1320 executes image combination processing.
  • the feature extraction section 1300C executes an output storage process (step ST3412).
  • the combined image storage section 1391 of the output storage section in the feature extraction section 1300C stores the combined image output from the first neural network section 1320.
  • the feature extraction unit 1300C executes an omission processing command processing (step ST3312).
  • the feature extraction processing control unit 1381 commands the first neural network unit 1320 not to execute processing in layers subsequent to the layer to be judged, and also commands the first output storage unit 1390 to output the feature map stored therein.
  • the feature extraction section 1300C executes a stored data output process (step ST3413).
  • the feature extraction unit 1300C ends the process shown in FIG.
  • FIG. 21 is a flowchart showing a second detailed example of the processing of the abnormal state determination unit 1500C in the abnormality warning device 100C and the abnormality determination device 1000C according to the third embodiment of the present disclosure.
  • the abnormal condition determination unit 1500C starts processing when it acquires the feature map output from the feature extraction unit 1300C.
  • the abnormal condition determination unit 1500C first executes processing on the data output by the feature extraction unit 1300C (step ST3510).
  • the abnormal state determination unit 1500C acquires the fully connected layer output (step ST3511).
  • the second state comparison section 1540 in the abnormal state determination section 1500C obtains the output value output from the first fully connected layer 1527.
  • the first layer state comparison unit 1541 executes a first layer state storage process (step ST3512).
  • the first fully connected layer state comparison unit 1541 stores the output value from the first fully connected layer 1527 and the input value to the second fully connected layer 1528 (the value before being processed by the second fully connected layer 1528) as a reference value.
  • the first layer state comparing unit 1541 executes a process of determining whether or not the data has been previously stored (step ST3513).
  • the first layer state comparison unit 1541 refers to the storage unit 1541a and determines whether a reference value (previous value) is stored.
  • the first layer state comparison unit 1541 determines that the previous value has been stored ("YES" in step ST3513), it executes a comparison process between the current value and the previous value (step ST3514).
  • the first fully connected layer state comparison unit 1541 compares the current value with a reference value (previous value).
  • the first layer state comparison unit 1541 executes a process of determining whether the difference is smaller than a threshold value (step ST3515).
  • the comparison processing unit 1541b in the first layer state comparison unit 1541 determines the magnitude of the difference between the reference value and the current value using the difference between the current value and the reference value (previous value) and a pre-stored threshold value.
  • the abnormal state determination processing control unit 1581C issues a normal processing command (step ST3520), and processing is performed by the second fully connected layer 1528.
  • the second fully connected layer state comparison unit 1542 executes a second fully connected layer state storage process (step ST3516).
  • the second fully connected layer state comparison unit 1542 stores the output value from the second fully connected layer 1528, which is the input value to the probability output layer 1529 (the value before being processed by the probability output layer 1529), as a reference value.
  • the second fully connected layer state comparison unit 1542 executes a process to determine whether a previous value has been stored (step ST3517).
  • the second fully connected layer state comparison unit 1542 refers to the storage unit 1542a to determine whether a reference value (previous value) has been stored.
  • the second fully connected layer state comparison unit 1542 determines that the previous value has been stored ("YES" in step ST3517), it executes a comparison process between the current value and the previous value (step ST3518).
  • the second fully connected layer state comparison unit 1542 compares the current value with a reference value (previous value).
  • the second fully connected layer state comparison unit 1542 executes a determination process of whether the difference is less than the threshold value (step ST3519).
  • the comparison processing unit 1542b in the second fully connected layer state comparison unit 1542 determines the magnitude of the difference between the reference value and the current value using the difference between the current value and the reference value (previous value) and a pre-stored threshold value.
  • step ST3517 “NO” determines that the data was not previously stored (step ST3517 "NO"), or if the comparison processing unit 1542b in the second fully connected layer state comparison unit 1542 determines that the difference is equal to or greater than the threshold value (step ST3519 "NO")
  • the abnormal state determination unit 1500C executes a normal processing command process (step ST3611).
  • Abnormal state determination section 1500C executes output processing (step ST3711).
  • the second neural network unit 1520 in the abnormal state determination unit 1500C outputs a determination value.
  • Abnormal state determination section 1500C executes an output storage process (step ST3712).
  • the second output storage unit 1590 in the abnormal state determination unit 1500C stores the determination value output by the second neural network unit 1520 in a probability storage unit 1591 .
  • the abnormal state determination unit 1500C executes an omission processing command (step ST3519).
  • the abnormal state judgment processing control unit 1581C instructs the second neural network unit 1520 not to execute processing in layers subsequent to the layer to be judged, and also instructs it to output the judgment value stored in the second output storage unit 1590.
  • the abnormal state determination unit 1500C executes the stored data output process (step ST3713).
  • the second output storage unit 1590 receives a command from the abnormal state determination process control unit 1581C and outputs the determination value (probability value) stored in the probability storage unit 1591.
  • each state storage and comparison processing unit determines that the change is small, it is possible to avoid subsequent processing, thereby reducing the amount of calculations and enabling faster processing. At this time, the result that is output is the previous highly accurate determination result, so that the output of highly accurate results can be continued.
  • the abnormality determination device of the present disclosure is further configured as follows. "The neural network unit includes a first neural network unit and a second neural network unit,
  • the output storage unit includes a first output storage unit and a second output storage unit,
  • the state comparison unit includes a first state comparison unit and a second state comparison unit
  • the processing control unit includes a first processing control unit and a second processing control unit, the first neural network unit having a plurality of layers that are a part of the layers constituting the neural network, acquiring an image output by the image acquisition unit, and outputting a feature map that represents a characteristic state of the object to be determined that is included in the image;
  • the first output storage unit that stores the feature map output by the first neural network unit;
  • the first state comparison unit which stores in advance a reference value that is a reference for an input value to each of a plurality of layers in the first neural network unit, and judges a change in the state of the object to be judged using a difference value between the reference value and a current value that is a value
  • the present disclosure has the effect of providing an abnormality determination device that can use a neural network in an abnormality determination technique to more quickly output highly accurate determination results. Furthermore, the present disclosure achieves the same effect as the above by applying the above configuration to the abnormality determination method.
  • FIG. 22 is a diagram illustrating a first example of a hardware configuration for realizing the functions of the present disclosure.
  • FIG. 23 is a diagram illustrating a second example of a hardware configuration for realizing the functions of the present disclosure.
  • the abnormality warning device 100 (100A, 100B, 100C) and the abnormality determination device 1000 (1000A, 1000B, 1000C) of the present disclosure are each realized by hardware such as that shown in FIG. 22 or FIG. 23.
  • each of the abnormality warning devices 100 (100A, 100B, 100C) and the abnormality determination devices 1000 (1000A, 1000B, 1000C) is configured with, for example, a processor 10001, a memory 10002, and a communication circuit 10004.
  • the processor 10001 and the memory 10002 are installed in, for example, a computer.
  • the memory 10002 stores the computer, an image acquisition unit 1010, a neural network unit 1020, a state comparison unit 1030, a processing control unit 1040, an output storage unit 1050, image acquisition units 1100, 1100B, and 1100C, feature extraction units 1300, 1300B, and 1300C, a first neural network unit 1320, an image branching unit 1321, and a convolution layer unit 1322.
  • the processor 10001 reads out and executes the programs stored in the memory 10002, thereby controlling the image acquisition unit 1010, the neural network unit 1020, the state comparison unit 1030, the processing control unit 1040, the output storage unit 1050, the image acquisition units 1100, 1100B, and 1100C, the feature extraction units 1300, 1300B, and 1300C, the first neural network unit 13 20, image branching unit 1321, convolution layer unit 1322, first convolution layer 1323, second convolution layer 1324, pooling layer unit 1325, first pooling layer 1326, second pooling layer 1327, image combination unit 1328, first state comparison unit 1340, 1340B, 1340C, image state comparison unit 1341, comparison processing unit 1341b, convolution layer state comparison unit 1342, comparison Processing unit 1342b, pooling layer state comparison unit 1343, comparison processing unit 1343b, first processing control units 1380, 1380B, 1380C, feature extraction processing control units 1381, 1381B, 1381C, abnormal state determination units 1500, 1500B, 1500C,
  • a storage unit (not shown) is realized by the memory 10002 or another memory (not shown). Furthermore, the storage units 1341a, 1342a, 1343a, the first output storage unit 1390, the combined image storage unit 1391, the storage units 1541a, 1542a, the second output storage unit 1590, and the probability storage unit 1591 in the abnormality warning device 100 (100A, 100B, 100C) and the abnormality determination device 1000 (1000A, 1000B, 1000C) are realized by the memory 10002 or another memory (not shown). Furthermore, the communication circuit 10004 realizes a communication unit (not shown).
  • the processor 10001 is, for example, a central processing unit (CPU), a graphics processing unit (GPU), a microprocessor, a microcontroller, or a digital signal processor (DSP).
  • the memory 10002 may be a non-volatile or volatile semiconductor memory such as a RAM (Random Access Memory), a ROM (Read Only Memory), an EPROM (Erasable Programmable ROM), an EEPROM (Electrically Erasable Programmable Read Only Memory), or a flash memory, or a magnetic disk such as a hard disk or a flexible disk, or an optical disk such as a CD (Compact Disc) or a DVD (Digital Versatile Disc), or a magneto-optical disk.
  • RAM Random Access Memory
  • ROM Read Only Memory
  • EPROM Erasable Programmable ROM
  • EEPROM Electrically Erasable Programmable Read Only Memory
  • flash memory or a magnetic disk such as a hard disk or a flexible disk, or an optical disk such as a CD (Com
  • the processor 10001 and the memory 10002 or the communication circuit 10004 are connected in a state in which data can be transmitted between them.
  • the processor 10001, the memory 10002, and the communication circuit 10004 are also connected via the input/output interface 10003 in a state in which data can be transmitted between them and other hardware.
  • the communication circuit 10004 realizes a communication unit (not shown).
  • the processing circuit 20001 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), an FPGA (Field-Programmable Gate Array), a SoC (System-on-a-Chip), or a system LSI (Large-Scale Integration).
  • the memory 20002 or another memory (not shown) implements a storage unit (not shown).
  • the memory 20002 may be a non-volatile or volatile semiconductor memory such as a RAM (Random Access Memory), a ROM (Read Only Memory), an EPROM (Erasable Programmable ROM), an EEPROM (Electrically Erasable Programmable Read Only Memory), or a flash memory, or a magnetic disk such as a hard disk or a flexible disk, or an optical disk such as a CD (Compact Disc) or a DVD (Digital Versatile Disc), or a magneto-optical disk.
  • the communication circuit 20004 realizes a communication unit (not shown).
  • the processing circuit 20001 and the memory 20002 or the communication circuit 20004 are connected in a state in which they can transmit data to each other.
  • the processing circuit 20001, the memory 20002, and the communication circuit 20004 are connected in a state in which they can transmit data to other hardware via the input/output interface 20003.
  • the present disclosure provides an abnormality determination technology that can quickly output highly accurate determination results using a neural network without performing calculations in all layers of the neural network. For example, this enables rapid and highly accurate abnormality determination, such as drowsiness detection, and is therefore suitable for application to in-vehicle driver monitoring systems.
  • 100, 100A, 100B, 100C abnormality warning device, 1000, 1000A, 1000B, 1000C: abnormality determination device, 1010: image acquisition unit, 1020: neural network unit, 1030: state comparison unit, 1040: processing control unit, 1050: output storage unit, 1100, 1100B, 1100C: image acquisition unit, 1300, 1300B, 1300C: feature extraction unit, 1320: neural network unit (first neural network unit), 1321: image branching unit, 1322: convolution layer unit, 1323: first convolution layer, 1324: second convolution layer, 1325 pooling layer unit 1326 first pooling layer, 1327 second pooling layer, 1328 image combination unit, 1340, 1340B, 1340C state comparison unit (first state comparison unit), 1341 image state comparison unit, 1341a storage unit, 1341b comparison processing unit, 1342 convolution layer state comparison unit, 1342a storage unit, 1342b comparison processing unit, 1343 pooling layer state comparison unit, 1343a storage unit, 1343b comparison processing unit, 1380, 1380

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Image Analysis (AREA)

Abstract

An abnormality determination device (1000) comprises: an image acquisition unit (1010) that acquires and outputs an image; a neural network unit (1020) that has a plurality of layers in a neural network, the neural network unit acquiring the image outputted by the image acquisition unit and outputting a determination result indicating the state of a determination object included in the image; an output storage unit (1050) that stores the output data outputted by the neural network unit; a state comparison unit (1030) that, while a reference value is preliminarily stored, the reference value being a reference for an input value to a first layer that is at least one layer among the plurality of layers of the neural network unit, determines a change in the state of the determination object by using a difference value between the reference value and a current value that is to be newly inputted to the first layer; and a processing control unit (1040) that instructs the neural network unit to not execute processing in layers after the first layer, and instructs that the output data stored in the output storage unit is be outputted, in cases where it is determined by the state comparison unit that the state of the determination object has not changed.

Description

異常判定装置、および、異常判定方法Abnormality determination device and abnormality determination method

 本開示技術は、判定対象者の状態を判定する異常判定技術に関する。 The disclosed technology relates to an abnormality determination technology that determines the condition of a subject.

 異常判定技術の中には、判定対象者の状態を、センサ装置(カメラ、レーダ、赤外線センサ、といったセンサ装置)による測定結果を用いて、判定するものがある。
 このような異常判定技術のうち、例えば、車両(移動体)の運転者(乗員)といった判定対象者の状態を判定する異常判定技術においては、判定対象者等の安全に関わるため、誤判定が発生しないように高精度に判定されることが望ましい。
 これに対し、近年、異常判定技術においては、深層学習(deep learning)を用いる技術の研究が進められている。深層学習ではニューラルネットワーク(Neural Network=NN)という数式的なモデルが用いられる。ニューラルネットワークは、入力層、隠れ層(中間層)、出力層、といった層を介して結果を出力する。
 特許文献1には、ニューラルネットワークの一種である畳み込みニューラルネットワーク(Convolutional Neural Network=CNN)を用いて、移動体のドライバが異常状態になったことを識別する識別装置が示されている。
 具体的には、特許文献1の識別装置は、取得した画像に基づいてドライバの状態(例えば異常状態であるか否かなど)を識別する処理において、画像からドライバの骨格情報を取得する際に、畳み込みニューラルネットワークによって取得する。
 特許文献1の識別装置は、ニューラルネットワークを用いて、ドライバの骨格情報の精度を向上させている。
Among the abnormality determination techniques, there are those that determine the condition of a person to be determined by using the results of measurements taken by a sensor device (such as a camera, radar, or infrared sensor).
Among such abnormality judgment technologies, for example, in abnormality judgment technologies that judge the state of a subject of judgment, such as a driver (occupant) of a vehicle (mobile body), since this concerns the safety of the subject of judgment, it is desirable to make the judgment with high accuracy to avoid erroneous judgments.
In response to this, in recent years, research into anomaly determination technology using deep learning has been progressing. Deep learning uses a mathematical model called a neural network (NN). A neural network outputs results through layers such as an input layer, a hidden layer (intermediate layer), and an output layer.
Patent Document 1 discloses an identification device that uses a convolutional neural network (CNN), which is a type of neural network, to identify when a driver of a mobile object is in an abnormal state.
Specifically, in the process of identifying the driver's condition (e.g., whether or not the driver is in an abnormal state) based on an acquired image, the identification device of Patent Document 1 obtains the driver's skeletal information from the image using a convolutional neural network.
The identification device of Patent Document 1 improves the accuracy of the driver's skeletal information by using a neural network.

特開2020-052869号公報JP 2020-052869 A

 他方で、上述したような移動体における異常判定技術は、高精度化とともに速やかな判定が求められる場合が多い。
 異常判定技術において上述したニューラルネットワークを用いて高精度化を実現しようとする場合、例えば、ニューラルネットワークにおいて、画像から判定対象者の状態を示す多数の判定指標(例えば、瞼の開閉状態、瞬きの回数、顔の位置、脳波の状態や、放熱状態、運転パフォーマンス(反応時間等)などといった多数の種類の指標)を抽出して用いる方法を採用することが考えられる。
 仮に、上記方法により、ニューラルネットワークを用いて高精度に異常状態の判定結果を出力させようとした場合、ニューラルネットワークにおける層の数が多くなってしまうため、または、構成が複雑になってしまうため、演算量および演算時間が増大してしまう。
 そのため、異常判定技術においては、ニューラルネットワークを用いて速やかに高精度な判定結果を出力することが困難である、といった課題があった。
 特許文献1の識別装置は、画像からニューラルネットワークを介して骨格情報を出力する程度であって、上記課題を解決できるものではない。
On the other hand, in the abnormality determination technology for a moving object as described above, there is often a demand for rapid determination as well as high accuracy.
When attempting to achieve high accuracy using the above-mentioned neural network in anomaly judgment technology, for example, a method can be adopted in which the neural network extracts and uses a large number of judgment indices indicating the state of the person being judged from an image (for example, a large number of types of indices such as the open/closed state of the eyelids, the number of blinks, the position of the face, the state of brain waves, the heat dissipation state, driving performance (reaction time, etc.)).
If one were to use the above method to output highly accurate abnormality determination results using a neural network, the number of layers in the neural network would increase or the configuration would become complex, resulting in an increase in the amount and time of calculations.
Therefore, there has been a problem with anomaly determination technology in that it is difficult to quickly output highly accurate determination results using a neural network.
The classification device of Patent Document 1 merely outputs skeletal information from an image via a neural network, and is not capable of solving the above-mentioned problems.

 本開示は、上記課題を解決するもので、異常判定技術において、ニューラルネットワークを用いて速やかに高精度な判定結果を出力することを可能にすることを目的とする。 The present disclosure aims to solve the above problems and to enable anomaly detection technology to quickly output highly accurate detection results using a neural network.

 本開示の異常判定装置は、
 画像を取得して出力する画像取得部と、
 ニューラルネットワークにおける複数の層を有し、前記画像取得部により出力された画像を取得し、当該画像に含まれる判定対象の状態を示す判定結果を出力する、ニューラルネットワーク部と、
 前記ニューラルネットワーク部により出力された出力データを格納する、出力格納部と、
 前記ニューラルネットワーク部の複数の層のうちの少なくとも一つの層である第1の層に対する入力値の基準である基準値を予め格納しておき、当該基準値と、当該第1の層に新たに入力されようとしている値である今回値と、の差分値を用いて、前記判定対象の状態の変化を判定する、状態比較部と、
 前記状態比較部により前記判定対象の状態が変化なしと判定された場合、前記第1の層以降の層における処理を実行しないよう前記ニューラルネットワーク部へ指令するとともに、前記出力格納部に格納されている前記出力データを出力するよう指令する、処理制御部と、
 を備えた。
The abnormality determination device of the present disclosure is
an image acquisition unit that acquires and outputs an image;
a neural network unit having a plurality of layers in a neural network, acquiring an image output by the image acquisition unit, and outputting a judgment result indicating a state of a judgment target included in the image;
an output storage unit that stores output data output by the neural network unit;
a state comparison unit that prestores a reference value that is a reference for an input value to a first layer that is at least one of the multiple layers of the neural network unit, and judges a change in the state of the object to be judged using a difference value between the reference value and a current value that is a value that is to be newly input to the first layer;
a processing control unit that instructs the neural network unit not to execute processing in the first layer and subsequent layers when the state comparison unit determines that the state of the object to be determined has not changed, and instructs the neural network unit to output the output data stored in the output storage unit;
Equipped with:

 本開示によれば、異常判定技術において、ニューラルネットワークを用いて速やかに高精度な判定結果を出力することを可能にすることができる、という効果を奏する。 The present disclosure has the effect of enabling anomaly detection technology to quickly output highly accurate detection results using a neural network.

図1は、本開示の実施の形態1に係る異常警報装置100Aおよび異常判定装置1000Aの構成の一例を示す図である。FIG. 1 is a diagram showing an example of the configuration of an abnormality warning device 100A and an abnormality determination device 1000A according to a first embodiment of the present disclosure. 図2は、本開示の実施の形態1に係る異常警報装置100Aおよび異常判定装置1000Aの処理の一例を示すフローチャートである。FIG. 2 is a flowchart showing an example of the processing of the abnormality warning device 100A and the abnormality determination device 1000A according to the first embodiment of the present disclosure. 図3は、本開示の実施の形態1に係る異常警報装置100Aおよび異常判定装置1000Aのさらに詳細な処理の一例を示すフローチャートである。FIG. 3 is a flowchart showing an example of more detailed processing of the abnormality warning device 100A and the abnormality determination device 1000A according to the first embodiment of the present disclosure. 図4は、本開示の実施の形態2に係る異常警報装置100Bおよび異常判定装置1000Bの構成の一例を示す図である。FIG. 4 is a diagram showing an example of the configuration of an abnormality warning device 100B and an abnormality determination device 1000B according to the second embodiment of the present disclosure. 図5は、異常警報装置100Bおよび異常判定装置1000Bにおける特徴抽出部1300Bの内部構成の一例を示す図である。FIG. 5 is a diagram showing an example of the internal configuration of the abnormality warning device 100B and the feature extraction unit 1300B in the abnormality determination device 1000B. 図6は、異常警報装置100Bおよび異常判定装置1000Bにおける異常状態判定部1500Bの内部構成の一例を示す図である。FIG. 6 is a diagram showing an example of the internal configuration of the abnormality warning device 100B and the abnormality state determination unit 1500B in the abnormality determination device 1000B. 図7は、本開示の実施の形態2に係る異常警報装置100Bおよび異常判定装置1000Bの処理の一例を示すフローチャートである。FIG. 7 is a flowchart showing an example of processing by the abnormality warning device 100B and the abnormality determination device 1000B according to the second embodiment of the present disclosure. 図8は、本開示の実施の形態2に係る異常警報装置100Bおよび異常判定装置1000Bのさらに詳細な処理の一例を示すフローチャートである。FIG. 8 is a flowchart showing an example of more detailed processing of the abnormality warning device 100B and the abnormality determination device 1000B according to the second embodiment of the present disclosure. 図9は、本開示の実施の形態2の特徴抽出部1300Bにおける状態比較部1030(第1の状態比較部1030B)が画像状態比較部1341である場合の特徴抽出部1300Bの処理の一例を示すフローチャートである。FIG. 9 is a flowchart showing an example of a process of the feature extraction unit 1300B in a case where the state comparison unit 1030 (first state comparison unit 1030B) in the feature extraction unit 1300B according to the second embodiment of the present disclosure is the image state comparison unit 1341. 図10は、本開示の実施の形態2の特徴抽出部1300Bにおける状態比較部1030(第1の状態比較部1030B)が畳み込み層状態比較部である場合の特徴抽出部1300Bの処理の一例を示すフローチャートである。FIG. 10 is a flowchart illustrating an example of a process of the feature extraction unit 1300B in the second embodiment of the present disclosure when the state comparison unit 1030 (first state comparison unit 1030B) is a convolutional layer state comparison unit. 図11は、本開示の実施の形態2の特徴抽出部1300Bにおける状態比較部1030(第1の状態比較部1030B)がプーリング層状態比較部である場合の特徴抽出部1300Bの処理の一例を示すフローチャートである。FIG. 11 is a flowchart illustrating an example of a process performed by the feature extraction unit 1300B in the second embodiment of the present disclosure when the state comparison unit 1030 (first state comparison unit 1030B) is a pooling layer state comparison unit. 図12は、本開示の実施の形態2の異常状態判定部1500Bにおける状態比較部(第2の状態比較部1540B)が第1の全結合層状態比較部1541である場合の異常状態判定部1500Bの処理の一例を示すフローチャートである。FIG. 12 is a flowchart showing an example of processing of the abnormal state determination unit 1500B in the case where the state comparison unit (second state comparison unit 1540B) in the abnormal state determination unit 1500B according to the second embodiment of the present disclosure is the first fully connected layer state comparison unit 1541. 図13は、本開示の実施の形態2の異常状態判定部1500Bにおける状態比較部(第2の状態比較部1540B)が第2の全結合層状態比較部である場合の異常状態判定部1500Bの処理の一例を示すフローチャートである。FIG. 13 is a flowchart showing an example of processing by the abnormal state determination unit 1500B in the second embodiment of the present disclosure when the state comparison unit (second state comparison unit 1540B) in the abnormal state determination unit 1500B is a second fully connected layer state comparison unit. 図14は、本開示の実施の形態3に係る異常警報装置100Cおよび異常判定装置1000Cの構成の一例を示す図である。FIG. 14 is a diagram showing an example of the configuration of an abnormality warning device 100C and an abnormality determination device 1000C according to the third embodiment of the present disclosure. 図15は、異常警報装置100Cおよび異常判定装置1000Cにおける特徴抽出部1300Cの内部構成の一例を示す図である。FIG. 15 is a diagram showing an example of the internal configuration of the feature extraction unit 1300C in the abnormality warning device 100C and the abnormality determination device 1000C. 図16は、異常警報装置100Cおよび異常判定装置1000Cにおける異常状態判定部1500Cの内部構成の一例を示す図である。FIG. 16 is a diagram showing an example of the internal configuration of the abnormality warning device 100C and the abnormality state determination unit 1500C in the abnormality determination device 1000C. 図17は、本開示の実施の形態3に係る異常警報装置100Cおよび異常判定装置1000Cの処理の一例を示すフローチャートである。FIG. 17 is a flowchart showing an example of the processing of the abnormality warning device 100C and the abnormality determination device 1000C according to the third embodiment of the present disclosure. 図18は、本開示の実施の形態3に係る異常警報装置100Cおよび異常判定装置1000Cにおける特徴抽出部1300Cの処理の詳細な第一の例を示すフローチャートである。FIG. 18 is a flowchart showing a detailed first example of the processing of the feature extraction unit 1300C in the abnormality warning device 100C and the abnormality determination device 1000C according to the third embodiment of the present disclosure. 図19は、本開示の実施の形態3に係る異常警報装置100Cおよび異常判定装置1000Cにおける異常状態判定部1500Cの処理の詳細な第一の例を示すフローチャートである。FIG. 19 is a flowchart showing a detailed first example of the processing of the abnormal state determination unit 1500C in the abnormality warning device 100C and the abnormality determination device 1000C according to the third embodiment of the present disclosure. 図20は、本開示の実施の形態3に係る異常警報装置100Cおよび異常判定装置1000Cにおける特徴抽出部1300Cの処理の詳細な第二の例を示すフローチャートである。FIG. 20 is a flowchart showing a second detailed example of the processing of the feature extraction unit 1300C in the abnormality warning device 100C and the abnormality determination device 1000C according to the third embodiment of the present disclosure. 図21は、本開示の実施の形態3に係る異常警報装置100Cおよび異常判定装置1000Cにおける異常状態判定部1500Cの処理の詳細な第二の例を示すフローチャートである。FIG. 21 is a flowchart showing a second detailed example of the processing of the abnormal state determination unit 1500C in the abnormality warning device 100C and the abnormality determination device 1000C according to the third embodiment of the present disclosure. 図22は、本開示の機能を実現するためのハードウェア構成の第一の例を示す図である。FIG. 22 is a diagram illustrating a first example of a hardware configuration for realizing the functions of the present disclosure. 図23は、本開示の機能を実現するためのハードウェア構成の第二の例を示す図である。FIG. 23 is a diagram illustrating a second example of a hardware configuration for realizing the functions of the present disclosure.

 以下、本開示をより詳細に説明するために、本開示の実施の形態について、添付の図面に従って説明する。
 なお、以下の説明においては、「第1の」、「第2の」、「第3の」などの序数が用いられる場合がある。これらの用語は、実施の形態の内容を理解することを容易にするために便宜上用いられるものであり、これらの序数によって生じ得る順序などに限定されるものではない。
In order to explain the present disclosure in more detail, embodiments of the present disclosure will be described below with reference to the accompanying drawings.
In the following description, ordinal numbers such as "first,""second," and "third" may be used. These terms are used for convenience to facilitate understanding of the contents of the embodiments, and are not limited to the order that may result from these ordinal numbers.

実施の形態1.
 実施の形態1は、本開示の基本的な構成の一形態を説明する。
Embodiment 1.
In the first embodiment, one mode of a basic configuration of the present disclosure will be described.

 実施の形態1に係る異常警報装置100Aおよび異常判定装置1000Aの構成の一例を説明する。
 図1は、本開示の実施の形態1に係る異常警報装置100Aおよび異常判定装置1000Aの構成の一例を示す図である。
 異常警報装置100Aは、画像を取得し、当該画像を用いて、当該画像に撮像された判定対象の状態を判定し、判定対象の状態が異常を示す場合に警報を出力する。
 図1に示す異常警報装置100は、異常判定装置1000A、および、警報出力部2000、を含み構成されている。
An example of the configuration of the abnormality warning device 100A and the abnormality determination device 1000A according to the first embodiment will be described.
FIG. 1 is a diagram showing an example of the configuration of an abnormality warning device 100A and an abnormality determination device 1000A according to a first embodiment of the present disclosure.
The abnormality warning device 100A acquires an image, uses the image to judge the state of an object to be judged that is captured in the image, and outputs a warning if the state of the object to be judged indicates an abnormality.
The abnormality warning device 100 shown in FIG. 1 includes an abnormality determination device 1000A and a warning output unit 2000.

 異常判定装置1000Aは、画像を取得し、当該画像を用いて、当該画像に撮像された判定対象の状態を出力する。判定対象の状態は、例えば、判定値、確率値、といった形態の値を含む出力データである。
 異常判定装置1000Aは、画像取得部1010、ニューラルネットワーク部1020、状態比較部1030、処理制御部1040、および、出力格納部1050、を含み構成されている。
The abnormality determination device 1000A obtains an image and uses the image to output the state of the object to be determined captured in the image. The state of the object to be determined is output data including a value in the form of, for example, a determination value or a probability value.
The abnormality determination device 1000A includes an image acquisition unit 1010, a neural network unit 1020, a state comparison unit 1030, a processing control unit 1040, and an output storage unit 1050.

 画像取得部1010は、画像を取得して出力する。
 画像取得部1010は、例えばカメラから撮像された画像(動画像もしくは静止画像)が入力されることで、画像を取得する。
 カメラは、例えば車両の室内に設置された撮像装置であって、車両の室内に存在する判定対象(判定対象は、例えば、運転者を含む乗員といった生体)を撮像し、撮像した画像を出力する。
 画像が動画像の場合、カメラにより動画像を一定時間間隔の静止画像(フレーム)に分割した画像が画像取得部1010に入力される。
The image acquisition unit 1010 acquires and outputs an image.
The image acquisition unit 1010 acquires an image by inputting an image (moving image or still image) captured by a camera, for example.
The camera is, for example, an imaging device installed inside the vehicle cabin, which captures an image of a determination target present inside the vehicle cabin (the determination target is, for example, a living body such as a passenger including the driver) and outputs the captured image.
When the image is a moving image, the moving image is divided by a camera into still images (frames) at regular time intervals, and the divided images are input to the image acquisition unit 1010 .

 ニューラルネットワーク部1020は、ニューラルネットワークにより構成されている。
 ニューラルネットワーク部1020は、ニューラルネットワークにおける複数の層を有し、画像取得部1010により出力された画像を取得し、当該画像に含まれる判定対象の状態を示す判定結果である出力データを出力する。
 ニューラルネットワークにおける複数の層は、入力層、隠れ層(中間層)、出力層、であって、さらに具体的には、例えば、畳み込み層、プーリング層、といった層である。
 ここで、層が多い(深い)ニューラルネットワークは特にDNN(Deep Neural Network)と呼ばれ、派生したものとして例えば、CNN(Convolutional Neural Network;畳み込みニューラルネットワーク)、RNN(Recurrent Neural Network;再起型ニューラルネットワーク)といったものがある。CNNは物体認識、画像認識分野で多く用いられ、RNNは時系列処理や音声認識、自然言語処理等で多く用いられる。本開示におけるニューラルネットワークはニューラルネットワークの形態を問わず適用可能である。本説明においては、CNNの形態を例にして説明する。
The neural network unit 1020 is composed of a neural network.
The neural network unit 1020 has multiple layers in a neural network, acquires the image output by the image acquisition unit 1010, and outputs output data that is a judgment result indicating the state of the judgment target contained in the image.
The multiple layers in a neural network are an input layer, a hidden layer (an intermediate layer), and an output layer, and more specifically, layers such as a convolutional layer and a pooling layer.
Here, a neural network with many layers (deep) is particularly called a DNN (Deep Neural Network), and its derivatives include, for example, a CNN (Convolutional Neural Network) and an RNN (Recurrent Neural Network). CNN is often used in the fields of object recognition and image recognition, while RNN is often used in time series processing, voice recognition, natural language processing, and the like. The neural network in the present disclosure is applicable regardless of the form of the neural network. In this description, the form of CNN will be described as an example.

 状態比較部1030は、判定対象の状態の変化を判定する。
 状態比較部1030は、ニューラルネットワーク部1020の複数の層のうちの少なくとも一つの層である第1の層に対する入力値の基準である基準値を予め格納しておき、当該基準値と、当該第1の層に新たに入力されようとしている値である今回値と、の差分値を用いて、判定対象の状態の変化を判定する。
 第1の層は、ニューラルネットワーク部1020における全ての層のうちの予め定められた一つの層であってもよいし、全ての層のうちの予め定められた一部の複数の層であってもよいし、全ての層それぞれであってもよい。
The state comparison unit 1030 judges a change in the state of the object to be judged.
The state comparison unit 1030 pre-stores a reference value that is a standard for the input value to the first layer, which is at least one of the multiple layers of the neural network unit 1020, and determines a change in the state of the object to be judged using the difference value between the reference value and the current value, which is the value that is to be newly input to the first layer.
The first layer may be a predetermined one of all layers in the neural network unit 1020, a predetermined portion of all layers, or all layers.

 基準値は、ニューラルネットワーク部1020の複数の層による前回の処理結果として、少なくとも一つの層である第1の層の入力値である前回値が格納されたものであり、今回、当該第1の層に新たに入力されようとしている入力値である今回値との差分値を得るために用いられる。後述する処理の説明においては、基準値に前回値を用いるものについて説明する。
 ただし、基準値は、以下のような値を格納して用いるようにしてもよい。
The reference value is a previous value, which is an input value of at least one layer, the first layer, stored as a result of previous processing by multiple layers of the neural network unit 1020, and is used to obtain a difference value between the previous value and a current value, which is an input value that is about to be newly input to the first layer. In the explanation of the processing described later, a case in which the previous value is used as the reference value will be explained.
However, the following values may be stored and used as the reference values.

 基準値は、予め学習しモデル化した典型的な処理結果を用いることができる。
 すなわち、基準値は、ニューラルネットワークにおける、予めモデル化した典型的な中間層の出力値である、ようにしてもよい。
 典型的な処理結果を用いることで、実観測画像に含まれる雑音や異常に影響されずに比較を行うことができる。
The reference value can use typical processing results that have been learned and modeled in advance.
In other words, the reference value may be a typical output value of a pre-modeled intermediate layer in a neural network.
By using typical processing results, comparisons can be made without being influenced by noise or anomalies contained in the actual observed images.

 また、基準値は、事前情報によって定めた一部の結果を用いることができる。
 事前情報によって定めた一部の結果とは、例えば、頭部の存在確率が高い画素を採用する、といったことが想定される。
 また、例えば、背景等、変化が大きいが重要性は低いことがわかっている画素の不採用といったことが想定される。
 すなわち、基準値は、ニューラルネットワークに含まれるノードのうち予め定められたノードの出力値である、ようにしてもよい。
 これにより、省メモリ化や、扱うデータを削減することができ、より高速化を図ることができる。
In addition, the reference value may use a part of the results determined based on advance information.
The partial results determined based on the prior information are assumed to be, for example, pixels that have a high probability of indicating the presence of a head.
Also, for example, pixels that are known to have a large change but low importance, such as the background, may not be used.
In other words, the reference value may be the output value of a predetermined node among the nodes included in the neural network.
This allows for reduced memory usage and data handling, resulting in faster processing speeds.

 状態比較部1030は、差分値と予め記憶されているしきい値とを用いて基準値と今回値との差の大小を判定した結果に基づいて、判定対象の状態の変化を判定する。
 具体的には、状態比較部1030は、例えば画像の単位に、差分値の二乗和が予め記憶されたしきい値より小さい場合、判定対象の状態が変化なしであると判定する。
 また、具体的には、状態比較部1030は、例えば画像の単位に、差分値の絶対値の総和が予め記憶されたしきい値より小さい場合、判定対象の状態が変化なしであると判定する。
The state comparison unit 1030 uses the difference value and a pre-stored threshold value to determine the magnitude of the difference between the reference value and the current value, and determines a change in the state of the object to be determined based on the result.
Specifically, for example, in units of images, if the sum of squares of the difference values is smaller than a pre-stored threshold value, the state comparison unit 1030 determines that the state of the determination target is unchanged.
More specifically, the state comparison unit 1030 determines that the state of the determination target is unchanged when the sum of the absolute values of the difference values is smaller than a pre-stored threshold value, for example, for each image.

 処理制御部1040は、判定対象の状態の変化に応じて、ニューラルネットワーク部1020における今回の処理動作を制限する指令を行うとともに、前回のニューラルネットワーク部1020の処理により出力された出力データを出力させる指令を行う。
 処理制御部1040は、状態比較部1030により判定対象の状態が変化なし(変化なしとは、変化が少ない場合を含む)と判定された場合、複数の層のうちの少なくとも一つの層である第1の層以降の層における処理を実行しないようニューラルネットワーク部1020へ指令するとともに、出力格納部1050に格納されている出力データを出力するよう指令する。
 出力データは、例えば、判定対象者の状態を示す値である判定値である。判定値は、本開示の実施の形態において、確率的な要素を含む確率値とも言う。
In response to changes in the state of the object to be judged, the processing control unit 1040 issues a command to restrict the current processing operation in the neural network unit 1020, and also issues a command to output the output data output by the previous processing of the neural network unit 1020.
When the state comparison unit 1030 determines that the state of the object to be judged has not changed (no change includes cases where there is little change), the processing control unit 1040 instructs the neural network unit 1020 not to execute processing in at least one layer out of the multiple layers, that is, the first layer or subsequent layers, and also instructs the neural network unit 1020 to output the output data stored in the output storage unit 1050.
The output data is, for example, a judgment value that is a value indicating the state of the person to be judged. In the embodiment of the present disclosure, the judgment value is also referred to as a probability value that includes a probabilistic element.

 出力格納部1050は、ニューラルネットワーク部1020により出力された出力データ(判定値(確率値))を格納する。
 出力格納部1050は、処理制御部1040からの指令を受けると、格納している出力データを出力する。
 出力格納部1050は、例えば、出力データを警報出力部2000へ出力する。
 出力格納部1050は、ニューラルネットワーク部1020による処理が行われたことにより出力された最新の出力データを保持する。例えば、出力格納部1050は、ニューラルネットワーク部1020により出力された出力データを取得する度に、過去に格納した出力データを消去して、最新の出力データを格納する。
The output storage section 1050 stores the output data (decision value (probability value)) output by the neural network section 1020 .
When the output storage unit 1050 receives a command from the process control unit 1040, it outputs the stored output data.
The output storage unit 1050 outputs the output data to the warning output unit 2000, for example.
The output storage unit 1050 holds the latest output data output as a result of processing performed by the neural network unit 1020. For example, every time the output storage unit 1050 obtains output data output by the neural network unit 1020, it erases the output data previously stored and stores the latest output data.

 警報出力部2000は、出力データを取得して、出力データに基づいて警報信号を図示しない警報装置等に出力する。
 警報出力部2000は、判定対象者の眠気状態または居眠り状態を示す判定値である出力データを取得し、当該判定値に応じて前記判定対象者に対する警報を出力させる。
The alarm output unit 2000 acquires output data, and outputs an alarm signal based on the output data to an alarm device (not shown) or the like.
The warning output unit 2000 acquires output data that is a judgment value indicating the drowsy state or dozing state of the person to be judged, and outputs a warning to the person to be judged in accordance with the judgment value.

 図1に示す異常判定装置1000Aは、警報出力部2000を含まない構成として示しているが、警報出力部2000を含むように構成してもよい。このように構成した場合、異常判定装置1000Aは、図14に示す異常警報装置100Aと同等である。以下の説明においては、異常警報装置100Aと異常判定装置1000Aとの区別が必要な場合以外は、異常判定装置1000Aが警報出力部2000を含めて構成されているものとして説明する。 The abnormality determination device 1000A shown in FIG. 1 is shown as being configured not to include the alarm output unit 2000, but may be configured to include the alarm output unit 2000. When configured in this manner, the abnormality determination device 1000A is equivalent to the abnormality warning device 100A shown in FIG. 14. In the following explanation, the abnormality determination device 1000A will be described as being configured to include the alarm output unit 2000, except in cases where it is necessary to distinguish between the abnormality warning device 100A and the abnormality determination device 1000A.

 異常判定装置1000Aは、上記構成以外に、図示しない制御部、図示しない記憶部、および、図示しない通信部、を含み構成されていてもよい。
 図示しない制御部は、異常判定装置1000A全体および各構成部に対する制御を行う。図示しない制御部は、例えば外部からの指令に従って異常判定装置1000Aを起動させる。また、図示しない制御部は、異常判定装置1000Aの状態(動作状態=起動、シャットダウン、スリープなどの状態)を制御する。
 図示しない記憶部は、異常判定装置1000Aに用いられる各データを記憶する。図示しない記憶部は、例えば、異常判定装置1000Aにおける各構成部による出力(出力されたデータ)を記憶し、構成部ごとに要求されたデータを要求元の構成部へ宛てて出力する。
 図示しない通信部は、外部の装置との間で通信を行う。例えば異常判定装置1000Aと車内カメラなどの撮像装置との間で通信を行う。また、例えば異常判定装置1000Aが表示部または音声出力部を備えていない場合、異常判定装置1000Aと例えば表示装置または音声出力装置といった外部の装置との間で通信を行う。
In addition to the above configuration, abnormality determination device 1000A may be configured to include a control unit (not shown), a storage unit (not shown), and a communication unit (not shown).
A control unit (not shown) controls the entire abnormality determination device 1000A and each of its components. The control unit (not shown) starts up the abnormality determination device 1000A in response to, for example, an external command. The control unit (not shown) also controls the state (operation state = start-up, shutdown, sleep, etc.) of the abnormality determination device 1000A.
A storage unit (not shown) stores each piece of data used in the abnormality determination device 1000A. The storage unit (not shown) stores, for example, outputs (output data) from each component in the abnormality determination device 1000A, and outputs data requested by each component to the component that has made the request.
A communication unit (not shown) communicates with an external device. For example, communication is performed between the abnormality determination device 1000A and an imaging device such as an in-vehicle camera. In addition, for example, if the abnormality determination device 1000A does not have a display unit or an audio output unit, communication is performed between the abnormality determination device 1000A and an external device such as a display unit or an audio output device.

 実施の形態1に係る異常警報装置100Aおよび異常判定装置1000Aの処理の一例を説明する。
 図2は、本開示の実施の形態1に係る異常警報装置100Aおよび異常判定装置1000Aの処理の一例を示すフローチャートである。
 異常判定装置1000Aは、例えばカメラから画像が入力されると、図2に示す処理を開始する。
An example of the processing of the abnormality warning device 100A and the abnormality determination device 1000A according to the first embodiment will be described.
FIG. 2 is a flowchart showing an example of the processing of the abnormality warning device 100A and the abnormality determination device 1000A according to the first embodiment of the present disclosure.
The abnormality determination device 1000A starts the process shown in FIG. 2 when an image is input from a camera, for example.

 異常判定装置1000Aは、画像取得処理を実行する(ステップST100)。
 画像取得処理において、異常判定装置1000Aの画像取得部1010は、画像を取得して出力する。
Abnormality determination device 1000A executes image acquisition processing (step ST100).
In the image acquisition process, the image acquisition unit 1010 of the abnormality determination device 1000A acquires and outputs an image.

 異常判定装置1000Aは、状態格納および状態比較処理を実行する(ステップST200)。
 状態格納および状態比較処理において、異常判定装置1000Aの状態比較部1030は、処理開始後の少なくとも1回目のニューラルネットワーク部1020の全ての層における処理結果のうち、少なくとも一つの層である第1の層の入力値である前回値を格納する。
 状態比較部1030は、1回目の処理については状態比較処理を行わず前回値を格納するだけであるが、2回目以降判定対象の状態の変化を判定する。
 状態比較部1030は、ニューラルネットワーク部1020の複数の層のうちの少なくとも一つの層である第1の層に対する入力値の基準である基準値を予め格納しておき、当該基準値と、当該第1の層に新たに入力されようとしている値である今回値と、の差分値を用いて、判定対象の状態の変化を判定する。
Abnormality determination device 1000A executes a state storage and state comparison process (step ST200).
In the state storage and state comparison process, the state comparison unit 1030 of the abnormality determination device 1000A stores the previous value, which is the input value of at least one layer, that is, the first layer, among the processing results in all layers of the neural network unit 1020 for at least the first time after the start of processing.
The state comparison unit 1030 does not perform state comparison processing for the first processing, but simply stores the previous value, but judges whether there has been a change in the state of the object to be judged from the second processing onwards.
The state comparison unit 1030 pre-stores a reference value that is a standard for the input value to the first layer, which is at least one of the multiple layers of the neural network unit 1020, and determines a change in the state of the object to be judged using the difference value between the reference value and the current value, which is the value that is to be newly input to the first layer.

 異常判定装置1000Aは、処理制御処理を実行する(ステップST300)。
 処理制御処理において、異常判定装置1000Aの処理制御部1040は、判定対象の状態の変化に応じて、ニューラルネットワーク部1020における今回の処理動作を制限する指令を行うとともに、前回のニューラルネットワーク部1020の処理により出力された出力データを出力させる指令を行う。
 具体的には、処理制御部1040は、状態比較部1030により判定対象の状態が変化なし(変化なしとは、変化が少ない場合を含む)と判定された場合、複数の層のうちの少なくとも一つの層である第1の層以降の層における処理を実行しないようニューラルネットワーク部1020へ指令するとともに、出力格納部1050に格納されている出力データを出力するよう指令する。
Abnormality determination device 1000A executes a process control process (step ST300).
In the process control process, the process control unit 1040 of the abnormality determination device 1000A issues a command to restrict the current processing operation in the neural network unit 1020 in response to a change in the state of the object to be determined, and also issues a command to output the output data output by the previous processing of the neural network unit 1020.
Specifically, when the state comparison unit 1030 determines that the state of the object to be judged has not changed (no change includes cases where there is little change), the processing control unit 1040 instructs the neural network unit 1020 not to execute processing in at least one layer out of the multiple layers, that is, the first layer or subsequent layers, and also instructs the processing control unit 1040 to output the output data stored in the output storage unit 1050.

 異常判定装置1000Aは、状態出力処理を実行する(ステップST400)。
 状態出力処理において、異常判定装置1000Aのニューラルネットワーク部1020または出力格納部1050は、出力データを出力する。
 ニューラルネットワーク部1020は、全ての層において処理が行われた場合、出力データを警報出力部2000へ出力する。
 出力格納部1050は、処理制御部1040からの指令を受けると、格納している最新の出力データを警報出力部2000へ出力する。
Abnormality determination device 1000A executes a state output process (step ST400).
In the state output process, the neural network unit 1020 or the output storage unit 1050 of the abnormality determination device 1000A outputs output data.
When processing has been performed in all layers, the neural network unit 1020 outputs output data to the warning output unit 2000.
When the output storage unit 1050 receives a command from the process control unit 1040 , it outputs the latest stored output data to the alarm output unit 2000 .

 異常判定装置1000Aは、警報出力処理を実行する(ステップST500)。
 警報出力処理において、異常判定装置1000Aの警報出力部2000は、出力データを取得して、出力データに基づいて警報信号を図示しない警報装置等に出力する。警報出力部2000は、出力データに含まれる判定値に基づいて、警報を出力するかを判定し、警報を出力すると判定した場合、図示しない警報装置等へ警報信号を出力する。
Abnormality determination device 1000A executes an alarm output process (step ST500).
In the alarm output process, the alarm output unit 2000 of the abnormality determination device 1000A acquires output data and outputs an alarm signal based on the output data to an alarm device (not shown) etc. The alarm output unit 2000 determines whether to output an alarm based on a determination value included in the output data, and when it determines to output an alarm, outputs an alarm signal to an alarm device (not shown) etc.

 異常判定装置1000Aは、ステップST500の処理を実行すると、
 図2に示す一連の処理を終了し、ステップST100の処理から繰り返す。
 異常判定装置1000Aは、例えばカメラが電源OFF状態になると連動して電源OFF状態になる。
When the abnormality determination device 1000A executes the process of step ST500,
The series of processes shown in FIG. 2 ends, and the process is repeated from step ST100.
For example, when the camera is turned off, the abnormality determination device 1000A is also turned off in conjunction with the camera.

 実施の形態1に係る異常警報装置100Aおよび異常判定装置1000Aの処理の詳細な一例を説明する。
 図3は、本開示の実施の形態1に係る異常警報装置100Aおよび異常判定装置1000Aのさらに詳細な処理の一例を示すフローチャートである。
 異常判定装置1000Aは、図3に示す処理を開始すると、まず、上述したステップST100と同様に、画像取得処理を実行する(ステップST100)。
A detailed example of the processing of the abnormality warning device 100A and the abnormality determination device 1000A according to the first embodiment will be described.
FIG. 3 is a flowchart showing an example of more detailed processing of the abnormality warning device 100A and the abnormality determination device 1000A according to the first embodiment of the present disclosure.
When abnormality determination device 1000A starts the process shown in FIG. 3, first, similarly to step ST100 described above, it executes image acquisition processing (step ST100).

 異常判定装置1000Aは、格納処理を実行する(ステップST201)。
 格納処理において、異常判定装置1000Aの状態比較部1030は、処理開始後の少なくとも1回目のニューラルネットワーク部1020の全ての層における処理結果のうち、少なくとも一つの層である第1の層の入力値である前回値を格納する。
 また、状態比較部1030は、第1の層において処理が行われる度に、第1の層の入力値を格納する。
Abnormality determination device 1000A executes a storage process (step ST201).
In the storage process, the state comparison unit 1030 of the abnormality determination device 1000A stores the previous value, which is the input value of at least one layer, the first layer, among the processing results in all layers of the neural network unit 1020 for at least the first time after the start of processing.
Furthermore, the state comparison unit 1030 stores the input values of the first layer each time processing is performed in the first layer.

 異常判定装置1000Aの状態比較部1030は、前回格納されているかを判定する前回格納判定処理を実行する(ステップST202)。 The state comparison unit 1030 of the abnormality determination device 1000A executes a previous storage determination process to determine whether the data has been stored previously (step ST202).

 異常判定装置1000Aの状態比較部1030は、今回と前回との比較処理を実行する(ステップST203)。状態比較部1030は、第1の層に新たに入力されようとしている値である今回値と前回値(基準値)との差(差分値)を算出する。 The state comparison unit 1030 of the abnormality determination device 1000A executes a comparison process between the current value and the previous value (step ST203). The state comparison unit 1030 calculates the difference (difference value) between the current value, which is the value that is about to be newly input to the first layer, and the previous value (reference value).

 異常判定装置1000Aの状態比較部1030は、差分値を用いて、判定対象の状態の変化を判定する(ステップST204)。
 状態比較部1030は、差分値と予め記憶されているしきい値とを用いて基準値と今回値との差の大小を判定した結果に基づいて、判定対象の状態の変化を判定する。
 具体的には、状態比較部1030は、例えば画像の単位に、差分値の二乗和が予め記憶されたしきい値より小さい場合、判定対象の状態が変化なしであると判定する。
 また、具体的には、状態比較部1030は、例えば画像の単位に、差分値の絶対値の総和が予め記憶されたしきい値より小さい場合、判定対象の状態が変化なしであると判定する。
State comparison section 1030 of abnormality determination device 1000A determines a change in the state of the object to be determined using the difference value (step ST204).
The state comparison unit 1030 uses the difference value and a pre-stored threshold value to determine the magnitude of the difference between the reference value and the current value, and determines a change in the state of the object to be determined based on the result.
Specifically, for example, in units of images, if the sum of squares of the difference values is smaller than a pre-stored threshold value, the state comparison unit 1030 determines that the state of the determination target is unchanged.
More specifically, the state comparison unit 1030 determines that the state of the determination target is unchanged when the sum of the absolute values of the difference values is smaller than a pre-stored threshold value, for example, for each image.

 異常判定装置1000Aの状態比較部1030により差分がしきい値以上であると判定され、判定対象の状態が変化ありであると判定された場合(ステップST204“NO”)、異常判定装置1000Aの処理制御部1040は、通常処理指令処理を実行する(ステップST301)。処理制御部1040は、ニューラルネットワーク部1020の第1の層における処理を実行させる指令を出力する。 If the state comparison unit 1030 of the abnormality determination device 1000A determines that the difference is equal to or greater than the threshold value and that the state of the object to be determined has changed (step ST204 "NO"), the processing control unit 1040 of the abnormality determination device 1000A executes a normal processing command process (step ST301). The processing control unit 1040 outputs a command to execute processing in the first layer of the neural network unit 1020.

 異常判定装置1000Aのニューラルネットワーク部1020は、出力処理を実行する(ステップST401)。ニューラルネットワーク部1020は、出力データを警報出力部2000へ出力する。 The neural network unit 1020 of the abnormality determination device 1000A executes output processing (step ST401). The neural network unit 1020 outputs the output data to the alarm output unit 2000.

 異常判定装置1000Aの出力格納部1050は、出力格納処理を実行する(ステップST402)。
 出力格納部1050は、ニューラルネットワーク部1020により出力された出力データを格納する。
Output storage unit 1050 of abnormality determination device 1000A executes an output storage process (step ST402).
The output storage section 1050 stores the output data output by the neural network section 1020 .

 異常判定装置1000Aの状態比較部1030により差分がしきい値未満であると判定され、判定対象の状態が変化なしであると判定された場合(ステップST204“YES”)、異常判定装置1000Aの処理制御部1040は、省略処理指令処理を実行する(ステップST302)。
 省略処理指令処理において、処理制御部1040は、複数の層のうちの少なくとも一つの層である第1の層以降の層における処理を実行しないようニューラルネットワーク部1020へ指令するとともに、出力格納部1050に格納されている出力データを出力するよう指令する。
If the state comparison unit 1030 of the abnormality determination device 1000A determines that the difference is less than the threshold value and that the state to be determined has not changed (step ST204 ``YES''), the processing control unit 1040 of the abnormality determination device 1000A executes an omission processing command processing (step ST302).
In the omission processing command processing, the processing control unit 1040 instructs the neural network unit 1020 not to execute processing in the first layer or subsequent layers, which is at least one layer among the multiple layers, and also instructs the output storage unit 1050 to output the output data stored therein.

 異常判定装置1000Aの出力格納部1050は、格納データ出力処理を実行する(ステップST403)。出力格納部1050は、処理制御部1040からの指令を受けると、格納している出力データを警報出力部2000へ出力する。 The output storage unit 1050 of the abnormality determination device 1000A executes a stored data output process (step ST403). When the output storage unit 1050 receives a command from the process control unit 1040, it outputs the stored output data to the alarm output unit 2000.

 警報出力部2000は、警報出力処理を実行する(ステップST500)。
 警報出力部2000は、既に説明したステップST500の処理と同様に、出力データに含まれる判定値に基づいて、警報を出力するかを判定し、警報を出力すると判定した場合、図示しない警報装置等へ警報信号を出力する。
The alarm output unit 2000 executes an alarm output process (step ST500).
The alarm output unit 2000 determines whether to output an alarm based on the judgment value contained in the output data, similar to the processing of step ST500 already described, and if it determines to output an alarm, outputs an alarm signal to an alarm device or the like (not shown).

 異常判定装置1000Aは、ステップST500の処理を実行すると、図3に示す一連の処理を終了し、ステップST100の処理から繰り返す。
 異常判定装置1000Aは、例えばカメラが電源OFF状態になると連動して電源OFF状態になる。
When abnormality determination device 1000A executes the process of step ST500, it ends the series of processes shown in FIG. 3 and repeats the process from step ST100.
For example, when the camera is turned off, the abnormality determination device 1000A is also turned off in conjunction with the camera.

 上述したような構成および処理により、異常判定装置は、条件に応じて、ニューラルネットワークにおける層の処理を省略することが可能になる。 The configuration and processing described above allows the anomaly determination device to omit processing of layers in the neural network depending on the conditions.

 本開示の異常判定装置は、以下のような構成である。

「画像を取得して出力する画像取得部と、
 ニューラルネットワークにおける複数の層を有し、前記画像取得部により出力された画像を取得し、当該画像に含まれる判定対象の状態を示す判定結果を出力する、ニューラルネットワーク部と、
 前記ニューラルネットワーク部により出力された出力データ(判定値(確率値))を格納する、出力格納部と、
 前記ニューラルネットワーク部の複数の層のうちの少なくとも一つの層である第1の層に対する入力値の基準である基準値を予め格納しておき、当該基準値と、当該第1の層に新たに入力されようとしている値である今回値と、の差分値を用いて、前記判定対象の状態の変化を判定する、状態比較部と、
 前記状態比較部により前記判定対象の状態が変化なしと判定された場合、前記第1の層以降の層における処理を実行しないよう前記ニューラルネットワーク部へ指令するとともに、前記出力格納部に格納されている前記出力データを出力するよう指令する、処理制御部と、
 を備えた、異常判定装置。」

 これにより、本開示は、異常判定技術において、ニューラルネットワークを用いて速やかに高精度な判定結果を出力することを可能にすることができる、異常判定装置を提供することができる、という効果を奏する。
The abnormality determination device of the present disclosure has the following configuration.

"An image acquisition unit that acquires and outputs an image;
a neural network unit having a plurality of layers in a neural network, acquiring an image output by the image acquisition unit, and outputting a judgment result indicating a state of a judgment target included in the image;
an output storage unit that stores output data (judgment value (probability value)) output by the neural network unit;
a state comparison unit that prestores a reference value that is a reference for an input value to a first layer that is at least one of the multiple layers of the neural network unit, and judges a change in the state of the object to be judged using a difference value between the reference value and a current value that is a value that is to be newly input to the first layer;
a processing control unit that instructs the neural network unit not to execute processing in the first layer and subsequent layers when the state comparison unit determines that the state of the object to be determined has not changed, and instructs the neural network unit to output the output data stored in the output storage unit;
An abnormality determination device equipped with the above.

As a result, the present disclosure has the effect of providing an abnormality determination device in an abnormality determination technique that can quickly output highly accurate determination results using a neural network.

 本開示の異常判定方法は、以下のような構成である。

「画像取得部が、取得した画像をニューラルネットワーク部へ出力し、
ニューラルネットワークにおける複数の層を有する前記ニューラルネットワーク部が、前記画像取得部により出力された画像を取得し、当該画像に含まれる判定対象の状態を示す判定結果を出力し、
出力格納部が、前記ニューラルネットワーク部により出力された出力データ(判定値(確率値))を格納し、
状態比較部が、前記ニューラルネットワーク部の複数の層のうちの少なくとも一つの層である第1の層に対する入力値の基準である基準値を予め格納しておき、当該基準値と、当該第1の層に新たに入力されようとしている値である今回値と、の差分値を用いて、前記判定対象の状態の変化を判定し、
前記状態比較部により前記判定対象の状態が変化なしと判定された場合、処理制御部が、前記第1の層以降の層における処理を実行しないよう前記ニューラルネットワーク部へ指令するとともに、前記出力格納部に格納されている前記出力データを出力するよう指令する、
異常状態判定方法。」

 これにより、本開示は、異常判定方法において、ニューラルネットワークを用いて速やかに高精度な判定結果を出力することを可能にすることができる、異常判定方法を提供することができる、という効果を奏する。
The abnormality determination method of the present disclosure has the following configuration.

The image acquisition unit outputs the acquired image to the neural network unit,
the neural network unit having a plurality of layers in a neural network acquires the image output by the image acquisition unit, and outputs a judgment result indicating a state of a judgment target included in the image;
an output storage unit stores the output data (judgment value (probability value)) output by the neural network unit;
a state comparison unit pre-stores a reference value that is a reference for an input value to a first layer that is at least one of the multiple layers of the neural network unit, and judges a change in the state of the object to be judged using a difference value between the reference value and a current value that is a value that is to be newly input to the first layer;
When the state comparison unit determines that the state of the object to be determined has not changed, the processing control unit instructs the neural network unit not to execute processing in the first layer and subsequent layers, and instructs the neural network unit to output the output data stored in the output storage unit.
Method for determining abnormal conditions.

As a result, the present disclosure has the effect of providing an anomaly determination method that makes it possible to quickly output highly accurate determination results using a neural network.

 本開示の異常判定装置は、さらに、以下のように構成した。

「前記基準値は、前記ニューラルネットワークにおける、前回の処理結果としての層ごと前回値を格納したものである、ことを特徴とする異常判定装置。」

 これにより、本開示は、異常判定技術において、ニューラルネットワークを用いて速やかに高精度な判定結果を出力することを可能にすることができる、異常判定装置を提供することができる、という効果を奏する。
 さらに、本開示は、上記構成を上記異常判定方法に適用することにより、上記効果と同様の効果を奏する。
The abnormality determination device of the present disclosure is further configured as follows.

"The anomaly determination device, wherein the reference value is a previous value stored for each layer as a result of the previous processing in the neural network."

As a result, the present disclosure has the effect of providing an abnormality determination device in an abnormality determination technique that can quickly output highly accurate determination results using a neural network.
Furthermore, the present disclosure achieves the same effect as the above by applying the above configuration to the abnormality determination method.

 本開示の異常判定装置は、さらに、以下のように構成した。

「前記基準値は、前記ニューラルネットワークにおける、予めモデル化した典型的な中間層の出力値である、
ことを特徴とする異常判定装置。」

 これにより、本開示は、異常判定技術において、ニューラルネットワークを用いて速やかに高精度な判定結果を出力することを可能にすることができる、異常判定装置を提供することができる、という効果を奏する。
 さらに、本開示は、上記構成を上記異常判定方法に適用することにより、上記効果と同様の効果を奏する。
The abnormality determination device of the present disclosure is further configured as follows.

"The reference value is a typical output value of a pre-modeled intermediate layer in the neural network.
An abnormality determination device characterized by the above.

As a result, the present disclosure has the effect of providing an abnormality determination device in an abnormality determination technique that can quickly output highly accurate determination results using a neural network.
Furthermore, the present disclosure achieves the same effect as the above by applying the above configuration to the abnormality determination method.

 本開示の異常判定装置は、さらに、以下のように構成した。

「前記基準値は、前記ニューラルネットワークに含まれるノードのうち予め定められたノードの出力値である、
ことを特徴とする異常判定装置。」

 これにより、本開示は、異常判定技術において、ニューラルネットワークを用いて速やかに高精度な判定結果を出力することを可能にすることができる、異常判定装置を提供することができる、という効果を奏する。
 さらに、本開示は、上記構成を上記異常判定方法に適用することにより、上記効果と同様の効果を奏する。
The abnormality determination device of the present disclosure is further configured as follows.

"The reference value is an output value of a predetermined node among the nodes included in the neural network.
An abnormality determination device characterized by the above.

As a result, the present disclosure has the effect of providing an abnormality determination device in an abnormality determination technique that can quickly output highly accurate determination results using a neural network.
Furthermore, the present disclosure achieves the same effect as the above by applying the above configuration to the above abnormality determination method.

 本開示の異常判定装置は、さらに、以下のように構成した。

「前記状態比較部は、
前記差分値と予め記憶されているしきい値とを用いて前記基準値と前記今回値との差の大小を判定した結果に基づいて、前記判定対象の状態の変化を判定する、
ことを特徴とする異常判定装置。」

 これにより、本開示は、異常判定技術において、ニューラルネットワークを用いて速やかに高精度な判定結果を出力することを可能にすることができる、異常判定装置を提供することができる、という効果を奏する。
 さらに、本開示は、上記構成を上記異常判定方法に適用することにより、上記効果と同様の効果を奏する。
The abnormality determination device of the present disclosure is further configured as follows.

"The state comparison unit is
a change in the state of the object to be determined based on a result of determining whether a difference between the reference value and the current value is greater or smaller using the difference value and a pre-stored threshold value;
An abnormality determination device characterized by the above.

As a result, the present disclosure has the effect of providing an abnormality determination device in an abnormality determination technique that can quickly output highly accurate determination results using a neural network.
Furthermore, the present disclosure achieves the same effect as the above by applying the above configuration to the abnormality determination method.

 本開示の異常判定装置は、さらに、以下のように構成した。

「前記状態比較部は、
前記画像の単位に、前記差分値の二乗和が予め記憶されたしきい値より小さい場合、前記判定対象の状態が変化なしであると判定する、
ことを特徴とする異常判定装置。」

 これにより、本開示は、異常判定技術において、ニューラルネットワークを用いて速やかに高精度な判定結果を出力することを可能にすることができる、異常判定装置を提供することができる、という効果を奏する。
 さらに、本開示は、上記構成を上記異常判定方法に適用することにより、上記効果と同様の効果を奏する。
The abnormality determination device of the present disclosure is further configured as follows.

"The state comparison unit is
When the sum of squares of the difference values is smaller than a pre-stored threshold value for each image, it is determined that the state of the object to be determined is unchanged.
An abnormality determination device characterized by the above.

As a result, the present disclosure has the effect of providing an abnormality determination device in an abnormality determination technique that can quickly output highly accurate determination results using a neural network.
Furthermore, the present disclosure achieves the same effect as the above by applying the above configuration to the abnormality determination method.

 本開示の異常判定装置は、さらに、以下のように構成した。

「前記状態比較部は、
前記画像の単位に、前記差分値の絶対値の総和が予め記憶されたしきい値より小さい場合、前記判定対象の状態が変化なしであると判定する、
ことを特徴とする異常判定装置。」

 これにより、本開示は、異常判定技術において、ニューラルネットワークを用いて速やかに高精度な判定結果を出力することを可能にすることができる、異常判定装置を提供することができる、という効果を奏する。
 さらに、本開示は、上記構成を上記異常判定方法に適用することにより、上記効果と同様の効果を奏する。
The abnormality determination device of the present disclosure is further configured as follows.

"The state comparison unit is
When the sum of the absolute values of the difference values is smaller than a pre-stored threshold value for each image, it is determined that the state of the object to be determined is unchanged.
An abnormality determination device characterized by the above.

As a result, the present disclosure has the effect of providing an abnormality determination device in an abnormality determination technique that can quickly output highly accurate determination results using a neural network.
Furthermore, the present disclosure achieves the same effect as the above by applying the above configuration to the abnormality determination method.

 本開示の異常判定装置は、さらに、以下のように構成した。

「判定対象者の眠気状態または居眠り状態を示す判定値である前記出力データを取得し、当該判定値に応じて前記判定対象者に対する警報を出力させる警報出力部をさらに備えた、
ことを特徴とする異常判定装置。」

 これにより、本開示は、異常判定技術において、ニューラルネットワークを用いて速やかに高精度な判定結果を出力することを可能にすることができる、異常判定装置を提供することができる、という効果を奏する。
 さらに、本開示は、上記構成を上記異常判定方法に適用することにより、上記効果と同様の効果を奏する。
The abnormality determination device of the present disclosure is further configured as follows.

"The present invention further includes an alarm output unit that acquires the output data, which is a judgment value indicating a drowsy state or a dozing state of a person to be judged, and outputs an alarm to the person to be judged according to the judgment value.
An abnormality determination device characterized by the above.

As a result, the present disclosure has the effect of providing an abnormality determination device in an abnormality determination technique that can quickly output highly accurate determination results using a neural network.
Furthermore, the present disclosure achieves the same effect as the above by applying the above configuration to the abnormality determination method.

実施の形態2.
 実施の形態2は、本開示の基本的な仕組みを、特徴抽出部および異常状態判定部それぞれのニューラルネットワークにおける層に対して適用した形態を説明する。
Embodiment 2.
In the second embodiment, a form in which the basic mechanism of the present disclosure is applied to layers in the neural networks of a feature extraction unit and an abnormal state determination unit will be described.

 実施の形態2に係る異常警報装置100Bおよび異常判定装置1000Bの構成の一例を説明する。
 図4は、本開示の実施の形態2に係る異常警報装置100Bおよび異常判定装置1000Bの構成の一例を示す図である。
 異常警報装置100Bは、異常判定装置1000B、および、警報出力部2000、を含み構成されている。
An example of the configuration of an abnormality warning device 100B and an abnormality determination device 1000B according to the second embodiment will be described.
FIG. 4 is a diagram showing an example of the configuration of an abnormality warning device 100B and an abnormality determination device 1000B according to the second embodiment of the present disclosure.
The abnormality warning device 100B includes an abnormality determination device 1000B and a warning output unit 2000.

 異常判定装置1000Bは、画像を取得し、当該画像を用いて、当該画像に撮像された判定対象の状態を出力する。
 図4に示す異常判定装置1000Bは、画像取得部1100B、特徴抽出部1300B、および、異常状態判定部1500B、を含み構成されている。
 ここで、異常判定装置1000Bにおいては、既に説明したニューラルネットワーク部が、後述するように、第1のニューラルネットワーク部1320と、第2のニューラルネットワーク部1520とを含み構成された形態である。第1のニューラルネットワーク部1320は、特徴抽出部1300Bに含まれ、第2のニューラルネットワーク部1520は、異常状態判定部1500Bに含まれる。
 また、異常判定装置1000Bにおいては、既に説明した出力格納部が、第1の出力格納部1390と、第2の出力格納部とを含み構成された形態である。第1の出力格納部1390は、特徴抽出部1300Bに含まれ、第2の出力格納部は、異常状態判定部1500Bに含まれる。
 また、異常判定装置1000Bにおいては、既に説明した状態比較部は、第1の状態比較部1340と、第2の状態比較部1540とを含み構成された形態である。第1の状態比較部1340は、特徴抽出部1300Bに含まれ、第2の状態比較部1540は、異常状態判定部1500Bに含まれる。
 また、異常判定装置1000Bにおいては、既に説明した処理制御部は、第1の処理制御部1380と、第2の処理制御部1580とを含み構成された形態である。第1の処理制御部1380は、特徴抽出部1300Bに含まれ、第2の処理制御部1580は、異常状態判定部1500Bに含まれる。
The abnormality determination device 1000B obtains an image, and uses the image to output the state of the determination target captured in the image.
The abnormality determination device 1000B shown in FIG. 4 includes an image acquisition unit 1100B, a feature extraction unit 1300B, and an abnormal state determination unit 1500B.
Here, in abnormality determination device 1000B, the neural network unit already described is configured to include, as will be described later, a first neural network unit 1320 and a second neural network unit 1520. First neural network unit 1320 is included in feature extraction unit 1300B, and second neural network unit 1520 is included in abnormal state determination unit 1500B.
In addition, in the abnormality determination device 1000B, the output storage unit already described is configured to include a first output storage unit 1390 and a second output storage unit. The first output storage unit 1390 is included in the feature extraction unit 1300B, and the second output storage unit is included in the abnormal state determination unit 1500B.
In addition, in abnormality determination device 1000B, the state comparison unit already described is configured to include a first state comparison unit 1340 and a second state comparison unit 1540. First state comparison unit 1340 is included in feature extraction unit 1300B, and second state comparison unit 1540 is included in abnormal state determination unit 1500B.
In addition, in the abnormality determination device 1000B, the process control unit already described is configured to include a first process control unit 1380 and a second process control unit 1580. The first process control unit 1380 is included in the feature extraction unit 1300B, and the second process control unit 1580 is included in the abnormal state determination unit 1500B.

 画像取得部1100Bは、既に説明した画像取得部1100Aと同様であるため、ここでの画像取得部1100Bの詳細な説明は省略する。 Since the image acquisition unit 1100B is similar to the image acquisition unit 1100A already described, a detailed description of the image acquisition unit 1100B will be omitted here.

 特徴抽出部1300Bは、画像を用いて、当該画像に含まれる判定対象の特徴的な状態を表す特徴マップを出力する。特徴抽出部1300Bにより、例えば、画像に含まれる、瞼や眼球といった居眠りの兆候を示す部位が切り出され、居眠り等の異常状態に特徴的な判定対象者の状態を表す特徴マップが生成される。 The feature extraction unit 1300B uses the image to output a feature map that represents the characteristic state of the subject contained in the image. For example, the feature extraction unit 1300B extracts parts of the image that show signs of drowsiness, such as the eyelids and eyeballs, and generates a feature map that represents the state of the subject that is characteristic of an abnormal state such as drowsiness.

 特徴抽出部1300Bの内部構成の一例を説明する。
 図5は、異常警報装置100Bおよび異常判定装置1000Bにおける特徴抽出部1300Bの内部構成の一例を示す図である。
 図5に示す特徴抽出部1300Bは、ニューラルネットワーク部(第1のニューラルネットワーク部1320)、状態比較部(第1の状態比較部1340B)、処理制御部(第1の処理制御部1380B)、および、出力格納部(第1の出力格納部1390)、を含み構成されている。
An example of the internal configuration of the feature extraction unit 1300B will be described.
FIG. 5 is a diagram showing an example of the internal configuration of the abnormality warning device 100B and the feature extraction unit 1300B in the abnormality determination device 1000B.
The feature extraction unit 1300B shown in FIG. 5 is configured to include a neural network unit (first neural network unit 1320), a state comparison unit (first state comparison unit 1340B), a processing control unit (first processing control unit 1380B), and an output storage unit (first output storage unit 1390).

 第1のニューラルネットワーク部1320は、ニューラルネットワークを構成する層のうちの一部である複数の層を有し、画像取得部1100により出力された画像を取得し、当該画像に含まれる判定対象の特徴的な状態を表す特徴マップを出力する。
 図5に示す第1のニューラルネットワーク部1320は、画像分岐部1321、畳み込み層部1322、プーリング層部1325、および、画像結合部1328、を含み構成されている。
The first neural network unit 1320 has multiple layers that are part of the layers that make up the neural network, acquires the image output by the image acquisition unit 1100, and outputs a feature map that represents the characteristic state of the object to be judged contained in the image.
The first neural network unit 1320 shown in FIG. 5 includes an image branching unit 1321, a convolution layer unit 1322, a pooling layer unit 1325, and an image combination unit 1328.

 画像分岐部1321は、画像を、ニューラルネットワークの層における複数のノードに分岐して入力する。画像は、以降の畳み込み層、プーリング層の数に応じて分岐される。図5においては、2つに分岐される。 The image branching unit 1321 branches and inputs an image to multiple nodes in a layer of a neural network. The image is branched according to the number of subsequent convolutional layers and pooling layers. In FIG. 5, the image is branched into two.

 畳み込み層部1322は、画像における判定対象の特徴的な部位を抽出するためのフィルタリングを行う。
 図5に示す畳み込み層部1322は、第1の畳み込み層1323、および、第2の畳み込み層1324、を含み構成されている。
 第1の畳み込み層1323および第2の畳み込み層1324は、例えば、予め用意した、3×3や5×5のサイズの畳み込みフィルタを用いた畳み込み処理(相互相関処理)により、居眠り状態を検知するための顔(体)の部位を抽出するためのフィルタリングを行う。
The convolution layer unit 1322 performs filtering to extract characteristic parts of the object to be determined in the image.
The convolution layer unit 1322 shown in FIG. 5 includes a first convolution layer 1323 and a second convolution layer 1324.
The first convolutional layer 1323 and the second convolutional layer 1324 perform filtering to extract face (body) parts for detecting a drowsy state, for example, by convolution processing (cross-correlation processing) using pre-prepared convolution filters of size 3x3 or 5x5.

 図5に示すプーリング層部1325は、第1のプーリング層1326、および、第2のプーリング層1327、を含み構成されている。
 第1のプーリング層1326および第2のプーリング層1327は、例えば、予め定めた領域毎に最大もしくは平均値を算出することで、画像位置に対して頑健な特徴量に関する画像を生成する。
The pooling layer unit 1325 shown in FIG. 5 includes a first pooling layer 1326 and a second pooling layer 1327 .
The first pooling layer 1326 and the second pooling layer 1327 generate an image relating to features that are robust against image position, for example, by calculating the maximum or average value for each predetermined region.

 図5に示す第1のニューラルネットワーク部1320は、畳み込み層とプーリング層の組を2つ備えているが、1つの場合でも3つ以上の場合も有効である。
 また、第1プーリング層および第2のプーリング層1327の後に再度畳み込み層、プーリング層がそれぞれ2つ以上連なる構成であってもよい。
 また、畳み込み層の後には、図示していない正規化線形ユニット層(活性化関数)等を含めるように構成してもよい。
The first neural network unit 1320 shown in FIG. 5 includes two pairs of convolutional layers and pooling layers, but it is also effective to include one pair or three or more pairs.
In addition, the first pooling layer and the second pooling layer 1327 may be followed by two or more convolution layers and two or more pooling layers.
In addition, a normalized linear unit layer (activation function) or the like (not shown) may be included after the convolution layer.

 画像結合部1328は、畳み込み層およびプーリング層を介して出力された複数の画像を結合する。
 画像結合部1328は、例えば、瞼や眼球といった居眠りの兆候を示す部位を切り出して、特徴マップを生成する。
The image combination unit 1328 combines multiple images output through the convolution layer and the pooling layer.
The image combining unit 1328 extracts areas that indicate signs of drowsiness, such as the eyelids and eyeballs, and generates a feature map.

 第1の状態比較部1340Bは、第1のニューラルネットワーク部1320における複数の層ごとに、当該層に対する入力値の基準である基準値を予め格納しておき、当該基準値と、当該層に新たに入力されようとしている値である今回値と、の差分値を用いて、判定対象の状態の変化を判定する。 The first state comparison unit 1340B prestores a reference value, which is the standard for the input value to each layer in the first neural network unit 1320, and uses the difference between the reference value and the current value, which is the value that is about to be newly input to the layer, to determine a change in the state of the object to be determined.

 基準値は、第1のニューラルネットワーク部1320の複数の層による前回の処理結果として、層ごとの入力値である前回値が格納されたものであり、今回、当該層に新たに入力されようとしている入力値である今回値との差分値を得るために用いられる。後述する処理の説明においては、基準値に前回値を用いるものについて説明する。
 この場合、第1の状態比較部1340は、1回目の処理(1枚目の画像に対する処理)については状態比較処理を行わず値を格納するだけであるが、2回目以降の処理(2枚目の画像に対する処理)においては判定対象の状態の変化を判定する。
 ただし、基準値は、以下のような値を格納して用いるようにしてもよい。
The reference value is a previous value, which is an input value for each layer, stored as a result of the previous processing by the multiple layers of the first neural network unit 1320, and is used to obtain a difference value between the previous value and a current value, which is an input value that is about to be newly input to the layer. In the explanation of the processing described later, a case in which the previous value is used as the reference value will be explained.
In this case, the first state comparison unit 1340 does not perform state comparison processing for the first processing (processing for the first image) and simply stores the value, but in the second and subsequent processing (processing for the second image), it determines changes in the state of the object to be judged.
However, the following values may be stored and used as the reference values.

 基準値は、予め学習しモデル化した典型的な処理結果を用いることができる。
 すなわち、基準値は、ニューラルネットワークにおける、予めモデル化した典型的な中間層の出力値である、ようにしてもよい。
 典型的な処理結果を用いることで、実観測画像に含まれる雑音や異常に影響されずに比較を行うことができる。
The reference value can use typical processing results that have been learned and modeled in advance.
In other words, the reference value may be a typical output value of a pre-modeled intermediate layer in a neural network.
By using typical processing results, comparisons can be made without being influenced by noise or anomalies contained in the actual observed images.

 また、基準値は、事前情報によって定めた一部の結果を用いることができる。
 事前情報によって定めた一部の結果とは、例えば、頭部の存在確率が高い画素を採用する、といったことが想定される。
 また、例えば、背景等、変化が大きいが重要性は低いことがわかっている画素の不採用といったことが想定される。
 すなわち、基準値は、ニューラルネットワークに含まれるノードのうち予め定められたノードの出力値である、ようにしてもよい。
 これにより、省メモリ化や、扱うデータを削減することができ、より高速化を図ることができる。
In addition, the reference value may use a part of the results determined based on advance information.
The partial results determined based on the prior information are assumed to be, for example, pixels that have a high probability of indicating the presence of a head.
Also, for example, pixels that are known to have a large change but low importance, such as the background, may not be used.
In other words, the reference value may be the output value of a predetermined node among the nodes included in the neural network.
This allows for reduced memory usage and data handling, resulting in faster processing speeds.

 第1の状態比較部1340は、差分値と予め記憶されているしきい値とを用いて基準値と今回値との差の大小を判定した結果に基づいて、判定対象の状態の変化を判定する。
 具体的には、第1の状態比較部1340は、画像の単位に、差分値の二乗和が予め記憶されたしきい値より小さい場合、判定対象の状態が変化なしであると判定する。
 また、具体的には、第1の状態比較部1340は、画像の単位に記差分値の絶対値の総和が予め記憶されたしきい値より小さい場合、判定対象の状態が変化なしであると判定する。
The first state comparison section 1340 uses the difference value and a pre-stored threshold value to determine the magnitude of the difference between the reference value and the current value, and determines a change in the state of the object to be determined based on the result.
Specifically, the first state comparing section 1340 determines that the state of the determination target is unchanged when the sum of squares of the difference values is smaller than a pre-stored threshold value for each image.
More specifically, the first state comparing section 1340 determines that the state of the determination target is unchanged when the sum of the absolute values of the difference values recorded in image units is smaller than a pre-stored threshold value.

 図5に示す第1の状態比較部1340Bは、画像状態比較部1341、を含み構成されている。
 画像状態比較部1341において、格納とは、それぞれの入力元の2次元画像(特徴マップ)の値を保持することを意味する。
 また、画像状態比較部1341において、比較とは、例えば、格納された(前時刻/前フレームの)2次元画像と、最新の2次元画像との間で、要素(画素)毎に差の絶対値をとり、更に、総和若しくは平均をとったものと、予め定めた所定閾値との大小比較を行うことを意味する。
The first state comparison section 1340B shown in FIG.
In the image state comparison unit 1341, "storing" means holding the values of the two-dimensional images (feature maps) of the respective input sources.
In addition, in the image state comparison unit 1341, comparison means, for example, taking the absolute value of the difference for each element (pixel) between a stored two-dimensional image (from the previous time/previous frame) and the latest two-dimensional image, and then comparing the sum or average with a predetermined threshold value.

 図5に示す画像状態比較部1341は、格納部1341a、および、比較処理部1341b、を含み構成されている。 The image state comparison unit 1341 shown in FIG. 5 includes a storage unit 1341a and a comparison processing unit 1341b.

 第1の処理制御部1380Bは、前記第1の状態比較部1340により前記判定対象の状態が変化なしと判定された場合、当該判定対象の前記第2の層以降の層における処理を実行しないよう前記第1のニューラルネットワーク部1320へ指令するとともに、前記第1の出力格納部1390に格納されている前記特徴マップを出力するよう指令する。
 図15に示す第1の処理制御部1380Bは、特徴抽出処理制御部1381B、を含み構成されている。
 特徴抽出処理制御部1381Bは、特徴抽出部1300において上記第1の処理制御部1380Bの機能を実行する。
When the first state comparison unit 1340 determines that the state of the object to be determined has not changed, the first processing control unit 1380B instructs the first neural network unit 1320 not to execute processing in the second layer and subsequent layers of the object to be determined, and also instructs the first processing control unit 1380B to output the feature map stored in the first output storage unit 1390.
The first process control unit 1380B shown in FIG. 15 includes a feature extraction process control unit 1381B.
The feature extraction process control unit 1381B executes the function of the first process control unit 1380B in the feature extraction unit 1300.

 第1の出力格納部1390は、第1のニューラルネットワーク部1320により出力された出力データ(判定値(確率値))を格納する。
 第1の出力格納部1390は、第1のニューラルネットワーク部1320により出力された特徴マップを格納する。
 図5に示す第1の出力格納部1390は、結合画像格納部1391、を含み構成されている。
 結合画像格納部1391は、第1のニューラルネットワーク部1320により結合されて出力された結合画像である特徴マップを格納する。
The first output storage section 1390 stores the output data (decision value (probability value)) output by the first neural network section 1320 .
The first output storage unit 1390 stores the feature map output by the first neural network unit 1320 .
The first output storage unit 1390 shown in FIG.
The combined image storage unit 1391 stores a feature map, which is a combined image combined and output by the first neural network unit 1320 .

 異常状態判定部1500Bの内部構成の一例を説明する。
 図6は、異常警報装置100Bおよび異常判定装置1000Bにおける異常状態判定部1500Bの内部構成の一例を示す図である。
 異常状態判定部1500Bは、二次元画像である特徴マップを用いて、判定対象の状態を示す判定値を出力する。
 図6に示す異常状態判定部1500Bは、ニューラルネットワーク部(第2のニューラルネットワーク部1520)、状態比較部(第2の状態比較部1540)、および、出力格納部(第2の出力格納部)1590、を含み構成されている。
An example of the internal configuration of the abnormal state determination unit 1500B will be described.
FIG. 6 is a diagram showing an example of the internal configuration of the abnormality warning device 100B and the abnormality state determination unit 1500B in the abnormality determination device 1000B.
The abnormal condition determination unit 1500B uses a feature map, which is a two-dimensional image, to output a determination value indicating the condition of the object to be determined.
The abnormal state determination unit 1500B shown in FIG. 6 is configured to include a neural network unit (second neural network unit 1520), a state comparison unit (second state comparison unit 1540), and an output storage unit (second output storage unit) 1590.

 ニューラルネットワーク部(第2のニューラルネットワーク部1520)は、ニューラルネットワークにおける複数の層を有し、特徴抽出部1300により出力された特徴マップを取得し、当該特徴マップを用いて判定対象の状態を示す判定値を出力データとして出力する。
 図6に示すニューラルネットワーク部(第2のニューラルネットワーク部1520)は、状態分類部1525、および、確率出力層1529、を含み構成されている。
The neural network unit (second neural network unit 1520) has multiple layers in a neural network, acquires the feature map output by the feature extraction unit 1300, and uses the feature map to output a judgment value indicating the state of the object to be judged as output data.
The neural network unit (second neural network unit 1520 ) shown in FIG. 6 includes a state classification unit 1525 and a probability output layer 1529 .

 状態分類部1525は、例えば、2次元の画像(特徴マップ)を1次元のベクトルに変換し、更に出力を、居眠り状態を示すもの(例えば、瞼:居眠り有/無、眼球:居眠り有/無の4出力)へとまとめる機能を有する。
 図6に示す状態分類部1525は、第1の全結合層1527、および、第2の全結合層1528、を含み構成されている。
 状態分類部1525は、第1の全結合層1527および第2の全結合層1528を介することで、所望出力数を要素数とする1次元のベクトルを生成する。
The state classification unit 1525 has a function of, for example, converting a two-dimensional image (feature map) into a one-dimensional vector, and further consolidating the output into an indication of the drowsy state (for example, four outputs: eyelids: dozing/not dozing, eyeballs: dozing/not dozing).
The state classification unit 1525 shown in FIG. 6 includes a first fully connected layer 1527 and a second fully connected layer 1528.
The state classification unit 1525 generates a one-dimensional vector whose number of elements is the desired number of outputs through a first fully connected layer 1527 and a second fully connected layer 1528 .

 確率出力層1529は、例えばソフトマックス関数を適用し、出力値の合計を1.0とすることで、出力結果に確率的な意味合いを持たせる。 The probability output layer 1529 applies a softmax function, for example, and sets the sum of the output values to 1.0, thereby giving the output results a probabilistic meaning.

 図6に示す第2のニューラルネットワーク部1520は、全結合層を2つ備えているが、全結合層がない場合も、3つ以上備えるように構成してもよい。 The second neural network unit 1520 shown in FIG. 6 has two fully connected layers, but it may be configured to have three or more fully connected layers if there are no fully connected layers.

 第2の状態比較部1540Bは、第2のニューラルネットワーク部1520における層のうちの少なくとも一つの層である第3の層に対する入力値の基準である基準値を予め格納しておき、当該基準値と、当該第3の層に新たに入力されようとしている値である今回値と、の差分値を用いて、前記判定対象の状態の変化を判定する。 The second state comparison unit 1540B prestores a reference value that is a standard for the input value to the third layer, which is at least one of the layers in the second neural network unit 1520, and uses the difference between the reference value and the current value, which is the value that is about to be newly input to the third layer, to determine a change in the state of the object to be determined.

 基準値は、第2のニューラルネットワーク部1520の複数の層による前回の処理結果として、層ごとの入力値である前回値が格納されたものであり、今回、当該第1の層に新たに入力されようとしている入力値である今回値との差分値を得るために用いられる。後述する処理の説明においては、基準値に前回値を用いるものについて説明する。
 この場合、第2の状態比較部1540Bは、1回目の処理(1枚目の画像に対する処理)については状態比較処理を行わず値を格納するだけであるが、2回目以降の処理(2枚目の画像に対する処理)においては判定対象の状態の変化を判定する。
 ただし、基準値は、以下のような値を格納して用いるようにしてもよい。
The reference value is a previous value, which is an input value for each layer, stored as a result of the previous processing by the multiple layers of the second neural network unit 1520, and is used to obtain a difference value between the previous value and a current value, which is an input value that is about to be newly input to the first layer. In the explanation of the processing described later, a case in which the previous value is used as the reference value will be explained.
In this case, the second state comparison unit 1540B does not perform state comparison processing for the first processing (processing for the first image) and simply stores the values, but determines changes in the state of the object to be judged for the second and subsequent processing (processing for the second image).
However, the following values may be stored and used as the reference values.

 基準値は、予め学習しモデル化した典型的な処理結果を用いることができる。
 すなわち、基準値は、ニューラルネットワークにおける、予めモデル化した典型的な中間層の出力値である、ようにしてもよい。
 典型的な処理結果を用いることで、実観測画像に含まれる雑音や異常に影響されずに比較を行うことができる。
The reference value can use typical processing results that have been learned and modeled in advance.
In other words, the reference value may be a typical output value of a pre-modeled intermediate layer in a neural network.
By using typical processing results, comparisons can be made without being influenced by noise or anomalies contained in the actual observed images.

 また、基準値は、事前情報によって定めた一部の結果を用いることができる。
 事前情報によって定めた一部の結果とは、例えば、頭部の存在確率が高い画素を採用する、といったことが想定される。
 また、例えば、背景等、変化が大きいが重要性は低いことがわかっている画素の不採用といったことが想定される。
 すなわち、基準値は、ニューラルネットワークに含まれるノードのうち予め定められたノードの出力値である、ようにしてもよい。
 これにより、省メモリ化や、扱うデータを削減することができ、より高速化を図ることができる。
In addition, the reference value may use a part of the results determined based on advance information.
The partial results determined based on the prior information are assumed to be, for example, pixels that have a high probability of indicating the presence of a head.
Also, for example, pixels that are known to have a large change but low importance, such as the background, may not be used.
In other words, the reference value may be the output value of a predetermined node among the nodes included in the neural network.
This allows for reduced memory usage and data handling, resulting in faster processing speeds.

 第2の状態比較部1540Bは、差分値と予め記憶されているしきい値とを用いて基準値と今回値との差の大小を判定した結果に基づいて、判定対象の状態の変化を判定する。
 具体的には、第2の状態比較部1540Bは、画像の単位に、差分値の二乗和が予め記憶されたしきい値より小さい場合、判定対象の状態が変化なしであると判定する。
 また、具体的には、第2の状態比較部1540Bは、画像の単位に記差分値の絶対値の総和が予め記憶されたしきい値より小さい場合、判定対象の状態が変化なしであると判定する。
The second state comparison section 1540B uses the difference value and a pre-stored threshold value to determine the magnitude of the difference between the reference value and the current value, and determines a change in the state of the object to be determined based on the result.
Specifically, when the sum of squares of the difference values is smaller than a pre-stored threshold value for each image, second state comparison section 1540B determines that the state of the determination target is unchanged.
More specifically, when the sum of the absolute values of the difference values recorded in image units is smaller than a pre-stored threshold value, the second state comparing section 1540B determines that the state of the determination target is unchanged.

 図6に示す第2の状態比較部1540Bは、第1の全結合層状態比較部1541を含み構成されている。
 第1の全結合層状態比較部1541において、格納とは、それぞれの入力元の1次元ベクトルの値を保持することを意味する。
 また、第1の全結合層状態比較部1541において、比較とは、例えば、格納された(前時刻/前フレームの)1次元ベクトルと、最新の1次元ベクトルとの間で、要素毎に差の絶対値をとり、更に、総和若しくは平均をとったものと、予め定めた所定閾値との大小比較を行うことを意味する。
The second state comparison unit 1540B shown in FIG. 6 includes a first fully connected layer state comparison unit 1541.
In the first fully connected layer state comparison unit 1541, "storing" means holding the values of the one-dimensional vectors of the respective input sources.
In addition, in the first fully connected layer state comparison unit 1541, comparison means, for example, taking the absolute value of the difference for each element between the stored one-dimensional vector (of the previous time/previous frame) and the latest one-dimensional vector, and then comparing the sum or average with a predetermined threshold value.

 図6に示す第1の全結合層状態比較部1541は、格納部1541a、および、比較処理部1541b、を含み構成されている。
 格納部1541aは、第1の全結合層1527の出力値であって第2の全結合層1528の入力値である基準値を、第1の全結合層1527による出力ごとに格納する。
 比較処理部1541bは、基準値と今回値とを比較する。
The first fully connected layer state comparison unit 1541 shown in FIG. 6 includes a storage unit 1541a and a comparison processing unit 1541b.
The storage unit 1541 a stores a reference value, which is an output value of the first fully connected layer 1527 and an input value of the second fully connected layer 1528 , for each output by the first fully connected layer 1527 .
The comparison processing unit 1541b compares the current value with a reference value.

 第2の処理制御部1580Bは、第2の状態比較部1540により判定対象の状態が変化なしと判定された場合、当該判定対象の前記層以降の層における処理を実行しないよう第2のニューラルネットワーク部1520へ指令するとともに、第2の出力格納部に格納されている判定値を出力データとして出力するよう指令する。
 図6に示す第2の処理制御部1580Bは、異常状態判定処理制御部1581B、を含み構成されている。
 異常状態判定処理制御部1581Bは、異常状態判定部1500Bにおける第2のニューラルネットワーク部1520の処理を制限できるよう機能する。
When the second state comparison unit 1540 determines that the state of the object to be judged has not changed, the second processing control unit 1580B instructs the second neural network unit 1520 not to execute processing in the layer of the object to be judged and subsequent layers, and also instructs the second processing control unit 1580B to output the judgment value stored in the second output storage unit as output data.
The second process control unit 1580B shown in FIG. 6 includes an abnormal state determination process control unit 1581B.
The abnormal state determination process control unit 1581B functions to limit the processing of the second neural network unit 1520 in the abnormal state determination unit 1500B.

 第2の出力格納部1590は、第2のニューラルネットワーク部1520により出力された判定値を格納する。
 図6に示す出力格納部(第2の出力格納部)1590は、確率格納部1591、を含み構成されている。
The second output storage unit 1590 stores the decision value output by the second neural network unit 1520 .
The output storage section (second output storage section) 1590 shown in FIG.

 確率格納部は、第2のニューラルネットワーク部1520により出力された判定値を格納する。
 判定値は、判定対象者の状態を示す値であって、例えば、確率出力層1529により出力結果に確率的な意味合いを有するように出力された確率値である。
The probability storage unit stores the judgment value output by the second neural network unit 1520 .
The judgment value is a value indicating the state of the person to be judged, and is, for example, a probability value output by the probability output layer 1529 so that the output result has a probabilistic meaning.

 図4の説明に戻る。
 警報出力部2000は、判定対象者の眠気状態または居眠り状態を示す判定値である出力データを取得し、当該判定値に応じて前記判定対象者に対する警報を出力させる。
Returning to the explanation of FIG.
The warning output unit 2000 acquires output data that is a judgment value indicating the drowsy state or dozing state of the person to be judged, and outputs a warning to the person to be judged in accordance with the judgment value.

 図4に示す異常判定装置1000Bは、警報出力部2000を含まない構成として示しているが、警報出力部2000を含むように構成してもよい。このように構成した場合、異常判定装置1000Bは、図4に示す異常警報装置100Bと同等である。以下の説明においては、異常警報装置100Bと異常判定装置1000Bとの区別が必要な場合以外は、異常判定装置1000Bが警報出力部2000を含めて構成されているものとして説明する。 The abnormality determination device 1000B shown in FIG. 4 is shown as being configured not to include the alarm output unit 2000, but may be configured to include the alarm output unit 2000. When configured in this manner, the abnormality determination device 1000B is equivalent to the abnormality warning device 100B shown in FIG. 4. In the following explanation, the abnormality determination device 1000B will be described as being configured to include the alarm output unit 2000, except in cases where it is necessary to distinguish between the abnormality warning device 100B and the abnormality determination device 1000B.

 異常判定装置1000Bは、上記構成以外に、図示しない制御部、図示しない記憶部、および、図示しない通信部、を含み構成されていてもよい。
 図示しない制御部は、異常判定装置1000B全体および各構成部に対する制御を行う。図示しない制御部は、例えば外部からの指令に従って異常判定装置1000Aを起動させる。また、図示しない制御部は、異常判定装置1000Bの状態(動作状態=起動、シャットダウン、スリープなどの状態)を制御する。
 図示しない記憶部は、異常判定装置1000Bに用いられる各データを記憶する。図示しない記憶部は、例えば、異常判定装置1000Bにおける各構成部による出力(出力されたデータ)を記憶し、構成部ごとに要求されたデータを要求元の構成部へ宛てて出力する。
 図示しない通信部は、外部の装置との間で通信を行う。例えば異常判定装置1000Bと車内カメラなどの撮像装置との間で通信を行う。また、例えば異常判定装置1000Bが表示部または音声出力部を備えていない場合、異常判定装置1000Bと例えば表示装置または音声出力装置といった外部の装置との間で通信を行う。
In addition to the above configuration, abnormality determination device 1000B may be configured to include a control unit (not shown), a storage unit (not shown), and a communication unit (not shown).
A control unit (not shown) controls the entire abnormality determination device 1000B and each of its components. The control unit (not shown) starts up the abnormality determination device 1000A in accordance with, for example, an external command. The control unit (not shown) also controls the state (operation state = start-up, shutdown, sleep, etc.) of the abnormality determination device 1000B.
A storage unit (not shown) stores each piece of data used in the abnormality determination device 1000B. The storage unit (not shown) stores, for example, outputs (output data) from each component in the abnormality determination device 1000B, and outputs data requested by each component to the component that has made the request.
The communication unit (not shown) communicates with an external device. For example, the communication unit communicates between the abnormality determination device 1000B and an imaging device such as an in-vehicle camera. In addition, for example, if the abnormality determination device 1000B does not have a display unit or an audio output unit, the communication unit communicates between the abnormality determination device 1000B and an external device such as a display unit or an audio output device.

 実施の形態2に係る異常警報装置100Bおよび異常判定装置1000Bの処理の一例を説明する。
 図7は、本開示の実施の形態2に係る異常警報装置100Bおよび異常判定装置1000Bの処理の一例を示すフローチャートである。
 異常判定装置1000Bは、例えばカメラから画像が入力されると、図7に示す処理を開始する。
An example of the processing of the abnormality warning device 100B and the abnormality determination device 1000B according to the second embodiment will be described.
FIG. 7 is a flowchart showing an example of processing by the abnormality warning device 100B and the abnormality determination device 1000B according to the second embodiment of the present disclosure.
When an image is input from a camera, for example, the abnormality determination device 1000B starts the process shown in FIG.

 異常判定装置1000Bは、画像取得処理を実行する(ステップST2100)。
 画像取得処理において、異常判定装置1000Bの画像取得部1100は、画像を取得して出力する。
Abnormality determination device 1000B executes image acquisition processing (step ST2100).
In the image acquisition process, the image acquisition unit 1100 of the abnormality determination device 1000B acquires and outputs an image.

 異常判定装置1000Bは、格納および状態比較処理を実行する(ステップST2200)。
 状態格納および状態比較処理において、異常判定装置1000Bの第1の状態比較部1340Bは、処理開始後の少なくとも1回目の第1のニューラルネットワーク部1320の層ごとに、層に対する入力値である前回値(基準値)を格納する。
 第1の状態比較部1340は、1回目の処理(1枚目の画像に対する処理)については状態比較処理を行わず値(前回値=基準値)を格納するだけであるが、2回目以降の処理(2枚目の画像に対する処理)においては判定対象の状態の変化を判定する。
Abnormality determination device 1000B executes a storage and state comparison process (step ST2200).
In the state storage and state comparison process, the first state comparison unit 1340B of the abnormality determination device 1000B stores the previous value (reference value), which is the input value to the layer, for each layer of the first neural network unit 1320 for at least the first time after the process starts.
The first state comparison unit 1340 does not perform state comparison processing for the first processing (processing for the first image) and simply stores the value (previous value = reference value), but for the second and subsequent processing (processing for the second image), it determines changes in the state of the object to be judged.

 異常判定装置1000Bは、特徴抽出処理制御処理を実行する(ステップST2300)。
 異常判定装置1000Bの特徴抽出処理制御部1381は、第1の状態比較部1340による判定結果に応じて、第1のニューラルネットワーク部1320に対する通常処理指令、または、省略処理指令を実行する。
 また、特徴抽出処理制御部1381は、省略処理指令を行う場合、結合画像格納部1391へ出力指令を行う。
Abnormality determination device 1000B executes a feature extraction process control process (step ST2300).
The feature extraction process control unit 1381 of the abnormality determination device 1000B executes a normal process command or an omission process command to the first neural network unit 1320 depending on the determination result by the first state comparison unit 1340.
Furthermore, when issuing an omission processing command, the feature extraction processing control unit 1381 issues an output command to the combined image storage unit 1391 .

 異常判定装置1000Bは、結合画像出力処理を実行する(ステップST2400)。
 第1のニューラルネットワーク部1320が通常処理を実行して結合画像を出力する、または、結合画像格納部1391が前回の結合画像を出力する。これにより特徴抽出部1300は結合画像を出力する。
Abnormality determination device 1000B executes combined image output processing (step ST2400).
The first neural network unit 1320 executes normal processing and outputs a combined image, or the combined image storage unit 1391 outputs the previous combined image, whereby the feature extraction unit 1300 outputs a combined image.

 異常判定装置1000Bは、全結合層状態格納および全結合層状態比較処理を実行する(ステップST2500)。
 異常判定装置1000Bにおける第2の状態比較部1540は、第2のニューラルネットワーク部1520の複数の層ごとに、当該層に対する入力値の基準である基準値を予め格納しておき、当該基準値と、当該層に新たに入力されようとしている値である今回値と、の差分値を用いて、判定対象の状態の変化を判定する。
 第2の状態比較部1540は、1回目の処理(1枚目の画像に対する処理)については状態比較処理を行わず値を格納するだけであるが、2回目以降の処理(2枚目の画像に対する処理)においては判定対象の状態の変化を判定する。
Abnormality determination device 1000B executes a process of storing the fully connected layer states and comparing the fully connected layer states (step ST2500).
The second state comparison unit 1540 in the abnormality determination device 1000B pre-stores, for each of the multiple layers of the second neural network unit 1520, a reference value that is a standard for the input value to that layer, and determines a change in the state of the object to be determined using the difference value between the reference value and the current value, which is the value that is to be newly input to that layer.
The second state comparison unit 1540 does not perform state comparison processing for the first processing (processing for the first image) and simply stores the values, but determines changes in the state of the object to be judged for the second and subsequent processing (processing for the second image).

 異常判定装置1000Bは、異常状態判定処理制御処理を実行する(ステップST2600)。
 異常状態判定処理制御部1581Bは、第2のニューラルネットワーク部1520に対する通常指令、または、省略指令を行う。異常状態判定処理制御部1581Bは、省略指令を行う場合、確率格納部1591に対して出力指令を行う。
Abnormality determination device 1000B executes an abnormal state determination process control process (step ST2600).
The abnormal state determination process control unit 1581B issues a normal command or an omission command to the second neural network unit 1520. When issuing an omission command, the abnormal state determination process control unit 1581B issues an output command to the probability storage unit 1591.

 異常判定装置1000Bは、結果出力処理を実行する(ステップST2700)。
 第2のニューラルネットワーク部1520が通常処理を実行して判定値を出力する、または、確率格納部1591が前回の判定値(確率値)を出力する。これにより異常状態判定部1500は判定値を出力データとして出力する。
Abnormality determination device 1000B executes a result output process (step ST2700).
The second neural network unit 1520 executes normal processing and outputs a judgment value, or the probability storage unit 1591 outputs a previous judgment value (probability value), so that the abnormal state judgment unit 1500 outputs the judgment value as output data.

 異常判定装置1000Bは、警報出力処理を実行する(ステップST2800)。
 警報出力処理において、異常判定装置1000Bの警報出力部2000は、出力データを取得して、出力データに基づいて警報信号を図示しない警報装置等に出力する。警報出力部2000は、出力データに含まれる判定値に基づいて、警報を出力するかを判定し、警報を出力すると判定した場合、図示しない警報装置等へ警報信号を出力する。
Abnormality determination device 1000B executes an alarm output process (step ST2800).
In the alarm output process, the alarm output unit 2000 of the abnormality determination device 1000B acquires output data and outputs an alarm signal based on the output data to an alarm device (not shown) etc. The alarm output unit 2000 determines whether to output an alarm based on a determination value included in the output data, and when it determines to output an alarm, outputs an alarm signal to an alarm device (not shown) etc.

 異常判定装置1000Bは、ステップST3800の処理を実行すると、図7に示す一連の処理を終了し、ステップST3100の処理から繰り返す。
 なお、異常判定装置1000Bは、例えばカメラが電源OFF状態になると連動して電源OFF状態になる。
When abnormality determining device 1000B executes the process of step ST3800, it ends the series of processes shown in FIG. 7 and repeats the process from step ST3100.
In addition, the abnormality determination device 1000B is also turned off in conjunction with, for example, the camera being turned off.

 ここで、実施の形態2に係る異常警報装置100Bおよび異常判定装置1000Bの処理の詳細な一例を説明する。
 図8は、本開示の実施の形態2に係る異常警報装置100Bおよび異常判定装置1000Bのさらに詳細な処理の一例を示すフローチャートである。
 図8のフローチャートは、図7のフローチャートにおけるステップST2200の処理からステップST2700の処理に相当する処理の例を示している。
 異常判定装置1000Bにおける特徴抽出部1300Bは、ステップST2200の処理を開始すると、まず、格納処理(ステップST2201)を実行する。格納処理において、特徴抽出部1300Bの状態比較部1340は、ニューラルネットワーク部1320(第1のニューラルネットワーク部1320)の全ての層のうち、少なくとも一つの層である第1の層の入力値である前回値を格納する。
 また、状態比較部1340は、第1の層において処理が行われる度に、第1の層の入力値を格納する。
Here, a detailed example of the processing of the abnormality warning device 100B and the abnormality determination device 1000B according to the second embodiment will be described.
FIG. 8 is a flowchart showing an example of more detailed processing of the abnormality warning device 100B and the abnormality determination device 1000B according to the second embodiment of the present disclosure.
The flowchart in FIG. 8 shows an example of processing equivalent to the processing from step ST2200 to step ST2700 in the flowchart in FIG.
When the feature extraction unit 1300B in the abnormality determination device 1000B starts the process of step ST2200, the feature extraction unit 1300B first executes a storage process (step ST2201). In the storage process, the state comparison unit 1340 of the feature extraction unit 1300B stores a previous value that is an input value of the first layer, which is at least one layer among all layers of the neural network unit 1320 (first neural network unit 1320).
Furthermore, the state comparison unit 1340 stores the input values of the first layer each time processing is performed in the first layer.

 次いで、異常判定装置1000Bにおける特徴抽出部1300Bは、前回格納判定処理(ステップST2202)を実行する。前回格納判定処理において、特徴抽出部1300Bの状態比較部1340は、前回の入力値(前回値=基準値)が格納されているかを判定する。 Next, the feature extraction unit 1300B in the abnormality determination device 1000B executes a previous storage determination process (step ST2202). In the previous storage determination process, the state comparison unit 1340 of the feature extraction unit 1300B determines whether the previous input value (previous value = reference value) is stored.

 異常判定装置1000Bにおける特徴抽出部1300Bは、比較処理(ステップST2203)を実行する。比較処理において、特徴抽出部1300Bの状態比較部1340は、第1の層に新たに入力されようとしている値である今回値と前回値(基準値)との差(差分値)を算出する。 The feature extraction unit 1300B in the abnormality determination device 1000B executes a comparison process (step ST2203). In the comparison process, the state comparison unit 1340 of the feature extraction unit 1300B calculates the difference (differential value) between the current value, which is the value that is about to be newly input to the first layer, and the previous value (reference value).

 異常判定装置1000Bにおける特徴抽出部1300Bは、差分判定処理(ステップST2204)を実行する。差分判定処理において、特徴抽出部1300Bの状態比較部1340は、差分値を用いて、判定対象の状態の変化を判定する。
 具体的には、状態比較部1340は、例えば画像の単位に、差分値の二乗和が予め記憶されたしきい値より小さい場合、判定対象の状態が変化なしであると判定する。
 また、具体的には、状態比較部1340は、例えば画像の単位に、差分値の絶対値の総和が予め記憶されたしきい値より小さい場合、判定対象の状態が変化なしであると判定する。
Feature extraction unit 1300B in abnormality determination device 1000B executes a difference determination process (step ST2204). In the difference determination process, state comparison unit 1340 of feature extraction unit 1300B uses the difference value to determine a change in the state of the determination target.
Specifically, for example, in units of images, if the sum of squares of the difference values is smaller than a pre-stored threshold value, the state comparison section 1340 determines that the state of the determination target is unchanged.
More specifically, the state comparison unit 1340 determines that the state of the determination target is unchanged when the sum of the absolute values of the difference values is smaller than a pre-stored threshold value, for example, for each image.

 特徴抽出部1300Bの状態比較部1340により前回格納有ではないと判定された場合(ステップST2202“NO”)、または、状態比較部1340により差分がしきい値以上であると判定され、判定対象の状態が変化ありであると判定された場合(ステップST2204“NO”)、異常判定装置1000Bにおける特徴抽出部1300Bは、通常処理指令(ステップST2301)を実行する。通常処理指令において、特徴抽出部1300Bの処理制御部1380は、ニューラルネットワーク部1320(第1のニューラルネットワーク部1320)の第1の層における処理を実行させる指令を出力する。 If the state comparison unit 1340 of the feature extraction unit 1300B determines that the data was not previously stored (step ST2202 "NO"), or if the state comparison unit 1340 determines that the difference is equal to or greater than the threshold value and that the state of the object to be determined has changed (step ST2204 "NO"), the feature extraction unit 1300B in the abnormality determination device 1000B executes a normal processing command (step ST2301). In the normal processing command, the processing control unit 1380 of the feature extraction unit 1300B outputs a command to execute processing in the first layer of the neural network unit 1320 (first neural network unit 1320).

 次いで、異常判定装置1000Bにおける特徴抽出部1300Bは、出力処理(ステップST2401)を実行する。出力処理において、特徴抽出部1300Bの第1のニューラルネットワーク部1320は、通常処理指令に従って処理を実行し、判定結果を示す出力データを出力する。 Next, the feature extraction unit 1300B in the abnormality determination device 1000B executes output processing (step ST2401). In the output processing, the first neural network unit 1320 of the feature extraction unit 1300B executes processing according to the normal processing command, and outputs output data indicating the determination result.

 次いで、異常判定装置1000Bにおける特徴抽出部1300Bは、出力格納処理(ステップST2402)を実行する。出力格納処理において、特徴抽出部1300Bの出力格納部1390(第1の出力格納部1390)は、ニューラルネットワーク部1320により出力された出力データを格納する。 Next, the feature extraction unit 1300B in the abnormality determination device 1000B executes an output storage process (step ST2402). In the output storage process, the output storage unit 1390 (first output storage unit 1390) of the feature extraction unit 1300B stores the output data output by the neural network unit 1320.

 状態比較部1340により差分がしきい値未満であると判定され、判定対象の状態が変化なしであると判定された場合(ステップST2204“YES”)、異常判定装置1000Bにおける特徴抽出部1300Bは、省略処理指令(ステップST2302)を実行する。省略処理指令において、特徴抽出部1300Bの処理制御部1380は、複数の層のうちの少なくとも一つの層である第1の層以降の層における処理を実行しないようニューラルネットワーク部1320へ指令するとともに、出力格納部1390に格納されている出力データを出力するよう指令する。 If the state comparison unit 1340 determines that the difference is less than the threshold value and that the state of the object to be determined has not changed (step ST2204 "YES"), the feature extraction unit 1300B in the abnormality determination device 1000B executes an omission processing command (step ST2302). In the omission processing command, the processing control unit 1380 of the feature extraction unit 1300B instructs the neural network unit 1320 not to execute processing in the first layer or subsequent layers, which is at least one layer out of the multiple layers, and instructs the output storage unit 1390 to output the output data stored therein.

 異常判定装置1000Bにおける特徴抽出部1300Bは、格納データ出力処理(ステップST2403)を実行する。格納データ出力処理において、特徴抽出部1300Bの出力格納部1390(出力格納部1390の結合画像格納部1391)は、処理制御部1380からの指令を受けると、格納している出力データである結合画像を警報出力部2000へ出力する。 The feature extraction unit 1300B in the abnormality determination device 1000B executes a stored data output process (step ST2403). In the stored data output process, when the output storage unit 1390 (combined image storage unit 1391 of the output storage unit 1390) of the feature extraction unit 1300B receives a command from the process control unit 1380, it outputs the stored output data, that is, the combined image, to the alarm output unit 2000.

 異常判定装置1000Bにおける異常状態判定部1500Bは、全結合層出力格納処理(ステップST2501)を実行する。全結合層出力格納処理において、異常状態判定部1500Bの状態比較部1540は、状態分類部1525における全結合層(第1の全結合層1527または第2の全結合層1528)から出力された値を全結合層の状態として格納する。 The abnormal state determination unit 1500B in the abnormality determination device 1000B executes a fully connected layer output storage process (step ST2501). In the fully connected layer output storage process, the state comparison unit 1540 of the abnormal state determination unit 1500B stores the value output from the fully connected layer (the first fully connected layer 1527 or the second fully connected layer 1528) in the state classification unit 1525 as the state of the fully connected layer.

 異常判定装置1000Bにおける異常状態判定部1500Bは、前回格納判定処理(ステップST2502)を実行する。前回格納判定処理において、異常状態判定部1500Bの状態比較部1540は、前回の入力値(前回値=基準値)が格納されているかを判定する。 The abnormal state determination unit 1500B in the abnormality determination device 1000B executes a previous storage determination process (step ST2502). In the previous storage determination process, the state comparison unit 1540 of the abnormal state determination unit 1500B determines whether the previous input value (previous value = reference value) is stored.

 異常判定装置1000Bにおける異常状態判定部1500Bは、前回格納有であると判定した場合(ステップST2502“YES”)、比較処理(ステップST2503)を実行する。比較処理において、異常状態判定部1500Bの状態比較部1540は、第1の層(第2の全結合層1528または確率出力層1529)に新たに入力されようとしている値である今回値と前回値(基準値)との差(差分値)を算出する。 If the abnormal state determination unit 1500B in the abnormality determination device 1000B determines that a previous value has been stored ("YES" in step ST2502), it executes a comparison process (step ST2503). In the comparison process, the state comparison unit 1540 of the abnormal state determination unit 1500B calculates the difference (differential value) between the current value, which is the value that is about to be newly input to the first layer (the second fully connected layer 1528 or the probability output layer 1529), and the previous value (reference value).

 異常判定装置1000Bにおける異常状態判定部1500Bは、差分判定処理(ステップST2504)を実行する。差分判定処理において、異常状態判定部1500Bの状態比較部1540は、今回値と前回値(基準値)との差分値と、予め記憶されているしきい値と、を用いて基準値と今回値との差の大小を判定する。 The abnormal state determination unit 1500B in the abnormality determination device 1000B executes a difference determination process (step ST2504). In the difference determination process, the state comparison unit 1540 of the abnormal state determination unit 1500B determines the magnitude of the difference between the reference value and the current value using the difference between the current value and the previous value (reference value) and a pre-stored threshold value.

 異常状態判定部1500Bの状態比較部1540により差分がしきい値以上であると判定された場合(ステップST2504“NO”)、異常判定装置1000Bにおける異常状態判定部1500Bは、通常処理指令(ステップST2601)を実行する。通常処理指令において、異常状態判定部1500Bの処理制御部1580は、ニューラルネットワーク部1520(第2のニューラルネットワーク部1520)に対し通常処理指令を実行する。 If the state comparison unit 1540 of the abnormal state determination unit 1500B determines that the difference is equal to or greater than the threshold value (step ST2504 "NO"), the abnormal state determination unit 1500B in the abnormality determination device 1000B executes a normal processing command (step ST2601). In the normal processing command, the processing control unit 1580 of the abnormal state determination unit 1500B executes a normal processing command to the neural network unit 1520 (second neural network unit 1520).

 異常判定装置1000Bにおける異常状態判定部1500Bは、出力処理(ステップST2701)を実行する。出力処理において、異常状態判定部1500Bのニューラルネットワーク部1520(第2のニューラルネットワーク部1520)は、判定値を出力データとして出力する。 The abnormal state determination unit 1500B in the abnormality determination device 1000B executes an output process (step ST2701). In the output process, the neural network unit 1520 (second neural network unit 1520) of the abnormal state determination unit 1500B outputs the determination value as output data.

 異常判定装置1000Bにおける異常状態判定部1500Bは、出力格納処理(ステップST2702)を実行する。出力格納処理において、異常状態判定部1500Bにおける出力格納部1590(第2の出力格納部1590)は、ニューラルネットワーク部1520(第2のニューラルネットワーク部1520)により出力された判定値を格納する。 The abnormal state determination unit 1500B in the abnormality determination device 1000B executes an output storage process (step ST2702). In the output storage process, the output storage unit 1590 (second output storage unit 1590) in the abnormal state determination unit 1500B stores the determination value output by the neural network unit 1520 (second neural network unit 1520).

 異常判定装置1000Bにおける異常状態判定部1500Bは、省略処理指令(ステップST2602)を実行する。省略処理指令において、異常状態判定部1500Bの処理制御部1580は、判定対象の層以降の層における処理を実行しないよう第2のニューラルネットワーク部1520へ指令するとともに、第2の出力格納部に格納されている判定値を出力するよう指令する。 The abnormal state determination unit 1500B in the abnormality determination device 1000B executes an omission processing command (step ST2602). In the omission processing command, the processing control unit 1580 of the abnormal state determination unit 1500B instructs the second neural network unit 1520 not to execute processing in layers subsequent to the layer to be determined, and instructs the second output storage unit to output the determination value stored therein.

 異常判定装置1000Bにおける異常状態判定部1500Bは、格納データ出力処理(ステップST2703)を実行する。格納データ出力処理において、異常状態判定部1500Bの第2の出力格納部1590は、ニューラルネットワーク部1520(第2のニューラルネットワーク部1520)により出力された判定値を格納する。 The abnormal state determination unit 1500B in the abnormality determination device 1000B executes a stored data output process (step ST2703). In the stored data output process, the second output storage unit 1590 of the abnormal state determination unit 1500B stores the determination value output by the neural network unit 1520 (second neural network unit 1520).

 第2のニューラルネットワーク部1520または第2の出力格納部1590が、判定値を出力すると、図21に示す一連の処理を終了する。 When the second neural network unit 1520 or the second output storage unit 1590 outputs the judgment value, the series of processes shown in FIG. 21 ends.

 なお、図においては、第1の状態比較部1340Bが画像状態比較部1341を有し、第2の状態比較部1540Bが第1の全結合層状態比較部1541を有する、構成の例を示したが、画像状態比較部1341、畳み込み層の状態変化を判定する畳み込み層状態比較部(後述する実施の形態における画像状態比較部1342を参照)、プーリング層の状態変化を判定するプーリング層状態比較部(後述する実施の形態におけるプーリング層状態比較部1343を参照)、第1の全結合層状態比較部(後述する実施の形態における第1の全結合層状態比較部1541を参照)、第2の全結合層1528の状態変化を判定する第2の全結合層状態比較部(後述する実施の形態における第2の全結合層状態比較部1542を参照)、のいずれか1つまたは複数の組み合わせにより構成するものであってもよい。例えば、いずれか1つを採用した場合、以下の図9、図10、図11、図12、図13に示すような処理により実現することができる。 In the figure, an example of a configuration is shown in which the first state comparison unit 1340B has an image state comparison unit 1341, and the second state comparison unit 1540B has a first fully connected layer state comparison unit 1541. However, the configuration may be one or a combination of the following: the image state comparison unit 1341, a convolution layer state comparison unit that determines a state change in the convolution layer (see the image state comparison unit 1342 in the embodiment described later), a pooling layer state comparison unit that determines a state change in the pooling layer (see the pooling layer state comparison unit 1343 in the embodiment described later), a first fully connected layer state comparison unit (see the first fully connected layer state comparison unit 1541 in the embodiment described later), and a second fully connected layer state comparison unit that determines a state change in the second fully connected layer 1528 (see the second fully connected layer state comparison unit 1542 in the embodiment described later). For example, when any one of them is adopted, it can be realized by the processing shown in the following Figures 9, 10, 11, 12, and 13.

 状態比較部(第1の状態比較部1340B)が画像状態比較部1341である場合の特徴抽出部1300Bの処理の例を説明する。
 図9は、本開示の実施の形態2の特徴抽出部1300Bにおける状態比較部(第1の状態比較部1340B)が画像状態比較部1341である場合の特徴抽出部1300Bの処理の一例を示すフローチャートである。
An example of the process of the feature extraction unit 1300B when the state comparison unit (first state comparison unit 1340B) is the image state comparison unit 1341 will be described.
FIG. 9 is a flowchart showing an example of a process of the feature extraction unit 1300B in the case where the state comparison unit (first state comparison unit 1340B) in the feature extraction unit 1300B according to the second embodiment of the present disclosure is the image state comparison unit 1341.

 特徴抽出部1300Bが処理を開始すると、まず、特徴抽出部1300Bの第1の状態比較部1340Bは画像取得を取得して画像格納処理を実行する(ステップST2211)。
 画像格納処理において、第1の状態比較部1340Bの画像状態比較部1341は、第1のニューラルネットワーク部1320に対する入力データである画像を格納する。
 また、画像状態比較部1341は、第1のニューラルネットワーク部1320が画像を取得する度に、画像(第1のニューラルネットワーク部1320により処理される前の画像)を格納部1341aへ格納する。
When the feature extraction section 1300B starts the process, first, the first state comparison section 1340B of the feature extraction section 1300B acquires an image acquisition command and executes an image storage process (step ST2211).
In the image storage process, the image state comparison section 1341 of the first state comparison section 1340B stores the image that is the input data for the first neural network section 1320.
Furthermore, every time the first neural network unit 1320 acquires an image, the image state comparison unit 1341 stores the image (the image before being processed by the first neural network unit 1320) in the storage unit 1341a.

 画像状態比較部1341は、前回格納有であるかを判定する処理を実行する(ステップST2212)。
 画像状態比較部1341は、格納部1341aを参照して、前回入力された画像が格納されているかを判定する。
The image state comparison section 1341 executes a process of determining whether the image has been previously stored (step ST2212).
The image state comparison section 1341 refers to the storage section 1341a to determine whether or not the previously input image is stored.

 画像状態比較部1341は、前回格納有であると判定した場合(ステップST2212“YES”)、今回と前回との比較処理を実行する(ステップST2213)。
 画像状態比較部1341の比較処理部1341bは、今回入力された画像と前回入力された画像とを比較する。
When it is determined that the image has been stored previously ("YES" in step ST2212), the image state comparison unit 1341 executes a process of comparing the current image with the previous image (step ST2213).
The comparison processing section 1341b of the image state comparison section 1341 compares the currently input image with the previously input image.

 比較処理部1341bは、差分がしきい値未満であるかを判定する処理を実行する(ステップST2214)。
 比較処理部1341bは、今回入力された画像と前回入力された画像との差分値と、予め記憶されているしきい値と、を用いて基準値(前回入力された画像)と今回値(今回入力されようとしている画像)との差の大小を判定する。
The comparison processing unit 1341b executes a process of determining whether the difference is less than a threshold value (step ST2214).
The comparison processing unit 1341b uses the difference value between the image input this time and the image input last time and a pre-stored threshold value to determine the magnitude of the difference between a reference value (the image input last time) and a current value (the image to be input this time).

 画像状態比較部1341により前回格納無であると判定された場合(ステップST2212“NO”)、または、画像状態比較部1341により前回値(基準値)と今回値との差分がしきい値以上であると判定された場合(ステップST2214“NO”)、第1の処理制御部1380である特徴抽出処理制御部1381は、第1のニューラルネットワーク部1320に対して通常処理指令を行う(ステップST2311)。
 第1のニューラルネットワーク部1320は、通常処理指令に従って通常処理を実行し、結合画像を出力する(ステップST2411)。
 第1の出力格納部1390である結合画像格納部1391は、第1のニューラルネットワーク部1320により出力された結合画像を格納する(ステップST2412)。
If the image state comparison unit 1341 determines that no value was stored last time (step ST2212 ``NO''), or if the image state comparison unit 1341 determines that the difference between the previous value (reference value) and the current value is greater than or equal to a threshold value (step ST2214 ``NO''), the feature extraction processing control unit 1381, which is the first processing control unit 1380, issues a normal processing command to the first neural network unit 1320 (step ST2311).
First neural network unit 1320 executes normal processing in accordance with the normal processing command, and outputs a combined image (step ST2411).
The combined image storage section 1391, which is the first output storage section 1390, stores the combined image output by the first neural network section 1320 (step ST2412).

 画像状態比較部1341により前回値(基準値)と今回値との差分がしきい値より小さいと判定された場合(ステップST2214“YES”)、第1の処理制御部1380である特徴抽出処理制御部1381は、第1のニューラルネットワーク部1320に対して省略処理指令(ステップST2312)を行うとともに、第1の出力格納部1390である結合画像格納部1391へ出力指令を行う(ステップST2413)。
 第1のニューラルネットワーク部1320は、省略処理指令に従って、第1のニューラルネットワーク部1320における以降の層の処理を実行しない。
 結合画像格納部1391は、出力指令に従って結合画像を出力する。
If the image state comparison unit 1341 determines that the difference between the previous value (reference value) and the current value is smaller than the threshold value (step ST2214 "YES"), the feature extraction processing control unit 1381, which is the first processing control unit 1380, issues an omission processing command to the first neural network unit 1320 (step ST2312) and also issues an output command to the combined image storage unit 1391, which is the first output storage unit 1390 (step ST2413).
In accordance with the omission processing command, the first neural network unit 1320 does not execute processing of the subsequent layers in the first neural network unit 1320 .
The combined image storage unit 1391 outputs the combined image in accordance with the output command.

 次に、状態比較部(第1の状態比較部1340B)が畳み込み層状態比較部(後述する実施の形態における畳み込み層状態比較部1342を参照)である場合の特徴抽出部1300Bの処理の例を説明する。
 図10は、本開示の実施の形態2の特徴抽出部1300Bにおける状態比較部(第1の状態比較部1340B)が畳み込み層状態比較部である場合の特徴抽出部1300Bの処理の一例を示すフローチャートである。
Next, an example of the processing of the feature extraction unit 1300B in a case where the state comparison unit (first state comparison unit 1340B) is a convolution layer state comparison unit (see the convolution layer state comparison unit 1342 in the embodiment described later) will be described.
FIG. 10 is a flowchart illustrating an example of a process of the feature extraction unit 1300B in the second embodiment of the present disclosure when the state comparison unit (first state comparison unit 1340B) in the feature extraction unit 1300B is a convolutional layer state comparison unit.

 特徴抽出部1300Bが処理を開始すると、まず、特徴抽出部1300Bの第1の状態比較部1340Bは畳み込み層状態格納処理を実行する(ステップST2221)。
 畳み込み層状態格納処理において、第1の状態比較部1340Bの畳み込み層状態比較部は、畳み込み層部1322の出力値であって、プーリング層部1325に対する入力データである入力値(プーリング層部1325により処理される前の値)を格納する。
 また、畳み込み層状態比較部は、畳み込み層部1322が出力値を出力する度に、出力値(プーリング層部1325により処理される前の値)を格納部へ格納する。
When the feature extraction unit 1300B starts the process, first, the first state comparison unit 1340B of the feature extraction unit 1300B executes a convolution layer state storage process (step ST2221).
In the convolution layer state storage process, the convolution layer state comparison unit of the first state comparison unit 1340B stores an input value (a value before being processed by the pooling layer unit 1325) which is an output value of the convolution layer unit 1322 and is input data to the pooling layer unit 1325.
In addition, every time the convolution layer unit 1322 outputs an output value, the convolution layer state comparison unit stores the output value (the value before being processed by the pooling layer unit 1325) in the storage unit.

 畳み込み層状態比較部は、前回格納有であるかを判定する処理を実行する(ステップST2222)。
 畳み込み層状態比較部は、格納部を参照して、前回入力された値(基準値:この説明においては前回値)が格納されているかを判定する。
The convolution layer state comparison unit executes a process of determining whether the data has been stored previously (step ST2222).
The convolution layer state comparison unit refers to the storage unit and determines whether a previously input value (reference value: in this explanation, the previous value) is stored.

 畳み込み層状態比較部は、前回格納有であると判定した場合(ステップST2222“YES”)、今回と前回との比較処理を実行する(ステップST2223)。
 畳み込み層状態比較部の比較処理部は、プーリング層に対し今回入力されようとしている値(今回値)と前回入力された値(前回値=基準値)とを比較する。
When the convolution layer state comparison unit determines that the previous state has been stored ("YES" in step ST2222), it executes a comparison process between the current state and the previous state (step ST2223).
The comparison processing unit of the convolution layer state comparison unit compares the value that is currently being input to the pooling layer (current value) with the value that was previously input (previous value=reference value).

 比較処理部は、差分がしきい値未満であるかを判定する処理を実行する(ステップST2224)。
 比較処理部は、今回入力されようとしている値(今回値)と前回入力された値(前回値)との差分値と、予め記憶されているしきい値と、を用いて基準値(前回入力された値)と今回値(今回入力されようとしている値)との差の大小を判定する。
The comparison processing unit executes a process of determining whether the difference is less than a threshold value (step ST2224).
The comparison processing unit determines the magnitude of the difference between a reference value (the value input previously) and the current value (the value being input currently) using a difference value between the value being input currently (the current value) and the value input previously (the previous value) and a pre-stored threshold value.

 畳み込み層状態比較部により前回格納無であると判定された場合(ステップST2222“NO”)、または、畳み込み層状態比較部により前回値(基準値)と今回値との差分がしきい値以上であると判定された場合(ステップST2224“NO”)、第1の処理制御部1380である特徴抽出処理制御部1381は、第1のニューラルネットワーク部1320に対して通常処理指令を行う(ステップST2321)。
 第1のニューラルネットワーク部1320は、通常処理指令に従って通常処理を実行し、結合画像を出力する(ステップST2421)。
 第1の出力格納部1390である結合画像格納部1391は、第1のニューラルネットワーク部1320により出力された結合画像を格納する(ステップST2422)。
If the convolutional layer state comparison unit determines that no value was stored last time ("NO" in step ST2222), or if the convolutional layer state comparison unit determines that the difference between the previous value (reference value) and the current value is equal to or greater than a threshold value ("NO" in step ST2224), the feature extraction processing control unit 1381, which is the first processing control unit 1380, issues a normal processing command to the first neural network unit 1320 (step ST2321).
First neural network unit 1320 executes normal processing in accordance with the normal processing command, and outputs a combined image (step ST2421).
The combined image storage section 1391, which is the first output storage section 1390, stores the combined image output by the first neural network section 1320 (step ST2422).

 畳み込み層状態比較部により前回値(基準値)と今回値との差分がしきい値より小さいと判定された場合(ステップST2224“YES”)、第1の処理制御部1380である特徴抽出処理制御部1381Bは、第1のニューラルネットワーク部1320に対して省略処理指令(ステップST2322)を行うとともに、第1の出力格納部1390である結合画像格納部1391へ出力指令を行う(ステップST2423)。
 第1のニューラルネットワーク部1320は、省略処理指令に従って、第1のニューラルネットワーク部1320における以降の層の処理を実行しない。
 結合画像格納部1391は、出力指令に従って結合画像を出力する。
If the convolution layer state comparison unit determines that the difference between the previous value (reference value) and the current value is smaller than the threshold value (step ST2224 "YES"), the feature extraction processing control unit 1381B, which is the first processing control unit 1380, issues an omission processing command to the first neural network unit 1320 (step ST2322) and also issues an output command to the combined image storage unit 1391, which is the first output storage unit 1390 (step ST2423).
In accordance with the omission processing command, the first neural network unit 1320 does not execute processing of the subsequent layers in the first neural network unit 1320 .
The combined image storage unit 1391 outputs the combined image in accordance with the output command.

 次に、状態比較部(第1の状態比較部1340B)がプーリング層状態比較部(後述する実施の形態におけるプーリング層状態比較部1343を参照)である場合の特徴抽出部1300Bの処理の例を説明する。
 図11は、本開示の実施の形態2の特徴抽出部1300Bにおける状態比較部(第1の状態比較部1340B)がプーリング層状態比較部である場合の特徴抽出部1300Bの処理の一例を示すフローチャートである。
Next, an example of the process of the feature extraction unit 1300B in a case where the state comparison unit (first state comparison unit 1340B) is a pooling layer state comparison unit (see the pooling layer state comparison unit 1343 in the embodiment described later) will be described.
FIG. 11 is a flowchart illustrating an example of a process performed by the feature extraction unit 1300B in the second embodiment of the present disclosure when the state comparison unit (first state comparison unit 1340B) in the feature extraction unit 1300B is a pooling layer state comparison unit.

 特徴抽出部1300Bが処理を開始すると、まず、特徴抽出部1300Bの第1の状態比較部1340Bはプーリング層状態格納処理を実行する(ステップST2221)。
 プーリング層状態格納処理において、第1の状態比較部1340Bのプーリング層状態比較部は、プーリング層部1325の出力値であって、画像結合部1328に対する入力データである入力値(画像結合部1328により処理される前の値)を格納する。
 また、プーリング層状態比較部は、プーリング層部1325が出力値を出力する度に、出力値(画像結合部1328により処理される前の値)を格納部へ格納する。
When the feature extraction unit 1300B starts the process, first, the first state comparison unit 1340B of the feature extraction unit 1300B executes a pooling layer state storage process (step ST2221).
In the pooling layer state storage process, the pooling layer state comparison unit of the first state comparison unit 1340B stores the input value (the value before being processed by the image combination unit 1328), which is the output value of the pooling layer unit 1325 and is input data to the image combination unit 1328.
In addition, every time the pooling layer unit 1325 outputs an output value, the pooling layer state comparison unit stores the output value (the value before being processed by the image combination unit 1328) in the storage unit.

 プーリング層状態比較部は、前回格納有であるかを判定する処理を実行する(ステップST2222)。
 プーリング層状態比較部は、格納部を参照して、前回入力された値(基準値:この説明においては前回値)が格納されているかを判定する。
The pooling layer state comparison unit executes a process of determining whether or not the data has been stored previously (step ST2222).
The pooling layer state comparison unit refers to the storage unit and determines whether a previously input value (reference value: in this explanation, the previous value) is stored.

 プーリング層状態比較部は、前回格納有であると判定した場合(ステップST2222“YES”)、今回と前回との比較処理を実行する(ステップST2223)。
 プーリング層状態比較部の比較処理部は、画像結合部1328に対し今回入力されようとしている値(今回値)と前回入力された値(前回値)とを比較する。
When it is determined that the data has been stored previously ("YES" in step ST2222), the pooling layer state comparison unit executes a process of comparing the current data with the previous data (step ST2223).
The comparison processing unit of the pooling layer state comparison unit compares the value currently being input to the image combination unit 1328 (current value) with the value previously input (previous value).

 比較処理部は、差分がしきい値未満であるかを判定する処理を実行する(ステップST2224)。
 比較処理部は、今回入力されようとしている値(今回値)と前回入力された値(前回値)との差分値と、予め記憶されているしきい値と、を用いて基準値(前回入力された値)と今回値(今回入力されようとしている値)との差の大小を判定する。
The comparison processing unit executes a process of determining whether the difference is less than a threshold value (step ST2224).
The comparison processing unit determines the magnitude of the difference between a reference value (the value input previously) and the current value (the value being input currently) using a difference value between the value being input currently (the current value) and the value input previously (the previous value) and a pre-stored threshold value.

 プーリング層状態比較部により前回格納無であると判定された場合(ステップST2222“NO”)、または、プーリング層状態比較部により前回値(基準値)と今回値との差分がしきい値以上であると判定された場合(ステップST2224“NO”)、第1の処理制御部1380である特徴抽出処理制御部1381Bは、第1のニューラルネットワーク部1320に対して通常処理指令を行う(ステップST2321)。
 第1のニューラルネットワーク部1320は、通常処理指令に従って通常処理を実行し、結合画像を出力する(ステップST2421)。
 第1の出力格納部1390である結合画像格納部1391は、第1のニューラルネットワーク部1320により出力された結合画像を格納する(ステップST2422)。
If the pooling layer state comparison unit determines that no value was stored last time (step ST2222 ``NO''), or if the pooling layer state comparison unit determines that the difference between the previous value (reference value) and the current value is greater than or equal to a threshold value (step ST2224 ``NO''), the feature extraction processing control unit 1381B, which is the first processing control unit 1380, issues a normal processing command to the first neural network unit 1320 (step ST2321).
First neural network unit 1320 executes normal processing in accordance with the normal processing command, and outputs a combined image (step ST2421).
The combined image storage section 1391, which is the first output storage section 1390, stores the combined image output by the first neural network section 1320 (step ST2422).

 プーリング層状態比較部により前回値(基準値)と今回値との差分がしきい値より小さいと判定された場合(ステップST2224“YES”)、第1の処理制御部1380である特徴抽出処理制御部1381Bは、第1のニューラルネットワーク部1320に対して省略処理指令(ステップST2322)を行うとともに、第1の出力格納部1390である結合画像格納部1391へ出力指令を行う(ステップST2423)。
 第1のニューラルネットワーク部1320は、省略処理指令に従って、第1のニューラルネットワーク部1320における以降の層の処理を実行しない。
 結合画像格納部1391は、出力指令に従って結合画像を出力する。
If the pooling layer state comparison unit determines that the difference between the previous value (reference value) and the current value is smaller than the threshold value (step ST2224 "YES"), the feature extraction processing control unit 1381B, which is the first processing control unit 1380, issues an omission processing command to the first neural network unit 1320 (step ST2322) and also issues an output command to the combined image storage unit 1391, which is the first output storage unit 1390 (step ST2423).
In accordance with the omission processing command, the first neural network unit 1320 does not execute processing of the subsequent layers in the first neural network unit 1320 .
The combined image storage unit 1391 outputs the combined image in accordance with the output command.

 次に、状態比較部(第2の状態比較部1540B)が第1の層状態比較部1541である場合の異常状態判定部1500Bの処理の例を説明する。
 図12は、本開示の実施の形態2の異常状態判定部1500Bにおける状態比較部(第2の状態比較部1540B)が第1の層状態比較部1541である場合の異常状態判定部1500Bの処理の一例を示すフローチャートである。
Next, an example of the process of the abnormal state determination unit 1500B when the state comparison unit (second state comparison unit 1540B) is the first layer state comparison unit 1541 will be described.
FIG. 12 is a flowchart showing an example of processing of the abnormal state determination unit 1500B in the case where the state comparison unit (second state comparison unit 1540B) in the abnormal state determination unit 1500B according to the second embodiment of the present disclosure is the first layer state comparison unit 1541.

 異常状態判定部1500Bが特徴抽出部1300Bにより出力されたデータに対する処理(ステップST2511)を実行すると、第2の状態比較部1540Bは、まず、第1の全結合層出力を取得する(ステップST2512)。第1の全結合層状態格納処理を実行する(ステップST2513)。第2の状態比較部1540Bにおける第1の層状態比較部1541は、第1の全結合層1527からの出力値であって第2の全結合層1528の入力値(第2の全結合層1528により処理される前の値)を格納する。 When the abnormal state determination unit 1500B executes processing on the data output by the feature extraction unit 1300B (step ST2511), the second state comparison unit 1540B first acquires the first fully connected layer output (step ST2512). It then executes a first fully connected layer state storage process (step ST2513). The first layer state comparison unit 1541 in the second state comparison unit 1540B stores the output value from the first fully connected layer 1527, which is the input value of the second fully connected layer 1528 (the value before being processed by the second fully connected layer 1528).

 第1の層状態比較部1541は、前回格納有であるかを判定する処理を実行する(ステップST2514)。
 第1の層状態比較部1541は、格納部1541aを参照して、基準値(前回値)が格納されているかを判定する。
The first layer state comparing unit 1541 executes a process of determining whether or not the data has been previously stored (step ST2514).
The first layer state comparison unit 1541 refers to the storage unit 1541a and determines whether a reference value (previous value) is stored.

 第1の層状態比較部1541は、前回格納有と判定した場合(ステップST2514“YES”)、今回と前回との比較処理を実行する(ステップST2515)。第1の全結合層状態比較部1541は、今回値と基準値(前回値)とを比較する。 If the first layer state comparison unit 1541 determines that the previous value has been stored ("YES" in step ST2514), it executes a comparison process between the current value and the previous value (step ST2515). The first fully connected layer state comparison unit 1541 compares the current value with a reference value (previous value).

 第1の層状態比較部1541は、差分がしきい値より小さいかを判定する処理を実行する(ステップST2516)。
 比較処理部1541bは、今回値と基準値(前回値)との差分値と、予め記憶されているしきい値と、を用いて基準値と今回値との差の大小を判定する。
The first layer state comparing unit 1541 executes a process of determining whether the difference is smaller than a threshold value (step ST2516).
The comparison processing unit 1541b judges whether the difference between the current value and the reference value (previous value) is large or small, using a difference between the current value and the reference value and a pre-stored threshold value.

 比較処理部1541bにより前回格納無であると判定された場合(ステップST2514“NO”)、または、第1の層状態比較部1541により差分がしきい値以上であると判定された場合(ステップST2516“NO”)、第2の処理制御部1580である異常状態判定処理制御部1581Bは、第2のニューラルネットワーク部1520に対して通常処理指令を行う(ステップST2611)。
 第2のニューラルネットワーク部1520は、通常処理指令に従って通常処理を実行し、判定値を出力する(ステップST2711)。
 第2の出力格納部1590である確率格納部1591は、第2のニューラルネットワーク部1520により出力された判定値を格納する(ステップST2712)。
If the comparison processing unit 1541b determines that no data was stored previously (step ST2514 "NO"), or if the first layer state comparison unit 1541 determines that the difference is equal to or greater than the threshold value (step ST2516 "NO"), the abnormal state determination processing control unit 1581B, which is the second processing control unit 1580, issues a normal processing command to the second neural network unit 1520 (step ST2611).
Second neural network unit 1520 executes normal processing in accordance with the normal processing command, and outputs a decision value (step ST2711).
Probability storage section 1591, which is second output storage section 1590, stores the decision value output by second neural network section 1520 (step ST2712).

 第1の層状態比較部1541により差分がしきい値より小さいと判定された場合(ステップST2516“YES”)、第2の処理制御部1580である異常状態判定処理制御部1581Bは、省略処理指令を実行する(ステップST2612)。
 省略処理指令において、異常状態判定処理制御部1581Bは、当該判定対象の層以降の層における処理を実行しないよう第2のニューラルネットワーク部1520へ指令するとともに、第2の出力格納部1590に格納されている判定値を出力するよう指令する。
 第2のニューラルネットワーク部1520は、省略処理指令に従って、第2のニューラルネットワーク部1520における以降の層の処理を実行しない。
 第2の出力格納部1590である確率格納部1591は、出力指令に従って判定値を出力する(ステップST2713)。
If the first layer state comparator 1541 determines that the difference is smaller than the threshold value ("YES" in step ST2516), the abnormal state determination process controller 1581B, which is the second process controller 1580, executes an omission process command (step ST2612).
In the omission processing command, the abnormal state judgment processing control unit 1581B instructs the second neural network unit 1520 not to execute processing in layers subsequent to the layer to be judged, and also instructs it to output the judgment value stored in the second output storage unit 1590.
In accordance with the omission processing command, the second neural network unit 1520 does not execute processing of the subsequent layers in the second neural network unit 1520 .
The probability storage section 1591, which is the second output storage section 1590, outputs the decision value in accordance with the output command (step ST2713).

 次に、状態比較部(第2の状態比較部1540B)が第2の全結合層状態比較部(後述する実施の形態における第2の全結合層状態比較部1542を参照)である場合の異常状態判定部1500Bの処理の例を説明する。
 図13は、本開示の実施の形態2の異常状態判定部1500Bにおける状態比較部(第2の状態比較部1540B)が第2の層状態比較部である場合の異常状態判定部1500Bの処理の一例を示すフローチャートである。
Next, an example of the processing of the abnormal state determination unit 1500B when the state comparison unit (second state comparison unit 1540B) is a second fully connected layer state comparison unit (see the second fully connected layer state comparison unit 1542 in the embodiment described later) will be described.
FIG. 13 is a flowchart showing an example of processing by abnormal state determination unit 1500B in the case where the state comparison unit (second state comparison unit 1540B) in abnormal state determination unit 1500B according to embodiment 2 of the present disclosure is the second layer state comparison unit.

 異常状態判定部1500Bが特徴抽出部1300Bにより出力されたデータに対する処理(ステップST2521)を実行すると、次いで、第2の全結合層1528は、第1の全結合層による出力に対する処理を実行する(ステップST2522)。
 次いで、第2の状態比較部1540Bは、第2の全結合層出力を取得する(ステップST2523)。次いで、第2の状態比較部1540Bは、第2の全結合層状態格納処理を実行する(ステップST2524)。第2の状態比較部1540Bにおける第2の層状態比較部は、第2の全結合層1528からの出力値であって確率出力層1529の入力値(確率出力層1529により処理される前の値)を格納する。
After the abnormal state determination unit 1500B executes processing on the data output by the feature extraction unit 1300B (step ST2521), the second fully connected layer 1528 then executes processing on the output by the first fully connected layer (step ST2522).
Next, the second state comparison unit 1540B acquires the second fully connected layer output (step ST2523). Next, the second state comparison unit 1540B executes a second fully connected layer state storage process (step ST2524). The second layer state comparison unit in the second state comparison unit 1540B stores the output value from the second fully connected layer 1528 and the input value of the probability output layer 1529 (the value before being processed by the probability output layer 1529).

 第2の層状態比較部は、前回格納有であるかを判定する処理を実行する(ステップST2525)。
 第2の層状態比較部は、格納部を参照して、基準値(前回値)が格納されているかを判定する。
The second layer state comparing unit executes a process of determining whether the data has been stored previously (step ST2525).
The second layer state comparison unit refers to the storage unit and determines whether a reference value (previous value) is stored.

 第2の層状態比較部は、前回格納有と判定した場合(ステップST2525“YES”)、今回と前回との比較処理を実行する(ステップST2526)。第2の層状態比較部は、今回値と基準値(前回値)とを比較する。 If the second layer state comparison unit determines that the value was stored last time ("YES" in step ST2525), it executes a comparison process between the current value and the previous value (step ST2526). The second layer state comparison unit compares the current value with the reference value (previous value).

 第2の層状態比較部は、差分がしきい値より小さいかを判定する処理を実行する(ステップST2527)。
 比較処理部は、今回値と基準値(前回値)との差分値と、予め記憶されているしきい値と、を用いて基準値と今回値との差の大小を判定する。
The second layer state comparing unit executes a process of determining whether the difference is smaller than a threshold value (step ST2527).
The comparison processing section determines whether the difference between the current value and the reference value (previous value) is large or small, using a difference between the current value and the reference value (previous value) and a pre-stored threshold value.

 比較処理部により前回格納無であると判定された場合(ステップST2525“NO”)、または、第2の層状態比較部により差分がしきい値以上であると判定された場合(ステップST2527“NO”)、第2の処理制御部1580である異常状態判定処理制御部1581Bは、第2のニューラルネットワーク部1520に対して通常処理指令を行う(ステップST2621)。
 第2のニューラルネットワーク部1520は、通常処理指令に従って通常処理を実行し、判定値を出力する(ステップST2721)。
 第2の出力格納部1590である確率格納部1591は、第2のニューラルネットワーク部1520により出力された判定値を格納する(ステップST2722)。
If the comparison processing unit determines that no data was stored previously (step ST2525 "NO"), or if the second layer state comparison unit determines that the difference is greater than or equal to the threshold value (step ST2527 "NO"), the abnormal state determination processing control unit 1581B, which is the second processing control unit 1580, issues a normal processing command to the second neural network unit 1520 (step ST2621).
Second neural network unit 1520 executes normal processing in accordance with the normal processing command, and outputs a determination value (step ST2721).
Probability storage section 1591, which is second output storage section 1590, stores the decision value output by second neural network section 1520 (step ST2722).

 第1の全結合層状態比較部1541により差分がしきい値より小さいと判定された場合(ステップST2527“YES”)、第2の処理制御部1580である異常状態判定処理制御部1581Bは、省略処理指令を実行する(ステップST2622)。
 省略処理指令において、異常状態判定処理制御部1581Bは、当該判定対象の層以降の層における処理を実行しないよう第2のニューラルネットワーク部1520へ指令するとともに、第2の出力格納部1590に格納されている判定値を出力するよう指令する。
 第2のニューラルネットワーク部1520は、省略処理指令に従って、第2のニューラルネットワーク部1520における以降の層の処理を実行しない。
 第2の出力格納部1590である確率格納部1591は、出力指令に従って判定値を出力する(ステップST2723)。
If the first fully connected layer state comparison unit 1541 judges that the difference is smaller than the threshold value (step ST2527 "YES"), the abnormal state judgment process control unit 1581B, which is the second process control unit 1580, executes an omission process command (step ST2622).
In the omission processing command, the abnormal state judgment processing control unit 1581B instructs the second neural network unit 1520 not to execute processing in layers subsequent to the layer to be judged, and also instructs it to output the judgment value stored in the second output storage unit 1590.
In accordance with the omission processing command, the second neural network unit 1520 does not execute processing of the subsequent layers in the second neural network unit 1520 .
The probability storage section 1591, which is the second output storage section 1590, outputs a decision value in accordance with the output command (step ST2723).

 本開示の異常判定装置は、さらに、以下のように構成した。

「前記ニューラルネットワーク部は、第1のニューラルネットワーク部と、第2のニューラルネットワーク部とを含み構成され、
 前記出力格納部は、第1の出力格納部と、第2の出力格納部とを含み構成され、
 前記状態比較部は、第1の状態比較部と、第2の状態比較部とを含み構成され、
 前記処理制御部は、第1の処理制御部と、第2の処理制御部とを含み構成され、
 前記ニューラルネットワークを構成する層のうちの一部である複数の層を有し、前記画像取得部により出力された画像を取得し、当該画像に含まれる前記判定対象の特徴的な状態を表す特徴マップを出力する、前記第1のニューラルネットワーク部、
 前記第1のニューラルネットワーク部により出力された前記特徴マップを格納する、前記第1の出力格納部、
 前記第1のニューラルネットワーク部における層のうちの少なくとも一つの層である第2の層に対する入力値の基準である基準値を予め格納しておき、当該基準値と、当該第2の層に新たに入力されようとしている値である今回値と、の差分値を用いて、前記判定対象の状態の変化を判定する、前記第1の状態比較部、
 および、
 前記第1の状態比較部により前記判定対象の状態が変化なしと判定された場合、当該判定対象の前記第2の層以降の層における処理を実行しないよう前記第1のニューラルネットワーク部へ指令するとともに、前記第1の出力格納部に格納されている前記特徴マップを出力するよう指令する、前記第1の処理制御部、
 を含み構成された特徴抽出部と、
 前記ニューラルネットワークを構成する層のうちの一部である複数の層を有し、前記特徴抽出部により出力された前記特徴マップを取得し、当該特徴マップを用いて前記判定対象の状態を示す判定値を前記出力データとして出力する、前記第2のニューラルネットワーク部、
 前記第2のニューラルネットワーク部により出力された判定値を格納する、前記第2の出力格納部、
 前記第2のニューラルネットワーク部における層のうちの少なくとも一つの層である第3の層に対する入力値の基準である基準値を予め格納しておき、当該基準値と、当該第3の層に新たに入力されようとしている値である今回値と、の差分値を用いて、前記判定対象の状態の変化を判定する、前記第2の状態比較部、
 および、
 前記第2の状態比較部により前記判定対象の状態が変化なしと判定された場合、当該判定対象の前記第3の層以降の層における処理を実行しないよう前記第2のニューラルネットワーク部へ指令するとともに、前記第2の出力格納部に格納されている前記判定値を前記出力データとして出力するよう指令する、前記第2の処理制御部、
を含み構成された異常状態判定部と、
 を備えた、異常判定装置。」

 これにより、本開示は、異常判定技術において、ニューラルネットワークを用いてさらに速やかに高精度な判定結果を出力することを可能にすることができる、異常判定装置を提供することができる、という効果を奏する。
 さらに、本開示は、上記構成を上記異常判定方法に適用することにより、上記効果と同様の効果を奏する。
The abnormality determination device of the present disclosure is further configured as follows.

"The neural network unit includes a first neural network unit and a second neural network unit,
The output storage unit includes a first output storage unit and a second output storage unit,
the state comparison unit includes a first state comparison unit and a second state comparison unit,
The processing control unit includes a first processing control unit and a second processing control unit,
the first neural network unit having a plurality of layers that are a part of the layers constituting the neural network, acquiring an image output by the image acquisition unit, and outputting a feature map that represents a characteristic state of the object to be determined that is included in the image;
the first output storage unit that stores the feature map output by the first neural network unit;
the first state comparison unit pre-stores a reference value that is a reference for an input value to a second layer that is at least one of the layers in the first neural network unit, and judges a change in the state of the object to be judged by using a difference value between the reference value and a current value that is a value that is to be newly input to the second layer;
and,
the first processing control unit instructs the first neural network unit not to execute processing in the second layer and subsequent layers of the object to be determined when the first state comparison unit determines that the state of the object to be determined has not changed, and instructs the first processing control unit to output the feature map stored in the first output storage unit;
A feature extraction unit including:
the second neural network unit having a plurality of layers that are a part of the layers constituting the neural network, acquiring the feature map output by the feature extraction unit, and outputting a judgment value indicating a state of the object to be judged as the output data using the feature map;
the second output storage unit for storing the judgment value output by the second neural network unit;
the second state comparison unit stores in advance a reference value that is a reference for an input value to a third layer that is at least one of the layers in the second neural network unit, and judges a change in the state of the object to be judged by using a difference value between the reference value and a current value that is a value that is to be newly input to the third layer;
and,
the second processing control unit instructs the second neural network unit not to execute processing in the third layer and subsequent layers of the object to be judged when the second state comparison unit judges that the state of the object to be judged has not changed, and instructs the second processing control unit to output the judgment value stored in the second output storage unit as the output data;
An abnormality state determination unit configured to include
An abnormality determination device equipped with the above.

As a result, the present disclosure has the effect of providing an abnormality determination device that can use a neural network in an abnormality determination technique to more quickly output highly accurate determination results.
Furthermore, the present disclosure achieves the same effect as the above by applying the above configuration to the abnormality determination method.

 本開示の異常判定装置は、さらに、以下のように構成した。

「前記第1の状態比較部および前記第2の状態比較部は、
前記差分値と予め記憶されているしきい値とを用いて前記基準値と前記今回値との差の大小を判定した結果に基づいて、前記判定対象の状態の変化を判定する、
ことを特徴とする異常判定装置。」

 これにより、本開示は、異常判定技術において、ニューラルネットワークを用いてさらに速やかに高精度な判定結果を出力することを可能にすることができる、異常判定装置を提供することができる、という効果を奏する。
 さらに、本開示は、上記構成を上記異常判定方法に適用することにより、上記効果と同様の効果を奏する。
The abnormality determination device of the present disclosure is further configured as follows.

"The first state comparison unit and the second state comparison unit are
a change in the state of the object to be determined based on a result of determining whether a difference between the reference value and the current value is greater or smaller using the difference value and a pre-stored threshold value;
An abnormality determination device characterized by the above.

As a result, the present disclosure has the effect of providing an abnormality determination device that can use a neural network in an abnormality determination technique to more quickly output highly accurate determination results.
Furthermore, the present disclosure achieves the same effect as the above by applying the above configuration to the abnormality determination method.

 本開示の異常判定装置は、さらに、以下のように構成した。

「前記第1の状態比較部および前記第2の状態比較部は、
前記画像の単位に、前記差分値の二乗和が予め記憶されたしきい値より小さい場合、前記判定対象の状態が変化なしであると判定する、
ことを特徴とする異常判定装置。」

 これにより、本開示は、異常判定技術において、ニューラルネットワークを用いてさらに速やかに高精度な判定結果を出力することを可能にすることができる、異常判定装置を提供することができる、という効果を奏する。
 さらに、本開示は、上記構成を上記異常判定方法に適用することにより、上記効果と同様の効果を奏する。
The abnormality determination device of the present disclosure is further configured as follows.

"The first state comparison unit and the second state comparison unit are
When the sum of squares of the difference values is smaller than a pre-stored threshold value for each image, it is determined that the state of the object to be determined is unchanged.
An abnormality determination device characterized by the above.

As a result, the present disclosure has the effect of providing an abnormality determination device that can use a neural network in an abnormality determination technique to more quickly output highly accurate determination results.
Furthermore, the present disclosure achieves the same effect as the above by applying the above configuration to the abnormality determination method.

 本開示の異常判定装置は、さらに、以下のように構成した。
「前記第1の状態比較部および前記第2の状態比較部は、
前記画像の単位に、前記差分値の絶対値の総和が予め記憶されたしきい値より小さい場合、前記判定対象の状態が変化なしであると判定する、
ことを特徴とする異常判定装置。」

 これにより、本開示は、異常判定技術において、ニューラルネットワークを用いてさらに速やかに高精度な判定結果を出力することを可能にすることができる、異常判定装置を提供することができる、という効果を奏する。
 さらに、本開示は、上記構成を上記異常判定方法に適用することにより、上記効果と同様の効果を奏する。
The abnormality determination device of the present disclosure is further configured as follows.
"The first state comparison unit and the second state comparison unit are
When the sum of the absolute values of the difference values is smaller than a pre-stored threshold value for each image, it is determined that the state of the object to be determined is unchanged.
An abnormality determination device characterized by the above.

As a result, the present disclosure has the effect of providing an abnormality determination device that can use a neural network in an abnormality determination technique to more quickly output highly accurate determination results.
Furthermore, the present disclosure achieves the same effect as the above by applying the above configuration to the abnormality determination method.

実施の形態3.
 実施の形態3は、本開示の基本的な仕組みをニューラルネットワークにおける全ての層に対して適用した形態を説明する。
 実施の形態3において、既に説明した構成および処理については、適宜省略する。
Embodiment 3.
In the third embodiment, a form in which the basic mechanism of the present disclosure is applied to all layers in a neural network will be described.
In the third embodiment, the configuration and processing already described will be omitted as appropriate.

 実施の形態3に係る異常警報装置100Cおよび異常判定装置1000Cの構成の一例を説明する。
 図14は、本開示の実施の形態3に係る異常警報装置100Cおよび異常判定装置1000Cの構成の一例を示す図である。
 異常警報装置100Cは、異常判定装置1000C、および、警報出力部2000、を含み構成されている。
An example of the configuration of an abnormality warning device 100C and an abnormality determination device 1000C according to the third embodiment will be described.
FIG. 14 is a diagram showing an example of the configuration of an abnormality warning device 100C and an abnormality determination device 1000C according to the third embodiment of the present disclosure.
The abnormality warning device 100C includes an abnormality determination device 1000C and a warning output unit 2000.

 異常判定装置1000Cは、画像を取得し、当該画像を用いて、当該画像に撮像された判定対象の状態を出力する。
 図14に示す異常判定装置1000Cは、画像取得部1100C、特徴抽出部1300C、および、異常状態判定部1500C、を含み構成されている。
 ここで、異常判定装置1000Cにおいては、既に説明したニューラルネットワーク部が、後述するように、第1のニューラルネットワーク部1320と、第2のニューラルネットワーク部1520とを含み構成された形態である。第1のニューラルネットワーク部1320は、特徴抽出部1300Cに含まれ、第2のニューラルネットワーク部1520は、異常状態判定部1500Cに含まれる。
 また、異常判定装置1000Cにおいては、既に説明した出力格納部が、第1の出力格納部1390と、第2の出力格納部とを含み構成された形態である。第1の出力格納部1390は、特徴抽出部1300Cに含まれ、第2の出力格納部は、異常状態判定部1500Cに含まれる。
 また、異常判定装置1000Cにおいては、既に説明した状態比較部は、第1の状態比較部1340と、第2の状態比較部1540とを含み構成された形態である。第1の状態比較部1340は、特徴抽出部1300Cに含まれ、第2の状態比較部1540は、異常状態判定部1500Cに含まれる。
 また、異常判定装置1000Cにおいては、既に説明した処理制御部は、第1の処理制御部1380と、第2の処理制御部1580とを含み構成された形態である。第1の処理制御部1380は、特徴抽出部1300Cに含まれ、第2の処理制御部1580は、異常状態判定部1500Cに含まれる。
The abnormality determination device 1000C obtains an image, and uses the image to output the state of the object to be determined that is captured in the image.
An abnormality determination device 1000C shown in FIG. 14 includes an image acquisition unit 1100C, a feature extraction unit 1300C, and an abnormal state determination unit 1500C.
Here, in abnormality determination device 1000C, the neural network unit already described is configured to include, as will be described later, a first neural network unit 1320 and a second neural network unit 1520. First neural network unit 1320 is included in feature extraction unit 1300C, and second neural network unit 1520 is included in abnormal state determination unit 1500C.
In addition, in the abnormality determination device 1000C, the output storage unit already described is configured to include a first output storage unit 1390 and a second output storage unit. The first output storage unit 1390 is included in the feature extraction unit 1300C, and the second output storage unit is included in the abnormal state determination unit 1500C.
Moreover, in abnormality determination device 1000C, the state comparison unit already described is configured to include a first state comparison unit 1340 and a second state comparison unit 1540. First state comparison unit 1340 is included in feature extraction unit 1300C, and second state comparison unit 1540 is included in abnormal state determination unit 1500C.
Moreover, in the abnormality determination device 1000C, the process control unit already described is configured to include a first process control unit 1380 and a second process control unit 1580. The first process control unit 1380 is included in the feature extraction unit 1300C, and the second process control unit 1580 is included in the abnormal state determination unit 1500C.

 画像取得部1100Cは、既に説明した画像取得部1100A,1100Bと同様であるため、ここでの画像取得部1100Cの詳細な説明は省略する。 Since the image acquisition unit 1100C is similar to the image acquisition units 1100A and 1100B already described, a detailed description of the image acquisition unit 1100C will be omitted here.

 特徴抽出部1300Cは、画像を用いて、当該画像に含まれる判定対象の特徴的な状態を表す特徴マップを出力する。特徴抽出部1300Cにより、例えば、画像に含まれる、瞼や眼球といった居眠りの兆候を示す部位が切り出され、居眠り等の異常状態に特徴的な判定対象者の状態を表す特徴マップが生成される。 The feature extraction unit 1300C uses the image to output a feature map that represents the characteristic state of the subject contained in the image. For example, the feature extraction unit 1300C extracts parts of the image that show signs of drowsiness, such as the eyelids and eyeballs, and generates a feature map that represents the state of the subject that is characteristic of an abnormal state such as drowsiness.

 特徴抽出部1300Cの内部構成の一例を説明する。
 図15は、異常警報装置100Cおよび異常判定装置1000Cにおける特徴抽出部1300Cの内部構成の一例を示す図である。
 図15に示す特徴抽出部1300Cは、ニューラルネットワーク部(第1のニューラルネットワーク部1320)、状態比較部(第1の状態比較部1340C)、処理制御部(第1の処理制御部1380C)、および、出力格納部(第1の出力格納部1390)、を含み構成されている。
An example of the internal configuration of the feature extraction unit 1300C will be described.
FIG. 15 is a diagram showing an example of the internal configuration of the feature extraction unit 1300C in the abnormality warning device 100C and the abnormality determination device 1000C.
The feature extraction unit 1300C shown in FIG. 15 is configured to include a neural network unit (first neural network unit 1320), a state comparison unit (first state comparison unit 1340C), a processing control unit (first processing control unit 1380C), and an output storage unit (first output storage unit 1390).

 第1のニューラルネットワーク部1320は、ニューラルネットワークを構成する層のうちの一部である複数の層を有し、画像取得部1100により出力された画像を取得し、当該画像に含まれる判定対象の特徴的な状態を表す特徴マップを出力する。
 図15に示す第1のニューラルネットワーク部1320は、画像分岐部1321、畳み込み層部1322、プーリング層部1325、および、画像結合部1328、を含み構成されている。
The first neural network unit 1320 has multiple layers that are part of the layers that make up the neural network, acquires the image output by the image acquisition unit 1100, and outputs a feature map that represents the characteristic state of the object to be judged contained in the image.
The first neural network unit 1320 shown in FIG. 15 includes an image branching unit 1321, a convolution layer unit 1322, a pooling layer unit 1325, and an image combination unit 1328.

 画像分岐部1321は、画像を、ニューラルネットワークの層における複数のノードに分岐して入力する。画像は、以降の畳み込み層、プーリング層の数に応じて分岐される。図15においては、2つに分岐される。 The image branching unit 1321 branches and inputs an image to multiple nodes in a layer of a neural network. The image is branched according to the number of subsequent convolutional layers and pooling layers. In FIG. 15, the image is branched into two.

 畳み込み層部1322は、画像における判定対象の特徴的な部位を抽出するためのフィルタリングを行う。
 図15に示す畳み込み層部1322は、第1の畳み込み層1323、および、第2の畳み込み層1324、を含み構成されている。
 第1の畳み込み層1323および第2の畳み込み層1324は、例えば、予め用意した、3×3や5×5のサイズの畳み込みフィルタを用いた畳み込み処理(相互相関処理)により、居眠り状態を検知するための顔(体)の部位を抽出するためのフィルタリングを行う。
The convolution layer unit 1322 performs filtering to extract characteristic parts of the object to be determined in the image.
The convolution layer unit 1322 shown in FIG. 15 includes a first convolution layer 1323 and a second convolution layer 1324.
The first convolutional layer 1323 and the second convolutional layer 1324 perform filtering to extract face (body) parts for detecting a drowsy state, for example, by convolution processing (cross-correlation processing) using pre-prepared convolution filters of size 3x3 or 5x5.

 図15に示すプーリング層部1325は、第1のプーリング層1326、および、第2のプーリング層1327、を含み構成されている。
 第1のプーリング層1326および第2のプーリング層1327は、例えば、予め定めた領域毎に最大もしくは平均値を算出することで、画像位置に対して頑健な特徴量に関する画像を生成する。
The pooling layer unit 1325 shown in FIG. 15 includes a first pooling layer 1326 and a second pooling layer 1327 .
The first pooling layer 1326 and the second pooling layer 1327 generate an image relating to features that are robust against image position, for example, by calculating the maximum or average value for each predetermined region.

 図15に示す第1のニューラルネットワーク部1320は、畳み込み層とプーリング層の組を2つ備えているが、1つの場合でも3つ以上の場合も有効である。
 また、第1プーリング層および第2のプーリング層1327の後に再度畳み込み層、プーリング層がそれぞれ2つ以上連なる構成であってもよい。
 また、畳み込み層の後には、図示していない正規化線形ユニット層(活性化関数)等を含めるように構成してもよい。
The first neural network unit 1320 shown in FIG. 15 includes two pairs of convolutional layers and pooling layers, but it is also effective to include one pair or three or more pairs.
In addition, the first pooling layer and the second pooling layer 1327 may be followed by two or more convolution layers and two or more pooling layers.
In addition, a normalized linear unit layer (activation function) or the like (not shown) may be included after the convolution layer.

 画像結合部1328は、畳み込み層およびプーリング層を介して出力された複数の画像を結合する。
 画像結合部1328は、例えば、瞼や眼球といった居眠りの兆候を示す部位を切り出して、特徴マップを生成する。
The image combination unit 1328 combines multiple images output through the convolution layer and the pooling layer.
The image combining unit 1328 extracts areas that indicate signs of drowsiness, such as the eyelids and eyeballs, and generates a feature map.

 第1の状態比較部1340Cは、第1のニューラルネットワーク部1320における複数の層ごとに、当該層に対する入力値の基準である基準値を予め格納しておき、当該基準値と、当該層に新たに入力されようとしている値である今回値と、の差分値を用いて、判定対象の状態の変化を判定する。 The first state comparison unit 1340C prestores a reference value, which is the standard for the input value to each layer in the first neural network unit 1320, and uses the difference between the reference value and the current value, which is the value that is about to be newly input to the layer, to determine a change in the state of the object to be determined.

 基準値は、第1のニューラルネットワーク部1320の複数の層による前回の処理結果として、層ごとの入力値である前回値が格納されたものであり、今回、当該層に新たに入力されようとしている入力値である今回値との差分値を得るために用いられる。後述する処理の説明においては、基準値に前回値を用いるものについて説明する。
 この場合、第1の状態比較部1340は、1回目の処理(1枚目の画像に対する処理)については状態比較処理を行わず値を格納するだけであるが、2回目以降の処理(2枚目の画像に対する処理)においては判定対象の状態の変化を判定する。
 ただし、基準値は、以下のような値を格納して用いるようにしてもよい。
The reference value is a previous value, which is an input value for each layer, stored as a result of the previous processing by the multiple layers of the first neural network unit 1320, and is used to obtain a difference value between the previous value and a current value, which is an input value that is about to be newly input to the layer. In the explanation of the processing described later, a case in which the previous value is used as the reference value will be explained.
In this case, the first state comparison unit 1340 does not perform state comparison processing for the first processing (processing for the first image) and simply stores the value, but in the second and subsequent processing (processing for the second image), it determines changes in the state of the object to be judged.
However, the following values may be stored and used as the reference values.

 基準値は、予め学習しモデル化した典型的な処理結果を用いることができる。
 すなわち、基準値は、ニューラルネットワークにおける、予めモデル化した典型的な中間層の出力値である、ようにしてもよい。
 典型的な処理結果を用いることで、実観測画像に含まれる雑音や異常に影響されずに比較を行うことができる。
The reference value can use typical processing results that have been learned and modeled in advance.
In other words, the reference value may be a typical output value of a pre-modeled intermediate layer in a neural network.
By using typical processing results, comparisons can be made without being influenced by noise or anomalies contained in the actual observed images.

 また、基準値は、事前情報によって定めた一部の結果を用いることができる。
 事前情報によって定めた一部の結果とは、例えば、頭部の存在確率が高い画素を採用する、といったことが想定される。
 また、例えば、背景等、変化が大きいが重要性は低いことがわかっている画素の不採用といったことが想定される。
 すなわち、基準値は、ニューラルネットワークに含まれるノードのうち予め定められたノードの出力値である、ようにしてもよい。
 これにより、省メモリ化や、扱うデータを削減することができ、より高速化を図ることができる。
In addition, the reference value may use a part of the results determined based on advance information.
The partial results determined based on the prior information are assumed to be, for example, pixels that have a high probability of indicating the presence of a head.
Also, for example, pixels that are known to have a large change but low importance, such as the background, may not be used.
In other words, the reference value may be the output value of a predetermined node among the nodes included in the neural network.
This allows for reduced memory usage and data handling, resulting in faster processing speeds.

 第1の状態比較部1340は、差分値と予め記憶されているしきい値とを用いて基準値と今回値との差の大小を判定した結果に基づいて、判定対象の状態の変化を判定する。
 具体的には、第1の状態比較部1340は、画像の単位に、差分値の二乗和が予め記憶されたしきい値より小さい場合、判定対象の状態が変化なしであると判定する。
 また、具体的には、第1の状態比較部1340は、画像の単位に記差分値の絶対値の総和が予め記憶されたしきい値より小さい場合、判定対象の状態が変化なしであると判定する。
The first state comparison section 1340 uses the difference value and a pre-stored threshold value to determine the magnitude of the difference between the reference value and the current value, and determines a change in the state of the object to be determined based on the result.
Specifically, the first state comparing section 1340 determines that the state of the determination target is unchanged when the sum of squares of the difference values is smaller than a pre-stored threshold value for each image.
More specifically, the first state comparing section 1340 determines that the state of the determination target is unchanged when the sum of the absolute values of the difference values recorded in image units is smaller than a pre-stored threshold value.

 図15に示す第1の状態比較部1340Cは、画像状態比較部1341、畳み込み層状態比較部1342、および、プーリング層状態比較部1343、を含み構成されている。
 画像状態比較部1341、畳み込み層状態比較部1342、および、プーリング層状態比較部1343、において、格納とは、それぞれの入力元の2次元画像(特徴マップ)の値を保持することを意味する。
 また、画像状態比較部1341、畳み込み層状態比較部1342、および、プーリング層状態比較部1343、において、比較とは、例えば、格納された(前時刻/前フレームの)2次元画像と、最新の2次元画像との間で、要素(画素)毎に差の絶対値をとり、更に、総和若しくは平均をとったものと、予め定めた所定閾値との大小比較を行うことを意味する。
The first state comparison unit 1340C shown in FIG. 15 includes an image state comparison unit 1341, a convolution layer state comparison unit 1342, and a pooling layer state comparison unit 1343.
In the image state comparison unit 1341, the convolution layer state comparison unit 1342, and the pooling layer state comparison unit 1343, "storing" means holding the values of the two-dimensional images (feature maps) that are the respective input sources.
Furthermore, in the image state comparison unit 1341, the convolution layer state comparison unit 1342, and the pooling layer state comparison unit 1343, comparison means, for example, taking the absolute value of the difference for each element (pixel) between a stored two-dimensional image (of the previous time/frame) and the latest two-dimensional image, and further comparing the sum or average with a predetermined threshold value.

 図15に示す画像状態比較部1341は、格納部1341a、および、比較処理部1341b、を含み構成されている。 The image state comparison unit 1341 shown in FIG. 15 includes a storage unit 1341a and a comparison processing unit 1341b.

 図15に示す畳み込み層状態比較部1342は、格納部1342a、および、比較処理部1342b、を含み構成されている。 The convolutional layer state comparison unit 1342 shown in FIG. 15 includes a storage unit 1342a and a comparison processing unit 1342b.

 図15に示すプーリング層状態比較部1343は、格納部1343a、および、比較処理部1343b、を含み構成されている。 The pooling layer state comparison unit 1343 shown in FIG. 15 includes a storage unit 1343a and a comparison processing unit 1343b.

 第1の処理制御部1380Cは、第1の状態比較部1340により判定対象の状態が変化なしと判定された場合、当該判定対象の層以降の層における処理を実行しないよう第1のニューラルネットワーク部1320へ指令するとともに、第1の出力格納部1390に格納されている特徴マップを出力するよう指令する。
 図15に示す第1の処理制御部1380Cは、特徴抽出処理制御部1381C、を含み構成されている。
 特徴抽出処理制御部1381Cは、特徴抽出部1300において上記第1の処理制御部1380Cの機能を実行する。
When the first state comparison unit 1340 determines that the state of the object to be judged has not changed, the first processing control unit 1380C instructs the first neural network unit 1320 not to execute processing in layers subsequent to the layer of the object to be judged, and also instructs the first output storage unit 1390 to output the feature map stored in the first output storage unit 1390.
The first process control unit 1380C shown in FIG. 15 includes a feature extraction process control unit 1381C.
The feature extraction process control unit 1381C executes the function of the first process control unit 1380C in the feature extraction unit 1300.

 第1の出力格納部1390は、第1のニューラルネットワーク部1320により出力された出力データ(判定値(確率値))を格納する。
 第1の出力格納部1390は、第1のニューラルネットワーク部1320により出力された特徴マップを格納する。
 図15に示す第1の出力格納部1390は、結合画像格納部1391、を含み構成されている。
 結合画像格納部1391は、第1のニューラルネットワーク部1320により結合されて出力された結合画像である特徴マップを格納する。
The first output storage section 1390 stores the output data (decision value (probability value)) output by the first neural network section 1320 .
The first output storage unit 1390 stores the feature map output by the first neural network unit 1320 .
The first output storage unit 1390 shown in FIG.
The combined image storage unit 1391 stores a feature map, which is a combined image combined and output by the first neural network unit 1320 .

 図14の説明に戻る。
 異常状態判定部1500Cは、二次元画像である特徴マップを用いて、判定対象の状態を示す判定値を出力する。
Returning to the explanation of FIG.
The abnormal condition determination unit 1500C uses a feature map, which is a two-dimensional image, to output a determination value indicating the condition of the object to be determined.

 異常状態判定部1500Cの内部構成の一例を説明する。
 図16は、異常警報装置100Cおよび異常判定装置1000Cにおける異常状態判定部1500Cの内部構成の一例を示す図である。
 図16に示す異常状態判定部1500Cは、ニューラルネットワーク部(第2のニューラルネットワーク部1520)、状態比較部(第2の状態比較部1540)、および、出力格納部(第2の出力格納部)1590、を含み構成されている。
An example of the internal configuration of the abnormal state determination unit 1500C will be described.
FIG. 16 is a diagram showing an example of the internal configuration of the abnormality warning device 100C and the abnormality state determination unit 1500C in the abnormality determination device 1000C.
The abnormal state determination unit 1500C shown in FIG. 16 is configured to include a neural network unit (second neural network unit 1520), a state comparison unit (second state comparison unit 1540), and an output storage unit (second output storage unit) 1590.

 ニューラルネットワーク部(第2のニューラルネットワーク部1520)は、ニューラルネットワークにおける複数の層を有し、前記特徴抽出部1300により出力された前記特徴マップを取得し、当該特徴マップを用いて前記判定対象の状態を示す判定値を出力データとして出力する。
 図16に示すニューラルネットワーク部(第2のニューラルネットワーク部1520)は、状態分類部1525、および、確率出力層1529、を含み構成されている。
The neural network unit (second neural network unit 1520) has multiple layers in a neural network, acquires the feature map output by the feature extraction unit 1300, and uses the feature map to output a judgment value indicating the state of the object to be judged as output data.
The neural network unit (second neural network unit 1520 ) shown in FIG. 16 includes a state classification unit 1525 and a probability output layer 1529 .

 状態分類部1525は、例えば、2次元の画像(特徴マップ)を1次元のベクトルに変換し、更に出力を、居眠り状態を示すもの(例えば、瞼:居眠り有/無、眼球:居眠り有/無の4出力)へとまとめる機能を有する。
 図16に示す状態分類部1525は、第1の全結合層1527、および、第2の全結合層1528、を含み構成されている。
 状態分類部1525は、第1の全結合層1527および第2の全結合層1528を介することで、所望出力数を要素数とする1次元のベクトルを生成する。
The state classification unit 1525 has a function of, for example, converting a two-dimensional image (feature map) into a one-dimensional vector, and further consolidating the output into an indication of the drowsy state (for example, four outputs: eyelids: dozing/not dozing, eyeballs: dozing/not dozing).
The state classification unit 1525 shown in FIG. 16 includes a first fully connected layer 1527 and a second fully connected layer 1528.
The state classification unit 1525 generates a one-dimensional vector whose number of elements is the desired number of outputs through a first fully connected layer 1527 and a second fully connected layer 1528 .

 確率出力層1529は、例えばソフトマックス関数を適用し、出力値の合計を1.0とすることで、出力結果に確率的な意味合いを持たせる。 The probability output layer 1529 applies a softmax function, for example, and sets the sum of the output values to 1.0, thereby giving the output results a probabilistic meaning.

 図16に示す第2のニューラルネットワーク部1520は、全結合層を2つ備えているが、全結合層がない場合も、3つ以上備えるように構成してもよい。 The second neural network unit 1520 shown in FIG. 16 has two fully connected layers, but it may be configured to have three or more fully connected layers if there are no fully connected layers.

 第2の状態比較部1540は、第2のニューラルネットワーク部1520の複数の層ごとに、当該層に対する入力値の基準である基準値を予め格納しておき、当該基準値と、当該層に新たに入力されようとしている値である今回値と、の差分値を用いて、判定対象の状態の変化を判定する。 The second state comparison unit 1540 prestores a reference value, which is the standard for the input value to each layer of the second neural network unit 1520, and uses the difference between the reference value and the current value, which is the value that is about to be newly input to the layer, to determine a change in the state of the object to be determined.

 基準値は、第2のニューラルネットワーク部1520の複数の層による前回の処理結果として、層ごとの入力値である前回値が格納されたものであり、今回、当該第1の層に新たに入力されようとしている入力値である今回値との差分値を得るために用いられる。後述する処理の説明においては、前回値を基準値に用いるものについて説明する。
 この場合、第2の状態比較部1540は、1回目の処理(1枚目の画像に対する処理)については状態比較処理を行わず値を格納するだけであるが、2回目以降の処理(2枚目の画像に対する処理)においては判定対象の状態の変化を判定する。
 ただし、基準値は、以下のような値を格納して用いるようにしてもよい。
The reference value is a previous value, which is an input value for each layer, stored as a result of the previous processing by the multiple layers of the second neural network unit 1520, and is used to obtain a difference value between the previous value and a current value, which is an input value that is about to be newly input to the first layer. In the explanation of the processing to be described later, the previous value is used as the reference value.
In this case, the second state comparison unit 1540 does not perform state comparison processing for the first processing (processing for the first image) and simply stores the value, but determines changes in the state of the object to be judged for the second and subsequent processing (processing for the second image).
However, the following values may be stored and used as the reference values.

 基準値は、予め学習しモデル化した典型的な処理結果を用いることができる。
 すなわち、基準値は、ニューラルネットワークにおける、予めモデル化した典型的な中間層の出力値である、ようにしてもよい。
 典型的な処理結果を用いることで、実観測画像に含まれる雑音や異常に影響されずに比較を行うことができる。
The reference value can use typical processing results that have been learned and modeled in advance.
In other words, the reference value may be a typical output value of a pre-modeled intermediate layer in a neural network.
By using typical processing results, comparisons can be made without being influenced by noise or anomalies contained in the actual observed images.

 また、基準値は、事前情報によって定めた一部の結果を用いることができる。
 事前情報によって定めた一部の結果とは、例えば、頭部の存在確率が高い画素を採用する、といったことが想定される。
 また、例えば、背景等、変化が大きいが重要性は低いことがわかっている画素の不採用といったことが想定される。
 すなわち、基準値は、ニューラルネットワークに含まれるノードのうち予め定められたノードの出力値である、ようにしてもよい。
 これにより、省メモリ化や、扱うデータを削減することができ、より高速化を図ることができる。
In addition, the reference value may use a part of the results determined based on advance information.
The partial results determined based on the prior information are assumed to be, for example, pixels that have a high probability of indicating the presence of a head.
Also, for example, pixels that are known to have a large change but low importance, such as the background, may not be used.
In other words, the reference value may be the output value of a predetermined node among the nodes included in the neural network.
This allows for reduced memory usage and data handling, resulting in faster processing speeds.

 第2の状態比較部1540は、差分値と予め記憶されているしきい値とを用いて基準値と今回値との差の大小を判定した結果に基づいて、判定対象の状態の変化を判定する。
 具体的には、第2の状態比較部1540は、画像の単位に、差分値の二乗和が予め記憶されたしきい値より小さい場合、判定対象の状態が変化なしであると判定する。
 また、具体的には、第2の状態比較部1540は、画像の単位に記差分値の絶対値の総和が予め記憶されたしきい値より小さい場合、判定対象の状態が変化なしであると判定する。
The second state comparison section 1540 uses the difference value and a pre-stored threshold value to determine the magnitude of the difference between the reference value and the current value, and determines a change in the state of the object to be determined based on the result.
Specifically, when the sum of squares of the difference values is smaller than a pre-stored threshold value for each image, second state comparing section 1540 determines that the state of the determination target is unchanged.
More specifically, when the sum of the absolute values of the difference values recorded in image units is smaller than a pre-stored threshold value, the second state comparing section 1540 determines that the state of the determination target is unchanged.

 図16に示す第2の状態比較部1540Cは、第1の層状態比較部1541、および、第2の層状態比較部1542、を含み構成されている。第1の層状態比較部1541は、第1の全結合層状態比較部1541とも記載する。第2の層状態比較部1542は、第2の全結合層状態比較部1542とも記載する。
 第1の全結合層状態比較部1541、および、第2の全結合層状態比較部1542、において、格納とは、それぞれの入力元の1次元ベクトルの値を保持することを意味する。
 また、第1の全結合層状態比較部1541、および、第2の全結合層状態比較部1542、において、比較とは、例えば、格納された(前時刻/前フレームの)1次元ベクトルと、最新の1次元ベクトルとの間で、要素毎に差の絶対値をとり、更に、総和若しくは平均をとったものと、予め定めた所定閾値との大小比較を行うことを意味する。
16 includes a first layer state comparison unit 1541 and a second layer state comparison unit 1542. The first layer state comparison unit 1541 is also referred to as a first fully connected layer state comparison unit 1541. The second layer state comparison unit 1542 is also referred to as a second fully connected layer state comparison unit 1542.
In the first fully connected layer state comparison unit 1541 and the second fully connected layer state comparison unit 1542, "storing" means holding the values of the one-dimensional vectors of the respective input sources.
In the first fully connected layer state comparison unit 1541 and the second fully connected layer state comparison unit 1542, comparison means, for example, taking the absolute value of the difference between a stored one-dimensional vector (of the previous time/frame) and the latest one-dimensional vector for each element, and then comparing the sum or average with a predetermined threshold value.

 図16に示す第1の全結合層状態比較部1541は、格納部1541a、および、比較処理部1541b、を含み構成されている。
 格納部1541aは、第1の全結合層1527の出力値であって第2の全結合層の入力値である基準値を、第1の全結合層1527による出力ごとに格納する。
 比較処理部1541bは、基準値と今回値とを比較する。
The first fully connected layer state comparison unit 1541 shown in FIG. 16 includes a storage unit 1541a and a comparison processing unit 1541b.
The storage unit 1541a stores a reference value, which is an output value of the first fully connected layer 1527 and an input value of the second fully connected layer, for each output by the first fully connected layer 1527.
The comparison processing unit 1541b compares the current value with a reference value.

 図16に示す第2の全結合層状態比較部1542は、格納部1542a、および、比較処理部1542b、を含み構成されている。
 格納部1542aは、第2の全結合層の出力値であって確率出力層1529の入力値である基準値を、第2の全結合層による出力ごとに格納する。
 比較処理部1542bは、基準値と今回値とを比較する。
The second fully connected layer state comparison unit 1542 shown in FIG. 16 includes a storage unit 1542a and a comparison processing unit 1542b.
The storage unit 1542a stores a reference value, which is an output value of the second fully connected layer and an input value of the probability output layer 1529, for each output by the second fully connected layer.
The comparison processing unit 1542b compares the current value with a reference value.

 第2の処理制御部1580Cは、第2の状態比較部1540により判定対象の状態が変化なしと判定された場合、当該判定対象の前記層以降の層における処理を実行しないよう第2のニューラルネットワーク部1520へ指令するとともに、第2の出力格納部に格納されている判定値を出力データとして出力するよう指令する。
 図16に示す第2の処理制御部1580Cは、異常状態判定処理制御部1581C、を含み構成されている。
 異常状態判定処理制御部1581Cは、異常状態判定部1500Cにおける第2のニューラルネットワーク部1520の処理を制限できるよう機能する。
When the second state comparison unit 1540 determines that the state of the object to be judged has not changed, the second processing control unit 1580C instructs the second neural network unit 1520 not to execute processing in the layer of the object to be judged and subsequent layers, and also instructs the second processing control unit 1580C to output the judgment value stored in the second output storage unit as output data.
The second process control unit 1580C shown in FIG. 16 includes an abnormal state determination process control unit 1581C.
The abnormal state determination processing control unit 1581C functions to limit the processing of the second neural network unit 1520 in the abnormal state determination unit 1500C.

 第2の出力格納部1590は、第2のニューラルネットワーク部1520により出力された判定値を格納する。
 図16に示す出力格納部(第2の出力格納部)1590は、確率格納部1591、を含み構成されている。
The second output storage unit 1590 stores the decision value output by the second neural network unit 1520 .
The output storage section (second output storage section) 1590 shown in FIG.

 確率格納部1591は、第2のニューラルネットワーク部1520により出力された判定値を格納する。
 判定値は、判定対象者の状態を示す値であって、例えば、確率出力層1529により出力結果に確率的な意味合いを有するように出力された確率値である。
The probability storage unit 1591 stores the judgment value output by the second neural network unit 1520 .
The judgment value is a value indicating the state of the person to be judged, and is, for example, a probability value output by the probability output layer 1529 so that the output result has a probabilistic meaning.

 図14の説明に戻る。
 警報出力部2000は、判定対象者の眠気状態または居眠り状態を示す判定値である出力データを取得し、当該判定値に応じて前記判定対象者に対する警報を出力させる。
Returning to the explanation of FIG.
The warning output unit 2000 acquires output data that is a judgment value indicating the drowsy state or dozing state of the person to be judged, and outputs a warning to the person to be judged in accordance with the judgment value.

 図14に示す異常判定装置1000Cは、警報出力部2000を含まない構成として示しているが、警報出力部2000を含むように構成してもよい。このように構成した場合、異常判定装置1000Cは、図14に示す異常警報装置100Cと同等である。以下の説明においては、異常警報装置100Cと異常判定装置1000Cとの区別が必要な場合以外は、異常判定装置1000Cが警報出力部2000を含めて構成されているものとして説明する。 The abnormality determination device 1000C shown in FIG. 14 is shown as being configured not to include the alarm output unit 2000, but may be configured to include the alarm output unit 2000. When configured in this manner, the abnormality determination device 1000C is equivalent to the abnormality warning device 100C shown in FIG. 14. In the following explanation, the abnormality determination device 1000C will be explained as being configured to include the alarm output unit 2000, except in cases where it is necessary to distinguish between the abnormality warning device 100C and the abnormality determination device 1000C.

 異常判定装置1000Cは、上記構成以外に、図示しない制御部、図示しない記憶部、および、図示しない通信部、を含み構成されていてもよい。
 図示しない制御部は、異常判定装置1000C全体および各構成部に対する制御を行う。図示しない制御部は、例えば外部からの指令に従って異常判定装置1000Cを起動させる。また、図示しない制御部は、異常判定装置1000Cの状態(動作状態=起動、シャットダウン、スリープなどの状態)を制御する。
 図示しない記憶部は、異常判定装置1000Cに用いられる各データを記憶する。図示しない記憶部は、例えば、異常判定装置1000Cにおける各構成部による出力(出力されたデータ)を記憶し、構成部ごとに要求されたデータを要求元の構成部へ宛てて出力する。
 図示しない通信部は、外部の装置との間で通信を行う。例えば異常判定装置1000Cと車内カメラなどの撮像装置との間で通信を行う。また、例えば異常判定装置1000Cが表示部または音声出力部を備えていない場合、異常判定装置1000Cと例えば表示装置または音声出力装置といった外部の装置との間で通信を行う。
In addition to the above configuration, abnormality determination device 1000C may be configured to include a control unit (not shown), a storage unit (not shown), and a communication unit (not shown).
A control unit (not shown) controls the entire abnormality determination device 1000C and each of its components. The control unit (not shown) starts up the abnormality determination device 1000C in response to, for example, an external command. The control unit (not shown) also controls the state (operation state = start-up, shutdown, sleep, etc.) of the abnormality determination device 1000C.
A storage unit (not shown) stores each piece of data used in the abnormality determination device 1000C. The storage unit (not shown) stores, for example, outputs (output data) from each component in the abnormality determination device 1000C, and outputs data requested by each component to the component that has made the request.
The communication unit (not shown) communicates with an external device. For example, the communication unit communicates between the abnormality determination device 1000C and an imaging device such as an in-vehicle camera. In addition, for example, if the abnormality determination device 1000C does not have a display unit or an audio output unit, the communication unit communicates between the abnormality determination device 1000C and an external device such as a display unit or an audio output device.

 実施の形態3に係る異常警報装置100Cおよび異常判定装置1000Cの処理の一例を説明する。
 図17は、本開示の実施の形態3に係る異常警報装置100Cおよび異常判定装置1000Cの処理の一例を示すフローチャートである。
 異常判定装置1000Cは、例えばカメラから画像が入力されると、図17に示す処理を開始する。
An example of the processing of the abnormality warning device 100C and the abnormality determination device 1000C according to the third embodiment will be described.
FIG. 17 is a flowchart showing an example of the processing of the abnormality warning device 100C and the abnormality determination device 1000C according to the third embodiment of the present disclosure.
The abnormality determination device 1000C starts the process shown in FIG. 17 when an image is input from a camera, for example.

 異常判定装置1000Cは、画像取得処理を実行する(ステップST3100)。
 画像取得処理において、異常判定装置1000Cの画像取得部1100は、画像を取得して出力する。
Abnormality determination device 1000C executes image acquisition processing (step ST3100).
In the image acquisition process, the image acquisition unit 1100 of the abnormality determination device 1000C acquires and outputs an image.

 次いで、異常判定装置1000Cは、格納および状態比較処理を実行する(ステップST3200)。
 状態格納および状態比較処理において、異常判定装置1000Cの第1の状態比較部1340Cは、処理開始後の少なくとも1回目の第1のニューラルネットワーク部1320の全ての層における処理結果である、層ごとの入力値である前回値を格納する。
 第1の状態比較部1340は、1回目の処理(1枚目の画像に対する処理)については状態比較処理を行わず値を格納するだけであるが、2回目以降の処理(2枚目の画像に対する処理)においては判定対象の状態の変化を判定する。
Next, abnormality determination device 1000C executes a storage and state comparison process (step ST3200).
In the state storage and state comparison process, the first state comparison unit 1340C of the abnormality determination device 1000C stores the previous values, which are the input values for each layer, which are the processing results for all layers of the first neural network unit 1320 for at least the first time after the start of processing.
The first state comparison unit 1340 does not perform state comparison processing for the first processing (processing for the first image) and simply stores the values, but in the second and subsequent processing (processing for the second image), it determines whether there has been a change in the state of the object to be judged.

 次いで、異常判定装置1000Cは、特徴抽出処理制御を実行する(ステップST3300)。
 異常判定装置1000Cの特徴抽出処理制御部1381は、第1の状態比較部1340による判定結果に応じて、第1のニューラルネットワーク部1320に対する通常処理指令、または、省略処理指令を実行する。
 また、特徴抽出処理制御部1381は、省略処理指令を行う場合、結合画像格納部1391へ出力指令を行う。
Next, abnormality determination device 1000C executes feature extraction process control (step ST3300).
The feature extraction process control unit 1381 of the abnormality determination device 1000C executes a normal process command or an omission process command to the first neural network unit 1320 depending on the determination result by the first state comparison unit 1340.
Furthermore, when issuing an omission processing command, the feature extraction processing control unit 1381 issues an output command to the combined image storage unit 1391 .

 次いで、異常判定装置1000Cは、結合画像出力処理を実行する(ステップST3400)。
 第1のニューラルネットワーク部1320が通常処理を実行して結合画像を出力する、または、結合画像格納部1391が前回の結合画像を出力する。これにより特徴抽出部1300は結合画像を出力する。
Next, abnormality determination device 1000C executes combined image output processing (step ST3400).
The first neural network unit 1320 executes normal processing and outputs a combined image, or the combined image storage unit 1391 outputs the previous combined image, whereby the feature extraction unit 1300 outputs a combined image.

 異常判定装置1000Cは、全結合層状態格納および全結合層状態比較処理を実行する(ステップST3500)。
 異常判定装置1000Cにおける第2の状態比較部1540は、第2のニューラルネットワーク部1520の複数の層ごとに、当該層に対する入力値の基準である基準値を予め格納しておき、当該基準値と、当該層に新たに入力されようとしている値である今回値と、の差分値を用いて、判定対象の状態の変化を判定する。
 第2の状態比較部1540は、1回目の処理(1枚目の画像に対する処理)については状態比較処理を行わず値を格納するだけであるが、2回目以降の処理(2枚目の画像に対する処理)においては判定対象の状態の変化を判定する。
Abnormality determination device 1000C executes a process of storing the fully connected layer states and comparing the fully connected layer states (step ST3500).
The second state comparison unit 1540 in the abnormality determination device 1000C pre-stores, for each of the multiple layers of the second neural network unit 1520, a reference value that is a standard for the input value to that layer, and determines a change in the state of the object to be determined using the difference value between the reference value and the current value, which is the value that is to be newly input to that layer.
The second state comparison unit 1540 does not perform state comparison processing for the first processing (processing for the first image) and simply stores the values, but determines changes in the state of the object to be judged for the second and subsequent processing (processing for the second image).

 異常判定装置1000Cは、異常状態判定処理制御処理を実行する(ステップST3600)。
 異常状態判定処理制御部1581Cは、第2のニューラルネットワーク部1520に対する通常指令、または、省略指令を行う。異常状態判定処理制御部1581Cは、省略指令を行う場合、確率格納部1591に対して出力指令を行う。
Abnormality determination device 1000C executes an abnormal state determination process control process (step ST3600).
The abnormal state determination process control unit 1581C issues a normal command or an omission command to the second neural network unit 1520. When issuing an omission command, the abnormal state determination process control unit 1581C issues an output command to the probability storage unit 1591.

 異常判定装置1000Cは、結果出力処理を実行する(ステップST3700)。
 第2のニューラルネットワーク部1520が通常処理を実行して判定値を出力する、または、確率格納部が前回の判定値(確率値)を出力する。これにより異常判定部は判定値を出力データとして出力する。
Abnormality determination device 1000C executes a result output process (step ST3700).
The second neural network unit 1520 executes normal processing and outputs a judgment value, or the probability storage unit outputs a previous judgment value (probability value), which causes the abnormality judgment unit to output the judgment value as output data.

 異常判定装置1000Cは、警報出力処理を実行する(ステップST3800)。
 警報出力処理において、異常判定装置1000Cの警報出力部2000は、出力データを取得して、出力データに基づいて警報信号を図示しない警報装置等に出力する。警報出力部2000は、出力データに含まれる判定値に基づいて、警報を出力するかを判定し、警報を出力すると判定した場合、図示しない警報装置等へ警報信号を出力する。
Abnormality determination device 1000C executes an alarm output process (step ST3800).
In the alarm output process, the alarm output unit 2000 of the abnormality determination device 1000C acquires output data and outputs an alarm signal based on the output data to an alarm device (not shown) etc. The alarm output unit 2000 determines whether to output an alarm based on a determination value included in the output data, and when it determines to output an alarm, outputs an alarm signal to an alarm device (not shown) etc.

 異常判定装置1000Cは、ステップST3800の処理を実行すると、図17に示す一連の処理を終了し、ステップST3100の処理から繰り返す。
 なお、異常判定装置1000Cは、例えばカメラが電源OFF状態になると連動して電源OFF状態になる。
When abnormality determining device 1000C executes the process of step ST3800, it ends the series of processes shown in FIG. 17 and repeats the process from step ST3100.
In addition, the abnormality determination device 1000C is also turned off in conjunction with, for example, the camera being turned off.

 実施の形態3に係る異常警報装置100Cおよび異常判定装置1000Cの処理の詳細な一例を説明する。
 図18は、本開示の実施の形態3に係る異常警報装置100Cおよび異常判定装置1000Cにおける特徴抽出部1300Cの処理の詳細な第一の例を示すフローチャートである。
A detailed example of the processing of the abnormality warning device 100C and the abnormality determination device 1000C according to the third embodiment will be described.
FIG. 18 is a flowchart showing a detailed first example of the processing of the feature extraction unit 1300C in the abnormality warning device 100C and the abnormality determination device 1000C according to the third embodiment of the present disclosure.

 特徴抽出部1300Cが処理を開始すると、まず、特徴抽出部1300Cの第1の状態比較部1340Cは画像格納処理を実行する(ステップST3210)。
 画像格納処理において、第1の状態比較部1340Cの画像状態比較部1341は、第1のニューラルネットワーク部1320に対する入力データである画像を格納する。
 また、画像状態比較部1341は、第1のニューラルネットワーク部1320が画像を取得する度に、画像を格納部1341aへ格納する。
When the feature extraction section 1300C starts the process, first, the first state comparison section 1340C of the feature extraction section 1300C executes an image storage process (step ST3210).
In the image storage process, the image state comparison section 1341 of the first state comparison section 1340C stores the image that is the input data for the first neural network section 1320.
Furthermore, every time the first neural network unit 1320 acquires an image, the image state comparison unit 1341 stores the image in the storage unit 1341a.

 画像状態比較部1341は、前回格納有であるかを判定する処理を実行する(ステップST3211)。
 画像状態比較部1341は、格納部1341aを参照して、前回入力された画像が格納されているかを判定する。
The image state comparison section 1341 executes a process of determining whether the image has been previously stored (step ST3211).
The image state comparison section 1341 refers to the storage section 1341a to determine whether the previously input image is stored therein.

 画像状態比較部1341は、前回格納有であると判定した場合(ステップST3211“YES”)、今回と前回との比較処理を実行する(ステップST3212)。
 画像状態比較部1341の比較処理部1341bは、今回入力された画像と前回入力された画像とを比較する。
When it is determined that the image has been stored previously ("YES" in step ST3211), the image state comparison unit 1341 executes a process of comparing the current image with the previous image (step ST3212).
The comparison processing section 1341b of the image state comparison section 1341 compares the currently input image with the previously input image.

 比較処理部1341bは、差分がしきい値未満であるかを判定する処理を実行する(ステップST3213)。
 比較処理部1341bは、今回入力された画像と前回入力された画像との差分値と、予め記憶されているしきい値と、を用いて基準値と今回値との差の大小を判定する。
The comparison processing unit 1341b executes a process of determining whether the difference is less than a threshold value (step ST3213).
The comparison processing unit 1341b judges the magnitude of the difference between the reference value and the current value using a difference value between the currently input image and the previously input image and a pre-stored threshold value.

 画像状態比較部1341により前回値と今回値との差分がしきい値より大きいと判定された場合(ステップST3213“NO”)、畳み込み層状態比較部1342は、畳み込み層状態格納処理を実行する(ステップST3214)。
 畳み込み層状態格納処理において、畳み込み層状態比較部1342は、畳み込み層に対する入力データを格納部1342aへ格納する。
When the image state comparison unit 1341 judges that the difference between the previous value and the current value is greater than the threshold value ("NO" in step ST3213), the convolution layer state comparison unit 1342 executes a convolution layer state storage process (step ST3214).
In the convolution layer state storage process, the convolution layer state comparison unit 1342 stores input data for the convolution layer in the storage unit 1342a.

 畳み込み層状態比較部1342は、前回格納有?であるかを判定する処理を実行する(ステップST3215)。
 畳み込み層状態比較部1342は、格納部1342aを参照して、前回入力された値である前回値が格納されているかを判定する。
The convolution layer state comparison unit 1342 executes a process of determining whether or not the data has been previously stored (step ST3215).
The convolution layer state comparison unit 1342 refers to the storage unit 1342a and determines whether a previous value, which is a value input previously, is stored.

 畳み込み層状態比較部1342の比較処理部1342bは、前回格納有であると判定した場合(ステップST3215“YES”)、今回と前回との比較処理を実行する(ステップST3216)。 If the comparison processing unit 1342b of the convolutional layer state comparison unit 1342 determines that the previous state has been stored (step ST3215 "YES"), it executes a comparison process between the current state and the previous state (step ST3216).

 特徴抽出部1300Cは、差分がしきい値未満であるかを判定する処理を実行する(ステップST3217)。
 比較処理部1342bは、今回入力された画像と前回入力された画像との差分値と、予め記憶されているしきい値と、を用いて前回値と今回値との差の大小を判定する。
The feature extraction section 1300C executes a process of determining whether the difference is less than a threshold value (step ST3217).
The comparison processing unit 1342b judges whether the difference between the previous value and the current value is large or small, using a difference value between the currently input image and the previously input image and a pre-stored threshold value.

 畳み込み層状態比較部1342の比較処理部1342bにより前回値と今回値との差分がしきい値より大きいと判定された場合(ステップST3217“NO”)、プーリング層状態比較部1343は、プーリング層状態格納処理を実行する(ステップST3218)。
 プーリング層状態格納処理において、プーリング層状態比較部1343は、プーリング層に対する入力データを格納部1343aへ格納する。
If the comparison processing unit 1342b of the convolution layer state comparing unit 1342 determines that the difference between the previous value and the current value is greater than the threshold value ("NO" in step ST3217), the pooling layer state comparing unit 1343 executes a pooling layer state storage process (step ST3218).
In the pooling layer state storage process, the pooling layer state comparison unit 1343 stores input data for the pooling layer in the storage unit 1343a.

 プーリング層状態比較部1343における比較処理部1343bは、前回格納有?であるかを判定する処理を実行する(ステップST3219)。 The comparison processing unit 1343b in the pooling layer state comparison unit 1343 executes a process to determine whether the data was previously stored (step ST3219).

 プーリング層状態比較部1343の比較処理部1343bは、前回格納有であると判定した場合(ステップST3219“YES”)、今回値と基準値(前回値)との比較処理を実行する(ステップST3220)。 If the comparison processing unit 1343b of the pooling layer state comparison unit 1343 determines that the value was stored previously ("YES" in step ST3219), it executes a comparison process between the current value and the reference value (previous value) (step ST3220).

 プーリング層状態比較部1343の比較処理部1343bは、基準値(前回値)と今回値との差分がしきい値未満であるかを判定する処理を実行する(ステップST3221)。 The comparison processing unit 1343b of the pooling layer state comparison unit 1343 executes a process to determine whether the difference between the reference value (previous value) and the current value is less than a threshold value (step ST3221).

 画像状態比較部1341の比較処理部1341b、畳み込み層状態比較部1342の比較処理部1342b、および、プーリング層状態比較部1343の比較処理部1343bにより、基準値(前回値)と今回値との差分がしきい値より大きいと判定された場合(ステップST3213“NO”、ステップST3217“NO”、ステップST3221“NO”)、特徴抽出部1300Cは、通常処理指令処理を実行する(ステップST3311)。
 通常処理指令処理において、特徴抽出処理制御部1381は、第1のニューラルネットワーク部1320に対して通常処理指令を行う。
If the comparison processing unit 1341b of the image state comparison unit 1341, the comparison processing unit 1342b of the convolution layer state comparison unit 1342, and the comparison processing unit 1343b of the pooling layer state comparison unit 1343 determine that the difference between the reference value (previous value) and the current value is greater than the threshold value (step ST3213 "NO", step ST3217 "NO", step ST3221 "NO"), the feature extraction unit 1300C executes normal processing command processing (step ST3311).
In the normal processing command process, the feature extraction processing control unit 1381 issues a normal processing command to the first neural network unit 1320 .

 特徴抽出部1300Cは、出力処理を実行する(ステップST3411)。
 特徴抽出部1300Cの第1のニューラルネットワーク部1320は、結合画像を出力する。
The feature extraction section 1300C executes an output process (step ST3411).
The first neural network portion 1320 of the feature extraction portion 1300C outputs the combined image.

 特徴抽出部1300Cは、出力格納処理を実行する(ステップST3412)。
 特徴抽出部1300Cにおける出力格納部の結合画像格納部1391は、第1のニューラルネットワーク部1320から出力された結合画像を格納する。
The feature extraction section 1300C executes an output storage process (step ST3412).
The combined image storage section 1391 of the output storage section in the feature extraction section 1300C stores the combined image output from the first neural network section 1320.

 画像状態比較部1341の比較処理部1341b、畳み込み層状態比較部1342の比較処理部1342b、および、プーリング層状態比較部1343の比較処理部1343bのいずれかにより、前回値と今回値との差分がしきい値より小さいと判定された場合(ステップST3213“YES”、ステップST3217“YES”、ステップST3221“YES”)、特徴抽出部1300Cは、省略処理指令処理を実行する(ステップST3312)。
 省略処理指令処理において、特徴抽出処理制御部1381は、当該判定対象の層以降の層における処理を実行しないよう第1のニューラルネットワーク部1320へ指令するとともに、第1の出力格納部1390に格納されている特徴マップを出力するよう指令する。
If any of the comparison processing unit 1341b of the image state comparison unit 1341, the comparison processing unit 1342b of the convolution layer state comparison unit 1342, and the comparison processing unit 1343b of the pooling layer state comparison unit 1343 determines that the difference between the previous value and the current value is smaller than the threshold value (step ST3213 "YES", step ST3217 "YES", step ST3221 "YES"), the feature extraction unit 1300C executes an omission processing command processing (step ST3312).
In the omission processing command processing, the feature extraction processing control unit 1381 commands the first neural network unit 1320 not to execute processing in layers subsequent to the layer to be judged, and also commands the first output storage unit 1390 to output the feature map stored therein.

 特徴抽出部1300Cは、格納データ出力処理を実行する(ステップST3413)。
 特徴抽出部1300Cは、図18に示す処理を終了する。
The feature extraction section 1300C executes a stored data output process (step ST3413).
The feature extraction unit 1300C ends the process shown in FIG.

 図19は、本開示の実施の形態3に係る異常警報装置100Cおよび異常判定装置1000Cにおける異常状態判定部1500Cの処理の詳細な第一の例を示すフローチャートである。
 異常状態判定部1500Cは、特徴抽出部1300Cから出力された特徴マップを取得すると、処理を開始する。異常状態判定部1500Cは、まず、特徴抽出部1300により出力されたデータに対する処理(ステップST3510)を実行する。
FIG. 19 is a flowchart showing a detailed first example of the processing of the abnormal state determination unit 1500C in the abnormality warning device 100C and the abnormality determination device 1000C according to the third embodiment of the present disclosure.
The abnormal condition determination unit 1500C starts processing when it acquires the feature map output from the feature extraction unit 1300C. The abnormal condition determination unit 1500C first executes processing on the data output by the feature extraction unit 1300C (step ST3510).

 異常状態判定部1500Cは、全結合層出力を取得する(ステップST3511)。
 異常状態判定部1500Cにおける第2の状態比較部1540は、第1の全結合層1527から出力された出力値を取得する。
The abnormal state determination unit 1500C acquires the fully connected layer output (step ST3511).
The second state comparison section 1540 in the abnormal state determination section 1500C obtains the output value output from the first fully connected layer 1527.

 第1の全結合層状態比較部1541は、第1の全結合層状態格納処理を実行する(ステップST3512)。第1の全結合層状態比較部1541は、第1の全結合層1527からの出力値であって第2の全結合層1528の入力値を基準値として格納する。 The first fully connected layer state comparison unit 1541 executes a first fully connected layer state storage process (step ST3512). The first fully connected layer state comparison unit 1541 stores the output value from the first fully connected layer 1527, which is the input value to the second fully connected layer 1528, as a reference value.

 第1の全結合層状態比較部1541は、前回格納有であるかを判定する処理を実行する(ステップST3513)。
 第1の全結合層状態比較部1541は、格納部1541aを参照して、基準値(前回値)が格納されているかを判定する。
The first fully connected layer state comparing unit 1541 executes a process of judging whether the state has been stored previously (step ST3513).
The first fully connected layer state comparison unit 1541 refers to the storage unit 1541a to determine whether a reference value (previous value) is stored.

 第1の全結合層状態比較部1541は、前回格納有と判定した場合(ステップST3513“YES”)、今回と前回との比較処理を実行する(ステップST3514)。第1の全結合層状態比較部1541は、今回値と基準値(前回値)とを比較する。 If the first fully connected layer state comparison unit 1541 determines that the previous value has been stored ("YES" in step ST3513), it executes a comparison process between the current value and the previous value (step ST3514). The first fully connected layer state comparison unit 1541 compares the current value with a reference value (previous value).

 第1の全結合層状態比較部1541は、差分がしきい値より小さいかを判定する処理を実行する(ステップST3515)。
 比較処理部1541bは、今回値と基準値(前回値)との差分値と、予め記憶されているしきい値と、を用いて基準値と今回値との差の大小を判定する。
The first fully connected layer state comparing unit 1541 executes a process of determining whether the difference is smaller than a threshold value (step ST3515).
The comparison processing unit 1541b judges whether the difference between the current value and the reference value (previous value) is large or small, using a difference between the current value and the reference value and a pre-stored threshold value.

 第1の全結合層状態比較部1541により差分がしきい値より大きいと判定された場合、 第2の全結合層1528において通常処理が実行され、第2の全結合層状態比較部1542は、第2の全結合層状態格納処理を実行する(ステップST3516)。
 第2の全結合層状態比較部1542は、第2の全結合層1528からの出力値であって確率出力層1529の入力値を基準値として格納する。
If the first fully connected layer state comparison unit 1541 determines that the difference is greater than the threshold value, normal processing is executed in the second fully connected layer 1528, and the second fully connected layer state comparison unit 1542 executes second fully connected layer state storage processing (step ST3516).
The second fully connected layer state comparison unit 1542 stores the output value from the second fully connected layer 1528, which is the input value to the probability output layer 1529, as a reference value.

 第2の全結合層状態比較部1542は、前回格納有であるかを判定する処理を実行する(ステップST3517)。
 第2の全結合層状態比較部1542は、格納部1542aを参照して、基準値(前回値)が格納されているかを判定する。
The second fully connected layer state comparing unit 1542 executes a process of judging whether the state has been stored previously (step ST3517).
The second fully connected layer state comparison unit 1542 refers to the storage unit 1542a to determine whether a reference value (previous value) is stored.

 第2の全結合層状態比較部1542は、前回格納有と判定した場合(ステップST3517“YES”)、今回と前回との比較処理を実行する(ステップST3518)。第2の全結合層状態比較部1542は、今回値と基準値(前回値)とを比較する。 If the second fully connected layer state comparison unit 1542 determines that the previous value has been stored ("YES" in step ST3517), it executes a comparison process between the current value and the previous value (step ST3518). The second fully connected layer state comparison unit 1542 compares the current value with a reference value (previous value).

 第2の全結合層状態比較部1542は、差分<しきい値?判定処理を実行する(ステップST3519)。
 比較処理部1542bは、今回値と基準値(前回値)との差分値と、予め記憶されているしきい値と、を用いて基準値と今回値との差の大小を判定する。
The second fully connected layer state comparing unit 1542 executes a process of determining whether the difference is smaller than the threshold value (step ST3519).
The comparison processing unit 1542b judges whether the difference between the current value and the reference value (previous value) is large or small, using a difference between the current value and the reference value (previous value) and a pre-stored threshold value.

 第1の全結合層状態比較部1541により差分がしきい値より大きいと判定された場合(ステップST3515“NO”)、および、第2の全結合層状態比較部1542により差分がしきい値より大きいと判定された場合(ステップST3519“NO”)、異常状態判定部1500Cは、通常処理指令処理を実行する(ステップST3611)。 If the first fully connected layer state comparison unit 1541 determines that the difference is greater than the threshold value (step ST3515 "NO"), and if the second fully connected layer state comparison unit 1542 determines that the difference is greater than the threshold value (step ST3519 "NO"), the abnormal state determination unit 1500C executes normal processing command processing (step ST3611).

 異常状態判定部1500Cは、出力処理を実行する(ステップST3711)。
 異常状態判定部1500Cにおけるニューラルネットワーク部1520(第2のニューラルネットワーク部1520)は判定値を出力する。
Abnormal state determination section 1500C executes output processing (step ST3711).
Neural network section 1520 (second neural network section 1520) in abnormal state determination section 1500C outputs a determination value.

 異常状態判定部1500Cは、出力格納処理を実行する(ステップST3712)。
 異常状態判定部1500Cにおける出力格納部1590(第2の出力格納部1590)は、ニューラルネットワーク部1520(第2のニューラルネットワーク部1520)により出力された判定値を格納する。
Abnormal state determination section 1500C executes an output storage process (step ST3712).
An output storage unit 1590 (second output storage unit 1590) in the abnormal state determination unit 1500C stores the determination value output by the neural network unit 1520 (second neural network unit 1520).

 第1の全結合層状態比較部1541により差分がしきい値より小さいと判定された場合、および、第2の全結合層状態比較部1542により差分がしきい値より小さいと判定された場合、異常状態判定部1500Cは、省略処理指令を実行する(ステップST3519)。
 省略処理指令において、異常状態判定処理制御部1581は、当該判定対象の層以降の層における処理を実行しないよう第2のニューラルネットワーク部1520へ指令するとともに、第2の出力格納部1590に格納されている判定値を出力するよう指令する。
If the first fully connected layer state comparison unit 1541 determines that the difference is smaller than the threshold value, and if the second fully connected layer state comparison unit 1542 determines that the difference is smaller than the threshold value, the abnormal state determination unit 1500C executes an omission processing command (step ST3519).
In the omission processing command, the abnormal state judgment processing control unit 1581 instructs the second neural network unit 1520 not to execute processing in layers subsequent to the layer to be judged, and also instructs the second output storage unit 1590 to output the judgment value stored therein.

 異常状態判定部1500Cは、格納データ出力処理を実行する(ステップST3713)。
 第2の出力格納部1590は、判定値を出力する。
Abnormal state determination section 1500C executes a stored data output process (step ST3713).
The second output storage section 1590 outputs the decision value.

 第2のニューラルネットワーク部1520または第2の出力格納部1590が、判定値を出力すると、図19に示す一連の処理を終了する。 When the second neural network unit 1520 or the second output storage unit 1590 outputs the judgment value, the series of processes shown in FIG. 19 ends.

 次に、実施の形態3に係る異常警報装置100Cおよび異常判定装置1000Cの処理の詳細な別の一例を説明する。
 図20は、本開示の実施の形態3に係る異常警報装置100Cおよび異常判定装置1000Cにおける特徴抽出部1300Cの処理の詳細な第二の例を示すフローチャートである。
Next, another detailed example of the processing of the abnormality warning device 100C and the abnormality determination device 1000C according to the third embodiment will be described.
FIG. 20 is a flowchart showing a second detailed example of the processing of the feature extraction unit 1300C in the abnormality warning device 100C and the abnormality determination device 1000C according to the third embodiment of the present disclosure.

 特徴抽出部1300Cが処理を開始すると、まず、特徴抽出部1300Cの第1の状態比較部1340Cは画像格納処理を実行する(ステップST3210)。
 画像格納処理において、第1の状態比較部1340Cの画像状態比較部1341は、第1のニューラルネットワーク部1320に対する入力データである画像を格納する。
 また、画像状態比較部1341は、第1のニューラルネットワーク部1320が画像を取得する度に、画像を格納部1341aへ格納する。
When the feature extraction section 1300C starts the process, first, the first state comparison section 1340C of the feature extraction section 1300C executes an image storage process (step ST3210).
In the image storage process, the image state comparison section 1341 of the first state comparison section 1340C stores the image that is the input data for the first neural network section 1320.
Furthermore, every time the first neural network unit 1320 acquires an image, the image state comparison unit 1341 stores the image in the storage unit 1341a.

 画像状態比較部1341は、前回格納有であるかを判定する処理を実行する(ステップST3211)。
 画像状態比較部1341は、格納部1341aを参照して、前回入力された画像が格納されているかを判定する。
The image state comparison section 1341 executes a process of determining whether the image has been previously stored (step ST3211).
The image state comparison section 1341 refers to the storage section 1341a to determine whether the previously input image is stored therein.

 画像状態比較部1341は、前回格納有であると判定した場合(ステップST3211“YES”)、今回と前回との比較処理を実行する(ステップST3212)。
 画像状態比較部1341の比較処理部1341bは、今回入力された画像と前回入力された画像とを比較する。
When it is determined that the image has been stored previously ("YES" in step ST3211), the image state comparison unit 1341 executes a process of comparing the current image with the previous image (step ST3212).
The comparison processing section 1341b of the image state comparison section 1341 compares the currently input image with the previously input image.

 比較処理部1341bは、差分がしきい値未満であるかを判定する処理を実行する(ステップST3213)。比較処理部1341bは、今回入力された画像と前回入力された画像との差分値と、予め記憶されているしきい値と、を用いて基準値と今回値との差の大小を判定する。 The comparison processing unit 1341b executes a process to determine whether the difference is less than a threshold value (step ST3213). The comparison processing unit 1341b determines the magnitude of the difference between the reference value and the current value using the difference value between the currently input image and the previously input image and a pre-stored threshold value.

 画像状態比較部1341により前回格納有ではないと判定された場合(ステップST3211“NO”)、または、画像状態比較部1341により前回値と今回値との差分がしきい値以上であると判定された場合(ステップST3213“NO”)、特徴抽出処理制御部1381Cにより通常処理指令(ステップST3231)が行われ、畳み込み層部1322による処理が実行される。
 次いで、畳み込み層状態比較部1342は、畳み込み層状態格納処理を実行する(ステップST3214)。畳み込み層状態格納処理において、畳み込み層状態比較部1342は、畳み込み層に対する入力データを格納部1342aへ格納する。
If the image state comparison unit 1341 determines that the value has not been stored previously (step ST3211 "NO"), or if the image state comparison unit 1341 determines that the difference between the previous value and the current value is equal to or greater than a threshold value (step ST3213 "NO"), the feature extraction processing control unit 1381C issues a normal processing command (step ST3231), and processing is performed by the convolution layer unit 1322.
Next, the convolution layer state comparison unit 1342 executes a convolution layer state storage process (step ST3214). In the convolution layer state storage process, the convolution layer state comparison unit 1342 stores input data for the convolution layer in the storage unit 1342a.

 畳み込み層状態比較部1342は、前回格納有?であるかを判定する処理を実行する(ステップST3215)。畳み込み層状態比較部1342は、格納部1342aを参照して、前回入力された値である前回値(基準値)が格納されているかを判定する。 The convolutional layer state comparison unit 1342 executes a process to determine whether a previous value has been stored (step ST3215). The convolutional layer state comparison unit 1342 refers to the storage unit 1342a and determines whether a previous value (reference value), which is a value input previously, has been stored.

 畳み込み層状態比較部1342の比較処理部1342bは、前回格納有であると判定した場合(ステップST3215“YES”)、今回と前回との比較処理を実行する(ステップST3216)。 If the comparison processing unit 1342b of the convolutional layer state comparison unit 1342 determines that the previous state has been stored (step ST3215 "YES"), it executes a comparison process between the current state and the previous state (step ST3216).

 特徴抽出部1300Cは、差分がしきい値未満であるかを判定する処理を実行する(ステップST3217)。具体的には、畳み込み層状態比較部1342の比較処理部1342bは、プーリング層部1325に対し今回入力されようとしている値(今回値)と前回入力された値(前回値)との差分値と、予め記憶されているしきい値と、を用いて前回値と今回値との差の大小を判定する。 The feature extraction unit 1300C executes a process to determine whether the difference is less than a threshold value (step ST3217). Specifically, the comparison processing unit 1342b of the convolution layer state comparison unit 1342 determines the magnitude of the difference between the previous value and the current value using the difference between the value to be input this time (current value) and the value input last time (previous value) to the pooling layer unit 1325 and a pre-stored threshold value.

 畳み込み層状態比較部1342により前回格納有ではないと判定された場合(ステップST3211“NO”)、または、畳み込み層状態比較部1342の比較処理部1342bにより前回値と今回値との差分がしきい値以上であると判定された場合(ステップST3217“NO”)、特徴抽出処理制御部1381Cにより通常処理指令(ステップST3232)が行われ、プーリング層部1325による処理が実行される。
 次いで、プーリング層状態格納処理において、プーリング層状態比較部1343は、プーリング層に対する入力データ(処理されようとしている入力値)を格納部1343aへ格納する。
If the convolutional layer state comparison unit 1342 determines that the value was not stored previously ("NO" in step ST3211), or if the comparison processing unit 1342b of the convolutional layer state comparison unit 1342 determines that the difference between the previous value and the current value is equal to or greater than a threshold value ("NO" in step ST3217), the feature extraction processing control unit 1381C issues a normal processing command (step ST3232), and processing is performed by the pooling layer unit 1325.
Next, in the pooling layer state storage process, the pooling layer state comparison unit 1343 stores the input data for the pooling layer (the input value to be processed) in the storage unit 1343a.

 次いで、プーリング層状態比較部1343における比較処理部1343bは、前回格納有?であるかを判定する処理を実行する(ステップST3219)。 Then, the comparison processing unit 1343b in the pooling layer state comparison unit 1343 executes a process to determine whether the data was previously stored (step ST3219).

 プーリング層状態比較部1343の比較処理部1343bは、前回格納有であると判定した場合(ステップST3219“YES”)、今回値と基準値(前回値)との比較処理を実行する(ステップST3220)。 If the comparison processing unit 1343b of the pooling layer state comparison unit 1343 determines that the value was stored previously ("YES" in step ST3219), it executes a comparison process between the current value and the reference value (previous value) (step ST3220).

 プーリング層状態比較部1343の比較処理部1343bは、基準値(前回値)と今回値との差分がしきい値未満であるかを判定する処理を実行する(ステップST3221)。 The comparison processing unit 1343b of the pooling layer state comparison unit 1343 executes a process to determine whether the difference between the reference value (previous value) and the current value is less than a threshold value (step ST3221).

 プーリング層状態比較部1343の比較処理部1343bにより、基準値(前回値)と今回値との差分がしきい値以上であると判定された場合(ステップST3221“NO”)、特徴抽出部1300Cは、通常処理指令処理を実行する(ステップST3311)。通常処理指令処理において、特徴抽出処理制御部1381は、第1のニューラルネットワーク部1320に対して通常処理指令を行う。第1のニューラルネットワーク部1320における画像結合部1328は、画像結合処理を実行する。 If the comparison processing unit 1343b of the pooling layer state comparison unit 1343 determines that the difference between the reference value (previous value) and the current value is equal to or greater than the threshold value (step ST3221 "NO"), the feature extraction unit 1300C executes normal processing command processing (step ST3311). In the normal processing command processing, the feature extraction processing control unit 1381 issues a normal processing command to the first neural network unit 1320. The image combination unit 1328 in the first neural network unit 1320 executes image combination processing.

 特徴抽出部1300Cは、出力格納処理を実行する(ステップST3412)。
 特徴抽出部1300Cにおける出力格納部の結合画像格納部1391は、第1のニューラルネットワーク部1320から出力された結合画像を格納する。
The feature extraction section 1300C executes an output storage process (step ST3412).
The combined image storage section 1391 of the output storage section in the feature extraction section 1300C stores the combined image output from the first neural network section 1320.

 画像状態比較部1341の比較処理部1341b、畳み込み層状態比較部1342の比較処理部1342b、および、プーリング層状態比較部1343の比較処理部1343bのいずれかにより、前回値と今回値との差分がしきい値より小さい(差分値がしきい値未満である)と判定された場合(ステップST3213“YES”、ステップST3217“YES”、ステップST3221“YES”)、特徴抽出部1300Cは、省略処理指令処理を実行する(ステップST3312)。
 省略処理指令処理において、特徴抽出処理制御部1381は、当該判定対象の層以降の層における処理を実行しないよう第1のニューラルネットワーク部1320へ指令するとともに、第1の出力格納部1390に格納されている特徴マップを出力するよう指令する。
If any of the comparison processing unit 1341b of the image state comparison unit 1341, the comparison processing unit 1342b of the convolution layer state comparison unit 1342, and the comparison processing unit 1343b of the pooling layer state comparison unit 1343 determines that the difference between the previous value and the current value is smaller than the threshold value (the difference value is less than the threshold value) (step ST3213 "YES", step ST3217 "YES", step ST3221 "YES"), the feature extraction unit 1300C executes an omission processing command processing (step ST3312).
In the omission processing command processing, the feature extraction processing control unit 1381 commands the first neural network unit 1320 not to execute processing in layers subsequent to the layer to be judged, and also commands the first output storage unit 1390 to output the feature map stored therein.

 特徴抽出部1300Cは、格納データ出力処理を実行する(ステップST3413)。
 特徴抽出部1300Cは、図20に示す処理を終了する。
The feature extraction section 1300C executes a stored data output process (step ST3413).
The feature extraction unit 1300C ends the process shown in FIG.

 図21は、本開示の実施の形態3に係る異常警報装置100Cおよび異常判定装置1000Cにおける異常状態判定部1500Cの処理の詳細な第二の例を示すフローチャートである。
 異常状態判定部1500Cは、特徴抽出部1300Cから出力された特徴マップを取得すると、処理を開始する。異常状態判定部1500Cは、まず、特徴抽出部1300により出力されたデータに対する処理(ステップST3510)を実行する。
FIG. 21 is a flowchart showing a second detailed example of the processing of the abnormal state determination unit 1500C in the abnormality warning device 100C and the abnormality determination device 1000C according to the third embodiment of the present disclosure.
The abnormal condition determination unit 1500C starts processing when it acquires the feature map output from the feature extraction unit 1300C. The abnormal condition determination unit 1500C first executes processing on the data output by the feature extraction unit 1300C (step ST3510).

 異常状態判定部1500Cは、全結合層出力を取得する(ステップST3511)。
 異常状態判定部1500Cにおける第2の状態比較部1540は、第1の全結合層1527から出力された出力値を取得する。
The abnormal state determination unit 1500C acquires the fully connected layer output (step ST3511).
The second state comparison section 1540 in the abnormal state determination section 1500C obtains the output value output from the first fully connected layer 1527.

 第1の層状態比較部1541は、第1の層状態格納処理を実行する(ステップST3512)。第1の全結合層状態比較部1541は、第1の全結合層1527からの出力値であって第2の全結合層1528の入力値(第2の全結合層1528により処理される前の値)を基準値として格納する。 The first layer state comparison unit 1541 executes a first layer state storage process (step ST3512). The first fully connected layer state comparison unit 1541 stores the output value from the first fully connected layer 1527 and the input value to the second fully connected layer 1528 (the value before being processed by the second fully connected layer 1528) as a reference value.

 第1の層状態比較部1541は、前回格納有であるかを判定する処理を実行する(ステップST3513)。
 第1の層状態比較部1541は、格納部1541aを参照して、基準値(前回値)が格納されているかを判定する。
The first layer state comparing unit 1541 executes a process of determining whether or not the data has been previously stored (step ST3513).
The first layer state comparison unit 1541 refers to the storage unit 1541a and determines whether a reference value (previous value) is stored.

 第1の層状態比較部1541は、前回格納有と判定した場合(ステップST3513“YES”)、今回と前回との比較処理を実行する(ステップST3514)。第1の全結合層状態比較部1541は、今回値と基準値(前回値)とを比較する。 If the first layer state comparison unit 1541 determines that the previous value has been stored ("YES" in step ST3513), it executes a comparison process between the current value and the previous value (step ST3514). The first fully connected layer state comparison unit 1541 compares the current value with a reference value (previous value).

 第1の層状態比較部1541は、差分がしきい値より小さいかを判定する処理を実行する(ステップST3515)。第1の層状態比較部1541における比較処理部1541bは、今回値と基準値(前回値)との差分値と、予め記憶されているしきい値と、を用いて基準値と今回値との差の大小を判定する。 The first layer state comparison unit 1541 executes a process of determining whether the difference is smaller than a threshold value (step ST3515). The comparison processing unit 1541b in the first layer state comparison unit 1541 determines the magnitude of the difference between the reference value and the current value using the difference between the current value and the reference value (previous value) and a pre-stored threshold value.

 第1の層状態比較部1541により前回格納有ではないと判定された場合(ステップST3211“NO”)、または、第1の層状態比較部1541における比較処理部1541bにより今回値と基準値(前回値)との差分値がしきい値以上であると判定された場合(ステップST3515“NO”)、異常状態判定処理制御部1581Cにより通常処理指令(ステップST3520)が行われ、第2の全結合層1528による処理が実行される。 If the first layer state comparison unit 1541 determines that no previous value was stored (step ST3211 "NO"), or if the comparison processing unit 1541b in the first layer state comparison unit 1541 determines that the difference between the current value and the reference value (previous value) is equal to or greater than the threshold value (step ST3515 "NO"), the abnormal state determination processing control unit 1581C issues a normal processing command (step ST3520), and processing is performed by the second fully connected layer 1528.

 次いで、第2の全結合層状態比較部1542は、第2の全結合層状態格納処理を実行する(ステップST3516)。第2の全結合層状態比較部1542は、第2の全結合層1528からの出力値であって確率出力層1529の入力値(確率出力層1529により処理される前の値)を基準値として格納する。 Then, the second fully connected layer state comparison unit 1542 executes a second fully connected layer state storage process (step ST3516). The second fully connected layer state comparison unit 1542 stores the output value from the second fully connected layer 1528, which is the input value to the probability output layer 1529 (the value before being processed by the probability output layer 1529), as a reference value.

 次いで、第2の全結合層状態比較部1542は、前回格納有であるかを判定する処理を実行する(ステップST3517)。第2の全結合層状態比較部1542は、格納部1542aを参照して、基準値(前回値)が格納されているかを判定する。 Then, the second fully connected layer state comparison unit 1542 executes a process to determine whether a previous value has been stored (step ST3517). The second fully connected layer state comparison unit 1542 refers to the storage unit 1542a to determine whether a reference value (previous value) has been stored.

 第2の全結合層状態比較部1542は、前回格納有と判定した場合(ステップST3517“YES”)、今回と前回との比較処理を実行する(ステップST3518)。第2の全結合層状態比較部1542は、今回値と基準値(前回値)とを比較する。 If the second fully connected layer state comparison unit 1542 determines that the previous value has been stored ("YES" in step ST3517), it executes a comparison process between the current value and the previous value (step ST3518). The second fully connected layer state comparison unit 1542 compares the current value with a reference value (previous value).

 第2の全結合層状態比較部1542は、差分<しきい値?判定処理を実行する(ステップST3519)。第2の全結合層状態比較部1542における比較処理部1542bは、今回値と基準値(前回値)との差分値と、予め記憶されているしきい値と、を用いて基準値と今回値との差の大小を判定する。 The second fully connected layer state comparison unit 1542 executes a determination process of whether the difference is less than the threshold value (step ST3519). The comparison processing unit 1542b in the second fully connected layer state comparison unit 1542 determines the magnitude of the difference between the reference value and the current value using the difference between the current value and the reference value (previous value) and a pre-stored threshold value.

 第2の全結合層状態比較部1542により前回格納有ではないと判定された場合(ステップST3517“NO”)、または、第2の全結合層状態比較部1542における比較処理部1542bにより差分がしきい値以上であると判定された場合(ステップST3519“NO”)、異常状態判定部1500Cは、通常処理指令処理を実行する(ステップST3611)。 If the second fully connected layer state comparison unit 1542 determines that the data was not previously stored (step ST3517 "NO"), or if the comparison processing unit 1542b in the second fully connected layer state comparison unit 1542 determines that the difference is equal to or greater than the threshold value (step ST3519 "NO"), the abnormal state determination unit 1500C executes a normal processing command process (step ST3611).

 異常状態判定部1500Cは、出力処理を実行する(ステップST3711)。
 異常状態判定部1500Cにおける第2のニューラルネットワーク部1520は判定値を出力する。
Abnormal state determination section 1500C executes output processing (step ST3711).
The second neural network unit 1520 in the abnormal state determination unit 1500C outputs a determination value.

 異常状態判定部1500Cは、出力格納処理を実行する(ステップST3712)。
 異常状態判定部1500Cにおける第2の出力格納部1590は、第2のニューラルネットワーク部1520により出力された判定値を確率格納部1591へ格納する。
Abnormal state determination section 1500C executes an output storage process (step ST3712).
The second output storage unit 1590 in the abnormal state determination unit 1500C stores the determination value output by the second neural network unit 1520 in a probability storage unit 1591 .

 第1の層状態比較部1541により差分がしきい値より小さい(しきい値未満)と判定された場合、および、第2の層状態比較部1542における比較処理部1542bにより差分がしきい値より小さいと判定された場合、異常状態判定部1500Cは、省略処理指令を実行する(ステップST3519)。
 省略処理指令において、異常状態判定処理制御部1581Cは、当該判定対象の層以降の層における処理を実行しないよう第2のニューラルネットワーク部1520へ指令するとともに、第2の出力格納部1590に格納されている判定値を出力するよう指令する。
If the first layer state comparison unit 1541 determines that the difference is smaller than the threshold value (less than the threshold value), and if the comparison processing unit 1542b in the second layer state comparison unit 1542 determines that the difference is smaller than the threshold value, the abnormal state determination unit 1500C executes an omission processing command (step ST3519).
In the omission processing command, the abnormal state judgment processing control unit 1581C instructs the second neural network unit 1520 not to execute processing in layers subsequent to the layer to be judged, and also instructs it to output the judgment value stored in the second output storage unit 1590.

 異常状態判定部1500Cは、格納データ出力処理を実行する(ステップST3713)。第2の出力格納部1590は、異常状態判定処理制御部1581Cからの指令を受けて、確率格納部1591に格納された判定値(確率値)を出力する。 The abnormal state determination unit 1500C executes the stored data output process (step ST3713). The second output storage unit 1590 receives a command from the abnormal state determination process control unit 1581C and outputs the determination value (probability value) stored in the probability storage unit 1591.

 第2のニューラルネットワーク部1520または第2の出力格納部1590が、判定値(確率値)を出力すると、図21に示す一連の処理を終了する。 When the second neural network unit 1520 or the second output storage unit 1590 outputs the judgment value (probability value), the series of processes shown in FIG. 21 ends.

 以上のように、実施の形態3によれば、各状態格納・比較処理部において、変化が小さいと判定した場合、以降の処理を回避することができるため、その分の演算量が減り、高速化を図ることができる。
 このとき、出力される結果は、前回の高精度な判定結果であることから高精度な結果の出力を継続することができる。
As described above, according to the third embodiment, if each state storage and comparison processing unit determines that the change is small, it is possible to avoid subsequent processing, thereby reducing the amount of calculations and enabling faster processing.
At this time, the result that is output is the previous highly accurate determination result, so that the output of highly accurate results can be continued.

 本開示の異常判定装置は、さらに、以下のように構成した。

「前記ニューラルネットワーク部は、第1のニューラルネットワーク部と、第2のニューラルネットワーク部とを含み構成され、
前記出力格納部は、第1の出力格納部と、第2の出力格納部とを含み構成され、
前記状態比較部は、第1の状態比較部と、第2の状態比較部とを含み構成され、
前記処理制御部は、第1の処理制御部と、第2の処理制御部とを含み構成され、
前記ニューラルネットワークを構成する層のうちの一部である複数の層を有し、前記画像取得部により出力された画像を取得し、当該画像に含まれる前記判定対象の特徴的な状態を表す特徴マップを出力する、前記第1のニューラルネットワーク部、
前記第1のニューラルネットワーク部により出力された前記特徴マップを格納する、前記第1の出力格納部、
前記第1のニューラルネットワーク部における複数の層ごとに、当該層に対する入力値の基準である基準値を予め格納しておき、当該基準値と、当該層に新たに入力されようとしている値である今回値と、の差分値を用いて、前記判定対象の状態の変化を判定する、前記第1の状態比較部、
および、
前記第1の状態比較部により前記判定対象の状態が変化なしと判定された場合、当該判定対象の前記層以降の層における処理を実行しないよう前記第1のニューラルネットワーク部へ指令するとともに、前記第1の出力格納部に格納されている前記特徴マップを出力するよう指令する、前記第1の処理制御部、
を含み構成された特徴抽出部と、
前記ニューラルネットワークにおける複数の層を有し、前記特徴抽出部により出力された前記特徴マップを取得し、当該特徴マップを用いて前記判定対象の状態を示す判定値を前記出力データとして出力する、前記第2のニューラルネットワーク部、
前記第2のニューラルネットワーク部により出力された判定値を格納する、前記第2の出力格納部と、
前記第2のニューラルネットワーク部の複数の層ごとに、当該層に対する入力値の基準である基準値を予め格納しておき、当該基準値と、当該層に新たに入力されようとしている値である今回値と、の差分値を用いて、前記判定対象の状態の変化を判定する、前記第2の状態比較部、
および、
前記第2の状態比較部により前記判定対象の状態が変化なしと判定された場合、当該判定対象の前記層以降の層における処理を実行しないよう前記第2のニューラルネットワーク部へ指令するとともに、前記第2の出力格納部に格納されている前記判定値を前記出力データとして出力するよう指令する、前記第2の処理制御部、
を含み構成された異常状態判定部と、
を備えた、異常判定装置。」

 これにより、本開示は、異常判定技術において、ニューラルネットワークを用いてさらに速やかに高精度な判定結果を出力することを可能にすることができる、異常判定装置を提供することができる、という効果を奏する。
 さらに、本開示は、上記構成を上記異常判定方法に適用することにより、上記効果と同様の効果を奏する。
The abnormality determination device of the present disclosure is further configured as follows.

"The neural network unit includes a first neural network unit and a second neural network unit,
The output storage unit includes a first output storage unit and a second output storage unit,
the state comparison unit includes a first state comparison unit and a second state comparison unit,
The processing control unit includes a first processing control unit and a second processing control unit,
the first neural network unit having a plurality of layers that are a part of the layers constituting the neural network, acquiring an image output by the image acquisition unit, and outputting a feature map that represents a characteristic state of the object to be determined that is included in the image;
the first output storage unit that stores the feature map output by the first neural network unit;
the first state comparison unit, which stores in advance a reference value that is a reference for an input value to each of a plurality of layers in the first neural network unit, and judges a change in the state of the object to be judged using a difference value between the reference value and a current value that is a value that is to be newly input to the layer;
and,
the first processing control unit instructs the first neural network unit not to execute processing in the layer of the object to be determined and subsequent layers when the first state comparison unit determines that the state of the object to be determined has not changed, and instructs the first processing control unit to output the feature map stored in the first output storage unit;
A feature extraction unit including:
the second neural network unit having a plurality of layers in the neural network, acquiring the feature map output by the feature extraction unit, and outputting a judgment value indicating a state of the object to be judged as the output data using the feature map;
the second output storage unit that stores the judgment value output by the second neural network unit;
the second state comparison unit, which stores in advance a reference value that is a reference for an input value to each layer of the second neural network unit, and judges a change in the state of the object to be judged using a difference value between the reference value and a current value that is a value that is to be newly input to the layer;
and,
the second processing control unit instructing the second neural network unit not to execute processing in the layer of the object to be judged and subsequent layers when the second state comparison unit judges that the state of the object to be judged has not changed, and instructing the second neural network unit to output the judgment value stored in the second output storage unit as the output data;
An abnormality state determination unit including:
An abnormality determination device equipped with the above.

As a result, the present disclosure has the effect of providing an abnormality determination device that can use a neural network in an abnormality determination technique to more quickly output highly accurate determination results.
Furthermore, the present disclosure achieves the same effect as the above by applying the above configuration to the abnormality determination method.

 ここで、上述した本開示の異常警報装置100(100A,100B,100C)および異常判定装置1000(1000A,1000B,1000C)が有する機能を実現するハードウェア構成を説明する。
図22は、本開示の機能を実現するためのハードウェア構成の第一の例を示す図である。
図23は、本開示の機能を実現するためのハードウェア構成の第二の例を示す図である。
 本開示の異常警報装置100(100A,100B,100C)および異常判定装置1000(1000A,1000B,1000C)はそれぞれ、図22または図23に示されるようなハードウェアにより実現される。
Here, a hardware configuration for realizing the functions of the abnormality warning device 100 (100A, 100B, 100C) and the abnormality determination device 1000 (1000A, 1000B, 1000C) of the present disclosure described above will be described.
FIG. 22 is a diagram illustrating a first example of a hardware configuration for realizing the functions of the present disclosure.
FIG. 23 is a diagram illustrating a second example of a hardware configuration for realizing the functions of the present disclosure.
The abnormality warning device 100 (100A, 100B, 100C) and the abnormality determination device 1000 (1000A, 1000B, 1000C) of the present disclosure are each realized by hardware such as that shown in FIG. 22 or FIG. 23.

 異常警報装置100(100A,100B,100C)および異常判定装置1000(1000A,1000B,1000C)はそれぞれ、図22に示すように、例えばプロセッサ10001、メモリ10002、および、通信回路10004により構成される。
 プロセッサ10001、メモリ10002は、例えば、コンピュータに搭載されているものである。
 メモリ10002には、当該コンピュータを、画像取得部1010、ニューラルネットワーク部1020、状態比較部1030、処理制御部1040、出力格納部1050、画像取得部1100,1100B,1100C、特徴抽出部1300,1300B,1300C、第1のニューラルネットワーク部1320、画像分岐部1321、畳み込み層部1322、第1の畳み込み層1323、第2の畳み込み層1324、プーリング層部1325、第1のプーリング層1326、第2のプーリング層1327、画像結合部1328、第1の状態比較部1340,1340B,1340C、画像状態比較部1341、比較処理部1341b、畳み込み層状態比較部1342、比較処理部1342b、プーリング層状態比較部1343、比較処理部1343b、第1の処理制御部1380,1380B,1380C、特徴抽出処理制御部1381,1381B,1381C、異常状態判定部1500,1500B,1500C、第2のニューラルネットワーク部1520、状態分類部1525、第1の全結合層1527、第2の全結合層1528、確率出力層1529、第2の状態比較部1540、第1の全結合層状態比較部1541、比較処理部1541b、第2の全結合層状態比較部1542、比較処理部1542b、第2の処理制御部1580,1580B,1580C、異常状態判定処理制御部1581,1581B,1581C、警報出力部2000、および、図示しない制御部として機能させるためのプログラムが記憶されている。メモリ10002に記憶されたプログラムをプロセッサ10001が読み出して実行することにより、画像取得部1010、ニューラルネットワーク部1020、状態比較部1030、処理制御部1040、出力格納部1050、画像取得部1100,1100B,1100C、特徴抽出部1300,1300B,1300C、第1のニューラルネットワーク部1320、画像分岐部1321、畳み込み層部1322、第1の畳み込み層1323、第2の畳み込み層1324、プーリング層部1325、第1のプーリング層1326、第2のプーリング層1327、画像結合部1328、第1の状態比較部1340,1340B,1340C、画像状態比較部1341、比較処理部1341b、畳み込み層状態比較部1342、比較処理部1342b、プーリング層状態比較部1343、比較処理部1343b、第1の処理制御部1380,1380B,1380C、特徴抽出処理制御部1381,1381B,1381C、異常状態判定部1500,1500B,1500C、第2のニューラルネットワーク部1520、状態分類部1525、第1の全結合層1527、第2の全結合層1528、確率出力層1529、第2の状態比較部1540、第1の全結合層状態比較部1541、比較処理部1541b、第2の全結合層状態比較部1542、比較処理部1542b、第2の処理制御部1580,1580B,1580C、異常状態判定処理制御部1581,1581B,1581C、警報出力部2000、および、図示しない制御部の機能が実現される。
 また、メモリ10002または図示しない他のメモリにより、図示しない記憶部が実現される。また、異常警報装置100(100A,100B,100C)および異常判定装置1000(1000A,1000B,1000C)における、格納部1341a,1342a,1343a,第1の出力格納部1390、結合画像格納部1391、格納部1541a,1542a、第2の出力格納部1590、および、確率格納部1591、は、メモリ10002または図示しない他のメモリにより実現される。
 また、通信回路10004により、図示しない通信部が実現される。
 プロセッサ10001は、例えば、CPU(Central Processing Unit)、GPU(Graphics Processing Unit)、マイクロプロセッサ、マイクロコントローラ又はDSP(Digital Signal Processor)などを用いたものである。
 メモリ10002は、RAM(Random Access Memory)、ROM(Read Only Memory)、EPROM(Erasable Programmable ROM)、EEPROM(Electrically Erasable Programmable Read Only Memory)又はフラッシュメモリ等の不揮発性もしくは揮発性の半導体メモリであってもよいし、ハードディスク又はフレキシブルディスク等の磁気ディスクであってもよいし、CD(Compact Disc)又はDVD(Digital VersatileDisc)等の光ディスクであってもよいし、光磁気ディスクであってもよい。
 プロセッサ10001とメモリ10002または通信回路10004は、相互にデータを伝送することが可能な状態に接続されている。また、プロセッサ10001とメモリ10002と通信回路10004とは、入出力インタフェース10003を介して他のハードウェアと相互にデータを伝送することが可能な状態に接続されている。
As shown in FIG. 22, each of the abnormality warning devices 100 (100A, 100B, 100C) and the abnormality determination devices 1000 (1000A, 1000B, 1000C) is configured with, for example, a processor 10001, a memory 10002, and a communication circuit 10004.
The processor 10001 and the memory 10002 are installed in, for example, a computer.
The memory 10002 stores the computer, an image acquisition unit 1010, a neural network unit 1020, a state comparison unit 1030, a processing control unit 1040, an output storage unit 1050, image acquisition units 1100, 1100B, and 1100C, feature extraction units 1300, 1300B, and 1300C, a first neural network unit 1320, an image branching unit 1321, and a convolution layer unit 1322. , first convolutional layer 1323, second convolutional layer 1324, pooling layer section 1325, first pooling layer 1326, second pooling layer 1327, image combination section 1328, first state comparison section 1340, 1340B, 1340C, image state comparison section 1341, comparison processing section 1341b, convolutional layer state comparison section 1342, comparison processing section 1342b, pooling layer state comparison section 1343, a comparison processing unit 1343b, first processing control units 1380, 1380B, 1380C, feature extraction processing control units 1381, 1381B, 1381C, abnormal state judgment units 1500, 1500B, 1500C, second neural network unit 1520, state classification unit 1525, first fully connected layer 1527, second fully connected layer 1528, probability output layer 1529, second state comparison unit 1540, first fully connected layer state comparison unit 1541, comparison processing unit 1541b, second fully connected layer state comparison unit 1542, comparison processing unit 1542b, second processing control units 1580, 1580B, 1580C, abnormal state judgment processing control units 1581, 1581B, 1581C, alarm output unit 2000, and programs for functioning as a control unit not shown are stored. The processor 10001 reads out and executes the programs stored in the memory 10002, thereby controlling the image acquisition unit 1010, the neural network unit 1020, the state comparison unit 1030, the processing control unit 1040, the output storage unit 1050, the image acquisition units 1100, 1100B, and 1100C, the feature extraction units 1300, 1300B, and 1300C, the first neural network unit 13 20, image branching unit 1321, convolution layer unit 1322, first convolution layer 1323, second convolution layer 1324, pooling layer unit 1325, first pooling layer 1326, second pooling layer 1327, image combination unit 1328, first state comparison unit 1340, 1340B, 1340C, image state comparison unit 1341, comparison processing unit 1341b, convolution layer state comparison unit 1342, comparison Processing unit 1342b, pooling layer state comparison unit 1343, comparison processing unit 1343b, first processing control units 1380, 1380B, 1380C, feature extraction processing control units 1381, 1381B, 1381C, abnormal state determination units 1500, 1500B, 1500C, second neural network unit 1520, state classification unit 1525, first fully connected layer 1527, second fully connected layer 1528 , probability output layer 1529, second state comparison unit 1540, first fully connected layer state comparison unit 1541, comparison processing unit 1541b, second fully connected layer state comparison unit 1542, comparison processing unit 1542b, second processing control units 1580, 1580B, 1580C, abnormal state determination processing control units 1581, 1581B, 1581C, alarm output unit 2000, and the functions of a control unit not shown are realized.
Furthermore, a storage unit (not shown) is realized by the memory 10002 or another memory (not shown). Furthermore, the storage units 1341a, 1342a, 1343a, the first output storage unit 1390, the combined image storage unit 1391, the storage units 1541a, 1542a, the second output storage unit 1590, and the probability storage unit 1591 in the abnormality warning device 100 (100A, 100B, 100C) and the abnormality determination device 1000 (1000A, 1000B, 1000C) are realized by the memory 10002 or another memory (not shown).
Furthermore, the communication circuit 10004 realizes a communication unit (not shown).
The processor 10001 is, for example, a central processing unit (CPU), a graphics processing unit (GPU), a microprocessor, a microcontroller, or a digital signal processor (DSP).
The memory 10002 may be a non-volatile or volatile semiconductor memory such as a RAM (Random Access Memory), a ROM (Read Only Memory), an EPROM (Erasable Programmable ROM), an EEPROM (Electrically Erasable Programmable Read Only Memory), or a flash memory, or a magnetic disk such as a hard disk or a flexible disk, or an optical disk such as a CD (Compact Disc) or a DVD (Digital Versatile Disc), or a magneto-optical disk.
The processor 10001 and the memory 10002 or the communication circuit 10004 are connected in a state in which data can be transmitted between them. The processor 10001, the memory 10002, and the communication circuit 10004 are also connected via the input/output interface 10003 in a state in which data can be transmitted between them and other hardware.

 または、画像取得部1010、ニューラルネットワーク部1020、状態比較部1030、処理制御部1040、出力格納部1050、画像取得部1100,1100B,1100C、特徴抽出部1300,1300B,1300C、第1のニューラルネットワーク部1320、画像分岐部1321、畳み込み層部1322、第1の畳み込み層1323、第2の畳み込み層1324、プーリング層部1325、第1のプーリング層1326、第2のプーリング層1327、画像結合部1328、第1の状態比較部1340,1340B,1340C、画像状態比較部1341、比較処理部1341b、畳み込み層状態比較部1342、比較処理部1342b、プーリング層状態比較部1343、比較処理部1343b、第1の処理制御部1380,1380B,1380C、特徴抽出処理制御部1381,1381B,1381C、異常状態判定部1500,1500B,1500C、第2のニューラルネットワーク部1520、状態分類部1525、第1の全結合層1527、第2の全結合層1528、確率出力層1529、第2の状態比較部1540、第1の全結合層状態比較部1541、比較処理部1541b、第2の全結合層状態比較部1542、比較処理部1542b、第2の処理制御部1580,1580B,1580C、異常状態判定処理制御部1581,1581B,1581C、警報出力部2000、および、図示しない制御部の機能は、図23に示すように、専用の処理回路20001により実現されるものであっても良い。
 また、通信回路10004により、図示しない通信部が実現される。
 処理回路20001は、例えば、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、PLD(Programmable Logic Device)、FPGA(Field-Programmable Gate Array)、SoC(System-on-a-Chip)またはシステムLSI(Large-Scale Integration)等を用いたものである。
 また、メモリ20002または図示しない他のメモリにより、図示しない記憶部が実現される。
 メモリ20002は、RAM(Random Access Memory)、ROM(Read Only Memory)、EPROM(Erasable Programmable ROM)、EEPROM(Electrically Erasable Programmable Read Only Memory)又はフラッシュメモリ等の不揮発性もしくは揮発性の半導体メモリであってもよいし、ハードディスク又はフレキシブルディスク等の磁気ディスクであってもよいし、CD(Compact Disc)又はDVD(Digital VersatileDisc)等の光ディスクであってもよいし、光磁気ディスクであってもよい。
 また、通信回路20004により、図示しない通信部が実現される。
 処理回路20001とメモリ20002または通信回路20004とは、相互にデータを伝送することが可能な状態に接続されている。また、処理回路20001とメモリ20002と通信回路20004とは、入出力インタフェース20003を介して他のハードウェアと相互にデータを伝送することが可能な状態に接続されている。
 なお、画像取得部1010、ニューラルネットワーク部1020、状態比較部1030、処理制御部1040、出力格納部1050、画像取得部1100,1100B,1100C、特徴抽出部1300,1300B,1300C、第1のニューラルネットワーク部1320、画像分岐部1321、畳み込み層部1322、第1の畳み込み層1323、第2の畳み込み層1324、プーリング層部1325、第1のプーリング層1326、第2のプーリング層1327、画像結合部1328、第1の状態比較部1340,1340B,1340C、画像状態比較部1341、比較処理部1341b、畳み込み層状態比較部1342、比較処理部1342b、プーリング層状態比較部1343、比較処理部1343b、第1の処理制御部1380,1380B,1380C、特徴抽出処理制御部1381,1381B,1381C、異常状態判定部1500,1500B,1500C、第2のニューラルネットワーク部1520、状態分類部1525、第1の全結合層1527、第2の全結合層1528、確率出力層1529、第2の状態比較部1540、第1の全結合層状態比較部1541、比較処理部1541b、第2の全結合層状態比較部1542、比較処理部1542b、第2の処理制御部1580,1580B,1580C、異常状態判定処理制御部1581,1581B,1581C、警報出力部2000、および、図示しない制御部の機能をそれぞれ別の処理回路で実現しても良いし,まとめて処理回路で実現しても良い。
Or, the image acquisition unit 1010, the neural network unit 1020, the state comparison unit 1030, the processing control unit 1040, the output storage unit 1050, the image acquisition units 1100, 1100B, 1100C, the feature extraction units 1300, 1300B, 1300C, the first neural network unit 1320, the image branching unit 1321, the convolution layer unit 1322, the first convolution layer 1323, the second A convolution layer 1324, a pooling layer unit 1325, a first pooling layer 1326, a second pooling layer 1327, an image combination unit 1328, a first state comparison unit 1340, 1340B, 1340C, an image state comparison unit 1341, a comparison processing unit 1341b, a convolution layer state comparison unit 1342, a comparison processing unit 1342b, a pooling layer state comparison unit 1343, a comparison processing unit 1343b, a first The functions of the processing control units 1380, 1380B, 1380C, feature extraction processing control units 1381, 1381B, 1381C, abnormal state determination units 1500, 1500B, 1500C, second neural network unit 1520, state classification unit 1525, first fully connected layer 1527, second fully connected layer 1528, probability output layer 1529, second state comparison unit 1540, first fully connected layer state comparison unit 1541, comparison processing unit 1541b, second fully connected layer state comparison unit 1542, comparison processing unit 1542b, second processing control units 1580, 1580B, 1580C, abnormal state determination processing control units 1581, 1581B, 1581C, alarm output unit 2000, and a control unit not shown in the figure may be realized by a dedicated processing circuit 20001, as shown in FIG. 23.
Furthermore, the communication circuit 10004 realizes a communication unit (not shown).
The processing circuit 20001 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), an FPGA (Field-Programmable Gate Array), a SoC (System-on-a-Chip), or a system LSI (Large-Scale Integration).
Furthermore, the memory 20002 or another memory (not shown) implements a storage unit (not shown).
The memory 20002 may be a non-volatile or volatile semiconductor memory such as a RAM (Random Access Memory), a ROM (Read Only Memory), an EPROM (Erasable Programmable ROM), an EEPROM (Electrically Erasable Programmable Read Only Memory), or a flash memory, or a magnetic disk such as a hard disk or a flexible disk, or an optical disk such as a CD (Compact Disc) or a DVD (Digital Versatile Disc), or a magneto-optical disk.
Furthermore, the communication circuit 20004 realizes a communication unit (not shown).
The processing circuit 20001 and the memory 20002 or the communication circuit 20004 are connected in a state in which they can transmit data to each other. In addition, the processing circuit 20001, the memory 20002, and the communication circuit 20004 are connected in a state in which they can transmit data to other hardware via the input/output interface 20003.
The image acquisition unit 1010, the neural network unit 1020, the state comparison unit 1030, the processing control unit 1040, the output storage unit 1050, the image acquisition units 1100, 1100B, and 1100C, the feature extraction units 1300, 1300B, and 1300C, the first neural network unit 1320, the image branching unit 1321, the convolution layer unit 1322, the first convolution layer 1323, the second A convolutional layer 1324, a pooling layer unit 1325, a first pooling layer 1326, a second pooling layer 1327, an image combination unit 1328, a first state comparison unit 1340, 1340B, 1340C, an image state comparison unit 1341, a comparison processing unit 1341b, a convolutional layer state comparison unit 1342, a comparison processing unit 1342b, a pooling layer state comparison unit 1343, a comparison processing unit 1343b, The functions of the first processing control units 1380, 1380B, 1380C, feature extraction processing control units 1381, 1381B, 1381C, abnormal state determination units 1500, 1500B, 1500C, second neural network unit 1520, state classification unit 1525, first fully connected layer 1527, second fully connected layer 1528, probability output layer 1529, second state comparison unit 1540, first fully connected layer state comparison unit 1541, comparison processing unit 1541b, second fully connected layer state comparison unit 1542, comparison processing unit 1542b, second processing control units 1580, 1580B, 1580C, abnormal state determination processing control units 1581, 1581B, 1581C, alarm output unit 2000, and a control unit not shown in the figure may be realized by separate processing circuits, or may be realized together by a processing circuit.

 または、画像取得部1010、ニューラルネットワーク部1020、状態比較部1030、処理制御部1040、出力格納部1050、画像取得部1100,1100B,1100C、特徴抽出部1300,1300B,1300C、第1のニューラルネットワーク部1320、画像分岐部1321、畳み込み層部1322、第1の畳み込み層1323、第2の畳み込み層1324、プーリング層部1325、第1のプーリング層1326、第2のプーリング層1327、画像結合部1328、第1の状態比較部1340,1340B,1340C、画像状態比較部1341、比較処理部1341b、畳み込み層状態比較部1342、比較処理部1342b、プーリング層状態比較部1343、比較処理部1343b、第1の処理制御部1380,1380B,1380C、特徴抽出処理制御部1381,1381B,1381C、異常状態判定部1500,1500B,1500C、第2のニューラルネットワーク部1520、状態分類部1525、第1の全結合層1527、第2の全結合層1528、確率出力層1529、第2の状態比較部1540、第1の全結合層状態比較部1541、比較処理部1541b、第2の全結合層状態比較部1542、比較処理部1542b、第2の処理制御部1580,1580B,1580C、異常状態判定処理制御部1581,1581B,1581C、警報出力部2000、および、図示しない制御部のうちの一部の機能がプロセッサ10001およびメモリ10002により実現され、かつ、残りの機能が処理回路20001により実現されるものであっても良い。 Or, an image acquisition unit 1010, a neural network unit 1020, a state comparison unit 1030, a processing control unit 1040, an output storage unit 1050, an image acquisition unit 1100, 1100B, 1100C, a feature extraction unit 1300, 1300B, 1300C, a first neural network unit 1320, an image branching unit 1321, a convolution layer unit 1322, a first convolution layer 1323, a second convolution layer 13 24, pooling layer unit 1325, first pooling layer 1326, second pooling layer 1327, image combination unit 1328, first state comparison unit 1340, 1340B, 1340C, image state comparison unit 1341, comparison processing unit 1341b, convolution layer state comparison unit 1342, comparison processing unit 1342b, pooling layer state comparison unit 1343, comparison processing unit 1343b, first processing control unit 1380, 1380 B, 1380C, feature extraction processing control units 1381, 1381B, 1381C, abnormal state determination units 1500, 1500B, 1500C, second neural network unit 1520, state classification unit 1525, first fully connected layer 1527, second fully connected layer 1528, probability output layer 1529, second state comparison unit 1540, first fully connected layer state comparison unit 1541, comparison processing unit 1541b, second fully connected layer Some of the functions of the state comparison unit 1542, the comparison processing unit 1542b, the second processing control units 1580, 1580B, 1580C, the abnormal state determination processing control units 1581, 1581B, 1581C, the alarm output unit 2000, and a control unit (not shown) may be realized by the processor 10001 and memory 10002, and the remaining functions may be realized by the processing circuit 20001.

 なお、本開示は、各実施の形態の自由な組み合わせ、各実施の形態の任意の構成要素の変形、または各実施の形態の任意の構成要素の省略が可能である。 Note that this disclosure allows for free combinations of each embodiment, modification of any of the components of each embodiment, or omission of any of the components of each embodiment.

 本開示は、異常判定技術において、ニューラルネットワークの全ての層における演算をせずに、ニューラルネットワークを用いて速やかに高精度な判定結果を出力することを可能にすることができ、例えば、高速かつ高精度な居眠り判定といった異常判定が可能となるので、例えば、車載用のドライバーモニタリングシステムに適用するのに適している。 The present disclosure provides an abnormality determination technology that can quickly output highly accurate determination results using a neural network without performing calculations in all layers of the neural network. For example, this enables rapid and highly accurate abnormality determination, such as drowsiness detection, and is therefore suitable for application to in-vehicle driver monitoring systems.

 100,100A,100B,100C 異常警報装置、1000,1000A,1000B,1000C 異常判定装置、1010 画像取得部、1020 ニューラルネットワーク部、1030 状態比較部、1040 処理制御部、1050 出力格納部、1100,1100B,1100C 画像取得部、1300,1300B,1300C 特徴抽出部、1320 ニューラルネットワーク部(第1のニューラルネットワーク部)、1321 画像分岐部、1322 畳み込み層部、1323 第1の畳み込み層、1324 第2の畳み込み層、1325 プーリング層部1326 第1のプーリング層、1327 第2のプーリング層、1328 画像結合部、1340,1340B,1340C 状態比較部(第1の状態比較部)、1341 画像状態比較部、1341a 格納部、1341b 比較処理部、1342 畳み込み層状態比較部、1342a 格納部、1342b 比較処理部、1343 プーリング層状態比較部、1343a 格納部、1343b 比較処理部、1380,1380B,1380C 処理制御部(第1の処理制御部)、1381,1381B,1381C 特徴抽出処理制御部、1390 出力格納部(第1の出力格納部)、1391 結合画像格納部、1500,1500B,1500C 異常状態判定部、1520 ニューラルネットワーク部(第2のニューラルネットワーク部)、1525 状態分類部、1527 第1の全結合層、1528 第2の全結合層、1529 確率出力層、1540 状態比較部(第2の状態比較部)、1541 第1の層状態比較部(第1の全結合層状態比較部)、1541a 格納部、1541b 比較処理部、1542 第2の層状態比較部(第2の全結合層状態比較部)、1542a 格納部、1542b 比較処理部、1580,1580B,1580C 処理制御部(第2の処理制御部)、1581,1581B,1581C 異常状態判定処理制御部、1590 出力格納部(第2の出力格納部)、1591 確率格納部、2000 警報出力部、10001 プロセッサ、10002 メモリ、10003 入出力インタフェース、10004 通信回路、20001 処理回路、20002 メモリ、20003 入出力インタフェース、20004 通信回路。 100, 100A, 100B, 100C: abnormality warning device, 1000, 1000A, 1000B, 1000C: abnormality determination device, 1010: image acquisition unit, 1020: neural network unit, 1030: state comparison unit, 1040: processing control unit, 1050: output storage unit, 1100, 1100B, 1100C: image acquisition unit, 1300, 1300B, 1300C: feature extraction unit, 1320: neural network unit (first neural network unit), 1321: image branching unit, 1322: convolution layer unit, 1323: first convolution layer, 1324: second convolution layer, 1325 pooling layer unit 1326 first pooling layer, 1327 second pooling layer, 1328 image combination unit, 1340, 1340B, 1340C state comparison unit (first state comparison unit), 1341 image state comparison unit, 1341a storage unit, 1341b comparison processing unit, 1342 convolution layer state comparison unit, 1342a storage unit, 1342b comparison processing unit, 1343 pooling layer state comparison unit, 1343a storage unit, 1343b comparison processing unit, 1380, 1380B, 1380C processing control unit (first processing control unit), 1381, 13 81B, 1381C: feature extraction processing control section, 1390: output storage section (first output storage section), 1391: combined image storage section, 1500, 1500B, 1500C: abnormal state determination section, 1520: neural network section (second neural network section), 1525: state classification section, 1527: first fully connected layer, 1528: second fully connected layer, 1529: probability output layer, 1540: state comparison section (second state comparison section), 1541: first layer state comparison section (first fully connected layer state comparison section), 1541a: storage section, 1541b: comparison processing section, 1542: second layer state Comparison unit (second fully connected layer state comparison unit), 1542a storage unit, 1542b comparison processing unit, 1580, 1580B, 1580C processing control unit (second processing control unit), 1581, 1581B, 1581C abnormal state determination processing control unit, 1590 output storage unit (second output storage unit), 1591 probability storage unit, 2000 alarm output unit, 10001 processor, 10002 memory, 10003 input/output interface, 10004 communication circuit, 20001 processing circuit, 20002 memory, 20003 input/output interface, 20004 communication circuit.

Claims (14)

 画像を取得して出力する画像取得部と、
 ニューラルネットワークにおける複数の層を有し、前記画像取得部により出力された画像を取得し、当該画像に含まれる判定対象の状態を示す判定結果を出力する、ニューラルネットワーク部と、
 前記ニューラルネットワーク部により出力された出力データを格納する、出力格納部と、
 前記ニューラルネットワーク部の複数の層のうちの少なくとも一つの層である第1の層に対する入力値の基準である基準値を予め格納しておき、当該基準値と、当該第1の層に新たに入力されようとしている値である今回値と、の差分値を用いて、前記判定対象の状態の変化を判定する、状態比較部と、
 前記状態比較部により前記判定対象の状態が変化なしと判定された場合、前記第1の層以降の層における処理を実行しないよう前記ニューラルネットワーク部へ指令するとともに、前記出力格納部に格納されている前記出力データを出力するよう指令する、処理制御部と、
 を備えた、異常判定装置。
an image acquisition unit that acquires and outputs an image;
a neural network unit having a plurality of layers in a neural network, acquiring an image output by the image acquisition unit, and outputting a judgment result indicating a state of a judgment target included in the image;
an output storage unit that stores output data output by the neural network unit;
a state comparison unit that prestores a reference value that is a reference for an input value to a first layer that is at least one of the multiple layers of the neural network unit, and judges a change in the state of the object to be judged using a difference value between the reference value and a current value that is a value that is to be newly input to the first layer;
a processing control unit that instructs the neural network unit not to execute processing in the first layer and subsequent layers when the state comparison unit determines that the state of the object to be determined has not changed, and instructs the neural network unit to output the output data stored in the output storage unit;
An abnormality determination device comprising:
 前記ニューラルネットワーク部は、第1のニューラルネットワーク部と、第2のニューラルネットワーク部とを含み構成され、
 前記出力格納部は、第1の出力格納部と、第2の出力格納部とを含み構成され、
 前記状態比較部は、第1の状態比較部と、第2の状態比較部とを含み構成され、
 前記処理制御部は、第1の処理制御部と、第2の処理制御部とを含み構成され、
 前記ニューラルネットワークを構成する層のうちの一部である複数の層を有し、前記画像取得部により出力された画像を取得し、当該画像に含まれる前記判定対象の特徴的な状態を表す特徴マップを出力する、前記第1のニューラルネットワーク部、
 前記第1のニューラルネットワーク部により出力された前記特徴マップを格納する、前記第1の出力格納部、
 前記第1のニューラルネットワーク部における層のうちの少なくとも一つの層である第2の層に対する入力値の基準である基準値を予め格納しておき、当該基準値と、当該第2の層に新たに入力されようとしている値である今回値と、の差分値を用いて、前記判定対象の状態の変化を判定する、前記第1の状態比較部、
 および、
 前記第1の状態比較部により前記判定対象の状態が変化なしと判定された場合、当該判定対象の前記第2の層以降の層における処理を実行しないよう前記第1のニューラルネットワーク部へ指令するとともに、前記第1の出力格納部に格納されている前記特徴マップを出力するよう指令する、前記第1の処理制御部、
 を含み構成された特徴抽出部と、
 前記ニューラルネットワークを構成する層のうちの一部である複数の層を有し、前記特徴抽出部により出力された前記特徴マップを取得し、当該特徴マップを用いて前記判定対象の状態を示す判定値を前記出力データとして出力する、前記第2のニューラルネットワーク部、
 前記第2のニューラルネットワーク部により出力された判定値を格納する、前記第2の出力格納部、
 前記第2のニューラルネットワーク部における層のうちの少なくとも一つの層である第3の層に対する入力値の基準である基準値を予め格納しておき、当該基準値と、当該第3の層に新たに入力されようとしている値である今回値と、の差分値を用いて、前記判定対象の状態の変化を判定する、前記第2の状態比較部、
 および、
 前記第2の状態比較部により前記判定対象の状態が変化なしと判定された場合、当該判定対象の前記第3の層以降の層における処理を実行しないよう前記第2のニューラルネットワーク部へ指令するとともに、前記第2の出力格納部に格納されている前記判定値を前記出力データとして出力するよう指令する、前記第2の処理制御部、
 を含み構成された異常状態判定部と、
 を備えた、請求項1に記載の異常判定装置。
the neural network unit includes a first neural network unit and a second neural network unit;
The output storage unit includes a first output storage unit and a second output storage unit,
the state comparison unit includes a first state comparison unit and a second state comparison unit,
The processing control unit includes a first processing control unit and a second processing control unit,
the first neural network unit having a plurality of layers that are a part of the layers constituting the neural network, acquiring an image output by the image acquisition unit, and outputting a feature map that represents a characteristic state of the object to be determined that is included in the image;
the first output storage unit that stores the feature map output by the first neural network unit;
the first state comparison unit pre-stores a reference value that is a reference for an input value to a second layer that is at least one of the layers in the first neural network unit, and judges a change in the state of the object to be judged by using a difference value between the reference value and a current value that is a value that is to be newly input to the second layer;
and,
the first processing control unit instructs the first neural network unit not to execute processing in the second layer and subsequent layers of the object to be determined when the first state comparison unit determines that the state of the object to be determined has not changed, and instructs the first processing control unit to output the feature map stored in the first output storage unit;
A feature extraction unit including:
the second neural network unit having a plurality of layers that are a part of the layers constituting the neural network, acquiring the feature map output by the feature extraction unit, and outputting a judgment value indicating a state of the object to be judged as the output data using the feature map;
the second output storage unit that stores the judgment value output by the second neural network unit;
the second state comparison unit stores in advance a reference value that is a reference for an input value to a third layer that is at least one of the layers in the second neural network unit, and judges a change in the state of the object to be judged by using a difference value between the reference value and a current value that is a value that is to be newly input to the third layer;
and,
the second processing control unit instructs the second neural network unit not to execute processing in the third layer and subsequent layers of the object to be judged when the second state comparison unit judges that the state of the object to be judged has not changed, and instructs the second processing control unit to output the judgment value stored in the second output storage unit as the output data;
An abnormality state determination unit including:
The abnormality determination device according to claim 1 .
 前記ニューラルネットワーク部は、第1のニューラルネットワーク部と、第2のニューラルネットワーク部とを含み構成され、
 前記出力格納部は、第1の出力格納部と、第2の出力格納部とを含み構成され、
 前記状態比較部は、第1の状態比較部と、第2の状態比較部とを含み構成され、
 前記処理制御部は、第1の処理制御部と、第2の処理制御部とを含み構成され、
 前記ニューラルネットワークを構成する層のうちの一部である複数の層を有し、前記画像取得部により出力された画像を取得し、当該画像に含まれる前記判定対象の特徴的な状態を表す特徴マップを出力する、前記第1のニューラルネットワーク部、
 前記第1のニューラルネットワーク部により出力された前記特徴マップを格納する、前記第1の出力格納部、
 前記第1のニューラルネットワーク部における複数の層ごとに、当該層に対する入力値の基準である基準値を予め格納しておき、当該基準値と、当該層に新たに入力されようとしている値である今回値と、の差分値を用いて、前記判定対象の状態の変化を判定する、前記第1の状態比較部、
 および、
 前記第1の状態比較部により前記判定対象の状態が変化なしと判定された場合、当該判定対象の前記層以降の層における処理を実行しないよう前記第1のニューラルネットワーク部へ指令するとともに、前記第1の出力格納部に格納されている前記特徴マップを出力するよう指令する、前記第1の処理制御部、
 を含み構成された特徴抽出部と、
 前記ニューラルネットワークにおける複数の層を有し、前記特徴抽出部により出力された前記特徴マップを取得し、当該特徴マップを用いて前記判定対象の状態を示す判定値を前記出力データとして出力する、前記第2のニューラルネットワーク部、
 前記第2のニューラルネットワーク部により出力された判定値を格納する、前記第2の出力格納部、
 前記第2のニューラルネットワーク部の複数の層ごとに、当該層に対する入力値の基準である基準値を予め格納しておき、当該基準値と、当該層に新たに入力されようとしている値である今回値と、の差分値を用いて、前記判定対象の状態の変化を判定する、前記第2の状態比較部、
 および、
 前記第2の状態比較部により前記判定対象の状態が変化なしと判定された場合、当該判定対象の前記層以降の層における処理を実行しないよう前記第2のニューラルネットワーク部へ指令するとともに、前記第2の出力格納部に格納されている前記判定値を前記出力データとして出力するよう指令する、前記第2の処理制御部、
 を含み構成された異常状態判定部と、
 を備えた、請求項1に記載の異常判定装置。
the neural network unit includes a first neural network unit and a second neural network unit;
The output storage unit includes a first output storage unit and a second output storage unit,
the state comparison unit includes a first state comparison unit and a second state comparison unit,
The processing control unit includes a first processing control unit and a second processing control unit,
the first neural network unit having a plurality of layers that are a part of the layers constituting the neural network, acquiring an image output by the image acquisition unit, and outputting a feature map that represents a characteristic state of the object to be determined that is included in the image;
the first output storage unit that stores the feature map output by the first neural network unit;
the first state comparison unit, which stores in advance a reference value that is a reference for an input value to each of a plurality of layers in the first neural network unit, and judges a change in the state of the object to be judged using a difference value between the reference value and a current value that is a value that is to be newly input to the layer;
and,
the first processing control unit instructs the first neural network unit not to execute processing in the layer of the object to be determined and subsequent layers when the first state comparison unit determines that the state of the object to be determined has not changed, and instructs the first processing control unit to output the feature map stored in the first output storage unit;
A feature extraction unit including:
the second neural network unit having a plurality of layers in the neural network, acquiring the feature map output by the feature extraction unit, and outputting a judgment value indicating a state of the object to be judged as the output data using the feature map;
the second output storage unit for storing the judgment value output by the second neural network unit;
the second state comparison unit, which stores in advance a reference value that is a reference for an input value to each layer of the second neural network unit, and judges a change in the state of the object to be judged using a difference value between the reference value and a current value that is a value that is to be newly input to the layer;
and,
the second processing control unit instructing the second neural network unit not to execute processing in the layer of the object to be judged and subsequent layers when the second state comparison unit judges that the state of the object to be judged has not changed, and instructing the second neural network unit to output the judgment value stored in the second output storage unit as the output data;
An abnormality state determination unit configured to include
The abnormality determination device according to claim 1 ,
 前記基準値は、前記ニューラルネットワークにおける、前回の処理結果としての層ごと前回値を格納したものである、ことを特徴とする請求項1から請求項3のうちのいずれか1項に記載の異常判定装置。 The abnormality determination device according to any one of claims 1 to 3, characterized in that the reference value is a previous value stored for each layer as a result of a previous process in the neural network.  前記基準値は、前記ニューラルネットワークにおける、予めモデル化した典型的な中間層の出力値である、
 ことを特徴とする請求項1から請求項3のうちのいずれか1項に記載の異常判定装置。
The reference value is a typical output value of a pre-modeled intermediate layer in the neural network.
4. The abnormality determination device according to claim 1, wherein the abnormality determination device is a detection device for detecting an abnormality.
 前記基準値は、前記ニューラルネットワークに含まれるノードのうち予め定められたノードの出力値である、
 ことを特徴とする請求項1から請求項3のうちのいずれか1項に記載の異常判定装置。
The reference value is an output value of a predetermined node among the nodes included in the neural network.
4. The abnormality determination device according to claim 1, wherein the abnormality determination device is a detection device for detecting an abnormality.
 前記状態比較部は、
 前記差分値と予め記憶されているしきい値とを用いて前記基準値と前記今回値との差の大小を判定した結果に基づいて、前記判定対象の状態の変化を判定する、
 ことを特徴とする請求項1に記載の異常判定装置。
The state comparison unit is
a change in the state of the object to be determined based on a result of determining whether a difference between the reference value and the current value is greater or smaller using the difference value and a pre-stored threshold value;
2. The abnormality determination device according to claim 1 .
 前記状態比較部は、
 前記画像の単位に、前記差分値の二乗和が予め記憶されたしきい値より小さい場合、前記判定対象の状態が変化なしであると判定する、
 ことを特徴とする請求項7に記載の異常判定装置。
The state comparison unit is
When the sum of squares of the difference values is smaller than a pre-stored threshold value for each image, it is determined that the state of the object to be determined is unchanged.
8. The abnormality determination device according to claim 7.
 前記第1の状態比較部および前記第2の状態比較部の少なくとも一方は、
 前記差分値と予め記憶されているしきい値とを用いて前記基準値と前記今回値との差の大小を判定した結果に基づいて、前記判定対象の状態の変化を判定する、
 ことを特徴とする請求項2または請求項3に記載の異常判定装置。
At least one of the first state comparison unit and the second state comparison unit is
a change in the state of the object to be determined based on a result of determining whether a difference between the reference value and the current value is greater or smaller using the difference value and a pre-stored threshold value;
4. The abnormality determination device according to claim 2 or 3.
 前記第1の状態比較部および前記第2の状態比較部の少なくとも一方は、
 前記画像の単位に、前記差分値の二乗和が予め記憶されたしきい値より小さい場合、前記判定対象の状態が変化なしであると判定する、
 ことを特徴とする請求項9に記載の異常判定装置。
At least one of the first state comparison unit and the second state comparison unit is
When the sum of squares of the difference values is smaller than a pre-stored threshold value for each image, it is determined that the state of the object to be determined is unchanged.
10. The abnormality determination device according to claim 9.
 前記状態比較部は、
 前記画像の単位に、前記差分値の絶対値の総和が予め記憶されたしきい値より小さい場合、前記判定対象の状態が変化なしであると判定する、
 ことを特徴とする請求項7に記載の異常判定装置。
The state comparison unit is
When the sum of the absolute values of the difference values is smaller than a pre-stored threshold value for each image, it is determined that the state of the object to be determined is unchanged.
8. The abnormality determination device according to claim 7.
 前記第1の状態比較部および前記第2の状態比較部の少なくとも一方は、
 前記画像の単位に、前記差分値の絶対値の総和が予め記憶されたしきい値より小さい場合、前記判定対象の状態が変化なしであると判定する、
 ことを特徴とする請求項9に記載の異常判定装置。
At least one of the first state comparison unit and the second state comparison unit is
When the sum of the absolute values of the difference values is smaller than a pre-stored threshold value for each image, it is determined that the state of the object to be determined is unchanged.
10. The abnormality determination device according to claim 9.
 判定対象者の眠気状態または居眠り状態を示す判定値である前記出力データを取得し、当該判定値に応じて前記判定対象者に対する警報を出力させる警報出力部をさらに備えた、
 ことを特徴とする請求項1から請求項12のうちのいずれか1項に記載の異常判定装置。
The device further includes an alarm output unit that acquires the output data, which is a judgment value indicating a drowsy state or a dozing state of the person to be judged, and outputs an alarm to the person to be judged according to the judgment value.
The abnormality determination device according to any one of claims 1 to 12.
 画像取得部が、取得した画像をニューラルネットワーク部へ出力し、
 ニューラルネットワークにおける複数の層を有する前記ニューラルネットワーク部が、前記画像取得部により出力された画像を取得し、当該画像に含まれる判定対象の状態を示す判定結果を出力し、
 出力格納部が、前記ニューラルネットワーク部により出力された出力データを格納し、
 状態比較部が、前記ニューラルネットワーク部の複数の層のうちの少なくとも一つの層である第1の層に対する入力値の基準である基準値を予め格納しておき、当該基準値と、当該第1の層に新たに入力されようとしている値である今回値と、の差分値を用いて、前記判定対象の状態の変化を判定し、
 前記状態比較部により前記判定対象の状態が変化なしと判定された場合、処理制御部が、前記第1の層以降の層における処理を実行しないよう前記ニューラルネットワーク部へ指令するとともに、前記出力格納部に格納されている前記出力データを出力するよう指令する、
 異常判定方法。
The image acquisition unit outputs the acquired image to the neural network unit,
the neural network unit having a plurality of layers in a neural network acquires the image output by the image acquisition unit, and outputs a judgment result indicating a state of a judgment target included in the image;
an output storage unit that stores the output data output by the neural network unit;
a state comparison unit pre-stores a reference value that is a reference for an input value to a first layer that is at least one of the multiple layers of the neural network unit, and judges a change in the state of the object to be judged using a difference value between the reference value and a current value that is a value that is to be newly input to the first layer;
When the state comparison unit determines that the state of the object to be determined has not changed, the processing control unit instructs the neural network unit not to execute processing in the first layer and subsequent layers, and instructs the neural network unit to output the output data stored in the output storage unit.
Method for determining abnormality.
PCT/JP2023/018362 2023-05-17 2023-05-17 Abnormality determination device and abnormality determination method Pending WO2024236748A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2025520317A JP7696532B2 (en) 2023-05-17 2023-05-17 Abnormality determination device and abnormality determination method
PCT/JP2023/018362 WO2024236748A1 (en) 2023-05-17 2023-05-17 Abnormality determination device and abnormality determination method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2023/018362 WO2024236748A1 (en) 2023-05-17 2023-05-17 Abnormality determination device and abnormality determination method

Publications (1)

Publication Number Publication Date
WO2024236748A1 true WO2024236748A1 (en) 2024-11-21

Family

ID=93518926

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/018362 Pending WO2024236748A1 (en) 2023-05-17 2023-05-17 Abnormality determination device and abnormality determination method

Country Status (2)

Country Link
JP (1) JP7696532B2 (en)
WO (1) WO2024236748A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06139224A (en) * 1992-09-08 1994-05-20 Hitachi Ltd Information processing device and monitoring device
US20200174748A1 (en) * 2018-11-30 2020-06-04 Advanced Micro Devices, Inc. Sorting Instances of Input Data for Processing through a Neural Network
US20210072984A1 (en) * 2019-09-10 2021-03-11 Micron Technology, Inc. Re-USING PROCESSING ELEMENTS OF AN ARTIFICIAL INTELLIGENCE PROCESSOR
WO2022070947A1 (en) * 2020-09-30 2022-04-07 ソニーセミコンダクタソリューションズ株式会社 Signal processing device, imaging device, and signal processing method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06139224A (en) * 1992-09-08 1994-05-20 Hitachi Ltd Information processing device and monitoring device
US20200174748A1 (en) * 2018-11-30 2020-06-04 Advanced Micro Devices, Inc. Sorting Instances of Input Data for Processing through a Neural Network
US20210072984A1 (en) * 2019-09-10 2021-03-11 Micron Technology, Inc. Re-USING PROCESSING ELEMENTS OF AN ARTIFICIAL INTELLIGENCE PROCESSOR
WO2022070947A1 (en) * 2020-09-30 2022-04-07 ソニーセミコンダクタソリューションズ株式会社 Signal processing device, imaging device, and signal processing method

Also Published As

Publication number Publication date
JPWO2024236748A1 (en) 2024-11-21
JP7696532B2 (en) 2025-06-20

Similar Documents

Publication Publication Date Title
US12159232B2 (en) Apparatus and method with neural network implementation of domain adaptation
EP3509011B1 (en) Apparatuses and methods for recognizing a facial expression robust against change in facial expression
US20230071940A1 (en) Long range lidar-based speed estimation
US20190164057A1 (en) Mapping and quantification of influence of neural network features for explainable artificial intelligence
EP3674974B1 (en) Apparatus and method with user verification
CN107303907B (en) Device and method for determining drowsiness of driver
CN111295676B (en) Method and device for automatically generating artificial neural networks
JP6969254B2 (en) Image processing equipment and programs
CN113807298B (en) Pedestrian crossing intention prediction method and device, electronic equipment and readable storage medium
CN113243021A (en) Method for training a neural network
JP7696532B2 (en) Abnormality determination device and abnormality determination method
US12354292B2 (en) Object ranging apparatus, method, and computer readable medium
US11636698B2 (en) Image processing method and apparatus with neural network adjustment
CN112714916A (en) Machine learning system, method of creating machine learning system, computer program, and apparatus
EP4125063A1 (en) Methods and systems for occupancy class prediction and methods and systems for occlusion value determination
EP4002270A1 (en) Image recognition evaluation program, image recognition evaluation method, evaluation device, and evaluation system
EP3792814B1 (en) Method and system for selecting an operation mode for an at least partly self-driving vehicle
CN117657170B (en) Intelligent safety and whole vehicle control method and system for new energy automobile
KR20210079823A (en) Electronic device and Method for controlling the electronic device thereof
CN113168571A (en) Method for training a neural network
US11893086B2 (en) Shape-biased image classification using deep convolutional networks
KR102324231B1 (en) Apparatus for Detecting Emergency Situation while Driving through Facial Expression Classification
JP7175381B2 (en) Arousal level estimation device, automatic driving support device, and arousal level estimation method
CN114186223A (en) System and method for improving measurements of an intrusion detection system by transforming one-dimensional measurements into a multi-dimensional image
CN112529000A (en) Apparatus and method for training segmentation model

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23937490

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2025520317

Country of ref document: JP