WO2017150073A1 - Work operation analysis system, work operation analysis method, and work operation analysis program - Google Patents
Work operation analysis system, work operation analysis method, and work operation analysis program Download PDFInfo
- Publication number
- WO2017150073A1 WO2017150073A1 PCT/JP2017/004022 JP2017004022W WO2017150073A1 WO 2017150073 A1 WO2017150073 A1 WO 2017150073A1 JP 2017004022 W JP2017004022 W JP 2017004022W WO 2017150073 A1 WO2017150073 A1 WO 2017150073A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- analysis
- information
- time
- work
- time stamp
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Definitions
- the present invention relates to a technique for recognizing work operations.
- Patent Document 1 discloses a method using moving images.
- feature data is calculated from a moving image, and the moving image is divided by finding a time-series change of the feature data, that is, a change in motion.
- time-series feature data or a symbol string representing time-series feature data is acquired from the divided moving images, and the motion is analyzed using them. This means that a complicated operation is divided into simpler operations, and there is an advantage that even a complicated operation can be analyzed.
- a work motion analysis system analyzes a device, a device information generation unit that generates device information including a plurality of time stamps that specify the time at which the sensor issued, and analyzes the device information.
- a production unit including a transmission unit that transmits to the apparatus, a reception unit that receives apparatus information, an analysis information buffer that temporarily stores analysis information acquired from the imaging apparatus, and a plurality of pieces of information included in the received apparatus information Analysis waits until the analysis information acquired at each time indicated by the time stamp is stored in the analysis information buffer, and analyzes the analysis information for the range specified using the time indicated by the multiple time stamps And an analysis device.
- FIG. 1 is a configuration diagram of a work operation analysis system in Embodiment 1.
- FIG. 3 is a flowchart illustrating the operation of the production apparatus according to the first embodiment. 3 is a flowchart illustrating the operation of the analysis apparatus according to the first embodiment. It is the figure which showed the component of apparatus information. It is a structure of the data preserve
- FIG. 1 is a system configuration diagram showing an overall image of a work motion analysis system.
- the work motion analysis system includes one or more production apparatuses 100 and an analysis apparatus 102.
- the production apparatus 100 includes an apparatus information transmission unit 110, an apparatus information generation unit 111, one or more sensors 112, a control unit 113, and a production unit 114.
- the analysis device 102 includes a device information reception unit 120, a trigger generation unit 121, a trigger generation table 123, a trigger buffer 131, an analysis information division unit 132, an analysis information buffer 133, a camera ID table 134, An operation model selection unit 135, an operation model storage unit 136, an analysis unit 137, a log storage unit 138, and a display unit 139 are included.
- One or more cameras 140 are connected to the analysis information buffer 133.
- a personal authentication device 150 and a scheduler 151 are connected to the analysis device 102.
- FIG. 2 is a flowchart showing the operation of the production apparatus 100. This flowchart will be described with reference to FIG.
- S200 is processing in which the device information generation unit 111 acquires a sensor ID from the sensor 112.
- the sensor ID is information that can identify the sensor 112. This process makes it clear which sensor has issued the alarm.
- the sensor 112 is, for example, a contact sensor, and can acquire the timing when the worker picks up the product from the production apparatus or the timing when the product is introduced.
- the former means the work start timing of the worker, and the latter means the work end timing of the worker.
- the senor 112 may be a start switch of the production unit 114.
- the control unit 113 receives the notification of the sensor 112 as the start switch, and controls the production unit 114 to perform production-related processing such as cutting, caulking, welding, and conveyance.
- the sensor 112 can acquire the timing when the worker inputs the product into the production apparatus and activates the production apparatus. This means the work end timing of the worker.
- the apparatus information generation unit 111 adds at least a production apparatus ID that can identify the production apparatus and a time stamp that can identify the time to the sensor ID acquired in S200, and the apparatus information 400 shown in FIG. Is a process for generating The time stamp represents the timing at which the device information generation unit 111 detects the report from the sensor 112.
- the sensor 112 includes a time-synchronizable clock
- the device information generation unit 111 may receive the sensor ID and the timing issued by the sensor 112 together. In that case, in S201, the apparatus information generation unit 111 only needs to add the production apparatus ID to the received information.
- FIG. 3 is a flowchart showing the operation of the analysis apparatus 102. This flowchart will be described with reference to FIGS.
- S300 is a process in which the device information receiving unit 120 receives the device information 400 from the device information transmitting unit 110.
- the data 500 stored in the trigger generation table 123 is a production apparatus ID, a sensor ID, a work ID, and a work start / end flag, and these are associated with each other.
- S301 is a process in which the trigger generation unit 121 reads the work ID and the work start / end flag corresponding to the sensor ID and the production apparatus ID included in the device information 400 with reference to the trigger generation table 123.
- a trigger 600 shown in FIG. 6 is generated, which includes at least those work IDs, work start / end flags, and a time stamp included in the device information.
- the timing indicated by the time stamp included in the trigger 600 is simply referred to as the timing indicated by the trigger 600.
- the work start / end flag indicates whether the timing indicated by the trigger 600 represents the work start or end timing.
- the former is called the start trigger and the latter is called the end trigger. Since there are two types of work processes where both start triggers and end triggers are generated (with pair triggers) and work processes where only start or end triggers are generated (without pair triggers), work start / end The flag also contains information about whether or not pair triggers are present.
- the trigger generation table 123 there may be no information in the trigger generation table 123 depending on the combination of the production device ID and the sensor ID. In this case, it means that the corresponding work does not exist, and the trigger 600 is not generated.
- S310 is a process of temporarily storing the trigger 600 in the trigger buffer 131.
- S 311 is a process of temporarily storing analysis information read from the camera 140 in the analysis information buffer 133.
- the analysis information stored here may be a moving image or information representing a posture of a person extracted from the moving image. Further, the posture information of a person may be position information of a part such as a person's hand, foot, shoulder, or head.
- the data in the analysis information buffer 133 is stored as time-series information for each camera ID.
- FIG. 7 a simulation diagram in the case where position information of parts such as human hands, feet, shoulders, and heads is accumulated as time series information is shown as a simulation diagram 701 of analysis information.
- S312 is a process in which the analysis information dividing unit 132 reads the trigger 600 stored in the trigger buffer 131. If the read trigger has a pair trigger, read it after both pair triggers are available. If the read trigger has no pair trigger, it is read as it is.
- the analysis information dividing unit 132 refers to the camera ID table 134 and acquires the corresponding camera ID from the work ID included in the trigger 600.
- the data 800 in the camera ID table 134 is data in which a work ID and a camera ID are associated with each other.
- S313 is processing in which the analysis information dividing unit 132 divides and reads analysis information from the analysis information buffer 133 according to the acquired camera ID and the time stamp included in the trigger.
- the three patterns of division processing will be described with reference to FIGS.
- the first pattern is when there is a pair trigger.
- the analysis information dividing unit 132 divides and reads analysis information in a range between both timings from the analysis information buffer 133 according to the timings indicated by the start trigger and the end trigger.
- the first pattern will be described. That is, in the simulation diagram 701 of the information for analysis, the first example 900 of the division target that is between the first example 901 of the timing indicated by the start trigger and the first example 902 of the timing indicated by the end trigger 900 Is shown.
- the second pattern is when there is no pair trigger and the trigger is a start trigger.
- the range between the timing indicated by the start trigger and the timing after a predetermined time with respect to the timing indicated by the start trigger is divided and read.
- the second pattern will be described with reference to FIG. That is, in the simulation diagram 701 of the information for analysis, a range between the second example 1001 of the timing indicated by the start trigger and the timing 1002 after a predetermined time from the second example of the timing indicated by the start trigger. A state in which the second example 1000 to be divided is divided is shown.
- the predetermined range may be a value obtained by adding a margin to the standard work time of the corresponding work process.
- the third pattern is when there is no pair trigger and the trigger is an end trigger.
- the range between the timing indicated by the end trigger and the timing indicated by the end trigger is divided and read.
- FIG. 11 explains the third pattern. That is, in the simulation diagram 701 of the analysis information, the second example 1101 of the timing indicated by the end trigger and the range between the timing 1102 a predetermined time before the second example of the timing indicated by the end trigger. A mode that the 3rd example 1100 of a division
- the predetermined range may be a value obtained by adding a margin to the standard work time of the corresponding work process.
- any of the above three patterns if the analysis information is not stored in time, it waits until the analysis information in the range to be read is stored.
- S314 is a process in which the behavior model selection unit 135 selects a behavior model corresponding to the work ID from the behavior model storage unit 136 in accordance with the work ID included in the trigger 600.
- the operation model corresponding to the trigger work ID and the personal ID that can identify the person performing the corresponding work acquired from the personal authentication device 150 or the scheduler 151 may be selected.
- the personal authentication device 150 acquires the personal ID of the person who is performing the work corresponding to the work ID using the recognition device that is recognized by the biometric information or held by the person.
- the scheduler 151 has information on when each person performs a work in which work process, and specifies the personal ID of the person who is performing the work based on the work ID and time stamp included in the trigger 600. be able to.
- the data 1200 stored in the behavior model storage unit 136 is data in which a work ID, a personal ID, and a behavior model expressed in numerical values are associated with each other.
- a behavior model expressed in numerical values are associated with each other.
- an operation model is stored for each work ID.
- the motion model may represent human motion information as time-series information of motion vectors on an image or a representative value thereof, or may be time-series information about a human posture or a representative value thereof. Furthermore, it may be expressed as a probability distribution related to them.
- a method that uses a Gaussian distribution or a Gaussian mixture distribution which is a parametric method, a frequency distribution method that uses a nonparametric method, or a method that uses a Parzen window. can do.
- S315 is a process in which the analysis unit 137 performs analysis using the analysis information divided in step S313 and the operation model selected in S314. Specifically, the degree of deviation representing how much the action indicated by the divided analysis information deviates from the selected action model is calculated and output as an analysis result.
- the dynamic programming method can be used if the selected behavior model is represented by time series information, and the Euclidean distance can be used if it is represented by a representative value.
- the behavior model is expressed as a probability distribution
- the Mahalanobis distance can be used if the probability distribution is a Gaussian distribution, and if it is expressed as another probability distribution, divided analysis information can be generated.
- the degree of deviation can be calculated by calculating the probability. Even when the behavior model is represented by a hidden Markov model, the probability that the divided analysis information can be generated can be calculated, and the deviation degree can be calculated.
- the analysis result may be a plurality of values obtained by calculation using a plurality of methods. For example, if the degree of deviation is calculated by selectively using different parts of the human body such as the whole body, hand, foot, upper body, and lower body in the analysis information, an analysis result corresponding to each part can be obtained.
- data 1300 stored in the log storage unit is data in which a time stamp, a work ID, a personal ID, and an analysis result are associated with each other.
- the time stamp is stored corresponding to the trigger 600 used in step S313. If the work start trigger is used, the time stamp (work start time stamp) is stored. If the work end trigger is used, the time stamp (work end time stamp) is stored. If both are used, save both.
- S317 is a process in which the display unit 139 displays the time stamp, work ID, personal ID, and analysis result on the production apparatus 100.
- the time stamp is stored corresponding to the trigger 600 used in step S313. If the work start trigger is used, the time stamp (work start time stamp) is displayed. If the work end trigger is used, the time stamp (work end time stamp) is displayed. If both are used, both are displayed.
- FIG. 14 is a diagram illustrating an example 1400 of a screen displayed on the display screen 139. If any analysis result exceeds a predetermined threshold value, a message indicating that a deviating action has occurred is displayed as shown in FIG. 14, and the analysis result uses information for analysis of a specific part of the human body. If it is calculated in this way, it becomes easier to understand by simultaneously displaying the parts as shown in FIG. Thereafter, the worker or the supervisor who has seen the display screen 139 can deal with, for example, discarding the product when the deviation operation occurs or giving appropriate guidance to the worker who has performed the deviation operation. Become.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Manufacturing & Machinery (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- General Factory Administration (AREA)
Abstract
Description
本発明は,作業動作を認識する技術に関する。 The present invention relates to a technique for recognizing work operations.
組立加工の分野において,作業者は規定された標準動作に従って,製品の組立加工を行うことが求められる。標準動作は,製品の品質を保つために必要な動作として定義されているため,作業者が標準動作とは異なる動作(逸脱動作)を行った場合,そのときの作業対象となった製品には,品質に問題が発生する可能性が高くなる。そのため,各種センサを利用して取得した人の動き情報から,逸脱動作を自動的に検知する機能(逸脱動作検知機能)が必要とされる。逸脱動作検知機能によって逸脱動作が検知された場合,例えば,そのとき作業対象となった製品に再検査を実施したり,製品自体を廃棄したりといった方法で,品質を担保することができる。 In the field of assembly processing, workers are required to perform assembly processing of products according to the standard operation specified. Since the standard action is defined as the action necessary to maintain the quality of the product, if the worker performs an action (deviation action) that is different from the standard action, , The possibility of problems in quality increases. For this reason, a function (deviation action detection function) for automatically detecting a departure action from human motion information acquired using various sensors is required. When a deviating action is detected by the deviating action detecting function, the quality can be ensured by, for example, performing a re-inspection on the product that is the work target at that time or discarding the product itself.
人の動作を自動解析する方法として,特許文献1記載の発明には,動画像を利用した方法が開示されている。前記方法では,まず,動画像から特徴データを算出し,特徴データの時系列変化,すなわち動作の変化を見つけることで動画像を分割する。分割された動画像から時系列の特徴データ,あるいは時系列の特徴データを表す記号列を取得し,それらを用いて動作を解析する,という手法である。これは,複雑な動作を,より単純な動作へ分割することを意味しており,複雑な動作であっても解析可能となるという利点がある。 As a method of automatically analyzing human movements, the invention described in Patent Document 1 discloses a method using moving images. In the method, first, feature data is calculated from a moving image, and the moving image is divided by finding a time-series change of the feature data, that is, a change in motion. In this method, time-series feature data or a symbol string representing time-series feature data is acquired from the divided moving images, and the motion is analyzed using them. This means that a complicated operation is divided into simpler operations, and there is an advantage that even a complicated operation can be analyzed.
特許文献1では,「局所的な動き情報の統計量に基づき特徴データを算出し,当該データの時系列変化から動画像データを分割し,分割区間毎に時系列特徴データを算出する」という構成が開示されている。 In Patent Document 1, “feature data is calculated based on local motion information statistics, moving image data is divided from time-series changes of the data, and time-series feature data is calculated for each divided section”. Is disclosed.
特許文献1記載の発明では,動作の変化を用いて動画を分割するが,作業開始と終了のタイミングと,動作の変化とが必ずしも同じタイミングとは限らないため,それらのタイミングを取得できないという課題がある。 In the invention described in Patent Document 1, a moving image is divided by using a change in operation. However, the timing of starting and ending work and the change in operation are not necessarily the same timing, and thus the timing cannot be acquired. There is.
上記課題を解決するため,本発明に係る作業動作解析システムは,センサと,センサが発報した時刻を特定する複数のタイムスタンプを含む装置情報を生成する装置情報生成部と,装置情報を解析装置に送信する送信部と,を備える生産装置と,装置情報を受信する受信部と,撮像装置から取得した解析用情報を一時蓄積する解析用情報バッファと,受信した装置情報に含まれる複数のタイムスタンプが示す時刻それぞれにおいて取得された解析用情報が解析用情報バッファに蓄積されるまで待機し,複数のタイムスタンプが示す時刻を用いて特定された範囲について,解析用情報の解析を行う解析部と,を備える解析装置と,を備える。 In order to solve the above-described problems, a work motion analysis system according to the present invention analyzes a device, a device information generation unit that generates device information including a plurality of time stamps that specify the time at which the sensor issued, and analyzes the device information. A production unit including a transmission unit that transmits to the apparatus, a reception unit that receives apparatus information, an analysis information buffer that temporarily stores analysis information acquired from the imaging apparatus, and a plurality of pieces of information included in the received apparatus information Analysis waits until the analysis information acquired at each time indicated by the time stamp is stored in the analysis information buffer, and analyzes the analysis information for the range specified using the time indicated by the multiple time stamps And an analysis device.
作業開始,終了あるいはその両方のタイミングを生産装置から取得することで,作業中の時間帯のみ,あるいは少なくとも作業を含んだ時間帯を対象として解析を実施することが可能となる。 By acquiring the work start and / or end timings from the production equipment, it is possible to perform analysis only for the time zone during work, or at least the time zone that includes work.
以下,実施例を図面を用いて説明する。 Hereinafter, examples will be described with reference to the drawings.
図1は,作業動作解析システム全体像を示すシステム構成図である。作業動作解析システムは,1以上の生産装置100および解析装置102から構成される。
FIG. 1 is a system configuration diagram showing an overall image of a work motion analysis system. The work motion analysis system includes one or
生産装置100は,装置情報送信部110と,装置情報生成部111と,1以上のセンサ112と,制御部113と,生産部114と,からなる。
The
解析装置102は,装置情報受信部120と,トリガ生成部121と,トリガ生成テーブル123と,トリガバッファ131と,解析用情報分割部132と,解析用情報バッファ133と,カメラIDテーブル134と,動作モデル選択部135と,動作モデル蓄積部136と,解析部137と,ログ蓄積部138と,表示部139からなる。解析用情報バッファ133にはカメラ140が1以上の台数で接続される。解析装置102には,個人認証装置150と,スケジューラ151が接続される。
The
生産装置100の動作については図2で,解析装置102の動作については図3で,それぞれ後述する。
The operation of the
図2は,生産装置100の動作を示すフローチャートである。このフローチャートを,図4を参照しながら説明する。
FIG. 2 is a flowchart showing the operation of the
S200は,装置情報生成部111が,センサ112からセンサIDを取得する処理である。センサIDは,センサ112を特定可能な情報である。この処理により,いずれのセンサが発報したかが判明する。センサ112は,例えば接触センサであり,作業者が生産装置から製品を取りだしたタイミング,あるいは製品を投入したタイミングを取得できる。前者は,作業者の作業開始タイミングを意味し,後者は,作業者の作業終了タイミングを意味する。
S200 is processing in which the device
また,センサ112は,生産部114の起動スイッチであってもよい。その場合,起動スイッチとしてのセンサ112の発報を制御部113が受信し,生産部114を制御して生産に関する処理,例えば切削,かしめ,溶接,搬送などを行うこととなる。つまり,センサ112によって,作業者が生産装置に製品を投入し,生産装置を起動したタイミングを取得できる。これは,作業者の作業終了タイミングを意味する。
Further, the
S201は,装置情報生成部111が,S200で取得したセンサIDに,少なくとも生産装置を特定可能な生産装置IDと,時刻を特定可能なタイムスタンプとを追加して,図4に示す装置情報400を生成する処理である。タイムスタンプは,センサ112の発報を装置情報生成部111が検知したタイミングを表す。あるいは,センサ112が時刻同期可能な時計を備えていれば,S200においては,装置情報生成部111は,センサIDとセンサ112が発報したタイミングを併せて受信しても良い。その場合は,S201においては,装置情報生成部111は,受信済の情報に生産装置IDを追加するだけで良い。
In S201, the apparatus
S202は,装置情報送信部110が,装置情報400を解析装置102に送信する処理である。図3は,解析装置102の動作を示すフローチャートである。このフローチャートを,図5~図14を参照しながら説明する。
S202 is a process in which the device
S300は,装置情報受信部120が,装置情報送信部110から装置情報400を受け取る処理である。トリガ生成テーブル123に保存されているデータ500は,図5に示すように,生産装置IDと,センサIDと,作業IDと,作業開始/終了フラグであり,それらが関連付けられている。
S300 is a process in which the device
S301は,トリガ生成部121が,トリガ生成テーブル123を参照して,装置情報400に含まれるセンサIDおよび生産装置IDに対応する作業IDおよび作業開始/終了フラグを読みだす処理である。この処理の結果,少なくともそれらの作業IDおよび,作業開始/終了フラグと,装置情報に含まれるタイムスタンプから構成される,図6に示すトリガ600が生成されることとなる。なお,以降,トリガ600に含まれるタイムスタンプが示すタイミングを,簡単にトリガ600が示すタイミングと呼ぶ。
S301 is a process in which the
作業開始/終了フラグは,トリガ600が示すタイミングが,作業開始あるいは終了のどちらのタイミングを表すのかを示す。前者を開始トリガ,後者を終了トリガと呼ぶ。なお,開始トリガ,終了トリガが両方生成される作業工程(ペアトリガあり),開始もしくは終了トリガのいずれかだけが生成される作業工程(ペアトリガなし)の2種類が想定されるため,作業開始/終了フラグは,ペアトリガあり/なしに関する情報も含む。
The work start / end flag indicates whether the timing indicated by the
なお,生産装置IDとセンサIDの組み合わせによっては,トリガ生成テーブル123に情報がない場合もある。この場合は,該当する作業が存在しないことを意味し,トリガ600は生成しない。
Note that there may be no information in the trigger generation table 123 depending on the combination of the production device ID and the sensor ID. In this case, it means that the corresponding work does not exist, and the
S310は,トリガ600をトリガバッファ131に一時保存する処理である。
S310 is a process of temporarily storing the
S311は,カメラ140から読みだされた解析用情報を解析用情報バッファ133に一時蓄積する処理である。ここで保存される解析用情報は,動画像でも良いし,あるいは,動画像から抽出された人の姿勢を表す情報でも良い。さらに,人の姿勢情報は人の手,足,肩,頭といった部位の位置情報であっても良い。いずれの場合でも,図7に示すように,解析用情報バッファ133内のデータは,カメラIDごとの時系列の情報として保存されることとなる。
S 311 is a process of temporarily storing analysis information read from the
図7では,人の手,足,肩,頭といった部位の位置情報を,時系列情報として蓄積した場合の模擬図を,解析用情報の模擬図701として示す。 In FIG. 7, a simulation diagram in the case where position information of parts such as human hands, feet, shoulders, and heads is accumulated as time series information is shown as a simulation diagram 701 of analysis information.
S312は,解析用情報分割部132が,トリガバッファ131に保存されたトリガ600を読みだす処理である。読みだしたトリガが,ペアトリガありの場合は,ペアトリガが両方揃ってから読み込む。読みだしたトリガが,ペアトリガなしの場合は,そのまま読みこむ。
S312 is a process in which the analysis
その後,解析用情報分割部132は,カメラIDテーブル134を参照して,トリガ600に含まれる作業IDから対応するカメラIDを取得する。
Thereafter, the analysis
図8に示すように,カメラIDテーブル134内のデータ800は,作業IDとカメラIDとが,関連付けられたデータである。
As shown in FIG. 8, the
S313は,解析用情報分割部132が,取得したカメラIDと,トリガに含まれるタイムスタンプに従い,解析用情報バッファ133から解析用情報を分割して読み込む処理である。以下,3パターンの分割処理につき,それぞれ図9~図11を用いて説明する。
S313 is processing in which the analysis
1つ目のパターンは,ペアトリガありの場合である。この場合,解析用情報分割部132は,開始トリガおよび終了トリガの示すタイミングに従い,解析用情報バッファ133から,両者のタイミングに挟まれた範囲の解析用情報を分割し,読み込む。
The first pattern is when there is a pair trigger. In this case, the analysis
図9で,1つ目のパターンについて説明する。すなわち,解析用情報の模擬図701において,開始トリガの示すタイミングの第1の例901と,終了トリガの示すタイミングの第1の例902に挟まれた範囲である分割対象の第1の例900を,分割する様子を示している。
Referring to FIG. 9, the first pattern will be described. That is, in the simulation diagram 701 of the information for analysis, the first example 900 of the division target that is between the first example 901 of the timing indicated by the start trigger and the first example 902 of the timing indicated by the
2つ目のパターンは,ペアトリガがない場合で,かつトリガが開始トリガであった場合である。この場合,開始トリガの示すタイミングと,開始トリガの示すタイミングに対して所定の時間後のタイミングに挟まれた範囲を分割して読み込む。 The second pattern is when there is no pair trigger and the trigger is a start trigger. In this case, the range between the timing indicated by the start trigger and the timing after a predetermined time with respect to the timing indicated by the start trigger is divided and read.
図10で,2つ目のパターンについて説明する。すなわち,解析用情報の模擬図701において,開始トリガの示すタイミングの第2の例1001と,開始トリガの示すタイミングの第2の例から所定の時間後のタイミング1002に挟まれた範囲である,分割対象の第2の例1000を,分割する様子を示す。ここで,所定の範囲は,該当作業工程の標準作業時間にマージンを加えた値とするとよい。
The second pattern will be described with reference to FIG. That is, in the simulation diagram 701 of the information for analysis, a range between the second example 1001 of the timing indicated by the start trigger and the
3つ目のパターンは,ペアトリガがない場合で,かつトリガが終了トリガであった場合である。この場合,終了トリガの示すタイミングと,終了トリガの示すタイミングに対して所定の時間前のタイミングに挟まれた範囲を分割して読み込む。 The third pattern is when there is no pair trigger and the trigger is an end trigger. In this case, the range between the timing indicated by the end trigger and the timing indicated by the end trigger is divided and read.
図11で,3つ目のパターンについて説明する。すなわち,解析用情報の模擬図701において,終了トリガの示すタイミングの第2の例1101と,終了トリガの示すタイミングの第2の例から所定の時間前のタイミング1102に挟まれた範囲である,分割対象の第3の例1100を,分割する様子を示す。ここで,所定の範囲は,該当作業工程の標準作業時間にマージンを加えた値とするとよい。 FIG. 11 explains the third pattern. That is, in the simulation diagram 701 of the analysis information, the second example 1101 of the timing indicated by the end trigger and the range between the timing 1102 a predetermined time before the second example of the timing indicated by the end trigger. A mode that the 3rd example 1100 of a division | segmentation object is divided | segmented is shown. Here, the predetermined range may be a value obtained by adding a margin to the standard work time of the corresponding work process.
ただし,上記3パターンいずれの場合でも,解析用情報の蓄積が間に合っていなかった場合は,読み込むべき範囲の解析用情報が蓄積されるまで,待機する。 However, in any of the above three patterns, if the analysis information is not stored in time, it waits until the analysis information in the range to be read is stored.
S314は,動作モデル選択部135が,トリガ600が含む作業IDに従い,動作モデル蓄積部136から作業IDに対応する動作モデルを選択する処理である。あるいは,トリガの作業IDと,個人認証装置150あるいはスケジューラ151から取得した該当作業を行っている人物を特定可能な個人IDとから,それらに対応する動作モデルを選択しても良い。個人認証装置150は,作業IDに対応した作業を実施している人物の個人IDを,生体情報を用いた認識,あるいは該当人物が保持する認証用のデバイスを用いて取得する。スケジューラ151は,各人物がいずれの作業工程にていつ作業を行うかの情報を持っており,トリガ600に含まれる作業IDおよびタイムスタンプによって,該当作業を行っている人物の個人IDを特定することができる。
S314 is a process in which the behavior
図12に示すように,動作モデル蓄積部136に保存されているデータ1200は,作業IDと,個人IDと,数値で表現された動作モデルとを関連付けたデータである。個人IDを利用しない場合は,作業ID毎に,動作モデルが保存されている。動作モデルは,人の動き情報を画像上の動きベクトルの時系列情報あるいはその代表値として表したものでも良いし,人の姿勢に関する時系列情報あるいはその代表値であっても良い。さらに,それらに関する確率分布として表現されていても良い。確率分布として表現する場合は,パラメトリックに表現する方法であるガウス分布あるいはガウス混合分布として表現する方法,ノンパラメトリックに表現する方法である頻度分布を用いた方法や,パルゼン窓を用いた方法を利用することができる。
As shown in FIG. 12, the
S315は,解析部137が,ステップS313にて分割された解析用情報と,S314にて選択された動作モデルを用いて解析を行う処理である。具体的には,分割された解析用情報によって示される動作が,選択された動作モデルからどの程度逸脱しているかを表す逸脱度を算出し,それを解析結果として出力する。逸脱度の算出方法として,選択された動作モデルが時系列情報で表されていれば動的プログラミング法を用いることができるし,代表値で表されていればユークリッド距離を用いることができる。また,動作モデルが確率分布として表現されている場合,確率分布がガウス分布であればマハラノビス距離を用いることができ,他の確率分布として表現されていれば分割された解析用情報の発生しうる確率を計算することで逸脱度を計算することができる。動作モデルが隠れマルコフモデルで表されている場合であっても,分割された解析用情報の発生しうる確率を計算でき,逸脱度を計算することができる。
S315 is a process in which the
なお,解析結果は,複数の方法を用いて計算して得られた複数の値であっても良い。例えば,解析用情報のうち,全身,手,足,上半身,下半身といった,異なる人体の部位を選択的に利用して逸脱度を計算すれば,それぞれの部位に対応した解析結果を得られる。 The analysis result may be a plurality of values obtained by calculation using a plurality of methods. For example, if the degree of deviation is calculated by selectively using different parts of the human body such as the whole body, hand, foot, upper body, and lower body in the analysis information, an analysis result corresponding to each part can be obtained.
S316は,このようにして得られた解析結果を,ログ蓄積部138に蓄積する処理である。図13に示すように,ログ蓄積部に保存するデータ1300は,タイムスタンプ,作業ID,個人ID,解析結果を関連付けたデータである。ここでタイムスタンプは,ステップS313で利用したトリガ600に対応して保存する。作業開始トリガを利用していれば,そのタイムスタンプ(作業開始タイムスタンプ),作業終了トリガを利用していれば,そのタイムスタンプ(作業終了タイムスタンプ)を保存する。両方利用している場合は,両方保存する。
S316 is a process of accumulating the analysis result obtained in this way in the
S317は,表示部139が,生産装置100にタイムスタンプ,作業ID,個人ID,解析結果を表示する処理である。ここでタイムスタンプは,ステップS313で利用したトリガ600に対応して保存する。作業開始トリガを利用していれば,そのタイムスタンプ(作業開始タイムスタンプ),作業終了トリガを利用していれば,そのタイムスタンプ(作業終了タイムスタンプ)を表示する。両方利用している場合は,両方表示する。
S317 is a process in which the
図14は,表示画面139に表示する画面の例1400を示す図である。いずれかの解析結果が,所定の閾値を超えていた場合には,図14に示すように逸脱動作が発生したメッセージを表示し,さらに,解析結果が特定の人体の部位の解析用情報を用いて計算されたものであれば,図14のようにその部位を同時に表示することで,よりわかりやすい表示となる。その後,表示画面139を見た作業員や監督員は,例えば該当逸脱動作発生時の製品を廃棄する,あるいは該当逸脱動作を行った作業員に対して適切な指導をする,といった対応が可能となる。
FIG. 14 is a diagram illustrating an example 1400 of a screen displayed on the
100 生産装置
102 解析装置
110 装置情報送信部
111 装置情報生成部
112 センサ
113 制御部
114 生産部
120 装置情報受信部
121 トリガ生成部
123 トリガ生成テーブル
131 トリガバッファ
132 解析用情報分割部
133 解析用情報バッファ
134 カメラIDテーブル
135 動作モデル選択部
136 動作モデル蓄積部
137 解析部
138 ログ蓄積部
139 表示部
140 カメラ
150 個人認証装置
151 スケジューラ
400 装置情報
600 トリガ
DESCRIPTION OF
Claims (5)
前記生産装置は,センサと,前記センサが発報した時刻を特定する複数のタイムスタンプを含む装置情報を生成する装置情報生成部と,前記装置情報を前記解析装置に送信する送信部と,を備え,
前記解析装置は,前記装置情報を受信する受信部と,撮像装置から取得した解析用情報を一時蓄積する解析用情報バッファと,解析部と,を備え,
前記解析部は,受信した装置情報に含まれる複数のタイムスタンプが示す時刻それぞれにおいて取得された解析用情報が前記解析用情報バッファに蓄積されるまで待機し,前記複数のタイムスタンプが示す時刻を用いて特定された範囲について,前記解析用情報の解析を行う,ことを特徴とする作業動作解析システム。 A work motion analysis system comprising a production device and an analysis device,
The production apparatus includes a sensor, a device information generation unit that generates device information including a plurality of time stamps that specify times when the sensor has issued, and a transmission unit that transmits the device information to the analysis device. Prepared,
The analysis device includes a reception unit that receives the device information, an analysis information buffer that temporarily stores analysis information acquired from the imaging device, and an analysis unit.
The analysis unit waits until the analysis information acquired at each of the times indicated by the plurality of time stamps included in the received device information is accumulated in the analysis information buffer, and displays the times indicated by the plurality of time stamps. A work motion analysis system characterized in that the analysis information is analyzed for a range specified by use.
前記複数のタイムスタンプは,作業者の作業開始時点を示す第1のタイムスタンプと,作業者の作業終了時点を示す第2のタイムスタンプと,を含み,
前記範囲は,前記第1のタイムスタンプと,前記第2のタイムスタンプと,により特定された範囲である,ことを特徴とする作業動作解析システム。 The work motion analysis system according to claim 1,
The plurality of time stamps include a first time stamp indicating a work start time of the worker and a second time stamp indicating a work end time of the worker,
The work motion analysis system characterized in that the range is a range specified by the first time stamp and the second time stamp.
前記センサとして,接触センサと,起動スイッチと,を備え,
前記第1のタイムスタンプは,前記接触センサが発報した時刻を特定するタイムスタンプであり,前記第2のタイムスタンプは,前記起動スイッチが発報した時刻を特定するタイムスタンプである,ことを特徴とする作業動作解析システム。 The work motion analysis system according to claim 2,
The sensor includes a contact sensor and a start switch,
The first time stamp is a time stamp specifying the time when the contact sensor has issued, and the second time stamp is a time stamp specifying the time when the activation switch has issued. Characteristic work motion analysis system.
撮像装置から取得した解析用情報を解析用情報バッファに一時蓄積する手順と,
前記複数のタイムスタンプが示す時刻それぞれにおいて取得された解析用情報が前記解析用情報バッファに蓄積されるまで待機する手順と,
前記複数のタイムスタンプが示す時刻を用いて特定された範囲について,前記解析用情報の解析を行う手順と,を有することを特徴とする作業動作解析方法。 A procedure for generating device information including a plurality of time stamps capable of specifying the time at which the sensor issued,
A procedure for temporarily storing analysis information acquired from the imaging device in an analysis information buffer;
A procedure of waiting until analysis information acquired at each of the times indicated by the plurality of time stamps is stored in the analysis information buffer;
And a procedure for analyzing the analysis information for a range specified using times indicated by the plurality of time stamps.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2016-039514 | 2016-03-02 | ||
| JP2016039514A JP6849312B2 (en) | 2016-03-02 | 2016-03-02 | Work motion recognition system |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2017150073A1 true WO2017150073A1 (en) | 2017-09-08 |
Family
ID=59742726
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2017/004022 Ceased WO2017150073A1 (en) | 2016-03-02 | 2017-02-03 | Work operation analysis system, work operation analysis method, and work operation analysis program |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP6849312B2 (en) |
| WO (1) | WO2017150073A1 (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2020152879A1 (en) * | 2019-01-23 | 2020-07-30 | オムロン株式会社 | Operation analysis device, operation analysis method, operation analysis program, and operation analysis system |
Families Citing this family (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP7192860B2 (en) | 2018-04-26 | 2022-12-20 | 日本電気株式会社 | Motion estimation system, motion estimation method, and motion estimation program |
| JP6779413B2 (en) * | 2018-05-31 | 2020-11-04 | 三菱電機株式会社 | Work analyzer |
| JP7406441B2 (en) * | 2020-04-08 | 2023-12-27 | 株式会社日立製作所 | Manufacturing defect factor search method and manufacturing defect factor search device |
| JP7016936B1 (en) * | 2020-11-25 | 2022-02-07 | 日立建機株式会社 | Operation grasp system |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2007122518A (en) * | 2005-10-28 | 2007-05-17 | Omron Corp | Filter parameter setting device, filtering processor, filter parameter setting method, operation time measuring system, control program and storage medium |
| JP2007243846A (en) * | 2006-03-10 | 2007-09-20 | Matsushita Electric Ind Co Ltd | Image extraction analysis apparatus, image extraction analysis system, and image extraction analysis method |
| JP2011034234A (en) * | 2009-07-30 | 2011-02-17 | Kozo Keikaku Engineering Inc | Movement analysis device, movement analysis method and movement analysis program |
| JP2012003649A (en) * | 2010-06-21 | 2012-01-05 | Kozo Keikaku Engineering Inc | Work analysis apparatus, work analysis method and program |
| JP2012023414A (en) * | 2010-07-12 | 2012-02-02 | Kozo Keikaku Engineering Inc | Simulation apparatus, simulation method, and program |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH04259803A (en) * | 1991-02-15 | 1992-09-16 | Nissan Motor Co Ltd | Work movement measurement device |
| JPH11120163A (en) * | 1997-10-17 | 1999-04-30 | Toyota Motor Corp | Operator motion simulation device and production line examination device |
| JP2010015205A (en) * | 2008-07-01 | 2010-01-21 | Meidensha Corp | Failure diagnosing system and method for semiconductor manufacturing device |
| JP5284179B2 (en) * | 2009-05-21 | 2013-09-11 | トヨタ自動車東日本株式会社 | Work determination system, work determination method, and recording medium recording the work determination method |
| JP5830780B2 (en) * | 2011-12-02 | 2015-12-09 | 株式会社日立製作所 | Business analysis apparatus, business analysis system, and business analysis method |
| JP2015114761A (en) * | 2013-12-10 | 2015-06-22 | 株式会社東芝 | Information processing system, electronic device, method, and program |
-
2016
- 2016-03-02 JP JP2016039514A patent/JP6849312B2/en active Active
-
2017
- 2017-02-03 WO PCT/JP2017/004022 patent/WO2017150073A1/en not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2007122518A (en) * | 2005-10-28 | 2007-05-17 | Omron Corp | Filter parameter setting device, filtering processor, filter parameter setting method, operation time measuring system, control program and storage medium |
| JP2007243846A (en) * | 2006-03-10 | 2007-09-20 | Matsushita Electric Ind Co Ltd | Image extraction analysis apparatus, image extraction analysis system, and image extraction analysis method |
| JP2011034234A (en) * | 2009-07-30 | 2011-02-17 | Kozo Keikaku Engineering Inc | Movement analysis device, movement analysis method and movement analysis program |
| JP2012003649A (en) * | 2010-06-21 | 2012-01-05 | Kozo Keikaku Engineering Inc | Work analysis apparatus, work analysis method and program |
| JP2012023414A (en) * | 2010-07-12 | 2012-02-02 | Kozo Keikaku Engineering Inc | Simulation apparatus, simulation method, and program |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2020152879A1 (en) * | 2019-01-23 | 2020-07-30 | オムロン株式会社 | Operation analysis device, operation analysis method, operation analysis program, and operation analysis system |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2017156978A (en) | 2017-09-07 |
| JP6849312B2 (en) | 2021-03-24 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2017150073A1 (en) | Work operation analysis system, work operation analysis method, and work operation analysis program | |
| JP2022009097A (en) | Discrimination device, discrimination method and program | |
| JP7265667B2 (en) | Work support system and work support method | |
| US10713770B2 (en) | Analysis apparatus and analysis method | |
| EP3686840A1 (en) | Abnormality detection device and abnormality detection method | |
| US11199561B2 (en) | System and method for standardized evaluation of activity sequences | |
| WO2019172093A1 (en) | Work action analysis system and method for analyzing work movement | |
| JP2020091801A (en) | Work analysis system and work analysis method | |
| WO2018076992A1 (en) | Production-line monitoring system and method | |
| US20180165622A1 (en) | Action analysis device, acton analysis method, and analysis program | |
| US10537244B1 (en) | Using eye tracking to label computer vision datasets | |
| JP6270488B2 (en) | Operator monitoring control device and operator monitoring control method | |
| JP2005215927A (en) | Behavior recognition system | |
| CN103713545A (en) | Operation guiding method, device and system | |
| CN113516092B (en) | Method and device for determining target behavior, storage medium and electronic device | |
| US20180126561A1 (en) | Generation device, control method, robot device, call system, and computer-readable recording medium | |
| CN105607736B (en) | A kind of information display method and terminal | |
| JP2014016798A (en) | Information processor and program | |
| CN115035576B (en) | User emotion recognition method, device, equipment and medium based on face video | |
| US12106566B2 (en) | Image processing apparatus that manages checking work performed on articles and image processing method thereof | |
| WO2022209638A1 (en) | Work instruction system | |
| JP2022031286A (en) | Information processing equipment, methods and programs | |
| US20230065834A1 (en) | Behavior analysis device and behavior analysis method | |
| JP5779302B1 (en) | Information processing apparatus, information processing method, and program | |
| CN113147176B (en) | Method for detecting operation specification of silk-screen link |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17759544 Country of ref document: EP Kind code of ref document: A1 |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 17759544 Country of ref document: EP Kind code of ref document: A1 |