[go: up one dir, main page]

WO2016084142A1 - Work assistance system and work assistance method - Google Patents

Work assistance system and work assistance method Download PDF

Info

Publication number
WO2016084142A1
WO2016084142A1 PCT/JP2014/081163 JP2014081163W WO2016084142A1 WO 2016084142 A1 WO2016084142 A1 WO 2016084142A1 JP 2014081163 W JP2014081163 W JP 2014081163W WO 2016084142 A1 WO2016084142 A1 WO 2016084142A1
Authority
WO
WIPO (PCT)
Prior art keywords
work
hand
information
support system
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2014/081163
Other languages
French (fr)
Japanese (ja)
Inventor
浩彦 佐川
栗原 恒弥
洋登 永吉
春美 清水
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Original Assignee
Hitachi Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd filed Critical Hitachi Ltd
Priority to PCT/JP2014/081163 priority Critical patent/WO2016084142A1/en
Publication of WO2016084142A1 publication Critical patent/WO2016084142A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present invention relates to a work support system.
  • the position of the work target and the tool position are specified from the image, and if the position of the tool is included in the work target position, the work is performed in the correct procedure.
  • a technique for determining that a device is broken has been proposed.
  • the method of operating a separately prepared switch which is a method for shifting to the next work procedure, requires an operator to perform an operation that is not performed in normal work. For this reason, such a method may hinder smooth progress of work.
  • the method for navigating the work is a method for detecting that the work has been performed on the work object by the marker or the touch sensor, as described in the related art or Patent Document 1, the influence on the work flow.
  • a method has a problem that it is often impossible or difficult to attach a marker or a touch sensor in the case of installing or assembling a large-scale facility.
  • Such a method can determine that some work has been performed on the work target, but cannot determine whether the work target is in the correct state.
  • the method of determining the end of work based on sensor data is often difficult to install the sensor or to operate the sensor, such as installation and assembly.
  • An object of the present invention is to provide a work support system for presenting work procedures to a worker in order and navigating the worker at an appropriate timing without adding special devices such as markers and sensors to the work target. By determining the status of the work target and automatically switching the work content to be presented to the worker based on the determination result, the worker can be surely performed without interfering with the normal work of the worker. It is to provide a technique for effectively navigating.
  • the present invention is a work support system, which outputs a photographed image of a target object that is a work target and the periphery of the target object, and a three-dimensional view of the target object.
  • a storage unit that stores a first model indicating a model, a detection unit that detects a hand of the user who performs the work and the object from the image, and whether the hand and the object overlap each other
  • the relationship determination unit for determining from the above and the hand and the object do not overlap
  • the first information that is the information of the object is extracted from the image
  • the second information that is the information of the object is the second information
  • a state determination unit that determines a working state based on a result extracted from one model and compares the first information and the second information; and an output unit that outputs information according to the working state. .
  • FIG. 3 is a flowchart illustrating processing by the work support system according to the first embodiment. It is explanatory drawing which shows the point cloud data of working space data of Example 1.
  • FIG. FIG. 3 is a flowchart illustrating processing by the work support system according to the first embodiment. It is explanatory drawing which shows the point cloud data of working space data of Example 1.
  • FIG. 3 is an explanatory diagram illustrating a three-dimensional model representing hand and arm shapes according to the first embodiment. It is explanatory drawing which shows the work space data after deleting the point cloud data corresponding to the operator's hand and arm of Example 1. FIG. It is explanatory drawing which shows the area
  • FIG. 10 is a flowchart illustrating processing by the work support system according to the second embodiment.
  • FIG. 10 is an explanatory diagram illustrating an image acquired as working space data according to the second embodiment. It is explanatory drawing which shows the area
  • FIG. 12 is a flowchart illustrating processing by the work support system according to the third embodiment.
  • 10 is a flowchart illustrating processing by the work support system according to the fourth embodiment. It is explanatory drawing which shows the space data during work when the operator of Example 4 is carrying out work while holding a tool or the like.
  • Embodiment 1 of the present invention will be described with reference to FIGS.
  • FIG. 1 is a block diagram illustrating a physical configuration of the work support system according to the first embodiment.
  • FIG. 1 shows a configuration of a work support system when a general work support system executes the processing of this embodiment.
  • the work support system includes an information processing apparatus 101, an input apparatus 102, an output apparatus 103, a work space model 104, a work procedure 105, a normal time model 106, a storage device 107, and a work space data 115.
  • the information processing apparatus 101 is an apparatus for executing various programs stored in the storage device 107.
  • the information processing apparatus 101 includes at least one processor and a memory.
  • the processor included in the information processing apparatus 101 is, for example, a CPU.
  • the memory included in the information processing apparatus 101 includes a ROM that is a nonvolatile storage element and a RAM that is a volatile storage element.
  • the ROM stores an immutable program (for example, BIOS).
  • the RAM is a high-speed and volatile storage element such as a DRAM (Dynamic Random Access Memory), and temporarily stores a program stored in the storage device 107 and data used when the program is executed.
  • DRAM Dynamic Random Access Memory
  • the storage device 107, the work space model 104, the work procedure 105, the normal time model 106, and the work space data 115 are large-capacity and nonvolatile storage devices such as a magnetic storage device (HDD) and a flash memory (SSD), for example. Yes, it stores a program executed by the information processing apparatus 101 and data used when the program is executed.
  • HDD magnetic storage device
  • SSD flash memory
  • the input device 102 includes a device that acquires, as the working space data of the present embodiment, data obtained by photographing a work target that is an object on which the worker performs work and a space where the work target exists.
  • the input device 102 receives an instruction from the worker and sends it to the program.
  • the input device 102 may include an input device in a general computer such as a keyboard, a mouse, or a touch panel.
  • the input device 102 includes a depth sensor that acquires a distance image having a pixel value as a distance to an object.
  • the input device 102 converts the distance image acquired by the depth sensor into point cloud data representing information of the target space as a set of points on the three-dimensional space by using the focal length and the center position of the depth sensor.
  • the working space data in this embodiment is mainly expressed as point cloud data.
  • the output device 103 is a device that presents the results of processing by the work support system of the present embodiment to the worker.
  • the output device 103 may include, for example, a monitor used in a general work support system, or may include a head-mounted display device called an HMD (head mounted display).
  • HMD head mounted display
  • the work space model 104 is a storage area for storing a three-dimensional work space model that represents the entire space in which work is performed.
  • the work space three-dimensional model may be represented by point cloud data in the work space model 104 in the same manner as the work space data.
  • the work support system of this embodiment also uses a different work space three-dimensional model for each work.
  • the work space model 104 stores point cloud data indicating the work space three-dimensional model and an identifier (for example, name) for identifying the work in which the work space three-dimensional model is used in association with each other.
  • the work space model 104 may store information on a three-dimensional model using a polygon (polygon) generally used in CG or the like.
  • the program that accesses the work space model 104 may divide the polygon in the work space model 104 until it reaches a predetermined size, and extract only the coordinates of the vertexes of each polygon after the division.
  • the program can easily convert the polygon representing the three-dimensional model into point cloud data indicating the extracted coordinates.
  • the work procedure 105 includes a series of work procedures for carrying out the work, information for specifying the work object that is the object to be worked, and information on the position of the work object in the work space three-dimensional model.
  • the work procedure 105 may include information indicating the state of the work target such as the size, shape, color, and temperature of the work target.
  • the normal-time model 106 includes a normal-time object (three-dimensional) model that represents an ideal state (position, size, shape, color, temperature, and the like) after the work of each work object indicated by the work procedure 105.
  • the three-dimensional model stored in the normal-time model 106 may be expressed by point cloud data, like the working space data and the working space three-dimensional model.
  • the normal model 106 includes point cloud data representing a work target and information related to related work and work procedures (a work name 401 described later, work procedure areas 403 and 407, and the like), Hold in correspondence.
  • the normal model 106 may hold a three-dimensional model using polygons.
  • a program that accesses the normal model 106 may convert the polygon into point cloud data by the method described above.
  • the working space data 115 includes working space data indicating the working space acquired during the work.
  • the working space data 115 includes point cloud data indicating the shape, size, and position of an object in the working space, and may include data indicating the temperature of the object in the working space.
  • the storage device 107 includes an information presentation program 108, an input program 113, a correspondence calculation program 114, a hand detection program 109, a work target detection program 110, a relationship determination program 111, and a state determination program 112.
  • the information presentation program 108 is a program that controls other programs held in the storage device 107 and presents information to the worker according to the contents of the work procedure 105.
  • the input program 113 is a program for acquiring working space data using the input device 102 after the work starts.
  • the correspondence calculation program 114 is a program for obtaining a correspondence between the work space data and the work space three-dimensional model.
  • the hand detection program 109 is a program for detecting an area of the worker's hand from the working space data.
  • the work target detection program 110 is a program that detects a work target area in the working space data based on the correspondence obtained by the correspondence calculation program 114.
  • the relationship determination program 111 determines whether the work target area and the hand area overlap. Specifically, for example, the relationship determination program 111 includes an area of the operator's hand detected by the hand detection program 109 within a predetermined position including the area of the work target detected by the work target detection program 110. Are determined by determining whether they exist.
  • the state determination program 112 extracts the data of the work target area from the working space data, and further, the normal target object corresponding to the work target The model is extracted from the normal model 106.
  • the state determination program 112 compares the extracted work area data with the extracted normal object model to determine whether the work object is in an appropriate state, that is, whether the work has been properly completed. Determine.
  • FIG. 2 is an explanatory diagram showing work to which the work support system of the first embodiment is applied.
  • the work support system shown in FIG. 2 includes the work support system shown in FIG. 1 and the work target on which the worker works.
  • the work support system shown in FIG. 2 includes a terminal 201, a touch panel 202, a depth sensor 203, a helmet 204, an HMD 205, equipment 206, and a lever 207.
  • the terminal 201 is a portable terminal that presents information regarding work to the worker or accepts instructions from the worker in the work support system.
  • the terminal 201 may be any device as long as it has an input device and an output device, and may be a tablet terminal or a smartphone.
  • the work support system shown in FIG. 1 may be implemented by the terminal 201, the depth sensor 203, and the HMD 205.
  • a computer connected to the terminal 201 by wire or wireless implements the information processing apparatus 101, the storage device 107, the work space model 104, the work procedure 105, the normal model 106, and the work space data 115 shown in FIG. May be.
  • the terminal 201 includes, for example, a touch panel 202 as the input device 102 and the output device 103.
  • the touch panel 202 is operated by an operator. The worker controls the operation of the work support system by inputting an instruction to the touch panel 202.
  • the worker operates the touch panel 202 when starting or ending work in the work support system, or when returning to the original work procedure when the work support system erroneously shifts to the next work procedure. .
  • the depth sensor 203 measures the distance between the work target and an object around the work target and the depth sensor 203.
  • the depth sensor 203 acquires the measurement result as working space data, and the input program 113 stores the obtained working space data in the working space data 115.
  • the depth sensor 203 is installed in the worker or in the vicinity of the worker.
  • the depth sensor 203 shown in FIG. 2 is installed in the helmet 204, so that it is possible to acquire information on an object existing in the worker's field of view including the work target and the worker's hand from the worker's viewpoint. And since the state determination program 112 of a present Example can determine the state of work based on the work object recognized by the user, it can determine the state of work accurately.
  • the depth sensor 203 may be installed at any position as long as the shape of the object existing in the space including the work target and the worker's hand can be acquired. For this reason, the depth sensor 203 may be installed in the HMD 205. Alternatively, the depth sensor 203 may be installed at any position on the worker's body as long as it can acquire information on the space including the work target, such as the worker's shoulder or chest. Further, the depth sensor 203 may be installed on a ceiling or the like near the worker.
  • the HMD 205 is a head-mounted display device.
  • the HMD 205 corresponds to the output device 103 in FIG.
  • the facility 206 is a device or a structure having a work target.
  • the worker works on a work target included in the facility 206 in order to perform maintenance or change of the facility 206.
  • a lever 207, a meter 208, a meter 209, a switch 210, a switch 211, a lamp 212, a lamp 213, a lamp 214, and the like, which are work objects, are arranged.
  • point cloud data indicating the shape of the facility 206 is stored in advance.
  • the input program 113 and the input device 102 may acquire point cloud data indicating the shape of the facility 206 in advance before work and store it in the work space model 104.
  • the administrator of the work support system may set point cloud data indicating the shape of the facility 206 in the work space model 104 in advance before work.
  • FIG. 3 shows an image when a three-dimensional work space model expressing the facility 206 as point cloud data is arranged in the three-dimensional space.
  • FIG. 3 is an explanatory diagram illustrating the point cloud data 301 of the facility 206 of the work support system according to the first embodiment.
  • the point cloud data 301 shown in FIG. 3 represents the shape of the facility 206 by a set of points in a three-dimensional space.
  • the point cloud data 301 shown in FIG. 3 represents only information regarding the position (coordinates) of each point in the three-dimensional space.
  • the depth sensor 203 may acquire the color information or temperature information of each point together, and the input device 102 may store the point cloud data 301 and the color information or temperature information in the work space model 104.
  • the depth sensor 203 acquires the point cloud data 301
  • the depth sensor 203 acquires a color image or a temperature image taken from the same direction, and the correspondence between the color image and the point cloud data 301 or the temperature image and the point cloud data 301.
  • the color information or temperature information of each point in the point cloud data 301 may be acquired on the basis of the correspondence relationship with the
  • FIG. 4 is an explanatory diagram showing a data format stored in the work procedure 105 of the first embodiment.
  • the work procedure 105 includes a name 401 and areas 402 to 410.
  • Name 401 indicates an identifier such as a name given to the work.
  • the work of this embodiment includes a plurality of work procedures.
  • the identifier of the name 401 is expressed by an arbitrary character string, for example.
  • the work procedure 105 indicates a plurality of work procedures included in a plurality of works.
  • the information presentation program 108 presents a work name 401 to the worker in order to select a work to be started.
  • the area 402 indicates the number of work procedures included in the work indicated by the name 401.
  • Areas 403 to 406 include information on the first work procedure included in the work indicated by the name 401, and areas 407 to 410 include information on the last work procedure included in the work indicated by the name 401.
  • the area 403 and the area 407 include an identifier (for example, a name) assigned to each work procedure.
  • the identifiers of the area 403 and the area 407 are expressed by an arbitrary character string, for example.
  • the area 404 and the area 408 are information indicating detailed work contents of each work procedure, and may be expressed by a character string that can be understood by the worker. Further, the area 404 and the area 408 may include an image, a moving image, or CG indicating the work content. In addition, the facility 206 including the work target may be reflected in the image or the like indicated by the region 404 and the region 408.
  • the area 405 and the area 409 include work target identifiers (for example, names) in each work procedure.
  • work target identifiers for example, names
  • the area 405 and the area 409 may include a plurality of work target identifiers.
  • the area 406 and the area 410 indicate the position coordinates on the work space three-dimensional model of the work object indicated by the area 405 and the area 409.
  • An area 406 and an area 410 indicate an identifier (for example, name) of the work space three-dimensional model including the work object, and a three-dimensional area of the work object on the work space three-dimensional model.
  • the area 406 and the area 410 include information on the area of the work target expressed as a combination of the center position of the cuboid circumscribing the work target on the work space three-dimensional model and the sizes in the x, y, and z axis directions. It may be held.
  • the area 406 and the area 410 may hold information on the work target area expressed as a combination of an angle representing rotation of a cuboid circumscribing the work target and a rotation matrix.
  • the work target area may be expressed by a combination of a plurality of rectangular parallelepipeds or a combination of a rectangular parallelepiped and another shape such as a cylinder or a sphere.
  • the contents of the region 406 and the region 410 may be expressed by any method as long as the method can express the three-dimensional region to be worked on the work space three-dimensional model.
  • FIG. 5 is an explanatory diagram illustrating a work target area and a work space three-dimensional model according to the first embodiment.
  • FIG. 5 is an image obtained by superimposing and displaying the work target area registered in the work procedure 105 on the work space three-dimensional model of FIG.
  • the work target area shown in FIG. 5 is represented by a rectangular parallelepiped.
  • a region 501 shown in FIG. 5 is a region corresponding to the lever 207 shown in FIG.
  • Regions 502 and 503 correspond to the meters 208 and 209
  • regions 504 and 505 correspond to the switches 210 and 211
  • regions 506, 507, and 508 correspond to the lamps 212, 213, and 214.
  • FIG. 6 is a functional block diagram illustrating processing by the work support system according to the first embodiment.
  • the work support system includes a work space data input unit 2001, a work space data-work space three-dimensional model correspondence calculation unit 2002, a work information presentation unit 2003, a work target region detection unit 2005, and a hand detection unit 2004. And a functional unit such as a work target-hand relationship determination unit 2006 and a work target state determination unit 2007.
  • the functional units included in the work support system according to the first embodiment may be implemented by each program illustrated in FIG. 1 or may be implemented by a physical device.
  • the working space data input unit 2001 acquires the shape around the work target being worked as working space data.
  • the working space data-working space three-dimensional model correspondence calculating unit 2002 compares the acquired working space data with the working space three-dimensional model, thereby obtaining data indicating the shape of the facility 206 including the work object. Extract.
  • the hand detection unit 2004 detects an area of the worker indicating the shape of the worker's hand from the acquired work space data.
  • the work target area detection unit 2005 detects a work target area indicating the shape of the work target based on the work procedure 105 from the data of the facility 206 including the work target.
  • the work target-hand relationship determination unit 2006 determines the positional relationship between the detected hand region and the detected work target region. If the work target-hand relationship determining unit 2006 determines that the detected hand region does not overlap the detected work target region as a result of the determination, the work target state determining unit 2007 determines that the work target state determining unit 2007 uses the working space data. The target data is extracted, and further, the normal object model indicating the correct shape of the work object after the operation is extracted from the normal model 106. Then, the work target state determination unit 2007 compares the extracted work target data with the extracted normal object model.
  • the work information presentation unit 2003 outputs that the work has been properly completed.
  • FIG. 7 is a flowchart illustrating processing by the work support system according to the first embodiment.
  • the information presentation program 108 presents a list of work that can be performed to the worker, and causes the worker to select the work to be performed (601). Specifically, the information presentation program 108 extracts at least one work name 401 from the work procedure stored in the work procedure 105, and outputs the extracted at least one name as a work list via the output device 103. Output.
  • the information presentation program 108 accepts the name of the work selected using the keyboard, mouse, touch panel, or the like included in the input device 102.
  • the information presentation program 108 acquires at least one work procedure corresponding to the work name selected in step 601 from the work procedure 105, and further, the work space tertiary corresponding to the selected work name.
  • An original model is acquired from the workspace model 104 (602).
  • the information presentation program 108 sets the first work procedure in the obtained work procedure as an execution target (603). Specifically, the information presentation program 108 outputs the first work procedure via the output device 103 in step 603. Thus, the worker can recognize the first work procedure.
  • the work procedure output here is the content indicated by the area 404 of the work procedure 105, and may be an image or a moving image showing the facility 206.
  • the input program 113 acquires the point cloud data of the working space data via the depth sensor 203 provided in the input device 102 (604).
  • the input program 113 stores the acquired point cloud data in the working space data 115.
  • FIG. 8 is an explanatory diagram showing point cloud data of the working space data of the first embodiment.
  • the working space data shown in FIG. 8 is point cloud data that three-dimensionally shows the shape of an object existing in the space including the work target being worked and the hand of the worker.
  • the point cloud data shown in FIG. 8 includes point cloud data 701 corresponding to a part of the work space three-dimensional model, and point cloud data 702 and 703 indicating the shape of the operator's hand.
  • the origin of the working space data is the position of the depth sensor.
  • the work space data acquired by the input program 113 and the work space three-dimensional model stored in the work space model 104 are expressed in different coordinate systems.
  • the depth sensor of the first embodiment acquires point cloud data that is a three-dimensional model to be worked, so that the state determination program 112 can accurately determine the state of the work in the process described later.
  • the hand detection program 109 detects the position and posture of the operator's hand and arm from the working space data acquired in step 604 (605).
  • the hand detection program 109 may use any method as long as it detects the position and orientation of a specific shape from the point cloud data as a method for detecting the hand and arm.
  • the hand detection program 109 may hold in advance a three-dimensional model representing the shape of the hand and arm during work as shown in FIG.
  • the hand detection program 109 detects the position and posture of the operator's hand and arm by detecting the position where the pre-held three-dimensional model of the hand and arm and the work space data most closely match. You may do it.
  • FIG. 9 is an explanatory diagram showing a three-dimensional model 801 representing the hand and arm shapes of the first embodiment.
  • a three-dimensional model 801 shown in FIG. 9 is a three-dimensional model using a polygon showing the shape of the right hand.
  • the hand detection program 109 may divide such polygons, and convert and hold a three-dimensional model of hands and arms into point cloud data in advance by the method described above.
  • the hand detection program 109 is, for example, an ICP (Iterative Closest Point) algorithm (method for detecting a position where the two point cloud data (the three-dimensional model 801 and the working space data in this embodiment) best match).
  • ICP Intelligent Closest Point
  • step 605 the hand detection program 109 uses these methods to convert the hand and arm three-dimensional model 801 so as to match the working space data, that is, a working matrix, that is, working.
  • the position and orientation of the three-dimensional model 801 in the spatial data are obtained.
  • the hand detection program 109 needs to acquire an initial position when performing alignment.
  • the hand detection program 109 may acquire in advance the average position and posture of the predicted hand and arm as the initial position.
  • the hand detection program 109 sets a plurality of representative positions and postures as initial positions, and based on the result of the best match between the working space data and the three-dimensional model of the hands and arms, You may seek posture.
  • the hand detection program 109 may hold a three-dimensional model of the left hand in addition to the three-dimensional model 801 shown in FIG.
  • the hand detection program 109 can detect the positions and postures of both hands and arms of the worker in the working space data by using the above-described method used for the three-dimensional model 801 of the right hand.
  • the correspondence calculation program 114 deletes the point cloud data corresponding to the hand and arm detected in step 605 from the working space data.
  • the correspondence calculation program 114 can detect any position and posture of the hand and arm from the working space data, and can be deleted by specifying the point cloud data corresponding to the hand and arm. A method may be used.
  • the correspondence calculation program 114 converts the three-dimensional model 801 of hands and arms into point cloud data in the working space data, for example, using the conversion matrix for performing rotation and translation obtained in step 605. Thereafter, the correspondence calculation program 114 uses a set of points in the working space data within a predetermined distance from each point of the converted point cloud data as point cloud data corresponding to the hands and arms of the worker. Delete from medium space data.
  • FIG. 10 is an explanatory diagram showing the working space data after deleting the point cloud data corresponding to the hand and arm of the worker of the first embodiment.
  • the correspondence calculation program 114 leaves the point cloud data 701 indicating the facility 206 in the working space data shown in FIG. 10, but deletes the point cloud data from the areas 902 and 903 where the point cloud data 702 and 703 existed.
  • the correspondence calculation program 114 obtains a transformation matrix representing the correspondence between the working space data and the work space three-dimensional model after removing the point cloud data corresponding to the operator's hand and arm (606).
  • the correspondence calculation program 114 can obtain information indicating the correspondence by obtaining a position where the working space data and the work space three-dimensional model most closely match.
  • the correspondence calculation program 114 obtains the position and orientation in which the two three-dimensional data are best matched as a method for obtaining the correspondence between the working space data and the work space three-dimensional model, and information indicating the obtained position and orientation. Any method may be used as long as the information is acquired as information indicating the correspondence.
  • the correspondence calculation program 114 may use, for example, the aforementioned ICP algorithm or NDT. Then, the correspondence calculation program 114 may obtain a conversion matrix that rotates and translates the work space data so as to best match the three-dimensional work space model as information indicating the correspondence.
  • the work target detection program 110 uses the information indicating the correspondence relationship between the work space data obtained in step 606 and the work space three-dimensional model, and the work object corresponding to the work procedure being executed ( The area in the working space data (corresponding to the area 405 of the work procedure 105) is obtained (607).
  • the work object detection program 110 obtains the work space three-dimensional model as a transformation matrix that rotates and translates so as to best match the work space data, and is then placed on the work space three-dimensional model shown in FIG. Read the work area. Then, the work target detection program 110 converts the read work target area using the obtained conversion matrix. Thereby, the work target detection program 110 can obtain a work target area in the working space data.
  • FIG. 11 is an explanatory diagram showing work target areas in the working space data of the first embodiment.
  • Areas 1001 and 1002 are work target areas in the working space data.
  • the information presentation program 108 outputs the information indicating the area in the working space data of the work target obtained in step 607 together with the output work procedure via the output device 103 (608). .
  • FIG. 12 is an explanatory diagram showing a screen 1100 that displays information on the work procedure and the work target area on the distance image acquired by the depth sensor 203 of the first embodiment.
  • the screen 1100 includes an area 1101 and an area 1102.
  • An area 1101 displays the contents of the work procedure being executed as a character string.
  • An area 1102 displays a work target area in the work procedure being performed.
  • the area 1102 shown in FIG. 12 only shows the work target area.
  • the work procedure 105 also stores information such as a symbol indicating an operation method (including a graphic such as an arrow or a circle) and an image or a moving image
  • the information presentation program 108 may display a symbol such as an arrow or a circle at the work target position. Further, an image or a moving image may be displayed in the area 1102 or in the vicinity of the area 1102.
  • the information presentation program 108 obtains a conversion matrix for converting the position on the distance image into the position on the color image. Also good. Then, the information presentation program 108 may display the area 1101 and the area 1102 on the color image by using this conversion matrix.
  • the information presentation program 108 may display only the area 1101 and the area 1102 on the HMD.
  • step 608 the relationship determination program 111, based on the area of the operator's hand in the work space data detected in step 605 and the work target area in the work space data obtained in step 607, It is determined whether the operator's hand overlaps the work target (609).
  • the relationship determination program 111 is close to the three-dimensional model of the hand portion of the point cloud data corresponding to the hand and the arm deleted from the working space data in the working hand space data. Only the point cloud data determined in step 606 is extracted as point cloud data corresponding to the hand. Then, the relationship determination program 111 sets the extracted point cloud data area as point cloud data of the worker's hand area in the working space data.
  • FIG. 13 is an explanatory diagram illustrating an area of the hand of the worker according to the first embodiment.
  • An area 1201 shown in FIG. 13 is an area corresponding to the hand.
  • An area 1001 shown in FIG. 13 is the work target area obtained in step 607.
  • the relationship determination program 111 obtains a rectangular parallelepiped circumscribing the point cloud data corresponding to the hand as shown in an area 1201 in FIG. 13, and determines the area 1201 as the hand area. In step 609, the relationship determination program 111 determines whether the operator's hand overlaps the work target by determining whether the area 1201 of the obtained worker's hand exists in the work target area 1001. Determine.
  • the relationship determination program 111 when the operator's hand area 1201 exists within a predetermined position range including the work target area 1001, the operator's hand overlaps the work target. You may judge.
  • the relationship determination program 111 determines whether the area 1201 of the operator's hand overlaps the area 1001 to be worked according to the determination result of step 609 (610). When it is determined that the worker's hand area 1201 overlaps the work target area 1001, the relationship determination program 111 needs to determine that the worker is performing the work and determine the work state. Judge that there is no. Then, the processing shown in FIG.
  • step 610 If it is determined in step 610 that the area 1201 of the worker's hand does not overlap the area 1001 to be worked, the relationship determination program 111 causes the worker to finish one of the work procedures, and this time It is determined that the timing is to be determined. Then, the relationship determination program 111 causes the state determination program 112 to execute step 611.
  • Step 609 and Step 610 are executed, the positional relationship between the worker's hand and the work target is determined, and when the worker's hand is separated from the work target, Step 611 is executed, whereby the relationship of Embodiment 1
  • the determination program 111 can efficiently extract an appropriate timing for determining the state of the work target.
  • step 611 the state determination program 112 extracts point cloud data corresponding to the work target from the work space data based on the work area 1001 in the work space data obtained in step 607. Further, the state determination program 112 extracts from the normal-time model 106 a normal-time object (three-dimensional) model representing an ideal state of the work target after work according to the work procedure set in step 603 or 614. To do.
  • the state determination program 112 calculates a difference between the point cloud data of the work target in the working space data and the acquired normal object (three-dimensional) model.
  • the state determination program 112 may use any method as long as it calculates the difference between the point cloud data to be worked and the normal object model in step 611. A specific method is shown below.
  • the state determination program 112 acquires the normal object acquired. A difference in distance between each point of the point cloud data of the model and the point cloud data of the work target closest to them is obtained. And the state determination program 112 calculates
  • the state determination program 112 acquires the point cloud data of the work target area enlarged by the ratio from the work space data 115 as the work target point cloud data.
  • the state determination program 112 uses the method described in step 605 and step 606 to align the positions of the normal object model and the point cloud data of the work target.
  • the state determination program 112 converts one point cloud data so as to best match the other point cloud data using the result of matching the positions.
  • the state determination program 112 calculates the difference in the distance between the point cloud data of the work target and the point cloud data of the normal object model using the above-described method of calculating the sum of the distances of the points, The difference between the work target area and the normal object model may be obtained.
  • step 611 the state determination program calculates the similarity between the point cloud data of the work target and the ideal normal three-dimensional model as the difference in distance between the point cloud data of the work target and the ideal normal three-dimensional model. You may ask for.
  • the state determination program 112 determines in step 611 that the state of the work target is not appropriate. If the obtained difference is equal to or smaller than the predetermined threshold, the state determination program 112 determines in step 611 that the state of the work target is appropriate.
  • step 611 the state determination program 112 determines whether the state of the work target is appropriate according to the determination result in step 611 (612). If the work target state is appropriate, one work procedure has been completed without any problem, and the information presentation program 108 executes step 613.
  • the processing in FIG. 7 returns to Step 604.
  • the information presentation program 108 may use the output device 103 to present information indicating that the work has not been completed by a message or a symbol to the worker.
  • the information presentation program 108 uses the output device 103 to present the worker with information indicating that the work procedure being performed has been completed. For example, the information presentation program 108 may present the message 1301 of FIG. 14 to the worker.
  • FIG. 14 is an explanatory view showing a screen displayed after one work procedure of the first embodiment is completed.
  • the message 1301 shown in FIG. 14 is displayed in step 613 and indicates that one work procedure has been completed without any problem.
  • the information presentation program 108 refers to the work procedure 105 and sets the next work procedure as the work procedure to be executed (614). In step 614 or when returning from step 612 to step 604, by outputting whether or not the work procedure has been completed, the information presentation program 108 determines whether or not the worker can appropriately perform a series of work. The state of the work to be shown can be presented to the worker. Thus, the work support system according to the first embodiment can appropriately navigate the worker.
  • step 614 the information presentation program 108 checks whether all work procedures in the work procedure to be executed have been completed (615). If all work procedures have not been completed, the next work procedure is executed. Therefore, step 604 is executed. When all work procedures are completed, the information presentation program 108 executes Step 616.
  • the information presentation program 108 uses the output device 103 to present information indicating that all work procedures have been completed to the worker. For example, the information presentation program 108 may present a message such as “work has been completed” to the worker. After presenting the information to the worker, the information presentation program 108 ends the process shown in FIG.
  • the output device 103 may include a device that outputs sound or sound such as a speaker in addition to a display device such as a monitor or HMD generally used in a computer. Then, the information presentation program 108 may present information to the worker by sound or voice.
  • the relationship determination program 111 may determine whether the worker's hand exists in the work target area related to the work procedure that is not being executed. When the relationship determination program 111 determines that the work target and the hand do not overlap and the worker's hand exists in the work target area related to the work procedure different from the work procedure currently being executed, The presentation program 108 may use an output device 103 to present a warning to the worker indicating that work is being performed on a different work target.
  • the information presentation program 108 can cause the user to review the work procedure and appropriately navigate the work.
  • the relationship determination program 111 may determine whether the hand region overlaps the work target region based on the results of steps 604 to 610 executed a plurality of times in the past.
  • the relationship determination program 111 may determine whether the worker's hand exists in the work target area by executing Steps 604 to 609 a plurality of times at predetermined time intervals. In this case, the relationship determination program 111 indicates that the worker's hand exists in the work target area for a predetermined time, and then the worker's hand does not exist in the work target area for a predetermined time. If it is determined, it may be determined in step 610 that the hand region does not overlap the work target region.
  • the work state can be determined by the state determination program 112 at an appropriate timing.
  • the relationship determination program 111 may identify whether the hand is one hand or both hands. Specifically, when the relationship determination program 111 determines that both hands are present in the work target area before a certain time and one hand is not present in the work target area after that time, It may be determined that the work has been completed, and step 611 may be executed.
  • the relationship determination program 111 can be used, for example, in a work where it is necessary to always support the work target with one hand. In addition, it is possible to extract an appropriate timing for determining the work state.
  • step 610 when the relationship determination program 111 determines that the predetermined time has passed, the information presentation program 108 determines that the operator's hand is present in the work target area. , Information indicating warning or attention indicating that the work is delayed or the like may be presented to the worker.
  • step 611 the state determination program 112 extracts the point cloud data corresponding to the work target from the work space data based on the work target area in the work space data obtained in step 607.
  • the point cloud data may be stored in a database such as the storage device 107 in association with the position of the work target on the work space model 104 and the work target in the work procedure 105.
  • the work support system since the work support system according to the first embodiment holds data representing a time-series change regarding each work target, for example, when the work to be performed is an inspection of the work target, the administrator can change the work target over time. Etc. can be confirmed.
  • the information presentation program 108 may accept a correct work procedure input by the operator via a keyboard, mouse, touch panel, or the like. Then, the information presentation program 108 may set the accepted work procedure.
  • the information presentation program 108 acquires the movement of the worker's hand with the depth sensor, and when the worker's hand makes a predetermined movement, the information presentation program 108 shifts to the correct work procedure. May be.
  • step 611 it is determined whether the difference between the point cloud data of the work target extracted from the working space data and the normal object model is larger than a predetermined threshold value.
  • the information content to be presented to the operator may be changed based on the magnitude of the difference.
  • the information presentation program 108 indicates that the work procedure being executed has been completed (such as “XX procedure has been completed”). May be presented to the operator. If the difference obtained in step 612 is smaller than the predetermined second threshold (> first threshold), the information presentation program 108 indicates that it is close to the correct state (such as “turn a little more”). May be presented to the operator.
  • the work procedure 105 may hold the threshold value according to the work content and the content presented to the worker in advance for each work procedure.
  • Step 604 to Step 611 uses the output device 103 to The operator may be presented with a warning or a reminder indicating that is delayed. Thereby, the information presentation program 108 can appropriately navigate the worker.
  • the work support system determines whether one work procedure in a continuous work is finished by determining whether the work target area and the hand area overlap each other. To do. For this reason, the work support system according to the first embodiment can determine the state of the work target at an appropriate timing, and can effectively switch the work procedure information presented to the worker. As a result, it is possible to navigate effectively so that the worker can reliably perform the work.
  • the work support system determines the state of the work target based on the three-dimensional point cloud data, it is possible to improve the determination accuracy.
  • Embodiment 2 of the present invention will be described with reference to FIGS.
  • a depth sensor capable of acquiring three-dimensional data is used as the means for inputting the working space data in the input device 102.
  • a general image is acquired as the means for inputting the working space data.
  • the work support system according to the second embodiment includes a camera instead of the depth sensor 203, and the camera according to the second embodiment acquires the work space data as a two-dimensional image.
  • Example 2 The configuration in Example 2 is the same as in FIG. However, the work space three-dimensional model held by the work space model 104 is a three-dimensional model using polygons (polygons) generally used in CG or the like, and the normal model 106 is an ideal work target. Holds an image representing the state. Further, the process executed by the state determination program 112 from the information presentation program 108 is the process shown in FIG.
  • FIG. 15 is a flowchart illustrating processing by the work support system according to the second embodiment.
  • steps 601 to 603, step 608, step 610, and steps 612 to 616 are the same processing as in the first embodiment.
  • the processing shown in FIG. 15 is different from the processing shown in FIG. 7 in steps 1401 to 1406 in the second embodiment.
  • step 1401 first, the input program 113 acquires an image which is working space data using a camera.
  • FIG. 16 is an explanatory diagram showing an image acquired as working space data of the second embodiment.
  • the image shown in FIG. 16 is acquired by the camera of Example 2 and shows the working space data.
  • the image shown in FIG. 16 includes an equipment 1501 and an image 1502 and an image 1503 of the worker's hand.
  • the hand detection program 109 detects the operator's hand and arm from the working space data acquired in step 1401 (1402).
  • the method for detecting the hand and arm in step 1402 may be any method as long as the hand detection program 109 recognizes a region having a specific shape from the image.
  • the hand detection program 109 may detect the area of the operator's hand and arm from the working space data by detecting an area having the same color as the hand and arm.
  • the hand detection program 109 holds a plurality of representative template images related to the hand and arm of the worker in advance and superimposes the template image on the work space data while changing the position and rotation angle in the template image.
  • the area of the operator's hand and arm may be detected from the working space data by obtaining the template image that best matches the working space data and its position and rotation angle.
  • the correspondence calculation program 114 deletes the detected hand and arm regions from the working space data (1403). Specifically, when the hand detection program 109 detects an area having the same color as the hand and arm as the worker's hand and arm, the correspondence calculation program 114 is detected in the working space data.
  • the hand and arm regions may be deleted by painting all the pixels in the hand and arm regions with a single color (for example, black).
  • the correspondence calculation program 114 displays the template image when the template image is superimposed on the working space data.
  • the hand and arm regions may be deleted by painting all the pixels of the region in the working space data to be overlaid with a single color.
  • the correspondence calculation program 114 further obtains a correspondence between the work space data after deleting the hand and arm regions of the worker and the work space three-dimensional model.
  • the correspondence calculation program 114 may obtain the correspondence by using a method of aligning positions so that the image of the work space data and the work space three-dimensional model best match.
  • the correspondence calculation program 114 uses the edge information extracted from the image and the CG model (V. Lepetit, L. Vacchetti, D. Thallmann and P. Fua, “Fully Automated and Stable Registration Registration”. The correspondence relationship may be obtained by using ISMAR 2003, pp. 93-102, 2003).
  • the correspondence calculation program 114 extracts edge information even in the vicinity of the boundary between the deleted worker's hand and arm regions, and therefore excludes such extracted edge information.
  • the correspondence calculation program 114 similarly uses the region of the operator's hand and arm and the boundary when using another method as a method for obtaining the correspondence between the work space data and the work space three-dimensional model. The correspondence is obtained after excluding the extracted information.
  • the correspondence calculation program 114 obtains, as the correspondence, a transformation matrix that performs rotation and translation of the work space three-dimensional model so as to best match the work space data. .
  • the work target detection program 110 uses the transformation matrix representing the correspondence between the work space data obtained in step 1403 and the work space three-dimensional model to perform each work related to the work procedure being performed.
  • the target area in the working space data is obtained (1404). That is, the work object detection program 110 according to the second embodiment obtains a corresponding area on the image from the three-dimensional area defined on the work space three-dimensional model in step 1404.
  • the work target detection program 110 uses the perspective projection conversion using the conversion row example obtained as a result of step 1403. To convert the coordinates of each vertex of the cube to a position on the image. Then, the work target detection program 110 obtains a region on the work target image by connecting the positions of the converted vertices on the image with straight lines based on the relationship between the vertices.
  • the work target detection program 110 selects a plurality of characteristic points at edges or vertices on the area, The original position is converted to a position on the image by perspective projection conversion. Then, the work object detection program 110 obtains an area on the work object image by obtaining an area represented by connecting the converted points.
  • step 1404 the information presentation program 108 executes step 608.
  • step 608 the relationship determination program 111, based on the area of the operator's hand in the working space data detected in step 1402 and the work target area in the working space data obtained in step 1404, As in step 609 in the first embodiment, it is determined whether the worker's hand overlaps the work target (1405).
  • step 1405 the relationship determination program 111 determines whether the area of the worker's hand exists in the work target area or a predetermined position range including the work target area. To do.
  • the relationship determination program 111 selects only the pixel corresponding to the hand portion from the pixels inside the hand and arm regions deleted from the work space data in step 1403, and the area of the worker's hand in the work space data. Choose as.
  • FIG. 17 is an explanatory diagram illustrating an area of the hand of the worker according to the second embodiment.
  • An area 1601 shown in FIG. 17 indicates a hand area.
  • An area 1602 shown in FIG. 17 is a work target area.
  • the relationship determination program 111 obtains a rectangle circumscribing the pixel determined to correspond to the hand, and determines the rectangular area as the hand area. In step 1405, the relationship determination program 111 determines whether the obtained area 1601 of the worker's hand overlaps the work target area 1602.
  • the relationship determination program 111 executes step 610 thereafter.
  • Step 610 of the second embodiment when it is determined that the area 1601 of the worker's hand does not overlap the area 1602 to be worked, the state determination program 112 starts Step 1406.
  • step 1406 the state determination program 112 extracts an image corresponding to the work target from the work space data based on the work target area in the work space data obtained in step 1404. In addition, the state determination program 112 acquires an image representing an ideal state of the corresponding work target from the normal model 106.
  • the state determination program 112 further compares the work target image in the working space data with the ideal work target image, and calculates the difference between the two. At this time, the work target image in the working space data and the image registered as the normal object model do not need to be images obtained by photographing the object from the same position and direction.
  • the state determination program 112 may obtain the difference between the two after aligning the position so that the work target image in the work space data and the ideal work target image are best matched.
  • the state determination program 112 is a two-dimensional tracking technology (Hirohiko Sagawa, Yudai Urano, Tsuneya Kurihara, “Development of a work support system using simple AR based on two-dimensional tracking,” Human Interface Symposium. 2013 paper collection, 2013) or the like may be used.
  • step 1406 the state determination program 112 executes step 612.
  • the work support system can determine the state of the work target at an appropriate timing and effectively switch the work procedure information presented to the worker. It becomes possible.
  • Embodiment 3 of the present invention will be described with reference to FIG.
  • Example 1 when it is determined that the operator's hand does not overlap the work target, the process proceeds to a process of calculating the difference between the work target data and the normal object model in the working space data.
  • the work support system according to the third embodiment further executes a process of determining whether a pre-registered worker's hand action has been performed.
  • the work support system is configured so that the work target data in the working space data is obtained when the operator's hand does not exist in the work target area after the operation of the predetermined hand is performed. And a process for calculating a difference between the normal object model and the normal object model.
  • the configuration of the work support system of the third embodiment is the same as that shown in FIG. However, the work procedure 105 of the third embodiment holds information regarding hand movements for determining hand movements.
  • the administrator or the like may store information related to hand movements in association with information on each work procedure in the format of the work procedure 105 shown in FIG.
  • the work support system according to the third embodiment may include a storage area that holds the name of the work procedure and the information related to the hand movement.
  • the information related to the hand motion may be, for example, time series data of a rotation matrix representing a position coordinate representing the hand trajectory and a direction. Further, the information related to the hand motion may be any information as long as it is information representing a change in the hand trajectory and posture, such as a method of representing the symbol as a moving direction.
  • FIG. 18 is a flowchart illustrating processing by the work support system according to the third embodiment.
  • Steps 601 to 608, Step 609, and Steps 611 to 616 are the same as those in the first embodiment.
  • steps 1701 and 1702 are different from the processing shown in FIG.
  • the process of step 1701 is executed by the relationship determination program 111 of the third embodiment.
  • the relationship determination program 111 obtains the hand position and posture from the hand area detected in step 605. Then, the relationship determination program 111 associates the time-series data based on the hand position and posture obtained in step 608 executed multiple times in the past up to the time when step 608 starts with the work procedure 105 being executed. It is determined whether or not the predetermined operation has been performed by comparing the registered information regarding the operation of the predetermined hand (1701).
  • the relationship determination program 111 regards, for example, the position of the center of gravity of the area of the hand as the position of the hand.
  • the relationship determination program 111 may use a method in which the position of the hand region detected from the working space data and the position of the detailed hand model are matched to determine the palm center position as the hand position.
  • the relationship determination program 111 can obtain a hand posture by obtaining a rectangular parallelepiped that best circumscribes the region of the hand of the worker and calculating a rotation matrix representing the direction.
  • the relationship determination program 111 may use a method of calculating a rotation matrix representing the direction of the palm by matching the position of the hand region detected from the working space data with the detailed hand model.
  • the relationship determination program 111 compares well-known time-series data comparison methods for comparing the time-series data of hand positions and postures up to execution of step 1701 with predetermined hand movements, for example, , Dynamic programming (DP matching) (Uchida, “DP Matching Overview: Basics and Various Extensions”, IEICE Technical Report, PRMU 2006-166, pp. 31-36, 2006), etc. may be used.
  • DP matching Dynamic programming
  • the relationship determination program 111 calculates an index representing the difference and similarity between the two from the result of comparing the two. Then, the relationship determination program 111 may determine whether a predetermined operation has been performed by comparing a predetermined threshold value with the calculated index.
  • the relationship determination program 111 determines that a predetermined operation has been performed when the calculated index is smaller than a predetermined threshold. To do.
  • the relationship determination program 111 determines that a predetermined operation has been performed when the calculated similarity is greater than a threshold value.
  • the relationship determination program 111 stores the determined time in the storage device 107 or the like. Thereafter, the relationship determination program 111 executes Step 609.
  • the relationship determination program 111 determines whether or not the hand overlaps the work target after the operator's hand has performed a predetermined motion based on the determination results in step 1701 and step 609 ( 1702). Specifically, the relationship determination program 111 performs a predetermined operation before starting the step 1702 and if the operator's hand does not exist in the work target area after the predetermined operation is performed. If it is determined, it is determined that the work target and the hand do not overlap. Then, the relationship determination program 111 causes the state determination program 112 to execute step 611.
  • the relationship determination program 111 it may be determined that the work target and the hand do not overlap, and the state determination program 112 may execute step 611.
  • the third embodiment described above can also be applied to the second embodiment in which an image acquired from a camera is used as working space data.
  • Step 1701 described above it was determined whether the predetermined hand movement was performed based only on the comparison result between the time series data including the position and posture of the worker's hand and the predetermined hand movement.
  • the relationship determination program 111 may further determine that the predetermined hand movement has been performed only when the hand movement is performed within a predetermined position range from the work target area.
  • the relationship determination program 111 may extract the hand direction and the direction of hand movement. If there is a work target area in the direction of the extracted hand movement and the hand movement is close to the predetermined movement, the relationship determination program 111 determines that the predetermined hand movement has been performed. good.
  • the relationship determination program 111 can determine that the timing at which the specific movement indicating the end of the work procedure is performed is an appropriate timing for determining the work state. For example, the relationship determination program 111 can proceed to step 611 for determining the state of the work target at the timing when the “pointing” operation is performed on the work target.
  • step 1701 the relationship determination program 111 extracts the direction of hand movement regardless of whether it is determined that a predetermined hand motion has been performed, and the work target in the extracted direction is being executed. It may be determined whether the work target is related to the work procedure.
  • the relationship determination program 111 displays the information.
  • the program 108 and the output device 103 may be used to present a warning to the worker indicating that work is being performed on a different work target.
  • the information presentation program 108 can cause the worker to review the work procedure and appropriately navigate the work.
  • the relationship determination program 111 can prompt the worker to confirm the content of the work again, and can appropriately navigate the worker.
  • step 1702 when a predetermined period of time has elapsed when the operator's hand does not perform the predetermined operation, the relationship determination program 111 uses the information presentation program 108 and the output device 103 to delay the operation, for example.
  • the operator may be presented with a warning or alert to indicate that he is doing.
  • the work support system in order to execute the process of determining whether the state of the work target is appropriate based on whether the worker has performed a predetermined operation and whether the hand has been released from the work target, can determine the end of the work procedure at an appropriate timing after the work is appropriately executed.
  • the work support system according to the third embodiment can effectively navigate the worker at an appropriate timing.
  • Embodiment 4 of the present invention will be described with reference to FIGS.
  • the work support system of Example 1 shifted to a process of calculating the difference between the work target data in the working space data and the normal object model based on the relationship between the worker's hand and the work target.
  • the work support system assumes a case where the worker performs a work while holding a tool or the like, determines the positional relationship between the work target area and the worker's hand, and further determines the worker's hand. Does not overlap the work target area, it is determined whether a tool or the like exists between the operator's hand and the work target area.
  • the work support system according to the fourth embodiment determines whether the area of the tool or the like overlaps the area to be worked. Furthermore, the work support system according to the fourth embodiment executes a process of calculating the difference between the work target data in the work space data and the normal object model when the hand and the tool do not overlap the work target area. Then, it is determined whether the work procedure has been completed normally.
  • the configuration of the work support system of the fourth embodiment is the same as the configuration of the work support system of FIG. However, the process of the work support system according to the fourth embodiment is partially different from the process of the work support system of FIG.
  • FIG. 19 is a flowchart illustrating processing by the work support system according to the fourth embodiment.
  • steps 601 to 609 and steps 611 to 616 are the same as those in the first embodiment.
  • steps 1801 and 1802 are different processes from those shown in FIG.
  • the process of step 1801 is executed by the relationship determination program 111 of the third embodiment.
  • the relationship determination program 111 detects an object such as a tool, and determines whether the object such as a tool is covered by the work target (1801).
  • the relationship determination program 111 may execute Step 1801 particularly when it is determined in Step 609 that the operator's hand does not overlap the work target area.
  • the relationship determination program 111 prevents erroneous determination that one work procedure has been completed in step 1802.
  • FIG. 20 is an explanatory diagram showing the working space data when the worker of the fourth embodiment is working while holding a tool or the like.
  • an area 1901 is point cloud data corresponding to an object such as a tool.
  • the area 1001 is the work target area obtained in step 607, similarly to the area 1001 shown in FIGS. 11 and 13.
  • the area 1201 is the area of the operator's hand obtained in step 609, as in the area 1201 shown in FIG.
  • the area 1901 is the work target area calculated in step 607, and the area 1202 is the area of the worker's hand calculated in step 609.
  • step 1801 the relationship determination program 111 extracts point cloud data in the space between the work target area 1001 and the worker hand area 1202 from the working space data.
  • the relationship determination program 111 determines that the tool or the like held by the worker exists in the work target area. be able to.
  • the relationship determination program 111 may hold in advance a three-dimensional model such as a tool to be used in the same manner as the hand detection method described in the first embodiment.
  • the relationship determination program 111 extracts point cloud data between the work target area and the worker's hand area from the working space data, and the extracted point cloud data and a three-dimensional model such as a tool. The positions may be aligned so that and match best.
  • the relationship determination program 111 may determine that the operator is holding the tool or the like when the difference between the three-dimensional model of the tool and the extracted point cloud data is smaller than a predetermined threshold.
  • the relationship determination program 111 may specify a region such as a tool in the working space data based on the result of alignment between the three-dimensional model and the extracted point cloud data, as in the case of hand detection. . Then, the relationship determination program 111 may further determine whether the tool or the like overlaps the work target area based on the overlap between the specified tool area and the work target area.
  • the work procedure 105 may hold a three-dimensional model such as a tool to be used in association with each work procedure.
  • the relationship determination program 111 determines whether both the operator's hand and an object such as a tool overlap the work target area based on the determinations in steps 609 and 1801 (1802). If both the operator's hand and the object such as the tool are not included in the work target area (does not overlap), the state determination program 112 starts step 611, and the operator's hand and the object such as the tool are working. If it is included (overlapping) in the target area, the input program 113 starts step 604.
  • Example 4 can also be applied to Example 2 in which an image acquired from a camera is used as working space data.
  • the process for determining the state of the work target can be performed at an appropriate timing, so this work can also be performed for a more general work. Examples are applicable.
  • a part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment. Further, it is possible to add, delete, and replace other configurations for a part of the configuration of each embodiment.
  • each of the above-described configurations, functions, processing units, processing procedures, etc. may be realized in hardware by designing a part or all of them, for example, with an integrated circuit.
  • Each of the above-described configurations, functions, and the like may be realized by software by interpreting and executing a program that realizes each function by the processor.
  • Information such as a program, a table, or a file that realizes each function can be stored in a recording device such as a memory, a hard disk, or an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, or a DVD.
  • control lines or information lines indicate what is considered necessary for the explanation, and not all control lines or information lines on the product are necessarily shown. In practice, almost all the components are connected to each other.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Manipulator (AREA)

Abstract

A work assistance system comprising: a camera which outputs a captured image of a target object, which is an object to be worked on, and of the surroundings thereof; a storage unit which stores a first model representing a three-dimensional model of the target object; a detection unit which detects, from the image, both a hand of a user performing the work, and the target object; a relationship determination unit which determines from the image whether the hand and the target object overlap each other; a condition determination unit which, if the hand and the target object do not overlap each other, extracts first information about the target object from the image and second information about the target object from the first model, and determines the working conditions on the basis of the results of a comparison between the first information and the second information; and an output unit which outputs information based on the working conditions.

Description

作業支援システム、及び、作業支援方法Work support system and work support method

 本発明は、作業支援システムに関する。 The present invention relates to a work support system.

 従来、部品を組み付ける作業において、部品を取り付ける位置を、プロジェクタを用いて作業空間上の部品に重畳表示するとともに、個々の部品の組み付けが終了した時点で作業者がスイッチを押下することにより、対象の部品の組み付け状態を画像により判定し、合格であれば次の組み付け工程に移行する技術が提案されている。 Conventionally, in the work of assembling parts, the position where the parts are attached is displayed superimposed on the parts in the work space using a projector, and when the individual parts are assembled, the operator depresses the switch. A technique has been proposed in which the assembly state of the part is determined from an image, and if it passes, the process proceeds to the next assembly process.

 また、作業対象及び工具に取り付けられたマーカを用いて、画像から作業対象の位置及び工具の位置を特定し、作業対象の位置に工具の位置が含まれていれば、正しい手順で作業が行われていると判定する技術が提案されている。 Also, using the markers attached to the work target and the tool, the position of the work target and the tool position are specified from the image, and if the position of the tool is included in the work target position, the work is performed in the correct procedure. A technique for determining that a device is broken has been proposed.

 さらに、作業者の姿勢、位置及び移動方向と作業手順データから、作業者が実施しようとしている作業が正しいかを判定するとともに、作業対象に取り付けたタッチセンサに触れたことを検出することにより作業対象に作業を行ったことを判定する技術、及び、作業対象から得られたセンサデータの変化を検知することにより、作業が正しく終了したことを判定する技術が提案されている(例えば、特許文献1参照)。 In addition, it is possible to determine from the worker's posture, position, moving direction and work procedure data whether the work that the worker is trying to carry out is correct and by detecting that the touch sensor attached to the work object is touched. Techniques for determining that work has been performed on a target and techniques for determining that work has been completed correctly by detecting changes in sensor data obtained from the work target have been proposed (for example, Patent Documents). 1).

特開2005-250990号公報JP 2005-250990 A

 従来の技術を用いることにより、作業の手順を順番に作業者に提示し、作業者をナビゲートする作業支援システムを構築することが可能である。 By using conventional technology, it is possible to construct a work support system that presents work procedures to the worker in order and navigates the worker.

 しかし、次の作業手順に移行する方法である、別途用意されたスイッチを操作する方法は、作業者が、通常の作業では行わない動作を行う必要がある。このため、このような方法は、スムーズな作業の進行を妨げる可能性がある。 However, the method of operating a separately prepared switch, which is a method for shifting to the next work procedure, requires an operator to perform an operation that is not performed in normal work. For this reason, such a method may hinder smooth progress of work.

 一方、従来技術又は特許文献1にあるように、作業をナビゲートする方法が、マーカ又はタッチセンサにより作業対象に対して作業が行われたことを検出する方法である場合、作業の流れに対する影響は少ない可能性はある。しかし、このような方法は、作業が大きい設備の据え付け又は組み立てである場合、マーカ又はタッチセンサを取り付けることは不可能又は困難な場合も多いという問題がある。また、このような方法は、作業対象へ何らかの作業が行われたことは判定できるが、作業対象の状態が正しい状態になっているかを判定することができない。 On the other hand, when the method for navigating the work is a method for detecting that the work has been performed on the work object by the marker or the touch sensor, as described in the related art or Patent Document 1, the influence on the work flow. There are few possibilities. However, such a method has a problem that it is often impossible or difficult to attach a marker or a touch sensor in the case of installing or assembling a large-scale facility. Such a method can determine that some work has been performed on the work target, but cannot determine whether the work target is in the correct state.

 一方、画像に基づいて作業対象の状態を判定する技術、及び、特許文献1に記載されているセンサデータの変化を検出する技術を用いることにより、作業が正しく終了したことを判定し、自動的に次の作業手順に関する情報を提示するような作業支援システムを構築することが可能である。 On the other hand, by using the technique for determining the state of the work object based on the image and the technique for detecting the change in the sensor data described in Patent Document 1, it is determined that the work has been correctly completed and automatically It is possible to construct a work support system that presents information on the next work procedure.

 しかし、画像に基づいて、作業対象の状態を常時判定する方法を用いた場合、誤って作業が終了したと判定される可能性が高まるという問題がある。 However, when the method of constantly determining the state of the work target based on the image is used, there is a problem that the possibility that the work is erroneously determined is increased.

 また、センサデータにより作業の終了を判定する方法も、据え付け及び組み立てのように、センサの取り付けが困難な場合又はセンサの稼動が困難な場合が多いと考えられる。 Also, it is considered that the method of determining the end of work based on sensor data is often difficult to install the sensor or to operate the sensor, such as installation and assembly.

 本発明の目的は、作業の手順を順番に作業者に提示し、作業者をナビゲートする作業支援システムにおいて、マーカ及びセンサ等の特別な装置を作業対象に追加することなく、適切なタイミングで作業対象の状態を判定し、判定結果に基づいて作業者に提示する作業内容を自動的に切り換えることにより、作業者の通常の作業を妨げることなく、確実に作業を遂行できるように作業者を効果的にナビゲートする技術を提供することにある。 An object of the present invention is to provide a work support system for presenting work procedures to a worker in order and navigating the worker at an appropriate timing without adding special devices such as markers and sensors to the work target. By determining the status of the work target and automatically switching the work content to be presented to the worker based on the determination result, the worker can be surely performed without interfering with the normal work of the worker. It is to provide a technique for effectively navigating.

 上記の課題を解決するために、本発明は、作業支援システムであって、作業の対象である対象物と当該対象物の周辺とを撮影した画像を出力するカメラと、前記対象物の三次元モデルを示す第1モデルが記憶される記憶部と、前記画像から、前記作業を行うユーザの手と前記対象物とを検出する検出部と、前記手と前記対象物とが重なるかを前記画像から判定する関係判定部と、前記手と前記対象物とが重ならない場合、前記対象物の情報である第1情報を前記画像から抽出し、前記対象物の情報である第2情報を前記第1モデルから抽出し、前記第1情報と前記第2情報とを比較した結果に基づいて、作業状態を判定する状態判定部と、前記作業状態に従った情報を出力する出力部と、を有する。 In order to solve the above problems, the present invention is a work support system, which outputs a photographed image of a target object that is a work target and the periphery of the target object, and a three-dimensional view of the target object. A storage unit that stores a first model indicating a model, a detection unit that detects a hand of the user who performs the work and the object from the image, and whether the hand and the object overlap each other When the relationship determination unit for determining from the above and the hand and the object do not overlap, the first information that is the information of the object is extracted from the image, and the second information that is the information of the object is the second information A state determination unit that determines a working state based on a result extracted from one model and compares the first information and the second information; and an output unit that outputs information according to the working state. .

 本発明によれば、確実に作業を遂行できるように作業者を効果的にナビゲートできる。 According to the present invention, it is possible to navigate the worker effectively so that the work can be performed reliably.

 上記した以外の課題、構成及び効果は、以下の実施形態の説明により明らかにされる。 Issues, configurations, and effects other than those described above will be clarified by the following description of the embodiments.

実施例1の作業支援システムの物理的な構成を示すブロック図である。It is a block diagram which shows the physical structure of the work assistance system of Example 1. FIG. 実施例1の作業支援システムにおける作業を示す説明図である。It is explanatory drawing which shows the operation | work in the work assistance system of Example 1. FIG. 実施例1の作業支援システムの設備の点群データを示す説明図である。It is explanatory drawing which shows the point cloud data of the installation of the work assistance system of Example 1. FIG. 実施例1の作業手順に格納されるデータフォーマットを示す説明図である。It is explanatory drawing which shows the data format stored in the work procedure of Example 1. FIG. 実施例1の作業対象の領域と作業空間三次元モデルとを示す説明図である。It is explanatory drawing which shows the area | region of the work target of Example 1, and a work space three-dimensional model. 実施例1の作業支援システムによる処理を示す機能ブロック図である。It is a functional block diagram which shows the process by the work assistance system of Example 1. FIG. 実施例1の作業支援システムによる処理を示すフローチャートである。3 is a flowchart illustrating processing by the work support system according to the first embodiment. 実施例1の作業中空間データの点群データを示す説明図である。It is explanatory drawing which shows the point cloud data of working space data of Example 1. FIG. 実施例1の手及び腕の形状を表す三次元モデルを示す説明図である。FIG. 3 is an explanatory diagram illustrating a three-dimensional model representing hand and arm shapes according to the first embodiment. 実施例1の作業者の手及び腕に対応する点群データを削除した後の作業中空間データを示す説明図である。It is explanatory drawing which shows the work space data after deleting the point cloud data corresponding to the operator's hand and arm of Example 1. FIG. 実施例1の作業中空間データにおける作業対象の領域を示す説明図である。It is explanatory drawing which shows the area | region of the work object in the working space data of Example 1. FIG. 実施例1のデプスセンサによって取得された距離画像上に、作業手順と作業対象の領域に関する情報を表示した画面を示す説明図である。It is explanatory drawing which shows the screen which displayed the information regarding the work procedure and the area | region of work object on the distance image acquired by the depth sensor of Example 1. FIG. 実施例1の作業者の手の領域を示す説明図である。It is explanatory drawing which shows the area | region of an operator's hand of Example 1. FIG. 実施例1の一つの作業手順が終了した後に表示される画面を示す説明図である。It is explanatory drawing which shows the screen displayed after one work procedure of Example 1 is complete | finished. 実施例2の作業支援システムによる処理を示すフローチャートである。10 is a flowchart illustrating processing by the work support system according to the second embodiment. 実施例2の作業中空間データとして取得された画像を示す説明図である。FIG. 10 is an explanatory diagram illustrating an image acquired as working space data according to the second embodiment. 実施例2の作業者の手の領域を示す説明図である。It is explanatory drawing which shows the area | region of the operator's hand of Example 2. FIG. 実施例3の作業支援システムによる処理を示すフローチャートである。12 is a flowchart illustrating processing by the work support system according to the third embodiment. 実施例4の作業支援システムによる処理を示すフローチャートである。10 is a flowchart illustrating processing by the work support system according to the fourth embodiment. 実施例4の作業者が工具等を把持して作業を行っている場合の作業中空間データを示す説明図である。It is explanatory drawing which shows the space data during work when the operator of Example 4 is carrying out work while holding a tool or the like.

 工事又は設備の保守又は点検等の各種作業において、作業の内容及び作業の対象物を適切なタイミングで作業者に提示することにより作業者の作業をナビゲートする、本実施例の作業支援システムについて、以下に説明する。 About the work support system of this embodiment that navigates a worker's work by presenting the work content and work object to the worker at an appropriate timing in various work such as maintenance or inspection of construction or facilities This will be described below.

 本発明における実施例1を図1から図14を用いて説明する。 Embodiment 1 of the present invention will be described with reference to FIGS.

 図1は、実施例1の作業支援システムの物理的な構成を示すブロック図である。 FIG. 1 is a block diagram illustrating a physical configuration of the work support system according to the first embodiment.

 図1は、一般的な作業支援システムが本実施例の処理を実行する場合の作業支援システムの構成を示す。 FIG. 1 shows a configuration of a work support system when a general work support system executes the processing of this embodiment.

 本実施例の作業支援システムは、情報処理装置101、入力装置102、出力装置103、作業空間モデル104、作業手順105、正常時モデル106、記憶装置107、及び、作業中空間データ115を有する。 The work support system according to the present embodiment includes an information processing apparatus 101, an input apparatus 102, an output apparatus 103, a work space model 104, a work procedure 105, a normal time model 106, a storage device 107, and a work space data 115.

 情報処理装置101は、記憶装置107に格納された各種のプログラムを実行するための装置である。情報処理装置101は、少なくとも一つのプロセッサ及びメモリを有する。 The information processing apparatus 101 is an apparatus for executing various programs stored in the storage device 107. The information processing apparatus 101 includes at least one processor and a memory.

 情報処理装置101が有するプロセッサは、例えばCPUである。情報処理装置101が有するメモリは、不揮発性の記憶素子であるROM及び揮発性の記憶素子であるRAMを含む。ROMは、不変のプログラム(例えば、BIOS)などを格納する。 The processor included in the information processing apparatus 101 is, for example, a CPU. The memory included in the information processing apparatus 101 includes a ROM that is a nonvolatile storage element and a RAM that is a volatile storage element. The ROM stores an immutable program (for example, BIOS).

 RAMは、DRAM(Dynamic Random Access Memory)のような高速かつ揮発性の記憶素子であり、記憶装置107に格納されるプログラム及びプログラムの実行時に使用されるデータを一時的に格納する。 The RAM is a high-speed and volatile storage element such as a DRAM (Dynamic Random Access Memory), and temporarily stores a program stored in the storage device 107 and data used when the program is executed.

 記憶装置107、作業空間モデル104、作業手順105、正常時モデル106及び作業中空間データ115は、例えば、磁気記憶装置(HDD)、フラッシュメモリ(SSD)等の大容量かつ不揮発性の記憶装置であり、情報処理装置101が実行するプログラム及びプログラムの実行時に使用されるデータを格納する。 The storage device 107, the work space model 104, the work procedure 105, the normal time model 106, and the work space data 115 are large-capacity and nonvolatile storage devices such as a magnetic storage device (HDD) and a flash memory (SSD), for example. Yes, it stores a program executed by the information processing apparatus 101 and data used when the program is executed.

 入力装置102は、作業者が作業を実施する対象物である作業対象と作業対象が存在する空間を撮影したデータを、本実施例の作業中空間データとして取得する装置を含む。また、入力装置102は、作業者による指示を受け付け、プログラムに送る。入力装置102は、例えば、キーボード、マウス、又は、タッチパネル等の一般的なコンピュータにおける入力装置を備えてもよい。 The input device 102 includes a device that acquires, as the working space data of the present embodiment, data obtained by photographing a work target that is an object on which the worker performs work and a space where the work target exists. The input device 102 receives an instruction from the worker and sends it to the program. The input device 102 may include an input device in a general computer such as a keyboard, a mouse, or a touch panel.

 実施例1において、入力装置102は、対象物までの距離を画素値とする距離画像を取得するデプスセンサを有する。入力装置102は、デプスセンサによって取得された距離画像を、デプスセンサの焦点距離及び中心位置を用いることにより、三次元空間上における点の集合として対象空間の情報を表す点群データに変換する。本実施例における作業中空間データは、主に点群データとして表現される。 In the first embodiment, the input device 102 includes a depth sensor that acquires a distance image having a pixel value as a distance to an object. The input device 102 converts the distance image acquired by the depth sensor into point cloud data representing information of the target space as a set of points on the three-dimensional space by using the focal length and the center position of the depth sensor. The working space data in this embodiment is mainly expressed as point cloud data.

 出力装置103は、本実施例の作業支援システムによる処理の結果を、作業者へ提示する装置である。出力装置103は、例えば、一般的な作業支援システムに使用されるモニタを備えても良いし、HMD(ヘッドマウントディスプレイ)と呼ばれる頭部装着型の表示装置を備えても良い。 The output device 103 is a device that presents the results of processing by the work support system of the present embodiment to the worker. The output device 103 may include, for example, a monitor used in a general work support system, or may include a head-mounted display device called an HMD (head mounted display).

 作業空間モデル104は、作業を実施する空間全体を表す作業空間三次元モデルを格納する記憶領域である。作業空間三次元モデルは、作業空間モデル104において、作業中空間データと同じく点群データによって表現されても良い。 The work space model 104 is a storage area for storing a three-dimensional work space model that represents the entire space in which work is performed. The work space three-dimensional model may be represented by point cloud data in the work space model 104 in the same manner as the work space data.

 一般的な作業支援システムは、複数の作業に関する情報を格納することが望ましい。本実施例の作業支援システムも、作業毎に異なる作業空間三次元モデルを用いる。このため、作業空間モデル104は、作業空間三次元モデルを示す点群データと、作業空間三次元モデルが用いられる作業を識別するための識別子(例えば、名称)とを、対応させて格納する。 It is desirable for general work support systems to store information related to multiple works. The work support system of this embodiment also uses a different work space three-dimensional model for each work. For this reason, the work space model 104 stores point cloud data indicating the work space three-dimensional model and an identifier (for example, name) for identifying the work in which the work space three-dimensional model is used in association with each other.

 なお、作業空間モデル104は、CG等で一般的に使用されるポリゴン(多角形)を用いた三次元モデルに関する情報を格納しても良い。作業空間モデル104にアクセスするプログラムは、作業空間モデル104におけるポリゴンを、所定の大きさになるまで分割し、分割後の各々のポリゴンの頂点の座標のみを抽出しても良い。そして、プログラムは、これによって、三次元モデルを表現するポリゴンを、抽出された座標を示す点群データに容易に変換できる。 Note that the work space model 104 may store information on a three-dimensional model using a polygon (polygon) generally used in CG or the like. The program that accesses the work space model 104 may divide the polygon in the work space model 104 until it reaches a predetermined size, and extract only the coordinates of the vertexes of each polygon after the division. Thus, the program can easily convert the polygon representing the three-dimensional model into point cloud data indicating the extracted coordinates.

 作業手順105は、作業を実施するための一連の作業手順と、作業を実施する対象物である作業対象を特定するための情報と、作業空間三次元モデルにおける作業対象の位置に関する情報を含む。なお、作業手順105は、作業対象の大きさ、形状、色及び温度等の作業対象の状態を示す情報を含んでも良い。 The work procedure 105 includes a series of work procedures for carrying out the work, information for specifying the work object that is the object to be worked, and information on the position of the work object in the work space three-dimensional model. The work procedure 105 may include information indicating the state of the work target such as the size, shape, color, and temperature of the work target.

 正常時モデル106は、作業手順105が示す各作業対象の作業後の理想的な状態(位置、大きさ、形状、色及び温度等)を表す正常時対象物(三次元)モデルを含む。正常時モデル106に格納される三次元モデルは、作業中空間データ及び作業空間三次元モデルと同じく、点群データによって表現されても良い。 The normal-time model 106 includes a normal-time object (three-dimensional) model that represents an ideal state (position, size, shape, color, temperature, and the like) after the work of each work object indicated by the work procedure 105. The three-dimensional model stored in the normal-time model 106 may be expressed by point cloud data, like the working space data and the working space three-dimensional model.

 ある作業対象に対応する三次元モデルは、同じ作業中においても、作業手順によって理想とする状態が異なる場合があると考えられる。このため、正常時モデル106は、作業対象を表現した点群データと、関連する作業及び作業手順に関する情報(後述の作業の名称401、並びに、作業手順の領域403及び領域407等)とを、対応させて保持する。 It is considered that the ideal state of a 3D model corresponding to a certain work target may differ depending on the work procedure even during the same work. For this reason, the normal model 106 includes point cloud data representing a work target and information related to related work and work procedures (a work name 401 described later, work procedure areas 403 and 407, and the like), Hold in correspondence.

 正常時モデル106は、ポリゴンを用いた三次元モデルを保持しても良い。そして、正常時モデル106にアクセスするプログラムが、前述のような方法でポリゴンを点群データに変換しても良い。 The normal model 106 may hold a three-dimensional model using polygons. A program that accesses the normal model 106 may convert the polygon into point cloud data by the method described above.

 作業中空間データ115は、作業を実施中に取得した作業空間を示す作業中空間データを含む。作業中空間データ115は、作業空間にある物体の形状、大きさ及び位置を示す点群データを含み、また、作業空間にある物体の温度等を示すデータを含んでも良い。 The working space data 115 includes working space data indicating the working space acquired during the work. The working space data 115 includes point cloud data indicating the shape, size, and position of an object in the working space, and may include data indicating the temperature of the object in the working space.

 記憶装置107は、情報提示プログラム108、入力プログラム113、対応関係算出プログラム114、手検出プログラム109、作業対象検出プログラム110、関係判定プログラム111、及び、状態判定プログラム112を含む。情報提示プログラム108は、記憶装置107が保持する他のプログラムを制御し、作業手順105の内容に従って作業者に情報を提示するプログラムである。 The storage device 107 includes an information presentation program 108, an input program 113, a correspondence calculation program 114, a hand detection program 109, a work target detection program 110, a relationship determination program 111, and a state determination program 112. The information presentation program 108 is a program that controls other programs held in the storage device 107 and presents information to the worker according to the contents of the work procedure 105.

 入力プログラム113は、作業が開始した後、作業中空間データを入力装置102を用いて取得するプログラムである。対応関係算出プログラム114は、作業中空間データと作業空間三次元モデルとの対応関係を求めるプログラムである。手検出プログラム109は、作業中空間データから作業者の手の領域を検出するプログラムである。 The input program 113 is a program for acquiring working space data using the input device 102 after the work starts. The correspondence calculation program 114 is a program for obtaining a correspondence between the work space data and the work space three-dimensional model. The hand detection program 109 is a program for detecting an area of the worker's hand from the working space data.

 作業対象検出プログラム110は、対応関係算出プログラム114によって求められた対応関係に基づいて、作業中空間データにおける作業対象の領域を検出するプログラムである。 The work target detection program 110 is a program that detects a work target area in the working space data based on the correspondence obtained by the correspondence calculation program 114.

 関係判定プログラム111は、作業対象の領域と手の領域とが重なるかを判定する。具体的には、例えば、関係判定プログラム111は、作業対象検出プログラム110によって検出された作業対象の領域を含む所定の位置の範囲内に、手検出プログラム109によって検出された作業者の手の領域が存在するかを判定することによって、重なるかを判定する。 The relationship determination program 111 determines whether the work target area and the hand area overlap. Specifically, for example, the relationship determination program 111 includes an area of the operator's hand detected by the hand detection program 109 within a predetermined position including the area of the work target detected by the work target detection program 110. Are determined by determining whether they exist.

 作業対象の領域と手の領域とが重ならないと判定された場合、状態判定プログラム112は、作業中空間データから作業対象の領域のデータを抽出し、さらに、作業対象に対応する正常時対象物モデルを正常時モデル106から抽出する。 When it is determined that the work target area and the hand area do not overlap, the state determination program 112 extracts the data of the work target area from the working space data, and further, the normal target object corresponding to the work target The model is extracted from the normal model 106.

 そして、状態判定プログラム112は、抽出した作業対象の領域のデータと抽出した正常時対象物モデルとを比較することによって、作業対象が適切な状態であるか、すなわち、作業が適切に終了したかを判定する。 Then, the state determination program 112 compares the extracted work area data with the extracted normal object model to determine whether the work object is in an appropriate state, that is, whether the work has been properly completed. Determine.

 図2は、実施例1の作業支援システムが適用された作業を示す説明図である。 FIG. 2 is an explanatory diagram showing work to which the work support system of the first embodiment is applied.

 図2に示す作業支援システムは、図1に示す作業支援システムと、作業者が作業する作業対象とを含む。図2に示す作業支援システムは、端末201、タッチパネル202、デプスセンサ203、ヘルメット204、HMD205、設備206及びレバー207を含む。 The work support system shown in FIG. 2 includes the work support system shown in FIG. 1 and the work target on which the worker works. The work support system shown in FIG. 2 includes a terminal 201, a touch panel 202, a depth sensor 203, a helmet 204, an HMD 205, equipment 206, and a lever 207.

 端末201は、作業支援システムにおいて、作業者に作業に関する情報を提示したり、作業者からの指示を受け付けたりする携帯型端末である。端末201は、入力装置及び出力装置を有せばいずれの装置であってもよく、タブレット型端末又はスマートフォン等であっても良い。 The terminal 201 is a portable terminal that presents information regarding work to the worker or accepts instructions from the worker in the work support system. The terminal 201 may be any device as long as it has an input device and an output device, and may be a tablet terminal or a smartphone.

 図1に示す作業支援システムは、端末201、デプスセンサ203及びHMD205によって実装されても良い。また、端末201に有線又は無線によって接続される計算機が、図1に示す情報処理装置101、記憶装置107、作業空間モデル104、作業手順105、正常時モデル106及び作業中空間データ115を実装しても良い。 The work support system shown in FIG. 1 may be implemented by the terminal 201, the depth sensor 203, and the HMD 205. A computer connected to the terminal 201 by wire or wireless implements the information processing apparatus 101, the storage device 107, the work space model 104, the work procedure 105, the normal model 106, and the work space data 115 shown in FIG. May be.

 端末201は、入力装置102及び出力装置103として、例えば、タッチパネル202を備える。タッチパネル202は、作業者によって操作される。作業者は、タッチパネル202に指示を入力することにより、作業支援システムの動作を制御する。 The terminal 201 includes, for example, a touch panel 202 as the input device 102 and the output device 103. The touch panel 202 is operated by an operator. The worker controls the operation of the work support system by inputting an instruction to the touch panel 202.

 例えば、作業者は、作業支援システムにおいて作業を開始する場合若しくは終了する場合、又は、作業支援システムが誤って次の作業手順に移行した時に元の作業手順に戻す場合等にタッチパネル202を操作する。 For example, the worker operates the touch panel 202 when starting or ending work in the work support system, or when returning to the original work procedure when the work support system erroneously shifts to the next work procedure. .

 デプスセンサ203は、作業対象及び作業対象の周辺にある物体と、デプスセンサ203との距離を測定する。そして、デプスセンサ203は、測定結果を作業中空間データとして取得し、入力プログラム113は、取得された作業中空間データを作業中空間データ115に格納する。デプスセンサ203は、作業者、又は、作業者の近傍に設置される。 The depth sensor 203 measures the distance between the work target and an object around the work target and the depth sensor 203. The depth sensor 203 acquires the measurement result as working space data, and the input program 113 stores the obtained working space data in the working space data 115. The depth sensor 203 is installed in the worker or in the vicinity of the worker.

 図2に示すデプスセンサ203は、ヘルメット204に設置されることにより、作業対象と作業者の手とを含む作業者の視界に存在する物体の情報を、作業者の視点から取得することができる。そして、本実施例の状態判定プログラム112は、ユーザによって認識される作業対象に基づいて作業の状態を判定できるため、精度よく作業の状態を判定できる。 The depth sensor 203 shown in FIG. 2 is installed in the helmet 204, so that it is possible to acquire information on an object existing in the worker's field of view including the work target and the worker's hand from the worker's viewpoint. And since the state determination program 112 of a present Example can determine the state of work based on the work object recognized by the user, it can determine the state of work accurately.

 デプスセンサ203は、作業対象と作業者の手とを含む空間に存在する物体の形状を取得することが可能であればいずれの位置に設置されても良い。このため、デプスセンサ203は、HMD205に設置されても良い。または、デプスセンサ203は、作業者の肩又は胸等、作業対象を含む空間の情報を取得できる位置であれば、作業者の身体上の任意の位置に設置されて良い。また、デプスセンサ203は、作業者近傍の天井等に設置されても良い。 The depth sensor 203 may be installed at any position as long as the shape of the object existing in the space including the work target and the worker's hand can be acquired. For this reason, the depth sensor 203 may be installed in the HMD 205. Alternatively, the depth sensor 203 may be installed at any position on the worker's body as long as it can acquire information on the space including the work target, such as the worker's shoulder or chest. Further, the depth sensor 203 may be installed on a ceiling or the like near the worker.

 HMD205は、頭部装着型の表示装置である。HMD205は、図1の出力装置103に対応する。 The HMD 205 is a head-mounted display device. The HMD 205 corresponds to the output device 103 in FIG.

 設備206は、作業対象を備える機器又は構造物等である。作業者は、設備206の保守又は変更を行うために、設備206が備える作業対象に作業する。設備206には、作業対象であるレバー207、メーター208、メーター209、スイッチ210、スイッチ211、ランプ212、ランプ213、及び、ランプ214等が配置される。 The facility 206 is a device or a structure having a work target. The worker works on a work target included in the facility 206 in order to perform maintenance or change of the facility 206. In the facility 206, a lever 207, a meter 208, a meter 209, a switch 210, a switch 211, a lamp 212, a lamp 213, a lamp 214, and the like, which are work objects, are arranged.

 作業空間モデル104には、設備206の形状を示す点群データがあらかじめ格納される。入力プログラム113及び入力装置102(デプスセンサ203)は、設備206の形状を示す点群データを作業前にあらかじめ取得し、作業空間モデル104に格納しても良い。また、作業支援システムの管理者は、設備206の形状を示す点群データを、作業前にあらかじめ作業空間モデル104に設定しても良い。 In the work space model 104, point cloud data indicating the shape of the facility 206 is stored in advance. The input program 113 and the input device 102 (depth sensor 203) may acquire point cloud data indicating the shape of the facility 206 in advance before work and store it in the work space model 104. The administrator of the work support system may set point cloud data indicating the shape of the facility 206 in the work space model 104 in advance before work.

 設備206を点群データとして表現した作業空間三次元モデルを、三次元空間上に配置した場合のイメージを図3に示す。 FIG. 3 shows an image when a three-dimensional work space model expressing the facility 206 as point cloud data is arranged in the three-dimensional space.

 図3は、実施例1の作業支援システムの設備206の点群データ301を示す説明図である。 FIG. 3 is an explanatory diagram illustrating the point cloud data 301 of the facility 206 of the work support system according to the first embodiment.

 図3に示す点群データ301は、三次元空間上の点の集合により設備206の形状を表現する。図3に示す点群データ301は、各点の三次元空間上の位置(座標)に関する情報のみを表す。しかし、デプスセンサ203が、各点の色情報又は温度情報を合わせて取得し、入力装置102が、点群データ301と色情報又は温度情報とを作業空間モデル104に格納しても良い。 The point cloud data 301 shown in FIG. 3 represents the shape of the facility 206 by a set of points in a three-dimensional space. The point cloud data 301 shown in FIG. 3 represents only information regarding the position (coordinates) of each point in the three-dimensional space. However, the depth sensor 203 may acquire the color information or temperature information of each point together, and the input device 102 may store the point cloud data 301 and the color information or temperature information in the work space model 104.

 このとき、デプスセンサ203は、点群データ301を取得した際に、同じ方向から撮影したカラー画像又は温度画像を取得し、カラー画像と点群データ301との対応関係又は温度画像と点群データ301との対応関係に基づいて、点群データ301中の各点の色情報又は温度情報を取得しても良い。 At this time, when the depth sensor 203 acquires the point cloud data 301, the depth sensor 203 acquires a color image or a temperature image taken from the same direction, and the correspondence between the color image and the point cloud data 301 or the temperature image and the point cloud data 301. The color information or temperature information of each point in the point cloud data 301 may be acquired on the basis of the correspondence relationship with the

 図4は、実施例1の作業手順105に格納されるデータフォーマットを示す説明図である。 FIG. 4 is an explanatory diagram showing a data format stored in the work procedure 105 of the first embodiment.

 作業手順105は、名称401、及び、領域402~410を含む。 The work procedure 105 includes a name 401 and areas 402 to 410.

 名称401は、作業に付与される名称等の識別子を示す。本実施例の作業は、複数の作業手順を含む。名称401の識別子は、例えば、任意の文字列により表現される。作業手順105は、複数の作業に含まれる複数の作業手順を示す。情報提示プログラム108は、開始する作業を選択するために作業の名称401を作業者に提示する。 Name 401 indicates an identifier such as a name given to the work. The work of this embodiment includes a plurality of work procedures. The identifier of the name 401 is expressed by an arbitrary character string, for example. The work procedure 105 indicates a plurality of work procedures included in a plurality of works. The information presentation program 108 presents a work name 401 to the worker in order to select a work to be started.

 領域402は、名称401が示す作業に含まれる作業手順の数を示す。領域403~領域406は、名称401が示す作業に含まれる最初の作業手順に関する情報を含み、領域407~領域410は、名称401が示す作業に含まれる最後の作業手順に関する情報を含む。 The area 402 indicates the number of work procedures included in the work indicated by the name 401. Areas 403 to 406 include information on the first work procedure included in the work indicated by the name 401, and areas 407 to 410 include information on the last work procedure included in the work indicated by the name 401.

 領域403及び領域407は、各作業手順に割り当てられた識別子(例えば、名称)を含む。領域403及び領域407の識別子は、例えば、任意の文字列により表現される。 The area 403 and the area 407 include an identifier (for example, a name) assigned to each work procedure. The identifiers of the area 403 and the area 407 are expressed by an arbitrary character string, for example.

 領域404及び領域408は、各作業手順の詳細な作業内容を示す情報であり、作業者が理解できる文字列により表現されても良い。また、領域404及び領域408は、作業内容を示す画像、動画、又は、CG等を含んでも良い。そして、領域404及び領域408が示す画像等には、作業対象を含む設備206が映されてもよい。 The area 404 and the area 408 are information indicating detailed work contents of each work procedure, and may be expressed by a character string that can be understood by the worker. Further, the area 404 and the area 408 may include an image, a moving image, or CG indicating the work content. In addition, the facility 206 including the work target may be reflected in the image or the like indicated by the region 404 and the region 408.

 領域405及び領域409は、各作業手順において作業対象の識別子(例えば、名称)を含む。作業者が一つの作業手順によって複数の作業対象を操作する場合、領域405及び領域409は、複数の作業対象の識別子を含んでも良い。 The area 405 and the area 409 include work target identifiers (for example, names) in each work procedure. When the worker operates a plurality of work targets according to one work procedure, the area 405 and the area 409 may include a plurality of work target identifiers.

 領域406及び領域410は、領域405及び領域409が示す作業対象の、作業空間三次元モデル上の位置座標を示す。また、領域406及び領域410は、作業対象を含む作業空間三次元モデルの識別子(例えば、名称)と、作業空間三次元モデル上における作業対象の三次元的な領域を示す。 The area 406 and the area 410 indicate the position coordinates on the work space three-dimensional model of the work object indicated by the area 405 and the area 409. An area 406 and an area 410 indicate an identifier (for example, name) of the work space three-dimensional model including the work object, and a three-dimensional area of the work object on the work space three-dimensional model.

 例えば、領域406及び領域410は、作業対象に外接する直方体の、作業空間三次元モデル上における中心位置とx、y及びz軸方向の大きさとの組み合わせとして表現された作業対象の領域の情報を保持してもよい。また、領域406及び領域410は、作業対象に外接する直方体の回転を表す角度、及び、回転行列の組み合わせとして表現された作業対象の領域の情報を保持してもよい。 For example, the area 406 and the area 410 include information on the area of the work target expressed as a combination of the center position of the cuboid circumscribing the work target on the work space three-dimensional model and the sizes in the x, y, and z axis directions. It may be held. In addition, the area 406 and the area 410 may hold information on the work target area expressed as a combination of an angle representing rotation of a cuboid circumscribing the work target and a rotation matrix.

 さらに、作業対象が複雑な形状である場合、作業対象の領域は、複数の直方体の組み合わせ、又は、直方体と円柱若しくは球等の他の形状との組合せにより表現されても良い。このように、領域406及び領域410の内容は、作業空間三次元モデル上における作業対象の三次元的な領域を表現できる方法であれば、いかなる方法によって表現されても良い。 Furthermore, when the work target has a complicated shape, the work target area may be expressed by a combination of a plurality of rectangular parallelepipeds or a combination of a rectangular parallelepiped and another shape such as a cylinder or a sphere. As described above, the contents of the region 406 and the region 410 may be expressed by any method as long as the method can express the three-dimensional region to be worked on the work space three-dimensional model.

 図5は、実施例1の作業対象の領域と作業空間三次元モデルとを示す説明図である。 FIG. 5 is an explanatory diagram illustrating a work target area and a work space three-dimensional model according to the first embodiment.

 図5は、作業手順105に登録された作業対象の領域を図3の作業空間三次元モデル上に重ね合わせて表示したイメージである。 FIG. 5 is an image obtained by superimposing and displaying the work target area registered in the work procedure 105 on the work space three-dimensional model of FIG.

 図5に示す作業対象の領域は、直方体によって表現される。図5に示す領域501は、図2に示すレバー207に対応する領域である。また、領域502及び503は、メーター208及び209に対応し、領域504及び505は、スイッチ210及び211に対応し、領域506、507及び508は、ランプ212、213及び214に対応する。 The work target area shown in FIG. 5 is represented by a rectangular parallelepiped. A region 501 shown in FIG. 5 is a region corresponding to the lever 207 shown in FIG. Regions 502 and 503 correspond to the meters 208 and 209, regions 504 and 505 correspond to the switches 210 and 211, and regions 506, 507, and 508 correspond to the lamps 212, 213, and 214.

 図6は、実施例1の作業支援システムによる処理を示す機能ブロック図である。 FIG. 6 is a functional block diagram illustrating processing by the work support system according to the first embodiment.

 実施例1の作業支援システムは、作業中空間データ入力部2001、作業中空間データ-作業空間三次元モデル対応関係算出部2002、作業情報提示部2003、作業対象領域検出部2005、手検出部2004、作業対象-手関係判定部2006及び作業対象状態判定部2007等の機能部を有する。実施例1の作業支援システムが有する機能部は、図1に示す各プログラムによって実装されてもよく、また、物理的な装置によって実装されてもよい。 The work support system according to the first embodiment includes a work space data input unit 2001, a work space data-work space three-dimensional model correspondence calculation unit 2002, a work information presentation unit 2003, a work target region detection unit 2005, and a hand detection unit 2004. And a functional unit such as a work target-hand relationship determination unit 2006 and a work target state determination unit 2007. The functional units included in the work support system according to the first embodiment may be implemented by each program illustrated in FIG. 1 or may be implemented by a physical device.

 作業中空間データ入力部2001は、作業中の作業対象周辺の形状を、作業中空間データとして取得する。作業中空間データ-作業空間三次元モデル対応関係算出部2002は、取得された作業中空間データと、作業空間三次元モデルとを比較することにより、作業対象を含む設備206の形状を示すデータを抽出する。 The working space data input unit 2001 acquires the shape around the work target being worked as working space data. The working space data-working space three-dimensional model correspondence calculating unit 2002 compares the acquired working space data with the working space three-dimensional model, thereby obtaining data indicating the shape of the facility 206 including the work object. Extract.

 一方で、手検出部2004は、取得された作業中空間データから、作業者の手の形状を示す作業者の領域を検出する。また、作業対象領域検出部2005は、作業対象を含む設備206のデータから、作業手順105に基づいて、作業対象の形状を示す作業対象の領域を検出する。 On the other hand, the hand detection unit 2004 detects an area of the worker indicating the shape of the worker's hand from the acquired work space data. The work target area detection unit 2005 detects a work target area indicating the shape of the work target based on the work procedure 105 from the data of the facility 206 including the work target.

 作業対象-手関係判定部2006は、検出された手の領域と、検出された作業対象の領域との位置の関係を判定する。作業対象-手関係判定部2006が、判定の結果、検出された手の領域が検出された作業対象の領域に重ならないと判定した場合、作業対象状態判定部2007は、作業中空間データから作業対象のデータを抽出し、さらに、作業後の作業対象の正しい形状を示す正常時対象物モデルを正常時モデル106から抽出する。そして、作業対象状態判定部2007は、抽出した作業対象のデータと抽出した正常時対象物モデルとを比較する。 The work target-hand relationship determination unit 2006 determines the positional relationship between the detected hand region and the detected work target region. If the work target-hand relationship determining unit 2006 determines that the detected hand region does not overlap the detected work target region as a result of the determination, the work target state determining unit 2007 determines that the work target state determining unit 2007 uses the working space data. The target data is extracted, and further, the normal object model indicating the correct shape of the work object after the operation is extracted from the normal model 106. Then, the work target state determination unit 2007 compares the extracted work target data with the extracted normal object model.

 そして、比較の結果、作業対象状態判定部2007が、作業対象の領域が適切な形状になっていると判定した場合、作業情報提示部2003は、作業が適切に終了したことを出力する。 Then, as a result of the comparison, when the work target state determination unit 2007 determines that the work target area has an appropriate shape, the work information presentation unit 2003 outputs that the work has been properly completed.

 以下に、図6に示す処理を図1に示すプログラムが実行した場合の処理の詳細を示す。 The details of the processing when the processing shown in FIG. 6 is executed by the program shown in FIG. 1 will be described below.

 図7は、実施例1の作業支援システムによる処理を示すフローチャートである。 FIG. 7 is a flowchart illustrating processing by the work support system according to the first embodiment.

 まず、情報提示プログラム108は、実施可能な作業の一覧を作業者に提示し、実施する作業を作業者に選択させる(601)。具体的には、情報提示プログラム108は、作業手順105に格納された作業手順から、作業の名称401を少なくとも一つ抽出し、抽出した少なくとも一つの名称を、作業の一覧として出力装置103を介して出力する。 First, the information presentation program 108 presents a list of work that can be performed to the worker, and causes the worker to select the work to be performed (601). Specifically, the information presentation program 108 extracts at least one work name 401 from the work procedure stored in the work procedure 105, and outputs the extracted at least one name as a work list via the output device 103. Output.

 また、情報提示プログラム108は、入力装置102に含まれるキーボード、マウス、又はタッチパネル等を用いて選択された作業の名称を受け付ける。 In addition, the information presentation program 108 accepts the name of the work selected using the keyboard, mouse, touch panel, or the like included in the input device 102.

 ステップ601の後、情報提示プログラム108は、ステップ601において選択された作業の名称に対応する少なくとも一つの作業手順を作業手順105から取得し、さらに、選択された作業の名称に対応する作業空間三次元モデルを作業空間モデル104から取得する(602)。 After step 601, the information presentation program 108 acquires at least one work procedure corresponding to the work name selected in step 601 from the work procedure 105, and further, the work space tertiary corresponding to the selected work name. An original model is acquired from the workspace model 104 (602).

 ステップ602の後、情報提示プログラム108は、取得した作業手順中の最初の作業手順を実施対象に設定する(603)。具体的には、情報提示プログラム108は、ステップ603において、最初の作業手順を、出力装置103を介して出力する。これによって作業者は、最初の作業手順を認識することができる。 After step 602, the information presentation program 108 sets the first work procedure in the obtained work procedure as an execution target (603). Specifically, the information presentation program 108 outputs the first work procedure via the output device 103 in step 603. Thus, the worker can recognize the first work procedure.

 ここで出力される作業手順は、作業手順105の領域404が示す内容であり、設備206を映す画像又は動画であってよい。 The work procedure output here is the content indicated by the area 404 of the work procedure 105, and may be an image or a moving image showing the facility 206.

 ステップ603の後、入力プログラム113は、入力装置102に備わるデプスセンサ203を介して、作業中空間データの点群データを取得する(604)。そして、入力プログラム113は、取得した点群データを作業中空間データ115に格納する。 After step 603, the input program 113 acquires the point cloud data of the working space data via the depth sensor 203 provided in the input device 102 (604). The input program 113 stores the acquired point cloud data in the working space data 115.

 図8は、実施例1の作業中空間データの点群データを示す説明図である。 FIG. 8 is an explanatory diagram showing point cloud data of the working space data of the first embodiment.

 図8に示す作業中空間データは、作業中の作業対象と作業者の手とを含む空間に存在する物体の形状を三次元で示す点群データである。また、図8に示す点群データは、作業空間三次元モデルの一部に対応する点群データ701と、作業者の手の形状を示す点群データ702及び703とを含む。 The working space data shown in FIG. 8 is point cloud data that three-dimensionally shows the shape of an object existing in the space including the work target being worked and the hand of the worker. The point cloud data shown in FIG. 8 includes point cloud data 701 corresponding to a part of the work space three-dimensional model, and point cloud data 702 and 703 indicating the shape of the operator's hand.

 ここで、作業中空間データの原点はデプスセンサの位置である。そして、一般的に、入力プログラム113により取得された作業中空間データと、作業空間モデル104に格納される作業空間三次元モデルとは、異なる座標系において表現される。 Here, the origin of the working space data is the position of the depth sensor. In general, the work space data acquired by the input program 113 and the work space three-dimensional model stored in the work space model 104 are expressed in different coordinate systems.

 実施例1のデプスセンサが、作業対象の三次元モデルである点群データを取得することによって、状態判定プログラム112が、後述の処理において精度よく作業の状態を判定することができる。 The depth sensor of the first embodiment acquires point cloud data that is a three-dimensional model to be worked, so that the state determination program 112 can accurately determine the state of the work in the process described later.

 ステップ604の後、手検出プログラム109は、ステップ604において取得された作業中空間データから作業者の手及び腕の位置及び姿勢を検出する(605)。手検出プログラム109は、手及び腕を検出する方法として、点群データから特定の形状の位置及び姿勢を検出するための方法であれば、どのような方法を用いても良い。 After step 604, the hand detection program 109 detects the position and posture of the operator's hand and arm from the working space data acquired in step 604 (605). The hand detection program 109 may use any method as long as it detects the position and orientation of a specific shape from the point cloud data as a method for detecting the hand and arm.

 例えば、手検出プログラム109は、図9に示すような作業時の手及び腕の形状を表す三次元モデルをあらかじめ保持しても良い。そして、手検出プログラム109は、手及び腕のあらかじめ保持された三次元モデルと、作業中空間データとが最もよく一致する位置を検出することにより、作業者の手及び腕の位置及び姿勢を検出しても良い。 For example, the hand detection program 109 may hold in advance a three-dimensional model representing the shape of the hand and arm during work as shown in FIG. The hand detection program 109 detects the position and posture of the operator's hand and arm by detecting the position where the pre-held three-dimensional model of the hand and arm and the work space data most closely match. You may do it.

 図9は、実施例1の手及び腕の形状を表す三次元モデル801を示す説明図である。 FIG. 9 is an explanatory diagram showing a three-dimensional model 801 representing the hand and arm shapes of the first embodiment.

 図9に示す三次元モデル801は、右手の形状を示す、ポリゴンを用いた三次元モデルである。手検出プログラム109は、このようなポリゴンを分割することにより、前述の方法によって、あらかじめ手と腕の三次元モデルを点群データに変換して保持しても良い。 A three-dimensional model 801 shown in FIG. 9 is a three-dimensional model using a polygon showing the shape of the right hand. The hand detection program 109 may divide such polygons, and convert and hold a three-dimensional model of hands and arms into point cloud data in advance by the method described above.

 また、手検出プログラム109は、二つの点群データ(本実施例において三次元モデル801および作業中空間データ)が最も良く一致する位置を検出する方法として、例えば、ICP(Iterative Closest Point)アルゴリズム(S. Rusinkiewicz and M. Levoy, “Efficient Variants of the ICP Algorithm”, Third International Conference on 3D Digital Imaging and Modeling, 2001)、又は、NDT(Normal Distributions Transformation)(P. Biber and W. Straser, “The Normal Distributions Transform:A New Approach to Laser ScanMatching Proceedings of the 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2003)等を用いても良い。 In addition, the hand detection program 109 is, for example, an ICP (Iterative Closest Point) algorithm (method for detecting a position where the two point cloud data (the three-dimensional model 801 and the working space data in this embodiment) best match). S. Rusinkiewicz and M. Levoy, "Efficient Variants of the ICP Algorithm", Third International Conference on 3D Digital Imaging and Modeling, 2001), or, NDT (Normal Distributions Transformation) (P. Biber and W. Straser, "Th e Normal Distributions Transform: A New Approach to Laser ScanMatching Proceedings of the 2003, IEEE / RSJ International ConfencementInsertient3, etc.

 手検出プログラム109は、ステップ605において、これらの方法を用いることにより、作業中空間データに一致するように手及び腕の三次元モデル801を回転及び並進を行うための変換行列、すなわち、作業中空間データにおける三次元モデル801の位置及び姿勢を求める。 In step 605, the hand detection program 109 uses these methods to convert the hand and arm three-dimensional model 801 so as to match the working space data, that is, a working matrix, that is, working. The position and orientation of the three-dimensional model 801 in the spatial data are obtained.

 また、これらの方法を用いた場合、手検出プログラム109は、位置合わせを行う際の初期位置を取得しておく必要がある。図8に示す作業中空間データが取れることがあらかじめ予想される場合、例えば、点群データの座標系において、右肘側が右下方手前に、右手は左上方奥に位置する形に配置される。このため、手検出プログラム109は、予想される手及び腕の平均的な位置及び姿勢を初期位置として、あらかじめ取得しておいても良い。 Further, when these methods are used, the hand detection program 109 needs to acquire an initial position when performing alignment. When it is predicted in advance that the working space data shown in FIG. 8 can be obtained, for example, in the coordinate system of the point cloud data, the right elbow side is arranged in the lower right front and the right hand is arranged in the upper left back. For this reason, the hand detection program 109 may acquire in advance the average position and posture of the predicted hand and arm as the initial position.

 または、手検出プログラム109は、複数の代表的な位置及び姿勢を初期位置として、作業中空間データと手及び腕の三次元モデルとが最もよく一致した結果に基づいて、手及び腕の位置及び姿勢を求めても良い。 Alternatively, the hand detection program 109 sets a plurality of representative positions and postures as initial positions, and based on the result of the best match between the working space data and the three-dimensional model of the hands and arms, You may seek posture.

 また、手検出プログラム109は、図9に示す三次元モデル801の他に、左手の三次元モデルも保持しても良い。そして、手検出プログラム109は、右手の三次元モデル801に用いた前述の方法を用いることにより、作業中空間データにおける作業者の両手及び両腕の位置及び姿勢を検出することができる。 Further, the hand detection program 109 may hold a three-dimensional model of the left hand in addition to the three-dimensional model 801 shown in FIG. The hand detection program 109 can detect the positions and postures of both hands and arms of the worker in the working space data by using the above-described method used for the three-dimensional model 801 of the right hand.

 ステップ605の後、対応関係算出プログラム114は、ステップ605において検出された手及び腕に対応する点群データを、作業中空間データから削除する。ここで、対応関係算出プログラム114は、作業中空間データから手及び腕の位置及び姿勢を検出し、手及び腕に対応する点群データを特定することにより削除する方法であれば、どのような方法を用いても良い。 After step 605, the correspondence calculation program 114 deletes the point cloud data corresponding to the hand and arm detected in step 605 from the working space data. Here, the correspondence calculation program 114 can detect any position and posture of the hand and arm from the working space data, and can be deleted by specifying the point cloud data corresponding to the hand and arm. A method may be used.

 対応関係算出プログラム114は、例えば、ステップ605において求められた回転及び並進を行うための変換行列を用いて、手及び腕の三次元モデル801を作業中空間データにおける点群データに変換する。その後、対応関係算出プログラム114は、変換後の点群データの各点から所定の距離以内にある作業中空間データの点の集合を、作業者の手及び腕に対応する点群データとして、作業中空間データから削除する。 The correspondence calculation program 114 converts the three-dimensional model 801 of hands and arms into point cloud data in the working space data, for example, using the conversion matrix for performing rotation and translation obtained in step 605. Thereafter, the correspondence calculation program 114 uses a set of points in the working space data within a predetermined distance from each point of the converted point cloud data as point cloud data corresponding to the hands and arms of the worker. Delete from medium space data.

 図10は、実施例1の作業者の手及び腕に対応する点群データを削除した後の作業中空間データを示す説明図である。 FIG. 10 is an explanatory diagram showing the working space data after deleting the point cloud data corresponding to the hand and arm of the worker of the first embodiment.

 対応関係算出プログラム114は、図10に示す作業中空間データにおいて、設備206を示す点群データ701を残すが、点群データ702及び703があった領域902及び903から点群データを削除する。 The correspondence calculation program 114 leaves the point cloud data 701 indicating the facility 206 in the working space data shown in FIG. 10, but deletes the point cloud data from the areas 902 and 903 where the point cloud data 702 and 703 existed.

 さらに、対応関係算出プログラム114は、作業者の手及び腕に対応する点群データを除いた後の作業中空間データと作業空間三次元モデルとの対応関係を表す変換行列を求める(606)。対応関係算出プログラム114は、作業中空間データと作業空間三次元モデルとが最もよく一致する位置を求めることにより、対応関係を示す情報を求めることが可能である。 Further, the correspondence calculation program 114 obtains a transformation matrix representing the correspondence between the working space data and the work space three-dimensional model after removing the point cloud data corresponding to the operator's hand and arm (606). The correspondence calculation program 114 can obtain information indicating the correspondence by obtaining a position where the working space data and the work space three-dimensional model most closely match.

 対応関係算出プログラム114は、作業中空間データと作業空間三次元モデルとの対応関係を求める方法として、二つの三次元データが最も良く一致する位置及び向きを求め、求めた位置及び向きを示す情報を、対応関係を示す情報として取得する方法であれば、どのような方法を用いても良い。 The correspondence calculation program 114 obtains the position and orientation in which the two three-dimensional data are best matched as a method for obtaining the correspondence between the working space data and the work space three-dimensional model, and information indicating the obtained position and orientation. Any method may be used as long as the information is acquired as information indicating the correspondence.

 対応関係算出プログラム114は、例えば、前述したICPアルゴリズム又はNDT等を用いても良い。そして、対応関係算出プログラム114は、作業空間三次元モデルに最も良く一致するように作業中空間データの回転及び並進を行う変換行列を、対応関係を示す情報として求めても良い。 The correspondence calculation program 114 may use, for example, the aforementioned ICP algorithm or NDT. Then, the correspondence calculation program 114 may obtain a conversion matrix that rotates and translates the work space data so as to best match the three-dimensional work space model as information indicating the correspondence.

 ステップ606の後、作業対象検出プログラム110は、ステップ606において求められた作業中空間データと作業空間三次元モデルとの対応関係を示す情報を用いて、実施中の作業手順に対応する作業対象(作業手順105の領域405に対応)の、作業中空間データにおける領域を求める(607)。 After step 606, the work target detection program 110 uses the information indicating the correspondence relationship between the work space data obtained in step 606 and the work space three-dimensional model, and the work object corresponding to the work procedure being executed ( The area in the working space data (corresponding to the area 405 of the work procedure 105) is obtained (607).

 作業対象検出プログラム110は、作業中空間データに最も良く一致するように作業空間三次元モデルを回転及び並進を行う変換行列として求めた後、図5に示す作業空間三次元モデル上に配置された作業対象の領域を読み出す。そして、作業対象検出プログラム110は、読み出した作業対象の領域を、求めた変換行列を用いて変換する。これにより、作業対象検出プログラム110は、作業中空間データにおける作業対象の領域を求めることが可能である。 The work object detection program 110 obtains the work space three-dimensional model as a transformation matrix that rotates and translates so as to best match the work space data, and is then placed on the work space three-dimensional model shown in FIG. Read the work area. Then, the work target detection program 110 converts the read work target area using the obtained conversion matrix. Thereby, the work target detection program 110 can obtain a work target area in the working space data.

 図11は、実施例1の作業中空間データにおける作業対象の領域を示す説明図である。 FIG. 11 is an explanatory diagram showing work target areas in the working space data of the first embodiment.

 図11に示す領域1001及び1002は、図5における領域501と領域502に対応する領域である。領域1001及び1002は、作業中空間データにおける作業対象の領域である。 11 are areas corresponding to the area 501 and the area 502 in FIG. Areas 1001 and 1002 are work target areas in the working space data.

 ステップ607の後、情報提示プログラム108は、ステップ607において求められた作業対象の作業中空間データにおける領域を示す情報を、出力していた作業手順とともに、出力装置103を介して出力する(608)。 After step 607, the information presentation program 108 outputs the information indicating the area in the working space data of the work target obtained in step 607 together with the output work procedure via the output device 103 (608). .

 図12は、実施例1のデプスセンサ203によって取得された距離画像上に、作業手順と作業対象の領域に関する情報を表示した画面1100を示す説明図である。 FIG. 12 is an explanatory diagram showing a screen 1100 that displays information on the work procedure and the work target area on the distance image acquired by the depth sensor 203 of the first embodiment.

 図12に示す画面1100は、デプスセンサ203によって取得された距離画像、作業手順の内容(作業手順105の領域404等)及び作業対象の領域を表示する。画面1100は、領域1101及び領域1102を含む。領域1101は、実施中の作業手順の内容を、文字列によって表示する。領域1102は、実施中の作業手順における作業対象の領域を表示する。 12 displays the distance image acquired by the depth sensor 203, the contents of the work procedure (the area 404 of the work procedure 105, etc.) and the work target area. The screen 1100 includes an area 1101 and an area 1102. An area 1101 displays the contents of the work procedure being executed as a character string. An area 1102 displays a work target area in the work procedure being performed.

 図12に示す領域1102は、作業対象の領域を示すのみである。しかし、作業手順105が操作方法を示す記号(矢印又は円等の図形を含む)、画像又は動画等の情報も格納する場合、情報提示プログラム108は、作業対象の位置に矢印又は円等の記号、さらには画像又は動画等を、領域1102又は領域1102の近傍に表示しても良い。 The area 1102 shown in FIG. 12 only shows the work target area. However, when the work procedure 105 also stores information such as a symbol indicating an operation method (including a graphic such as an arrow or a circle) and an image or a moving image, the information presentation program 108 may display a symbol such as an arrow or a circle at the work target position. Further, an image or a moving image may be displayed in the area 1102 or in the vicinity of the area 1102.

 また、入力装置102が、カラー画像と距離画像とを同じ方向から取得できるデプスセンサ203を用いる場合、情報提示プログラム108は、距離画像上の位置をカラー画像上の位置に変換する変換行列を求めても良い。そして、情報提示プログラム108は、この変換行列を用いることにより、領域1101及び領域1102を、カラー画像上に表示しても良い。 When the input device 102 uses the depth sensor 203 that can acquire a color image and a distance image from the same direction, the information presentation program 108 obtains a conversion matrix for converting the position on the distance image into the position on the color image. Also good. Then, the information presentation program 108 may display the area 1101 and the area 1102 on the color image by using this conversion matrix.

 また、出力装置103が、前方の景色を直接視認できるタイプのHMDを用いる場合、情報提示プログラム108は、HMD上に領域1101及び領域1102のみを表示しても良い。 In addition, when the output device 103 uses an HMD of a type that can directly see the scenery in front, the information presentation program 108 may display only the area 1101 and the area 1102 on the HMD.

 ステップ608の後、関係判定プログラム111は、ステップ605において検出された作業中空間データにおける作業者の手の領域と、ステップ607において求められた作業中空間データにおける作業対象の領域とに基づいて、作業者の手が作業対象に重なるかを判定する(609)。 After step 608, the relationship determination program 111, based on the area of the operator's hand in the work space data detected in step 605 and the work target area in the work space data obtained in step 607, It is determined whether the operator's hand overlaps the work target (609).

 ここで、関係判定プログラム111は、作業中空間データにおける作業者の手の領域を、作業中空間データから削除した手及び腕に対応する点群データのうち、手の部分の三次元モデルに近いとステップ606において判定された点群データのみを手に対応する点群データとして抽出する。そして、関係判定プログラム111は、抽出した点群データの領域を作業中空間データにおける作業者の手の領域の点群データとする。 Here, the relationship determination program 111 is close to the three-dimensional model of the hand portion of the point cloud data corresponding to the hand and the arm deleted from the working space data in the working hand space data. Only the point cloud data determined in step 606 is extracted as point cloud data corresponding to the hand. Then, the relationship determination program 111 sets the extracted point cloud data area as point cloud data of the worker's hand area in the working space data.

 図13は、実施例1の作業者の手の領域を示す説明図である。 FIG. 13 is an explanatory diagram illustrating an area of the hand of the worker according to the first embodiment.

 図13に示す領域1201は、手に対応する領域である。また、図13に示す領域1001は、ステップ607において求められた作業対象の領域である。 An area 1201 shown in FIG. 13 is an area corresponding to the hand. An area 1001 shown in FIG. 13 is the work target area obtained in step 607.

 例えば、関係判定プログラム111は、図13の領域1201に示すように、手に対応する点群データに外接する直方体を求め、領域1201を手の領域に決定する。そして、ステップ609において、関係判定プログラム111は、求めた作業者の手の領域1201が作業対象の領域1001の中に存在しているかを判定することによって、作業者の手が作業対象に重なるかを判定する。 For example, the relationship determination program 111 obtains a rectangular parallelepiped circumscribing the point cloud data corresponding to the hand as shown in an area 1201 in FIG. 13, and determines the area 1201 as the hand area. In step 609, the relationship determination program 111 determines whether the operator's hand overlaps the work target by determining whether the area 1201 of the obtained worker's hand exists in the work target area 1001. Determine.

 また、本実施例の関係判定プログラム111は、作業者の手の領域1201が作業対象の領域1001を含む所定の位置範囲の中に存在している場合、作業者の手が作業対象に重なると判定してもよい。 Further, the relationship determination program 111 according to the present embodiment, when the operator's hand area 1201 exists within a predetermined position range including the work target area 1001, the operator's hand overlaps the work target. You may judge.

 ステップ609の後、関係判定プログラム111は、ステップ609の判定の結果に従って、作業者の手の領域1201が作業対象の領域1001に重なるかを判定する(610)。そして、作業者の手の領域1201が作業対象の領域1001に重なると判定された場合、関係判定プログラム111は、作業者が作業を実施中であると判定し、作業の状態を判定する必要はないと判定する。そして、図7に示す処理は、ステップ604に戻る。 After step 609, the relationship determination program 111 determines whether the area 1201 of the operator's hand overlaps the area 1001 to be worked according to the determination result of step 609 (610). When it is determined that the worker's hand area 1201 overlaps the work target area 1001, the relationship determination program 111 needs to determine that the worker is performing the work and determine the work state. Judge that there is no. Then, the processing shown in FIG.

 また、作業者の手の領域1201が作業対象の領域1001に重ならないとステップ610において判定された場合、関係判定プログラム111は、作業者が作業手順の一つを終了し、この時が作業状態を判定するべきタイミングであると判定する。そして、関係判定プログラム111は、状態判定プログラム112にステップ611を実行させる。 If it is determined in step 610 that the area 1201 of the worker's hand does not overlap the area 1001 to be worked, the relationship determination program 111 causes the worker to finish one of the work procedures, and this time It is determined that the timing is to be determined. Then, the relationship determination program 111 causes the state determination program 112 to execute step 611.

 ステップ609及びステップ610を実行し、作業者の手と作業対象との位置の関係を判定し、作業者の手が作業対象から離れた場合にステップ611を実行させることによって、実施例1の関係判定プログラム111は、作業対象の状態を判定するために適切なタイミングを効率よく抽出することができる。 Step 609 and Step 610 are executed, the positional relationship between the worker's hand and the work target is determined, and when the worker's hand is separated from the work target, Step 611 is executed, whereby the relationship of Embodiment 1 The determination program 111 can efficiently extract an appropriate timing for determining the state of the work target.

 ステップ611において、状態判定プログラム112は、ステップ607において求めた作業中空間データにおける作業対象の領域1001に基づいて、作業対象に対応する点群データを作業中空間データから抽出する。また、状態判定プログラム112は、ステップ603又はステップ614において設定された作業手順による作業後の作業対象の、理想的な状態を表す正常時対象物(三次元)モデルを、正常時モデル106から抽出する。 In step 611, the state determination program 112 extracts point cloud data corresponding to the work target from the work space data based on the work area 1001 in the work space data obtained in step 607. Further, the state determination program 112 extracts from the normal-time model 106 a normal-time object (three-dimensional) model representing an ideal state of the work target after work according to the work procedure set in step 603 or 614. To do.

 また、状態判定プログラム112は、ステップ611において、作業中空間データにおける作業対象の点群データと、取得した正常時対象物(三次元)モデルとの差を算出する。状態判定プログラム112は、ステップ611において、作業対象の点群データと正常時対象物モデルとの差を算出する方法であれば、どのような方法を用いても良い。具体的な方法を以下に示す。 In step 611, the state determination program 112 calculates a difference between the point cloud data of the work target in the working space data and the acquired normal object (three-dimensional) model. The state determination program 112 may use any method as long as it calculates the difference between the point cloud data to be worked and the normal object model in step 611. A specific method is shown below.

 正常時モデル106が、作業空間三次元モデル中における作業対象の領域を示す点群データと同じ位置及び姿勢の正常時対象物モデルを保持する場合、状態判定プログラム112は、取得した正常時対象物モデルの点群データの各点と、それらに最も近い作業対象の点群データとの距離の差を求める。そして、状態判定プログラム112は、求めた複数の差の合計を、作業中空間データにおける作業対象の点群データと、取得した正常時対象物モデルとの差として求める。 When the normal model 106 holds a normal object model having the same position and orientation as the point cloud data indicating the work target area in the work space three-dimensional model, the state determination program 112 acquires the normal object acquired. A difference in distance between each point of the point cloud data of the model and the point cloud data of the work target closest to them is obtained. And the state determination program 112 calculates | requires the sum total of the calculated | required several difference as a difference of the point cloud data of the work object in work space data, and the acquired normal time object model.

 なお、作業空間モデル104が保持する作業対象の位置及び姿勢と、正常時モデル106が保持する正常時対象物モデルの位置及び姿勢とが異なる場合、管理者等は、作業空間モデル104に、作業空間モデル104における作業対象の領域を所定の割合いだけ拡大した領域に対応する正常時対象物モデルをあらかじめ格納しておく。そして、状態判定プログラム112は、当該割合いだけ拡大した作業対象の領域の点群データを、作業対象の点群データとして作業中空間データ115から取得する。 If the position and orientation of the work target held by the work space model 104 and the position and posture of the normal object model held by the normal time model 106 are different, the administrator or the like A normal object model corresponding to an area obtained by enlarging a work target area in the space model 104 by a predetermined ratio is stored in advance. Then, the state determination program 112 acquires the point cloud data of the work target area enlarged by the ratio from the work space data 115 as the work target point cloud data.

 そして、状態判定プログラム112は、ステップ605及びステップ606において説明した方法を用いて、このような正常時対象物モデルと作業対象の点群データとの位置をあわせる。状態判定プログラム112は、位置をあわせた結果を用い、他方の点群データに最も良く一致するように、一方の点群データを変換する。 Then, the state determination program 112 uses the method described in step 605 and step 606 to align the positions of the normal object model and the point cloud data of the work target. The state determination program 112 converts one point cloud data so as to best match the other point cloud data using the result of matching the positions.

 その後、状態判定プログラム112は、点の距離の差の合計を算出する前述の方法を用いて、作業対象の点群データと正常時対象物モデルの点群データとの距離の差を算出し、作業対象の領域と正常時対象物モデルとの差を求めても良い。 Thereafter, the state determination program 112 calculates the difference in the distance between the point cloud data of the work target and the point cloud data of the normal object model using the above-described method of calculating the sum of the distances of the points, The difference between the work target area and the normal object model may be obtained.

 なお、ステップ611において、状態判定プログラムは、作業対象の点群データ及び理想的な正常時三次元モデルの距離の差として、作業対象の点群データ及び理想的な正常時三次元モデルの類似度を求めても良い。 In step 611, the state determination program calculates the similarity between the point cloud data of the work target and the ideal normal three-dimensional model as the difference in distance between the point cloud data of the work target and the ideal normal three-dimensional model. You may ask for.

 求めた差の大きさが所定の閾値より大きい場合、状態判定プログラム112は、ステップ611において、作業対象の状態が適切ではないと判定する。また、求めた差の大きさが所定の閾値以下の場合、状態判定プログラム112は、ステップ611において、作業対象の状態が適切であると判定する。 When the obtained difference is larger than the predetermined threshold, the state determination program 112 determines in step 611 that the state of the work target is not appropriate. If the obtained difference is equal to or smaller than the predetermined threshold, the state determination program 112 determines in step 611 that the state of the work target is appropriate.

 作業中空間データから抽出された作業対象の三次元モデルと、正常時三次元モデルから抽出された作業対象の三次元モデルとの差に基づいて、作業対象の状態を判定することによって、状態判定プログラム112は、作業の状態が適切であるかを精度よく判定できる。 State determination by determining the state of the work target based on the difference between the 3D model of the work target extracted from the working space data and the 3D model of the work target extracted from the normal 3D model The program 112 can accurately determine whether the work state is appropriate.

 ステップ611の後、状態判定プログラム112は、ステップ611における判定の結果に従って、作業対象の状態が適切であるかを判定する(612)。作業対象の状態が適切である場合、一つの作業手順は問題なく終了したため、情報提示プログラム108は、ステップ613を実行する。 After step 611, the state determination program 112 determines whether the state of the work target is appropriate according to the determination result in step 611 (612). If the work target state is appropriate, one work procedure has been completed without any problem, and the information presentation program 108 executes step 613.

 また、作業対象の状態が適切でない場合、図7の処理は、ステップ604に戻る。ステップ612からステップ604に戻る場合、情報提示プログラム108は、出力装置103を用いて、メッセージ又は記号等により作業が終了していないことを表す情報を作業者に提示しても良い。 If the work target state is not appropriate, the processing in FIG. 7 returns to Step 604. When returning from step 612 to step 604, the information presentation program 108 may use the output device 103 to present information indicating that the work has not been completed by a message or a symbol to the worker.

 ステップ613において、情報提示プログラム108は、出力装置103を用いて、作業者に実施中の作業手順が終了したことを表す情報を作業者に提示する。例えば、情報提示プログラム108は、図14のメッセージ1301を作業者に提示しても良い。 In step 613, the information presentation program 108 uses the output device 103 to present the worker with information indicating that the work procedure being performed has been completed. For example, the information presentation program 108 may present the message 1301 of FIG. 14 to the worker.

 図14は、実施例1の一つの作業手順が終了した後に表示される画面を示す説明図である。 FIG. 14 is an explanatory view showing a screen displayed after one work procedure of the first embodiment is completed.

 図14に示すメッセージ1301は、ステップ613において表示され、一つの作業手順が問題なく終了したことを示す。 The message 1301 shown in FIG. 14 is displayed in step 613 and indicates that one work procedure has been completed without any problem.

 ステップ613の後、情報提示プログラム108は、作業手順105を参照し、実施対象の作業手順として次の作業手順を設定する(614)。ステップ614において、又は、ステップ612からステップ604に戻る際に、作業手順が終了したか否かを出力することによって、情報提示プログラム108は、一連の作業を作業者が適切に実行できているかを示す作業の状態を、作業者に提示できる。そしてこれにより、実施例1の作業支援システムは、作業者を適切にナビゲートできる。 After step 613, the information presentation program 108 refers to the work procedure 105 and sets the next work procedure as the work procedure to be executed (614). In step 614 or when returning from step 612 to step 604, by outputting whether or not the work procedure has been completed, the information presentation program 108 determines whether or not the worker can appropriately perform a series of work. The state of the work to be shown can be presented to the worker. Thus, the work support system according to the first embodiment can appropriately navigate the worker.

 ステップ614の後、情報提示プログラム108は、実施対象の作業手順中の全ての作業手順が終了したかを調べ(615)、全ての作業手順が終了していない場合、次の作業手順を実施するため、ステップ604を実行する。全ての作業手順が終了した場合、情報提示プログラム108は、ステップ616を実行する。 After step 614, the information presentation program 108 checks whether all work procedures in the work procedure to be executed have been completed (615). If all work procedures have not been completed, the next work procedure is executed. Therefore, step 604 is executed. When all work procedures are completed, the information presentation program 108 executes Step 616.

 ステップ616において、情報提示プログラム108は、出力装置103を用いて、作業者に全ての作業手順が終了したことを表す情報を作業者に提示する。例えば、情報提示プログラム108は、「作業が終了しました」等のメッセージを作業者に提示しても良い。作業者に情報を提示した後、情報提示プログラム108は、図7に示す処理を終了する。 In step 616, the information presentation program 108 uses the output device 103 to present information indicating that all work procedures have been completed to the worker. For example, the information presentation program 108 may present a message such as “work has been completed” to the worker. After presenting the information to the worker, the information presentation program 108 ends the process shown in FIG.

 以上の実施例1において、出力装置103は、コンピュータにおいて一般的に使用されるモニタ又はHMD等の表示装置以外に、スピーカ等の音又は音声を出力する装置を有しても良い。そして、情報提示プログラム108は、音又は音声により作業者に情報を提示しても良い。 In the first embodiment described above, the output device 103 may include a device that outputs sound or sound such as a speaker in addition to a display device such as a monitor or HMD generally used in a computer. Then, the information presentation program 108 may present information to the worker by sound or voice.

 また、ステップ609において、関係判定プログラム111は、実施中ではない作業手順に関連する作業対象の領域に作業者の手が存在するかを判定しても良い。そして、関係判定プログラム111が、作業対象と手とが重ならず、かつ、実施中の作業手順とは異なる作業手順に関連する作業対象の領域に作業者の手が存在すると判定した場合、情報提示プログラム108は、出力装置103を用いて、異なる作業対象に作業していることを示す警告を作業者に提示しても良い。 In step 609, the relationship determination program 111 may determine whether the worker's hand exists in the work target area related to the work procedure that is not being executed. When the relationship determination program 111 determines that the work target and the hand do not overlap and the worker's hand exists in the work target area related to the work procedure different from the work procedure currently being executed, The presentation program 108 may use an output device 103 to present a warning to the worker indicating that work is being performed on a different work target.

 これによって、情報提示プログラム108は、ユーザに作業手順を見直させ、適切に作業をナビゲートすることができる。 As a result, the information presentation program 108 can cause the user to review the work procedure and appropriately navigate the work.

 また、ステップ610において、関係判定プログラム111が、過去に複数回実行されたステップ604~610の結果に基づいて、手の領域が作業対象の領域に重なるかを判定してもよい。 In step 610, the relationship determination program 111 may determine whether the hand region overlaps the work target region based on the results of steps 604 to 610 executed a plurality of times in the past.

 具体的には、関係判定プログラム111は、作業者の手が作業対象の領域中に存在するかを、所定の時間間隔でステップ604~ステップ609を複数回実行することによって判定してもよい。この場合、関係判定プログラム111は、作業者の手が作業対象の領域中に所定の時間存在しており、さらにその後、作業者の手が作業対象の領域中に所定の時間存在していないと判定された場合に、ステップ610において、手の領域が作業対象の領域に重ならないと判定してもよい。 Specifically, the relationship determination program 111 may determine whether the worker's hand exists in the work target area by executing Steps 604 to 609 a plurality of times at predetermined time intervals. In this case, the relationship determination program 111 indicates that the worker's hand exists in the work target area for a predetermined time, and then the worker's hand does not exist in the work target area for a predetermined time. If it is determined, it may be determined in step 610 that the hand region does not overlap the work target region.

 これによって、作業手順を実施中に、短時間偶発的に、作業者が作業対象から手を話した場合にも、関係判定プログラム111は、作業手順が終了したと誤って判定されることを防ぎ、適切なタイミングにおいて状態判定プログラム112に作業状態を判定させることができる。 This prevents the relationship determination program 111 from erroneously determining that the work procedure has been completed even when the worker speaks a hand from the work object accidentally for a short time during the work procedure. The work state can be determined by the state determination program 112 at an appropriate timing.

 また、前述のステップ610において、関係判定プログラム111は、片手であるか両手であるかを識別しても良い。具体的には、関係判定プログラム111は、ある時点より前に両手が作業対象の領域中に存在しており、その時点の後に片手が作業対象の領域中に存在していないと判定した場合、作業が終了したと判定し、ステップ611を実行しても良い。 Further, in step 610 described above, the relationship determination program 111 may identify whether the hand is one hand or both hands. Specifically, when the relationship determination program 111 determines that both hands are present in the work target area before a certain time and one hand is not present in the work target area after that time, It may be determined that the work has been completed, and step 611 may be executed.

 両手及び一方の手の位置に基づいて、手が作業対象に重なるか否かを判定することによって、関係判定プログラム111は、例えば、作業対象を常に片手で支えている必要があるような作業においても、作業の状態を判定する適切なタイミングを抽出することができる。 By determining whether or not the hand overlaps the work target based on the positions of both hands and one hand, the relationship determination program 111 can be used, for example, in a work where it is necessary to always support the work target with one hand. In addition, it is possible to extract an appropriate timing for determining the work state.

 また、ステップ610において、関係判定プログラム111が、作業者の手が作業対象の領域中に存在すると判定される時間が所定の時間を経過したと判定した場合、情報提示プログラム108は、出力装置103を用いて、作業が遅延していること等を示す警告又は注意喚起を示す情報を作業者に提示しても良い。 In step 610, when the relationship determination program 111 determines that the predetermined time has passed, the information presentation program 108 determines that the operator's hand is present in the work target area. , Information indicating warning or attention indicating that the work is delayed or the like may be presented to the worker.

 また、ステップ611において、状態判定プログラム112は、ステップ607において求めた作業中空間データにおける作業対象の領域に基づいて、作業対象に対応する点群データを作業中空間データから抽出した際、抽出した点群データを作業空間モデル104上の作業対象の位置及び作業手順105中の作業対象に関連付けて、記憶装置107等のデータベースに保存しても良い。 In step 611, the state determination program 112 extracts the point cloud data corresponding to the work target from the work space data based on the work target area in the work space data obtained in step 607. The point cloud data may be stored in a database such as the storage device 107 in association with the position of the work target on the work space model 104 and the work target in the work procedure 105.

 これにより、実施例1の作業支援システムは、各作業対象に関する時系列変化を表すデータを保持するため、例えば、実施する作業が作業対象の点検である場合、管理者が、作業対象の経年変化等を確認することができる。 Thereby, since the work support system according to the first embodiment holds data representing a time-series change regarding each work target, for example, when the work to be performed is an inspection of the work target, the administrator can change the work target over time. Etc. can be confirmed.

 また、ステップ612における判定結果に誤りが発生し、実施中の作業手順が終了しているにも関わらず作業手順が次に進まない場合、又は、実施中の作業手順が終了していないにも関わらず作業手順が次に進んでしまった場合、情報提示プログラム108は、作業者がキーボード、マウス、又は、タッチパネル等を介して入力した正しい作業手順を受け付けても良い。そして、情報提示プログラム108は、受け付けた作業手順を設定しても良い。 In addition, when an error occurs in the determination result in step 612 and the work procedure does not proceed even though the work procedure being executed has been completed, or the work procedure being executed has not been completed. Regardless, if the work procedure has proceeded next, the information presentation program 108 may accept a correct work procedure input by the operator via a keyboard, mouse, touch panel, or the like. Then, the information presentation program 108 may set the accepted work procedure.

 また、正しい作業手順に移行する方法には、情報提示プログラム108が、作業者の手の動きをデプスセンサで取得し、作業者の手が所定の動きをした場合、正しい作業手順に移行するようにしても良い。 Further, as a method of shifting to the correct work procedure, the information presentation program 108 acquires the movement of the worker's hand with the depth sensor, and when the worker's hand makes a predetermined movement, the information presentation program 108 shifts to the correct work procedure. May be.

 また、前述のステップ611において、作業中空間データから抽出した作業対象の点群データと正常時対象物モデルとの差が所定の閾値より大きいかを判定したが、情報提示プログラム108は、ステップ613において、差の大きさに基づいて作業者に提示する情報の内容を変更しても良い。 In step 611 described above, it is determined whether the difference between the point cloud data of the work target extracted from the working space data and the normal object model is larger than a predetermined threshold value. The information content to be presented to the operator may be changed based on the magnitude of the difference.

 例えば、ステップ611において求めた差が、所定の第1の閾値より小さい場合、情報提示プログラム108は、実施中の作業手順が終了したことを示す内容(「○○手順が終了しました」等)を作業者に提示しても良い。そして、ステップ612において求めた差が、所定の第2の閾値(>第1の閾値)より小さい場合、情報提示プログラム108は、正しい状態に近いことを示す内容(「もう少し回してください」等)を作業者に提示しても良い。 For example, when the difference obtained in step 611 is smaller than the predetermined first threshold, the information presentation program 108 indicates that the work procedure being executed has been completed (such as “XX procedure has been completed”). May be presented to the operator. If the difference obtained in step 612 is smaller than the predetermined second threshold (> first threshold), the information presentation program 108 indicates that it is close to the correct state (such as “turn a little more”). May be presented to the operator.

 この場合、作業手順105は、作業の内容に従った、閾値、及び作業者に提示する内容を、作業手順毎にあらかじめ保持しても良い。 In this case, the work procedure 105 may hold the threshold value according to the work content and the content presented to the worker in advance for each work procedure.

 また、ステップ604~ステップ611を複数回実行された結果、作業対象の状態が適切であると判定されない時間が所定の時間を経過した場合、情報提示プログラム108は、出力装置103を用いて、作業が遅延していること等を示す警告又は注意喚起を作業者に提示しても良い。これによって、情報提示プログラム108は、作業者を適切にナビゲートできる。 Further, as a result of executing Step 604 to Step 611 a plurality of times, when a predetermined time has elapsed since the time when the state of the work target is not determined to be appropriate, the information presentation program 108 uses the output device 103 to The operator may be presented with a warning or a reminder indicating that is delayed. Thereby, the information presentation program 108 can appropriately navigate the worker.

 実施例1によれば、作業支援システムは、作業対象の領域と手の領域とが重なるか否かを判定することによって、連続した作業の中の一つの作業手順を終了したか否かを判定する。このため、実施例1の作業支援システムは、適切なタイミングで作業対象の状態を判定し、作業者に提示する作業手順の情報を効果的に切り替えることが可能となる。その結果、作業者が確実に作業を遂行できるように、効果的にナビゲートすることができる。 According to the first embodiment, the work support system determines whether one work procedure in a continuous work is finished by determining whether the work target area and the hand area overlap each other. To do. For this reason, the work support system according to the first embodiment can determine the state of the work target at an appropriate timing, and can effectively switch the work procedure information presented to the worker. As a result, it is possible to navigate effectively so that the worker can reliably perform the work.

 また、実施例1の作業支援システムは、三次元の点群データに基づいた作業対象の状態を判定するため、判定の精度を向上することも可能となる。 Further, since the work support system according to the first embodiment determines the state of the work target based on the three-dimensional point cloud data, it is possible to improve the determination accuracy.

 本発明における実施例2を図15から図17を用いて説明する。 Embodiment 2 of the present invention will be described with reference to FIGS.

 実施例1では、入力装置102において作業中空間データを入力する手段として三次元データを取得できるデプスセンサを使用したが、実施例2では、作業中空間データを入力する手段として一般的な画像を取得するカメラを使用する。すなわち、実施例2の作業支援システムは、デプスセンサ203の代わりにカメラを備え、実施例2のカメラは、作業中空間データを二次元の画像として取得する。 In the first embodiment, a depth sensor capable of acquiring three-dimensional data is used as the means for inputting the working space data in the input device 102. However, in the second embodiment, a general image is acquired as the means for inputting the working space data. Use the camera that you want. That is, the work support system according to the second embodiment includes a camera instead of the depth sensor 203, and the camera according to the second embodiment acquires the work space data as a two-dimensional image.

 実施例2における構成は図1と同様である。しかし、作業空間モデル104が保持する作業空間三次元モデルはCG等で一般的に使用されるポリゴン(多角形)を用いた三次元モデルであり、正常時モデル106は、作業対象の理想的な状態を表す画像を保持する。また、情報提示プログラム108から状態判定プログラム112によって実行される処理は、図15に示す処理である。 The configuration in Example 2 is the same as in FIG. However, the work space three-dimensional model held by the work space model 104 is a three-dimensional model using polygons (polygons) generally used in CG or the like, and the normal model 106 is an ideal work target. Holds an image representing the state. Further, the process executed by the state determination program 112 from the information presentation program 108 is the process shown in FIG.

 図15は、実施例2の作業支援システムによる処理を示すフローチャートである。 FIG. 15 is a flowchart illustrating processing by the work support system according to the second embodiment.

 図15に示すフローチャートにおいて、ステップ601~603、ステップ608、ステップ610、及び、ステップ612~616は、実施例1と同じ処理である。図15に示す処理と図7に示す処理とは、実施例2のステップ1401~ステップ1406の処理が異なる。 In the flowchart shown in FIG. 15, steps 601 to 603, step 608, step 610, and steps 612 to 616 are the same processing as in the first embodiment. The processing shown in FIG. 15 is different from the processing shown in FIG. 7 in steps 1401 to 1406 in the second embodiment.

 以下、ステップ1401~ステップ1406について説明する。ステップ1401において、まず、入力プログラム113は、カメラを用いて作業中空間データである画像を取得する。 Hereinafter, step 1401 to step 1406 will be described. In step 1401, first, the input program 113 acquires an image which is working space data using a camera.

 図16は、実施例2の作業中空間データとして取得された画像を示す説明図である。 FIG. 16 is an explanatory diagram showing an image acquired as working space data of the second embodiment.

 図16に示す画像は、実施例2のカメラによって取得され、作業中空間データを示す。図16に示す画像は、設備1501と、作業者の手の画像1502及び画像1503とを含む。 The image shown in FIG. 16 is acquired by the camera of Example 2 and shows the working space data. The image shown in FIG. 16 includes an equipment 1501 and an image 1502 and an image 1503 of the worker's hand.

 ステップ1401の後、手検出プログラム109は、ステップ1401において取得された作業中空間データから作業者の手及び腕を検出する(1402)。ステップ1402において手及び腕を検出する方法は、手検出プログラム109が、画像から特定形状の領域を認識する方法であれば、どのような方法でも良い。 After step 1401, the hand detection program 109 detects the operator's hand and arm from the working space data acquired in step 1401 (1402). The method for detecting the hand and arm in step 1402 may be any method as long as the hand detection program 109 recognizes a region having a specific shape from the image.

 例えば、手検出プログラム109は、手及び腕の色と同一の色を有する領域を検出することにより、作業中空間データから作業者の手及び腕の領域を検出しても良い。または、手検出プログラム109は、作業者の手及び腕に関する代表的なテンプレート画像をあらかじめ複数保持しておき、テンプレート画像中の位置及び回転角度を変化させながら作業中空間データと重ね合わせ、テンプレート画像と作業中空間データとが最も良く一致するテンプレート画像及びその位置及び回転角度を求めることにより、作業中空間データから作業者の手及び腕の領域を検出しても良い。 For example, the hand detection program 109 may detect the area of the operator's hand and arm from the working space data by detecting an area having the same color as the hand and arm. Alternatively, the hand detection program 109 holds a plurality of representative template images related to the hand and arm of the worker in advance and superimposes the template image on the work space data while changing the position and rotation angle in the template image. The area of the operator's hand and arm may be detected from the working space data by obtaining the template image that best matches the working space data and its position and rotation angle.

 ステップ1402の後、対応関係算出プログラム114は、検出した手及び腕の領域を作業中空間データから削除する(1403)。具体的には、手検出プログラム109が、手及び腕の色と同一の色を有する領域を作業者の手及び腕として検出した場合、対応関係算出プログラム114は、作業中空間データにおいて検出された手及び腕の領域の画素を全て単一の色(例えば黒)で塗りつぶすことによって、手及び腕の領域を削除しても良い。 After step 1402, the correspondence calculation program 114 deletes the detected hand and arm regions from the working space data (1403). Specifically, when the hand detection program 109 detects an area having the same color as the hand and arm as the worker's hand and arm, the correspondence calculation program 114 is detected in the working space data. The hand and arm regions may be deleted by painting all the pixels in the hand and arm regions with a single color (for example, black).

 また、手検出プログラム109が、テンプレート画像を用いて作業者の手及び腕の領域を検出した場合、対応関係算出プログラム114は、作業中空間データにテンプレート画像を重ね合わせた場合に、テンプレート画像と重ね合わさる作業中空間データにおける領域の画素を全て単一の色で塗りつぶすことによって、手及び腕の領域を削除しても良い。 Further, when the hand detection program 109 detects an area of the hand and arm of the worker using the template image, the correspondence calculation program 114 displays the template image when the template image is superimposed on the working space data. The hand and arm regions may be deleted by painting all the pixels of the region in the working space data to be overlaid with a single color.

 ステップ1403において、さらに、対応関係算出プログラム114は、作業者の手及び腕の領域を削除した後の作業中空間データと作業空間三次元モデルとの対応関係を求める。対応関係算出プログラム114は、作業中空間データの画像と作業空間三次元モデルとが最も良く一致するように位置を合わせる方法を用いることにより、対応関係を求めても良い。 In step 1403, the correspondence calculation program 114 further obtains a correspondence between the work space data after deleting the hand and arm regions of the worker and the work space three-dimensional model. The correspondence calculation program 114 may obtain the correspondence by using a method of aligning positions so that the image of the work space data and the work space three-dimensional model best match.

 例えば、対応関係算出プログラム114は、画像及びCGモデルから抽出したエッジ情報を用いる方法(V. Lepetit, L. Vacchetti, D. Thalmann and P. Fua, “Fully Automated and Stable Registration for Augmented Reality Applications”, ISMAR2003, pp.93-102, 2003)等を用いて対応関係を求めても良い。 For example, the correspondence calculation program 114 uses the edge information extracted from the image and the CG model (V. Lepetit, L. Vacchetti, D. Thallmann and P. Fua, “Fully Automated and Stable Registration Registration”. The correspondence relationship may be obtained by using ISMAR 2003, pp. 93-102, 2003).

 なお、この方法を用いる場合、対応関係算出プログラム114は、削除した作業者の手及び腕の領域の境界付近においてもエッジ情報を抽出してしまうため、このような抽出されたエッジ情報を除外する。また、対応関係算出プログラム114は、作業中空間データと作業空間三次元モデルとの対応関係を求める方法として他の方法を用いる場合も、同様に、作業者の手及び腕の領域及びその境界から抽出される情報は除外してから、対応関係を求める。 When this method is used, the correspondence calculation program 114 extracts edge information even in the vicinity of the boundary between the deleted worker's hand and arm regions, and therefore excludes such extracted edge information. . In addition, the correspondence calculation program 114 similarly uses the region of the operator's hand and arm and the boundary when using another method as a method for obtaining the correspondence between the work space data and the work space three-dimensional model. The correspondence is obtained after excluding the extracted information.

 前述のような方法により対応関係を求める場合、対応関係算出プログラム114は、作業中空間データに最も良く一致するように、作業空間三次元モデルの回転及び並進を行う変換行列を、対応関係として求める。 When the correspondence is obtained by the method as described above, the correspondence calculation program 114 obtains, as the correspondence, a transformation matrix that performs rotation and translation of the work space three-dimensional model so as to best match the work space data. .

 ステップ1403の後、作業対象検出プログラム110は、ステップ1403において求められた作業中空間データと作業空間三次元モデルとの対応関係を表す変換行列を用いて、実施中の作業手順に関連する各作業対象の、作業中空間データにおける領域を求める(1404)。すなわち、実施例2の作業対象検出プログラム110は、ステップ1404において、作業空間三次元モデル上に定義された三次元の領域から、画像上の対応する領域を求める。 After step 1403, the work target detection program 110 uses the transformation matrix representing the correspondence between the work space data obtained in step 1403 and the work space three-dimensional model to perform each work related to the work procedure being performed. The target area in the working space data is obtained (1404). That is, the work object detection program 110 according to the second embodiment obtains a corresponding area on the image from the three-dimensional area defined on the work space three-dimensional model in step 1404.

 作業空間三次元モデル上の作業対象の領域が、図5に示すような立方体として登録されている場合、作業対象検出プログラム110は、ステップ1403の結果得られた変換行例を用いた透視投影変換を行うことにより、立方体の各頂点の座標を画像上の位置に変換する。そして、作業対象検出プログラム110は、変換した各頂点の画像上の位置を頂点間の関係に基づいて直線で結ぶことにより、作業対象の画像上での領域を求める。 When the work target area on the work space three-dimensional model is registered as a cube as shown in FIG. 5, the work target detection program 110 uses the perspective projection conversion using the conversion row example obtained as a result of step 1403. To convert the coordinates of each vertex of the cube to a position on the image. Then, the work target detection program 110 obtains a region on the work target image by connecting the positions of the converted vertices on the image with straight lines based on the relationship between the vertices.

 さらに、作業空間三次元モデル上の作業対象の領域が任意の形状であっても、作業対象検出プログラム110は、領域上のエッジ又は頂点等にある特徴的な点を複数選択し、それらの三次元位置を透視投影変換によって画像上の位置に変換する。そして、作業対象検出プログラム110は、変換後の点を結ぶことによって表される領域を求めることにより、作業対象の画像上での領域を求める。 Furthermore, even if the area of the work target on the work space three-dimensional model has an arbitrary shape, the work target detection program 110 selects a plurality of characteristic points at edges or vertices on the area, The original position is converted to a position on the image by perspective projection conversion. Then, the work object detection program 110 obtains an area on the work object image by obtaining an area represented by connecting the converted points.

 ステップ1404の後、情報提示プログラム108は、ステップ608を実行する。ステップ608の後、関係判定プログラム111は、ステップ1402において検出された作業中空間データにおける作業者の手の領域と、ステップ1404において求められた作業中空間データにおける作業対象の領域とに基づいて、実施例1におけるステップ609と同様に、作業者の手が作業対象に重なるかを判定する(1405)。 After step 1404, the information presentation program 108 executes step 608. After step 608, the relationship determination program 111, based on the area of the operator's hand in the working space data detected in step 1402 and the work target area in the working space data obtained in step 1404, As in step 609 in the first embodiment, it is determined whether the worker's hand overlaps the work target (1405).

 具体的には、関係判定プログラム111は、ステップ1405において、作業者の手の領域が、作業対象の領域の中、又は、作業対象の領域を含む所定の位置範囲の中に存在するかを判定する。 Specifically, in step 1405, the relationship determination program 111 determines whether the area of the worker's hand exists in the work target area or a predetermined position range including the work target area. To do.

 関係判定プログラム111は、ステップ1403において作業中空間データから削除した手及び腕の領域の内側の画素のうち、手の部分に対応する画素のみを、作業中空間データ中における作業者の手の領域として選択する。 The relationship determination program 111 selects only the pixel corresponding to the hand portion from the pixels inside the hand and arm regions deleted from the work space data in step 1403, and the area of the worker's hand in the work space data. Choose as.

 図17は、実施例2の作業者の手の領域を示す説明図である。 FIG. 17 is an explanatory diagram illustrating an area of the hand of the worker according to the second embodiment.

 図17に示す領域1601は、手の領域を示す。また、図17に示す領域1602は、作業対象の領域である。 An area 1601 shown in FIG. 17 indicates a hand area. An area 1602 shown in FIG. 17 is a work target area.

 例えば、関係判定プログラム111は、手に対応すると判断された画素に外接する長方形を求め、長方形の領域を手の領域に決定する。ステップ1405において、関係判定プログラム111は、求めた作業者の手の領域1601が作業対象の領域1602に重なるかを判定する。 For example, the relationship determination program 111 obtains a rectangle circumscribing the pixel determined to correspond to the hand, and determines the rectangular area as the hand area. In step 1405, the relationship determination program 111 determines whether the obtained area 1601 of the worker's hand overlaps the work target area 1602.

 関係判定プログラム111は、その後ステップ610を実行する。実施例2のステップ610において、作業者の手の領域1601が作業対象の領域1602に重ならないと判定された場合、状態判定プログラム112は、ステップ1406を開始する。 The relationship determination program 111 executes step 610 thereafter. In Step 610 of the second embodiment, when it is determined that the area 1601 of the worker's hand does not overlap the area 1602 to be worked, the state determination program 112 starts Step 1406.

 ステップ1406において、状態判定プログラム112は、ステップ1404において求められた作業中空間データにおける作業対象の領域に基づいて、作業対象に対応する画像を作業中空間データから抽出する。また、状態判定プログラム112は、正常時モデル106から、該当する作業対象の理想的な状態を表す画像を取得する。 In step 1406, the state determination program 112 extracts an image corresponding to the work target from the work space data based on the work target area in the work space data obtained in step 1404. In addition, the state determination program 112 acquires an image representing an ideal state of the corresponding work target from the normal model 106.

 ステップ1406において、状態判定プログラム112は、さらに、作業中空間データにおける作業対象の画像と理想的な作業対象の画像とを比較し、両者の差を算出する。この際、作業中空間データにおける作業対象の画像と、正常時対象物モデルとして登録されている画像とは、対象物を同一の位置及び方向から撮影した画像である必要はない。状態判定プログラム112は、作業中空間データにおける作業対象の画像と理想的な作業対象の画像とが最も良く一致するように位置を合わせた後、両者の差を求めても良い。 In step 1406, the state determination program 112 further compares the work target image in the working space data with the ideal work target image, and calculates the difference between the two. At this time, the work target image in the working space data and the image registered as the normal object model do not need to be images obtained by photographing the object from the same position and direction. The state determination program 112 may obtain the difference between the two after aligning the position so that the work target image in the work space data and the ideal work target image are best matched.

 二つの画像の位置を合わせる方法として、状態判定プログラム112は、二次元トラッキング技術(佐川浩彦、浦野雄大、栗原恒弥、“2次元トラッキングに基づく簡易ARによる作業支援システムの開発、”ヒューマンインタフェースシンポジウム2013論文集、2013)等を用いても良い。 As a method for aligning the positions of two images, the state determination program 112 is a two-dimensional tracking technology (Hirohiko Sagawa, Yudai Urano, Tsuneya Kurihara, “Development of a work support system using simple AR based on two-dimensional tracking,” Human Interface Symposium. 2013 paper collection, 2013) or the like may be used.

 ステップ1406の後、状態判定プログラム112は、ステップ612を実行する。 After step 1406, the state determination program 112 executes step 612.

 実施例2によれば、作業支援システムは、二次元の画像を用いた場合でも、適切なタイミングで作業対象の状態を判定し、作業者に提示する作業手順の情報を効果的に切り替えることが可能となる。 According to the second embodiment, even when a two-dimensional image is used, the work support system can determine the state of the work target at an appropriate timing and effectively switch the work procedure information presented to the worker. It becomes possible.

 本発明における実施例3を図18を用いて説明する。 Embodiment 3 of the present invention will be described with reference to FIG.

 実施例1では、作業者の手が作業対象に重ならないと判定された場合に、作業中空間データにおける作業対象のデータと正常時対象物モデルとの差を算出する処理に移行した。実施例3の作業支援システムは、あらかじめ登録されている作業者の手の動作が行われたかを判定する処理をさらに実行する。 In Example 1, when it is determined that the operator's hand does not overlap the work target, the process proceeds to a process of calculating the difference between the work target data and the normal object model in the working space data. The work support system according to the third embodiment further executes a process of determining whether a pre-registered worker's hand action has been performed.

 さらに、実施例3の作業支援システムは、所定の手の動作が行われた後、作業者の手が作業対象の領域に存在しない状態となった場合に、作業中空間データにおける作業対象のデータと正常時対象物モデルとの差を算出する処理を実行する。 Furthermore, the work support system according to the third embodiment is configured so that the work target data in the working space data is obtained when the operator's hand does not exist in the work target area after the operation of the predetermined hand is performed. And a process for calculating a difference between the normal object model and the normal object model.

 実施例3の作業支援システムの構成は、図1と同様である。ただし、実施例3の作業手順105は、手の動作を判定するための手の動作に関する情報を保持する。 The configuration of the work support system of the third embodiment is the same as that shown in FIG. However, the work procedure 105 of the third embodiment holds information regarding hand movements for determining hand movements.

 手の動作は作業手順の内容によって異なる。このため、管理者等は、図4に示す作業手順105のフォーマットに、各作業手順の情報に関連付けて手の動作に関する情報を格納しても良い。また、実施例3の作業支援システムが、作業手順の名称と手の動作に関する情報とを対応して保持する記憶領域を有しても良い。 手 の Hand movements vary depending on the work procedure. For this reason, the administrator or the like may store information related to hand movements in association with information on each work procedure in the format of the work procedure 105 shown in FIG. In addition, the work support system according to the third embodiment may include a storage area that holds the name of the work procedure and the information related to the hand movement.

 また、手の動作に関する情報は、例えば、手の軌跡を表す位置座標及び向き表す回転行列の時系列データであっても良い。また、手の動作に関する情報は、移動方向を表す記号列として表す方法等、手の軌跡や姿勢の変化を表す情報であれば、どのような情報でも良い。 Further, the information related to the hand motion may be, for example, time series data of a rotation matrix representing a position coordinate representing the hand trajectory and a direction. Further, the information related to the hand motion may be any information as long as it is information representing a change in the hand trajectory and posture, such as a method of representing the symbol as a moving direction.

 図18は、実施例3の作業支援システムによる処理を示すフローチャートである。 FIG. 18 is a flowchart illustrating processing by the work support system according to the third embodiment.

 図18に示す処理において、ステップ601~608、ステップ609、ステップ611~616は、実施例1と同じである。一方で、ステップ1701及び1702が図7に示す処理と異なる。ステップ1701の処理は、実施例3の関係判定プログラム111によって実行される。 In the processing shown in FIG. 18, Steps 601 to 608, Step 609, and Steps 611 to 616 are the same as those in the first embodiment. On the other hand, steps 1701 and 1702 are different from the processing shown in FIG. The process of step 1701 is executed by the relationship determination program 111 of the third embodiment.

 ステップ608の後、関係判定プログラム111は、ステップ605において検出された手の領域から手の位置及び姿勢を求める。そして、関係判定プログラム111は、ステップ608が開始する時点までの過去に複数回実行されたステップ608において求められた手の位置及び姿勢に基づく時系列データと、実施中の作業手順105に関連付けて登録されている所定の手の動作に関する情報とを比較することにより、所定の動作が実施されたかを判定する(1701)。 After step 608, the relationship determination program 111 obtains the hand position and posture from the hand area detected in step 605. Then, the relationship determination program 111 associates the time-series data based on the hand position and posture obtained in step 608 executed multiple times in the past up to the time when step 608 starts with the work procedure 105 being executed. It is determined whether or not the predetermined operation has been performed by comparing the registered information regarding the operation of the predetermined hand (1701).

 ステップ605において作業中空間データから作業者の手の領域が特定されるため、関係判定プログラム111は、例えば、その手の領域の重心位置を手の位置とみなす。または、関係判定プログラム111は、作業中空間データから検出した手の領域と、詳細な手のモデルとの位置を合わせ、掌の中心位置を手の位置として求める方法等を用いても良い。 In step 605, since the area of the worker's hand is specified from the working space data, the relationship determination program 111 regards, for example, the position of the center of gravity of the area of the hand as the position of the hand. Alternatively, the relationship determination program 111 may use a method in which the position of the hand region detected from the working space data and the position of the detailed hand model are matched to determine the palm center position as the hand position.

 また、関係判定プログラム111は、ステップ1701において、作業者の手の領域に最も良く外接する直方体を求め、その方向を表す回転行列を算出することにより、手の姿勢を求めることができる。また、関係判定プログラム111は、作業中空間データから検出した手の領域と、詳細な手のモデルと位置を合わせ、掌の方向を表す回転行列を算出する方法等を用いても良い。 Further, in step 1701, the relationship determination program 111 can obtain a hand posture by obtaining a rectangular parallelepiped that best circumscribes the region of the hand of the worker and calculating a rotation matrix representing the direction. In addition, the relationship determination program 111 may use a method of calculating a rotation matrix representing the direction of the palm by matching the position of the hand region detected from the working space data with the detailed hand model.

 さらに、関係判定プログラム111は、ステップ1701を実行するまでの手の位置及び姿勢の時系列データと、所定の手の動作とを比較するため、良く知られている時系列データの比較方法、例えば、動的計画法(DPマッチング)(内田、“DPマッチング概説~基本と様々な拡張~”、信学技報、PRMU2006-166、pp.31-36、2006年)等を用いてもよい。 Further, the relationship determination program 111 compares well-known time-series data comparison methods for comparing the time-series data of hand positions and postures up to execution of step 1701 with predetermined hand movements, for example, , Dynamic programming (DP matching) (Uchida, “DP Matching Overview: Basics and Various Extensions”, IEICE Technical Report, PRMU 2006-166, pp. 31-36, 2006), etc. may be used.

 これによって、関係判定プログラム111は、両者を比較した結果から両者の差異及び類似度を表す指標を算出する。そして、関係判定プログラム111は、所定の閾値と算出した指標とを比較することによって、所定の動作が行われたかを判定してもよい。 Thereby, the relationship determination program 111 calculates an index representing the difference and similarity between the two from the result of comparing the two. Then, the relationship determination program 111 may determine whether a predetermined operation has been performed by comparing a predetermined threshold value with the calculated index.

 例えば、比較の結果、値が大きい程差異が大きいことを示す指標が算出された場合、関係判定プログラム111は、算出された指標が所定の閾値より小さい場合に、所定の動作が行われたと判定する。 For example, as a result of comparison, when an index indicating that the difference is larger as the value is larger is calculated, the relationship determination program 111 determines that a predetermined operation has been performed when the calculated index is smaller than a predetermined threshold. To do.

 また、比較の結果、差異が小さい程大きい値を示す類似度が算出された場合、関係判定プログラム111は、算出された類似度が閾値より大きい場合に所定の動作が行われたと判定する。関係判定プログラム111は、所定の動作が行われたと判定した場合、判定された時刻を記憶装置107等に保存する。その後、関係判定プログラム111は、ステップ609を実行する。 Also, as a result of comparison, when a similarity indicating a larger value as the difference is smaller is calculated, the relationship determination program 111 determines that a predetermined operation has been performed when the calculated similarity is greater than a threshold value. When the relationship determination program 111 determines that a predetermined operation has been performed, the relationship determination program 111 stores the determined time in the storage device 107 or the like. Thereafter, the relationship determination program 111 executes Step 609.

 ステップ609の後、関係判定プログラム111は、ステップ1701とステップ609とにおける判定結果に基づいて、作業者の手が所定の動作を行った後、手が作業対象に重なるか否かを判定する(1702)。具体的には、関係判定プログラム111は、ステップ1702を開始する前に所定の動作が行われ、かつ、その所定の動作が行われた後に作業者の手が作業対象の領域中に存在しないと判定された場合、作業対象と手が重ならないと判定する。そして、関係判定プログラム111は、状態判定プログラム112にステップ611を実行させる。 After step 609, the relationship determination program 111 determines whether or not the hand overlaps the work target after the operator's hand has performed a predetermined motion based on the determination results in step 1701 and step 609 ( 1702). Specifically, the relationship determination program 111 performs a predetermined operation before starting the step 1702 and if the operator's hand does not exist in the work target area after the predetermined operation is performed. If it is determined, it is determined that the work target and the hand do not overlap. Then, the relationship determination program 111 causes the state determination program 112 to execute step 611.

 または、作業者の手が作業対象の領域中に存在しないと判定された時点から、所定の時間過去にさかのぼった時刻において、所定の動作が行われたと判定された場合、関係判定プログラム111は、作業対象と手が重ならないと判定し、状態判定プログラム112にステップ611を実行させても良い。 Alternatively, when it is determined that a predetermined operation has been performed at a time that has been determined to go back a predetermined time from the time when it is determined that the worker's hand does not exist in the work target area, the relationship determination program 111 It may be determined that the work target and the hand do not overlap, and the state determination program 112 may execute step 611.

 なお、前述の実施例3は、カメラから取得した画像を作業中空間データとして使用する実施例2にも適用可能である。 The third embodiment described above can also be applied to the second embodiment in which an image acquired from a camera is used as working space data.

 また、前述のステップ1701において、作業者の手の位置及び姿勢からなる時系列データと所定の手の動作との比較結果のみに基づいて、所定の手の動作が行われたかを判定した。しかし、関係判定プログラム111は、さらに、作業対象の領域から所定の位置の範囲内で手の動作が行われた場合にのみ、所定の手の動作が行われたと判定しても良い。 In Step 1701 described above, it was determined whether the predetermined hand movement was performed based only on the comparison result between the time series data including the position and posture of the worker's hand and the predetermined hand movement. However, the relationship determination program 111 may further determine that the predetermined hand movement has been performed only when the hand movement is performed within a predetermined position range from the work target area.

 また、ステップ1701において、関係判定プログラム111は、手の方向及び手の動きの方向を抽出してもよい。そして、抽出された手の動きの方向に作業対象の領域が存在し、かつ、手の動きが所定の動きに近い場合、関係判定プログラム111は、所定の手の動作が行われたと判定して良い。 In step 1701, the relationship determination program 111 may extract the hand direction and the direction of hand movement. If there is a work target area in the direction of the extracted hand movement and the hand movement is close to the predetermined movement, the relationship determination program 111 determines that the predetermined hand movement has been performed. good.

 これにより、関係判定プログラム111は、作業手順の終了を示す特定の動きが行われたタイミングを、作業状態を判定する適切なタイミングであると判定できる。関係判定プログラム111は、例えば、作業対象に対して「指差し」動作が行われたタイミングで、作業対象の状態を判定するステップ611に移行できる。 Thereby, the relationship determination program 111 can determine that the timing at which the specific movement indicating the end of the work procedure is performed is an appropriate timing for determining the work state. For example, the relationship determination program 111 can proceed to step 611 for determining the state of the work target at the timing when the “pointing” operation is performed on the work target.

 また、ステップ1701において、関係判定プログラム111は、所定の手の動作が行われたと判定されたかによらず、手の動きの方向を抽出し、抽出された方向にある作業対象が実施している作業手順に関連する作業対象かを判定してもよい。 In step 1701, the relationship determination program 111 extracts the direction of hand movement regardless of whether it is determined that a predetermined hand motion has been performed, and the work target in the extracted direction is being executed. It may be determined whether the work target is related to the work procedure.

 そして、実施している作業手順に関連する作業対象ではない領域、例えば、異なる作業対象が検出される位置に向かって手が動作していることを検出した場合、関係判定プログラム111は、情報提示プログラム108及び出力装置103を用いて、異なる作業対象に作業を実施しようとしていることを示す警告を作業者に提示しても良い。これによって、情報提示プログラム108は、作業者に作業手順を見直させ、適切に作業をナビゲートできる。 When it is detected that the hand is moving toward an area that is not a work target related to the work procedure being performed, for example, a position where a different work target is detected, the relationship determination program 111 displays the information. The program 108 and the output device 103 may be used to present a warning to the worker indicating that work is being performed on a different work target. Thus, the information presentation program 108 can cause the worker to review the work procedure and appropriately navigate the work.

 これによって、関係判定プログラム111は、作業者に作業の内容を改めて確認するように促すことができ、作業者を適切にナビゲートできる。 Thereby, the relationship determination program 111 can prompt the worker to confirm the content of the work again, and can appropriately navigate the worker.

 また、ステップ1702において、作業者の手が所定の動作を実施しない時間が所定の時間を経過した場合、関係判定プログラム111は、情報提示プログラム108及び出力装置103を用いて、例えば、作業が遅延していることを示す警告又は注意喚起を作業者に提示しても良い。 In step 1702, when a predetermined period of time has elapsed when the operator's hand does not perform the predetermined operation, the relationship determination program 111 uses the information presentation program 108 and the output device 103 to delay the operation, for example. The operator may be presented with a warning or alert to indicate that he is doing.

 以上の実施例3によれば、作業者が所定の動作を行ったか、及び、作業対象から手を離したかに基づいて、作業対象の状態が適切であるかを判定する処理を実行するため、作業支援システムは、適切に作業が実行された後の適切なタイミングで作業手順の終了を判定することができる。そしてこれにより、実施例3の作業支援システムは、適切なタイミングで作業者を効果的にナビゲートできる。 According to the third embodiment described above, in order to execute the process of determining whether the state of the work target is appropriate based on whether the worker has performed a predetermined operation and whether the hand has been released from the work target, The work support system can determine the end of the work procedure at an appropriate timing after the work is appropriately executed. Thus, the work support system according to the third embodiment can effectively navigate the worker at an appropriate timing.

 本発明における実施例4を図19及び図20を用いて説明する。 Embodiment 4 of the present invention will be described with reference to FIGS.

 実施例1の作業支援システムは、作業者の手と作業対象との関係に基づいて、作業中空間データ中の作業対象のデータと正常時対象物モデルとの差を算出する処理に移行した。 The work support system of Example 1 shifted to a process of calculating the difference between the work target data in the working space data and the normal object model based on the relationship between the worker's hand and the work target.

 実施例4の作業支援システムは、作業者が工具等を把持して作業を実施する場合を想定し、作業対象の領域と作業者の手との位置関係を判定し、さらに、作業者の手が作業対象の領域に重ならない場合、作業者の手と作業対象の領域との間に工具等が存在するかを判定する。 The work support system according to the fourth embodiment assumes a case where the worker performs a work while holding a tool or the like, determines the positional relationship between the work target area and the worker's hand, and further determines the worker's hand. Does not overlap the work target area, it is determined whether a tool or the like exists between the operator's hand and the work target area.

 そして、実施例4の作業支援システムは、工具等が存在すると判定された場合、工具等の領域が作業対象の領域に重なるかを判定する。さらに、実施例4の作業支援システムは、手及び工具等が作業対象の領域に重ならない場合、作業中空間データ中の作業対象のデータと正常時対象物モデルとの差を算出する処理を実行し、正常に作業手順が終了したかを判定する。 Then, when it is determined that a tool or the like exists, the work support system according to the fourth embodiment determines whether the area of the tool or the like overlaps the area to be worked. Furthermore, the work support system according to the fourth embodiment executes a process of calculating the difference between the work target data in the work space data and the normal object model when the hand and the tool do not overlap the work target area. Then, it is determined whether the work procedure has been completed normally.

 実施例4の作業支援システムの構成も、図1の作業支援システムの構成と同様である。しかし、実施例4の作業支援システムの処理と、図1の作業支援システムの処理とは、一部が異なる。 The configuration of the work support system of the fourth embodiment is the same as the configuration of the work support system of FIG. However, the process of the work support system according to the fourth embodiment is partially different from the process of the work support system of FIG.

 図19は、実施例4の作業支援システムによる処理を示すフローチャートである。 FIG. 19 is a flowchart illustrating processing by the work support system according to the fourth embodiment.

 図19の処理において、ステップ601~609、ステップ611~616は、実施例1と同じである。一方で、ステップ1801及び1802が図7に示す処理と異なる処理である。ステップ1801の処理は、実施例3の関係判定プログラム111によって実行される。 In the processing of FIG. 19, steps 601 to 609 and steps 611 to 616 are the same as those in the first embodiment. On the other hand, steps 1801 and 1802 are different processes from those shown in FIG. The process of step 1801 is executed by the relationship determination program 111 of the third embodiment.

 ステップ609の後、関係判定プログラム111は、工具等の物体を検出し、工具等の物体が作業対象にかさなるかを判定する(1801)。関係判定プログラム111は、特に、ステップ609において作業者の手が作業対象の領域に重ならないと判定された場合に、ステップ1801を実行してもよい。これによって、作業者の手が作業対象を含む所定の位置範囲から離れているが、一方で、作業者が工具等の物体を用いて作業対象への作業を行っている場合に、関係判定プログラム111は、ステップ1802において一つの作業手順が終了したと誤って判定することを防ぐ。 After step 609, the relationship determination program 111 detects an object such as a tool, and determines whether the object such as a tool is covered by the work target (1801). The relationship determination program 111 may execute Step 1801 particularly when it is determined in Step 609 that the operator's hand does not overlap the work target area. As a result, when the worker's hand is away from the predetermined position range including the work target, but the worker is working on the work target using an object such as a tool, the relationship determination program 111 prevents erroneous determination that one work procedure has been completed in step 1802.

 図20は、実施例4の作業者が工具等を把持して作業を行っている場合の作業中空間データを示す説明図である。 FIG. 20 is an explanatory diagram showing the working space data when the worker of the fourth embodiment is working while holding a tool or the like.

 図20における領域1901は、工具等の物体に対応する点群データである。領域1001は、図11及び図13に示す領域1001と同じく、ステップ607において求められた作業対象の領域である。領域1201は、図13に示す領域1201と同じく、ステップ609において求められた作業者の手の領域である。 In FIG. 20, an area 1901 is point cloud data corresponding to an object such as a tool. The area 1001 is the work target area obtained in step 607, similarly to the area 1001 shown in FIGS. 11 and 13. The area 1201 is the area of the operator's hand obtained in step 609, as in the area 1201 shown in FIG.

 領域1901は、ステップ607において算出された作業対象の領域であり、領域1202は、ステップ609において算出された作業者の手の領域である。 The area 1901 is the work target area calculated in step 607, and the area 1202 is the area of the worker's hand calculated in step 609.

 関係判定プログラム111は、ステップ1801において、作業対象の領域1001と作業者の手の領域1202との間の空間にある点群データを作業中空間データから抽出する。そして、関係判定プログラム111は、抽出した点群データが作業対象の領域と作業者の手の領域とに接触している場合、作業者が把持した工具等が作業対象の領域に存在すると判定することができる。 In step 1801, the relationship determination program 111 extracts point cloud data in the space between the work target area 1001 and the worker hand area 1202 from the working space data. When the extracted point cloud data is in contact with the work target area and the worker's hand area, the relationship determination program 111 determines that the tool or the like held by the worker exists in the work target area. be able to.

 また、関係判定プログラム111は、実施例1で説明した手の検出方法と同様に、使用する工具等の三次元モデルをあらかじめ保持しても良い。そして、関係判定プログラム111は、作業対象の領域と作業者の手の領域との間にある点群データを、作業中空間データから抽出し、抽出した点群データと、工具等の三次元モデルとが最も良く一致するように位置を合わせてもよい。 Also, the relationship determination program 111 may hold in advance a three-dimensional model such as a tool to be used in the same manner as the hand detection method described in the first embodiment. The relationship determination program 111 extracts point cloud data between the work target area and the worker's hand area from the working space data, and the extracted point cloud data and a three-dimensional model such as a tool. The positions may be aligned so that and match best.

 そして、関係判定プログラム111は、工具等の三次元モデルと抽出した点群データとの差が所定の閾値より小さい場合、作業者が工具等を把持していると判定しても良い。 The relationship determination program 111 may determine that the operator is holding the tool or the like when the difference between the three-dimensional model of the tool and the extracted point cloud data is smaller than a predetermined threshold.

 この場合、関係判定プログラム111は、手の検出と同様に、三次元モデルと抽出した点群データとの位置合わせの結果に基づいて、作業中空間データにおける工具等の領域を特定しても良い。そして、関係判定プログラム111は、特定した工具等の領域と作業対象の領域との重なりに基づいて、作業対象の領域に工具等が重なるかを、さらに判定しても良い。また、作業手順105は、使用する工具等の三次元モデルを各作業手順に関連付けて保持しても良い。 In this case, the relationship determination program 111 may specify a region such as a tool in the working space data based on the result of alignment between the three-dimensional model and the extracted point cloud data, as in the case of hand detection. . Then, the relationship determination program 111 may further determine whether the tool or the like overlaps the work target area based on the overlap between the specified tool area and the work target area. The work procedure 105 may hold a three-dimensional model such as a tool to be used in association with each work procedure.

 ステップ1801の後、関係判定プログラム111は、ステップ609及び1801の判定に基づいて、作業者の手及び工具等の物体の両方が作業対象の領域に重なるかを判定する(1802)。作業者の手及び工具等の物体の両方が作業対象の領域に含まれていない(重ならない)場合、状態判定プログラム112は、ステップ611を開始し、作業者の手及び工具等の物体が作業対象の領域に含まれる(重なる)場合、入力プログラム113は、ステップ604を開始する。 After step 1801, the relationship determination program 111 determines whether both the operator's hand and an object such as a tool overlap the work target area based on the determinations in steps 609 and 1801 (1802). If both the operator's hand and the object such as the tool are not included in the work target area (does not overlap), the state determination program 112 starts step 611, and the operator's hand and the object such as the tool are working. If it is included (overlapping) in the target area, the input program 113 starts step 604.

 実施例4は、カメラから取得した画像を作業中空間データとして使用する実施例2にも適用可能である。 Example 4 can also be applied to Example 2 in which an image acquired from a camera is used as working space data.

 以上の実施例4により、作業者が工具等を把持して作業を行っている場合においても、適切なタイミングで作業対象の状態を判定する処理を行えるため、より一般的な作業にも本実施例を適用可能である。 According to the fourth embodiment described above, even when the worker is holding a tool or the like and performing the work, the process for determining the state of the work target can be performed at an appropriate timing, so this work can also be performed for a more general work. Examples are applicable.

 なお、本発明は上記した実施例に限定されるものではなく、様々な変形例が含まれる。例えば、上記した実施例は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。 In addition, this invention is not limited to the above-mentioned Example, Various modifications are included. For example, the above-described embodiments have been described in detail for easy understanding of the present invention, and are not necessarily limited to those having all the configurations described.

 また、ある実施例の構成の一部を他の実施例の構成に置き換えることが可能であり、また、ある実施例の構成に他の実施例の構成を加えることも可能である。また、各実施例の構成の一部について、他の構成の追加・削除・置換をすることが可能である。 Further, a part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment. Further, it is possible to add, delete, and replace other configurations for a part of the configuration of each embodiment.

 また、上記の各構成、機能、処理部、処理手順等は、それらの一部又は全部を、例えば集積回路で設計する等によりハードウェアで実現してもよい。また、上記の各構成、機能等は、プロセッサがそれぞれの機能を実現するプログラムを解釈し、実行することによりソフトウェアで実現してもよい。各機能を実現するプログラム、テーブル、又はファイル等の情報は、メモリ、ハードディスク若しくはSSD(Solid State Drive)等の記録装置、又は、ICカード、SDカード若しくはDVD等の記録媒体に置くことができる。 Also, each of the above-described configurations, functions, processing units, processing procedures, etc. may be realized in hardware by designing a part or all of them, for example, with an integrated circuit. Each of the above-described configurations, functions, and the like may be realized by software by interpreting and executing a program that realizes each function by the processor. Information such as a program, a table, or a file that realizes each function can be stored in a recording device such as a memory, a hard disk, or an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, or a DVD.

 また、制御線又は情報線は説明上必要と考えられるものを示しており、製品上必ずしも全ての制御線又は情報線を示しているとは限らない。実際には殆ど全ての構成が相互に接続されている。 Also, the control lines or information lines indicate what is considered necessary for the explanation, and not all control lines or information lines on the product are necessarily shown. In practice, almost all the components are connected to each other.

Claims (15)

 作業支援システムであって、
 作業の対象である対象物と当該対象物の周辺とを撮影した画像を出力するカメラと、
 前記対象物の三次元モデルを示す第1モデルが記憶される記憶部と、
 前記画像から、前記作業を行うユーザの手と前記対象物とを検出する検出部と、
 前記手と前記対象物とが重なるかを前記画像から判定する関係判定部と、
 前記手と前記対象物とが重ならない場合、前記対象物の情報である第1情報を前記画像から抽出し、前記対象物の情報である第2情報を前記第1モデルから抽出し、前記第1情報と前記第2情報とを比較した結果に基づいて、作業状態を判定する状態判定部と、
 前記作業状態に従った情報を出力する出力部と、を有する作業支援システム。
A work support system,
A camera that outputs an image of an object that is a target of work and the periphery of the object;
A storage unit storing a first model indicating a three-dimensional model of the object;
From the image, a detection unit for detecting the hand of the user who performs the work and the object;
A relationship determination unit that determines from the image whether the hand and the object overlap;
When the hand and the object do not overlap, the first information that is information of the object is extracted from the image, the second information that is information of the object is extracted from the first model, and the first A state determination unit that determines a working state based on a result of comparing one information with the second information;
An output unit for outputting information according to the work state.
 請求項1に記載の作業支援システムであって、
 前記第1モデルは、前記対象物の前記作業後の理想的な形状を示し、
 前記カメラは、前記対象物と当該対象物の周辺に存在する物体との三次元モデルを示す前記画像を出力し、
 前記検出部は、前記画像が示す前記対象物の三次元モデルである第2モデルと、前記手の三次元モデルとを、前記画像から検出し、
 前記関係判定部は、前記手の三次元モデルと前記第2モデルとが重なるかを判定し、
 前記手の三次元モデルと前記対象物の第2モデルとが重ならない場合、前記状態判定部は、前記対象物の第2モデルの情報を前記第1情報として抽出し、さらに、前記第1モデルから前記対象物の情報である第2情報を抽出することを特徴とする作業支援システム。
The work support system according to claim 1,
The first model shows an ideal shape of the object after the work,
The camera outputs the image showing a three-dimensional model of the object and an object existing around the object;
The detection unit detects a second model, which is a three-dimensional model of the object indicated by the image, and a three-dimensional model of the hand from the image,
The relationship determination unit determines whether the three-dimensional model of the hand and the second model overlap,
When the three-dimensional model of the hand and the second model of the object do not overlap, the state determination unit extracts information on the second model of the object as the first information, and further, the first model The work support system characterized by extracting the 2nd information which is the information of the target object from.
 請求項1に記載の作業支援システムであって、
 前記カメラは、前記作業を行うユーザの視点から見たユーザの視界を含む空間を撮影した画像を、前記対象物と当該対象物の周辺を撮影した画像として出力することを特徴とする作業支援システム。
The work support system according to claim 1,
The camera outputs an image obtained by photographing a space including a user's field of view as seen from the viewpoint of the user who performs the work as an image obtained by photographing the object and the periphery of the object. .
 請求項1に記載の作業支援システムであって、
 前記カメラは、複数の前記画像を出力し、
 前記関係判定部は、前記手が前記対象物を含む所定の位置範囲の中に第1の所定の時間以上存在した後、前記手が前記対象物を含む所定の位置範囲の中に第2の所定の時間以上存在しないことを、前記複数の画像から検出した場合、前記手と前記対象物とが重ならないと判定することを特徴とする作業支援システム。
The work support system according to claim 1,
The camera outputs a plurality of the images,
The relationship determination unit includes a second position within a predetermined position range in which the hand includes the object after the hand exists in the predetermined position range including the object for a first predetermined time or more. A work support system that determines that the hand and the object do not overlap when it is detected from the plurality of images that it does not exist for a predetermined time or more.
 請求項1に記載の作業支援システムであって、
 前記カメラは、複数の前記画像を出力し、
 前記検出部は、前記ユーザの両手を前記複数の画像から検出し、
 前記関係判定部は、前記ユーザの両手が前記対象物を含む所定の位置範囲の中に第1の所定の時間以上存在した後、前記ユーザの手が前記対象物を含む所定の位置範囲の中に第2の所定の時間以上存在しないことを、前記複数の画像から検出した場合、前記手と前記対象物とが重ならないと判定することを特徴とする作業支援システム。
The work support system according to claim 1,
The camera outputs a plurality of the images,
The detection unit detects both hands of the user from the plurality of images,
The relationship determination unit is configured to determine whether the user's hand is in a predetermined position range including the target object after both hands of the user exist in the predetermined position range including the target object for a first predetermined time or longer. When it is detected from the plurality of images that the hand does not exist for a second predetermined time, it is determined that the hand and the object do not overlap.
 請求項1に記載の作業支援システムであって、
 前記カメラは、複数の前記画像を出力し、
 前記関係判定部は、
 前記手の動作を、前記複数の画像から取得し、
 前記手が前記対象物を含む所定の位置範囲の中で所定の動作を行った後、前記手が前記対象物を含む所定の位置範囲の中に所定の時間以上存在しないことを、前記複数の画像から検出した場合、前記手と前記対象物とが重ならないと判定することを特徴とする作業支援システム。
The work support system according to claim 1,
The camera outputs a plurality of the images,
The relationship determination unit
Acquiring the hand movement from the plurality of images;
After the hand performs a predetermined action in a predetermined position range including the object, the hand does not exist in the predetermined position range including the object for a predetermined time or more. A work support system that determines that the hand and the object do not overlap when detected from an image.
 請求項1に記載の作業支援システムであって、
 前記カメラは、複数の前記画像を出力し、
 前記関係判定部は、
 前記手の動作を、前記複数の画像から取得し、
 前記手が前記対象物が存在する方向に向かって所定の動作を行った後、前記手が前記対象物を含む所定の位置範囲の中に所定の時間以上存在しないことを、前記複数の画像から検出した場合、前記手と前記対象物とが重ならないと判定することを特徴とする作業支援システム。
The work support system according to claim 1,
The camera outputs a plurality of the images,
The relationship determination unit
Acquiring the hand movement from the plurality of images;
After the hand performs a predetermined motion in the direction in which the object is present, the plurality of images indicate that the hand does not exist within a predetermined position range including the object for a predetermined time or longer. A work support system that, when detected, determines that the hand and the object do not overlap.
 請求項1に記載の作業支援システムであって、
 前記関係判定部は、
 前記対象物と前記手との間に、前記対象物と前記手とのいずれでもない物体が存在するかを、前記画像を用いて判定し、
 前記対象物と前記手との間に前記物体が存在する場合、前記物体が前記対象物を含む所定の位置範囲の中に含まれるかを、前記画像を用いて判定し、
 前記手が前記対象物を含む所定の位置範囲の中に存在せず、かつ、前記物体が前記対象物を含む所定の位置範囲の中に存在しない場合、前記手と前記対象物とが重ならないと判定することを特徴とする作業支援システム。
The work support system according to claim 1,
The relationship determination unit
It is determined using the image whether there is an object that is neither the object nor the hand between the object and the hand,
When the object is present between the object and the hand, it is determined using the image whether the object is included in a predetermined position range including the object,
If the hand does not exist within a predetermined position range including the object and the object does not exist within a predetermined position range including the object, the hand and the object do not overlap. A work support system characterized by determining that
 請求項1に記載の作業支援システムであって、
 前記状態判定部は、
 前記第1情報と前記第2情報とを比較することによって、前記画像から検出された対象物と、前記第1モデルとの差を求め、
 前記差に基づいて、前記作業が適切に終了したかを、前記作業状態として判定することを特徴とする作業支援システム。
The work support system according to claim 1,
The state determination unit
By comparing the first information and the second information, a difference between the object detected from the image and the first model is obtained,
A work support system that determines, based on the difference, whether the work is properly completed as the work state.
 請求項9に記載の作業支援システムであって、
 前記記憶部は、前記作業の次に実施される作業の対象物である次の対象物に関する情報を保持し、
 前記状態判定部が前記作業が適切に終了したと判定した場合、前記出力部は、前記作業が適切に終了したことと、前記次の対象物に関する情報とを、前記作業状態に従った情報として出力し、
 前記状態判定部が前記作業が適切に終了していないと判定した場合、前記出力部は、前記作業が適切に終了していないことを、前記作業状態に従った情報として出力することを特徴とする作業支援システム。
The work support system according to claim 9,
The storage unit stores information on a next object that is an object of a work performed next to the work,
When the state determination unit determines that the work is properly completed, the output unit determines that the work is properly completed and information on the next object as information according to the work state. Output,
When the state determination unit determines that the work is not properly finished, the output unit outputs that the work is not properly finished as information according to the work state. Work support system.
 請求項9に記載の作業支援システムであって、
 前記カメラは、所定の時間において複数の前記画像を出力し、
 前記状態判定部は、前記複数の画像から検出された対象物と、前記第1モデルとの複数の差を求め、
 前記状態判定部が、前記複数の差に基づいて前記所定の時間において前記作業が適切に終了していないと判定した場合、前記出力部は、前記作業状態に従った情報として警告を出力することを特徴とする作業支援システム。
The work support system according to claim 9,
The camera outputs a plurality of the images at a predetermined time,
The state determination unit obtains a plurality of differences between the object detected from the plurality of images and the first model;
When the state determination unit determines that the work is not properly completed at the predetermined time based on the plurality of differences, the output unit outputs a warning as information according to the work state. A work support system characterized by
 請求項1に記載の作業支援システムであって、
 前記記憶部は、前記作業とは異なる作業の対象物である他の対象物に関する情報を保持し、
 前記関係判定部は、前記手が前記対象物を含む所定の位置範囲の外であり、かつ、前記手が前記他の対象物を含む所定の位置範囲に存在するかを判定し、
 前記手が前記対象物を含む所定の位置範囲の外であり、かつ、前記手が前記他の対象物を含む所定の位置範囲に存在する場合、前記出力部は、警告を示す情報を出力することを特徴とする作業支援システム。
The work support system according to claim 1,
The storage unit holds information about other objects that are objects of work different from the work,
The relationship determination unit determines whether the hand is outside a predetermined position range including the object and the hand exists in a predetermined position range including the other object,
When the hand is outside a predetermined position range including the target object and the hand exists in a predetermined position range including the other target object, the output unit outputs information indicating a warning. A work support system characterized by that.
 請求項1に記載の作業支援システムであって、
 前記カメラは、複数の前記画像を出力し、
 前記関係判定部は、
 前記手が動く方向を、前記複数の画像から取得し、
 前記手が動く方向が、前記対象物を含む所定の位置範囲に向かう方向であるかを判定し、
 前記手が動く方向が前記対象物を含む所定の位置範囲に向かう方向でない場合、前記出力部は、警告を示す情報を出力することを特徴とする作業支援システム。
The work support system according to claim 1,
The camera outputs a plurality of the images,
The relationship determination unit
The direction in which the hand moves is acquired from the plurality of images,
Determining whether the direction in which the hand moves is a direction toward a predetermined position range including the object;
When the direction in which the hand moves is not a direction toward a predetermined position range including the object, the output unit outputs information indicating a warning.
 作業支援システムによる作業支援方法であって、
 前記作業支援システムは、
 プロセッサ及びメモリと、
 作業の対象である対象物と当該対象物の周辺とを撮影した画像を出力するカメラと、を備え、
 前記方法は、
 前記プロセッサが、前記対象物の三次元モデルを示す第1モデルが記憶される記憶手順と、
 前記プロセッサが、前記画像から、前記作業を行うユーザの手と前記対象物とを検出する検出手順と、
 前記プロセッサが、前記手と前記対象物とが重なるかを判定する関係判定手順と、
 前記プロセッサが、前記手と前記対象物とが重ならない場合、前記対象物の情報である第1情報を前記画像から抽出し、前記対象物の情報である第2情報を前記第1モデルから抽出し、前記第1情報と前記第2情報とを比較した結果に基づいて、作業状態を判定する状態判定手順と、
 前記プロセッサが、前記作業状態に従った情報を出力する出力手順と、を含む作業支援方法。
A work support method by a work support system,
The work support system includes:
A processor and memory;
A camera that outputs an image obtained by photographing an object that is a target of work and the periphery of the object;
The method
A storage procedure in which the processor stores a first model indicating a three-dimensional model of the object;
A detection procedure in which the processor detects, from the image, a user's hand performing the work and the object;
A relationship determination procedure for the processor to determine whether the hand and the object overlap;
When the hand does not overlap the object, the processor extracts first information that is information on the object from the image, and extracts second information that is information on the object from the first model. And a state determination procedure for determining a work state based on a result of comparing the first information and the second information;
An output procedure in which the processor outputs information according to the work state.
 請求項14に記載の作業支援方法であって、
 前記第1モデルは、前記対象物の前記作業後の理想的な形状を示し、
 前記方法は、前記カメラが、前記対象物と当該対象物の周辺に存在する物体との三次元モデルを示す前記画像を出力する手順を含み、
 前記検出手順は、前記プロセッサが、前記画像が示す前記対象物の三次元モデルである第2モデルと、前記手の三次元モデルとを、前記画像から検出する手順を含み、
 前記関係判定手順は、前記プロセッサが、前記手の三次元モデルと前記第2モデルとが重なるかを判定する手順を含み、
 前記状態判定手順は、前記手の三次元モデルと前記対象物の第2モデルとが重ならない場合、前記プロセッサが、前記対象物の第2モデルの情報を前記第1情報として抽出し、さらに、前記第1モデルから前記対象物の情報である第2情報を抽出する手順を含むことを特徴とする作業支援方法。
The work support method according to claim 14,
The first model shows an ideal shape of the object after the work,
The method includes a step in which the camera outputs the image showing a three-dimensional model of the object and an object existing around the object;
The detection procedure includes a procedure in which the processor detects a second model, which is a three-dimensional model of the object indicated by the image, and a three-dimensional model of the hand from the image.
The relationship determination procedure includes a procedure in which the processor determines whether the three-dimensional model of the hand and the second model overlap each other,
In the state determination procedure, when the three-dimensional model of the hand and the second model of the object do not overlap, the processor extracts the information of the second model of the object as the first information, and A work support method comprising a procedure of extracting second information which is information of the object from the first model.
PCT/JP2014/081163 2014-11-26 2014-11-26 Work assistance system and work assistance method Ceased WO2016084142A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2014/081163 WO2016084142A1 (en) 2014-11-26 2014-11-26 Work assistance system and work assistance method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2014/081163 WO2016084142A1 (en) 2014-11-26 2014-11-26 Work assistance system and work assistance method

Publications (1)

Publication Number Publication Date
WO2016084142A1 true WO2016084142A1 (en) 2016-06-02

Family

ID=56073766

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/081163 Ceased WO2016084142A1 (en) 2014-11-26 2014-11-26 Work assistance system and work assistance method

Country Status (1)

Country Link
WO (1) WO2016084142A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017218792A (en) * 2016-06-07 2017-12-14 清水建設株式会社 Construction management apparatus and construction management method
JP2018101353A (en) * 2016-12-21 2018-06-28 ヤフー株式会社 Determination system, method for determination, and determination program
WO2018116790A1 (en) * 2016-12-22 2018-06-28 株式会社Cygames Inconsistency detection system, mixed reality system, program, and inconsistency detection method
JP2018156279A (en) * 2017-03-16 2018-10-04 株式会社デンソーウェーブ Work support device, work support program
CN111091625A (en) * 2018-10-23 2020-05-01 波音公司 Augmented reality system for manufacturing composite parts
EP4239542A4 (en) * 2020-10-29 2024-10-02 NEC Platforms, Ltd. ASSISTANCE SYSTEM, ASSISTANCE PROCEDURE AND STORAGE MEDIUM

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003281297A (en) * 2002-03-22 2003-10-03 National Institute Of Advanced Industrial & Technology Information presentation apparatus and information presentation method
WO2005088542A1 (en) * 2004-03-17 2005-09-22 Matsushita Electric Industrial Co., Ltd. System for recognizing cooking operation of foodstuff and program for recognizing cooking operation of foodstuff

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003281297A (en) * 2002-03-22 2003-10-03 National Institute Of Advanced Industrial & Technology Information presentation apparatus and information presentation method
WO2005088542A1 (en) * 2004-03-17 2005-09-22 Matsushita Electric Industrial Co., Ltd. System for recognizing cooking operation of foodstuff and program for recognizing cooking operation of foodstuff

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017218792A (en) * 2016-06-07 2017-12-14 清水建設株式会社 Construction management apparatus and construction management method
JP2018101353A (en) * 2016-12-21 2018-06-28 ヤフー株式会社 Determination system, method for determination, and determination program
WO2018116790A1 (en) * 2016-12-22 2018-06-28 株式会社Cygames Inconsistency detection system, mixed reality system, program, and inconsistency detection method
JP2018106262A (en) * 2016-12-22 2018-07-05 株式会社Cygames Inconsistency detection system, mixed reality system, program, and inconsistency detection method
US10896497B2 (en) 2016-12-22 2021-01-19 Cygames, Inc. Inconsistency detecting system, mixed-reality system, program, and inconsistency detecting method
JP2018156279A (en) * 2017-03-16 2018-10-04 株式会社デンソーウェーブ Work support device, work support program
JP7066976B2 (en) 2017-03-16 2022-05-16 株式会社デンソーウェーブ Work support equipment, work support program
CN111091625A (en) * 2018-10-23 2020-05-01 波音公司 Augmented reality system for manufacturing composite parts
JP2020107321A (en) * 2018-10-23 2020-07-09 ザ・ボーイング・カンパニーThe Boeing Company Augmented reality system for composite part manufacturing
JP7475834B2 (en) 2018-10-23 2024-04-30 ザ・ボーイング・カンパニー Augmented reality system for composite parts manufacturing
EP4239542A4 (en) * 2020-10-29 2024-10-02 NEC Platforms, Ltd. ASSISTANCE SYSTEM, ASSISTANCE PROCEDURE AND STORAGE MEDIUM

Similar Documents

Publication Publication Date Title
JP6265027B2 (en) Display device, position specifying program, and position specifying method
JP6314394B2 (en) Information processing apparatus, setting method, setting program, system, and management apparatus
US9710971B2 (en) Information processing device, position designation method and storage medium
US9792731B2 (en) System and method for controlling a display
US10417781B1 (en) Automated data capture
JP6642968B2 (en) Information processing apparatus, information processing method, and program
JP7017689B2 (en) Information processing equipment, information processing system and information processing method
JP6264834B2 (en) Guide method, information processing apparatus, and guide program
JP5991423B2 (en) Display device, display method, display program, and position setting system
JP6160290B2 (en) Information processing apparatus, determination method, and determination program
WO2016084142A1 (en) Work assistance system and work assistance method
JP2020095009A (en) Computer rebar measurement inspection system
US10950056B2 (en) Apparatus and method for generating point cloud data
US20140244219A1 (en) Method of creating a pipe route line from a point cloud in three-dimensional modeling software
US10185399B2 (en) Image processing apparatus, non-transitory computer-readable recording medium, and image processing method
JP2018142109A (en) Display control program, display control method, and display control apparatus
WO2015173882A1 (en) Content generation method and management device
JP2014106597A (en) Autonomous moving body, object information acquisition device, and object information acquisition method
JP5620741B2 (en) Information processing apparatus, information processing method, and program
TWI468849B (en) Building texture extracting apparatus and method thereof
JP5083715B2 (en) 3D position and orientation measurement method and apparatus
JP2013092888A (en) Data processor
WO2020067204A1 (en) Learning data creation method, machine learning model generation method, learning data creation device, and program
JP6303918B2 (en) Gesture management system, gesture management program, gesture management method, and pointing recognition device
JP2019197345A (en) Image processing device and image processing program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14906836

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14906836

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP