[go: up one dir, main page]

WO2025004362A1 - Remote operation assistance system and method - Google Patents

Remote operation assistance system and method Download PDF

Info

Publication number
WO2025004362A1
WO2025004362A1 PCT/JP2023/024474 JP2023024474W WO2025004362A1 WO 2025004362 A1 WO2025004362 A1 WO 2025004362A1 JP 2023024474 W JP2023024474 W JP 2023024474W WO 2025004362 A1 WO2025004362 A1 WO 2025004362A1
Authority
WO
WIPO (PCT)
Prior art keywords
operator
user
terminal
fingers
movement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/JP2023/024474
Other languages
French (fr)
Japanese (ja)
Inventor
由美 村上
伸二 宮原
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NTT Inc
Original Assignee
Nippon Telegraph and Telephone Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nippon Telegraph and Telephone Corp filed Critical Nippon Telegraph and Telephone Corp
Priority to PCT/JP2023/024474 priority Critical patent/WO2025004362A1/en
Publication of WO2025004362A1 publication Critical patent/WO2025004362A1/en
Anticipated expiration legal-status Critical
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • One aspect of the present invention relates to a remote operation support system and a remote operation support method used to remotely support, for example, the operation of a smartphone.
  • one service that supports users who are unfamiliar with operating smartphones which connects the user's smartphone to an operator terminal at a service center via a network, and the operator provides instructions on how to operate the smartphone while the screen is shared between the two terminals (see, for example, non-patent document 1).
  • this service the user can receive assistance with how to operate the smartphone from the comfort of their own home, as if the operator were face-to-face.
  • the user confirms the operation method by visually viewing the screen as the operator operates.
  • the user later tries to perform the operation that they have seen the operator perform, they may find that they are unable to reproduce the operation because their hand and finger movements differ from the operator's.
  • This invention was made with the above in mind, and aims to provide technology that enables users who are unfamiliar with operation or who have difficulty moving their fingers at will to reliably operate an information terminal using their fingers.
  • one aspect of the remote operation support system or method according to the present invention is a system that connects a user terminal and an operator terminal via a network and transmits the operator's operations to a user while sharing a display screen.
  • the operator terminal first detects motion information reflecting the movement of the operator's fingers, and transmits the detected motion information from the operator terminal to the user terminal via the network.
  • the user terminal receives the motion information transmitted from the operator terminal, generates a stimulation control signal based on the received motion information to cause the user's fingers to move in the same way as the movement of the operator's fingers, and presents a stimulation to a body part related to the movement of the user's fingers in response to the generated stimulation control signal.
  • motion information reflecting the movement of the operator's fingers such as an electromyographic signal
  • the user terminal applies stimuli such as electrical stimulation to the areas related to the movement of the user's fingers based on the motion signal, so as to cause the user's fingers to move in the same way as the operator's fingers.
  • stimuli such as electrical stimulation
  • the movement of the operator's fingers is transmitted as is to the user's fingers, and the user can directly recognize the movement of the operator's fingers at a remote location as the movement of his or her own fingers.
  • one aspect of the present invention provides technology that enables even users who are unfamiliar with operation or who have difficulty moving their fingers at will to reliably operate an information terminal using their fingers.
  • FIG. 1 is a diagram showing an example of the configuration of a remote operation support system according to an embodiment of the present invention.
  • FIG. 2 is a block diagram showing an example of the configuration of an operator terminal in the remote operation support system shown in FIG.
  • FIG. 3 is a block diagram showing an example of the configuration of a user terminal in the remote operation support system shown in FIG.
  • FIG. 4 is a flowchart showing an example of a processing procedure and processing contents of the operator terminal and the user terminal shown in FIG. 2 and
  • FIG. 5A is a diagram showing an example of an operation to be supported.
  • FIG. 5B is a diagram showing another example of an operation to be supported.
  • FIG. 1 is a diagram showing an example of the configuration of a remote operation support system according to an embodiment of the present invention.
  • the remote operation support system connects a user terminal UT used by the user receiving support with an operator terminal OT used by the operator providing the support via a network NW, and in this state a display screen is shared between the user terminal UT and the operator terminal OT, and while looking at this display screen, the operator supports the user in operating the terminal using their fingers.
  • Operation gloves HE1 and HE2 are worn on the hands of the operator and the user, respectively.
  • a sensor unit and a stimulus presentation unit are installed or connected to gloves HE1 and HE2, respectively.
  • the sensor unit is composed of electromyogram sensors ES1 and ES2 and their detection electrodes.
  • the detection electrodes include, for example, multiple sets of two electrodes that form a pair, and these sets are placed in locations inside the glove that correspond to each finger.
  • the electromyogram sensors ES1 and ES2 measure the surface electromyogram (EMG) of the skin, and measure the skin surface potential difference that appears between the two electrodes for each set of electrodes as an electromyogram signal.
  • EMG surface electromyogram
  • the electromyogram sensors ES1 and ES2 detect amplitude changes corresponding to muscle activity associated with the finger movement accompanying the swipe operation.
  • the electromyogram sensors ES1 and ES2 then transmit the detected electromyogram signals together with identification information of the electrode set that detected the signals to the operator terminal OT and the user terminal UT via, for example, a signal cable or a wireless line that employs a low-power wireless data transmission standard such as Bluetooth (registered trademark).
  • the stimulation presentation unit is composed of electrical stimulation generators ED1 and ED2 and stimulation presentation electrodes.
  • the stimulation presentation electrodes include, for example, multiple sets of two electrodes that form a pair. Each electrode set is placed at a position in the glove that corresponds to the muscle that moves each finger.
  • the electrical stimulation generators ED1 and ED2 individually apply electrical stimulation to the muscles that move each finger, generating a low-frequency stimulation signal suitable for muscular contraction by, for example, superimposing a stimulation frequency of 15 to 200 Hz on a reference frequency of several kHz to 20 kHz.
  • the reference frequency, stimulation frequency and signal strength of the electrical stimulation signal can be arbitrarily controlled by the electrical stimulation control signal output from the operator terminal OT and the user terminal UT, and can be arbitrarily set according to the type and condition of the muscle to be stimulated, etc.
  • the network NW comprises, for example, a wide area network with the Internet at its core, and an access network for accessing this wide area network.
  • Examples of the access network that can be used include a public communication network using wired or wireless connections, a Local Area Network (LAN) using wired or wireless connections, and a Cable Television (CATV) network.
  • LAN Local Area Network
  • CATV Cable Television
  • the network NW is configured as an in-house network such as a LAN or wireless LAN.
  • FIG. 2 is a block diagram showing an example of the hardware and software configuration of the operator terminal OT.
  • the operator terminal OT is, for example, a smartphone, and has a control unit 11 that uses a hardware processor such as a central processing unit (CPU).
  • a storage unit having a program storage unit 12 and a data storage unit 13, an input/output interface (hereinafter, the interface will be abbreviated as I/F) unit 14, and a communication I/F unit 5 are connected to this control unit 11 via a bus (not shown).
  • a PDA terminal with a touch panel or a notebook personal computer may also be used as the operator terminal OT.
  • a touch panel type input/output device is connected to the input/output I/F unit 14.
  • the touch panel type input/output device is configured by arranging an input device 162, which is, for example, a piezoelectric or electrostatic input sheet, on the display surface of a display device 161, which uses, for example, liquid crystal or organic EL.
  • the input/output I/F unit 14 is also connected to the electromyogram sensor ES1 and the electrical stimulation generator ED1 described above.
  • E11 and E12 are a detection electrode and a stimulation presentation electrode connected to the electromyogram sensor ES1 and the electrical stimulation generator ED1, respectively.
  • the communication I/F unit 15 transmits and receives information data to and from the user terminal UT via the network NW in accordance with the communication protocol defined by the network NW.
  • the program storage unit 12 is configured by combining, for example, a non-volatile memory such as a solid state drive (SSD) as a storage medium that can be written to and read from at any time, and a non-volatile memory such as a read only memory (ROM), and stores application programs necessary to execute various control processes according to one embodiment, in addition to middleware such as an operating system (OS).
  • OS operating system
  • the program will be collectively referred to as the program.
  • the data storage unit 13 is, for example, a combination of a non-volatile memory such as an SSD, which can be written to and read from at any time, and a volatile memory such as a RAM (Random Access Memory), and is used to temporarily store transmitted and received data, as well as data generated in the process of providing operational assistance to the user using electrical stimulation.
  • a non-volatile memory such as an SSD
  • RAM Random Access Memory
  • the control unit 11 includes a screen sharing control processing unit 111, an electromyographic signal receiving processing unit 112, an electrical stimulation control processing unit 113, and an electromyographic signal transmitting processing unit 114 as processing functions necessary to implement one embodiment. All of these processing units 111 to 114 are realized by causing the hardware processor of the control unit 11 to execute application programs stored in the program storage unit 12.
  • the screen sharing control processing unit 111 shares display screen data used for operational support with the user terminal UT when a communication link is established with the user terminal UT.
  • the myoelectric signal receiving and processing unit 112 receives the user's myoelectric signal transmitted from the user terminal UT via the communication I/F unit 15, and passes the received user's myoelectric signal to the electrical stimulation control processing unit 113.
  • the electrical stimulation control processing unit 113 generates an electrical stimulation control signal in response to the user's myoelectric signal to cause the operator's finger to move in the same way as the user's finger as represented by the myoelectric signal, and outputs the generated electrical stimulation control signal from the input/output I/F unit 14 to the electrical stimulation generator ED1.
  • the myoelectric signal transmission processing unit 114 receives the operator's myoelectric signal detected by the myoelectric sensor ES1 from the input/output I/F unit 14, and transmits the received operator's myoelectric signal from the communication I/F unit 15 to the user terminal UT.
  • FIG. 3 is a block diagram showing an example of the hardware and software configuration of the user terminal UT.
  • the user terminal UT is a smartphone, just like the operator terminal OT described above. Note that the user terminal UT can also be an information terminal using a touch panel, such as a PDA.
  • the user terminal UT comprises a control unit 21 using a hardware processor such as a central processing unit (CPU), a storage unit having a program storage unit 22 and a data storage unit 23, an input/output I/F unit 24, and a communication I/F unit 25, all connected via a bus.
  • a hardware processor such as a central processing unit (CPU)
  • CPU central processing unit
  • storage unit having a program storage unit 22 and a data storage unit 23, an input/output I/F unit 24, and a communication I/F unit 25, all connected via a bus.
  • the input/output I/F unit 24 is connected to a touch panel type input/output device in which a sheet-like input device 162 is arranged on the display surface of a display device 161.
  • the input/output I/F unit 14 is also connected to an electromyogram sensor ES2 and an electrical stimulation generator ED2.
  • a detection electrode E21 is connected to the electromyogram sensor ES2, and a stimulation presentation electrode E22 is connected to the electrical stimulation generator ED2.
  • the communication I/F unit 25 transmits and receives information data to and from the operator terminal OT via the network NW in accordance with the communication protocol defined by the network NW.
  • the program storage unit 22 is configured by combining, for example, a non-volatile memory such as an SSD as a storage medium that can be written to and read from at any time, and a non-volatile memory such as a ROM, and stores application programs necessary for executing various control processes according to one embodiment, in addition to middleware such as an OS.
  • middleware such as an OS.
  • the OS and each application program will be collectively referred to as a program.
  • the data storage unit 23 is, for example, a combination of a non-volatile memory such as an SSD that can be written to and read from at any time as a storage medium, and a volatile memory such as a RAM, and is used to temporarily store transmitted and received data, as well as data generated in the process of receiving operational assistance using electrical stimulation.
  • a non-volatile memory such as an SSD that can be written to and read from at any time as a storage medium
  • a volatile memory such as a RAM
  • the control unit 21 includes, as processing functions necessary to implement one embodiment, a screen sharing control processing unit 211, an EMG signal transmission processing unit 212, an EMG signal reception processing unit 213, and an electrical stimulation control processing unit 214. All of these processing units 211 to 214 are realized by causing the hardware processor of the control unit 21 to execute application programs stored in the program storage unit 22.
  • the screen sharing control processing unit 211 shares display screen data used to receive operational assistance with the operator terminal OT when a communication link is established with the operator terminal OT.
  • the myoelectric signal transmission processing unit 212 receives the user's myoelectric signal detected by the myoelectric sensor ES2 from the input/output I/F unit 24, and transmits the received user's myoelectric signal from the communication I/F unit 25 to the operator terminal OT.
  • the myoelectric signal receiving and processing unit 213 receives the operator's myoelectric signal transmitted from the operator terminal OT via the communication I/F unit 25, and passes the received operator's myoelectric signal to the electrical stimulation control processing unit 214.
  • the electrical stimulation control processing unit 214 generates an electrical stimulation control signal in response to the operator's myoelectric signal to cause the user's finger to move in the same way as the operator's finger represented by the myoelectric signal, and outputs the generated electrical stimulation control signal from the input/output I/F unit 24 to the electrical stimulation generator ED2.
  • FIG. 4 is a flowchart showing an example of the processing procedure and processing content of operation support control executed by each of the control units 11, 21 of the operator terminal OT and the user terminal UT of the remote operation support system.
  • the electromyogram sensors ES1, ES2 and the electrical stimulation generators ED1, ED2 may both be integrally installed in the gloves HE1, HE2, or may be connected by a signal cable.
  • electromyogram sensors ES1, ES2 and the electrical stimulation generators ED1, ED2 are paired in advance with the operator terminal OT and the user terminal UT to enable communication, for example, via Bluetooth.
  • control unit 21 of the operator terminal OT establishes a communication link with the user terminal UT in response to access from the user terminal UT, and then, upon receiving the above-mentioned operation assistance request in step S11, launches an application and notifies the operator of this fact.
  • step S12 the control unit 21 of the user terminal UT reads out display screen data to be operated from the data storage unit 13 or downloads it from a website, and displays it on the display device 261.
  • the display screen data is transmitted from the communication I/F unit 25 to the operator terminal OT.
  • control unit 11 of the operator terminal OT under the control of the screen sharing control processing unit 111, receives the display screen data transmitted from the user terminal UT via the communication I/F unit 15 in step S13. Then, the received display screen data is displayed on the display device 161.
  • map data is used as the display screen data, it may also be image data such as photographs or text data.
  • Figures 5A and 5B show, as an example of such an operation, a pinch-in operation and a pinch-out operation using two fingers simultaneously. Note that other examples of the operation may include an operation using three fingers simultaneously, a drag operation using one finger, a flick operation, a double tap operation, etc.
  • the control unit 21 of the user terminal UT detects the user's trial operation in step S14, under the control of the myoelectric signal transmission processing unit 212, it acquires, via the input/output I/F unit 24 in step S15, a myoelectric signal (including identification information of the electrode set from which the signal was detected) corresponding to the finger movement associated with the trial operation detected by the myoelectric sensor ES2.
  • the myoelectric signal transmission processing unit 212 then transmits the acquired myoelectric signal of the user from the communication I/F unit 25 to the operator terminal OT in step S16.
  • the control unit 21 of the user terminal UT transmits the operated display screen data from the communication I/F unit 25 to the operator terminal OT.
  • control unit 11 of the operator terminal OT receives the display screen data transmitted from the user terminal UT via the communication I/F unit 15 in step S17 under the control of the screen sharing control processing unit 111, in step S18, the control unit 11 updates the display screen data displayed on the display device 161 to the received display screen data.
  • control unit 11 of the operator terminal OT under the control of the myoelectric signal receiving and processing unit 112, receives the user's myoelectric signal transmitted from the user terminal UT via the communication I/F unit 15 in step S17.
  • step S19 When the control unit 11 of the operator terminal OT receives the user's myoelectric signal, in step S19, under the control of the electrical stimulation control processing unit 113, generates an electrical stimulation control signal for causing the operator's finger to move in the same way as the user's finger represented by the user's myoelectric signal.
  • This electrical stimulation control signal includes values specifying the strength (amplitude) and application time of the electrical stimulation, and an identification signal specifying the electrode set corresponding to the finger to which the electrical stimulation is to be applied.
  • the electrical stimulation control processing unit 113 then outputs the generated electrical stimulation control signal from the input/output I/F unit 14 to the electrical stimulation generator ED1.
  • an electrical stimulation signal of the strength specified by the electrical stimulation control signal is generated from the electrical stimulation generator ED1 for the specified length of time, and the generated electrical stimulation signal is applied to the set of stimulation presentation electrodes corresponding to the finger to be moved.
  • electrical stimulation is applied to the muscles that move the operator's finger that is the same as the finger used by the user in the trial operation, thereby moving the operator's finger.
  • the movement of the user's finger is transferred to the movement of the operator's finger. Therefore, the operator can recognize the finger movement associated with the user's trial operation from the movement of his or her own finger, and can directly confirm any points that need to be corrected in the user's trial operation.
  • the operator refers to the confirmation result of the user's finger movement and performs a model operation, such as a pinch-in operation or a pinch-out operation, on the display screen of the operator terminal OT.
  • a model operation such as a pinch-in operation or a pinch-out operation
  • the control unit 11 acquires, under the control of the myoelectric signal transmission processing unit 114, a myoelectric signal (including identification information of the electrode set that detected the signal) corresponding to the finger movement associated with the model operation detected by the myoelectric sensor ES1 via the input/output I/F unit 24 in step S21.
  • the myoelectric signal transmission processing unit 114 transmits the acquired myoelectric signal of the user from the communication I/F unit 15 to the user terminal UT in step S22.
  • control unit 21 of the user terminal UT under the control of the myoelectric signal receiving and processing unit 213, receives the operator's myoelectric signal transmitted from the operator terminal OT via the communication I/F unit 25 in step S23.
  • the control unit 21 of the user terminal UT Upon receiving the operator's myoelectric signal, the control unit 21 of the user terminal UT, under the control of the electrical stimulation control processing unit 214, generates an electrical stimulation control signal in step S24 for causing the user's finger to move in the same way as the movement of the operator's finger represented by the operator's myoelectric signal.
  • This electrical stimulation control signal includes values specifying the strength (amplitude) and application time of the electrical stimulation, and an identification signal specifying the electrode set corresponding to the finger to which the electrical stimulation is to be applied.
  • the electrical stimulation control processing unit 214 then outputs the generated electrical stimulation control signal from the input/output I/F unit 24 to the electrical stimulation generator ED2.
  • an electrical stimulation signal of the strength specified by the electrical stimulation control signal is generated from the electrical stimulation generator ED2 for the specified length of time, and the generated electrical stimulation signal is applied to the set of stimulation presentation electrodes corresponding to the finger to be moved.
  • electrical stimulation is applied to the muscles that move the user's finger that is the same as the finger used by the operator for the operation, thereby moving the user's finger.
  • the operator's model finger movement is transferred to the user's finger. Therefore, the user can directly recognize the finger movement associated with the operator's model operation as the movement of his or her own finger.
  • the display screen data after this operation is sent from the communication I/F unit 25 to the operator terminal OT in step S25 under the control of the screen sharing control processing unit 211.
  • control unit 11 of the operator terminal OT under the control of the screen sharing control processing unit 111, receives the display image data transmitted from the user terminal UT via the communication I/F unit 15 in step S26, and in step S27 outputs the received display image data from the input/output I/F unit 14 to the display device 161 for display.
  • the operator can check the results of the user's operations by looking at the display screen data.
  • step S28 the control unit 21 of the user terminal UT monitors whether or not the user performs an operation to end the operational assistance. If the user does not perform an operation to end the operational assistance, the control unit 21 returns to step S14 and repeatedly executes a series of processes for the operational assistance in steps S14 to S28. In response to this, when a request to end the operational assistance is detected, the control unit 21 transmits the request to end the operational assistance from the communication I/F unit 25 to the operator terminal OT, and then closes the application to end the above-mentioned operational assistance process.
  • step S29 the control unit 11 of the operator terminal OT monitors for receipt of the operation assistance end request sent from the user terminal UT. If the operation assistance end request is not received, the process returns to step S17 and continues the operation assistance control in steps S17 to S29. On the other hand, if the operation assistance end request is received, the control unit 11 of the operator terminal OPT notifies the operator of this fact with a message and closes the application to end the operation assistance.
  • the finger movement accompanying a swipe operation that is an example of the operator is detected by the electromyographic sensor ES1, and the detected electromyographic signal is transmitted from the operator terminal OT to the user terminal UT.
  • the user terminal UT receives the electromyographic signal, it generates an electrical stimulation control signal for causing the user's finger to make the same movement as the example finger movement of the operator represented by the electromyographic signal, and supplies this to the electrical stimulation generator ED2, and the electrical stimulation generator ED2 applies an electrical stimulation to the user's finger to move the finger.
  • the user prior to providing operational assistance to the user, the user first performs an operation and a myoelectric signal representing the finger movement at that time is transmitted from the user terminal UT to the operator terminal OT.
  • the operator terminal OT then generates an electrical stimulation control signal based on the myoelectric signal and applies an electrical stimulation to the operator's finger, thereby directly communicating the user's finger movement to the operator.
  • an electromyogram is detected as motion information reflecting the finger movement accompanying the model operation of the operator, but instead, for example, the position of the fingertip and its acceleration in multiple axial directions may be detected, and the user's finger movement may be controlled based on the detected fingertip position and acceleration.
  • an actuator or the like may be used to control the user's finger movement.
  • electromyograms are detected as motion information reflecting the finger movements accompanying the user's operation.
  • skin electrodermal activity may be detected.
  • the degree of difficulty of the user in the operation is estimated based on the detected skin electrodermal activity, and the result is presented to the operator, allowing the operator to provide more detailed support by adjusting the speed of the model operation, etc.
  • this invention is not limited to the above-described embodiment as it is, and in the implementation stage, the components can be modified and embodied without departing from the gist of the invention.
  • various inventions can be formed by appropriately combining multiple components disclosed in the above-described embodiment. For example, some components may be deleted from all the components shown in the embodiment. Furthermore, components from different embodiments may be appropriately combined.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

According to one aspect of this invention, when executed by a system that connects a user terminal and an operator terminal over a network and transfers an operation of the operator to a user in a state in which a display screen is shared, first, operation information in which the movement of a finger of the operator is reflected is detected in the operator terminal, and the detected operation information is transmitted from the operator terminal to the user terminal over the network. With regard to this, the user terminal receives the operation information transmitted from the operator terminal, generates a stimulus control signal for generating the same movement as the finger of the operator in the finger of the user on the basis of the received operation information, and presents a stimulus to a body part related to the movement of the finger of the user in response to the generated stimulus control signal.

Description

遠隔操作支援システムおよび方法Remote operation support system and method

 この発明の一態様は、例えばスマートフォンの操作を遠隔支援するために用いる遠隔操作支援システムおよび遠隔操作支援方法に関する。 One aspect of the present invention relates to a remote operation support system and a remote operation support method used to remotely support, for example, the operation of a smartphone.

 例えば、スマートフォンの操作に不慣れなユーザを支援するサービスとして、ユーザのスマートフォンとサービスセンタのオペレータ端末との間をネットワークを介して接続し、両端末間で画面を共有した状態でオペレータが操作方法を案内するサービスが知られている(例えば非特許文献1を参照)。このサービスを利用することで、ユーザは自宅に居ながらあたかもオペレータが対面しているような状態で操作方法の支援を受けることが可能となる。 For example, one service that supports users who are unfamiliar with operating smartphones is known, which connects the user's smartphone to an operator terminal at a service center via a network, and the operator provides instructions on how to operate the smartphone while the screen is shared between the two terminals (see, for example, non-patent document 1). By using this service, the user can receive assistance with how to operate the smartphone from the comfort of their own home, as if the operator were face-to-face.

「あんしん遠隔サポート」、NTTドコモ、インターネット<URL: https://www.docomo.ne.jp/service/remote_support/>"Anshin Remote Support", NTT Docomo, Internet <URL: https://www.docomo.ne.jp/service/remote_support/>

 ところが、この種の従来のサービスは、オペレータが操作する画面をユーザが目で見ることで操作方法を確認するものとなっている。このため、ユーザは目で見たオペレータの操作を後に自身で行おうとしたときに、手指の動かし方がオペレータの操作とは異なり、操作を再現できないことがあった。特に、障害等の影響により自身の意志で手指を自在に動かすことが困難なユーザにとっては、目で見たオペレータの操作を自身の手指の操作で再現することが難しかった。 However, in this type of conventional service, the user confirms the operation method by visually viewing the screen as the operator operates. As a result, when the user later tries to perform the operation that they have seen the operator perform, they may find that they are unable to reproduce the operation because their hand and finger movements differ from the operator's. In particular, for users who have difficulty moving their hands and fingers at will due to a disability or other reason, it is difficult to reproduce the operation of the operator that they have seen with their own hands and fingers.

 この発明は上記事情に着目してなされたもので、操作に不慣れなユーザや自身の意志で手指を自在に動かすことが困難なユーザであっても、手指を使った情報端末の操作を確実に行えるようにする技術を提供しようとするものである。 This invention was made with the above in mind, and aims to provide technology that enables users who are unfamiliar with operation or who have difficulty moving their fingers at will to reliably operate an information terminal using their fingers.

 上記課題を解決するためにこの発明に係る遠隔操作支援システムまたは遠隔操作支援方法の一態様は、ユーザ端末とオペレータ端末との間をネットワークを介して接続し、表示画面を共有した状態でオペレータの操作をユーザに伝達するシステムが実行する際に、先ずオペレータ端末において、前記オペレータの手指の動きが反映された動作情報を検出し、検出した前記動作情報を前記オペレータ端末から前記ユーザ端末へ前記ネットワークを介して送信する。これに対し、ユーザ端末においては、前記オペレータ端末から送信された前記動作情報を前記ユーザ端末において受信し、受信した前記動作情報に基づいて前記オペレータの手指の動きと同一の動きを前記ユーザの手指に生じさせるための刺激制御信号を生成し、生成した前記刺激制御信号に応じて、前記ユーザの手指の動きと関係する身体部位に刺激を提示するようにしたものである。 In order to solve the above problems, one aspect of the remote operation support system or method according to the present invention is a system that connects a user terminal and an operator terminal via a network and transmits the operator's operations to a user while sharing a display screen. When the system is executed, the operator terminal first detects motion information reflecting the movement of the operator's fingers, and transmits the detected motion information from the operator terminal to the user terminal via the network. In response to this, the user terminal receives the motion information transmitted from the operator terminal, generates a stimulation control signal based on the received motion information to cause the user's fingers to move in the same way as the movement of the operator's fingers, and presents a stimulation to a body part related to the movement of the user's fingers in response to the generated stimulation control signal.

 この発明の一態様によれば、オペレータの手指の動きが反映された動作情報、例えば筋電信号が検出されてユーザ端末に伝送され、ユーザ端末では上記動作信号に基づいて、オペレータの手指の動きと同一の動きを前記ユーザの手指に生じさせるべく電気刺激等の刺激がユーザの手指の動きに係る部位に付与される。このため、オペレータの操作に伴う手指の動きがそのままユーザの手指に伝達されることになり、これによりユーザは遠隔地に居るオペレータの手指の動きを自身の手指の動きとして直接的に認識することが可能となる。 According to one aspect of the invention, motion information reflecting the movement of the operator's fingers, such as an electromyographic signal, is detected and transmitted to a user terminal, and the user terminal applies stimuli such as electrical stimulation to the areas related to the movement of the user's fingers based on the motion signal, so as to cause the user's fingers to move in the same way as the operator's fingers. As a result, the movement of the operator's fingers is transmitted as is to the user's fingers, and the user can directly recognize the movement of the operator's fingers at a remote location as the movement of his or her own fingers.

 すなわちこの発明の一態様によれば、操作に不慣れなユーザや自身の意志で手指を自在に動かすことが困難なユーザであっても、手指を使った情報端末の操作を確実に行えるようにする技術を提供することができる。 In other words, one aspect of the present invention provides technology that enables even users who are unfamiliar with operation or who have difficulty moving their fingers at will to reliably operate an information terminal using their fingers.

図1は、この発明の一実施形態に係る遠隔操作支援システムの構成の一例を示す図である。FIG. 1 is a diagram showing an example of the configuration of a remote operation support system according to an embodiment of the present invention. 図2は、図1に示した遠隔操作支援システムにおけるオペレータ端末の構成の一例を示すブロック図である。FIG. 2 is a block diagram showing an example of the configuration of an operator terminal in the remote operation support system shown in FIG. 図3は、図1に示した遠隔操作支援システムにおけるユーザ端末の構成の一例を示すブロック図である。FIG. 3 is a block diagram showing an example of the configuration of a user terminal in the remote operation support system shown in FIG. 図4は、図2および図3に示したオペレータ端末およびユーザ端末の処理手順と処理内容の一例を示すフローチャートである。FIG. 4 is a flowchart showing an example of a processing procedure and processing contents of the operator terminal and the user terminal shown in FIG. 2 and FIG. 図5Aは、支援対象となる操作の一例を示す図である。FIG. 5A is a diagram showing an example of an operation to be supported. 図5Bは、支援対象となる操作の他の例を示す図である。FIG. 5B is a diagram showing another example of an operation to be supported.

 以下、図面を参照してこの発明に係わる実施形態を説明する。 Below, an embodiment of the present invention will be described with reference to the drawings.

 [一実施形態]
 (構成例)
 (1)システム
 図1は、この発明の一実施形態に係る遠隔操作支援システムの構成の一例を示す図である。
[One embodiment]
(Configuration example)
(1) System FIG. 1 is a diagram showing an example of the configuration of a remote operation support system according to an embodiment of the present invention.

 この発明の一実施形態に係る遠隔操作支援システムは、支援を受けるユーザが使用するユーザ端末UTと、支援を提供するオペレータが使用するオペレータ端末OTとの間を、ネットワークNWを介して接続し、この状態でユーザ端末UTとオペレータ端末OTとの間で表示画面を共有し、この表示画面を見ながらオペレータがユーザに対し手指を使った端末操作を支援する。 In one embodiment of the present invention, the remote operation support system connects a user terminal UT used by the user receiving support with an operator terminal OT used by the operator providing the support via a network NW, and in this state a display screen is shared between the user terminal UT and the operator terminal OT, and while looking at this display screen, the operator supports the user in operating the terminal using their fingers.

 オペレータおよびユーザの手にはそれぞれ操作用のグローブHE1,HE2が装着される。グローブHE1,HE2にはそれぞれセンサユニットおよび刺激提示ユニットが設置または接続されている。 Operation gloves HE1 and HE2 are worn on the hands of the operator and the user, respectively. A sensor unit and a stimulus presentation unit are installed or connected to gloves HE1 and HE2, respectively.

 センサユニットは、筋電センサES1,ES2とその検出電極とから構成される。検出電極は、例えば対をなす2個の電極を複数セット備え、これらのセットをグローブ内の各指に対応する部位に配置される。筋電センサES1,ES2は、皮膚の表面筋電図(Electro MyoGraphy:EMG)を測定するもので、上記電極のセットごとにその2個の電極間に現れる皮膚表面電位差を筋電信号として計測する。 The sensor unit is composed of electromyogram sensors ES1 and ES2 and their detection electrodes. The detection electrodes include, for example, multiple sets of two electrodes that form a pair, and these sets are placed in locations inside the glove that correspond to each finger. The electromyogram sensors ES1 and ES2 measure the surface electromyogram (EMG) of the skin, and measure the skin surface potential difference that appears between the two electrodes for each set of electrodes as an electromyogram signal.

 より具体的には、筋電センサES1,ES2は、スワイプ操作に伴う指の動きに関連する筋肉の筋活動に応じた振幅変化を検出する。そして筋電センサES1,ES2は、検出した上記筋電信号を、検出元の電極セットの識別情報と共に、例えば信号ケーブルまたはBluetooth(登録商標)等の小電力無線データ伝送規格を採用した無線回線を介してオペレータ端末OTおよびユーザ端末UTへ送信する。 More specifically, the electromyogram sensors ES1 and ES2 detect amplitude changes corresponding to muscle activity associated with the finger movement accompanying the swipe operation. The electromyogram sensors ES1 and ES2 then transmit the detected electromyogram signals together with identification information of the electrode set that detected the signals to the operator terminal OT and the user terminal UT via, for example, a signal cable or a wireless line that employs a low-power wireless data transmission standard such as Bluetooth (registered trademark).

 刺激提示ユニットは、電気刺激発生器ED1,ED2と刺激提示用電極とから構成される。刺激提示用電極は、例えば対をなす2個の電極を複数セット備える。各電極セットは、グローブ内の各指を動かす筋肉に対応する位置にそれぞれ配置される。 The stimulation presentation unit is composed of electrical stimulation generators ED1 and ED2 and stimulation presentation electrodes. The stimulation presentation electrodes include, for example, multiple sets of two electrodes that form a pair. Each electrode set is placed at a position in the glove that corresponds to the muscle that moves each finger.

 電気刺激発生器ED1,ED2は、上記指ごとに当該指を動かす筋肉に対し個別に電気刺激を与えるもので、例えば数kHz ~20kHz の基準周波数に15~200Hzの刺激周波数を重畳することにより、擬似的に筋収縮に適した低周波刺激信号を発生する。なお、上記基準周波数、刺激周波数および電気刺激信号の信号強度は、オペレータ端末OTおよびユーザ端末UTから出力される電気刺激制御信号により任意に制御可能であり、刺激を与える筋肉の種類やその状態等に応じて任意に設定される。 The electrical stimulation generators ED1 and ED2 individually apply electrical stimulation to the muscles that move each finger, generating a low-frequency stimulation signal suitable for muscular contraction by, for example, superimposing a stimulation frequency of 15 to 200 Hz on a reference frequency of several kHz to 20 kHz. The reference frequency, stimulation frequency and signal strength of the electrical stimulation signal can be arbitrarily controlled by the electrical stimulation control signal output from the operator terminal OT and the user terminal UT, and can be arbitrarily set according to the type and condition of the muscle to be stimulated, etc.

 なお、ネットワークNWは、例えばインターネットを中核とする広域ネットワークと、この広域ネットワークにアクセスするためのアクセスネットワークとを備える。アクセスネットワークとしては、例えば、有線または無線を使用する公衆通信ネットワーク、有線または無線を使用するLAN(Local Area Network)、CATV(Cable Television)ネットワークが使用される。なお、業務支援システムが例えば企業内または事業所内で運用される場合、ネットワークNWはLANまたは無線LAN等の構内ネットワークにより構成される。 The network NW comprises, for example, a wide area network with the Internet at its core, and an access network for accessing this wide area network. Examples of the access network that can be used include a public communication network using wired or wireless connections, a Local Area Network (LAN) using wired or wireless connections, and a Cable Television (CATV) network. Note that when the business support system is operated within a company or business establishment, for example, the network NW is configured as an in-house network such as a LAN or wireless LAN.

 (2)オペレータ端末OT
 図2は、オペレータ端末OTのハードウェアおよびソフトウェアの構成の一例を示すブロック図である。
(2) Operator terminal (OT)
FIG. 2 is a block diagram showing an example of the hardware and software configuration of the operator terminal OT.

 オペレータ端末OTは、例えばスマートフォンからなり、中央処理ユニット(Central Processing Unit:CPU)等のハードウェアプロセッサを使用した制御部11を備える。そして、この制御部11に対し、プログラム記憶部12およびデータ記憶部13を有する記憶ユニットと、入出力インタフェース(以後インタフェースをI/Fと略称する)部14と、通信I/F部5とを、図示しないバスを介して接続したものとなっている。 The operator terminal OT is, for example, a smartphone, and has a control unit 11 that uses a hardware processor such as a central processing unit (CPU). A storage unit having a program storage unit 12 and a data storage unit 13, an input/output interface (hereinafter, the interface will be abbreviated as I/F) unit 14, and a communication I/F unit 5 are connected to this control unit 11 via a bus (not shown).

 なお、オペレータ端末OTとしては、スマートフォン以外に、タッチパネルを備えたPDA端末やノート型パーソナルコンピュータを用いてもよい。 In addition, in addition to a smartphone, a PDA terminal with a touch panel or a notebook personal computer may also be used as the operator terminal OT.

 入出力I/F部14には、タッチパネル型の入出力デバイスが接続される。タッチパネル型の入出力デバイスは、例えば液晶または有機ELを用いた表示デバイス161の表示面上に、例えば圧電式または静電式の入力シートからなる入力デバイス162を配置したものからなる。 A touch panel type input/output device is connected to the input/output I/F unit 14. The touch panel type input/output device is configured by arranging an input device 162, which is, for example, a piezoelectric or electrostatic input sheet, on the display surface of a display device 161, which uses, for example, liquid crystal or organic EL.

 また入出力I/F部14には、先に述べた筋電センサES1および電気刺激発生器ED1が接続される。なお、E11,E12は、それぞれ上記筋電センサES1および電気刺激発生器ED1に接続された検出電極および刺激提示用電極である。 The input/output I/F unit 14 is also connected to the electromyogram sensor ES1 and the electrical stimulation generator ED1 described above. Note that E11 and E12 are a detection electrode and a stimulation presentation electrode connected to the electromyogram sensor ES1 and the electrical stimulation generator ED1, respectively.

 通信I/F部15は、ネットワークNWで定義される通信プロトコルに従い、ネットワークNWを介してユーザ端末UTとの間で情報データの送受信を行う。 The communication I/F unit 15 transmits and receives information data to and from the user terminal UT via the network NW in accordance with the communication protocol defined by the network NW.

 プログラム記憶部12は、例えば、記憶媒体としてSSD(Solid State Drive)等の随時書込みおよび読出しが可能な不揮発性メモリと、ROM(Read Only Memory)等の不揮発性メモリとを組み合わせて構成したもので、OS(Operating System)等のミドルウェアに加えて、一実施形態に係る各種制御処理を実行するために必要なアプリケーション・プログラムを格納する。なお、以後OSと各アプリケーション・プログラムとをまとめてプログラムと称する。 The program storage unit 12 is configured by combining, for example, a non-volatile memory such as a solid state drive (SSD) as a storage medium that can be written to and read from at any time, and a non-volatile memory such as a read only memory (ROM), and stores application programs necessary to execute various control processes according to one embodiment, in addition to middleware such as an operating system (OS). Hereinafter, the OS and each application program will be collectively referred to as the program.

 データ記憶部13は、例えば、記憶媒体として、SSD等の随時書込みおよび読出しが可能な不揮発性メモリと、RAM(Random Access Memory)等の揮発性メモリと組み合わせたもので、送受信されたデータや、ユーザに対し電気刺激を用いた操作支援を行う過程で生成されるデータを一時保持するために使用される。 The data storage unit 13 is, for example, a combination of a non-volatile memory such as an SSD, which can be written to and read from at any time, and a volatile memory such as a RAM (Random Access Memory), and is used to temporarily store transmitted and received data, as well as data generated in the process of providing operational assistance to the user using electrical stimulation.

 制御部11は、一実施形態を実施するために必要な処理機能として、画面共有制御処理部111と、筋電信号受信処理部112と、電気刺激制御処理部113と、筋電信号送信処理部114とを備える。これらの処理部111~114は、何れもプログラム記憶部12に格納されたアプリケーション・プログラムを制御部11のハードウェアプロセッサに実行させることにより実現される。 The control unit 11 includes a screen sharing control processing unit 111, an electromyographic signal receiving processing unit 112, an electrical stimulation control processing unit 113, and an electromyographic signal transmitting processing unit 114 as processing functions necessary to implement one embodiment. All of these processing units 111 to 114 are realized by causing the hardware processor of the control unit 11 to execute application programs stored in the program storage unit 12.

 画面共有制御処理部111は、ユーザ端末UTとの間に通信リンクが確立された状態で、ユーザ端末UTとの間で操作支援のために使用する表示画面データを共有する。 The screen sharing control processing unit 111 shares display screen data used for operational support with the user terminal UT when a communication link is established with the user terminal UT.

 筋電信号受信処理部112は、ユーザ端末UTから送信されたユーザの筋電信号を通信I/F部15を介して受信し、受信した上記ユーザの筋電信号を電気刺激制御処理部113に渡す。 The myoelectric signal receiving and processing unit 112 receives the user's myoelectric signal transmitted from the user terminal UT via the communication I/F unit 15, and passes the received user's myoelectric signal to the electrical stimulation control processing unit 113.

 電気刺激制御処理部113は、上記ユーザの筋電信号に応じて、当該筋電信号により表されるユーザの指の動きと同一の動きをオペレータの指に発生させるための電気刺激制御信号を生成し、生成した上記電気刺激制御信号を入出力I/F部14から電気刺激発生器ED1へ出力する。 The electrical stimulation control processing unit 113 generates an electrical stimulation control signal in response to the user's myoelectric signal to cause the operator's finger to move in the same way as the user's finger as represented by the myoelectric signal, and outputs the generated electrical stimulation control signal from the input/output I/F unit 14 to the electrical stimulation generator ED1.

 筋電信号送信処理部114は、筋電センサES1により検出されたオペレータの筋電信号を入出力I/F部14から受け取り、受け取った上記オペレータの筋電信号を通信I/F部15からユーザ端末UTへ送信する。 The myoelectric signal transmission processing unit 114 receives the operator's myoelectric signal detected by the myoelectric sensor ES1 from the input/output I/F unit 14, and transmits the received operator's myoelectric signal from the communication I/F unit 15 to the user terminal UT.

 (3)ユーザ端末UT
 図3は、ユーザ端末UTのハードウェアおよびソフトウェアの構成の一例を示すブロック図である。
(3) User Terminal UT
FIG. 3 is a block diagram showing an example of the hardware and software configuration of the user terminal UT.

 ユーザ端末UTは、上記したオペレータ端末OTと同様にスマートフォンからなる。なお、ユーザ端末UTについても、PDA等のタッチパネルを使用した情報端末を使用することができる。 The user terminal UT is a smartphone, just like the operator terminal OT described above. Note that the user terminal UT can also be an information terminal using a touch panel, such as a PDA.

 ユーザ端末UTは、中央処理ユニット(CPU)等のハードウェアプロセッサを使用した制御部21に対し、プログラム記憶部22およびデータ記憶部23を有する記憶ユニットと、入出力I/F部24と、通信I/F部25とを、バスを介して接続したものとなっている。 The user terminal UT comprises a control unit 21 using a hardware processor such as a central processing unit (CPU), a storage unit having a program storage unit 22 and a data storage unit 23, an input/output I/F unit 24, and a communication I/F unit 25, all connected via a bus.

 入出力I/F部24には、表示デバイス161の表示面にシート状の入力デバイス162を配置したタッチパネル型の入出力デバイスが接続される。 The input/output I/F unit 24 is connected to a touch panel type input/output device in which a sheet-like input device 162 is arranged on the display surface of a display device 161.

 また入出力I/F部14には、筋電センサES2および電気刺激発生器ED2が接続される。筋電センサES2には検出電極E21が、また電気刺激発生器ED2には刺激提示用電極E22がそれぞれ接続される。 The input/output I/F unit 14 is also connected to an electromyogram sensor ES2 and an electrical stimulation generator ED2. A detection electrode E21 is connected to the electromyogram sensor ES2, and a stimulation presentation electrode E22 is connected to the electrical stimulation generator ED2.

 通信I/F部25は、ネットワークNWで定義される通信プロトコルに従い、ネットワークNWを介してオペレータ端末OTとの間で情報データの送受信を行う。 The communication I/F unit 25 transmits and receives information data to and from the operator terminal OT via the network NW in accordance with the communication protocol defined by the network NW.

 プログラム記憶部22は、例えば、記憶媒体としてSSD等の随時書込みおよび読出しが可能な不揮発性メモリと、ROM等の不揮発性メモリとを組み合わせて構成したもので、OS等のミドルウェアに加えて、一実施形態に係る各種制御処理を実行するために必要なアプリケーション・プログラムを格納する。なお、以後OSと各アプリケーション・プログラムとをまとめてプログラムと称する。 The program storage unit 22 is configured by combining, for example, a non-volatile memory such as an SSD as a storage medium that can be written to and read from at any time, and a non-volatile memory such as a ROM, and stores application programs necessary for executing various control processes according to one embodiment, in addition to middleware such as an OS. Hereinafter, the OS and each application program will be collectively referred to as a program.

 データ記憶部23は、例えば、記憶媒体として、SSD等の随時書込みおよび読出しが可能な不揮発性メモリと、RAM等の揮発性メモリと組み合わせたもので、送受信されたデータや、電気刺激を用いた操作支援を受ける過程で生成されるデータを一時保持するために使用される。 The data storage unit 23 is, for example, a combination of a non-volatile memory such as an SSD that can be written to and read from at any time as a storage medium, and a volatile memory such as a RAM, and is used to temporarily store transmitted and received data, as well as data generated in the process of receiving operational assistance using electrical stimulation.

 制御部21は、一実施形態を実施するために必要な処理機能として、画面共有制御処理部211と、筋電信号送信処理部212と、筋電信号受信処理部213と、電気刺激制御処理部214とを備える。これらの処理部211~214は、何れもプログラム記憶部22に格納されたアプリケーション・プログラムを制御部21のハードウェアプロセッサに実行させることにより実現される。 The control unit 21 includes, as processing functions necessary to implement one embodiment, a screen sharing control processing unit 211, an EMG signal transmission processing unit 212, an EMG signal reception processing unit 213, and an electrical stimulation control processing unit 214. All of these processing units 211 to 214 are realized by causing the hardware processor of the control unit 21 to execute application programs stored in the program storage unit 22.

 画面共有制御処理部211は、オペレータ端末OTとの間に通信リンクが確立された状態で、オペレータ端末OTとの間で操作支援を受けるために使用する表示画面データを共有する。 The screen sharing control processing unit 211 shares display screen data used to receive operational assistance with the operator terminal OT when a communication link is established with the operator terminal OT.

 筋電信号送信処理部212は、筋電センサES2により検出されたユーザの筋電信号を入出力I/F部24から受け取り、受け取った上記ユーザの筋電信号を通信I/F部25からオペレータ端末OTへ送信する。 The myoelectric signal transmission processing unit 212 receives the user's myoelectric signal detected by the myoelectric sensor ES2 from the input/output I/F unit 24, and transmits the received user's myoelectric signal from the communication I/F unit 25 to the operator terminal OT.

 筋電信号受信処理部213は、オペレータ端末OTから送信されたオペレータの筋電信号を通信I/F部25を介して受信し、受信した上記オペレータの筋電信号を電気刺激制御処理部214に渡す。 The myoelectric signal receiving and processing unit 213 receives the operator's myoelectric signal transmitted from the operator terminal OT via the communication I/F unit 25, and passes the received operator's myoelectric signal to the electrical stimulation control processing unit 214.

 電気刺激制御処理部214は、上記オペレータの筋電信号に応じて、当該筋電信号により表されるオペレータの指の動きと同一の動きをユーザの指に発生させるための電気刺激制御信号を生成し、生成した上記電気刺激制御信号を入出力I/F部24から電気刺激発生器ED2へ出力する。 The electrical stimulation control processing unit 214 generates an electrical stimulation control signal in response to the operator's myoelectric signal to cause the user's finger to move in the same way as the operator's finger represented by the myoelectric signal, and outputs the generated electrical stimulation control signal from the input/output I/F unit 24 to the electrical stimulation generator ED2.

 (動作例)
 次に、以上のように構成された遠隔操作支援システムの動作例を説明する。
(Example of operation)
Next, an example of the operation of the remote operation support system configured as above will be described.

 図4は、遠隔操作支援システムのオペレータ端末OTおよびユーザ端末UTの各制御部11,21が実行する操作支援制御の処理手順と処理内容の一例を示すフローチャートである。 FIG. 4 is a flowchart showing an example of the processing procedure and processing content of operation support control executed by each of the control units 11, 21 of the operator terminal OT and the user terminal UT of the remote operation support system.

 (1)操作支援のための事前準備
 ユーザ端末UTの操作支援を受けようとする場合、ユーザは先ず操作を行う側の手にグローブHE2を装着する。一方、オペレータも同様に操作側の手にグローブHE1を装着する。グローブHE1,HE2には、図1に例示したように筋電センサES1,ES2が一体的に設けられ、かつ電気刺激発生器ED1,ED2が信号ケーブルを介して接続されている。
(1) Advance preparation for operation assistance When a user wishes to receive operation assistance from a user terminal UT, the user first puts on a glove HE2 on the hand that will perform the operation. Similarly, the operator puts on a glove HE1 on the hand that will perform the operation. As shown in Fig. 1, the gloves HE1 and HE2 are integrally provided with electromyographic sensors ES1 and ES2, and are connected to electrical stimulation generators ED1 and ED2 via signal cables.

 なお、筋電センサES1,ES2および電気刺激発生器ED1,ED2は、双方ともグローブHE1,HE2に一体的に設置されてもよいし、信号ケーブルにより接続されてもよい。 The electromyogram sensors ES1, ES2 and the electrical stimulation generators ED1, ED2 may both be integrally installed in the gloves HE1, HE2, or may be connected by a signal cable.

 また、筋電センサES1,ES2および電気刺激発生器ED1,ED2とオペレータ端末OTおよびユーザ端末UTとの間は、例えばBluetoothにより通信可能とするべく事前にペアリング設定される。 Furthermore, the electromyogram sensors ES1, ES2 and the electrical stimulation generators ED1, ED2 are paired in advance with the operator terminal OT and the user terminal UT to enable communication, for example, via Bluetooth.

 (2)アプリケーションの起動
 ユーザがスマートフォンの操作についてオペレータから指導を受けようとする場合、ユーザ端末UTにおいて、例えばアイコンをタップすることにより操作支援要求を入力する。ユーザ端末UTの制御部11は、ステップS10により上記操作支援要求を検知するとアプリケーションを起動し、オペレータ端末OTとの間に通信リンクを確立するための処理を実行した後、オペレータ端末OTに対し操作支援要求を送信する。
(2) Application Activation When a user wishes to receive guidance from an operator on how to operate a smartphone, the user inputs an operation assistance request in the user terminal UT, for example, by tapping an icon. When the control unit 11 of the user terminal UT detects the operation assistance request in step S10, the control unit 11 activates an application and executes a process for establishing a communication link with the operator terminal OT, and then transmits the operation assistance request to the operator terminal OT.

 これに対し、オペレータ端末OTの制御部21は、ユーザ端末UTからのアクセスに応じてユーザ端末UTとの間に通信リンクを確立した後、ステップS11により上記操作支援要求を受信すると、アプリケーションを起動してその旨をオペレータに通知する。 In response to this, the control unit 21 of the operator terminal OT establishes a communication link with the user terminal UT in response to access from the user terminal UT, and then, upon receiving the above-mentioned operation assistance request in step S11, launches an application and notifies the operator of this fact.

 (3)ユーザによる試行操作
 ユーザ端末UTの制御部21は、ユーザの試行操作に応じて、ステップS12において、操作対象とする表示画面データをデータ記憶部13から読み出すか、またはWebサイトからダウンロードし、表示デバイス261に表示する。また、それと共に画面共有制御処理部211の制御の下、上記表示画面データを通信I/F部25からオペレータ端末OTへ送信する。
(3) Trial Operation by User In response to the trial operation by the user, in step S12, the control unit 21 of the user terminal UT reads out display screen data to be operated from the data storage unit 13 or downloads it from a website, and displays it on the display device 261. At the same time, under the control of the screen sharing control processing unit 211, the display screen data is transmitted from the communication I/F unit 25 to the operator terminal OT.

 これに対しオペレータ端末OTの制御部11は、画面共有制御処理部111の制御の下、ステップS13において、上記ユーザ端末UTから送信された表示画面データを通信I/F部15を介して受信する。そして、受信した上記表示画面データを表示デバイス161に表示する。 In response to this, the control unit 11 of the operator terminal OT, under the control of the screen sharing control processing unit 111, receives the display screen data transmitted from the user terminal UT via the communication I/F unit 15 in step S13. Then, the received display screen data is displayed on the display device 161.

 なお、表示画面データとしては、例えば地図データが用いられるが、ほかにも写真等のイメージデータであっても、また文書データであってもよい。 Note that while map data is used as the display screen data, it may also be image data such as photographs or text data.

 さて、この状態でユーザが、ユーザ端末UTの入力デバイス262上において、グローブHE2を装着した指を動かして上記表示画面データに対しスワイプ操作を行ったとする。図5Aおよび図5Bは、その操作の一例として、2本の指を同時に使用するピンチイン操作およびピンチアウト操作を示している。なお、他の操作例としては、3本指を同時に使用する操作や、1本指を用いるドラッグ操作、フリック操作、ダブルタップ操作などであってもよい。 Now, in this state, assume that the user performs a swipe operation on the display screen data by moving the fingers wearing the glove HE2 on the input device 262 of the user terminal UT. Figures 5A and 5B show, as an example of such an operation, a pinch-in operation and a pinch-out operation using two fingers simultaneously. Note that other examples of the operation may include an operation using three fingers simultaneously, a drag operation using one finger, a flick operation, a double tap operation, etc.

 ユーザ端末UTの制御部21は、上記ユーザの試行操作をステップS14により検知すると、筋電信号送信処理部212の制御の下、筋電センサES2により検出される、上記試行操作に伴う指の動きに対応する筋電信号(検出元の電極セットの識別情報を含む)を、ステップS15により入出力I/F部24を介して取得する。そして、筋電信号送信処理部212は、取得した上記ユーザの筋電信号を、ステップS16により通信I/F部25からオペレータ端末OTに向け送信する。また、同時にユーザ端末UTの制御部21は、画面共有制御処理部211の制御の下、操作された表示画面データを通信I/F部25からオペレータ端末OTに向け送信する。 When the control unit 21 of the user terminal UT detects the user's trial operation in step S14, under the control of the myoelectric signal transmission processing unit 212, it acquires, via the input/output I/F unit 24 in step S15, a myoelectric signal (including identification information of the electrode set from which the signal was detected) corresponding to the finger movement associated with the trial operation detected by the myoelectric sensor ES2. The myoelectric signal transmission processing unit 212 then transmits the acquired myoelectric signal of the user from the communication I/F unit 25 to the operator terminal OT in step S16. At the same time, under the control of the screen sharing control processing unit 211, the control unit 21 of the user terminal UT transmits the operated display screen data from the communication I/F unit 25 to the operator terminal OT.

 これに対し、オペレータ端末OTの制御部11は、画面共有制御処理部111の制御の下、上記ユーザ端末UTから送信された表示画面データを、ステップS17により通信I/F部15を介して受信すると、ステップS18において、表示デバイス161に表示される表示画面データを、受信された上記表示画面データに更新する。 In response to this, when the control unit 11 of the operator terminal OT receives the display screen data transmitted from the user terminal UT via the communication I/F unit 15 in step S17 under the control of the screen sharing control processing unit 111, in step S18, the control unit 11 updates the display screen data displayed on the display device 161 to the received display screen data.

 また、それと同時にオペレータ端末OTの制御部11は、筋電信号受信処理部112の制御の下、上記ユーザ端末UTから送信されるユーザの筋電信号を、ステップS17により通信I/F部15を介して受信する。 At the same time, the control unit 11 of the operator terminal OT, under the control of the myoelectric signal receiving and processing unit 112, receives the user's myoelectric signal transmitted from the user terminal UT via the communication I/F unit 15 in step S17.

 上記ユーザの筋電信号を受信するとオペレータ端末OTの制御部11は、電気刺激制御処理部113の制御の下、ステップS19において、上記ユーザの筋電信号により表されるユーザの指の動きと同一の動きをオペレータの指に生じさせるための電気刺激制御信号を生成する。この電気刺激制御信号には、電気刺激の強さ(振幅)と印加時間を指定する値と、電気刺激の印加先となる指に対応する電極セットを指定する識別信号とが含まれる。そして、電気刺激制御処理部113は、生成した上記電気刺激制御信号を、入出力I/F部14から電気刺激発生器ED1へ出力する。 When the control unit 11 of the operator terminal OT receives the user's myoelectric signal, in step S19, under the control of the electrical stimulation control processing unit 113, generates an electrical stimulation control signal for causing the operator's finger to move in the same way as the user's finger represented by the user's myoelectric signal. This electrical stimulation control signal includes values specifying the strength (amplitude) and application time of the electrical stimulation, and an identification signal specifying the electrode set corresponding to the finger to which the electrical stimulation is to be applied. The electrical stimulation control processing unit 113 then outputs the generated electrical stimulation control signal from the input/output I/F unit 14 to the electrical stimulation generator ED1.

 この結果、電気刺激発生器ED1から、上記電気刺激制御信号により指定された強さの電気刺激信号が指定された時間長だけ発生され、かつ発生された上記電気刺激信号が、動かす対象となる指に対応する刺激提示電極のセットに印加される。このため、ユーザが試行操作に使用した指と同一のオペレータの指を動かす筋肉に電気刺激が付与され、これによりオペレータの指が動作する。つまり、ユーザの指の動きがオペレータの指の動きに転写される。従って、オペレータはユーザの試行操作に伴う指の動きを自身の指の動きにより認識することができ、これによりユーザの試行操作の修正すべき点を直に確認することができる。 As a result, an electrical stimulation signal of the strength specified by the electrical stimulation control signal is generated from the electrical stimulation generator ED1 for the specified length of time, and the generated electrical stimulation signal is applied to the set of stimulation presentation electrodes corresponding to the finger to be moved. As a result, electrical stimulation is applied to the muscles that move the operator's finger that is the same as the finger used by the user in the trial operation, thereby moving the operator's finger. In other words, the movement of the user's finger is transferred to the movement of the operator's finger. Therefore, the operator can recognize the finger movement associated with the user's trial operation from the movement of his or her own finger, and can directly confirm any points that need to be corrected in the user's trial operation.

 (4)ユーザ操作の支援
 オペレータは、上記ユーザの指の動きの確認結果を参照して、オペレータ端末OTの表示画面上で手本となる操作、例えばピンチイン操作またはピンチアウト操作を行う。オペレータ端末OTの制御部11は、上記手本となる操作をステップS20により検知すると、筋電信号送信処理部114の制御の下、筋電センサES1により検出される、上記手本となる操作に伴う指の動きに対応する筋電信号(検出元の電極セットの識別情報を含む)を、ステップS21により入出力I/F部24を介して取得する。そして、筋電信号送信処理部114は、取得した上記ユーザの筋電信号を、ステップS22により通信I/F部15からユーザ端末UTに向け送信する。
(4) Support for user operation The operator refers to the confirmation result of the user's finger movement and performs a model operation, such as a pinch-in operation or a pinch-out operation, on the display screen of the operator terminal OT. When the control unit 11 of the operator terminal OT detects the model operation in step S20, the control unit 11 acquires, under the control of the myoelectric signal transmission processing unit 114, a myoelectric signal (including identification information of the electrode set that detected the signal) corresponding to the finger movement associated with the model operation detected by the myoelectric sensor ES1 via the input/output I/F unit 24 in step S21. Then, the myoelectric signal transmission processing unit 114 transmits the acquired myoelectric signal of the user from the communication I/F unit 15 to the user terminal UT in step S22.

 これに対し、ユーザ端末UTの制御部21は、筋電信号受信処理部213の制御の下、上記オペレータ端末OTから送信されるオペレータの筋電信号を、ステップS23により通信I/F部25を介して受信する。 In response to this, the control unit 21 of the user terminal UT, under the control of the myoelectric signal receiving and processing unit 213, receives the operator's myoelectric signal transmitted from the operator terminal OT via the communication I/F unit 25 in step S23.

 上記オペレータの筋電信号を受信するとユーザ端末UTの制御部21は、電気刺激制御処理部214の制御の下、ステップS24において、上記オペレータの筋電信号により表されるオペレータの指の動きと同一の動きをユーザの指に生じさせるための電気刺激制御信号を生成する。この電気刺激制御信号には、電気刺激の強さ(振幅)と印加時間を指定する値と、電気刺激の印加先となる指に対応する電極セットを指定する識別信号とが含まれる。そして、電気刺激制御処理部214は、生成した上記電気刺激制御信号を、入出力I/F部24から電気刺激発生器ED2へ出力する。 Upon receiving the operator's myoelectric signal, the control unit 21 of the user terminal UT, under the control of the electrical stimulation control processing unit 214, generates an electrical stimulation control signal in step S24 for causing the user's finger to move in the same way as the movement of the operator's finger represented by the operator's myoelectric signal. This electrical stimulation control signal includes values specifying the strength (amplitude) and application time of the electrical stimulation, and an identification signal specifying the electrode set corresponding to the finger to which the electrical stimulation is to be applied. The electrical stimulation control processing unit 214 then outputs the generated electrical stimulation control signal from the input/output I/F unit 24 to the electrical stimulation generator ED2.

 この結果、電気刺激発生器ED2から、上記電気刺激制御信号により指定された強さの電気刺激信号が指定された時間長だけ発生され、かつ発生された上記電気刺激信号が、動かす対象となる指に対応する刺激提示電極のセットに印加される。このため、オペレータが操作に使用した指と同一のユーザの指を動かす筋肉に電気刺激が付与され、これによりユーザの指が動作する。つまり、オペレータの手本となる指の動きがユーザの指に転写される。従って、ユーザはオペレータの手本となる操作に伴う指の動きを、自身の指の動きとして直に認識することが可能となる。 As a result, an electrical stimulation signal of the strength specified by the electrical stimulation control signal is generated from the electrical stimulation generator ED2 for the specified length of time, and the generated electrical stimulation signal is applied to the set of stimulation presentation electrodes corresponding to the finger to be moved. As a result, electrical stimulation is applied to the muscles that move the user's finger that is the same as the finger used by the operator for the operation, thereby moving the user's finger. In other words, the operator's model finger movement is transferred to the user's finger. Therefore, the user can directly recognize the finger movement associated with the operator's model operation as the movement of his or her own finger.

 上記ユーザの指の動きに応じて表示画面が操作されると、この操作後の表示画面データが、画面共有制御処理部211の制御の下で、ステップS25により通信I/F部25からオペレータ端末OTへ送信される。 When the display screen is operated in response to the user's finger movements, the display screen data after this operation is sent from the communication I/F unit 25 to the operator terminal OT in step S25 under the control of the screen sharing control processing unit 211.

 これに対しオペレータ端末OTの制御部11は、画面共有制御処理部111の制御の下、上記ユーザ端末UTから送信された表示画像データを、ステップS26により通信I/F部15を介して受信すると、ステップS27において、受信した上記表示画像データを入出力I/F部14から表示デバイス161へ出力して表示する。 In response to this, the control unit 11 of the operator terminal OT, under the control of the screen sharing control processing unit 111, receives the display image data transmitted from the user terminal UT via the communication I/F unit 15 in step S26, and in step S27 outputs the received display image data from the input/output I/F unit 14 to the display device 161 for display.

 従って、オペレータは上記表示画面データを見ることで、ユーザの操作結果を確認することができる。 Therefore, the operator can check the results of the user's operations by looking at the display screen data.

 (5)支援制御の終了判定
 ユーザ端末UTの制御部21は、ステップS28において、ユーザが操作支援の終了操作を行うか否かを監視する。この結果、操作支援の終了操作が行われなければ、ステップS14に戻って、ステップS14~S28による操作支援のための一連の処理を繰り返し実行する。これに対し、操作支援の終了要求を検知すると、この操作支援終了要求を通信I/F部25からオペレータ端末OTへ送信し、しかる後アプリケーションを閉じて上記操作支援処理を終了する。
(5) Determination of End of Assistance Control In step S28, the control unit 21 of the user terminal UT monitors whether or not the user performs an operation to end the operational assistance. If the user does not perform an operation to end the operational assistance, the control unit 21 returns to step S14 and repeatedly executes a series of processes for the operational assistance in steps S14 to S28. In response to this, when a request to end the operational assistance is detected, the control unit 21 transmits the request to end the operational assistance from the communication I/F unit 25 to the operator terminal OT, and then closes the application to end the above-mentioned operational assistance process.

 これに対し、オペレータ端末OTの制御部11は、ステップS29において、ユーザ端末UTから送信される上記操作支援終了要求の受信を監視する。そして、操作支援終了要求が受信されなければ、ステップS17に戻り、ステップS17~S29による操作支援制御を継続する。一方、上記操作支援終了要求が受信されると、オペレータ端末OPTの制御部11は、オペレータにその旨のメッセージを通知すると共に、アプリケーションを閉じて操作支援を終了する。 In response to this, in step S29, the control unit 11 of the operator terminal OT monitors for receipt of the operation assistance end request sent from the user terminal UT. If the operation assistance end request is not received, the process returns to step S17 and continues the operation assistance control in steps S17 to S29. On the other hand, if the operation assistance end request is received, the control unit 11 of the operator terminal OPT notifies the operator of this fact with a message and closes the application to end the operation assistance.

 (作用・効果)
 以上述べたように一実施形態では、例えばオペレータの手本となるスワイプ操作に伴う指の動きを筋電センサES1により検出し、検出した上記筋電信号をオペレータ端末OTからユーザ端末UTへ送信する。これに対しユーザ端末UTは、上記筋電信号を受信すると、この筋電信号により表されるオペレータの手本となる指の動きと同一の動きをユーザの指に発生させるための電気刺激制御信号を生成して電気刺激発生器ED2に供給し、電気刺激発生器ED2からユーザの指に対し電気刺激を与えて指を動作させるようにしている。
(Action and Effects)
As described above, in one embodiment, for example, the finger movement accompanying a swipe operation that is an example of the operator is detected by the electromyographic sensor ES1, and the detected electromyographic signal is transmitted from the operator terminal OT to the user terminal UT. In response to this, when the user terminal UT receives the electromyographic signal, it generates an electrical stimulation control signal for causing the user's finger to make the same movement as the example finger movement of the operator represented by the electromyographic signal, and supplies this to the electrical stimulation generator ED2, and the electrical stimulation generator ED2 applies an electrical stimulation to the user's finger to move the finger.

 従って、スマホの操作に不慣れなユーザや、障害などの影響により指を自在に動かすことが困難なユーザに対し、オペレータの手本となる操作を指の動きとして直に伝えることができる。このため、ユーザは、スワイプ操作等のような比較的分かりにくい操作を確実に行うことが可能となる。 Therefore, for users who are unfamiliar with operating smartphones or who have difficulty moving their fingers freely due to disabilities, the operator can directly demonstrate operations in the form of finger movements. This allows users to reliably perform relatively difficult operations such as swiping.

 また一実施形態では、ユーザに対する操作支援に先立ち、ユーザが先ず操作を行ってそのとき指の動きを表す筋電信号をユーザ端末UTからオペレータ端末OTに送信し、オペレータ端末OTにおいて上記筋電信号をもとに電気刺激制御信号を生成してオペレータの指に電気刺激を与えることにより、ユーザの指の動きをオペレータに直に伝えるようにしている。 In one embodiment, prior to providing operational assistance to the user, the user first performs an operation and a myoelectric signal representing the finger movement at that time is transmitted from the user terminal UT to the operator terminal OT. The operator terminal OT then generates an electrical stimulation control signal based on the myoelectric signal and applies an electrical stimulation to the operator's finger, thereby directly communicating the user's finger movement to the operator.

 この結果、オペレータはユーザの指の動きについて修正点を把握した上で、手本となる操作を行うことができる。従って、ユーザごとにその指の動きの特徴を踏まえ、スマートフォンの操作についての支援をより正確にかつ能率よく行うことが可能となる。 As a result, the operator can understand what needs to be corrected in the user's finger movements and then perform model operations. This makes it possible to provide more accurate and efficient support for smartphone operation based on the characteristics of each user's finger movements.

 [その他の実施形態]
 (1)一実施形態では、オペレータの手本操作に伴う指の動きが反映された動作情報として、筋電信号を検出するようにしたが、それに代えて例えば指先の位置とその複数軸方向の加速度を検出し、検出した上記指先の位置と加速度に基づいてユーザの指の動きを制御するようにしてもよい。この場合のユーザの指の動きの制御には、アクチュエータなどを用いることができる。
[Other embodiments]
(1) In one embodiment, an electromyogram is detected as motion information reflecting the finger movement accompanying the model operation of the operator, but instead, for example, the position of the fingertip and its acceleration in multiple axial directions may be detected, and the user's finger movement may be controlled based on the detected fingertip position and acceleration. In this case, an actuator or the like may be used to control the user's finger movement.

 (2)一実施形態では、ユーザの操作に伴う指の動きが反映された動作情報として、筋電信号を検出するようにしたが、それに加えて例えば皮膚電気活動を検出してもよい。この場合、検出した皮膚電気活動をもとに操作に対するユーザの苦手度合いを推定して、その結果をオペレータに提示することで、オペレータは手本となる操作の速度等を調整する等して、よりきめ細やかな支援を行うことが可能となる。 (2) In one embodiment, electromyograms are detected as motion information reflecting the finger movements accompanying the user's operation. In addition, for example, skin electrodermal activity may be detected. In this case, the degree of difficulty of the user in the operation is estimated based on the detected skin electrodermal activity, and the result is presented to the operator, allowing the operator to provide more detailed support by adjusting the speed of the model operation, etc.

 (3)その他、オペレータ端末およびユーザ端末の種類や機能構成、その処理手順と処理内容、支援対象となる操作の種類等についても、この発明の要旨を逸脱しない範囲で種々変形して実施可能である。 (3) In addition, the types and functional configurations of the operator terminal and user terminal, their processing procedures and processing contents, the types of operations to be supported, etc. may be modified and implemented in various ways without departing from the spirit of this invention.

 以上、この発明の実施形態を詳細に説明してきたが、前述までの説明はあらゆる点においてこの発明の例示に過ぎない。この発明の範囲を逸脱することなく種々の改良や変形を行うことができることは言うまでもない。つまり、この発明の実施にあたって、実施形態に応じた具体的構成が適宜採用されてもよい。 Although the embodiments of the present invention have been described in detail above, the above description is merely an example of the present invention in every respect. It goes without saying that various improvements and modifications can be made without departing from the scope of the present invention. In other words, when implementing the present invention, specific configurations according to the embodiments may be appropriately adopted.

 要するにこの発明は、上記実施形態そのままに限定されるものではなく、実施段階ではその要旨を逸脱しない範囲で構成要素を変形して具体化できる。また、上記実施形態に開示されている複数の構成要素の適宜な組み合せにより種々の発明を形成できる。例えば、実施形態に示される全構成要素から幾つかの構成要素を削除してもよい。さらに、異なる実施形態に亘る構成要素を適宜組み合せてもよい。 In short, this invention is not limited to the above-described embodiment as it is, and in the implementation stage, the components can be modified and embodied without departing from the gist of the invention. Furthermore, various inventions can be formed by appropriately combining multiple components disclosed in the above-described embodiment. For example, some components may be deleted from all the components shown in the embodiment. Furthermore, components from different embodiments may be appropriately combined.

 UT…ユーザ端末
 OT…オペレータ端末
 NW…ネットワーク
 ES1,ES2…筋電センサ
 E11,E21…検出電極
 ED1,ED2…電気刺激発生器
 E12,E22…刺激提示電極
 11,21…制御部
 12,22…プログラム記憶部
 13,23…データ記憶部
 14,24…入出力I/F部
 15,25…通信I/F部
 161,261…表示デバイス
 162,262…入力デバイス
 111,211…画面共有制御処理部
 112,213…筋電信号受信処理部
 113,214…電気刺激制御処理部
 114,212…筋電信号送信処理部
 
UT...user terminal OT...operator terminal NW...network ES1, ES2...electromyogram sensor E11, E21...detection electrode ED1, ED2...electrical stimulation generator E12, E22...stimulation presentation electrode 11, 21...control unit 12, 22...program storage unit 13, 23...data storage unit 14, 24...input/output I/F unit 15, 25...communication I/F unit 161, 261...display device 162, 262...input device 111, 211...screen sharing control processing unit 112, 213...electromyogram signal reception processing unit 113, 214...electrical stimulation control processing unit 114, 212...electromyogram signal transmission processing unit

Claims (7)

 ユーザ端末とオペレータ端末との間をネットワークを介して接続し、表示画面を共有した状態でオペレータの操作をユーザに伝達する遠隔操作支援システムであって、
 前記オペレータの手指の動きが反映されたオペレータ動作情報を検出する第1のセンサユニットと、
 前記オペレータ端末に設けられ、検出された前記オペレータ動作情報を前記ネットワークを介して前記ユーザ端末へ送信する送信処理部と、
 前記ユーザ端末に設けられ、前記オペレータ端末から送信された前記オペレータ動作情報を受信する受信処理部と、
 前記ユーザ端末に設けられ、受信された前記オペレータ動作情報に基づいて、前記オペレータの手指の動きと同一の動きを前記ユーザの手指に生じさせるためのユーザ刺激制御信号を生成する刺激制御部と、
 生成された前記ユーザ刺激制御信号に応じて、前記ユーザの手指の動きと関係する身体部位に刺激を提示する第1の刺激提示ユニットと
 を備える遠隔操作支援システム。
A remote operation support system that connects a user terminal and an operator terminal via a network and transmits an operator's operation to a user while sharing a display screen,
a first sensor unit that detects operator operation information reflecting a movement of the operator's fingers;
a transmission processing unit provided in the operator terminal and configured to transmit the detected operator operation information to the user terminal via the network;
a reception processing unit provided in the user terminal and configured to receive the operator operation information transmitted from the operator terminal;
a stimulus control unit provided in the user terminal and generating a user stimulus control signal for causing the user's fingers to make the same movement as the movement of the operator's fingers based on the received operator operation information;
a first stimulus presentation unit that presents a stimulus to a body part related to the movement of the user's fingers in response to the generated user stimulus control signal.
 前記ユーザの手指の動きが反映されたユーザ動作情報を検出する第2のセンサユニットと、
 前記ユーザ端末に設けられ、検出された前記ユーザ動作情報を前記ネットワークを介して前記オペレータ端末へ送信する第2の送信処理部と、
 前記オペレータ端末に設けられ、前記ユーザ端末から送信された前記ユーザ動作情報を受信する第2の受信処理部と、
 前記オペレータ端末に設けられ、受信された前記ユーザ動作情報に基づいて、前記ユーザの手指の動きと同一の動きを前記オペレータの手指に生じさせるためのオペレータ刺激制御信号を生成する第2の刺激制御部と、
 生成された前記オペレータ刺激制御信号に応じて、前記オペレータの手指の動きと関係する身体部位に刺激を提示する第2の刺激提示ユニットと
 を、さらに備える請求項1に記載の遠隔操作支援システム。
a second sensor unit configured to detect user motion information reflecting a motion of the user's fingers;
a second transmission processing unit provided in the user terminal and configured to transmit the detected user operation information to the operator terminal via the network;
a second receiving processor provided in the operator terminal and configured to receive the user operation information transmitted from the user terminal;
a second stimulus control unit provided in the operator terminal and configured to generate an operator stimulus control signal for causing the operator's fingers to make the same movement as the user's fingers, based on the received user operation information;
The remote operation support system according to claim 1 , further comprising: a second stimulus presentation unit that presents a stimulus to a body part related to the movement of the operator's fingers in response to the generated operator stimulus control signal.
 前記第1のセンサユニットは、前記オペレータ動作情報として、前記オペレータの手指の動きが反映された筋電信号を検出し、
 前記第1の刺激提示ユニットは、前記ユーザの手指を動かす筋部位に電気刺激を付与する、
 請求項1に記載の遠隔操作支援システム。
the first sensor unit detects, as the operator operation information, a myoelectric signal reflecting a movement of the operator's fingers;
The first stimulus presentation unit applies electrical stimulation to a muscle site that moves a finger of the user.
The remote operation support system according to claim 1 .
 前記第2のセンサユニットは、前記ユーザ動作情報として、前記ユーザの手指の動きが反映された筋電信号を検出し、
 前記第2の刺激提示ユニットは、前記オペレータの手指を動かす筋部位に電気刺激を付与する、
 請求項2に記載の遠隔操作支援システム。
The second sensor unit detects, as the user operation information, a myoelectric signal reflecting a movement of the user's fingers;
The second stimulus presentation unit applies electrical stimulation to a muscle site that moves the operator's finger.
The remote operation support system according to claim 2 .
 前記第1のセンサユニットは、前記オペレータのスワイプ操作に係る手指の動きが反映された前記オペレータ動作情報を検出する、請求項1に記載の遠隔操作支援システム。 The remote operation support system according to claim 1, wherein the first sensor unit detects the operator operation information that reflects the movement of the operator's fingers related to a swipe operation.  前記第2のセンサユニットは、前記ユーザのスワイプ操作に係る手指の動きが反映された前記ユーザ動作情報を検出する、請求項2に記載の遠隔操作支援システム。 The remote operation support system according to claim 2, wherein the second sensor unit detects the user operation information reflecting the movement of the user's fingers associated with a swipe operation.  ユーザ端末とオペレータ端末との間をネットワークを介して接続し、表示画面を共有した状態でオペレータの操作をユーザに伝達するシステムが実行する遠隔操作支援方法であって、
 前記オペレータの手指の動きが反映された動作情報を検出する過程と、
 検出された前記動作情報を、前記オペレータ端末から前記ユーザ端末へ前記ネットワークを介して送信する過程と、
 前記オペレータ端末から送信された前記動作情報を前記ユーザ端末において受信する過程と、
 受信された前記動作情報に基づいて、前記オペレータの手指の動きと同一の動きを前記ユーザの手指に生じさせるための刺激制御信号を生成する過程と、
 生成された前記刺激制御信号に応じて、前記ユーザの手指の動きと関係する身体部位に刺激を提示する過程と
 を備える遠隔操作支援方法。
A remote operation support method executed by a system that connects a user terminal and an operator terminal via a network and transmits an operator's operation to a user while sharing a display screen, comprising:
detecting motion information reflecting the motion of the operator's fingers;
transmitting the detected motion information from the operator terminal to the user terminal via the network;
receiving the operation information transmitted from the operator terminal at the user terminal;
generating a stimulation control signal for causing the user's finger to move in the same manner as the operator's finger based on the received motion information;
and a step of presenting a stimulus to a body part related to the movement of the user's finger in response to the generated stimulus control signal.
PCT/JP2023/024474 2023-06-30 2023-06-30 Remote operation assistance system and method Pending WO2025004362A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2023/024474 WO2025004362A1 (en) 2023-06-30 2023-06-30 Remote operation assistance system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2023/024474 WO2025004362A1 (en) 2023-06-30 2023-06-30 Remote operation assistance system and method

Publications (1)

Publication Number Publication Date
WO2025004362A1 true WO2025004362A1 (en) 2025-01-02

Family

ID=93938192

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/024474 Pending WO2025004362A1 (en) 2023-06-30 2023-06-30 Remote operation assistance system and method

Country Status (1)

Country Link
WO (1) WO2025004362A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003500128A (en) * 1999-05-21 2003-01-07 チャールズ クック,マイケル Computer game feedback assembly
JP2007136041A (en) * 2005-11-22 2007-06-07 Tokyo Institute Of Technology Learning support device, learning support method, virtual human interface device, virtual human interface method, virtual human interface system, program for realizing these devices, and recording medium
JP2018187421A (en) * 2013-09-09 2018-11-29 イマージョン コーポレーションImmersion Corporation Electrical Stimulation Haptic Feedback Interface

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003500128A (en) * 1999-05-21 2003-01-07 チャールズ クック,マイケル Computer game feedback assembly
JP2007136041A (en) * 2005-11-22 2007-06-07 Tokyo Institute Of Technology Learning support device, learning support method, virtual human interface device, virtual human interface method, virtual human interface system, program for realizing these devices, and recording medium
JP2018187421A (en) * 2013-09-09 2018-11-29 イマージョン コーポレーションImmersion Corporation Electrical Stimulation Haptic Feedback Interface

Similar Documents

Publication Publication Date Title
Markovic et al. GLIMPSE: Google Glass interface for sensory feedback in myoelectric hand prostheses
EP3163972B1 (en) Terminal, server, and terminal control method
Fager et al. Patients' experiences with technology during inpatient rehabilitation: opportunities to support independence and therapeutic engagement
US11197773B2 (en) Intraoral device control system
Kübler et al. Facing the challenge: bringing brain–computer interfaces to end-users
CN108475476B (en) Apparatus and method in the form of a glove for sending and receiving information via Braille
Wang et al. Development of a humanoid robot control system based on ar-bci and slam navigation
WO2025004362A1 (en) Remote operation assistance system and method
CN109661638A (en) Terminal installation, server and information processing system
JP2014049109A (en) Eye control communication system
Witney et al. Spatial representation of predictive motor learning
Xu et al. Boosters of the metaverse: a review of augmented reality-based brain-computer interface
JPWO2012073733A1 (en) Presentation device, communication system, and sensitivity presentation method
CN110032416A (en) A kind of terminal remote control method and terminal
Kotkar et al. Android Based Remote Desktop Client
JP5427592B2 (en) Device control system, terminal device and device
Gavrilovska et al. Human bond communications: Generic classification and technology enablers
US20240256043A1 (en) Brain computer interface (bci) system that can be implemented on multiple devices
KR20120133347A (en) Rehabilitation training system
JP2022127333A (en) LOAD DISTRIBUTION SYSTEM, LOAD DISTRIBUTION METHOD, AND PROGRAM
JP5867138B2 (en) Remote consultation system
JP2016134833A (en) Vehicle display system, portable terminal device, vehicle display program
CN113791692A (en) Interaction method, terminal device and readable storage medium
JP7007852B2 (en) Remote force guidance system, master and slave devices, and programs
CN106873779B (en) Gesture recognition device and gesture recognition method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23943750

Country of ref document: EP

Kind code of ref document: A1