WO2022009338A1 - 情報処理端末、遠隔制御方法およびプログラム - Google Patents
情報処理端末、遠隔制御方法およびプログラム Download PDFInfo
- Publication number
- WO2022009338A1 WO2022009338A1 PCT/JP2020/026710 JP2020026710W WO2022009338A1 WO 2022009338 A1 WO2022009338 A1 WO 2022009338A1 JP 2020026710 W JP2020026710 W JP 2020026710W WO 2022009338 A1 WO2022009338 A1 WO 2022009338A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- controlled device
- control
- information processing
- processing terminal
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C17/00—Arrangements for transmitting signals characterised by the use of a wireless electrical link
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C17/00—Arrangements for transmitting signals characterised by the use of a wireless electrical link
- G08C17/02—Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/12—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72409—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
- H04M1/72415—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories for remote control of appliances
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04Q—SELECTING
- H04Q9/00—Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C2201/00—Transmission systems of control signals via wireless link
- G08C2201/30—User interface
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C2201/00—Transmission systems of control signals via wireless link
- G08C2201/90—Additional features
- G08C2201/93—Remote control using other portable devices, e.g. mobile phone, PDA, laptop
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/70—Services for machine-to-machine communication [M2M] or machine type communication [MTC]
Definitions
- the present invention relates to a remote control technique for performing remote control using a mobile information terminal.
- HMD head-mounted display
- remote controls On the other hand, the number of devices operated by remote controllers (hereinafter referred to as "remote controls”) is increasing. In response to this, learning remote controls that integrate various remote control functions (corresponding by storing various remote control functions) and smart remote controls (managing device operations via Internet communication such as Wi-Fi and wireless LAN) are available. It is generalized. For example, in Patent Document 1, a general-purpose remote controller having directionality is directed to a target remote-controlled device to establish a wireless link, and the general-purpose remote-controlled device receives an operation application transmitted from the remote-controlled device. A remote control system for operating a remote controlled device with a remote control is disclosed.
- the device can be selected by simply pointing the general-purpose remote controller at the device to be remotely controlled, and the selected device can be remotely controlled using the general-purpose remote controller.
- ready-made remote controllers do not necessarily have a user-friendly interface, such as having function buttons for functions that are not normally used and the arrangement of function buttons being different from the user's preference.
- the troublesomeness of searching for the remote controller is not eliminated.
- the present invention has been made in view of the above circumstances, and an object of the present invention is to provide a remote control technology that eliminates the troublesomeness of searching for a remote controller and further improves the convenience of the user.
- the present invention is an information processing terminal provided with a communication device, which transmits a control command to a controlled device via the communication device to remotely control the controlled device, and is an image of the surroundings of the information processing terminal.
- a virtual remote controller generation unit that identifies the controlled device and a desired control operation that is a desired control operation from the surrounding image, and generates a reception object that receives a user's operation instruction for the specified desired control operation, and the reception. It is characterized by including an operation reception unit that receives the operation instruction by the user via an object, and a command transmission unit that transmits the control command corresponding to the operation instruction received by the operation reception unit to the controlled device. And.
- a remote control technology that eliminates the troublesomeness of searching for a remote controller and further improves the convenience of the user. Issues, configurations and effects other than those described above will be clarified by the description of the following embodiments.
- This embodiment is intended for use by an elderly user or the like who can easily lose a remote controller (remote control; remote control device), for example. Not only is it easy for older users to lose their remote controls, but they often find the many buttons on off-the-shelf remote controls annoying.
- remote controller remote control; remote control device
- In the present embodiment in order to solve these problems, for example, as long as the minimum necessary control operation is simply described on a suitable paper by handwriting or the like, it can be described as if it were a real remote controller. Make it available.
- FIG. 1 is a diagram for explaining an outline of the present embodiment.
- the user 901 wearing the HMD (head-mounted display) 100 prepares a paper or the like on which the name of the device (controlled device) to be operated and the desired control operation (desired control operation) are described.
- the described paper or the like is referred to as a presentation surface 300
- the described information is referred to as a description information 310.
- the HMD 100 generates the remote control function of the controlled device as the virtual remote control image 400 from the description information 310 of the presentation surface 300 seen by the user 901 through the display 131. Then, the user 901's operation on the presentation surface 300 is accepted as an operation on the virtual remote controller image 400, and the controlled device is operated from the HMD 100.
- the virtual remote controller image 400 is generated by the HMD 100 reading the description information 310 on the presentation surface 300 and associating it with the control command of the controlled device.
- the generated virtual remote control image 400 may be displayed on the display 131 of the HMD 100.
- the desired control operation described on the presentation surface 300 may be displayed on the display 131 of the HMD 100 as a function button (reception object) when it is recognized.
- the user 901 performs an operation such as a gesture and gives an instruction to the function button to control the controlled device.
- the presentation surface 300 may be, for example, a sheet-like member such as the above-mentioned paper. Further, the description information 310 may be described by handwriting. However, the presentation surface 300 and the description information 310 are not limited to this.
- the user 901 can operate the controlled device as if the presentation surface 300 prepared by the user is used as a remote controller.
- FIG. 2 is a system configuration diagram of the remote control system 500 of the present embodiment.
- the remote control system 500 of the present embodiment includes an HMD 100, a controlled device 510, a wireless router 520, and a network server 550.
- the wireless router 520 is connected to the network server 550 via the network 530.
- the HMD 100 is connected to the network server 550 via the wireless router 520.
- the controlled device 510 is connected to the wireless router 520 by a wired LAN or a wireless LAN.
- the HMD 100 transmits a control command to the controlled device 510 via the wireless router 520.
- the user 901 wearing the HMD 100 can control the controlled device 510 by remote control.
- the network server 550 is an external server that stores control data for controlling the controlled device 510.
- the control data of the controlled device 510 requested via the wireless router 520 and the network 530 from the HMD 100 is transmitted to the source HMD 100 via the network 530 and the wireless router 520.
- the number is not limited to this.
- the manufacturer of the controlled device 510 may provide the network server 550, respectively.
- the network server 550 is not always necessary. After the HMD 100 acquires the necessary information from the network server 550, the controlled device 510 is controlled by communication between the HMD 100, the wireless router 520, and the controlled device 510.
- the controlled device 510 is a device that the user 901 wearing the HMD 100 wants to control with a remote controller.
- home appliances such as air conditioners, lighting equipment, TVs, electric fans, gas water heaters, floor heating, and other devices to which HEMS (Home Energy Management System), which is a household energy management system, can be applied.
- HEMS Home Energy Management System
- the controlled device 510 will be described by taking an air conditioner as an example.
- the air conditioner is assumed to be a household air conditioner (hereinafter referred to as "air conditioner").
- the HMD 100 is a device that the user 901 wears on the head and displays the processed information on the display 131.
- the HMD 100 of the present embodiment has the shape of spectacles or goggles, and includes a display 131 at the lens position of the spectacles.
- the display 131 may be a transmissive type or a non-transparent type.
- the user 901 can observe the situation in the real space through the display 131.
- an augmented reality AR object is displayed on the display 131. Therefore, the user 901 can simultaneously visually recognize both the AR object displayed on the display 131 and the situation in the real space.
- FIG. 3 is a block diagram showing a configuration example of the HMD 100 of the present embodiment.
- the same components as those shown in FIGS. 1 and 2 are designated by the same reference numerals.
- the HMD 100 of the present embodiment includes a main processor 111, a system bus 112, a storage device 110, an operation reception device 120, an image processing device 130, a voice processing device 140, and a sensor 150. , The communication device 160, the expansion interface (I / F) unit 105, and the timer 106.
- the main processor 111 is a main control unit that controls the entire HMD 100 according to a predetermined program.
- the main processor 111 is realized by a CPU (Central Processor Unit) or a microprocessor unit (MPU).
- the main processor 111 performs the operation control processing of the entire HMD 100 by executing a program such as an operating system (Operating System: OS) stored in the storage device 110 and various operation control applications. Further, the main processor 111 controls the activation operation of various applications.
- the main processor 111 performs processing according to a clock signal measured and output by the timer 106.
- the system bus 112 is a data communication path for transmitting / receiving data between the main processor 111 and each part in the HMD 100.
- the storage device 110 includes a RAM 113, a ROM 114, and a flash memory 115.
- the storage device 110 stores programs such as an operating system and various operation control applications such as music, images, and documents.
- information data such as base data required for basic operations by the operating system and file data started by various applications are stored. For example, when a music application installed in the HMD 100 is activated and music file data is selected, the selected music file data is played back in the HMD 100, and desired music can be listened to.
- RAM 113 is a program area for executing basic operation programs and other application programs. Further, the RAM 113 is a temporary storage area for temporarily holding data as needed when executing various application programs. The RAM 113 may be integrated with the main processor 111.
- the ROM 114 and the flash memory 115 store various programs for realizing the functions of the HMD 100, sensor information including operation setting values and detection values from the sensor 150 described later, and various display data such as virtual objects and contents.
- the flash memory 115 stores an operation program downloaded from the network 530 and various data created by the operation program. Each operation program stored in the flash memory 115 can be updated and expanded by a download process from each server device on the network 530.
- the flash memory 115 can store contents such as moving images, still images, and sounds downloaded from the network 530. It also stores data such as moving images and still images taken by the in-camera 134 or the out-camera 133.
- the ROM 114 and the flash memory 115 are so-called non-volatile storages that hold the stored information even when the HMD 100 is not supplied with power from the outside.
- an internal memory storage Internal Memory Storage
- the HMD 100 can realize various functions by the main processor 111 expanding and executing a new application program stored in the built-in memory storage in the RAM 113.
- the built-in memory storage needs to hold the stored information even when the power is not supplied to the HMD 100. Therefore, for the built-in memory storage, for example, a device such as a flash ROM, an SSD (Solid State Drive), or an HDD (Hard Disk Drive) is used.
- a device such as a flash ROM, an SSD (Solid State Drive), or an HDD (Hard Disk Drive) is used.
- the operation receiving device 120 receives an input of an operation instruction to the HMD 100.
- a button switch 121 and a touch panel 122 are provided.
- the button switch 121 is a power key, a volume key, a home key, or the like.
- the touch panel 122 receives an operation instruction by the touch pad.
- the HMD 100 of the present embodiment does not necessarily have to include all of these operation reception devices 120. Further, the operation of the HMD 100 may be accepted via another information processing terminal device connected by wired communication or wireless communication.
- the operation receiving device 120 may be provided in a position or form within the HMD 100 where the user 901 can easily perform an input operation. It should be noted that the form may be separated from the main body of the HMD 100 and connected by wire or wirelessly. Further, the line of sight of the user 901 may be used. For those using the line of sight, for example, the input operation screen is displayed in the display 131, and the input operation information is captured by the position on the input operation screen on which the line of sight of the user 901 detected by the in-camera 134 described later is facing. The pointer may be displayed on the input operation screen, and the input operation information may be captured by operating the pointer. Further, the user 901 may utter a voice indicating an input operation, collect the sound with a microphone 143 described later, and capture the input operation information.
- the image processing device 130 is an image (video) processor and includes a display 131, an out-camera 133, and an in-camera 134.
- the display 131 is a display device (display) such as a liquid crystal panel, and presents the image data processed by the image signal processing unit to the user of the HMD 100.
- the display 131 includes a display for the left eye and a display for the right eye.
- the display 131 may be a transmissive (optical transmissive) display or a non-transmissive (video transmissive) display.
- the optical transmissive display is equipped with a projection unit that projects various information such as playback information by the startup application and notification information to the user 901, and a transparent half mirror that displays various projected information in an image in front of the eyes.
- the video transmissive display includes a liquid crystal panel and the like that display various information together with the real space object in front of the photograph taken by the out-camera 133. With the display 131, the user 901 visually recognizes the reproduction information such as music, images, and documents by the activation application in addition to the image in the field of view in front of the user.
- the image signal processing unit is an image (video) signal processor that processes images input from the out-camera 133 and the in-camera 134. Further, the image signal processing unit superimposes the object created by the main processor 111 or the like on the input image and outputs the object to the display 131.
- the image signal processing unit may be realized by the main processor 111, or may be realized by providing a processor dedicated to the image separately from the main processor 111.
- an image or the like imitating a remote control screen is displayed as an AR object on the display 131.
- the out-camera 133 and the in-camera 134 use an electronic device such as a CCD (Charge Coupled Device) or CMOS (Complementary Metal Oxide Sensor) sensor to convert the light input from the lens into an electric signal to convert the light input from the lens into an electric signal, thereby surrounding or an object. It is a camera that inputs image data.
- CCD Charge Coupled Device
- CMOS Complementary Metal Oxide Sensor
- the out-camera 133 acquires an image around the HMD 100.
- the out-camera 133 is installed in front of the HMD 100 and captures the front view field state.
- the HMD 100 includes an optical transmission type and a video transmission type depending on the type of the display 131, both of which capture the visual field view state in front of the eyes taken by the out-camera 133.
- the in-camera 134 acquires an image in a region different from that of the out-camera 133. For example, the eyes of the user 901 are photographed.
- the in-camera 134 functions as a line-of-sight detection sensor together with a processing unit that performs line-of-sight identification processing.
- the line-of-sight detection sensor includes a right line-of-sight detection unit and a left line-of-sight detection unit, and detects the line of sight of the right eye and the left eye, respectively.
- the process of detecting the line of sight may be performed by using a well-known technique generally used as an eye tracking process.
- the face is irradiated with an infrared LED (Light Emitting Diode)
- the image is taken with an infrared camera
- the position of the reflected light generated by the infrared LED irradiation on the corneal (corneal reflex) is used as a reference point.
- Detects the line of sight based on the position of the pupil relative to the position of the corneal reflex.
- an infrared camera and an infrared LED are provided.
- the voice processing device 140 is an audio processor that processes voice. It includes a speaker 141 and a microphone 143.
- the speaker 141 outputs the audio signal processed by the audio signal processing unit to the outside.
- the audio signal processing unit is an audio signal processor.
- HMD100 for example, headphones and the like are used.
- the microphone 143 converts the voice of the user, etc. into voice data and inputs it.
- a peripheral sound microphone and a vocal sound microphone may be provided.
- Each of these microphones collects external voice and user 901's own utterance. Headphones may be connected. The headphones are worn on the ears of the user 901 and listen to the voice to the user 901.
- the sensor 150 is a group of sensors for detecting the state of the HMD 100.
- a GPS (Global Positioning System) receiving unit 151 receives the position, movement, tilt, direction, etc.
- a gyro sensor 152 detects the position, movement, tilt, direction, etc.
- a geomagnetic sensor 153 detects the position, movement, tilt, direction, etc.
- an acceleration sensor 154 acquires the distance information from the HMD 100 to the object.
- other sensors may be further provided.
- the acceleration sensor 154 is a sensor that detects acceleration, which is a change in speed per unit time, and can capture movement, vibration, impact, and the like.
- the gyro sensor 152 is a sensor that detects the angular velocity in the rotation direction, and can capture the state of the vertical, horizontal, and diagonal postures. The movement of the HMD 100 can be detected by using the acceleration sensor 154 and the gyro sensor 152 mounted on the HMD 100.
- the geomagnetic sensor 153 is a sensor that detects the magnetic force of the earth, and detects the direction in which the HMD 100 is facing.
- the movement of the HMD 100 may be detected by capturing the change in the geomagnetism with respect to the movement of the HMD 100 by using a three-axis type that detects the geomagnetism in the vertical direction in addition to the front-back direction and the left-right direction.
- the depth sensor 155 is a sensor that can capture the shape of an object such as a person or an object as a three-dimensional object.
- LiDAR Light Detection and Ringing
- TOF Time Of Flight
- a millimeter-wave radar that detects the state of a pair of elephants is used.
- the communication device 160 is a communication interface that performs wireless communication with other information terminals in the vicinity by short-range wireless communication, wireless LAN, or base station communication. In wireless communication, data is transmitted and received via the transmission / reception antenna.
- the communication device 160 includes a LAN (Local Area Network) communication interface (I / F) 161, a telephone network communication I / F 162, a BT (Bluetooth®) communication I / F 163, an infrared communication I / F 164, and the like. To prepare for.
- LAN Local Area Network
- I / F Local Area Network
- BT Bluetooth®
- the LAN communication I / F161 is connected to a network 530 such as the Internet via a wireless access point or the like, and transmits / receives data to / from each server on the network 530.
- Wi-Fi registered trademark
- Bluetooth registered trademark
- the telephone network communication I / F 162 is connected to a communication network using a mobile telephone communication network, and transmits / receives information to / from a server on the communication network.
- the communication method used includes, for example, GSM (registered trademark) (Global System for Mobile Communications) method, W-CDMA (Wideband Code Division Multiple Access) method, CDMA2000 method, UMTS (Universal System) method, etc.
- GSM Global System for Mobile Communications
- W-CDMA Wideband Code Division Multiple Access
- CDMA2000 method Code Division Multiple Access 2000 method
- UMTS Universal System
- 3G generation mobile communication systems
- LTE Long Term Evolution
- 5G 5th generation
- the BT communication I / F163 is an interface for communicating with an external device according to the Bluetooth standard.
- the infrared communication I / F164 is an interface for communicating with an external device by infrared rays.
- IrDA Infrared Data Association, registered trademark
- Zigbee registered trademark
- HomeRF Home Radio Frequency, registered trademark
- it may be performed using a wireless LAN such as Wi-Fi (registered trademark).
- Wi-Fi registered trademark
- UWB Ultra Wide Band
- the LAN communication I / F161, the telephone network communication I / F162, the BT communication I / F163, and the infrared communication I / F164 are each provided with a coding circuit, a decoding circuit, an antenna, and the like.
- the communication device 160 may use another method such as communication by optical communication sound wave as a means of wireless communication. In that case, instead of the transmitting / receiving antenna, a light emitting / receiving unit and a sound wave output / sound wave input unit are used, respectively.
- the expansion interface unit 105 is a group of interfaces for expanding the functions of the HMD 100, and in the present embodiment, it includes a charging terminal, a video / audio interface, a USB (Universal Serial Bus) interface, a memory interface, and the like.
- the video / audio interface inputs video / audio signals from an external video / audio output device, outputs video / audio signals to an external video / audio input device, and the like.
- the USB interface connects SB devices.
- the memory interface connects a memory card or other memory medium to send and receive data.
- the configuration example of the HMD 100 shown in FIG. 3 includes a configuration that is not essential to the present embodiment, the effect of the present embodiment is not impaired even if the configuration is not provided with these. Further, a configuration (not shown) such as a digital broadcast receiving function and an electronic money payment function may be further added.
- FIG. 4 is a functional block diagram of the HMD 100 of the present embodiment. Each function shown in this figure is realized by the main processor 111 loading a program stored in the internal memory storage into the RAM 113 and executing the program.
- the HMD 100 includes a main control unit 210, a virtual remote controller generation unit 220, an operation reception unit 230, a command transmission unit 240, and a data storage unit 250.
- the data storage unit 250 stores an analysis table 251, a control data table 252, a remote control table 253, a gesture operation table 254, and processing data 255.
- the data storage unit 250 is built in the internal memory storage or the RAM 113.
- the main control unit 210 controls the operation of the entire HMD 100.
- the virtual remote controller generation unit 220 generates the virtual remote controller of the controlled device 510 from the description information 310 on the presentation surface 300.
- the virtual remote controller is a function that allows the user 901 of the HMD 100 to use the description information 310 on the presentation surface 300 like a remote controller of the controlled device 510.
- the virtual remote controller generation unit 220 of the present embodiment generates a virtual remote controller image 400 in which the function buttons of the remote controller are superimposed and presented on the corresponding description information 310 of the presentation surface 300.
- the operation receiving unit 230 which will be described later, receives an operation instruction of the user 901 for the function button on the virtual remote controller image 400, and the command transmitting unit 240 transmits a control command to the controlled device 510.
- the virtual remote controller generation unit 220 of the present embodiment includes an image acquisition unit 221, a specific information acquisition unit 222, a control data acquisition unit 223, a corresponding unit 224, and a virtual image generation unit 225.
- the image acquisition unit 221 acquires an image of the presentation surface 300, which is the base of the virtual remote controller.
- the out-camera 133 captures the real space and acquires the surrounding image.
- the real space to be photographed includes the description information 310 of the presentation surface 300.
- the specific information acquisition unit 222 performs the specific information acquisition process.
- the specific information acquisition process is a process of analyzing the surrounding image acquired by the image acquisition unit 221 and specifying a controlled device that is a device that the user 901 wants to operate and a desired control operation that is a control operation that the user wants to operate.
- the description information 310 on the presentation surface 300 is analyzed to identify the controlled device and the desired controlled operation.
- the analysis table 251 is used.
- the specific results are output as controlled device specific information and desired control operation specific information, respectively.
- the analysis table 251 and the specific information acquisition process will be described later.
- the control data acquisition unit 223 performs the control data acquisition process, and the control data acquisition process is a process of acquiring the control data of the controlled device 510 specified by the specific information acquisition unit 222.
- the control data acquisition unit 223 first determines whether or not the control data of the controlled device 510 is stored in the control data table 252 of the own device (HMD100). Then, if it is not stored, the connectable controlled device 510 confirms the presence or absence of the controlled device 510 that is not stored in the control data table 252.
- information on the controlled device 510 such as a device name and a model number, is acquired from the controlled device 510, and control data is acquired from the network server 550.
- the control data acquired from the network server 550 is stored in the control data table 252 in association with the identification information (ID) given to each controlled device 510.
- ID identification information
- control data includes possible control operations for each controlled device 510 and control commands for each control operation.
- Correspondence unit 224 performs correspondence processing for associating a desired control operation specified by the specific information acquisition unit 222 with a control command included in the control data.
- the associated result is stored in the remote control table 253.
- the remote control table 253 is the control data of the virtual remote control desired by the user 901. The details of the corresponding processing and the remote control table 253 will be described later.
- the virtual image generation unit 225 generates a virtual remote controller image 400 that functions as a virtual remote controller and displays it on the display 131.
- the display position is a position that matches the presentation surface 300 with respect to the line-of-sight direction of the user 901.
- the virtual image generation unit 225 generates the virtual remote control image 400 by using the remote control table 253.
- the virtual remote control image 400 includes a controlled device information display area 410 and a desired control operation display area 420.
- the controlled device name 411 is displayed as information for identifying the controlled device specified by the specific information acquisition unit 222. In the example of this figure, it is displayed as "air conditioner".
- reception objects 421 and 422 of the function button for receiving the instruction of the user 901 for the desired control operation specified by the specific information acquisition unit 222 are displayed.
- the reception objects 421 and 422 are created for each desired control operation.
- the display position of the reception objects 421 and 422 is the position on the presentation surface 300 where the information corresponding to the desired control operation is described in the description information 310.
- the operation reception unit 230 receives the operation instruction of the user 901 for the virtual remote controller (virtual remote controller image 400).
- the virtual remote controller virtual remote controller image 400
- it is accepted as an operation instruction for a desired control operation corresponding to the reception objects 421 and 422.
- the operation instruction is given by, for example, a gesture operation or a line-of-sight operation.
- the operation reception unit 230 analyzes these operations with reference to the gesture operation table 254 and the like, and detects the operation instruction. When there is a predetermined operation, it is determined that the reception objects 421 and 422 are selected. Then, the desired control operation associated with the reception objects 421 and 422 is specified.
- the operation reception unit 230 detects the gesture operation of the user 901 by the movement of the hand extracted from the image taken by the sensor 150 provided in the HMD 100 or the out-camera 133. Further, the line-of-sight operation is detected by using the function of the line-of-sight detection sensor.
- the command transmission unit 240 transmits a control command associated with the specified desired control operation to the controlled device.
- transmission is performed to the controlled device 510 via the communication device 160 and the wireless router 520.
- the control command is acquired from the remote control table 253.
- the analysis table 251 is used when the specific information acquisition unit 222 analyzes the description information 310.
- This is a table in which analysis results are registered in association with character strings, figures, colors, etc. (hereinafter, character strings, etc.) that may be described in the description information 310.
- FIG. 6 is a diagram showing an example of the analysis table 251 of the present embodiment. As shown in this figure, in the analysis table 251, the analysis content 251b and the description type 251c are registered in association with each description content 251a.
- a candidate such as a character string described by the user 901 as the description information 310 on the presentation surface 300 is stored.
- the names of controlled devices 510 such as air conditioners, coolers, lighting fixtures, lights, lamps, and televisions, and the names of desired control operations such as cooling, heating, dehumidification, ventilation, stop, lighting, and extinguishing are stored. ..
- desired control operations such as cooling, heating, dehumidification, ventilation, stop, lighting, and extinguishing.
- various different names and descriptions are also stored for each of the name of the controlled device 510 and the name of the desired controlled operation. These are character strings and the like that may be described by the user 901.
- the analysis content 251b stores the analysis content of the name of the controlled device 510 or the name of the desired control operation described in the description content 251a.
- the analysis content is information that uniquely identifies a character string or the like that may be described by a large number of users 901.
- the description content 251a is the name of the controlled device 510
- the device name that identifies the controlled device 510 is stored.
- the description content 251a is an "air conditioner”, a “cooler”, or the like
- the “air conditioner device” is stored.
- lighting equipment "light”, and “lamp”, "lighting equipment” is stored.
- description content 251a is the name of the desired control operation
- information specifying the desired control operation is stored. Specifically, when the description content 251a is “cooling” or the like, “cooling function on” is stored, and when the description content 251a is “stop” or the like, "stop” is stored.
- the description type 251c stores the type of what the description content 251a represents. In the present embodiment, either the controlled device 510 or the desired controlled operation is registered.
- the specific information acquisition unit 222 analyzes the description information 310 on the presentation surface 300 and extracts a character string or the like. Then, the information matching the extracted character string or the like is searched for and detected in the description content 251a of the analysis table 251. Then, the analysis content 251b and the description type 251c registered corresponding to the analysis content 251b are extracted. When the description type 251c indicates a controlled device, the analysis content 251b is output as the controlled device identification information. On the other hand, when the description type 251c indicates a desired control operation, the analysis content 251b is output as the desired control operation specific information.
- the specific information acquisition unit 222 collates the content described as the description information 310 on the presentation surface 300 with the description content 251a, and uses the analysis content 251b and the description type 251c of the data having a match as the analysis result. ..
- description information 310 includes “Ray Bow”, “Ray”, “Cooling”, “Cold”, etc., "Cooling function on” and “Desirable control operation” are used as analysis results.
- the description information 310 includes “electricity”, “denki”, “lighting”, “showmei”, “light”, etc., “lighting equipment” and “controlled equipment” are used as the analysis results.
- the analysis table 251 is an example, and the description of other character strings not described and the analysis contents for the character strings will be omitted here. Moreover, each character string is only an example. Further, as described above, the description information 310 can use figures (illustrations), pictograms (snow marks, flame marks, etc.), symbols, and the like in addition to the character strings.
- the specific information acquisition unit 222 When the information described in the description information 310 is not registered in the analysis table 251, the specific information acquisition unit 222 newly registers the result (character string or the like) of the analysis of the description information 310 in the description content 251a. do. In this case, the corresponding analysis content 251b and description type 251c accept registration from the user 901.
- the control data table 252 stores the control data of the controlled device 510.
- FIG. 7A is a diagram showing an example of the control data table 252 of the present embodiment. As shown in this figure, the control data table 252 is associated with the controlled device name 252a, its identification information (ID) 252b, the control data 252c of the controlled device 510, and the details of the controlled device 510. Information 252d and is stored.
- the ID 252b may be any information as long as the control data of the controlled device 510 is uniquely determined. For example, a model number or the like set by the manufacturer may be used.
- the control data 252c is a set of control commands for each possible control operation in the controlled device.
- the detailed information 252d is, for example, a name, a maker name, a physical address, or the like.
- the detailed information 252d is used for connection between the HMD 100 and the controlled device 510 and the like.
- the detailed information 252d may be stored as a sub-table associated with the ID 252b instead of on the same control data table 252.
- control data table 252 the above information is registered in advance for the controlled device 510 to which the user 901 has previously connected and the controlled device 510 that may be used. Further, the control data acquisition unit 223 may acquire and register the data from the network server 550.
- the remote control table 253 is a table in which information for specifying the controlled device 510 to be controlled by the virtual remote control and control commands for each desired control operation are stored.
- FIG. 7B is a diagram showing an example of the remote control table 253 of the present embodiment.
- the description type 253a, the analysis content 253b, the identification information (ID) 253c, the control command 253d, the area 253e, and the display data 253f are stored in the remote control table 253.
- the remote control table 253 may store 253 g of the creation date and time.
- the description type 253a is the type of the description information 310 (controlled device or desired control operation).
- the description type 251c obtained as a result of analysis by the specific information acquisition unit 222 using the analysis table 251 is stored.
- the analysis content 253b is the result of the specific information acquisition unit 222 analyzing the description information 310.
- the analysis content 251b of the analysis table 251 is stored.
- the ID 253c stores an ID that uniquely identifies the controlled device.
- the ID of the controlled device 510 is acquired by the specific information acquisition unit 222 from the controlled device main body, the network, or the like based on the analysis content 251b of the controlled device.
- the control command 253d is a control command corresponding to the control operation when the description type 253a is a desired control operation.
- the corresponding unit 224 acquires and stores the control data table 252.
- the display data 253f is data to be displayed on the virtual remote controller image 400.
- the area 253e is information for specifying the display area of the virtual remote control image 400.
- the analysis result by the specific information acquisition unit 222 is stored.
- the area 253e for example, information specified by the pixel position of the image acquired by the HMD 100 is stored.
- the virtual image generation unit 225 generates a reception object of the function button group by using the handwritten character string described on the presentation surface 300 as a function button.
- the virtual image generation unit 225 when the virtual image generation unit 225 generates the virtual remote control image 400, it is registered in the display data 253f of the data in which the "controlled device" is registered as the description type 253a in the controlled device information display area 410. Display the information that is being displayed. In the case of FIG. 7B, it is an "air conditioner".
- the virtual image generation unit 225 displays the display data 253f of all the data in which the "desired control operation" is registered in the description type 253a in the desired control operation display area 420 of the virtual remote controller image 400. In the case of FIG. 7B, it is "cooling” and “stopping".
- the main control unit 210 erases the display of the virtual remote controller image 400 when a predetermined period elapses or when an instruction is received from the user 901. At this time, the remote control table 253 is also erased at the same time. As a result, wasteful memory consumption can be avoided.
- the predetermined period is, for example, 1 minute, 1 hour, 1 week during the registration date, and the like.
- the remote control table 253 may be erased after a lapse of a predetermined time. In this case, even if the display of the virtual remote control image 400 is erased, before the remote control table 253 is erased, the user 901 sees the controlled device 510, and the virtual remote control image 400 is displayed again and operated. Accept.
- the remote control table 253 may be configured to be deleted when the virtual remote control generation process for the next controlled device 510 is started.
- the gesture operation table 254 is a table in which operation instructions are associated and registered for each gesture of the user 901. It is used when specifying an operation instruction from the gesture operation of the user 901 detected by the operation reception unit 230.
- FIG. 8 is a diagram showing an example of the gesture operation table 254 of this embodiment. As shown in this figure, the gesture operation table 254 includes a gesture operation 254a and an operation instruction 254b.
- the gesture operation 254a is a gesture of the user 901 that the operation reception unit 230 may detect.
- the operation instruction 254b stores the content of the operation instruction corresponding to each gesture.
- the content of the operation instruction is predetermined.
- User 901 may register a desired operation instruction.
- the operation reception unit 230 detects the operation of touching the function button which is the reception object 421 or 422 as the gesture operation of the user 901, the operation reception unit 230 executes the corresponding function (here, the control operation). It is determined from the instruction, and the control command associated with the execution instruction is acquired from the remote controller table 253.
- the command transmission unit 240 transmits the control command to the controlled device 510.
- the operation reception unit 230 moves the function button. Determine from the instruction. Further, as a gesture operation, when the operation of sliding out of the function button display area while pressing the presented function button is detected, the operation reception unit 230 determines that the instruction is to delete the function button. Further, when the operation of drawing an arc so as to surround the entire presented function button group is detected as the gesture operation, the operation reception unit 230 determines that the instruction is to select the function button surrounded by the arc. Further, when the user 901 detects the gesture operation of moving the hand sideways, the operation reception unit 230 determines that the instruction is to end the function button display.
- the in-camera 134 may detect the line of sight of the user 901 and select a function button or the like.
- the user 901 can freely set the arrangement of the function buttons.
- the data required for processing, the data generated during processing, and the data generated as a result of processing are stored in the processing data 255.
- the user 901 uses each of these functions and data to obtain an operating environment as if the presentation surface 300 became a remote controller only by visually recognizing the presentation surface 300 through the display 131 of the HMD 100.
- FIG. 9 is a processing flow of the remote control processing of the present embodiment. This process is started when the user 901 instructs or the HMD 100 detects the presentation surface 300.
- the data is deleted after a predetermined period has elapsed after the virtual remote controller image 400 is created will be described as an example.
- the image acquisition unit 221 acquires a surrounding image (step S1101).
- the specific information acquisition unit 222 performs a specific information acquisition process of acquiring specific information (controlled device specific information and desired control operation specific information) from the surrounding image (step S1102).
- the control data acquisition unit 223 performs a control data acquisition process for acquiring the control data of the controlled device 510 based on the controlled device identification information (step S1103).
- the control data is acquired from the storage device 110 if it is in the storage device 110 of the own device, or from the network server 550 via the communication device 160 if it is not.
- the control data acquisition unit 223 determines in step S1103 whether or not control information regarding the device specified as the controlled device could be acquired (step S1104).
- control data can be obtained from either the own device or the network server 550 (S1104; Yes)
- the corresponding unit 224 performs the corresponding process to complete the remote control table 253 (step S1105).
- the control command is associated with the desired control operation.
- the virtual image generation unit 225 refers to the remote control table 253 and generates a virtual remote control image (step S1106).
- the virtual image generation unit 225 creates display data including the reception objects 421 and 422 of the function buttons associated with the control commands and the display of the names of the controlled devices 510.
- the display data to be created is generated so as to be arranged in the corresponding area of the surrounding image acquired by the image acquisition unit 221. For example, as shown in FIG. 1, the function button for receiving the instruction of the controlled device in the area where "air conditioner” is described and the instruction of "cooling" in the area where "cooling” is described on the presentation surface 300 is "stopped". Is generated so that a function button that accepts the instruction of "stop” is displayed in the area where "" is described.
- the virtual remote control image does not have to be displayed. However, the area information of the function buttons (reception objects 421 and 422) is retained, and the instruction of the user 901 is received.
- the operation reception unit 230 receives the operation instruction of the user 901 by holding the information on the arrangement position of the reception objects 421 and 422 and the corresponding control operation.
- the main control unit 210 causes the timer 106 to start the timer count (step S1107). Then, it is determined whether or not the value tm of the timer exceeds the predetermined period Th (tm ⁇ Th) (step S1108), and if the value tm exceeds the period Th (tm> Th), the main control unit 210 determines.
- the remote control table 253 is erased (step S1111), and the process ends.
- the operation receiving unit 230 receives the operation instruction for the desired control operation by the user 901 (step S1109), the operation receiving unit 230 receives the control command for the received control operation. Is extracted from the remote control table 253.
- the command transmission unit 240 transmits the extracted control command to the controlled device 510 (step S1110) via the communication device 160, and returns to step S1108.
- step S1104 If the control data could not be acquired in step S1104 (S1104; No), the main control unit 210 performs a display process (NG display process) when the control data could not be acquired (step S1112), and ends the process.
- NG display process for example, a message indicating that the control data of the specified controlled device 510 could not be obtained is displayed on the display 131, or a message indicating that the virtual remote controller cannot be created is displayed on the display 131. Display it.
- FIG. 10 is a processing flow of the specific information acquisition process of the present embodiment. This process is a process of analyzing the surrounding image and completing the description type 253a, the analysis content 253b, and the ID 253c of the remote controller table 253.
- the specific information acquisition unit 222 first performs the presentation surface shape grasping process (step S1201).
- This presentation surface shape grasping process is a process for grasping the shape of the presentation surface 300. Specifically, the surrounding image taken by the out-camera 133 is analyzed, and the predetermined shape of the presentation surface 300 is detected. In this embodiment, the outer frame 305 is detected. The area of the detected outer frame 305 is used as a scanning area for detecting the name of the controlled device 510 and information on the desired control operation. As described above, the name of the controlled device and the desired control operation are drawn on the presentation surface 300 as the description information 310.
- the shape of the outer frame 305 is, for example, a rectangle. However, the shape is not limited. Further, the shape does not have to be an accurate figure. For example, in handwriting, the rectangular line may be shaken or the corners may be slightly rounded, so it is judged to be a rectangle based on the rectangularness. Further, the mark and characters drawn in the rectangular outer frame 305 may be combined to determine the presentation surface 300. For example, the presentation surface 300 may have the characteristics of each of the "remote control", the mark circled "ri", the user's favorite mark, and the like.
- the specific information acquisition unit 222 determines whether or not the description information 310 that may correspond to the controlled device information or the desired control operation information exists in the scanning area of the presentation surface 300 (step S1202). .. When determining the presence or absence of information, for example, the scanning area is binarized, and a simple image analysis process such as whether or not there is a continuous black area having a predetermined size or more is performed.
- the NG display process is performed (step S1211), and the process ends.
- a message or the like indicating that the description information 310 cannot be detected is output.
- the degree of the presentation surface 300-likeness may be calculated at the time of determining whether or not the presentation surface 300 is present, and a message may be displayed when the degree is equal to or higher than a certain threshold value.
- the specific information acquisition unit 222 performs a presentation surface analysis process for analyzing the scanning region in detail (step S1203).
- the presentation surface analysis process the description information 310 is analyzed, and the controlled device information candidate and the desired control operation information candidate are extracted.
- the controlled device information candidate and the desired control operation information candidate are extracted by comparing the analysis result of the description information 310 with the description content 251a registered in the analysis table 251. For example, in the example of FIG. 1, "air conditioner” is extracted as a controlled device information candidate, and "cooling" and "stop” are extracted as desired control operation candidates. At this time, the analysis content 251b corresponding to each of the controlled device information candidate and the desired control operation candidate is determined.
- the specific information acquisition unit 222 performs an ID acquisition process for acquiring the identification information (ID) of the controlled device 510 from the controlled device information candidate (step S1204).
- ID is information that uniquely identifies the controlled device 510, for example, a model number or the like.
- it may be detailed device information such as the manufacturer name of the controlled device 510, the type of the device, and the physical address.
- the specific information acquisition unit 222 selects, for example, the type of device by comparing the controlled device information candidate with the analysis table 251.
- the type of device is, for example, information registered in the analysis content 251b of the analysis table 251.
- the identification information (ID) 252b of all the records in which the selected device type is registered as the controlled device name 252a is acquired from the control data table 252, and the details corresponding to the acquired identification information (ID) 252b.
- the information 252d may be used to send a connection establishment request to the surroundings and determine whether or not a response can be obtained. As a result, the response is received only from the corresponding type of device.
- the ID acquisition process it is possible to determine whether or not there is a controlled device 510 in the vicinity capable of transmitting a control command by a virtual remote controller. That is, the specific information acquisition unit 222 determines whether or not there is a response within a predetermined period (step S1205).
- the specific information acquisition unit 222 When a response is obtained within a predetermined period (S1205; Yes), the specific information acquisition unit 222 outputs the extracted and specified analysis content 251b, its description type 251c, and ID 252b to the control data acquisition unit 223. Further, the specific information acquisition unit 222 further registers the extracted and specified information in the remote controller table 253 (step S1206), and ends the process.
- the information to be registered is the description type 253a, the analysis content 253b, the ID 253c, and the area 253e and the display data 253f obtained by the presentation surface analysis process.
- step S1205 if no response is obtained (S1205; No), the process proceeds to step S1211, NG display processing is performed, and the processing is terminated.
- NG display processing in this case, a message indicating that there is no controlled device 510 capable of transmitting an operation command by the virtual remote controller is output in the surrounding area.
- the selection rule when a response is obtained from a plurality of controlled devices 510 is set in advance.
- the rules are, for example, the device that has transmitted the response earliest, the device in the line-of-sight direction of the user 901, the device that is presented to the user 901, and the selection is accepted.
- the specific information acquisition unit 222 may search for a connectable controlled device 510 when a response is not obtained from the desired device. At this time, when a new controlled device 510 that is not registered in the control data table 252 is detected, detailed information of the new controlled device 510 is acquired and registered in the control data table 252. Then, it is confirmed whether or not there is a device corresponding to the controlled device information candidate among the new controlled devices 510.
- the user 901 may acquire the ID 252b by photographing the manufacturer name and model number of the controlled device 510 with the out-camera 133. Further, the external shape of the controlled device 510 may be photographed, and an inquiry may be made to the network server 550 together with the photographed image. Further, the user 901 may acquire it from the homepage of the manufacturer or the like. In this case, the user 901 may manually describe some of the detailed information on the presentation surface 300 and acquire it as shooting data. Further, the user 901 may generate some of the detailed information by voice and acquire it by the voice processing device 140.
- an ID is assigned to each controlled device 510 registered at the time of registration in the control data table 252.
- the user 901 does not need to accurately describe the name of the controlled device 510 as the information of the controlled device 510 on the presentation surface 300. For example, it may be drawn with other characters such as "cooler” or "cold air conditioner” or a picture that can be judged as an air conditioner.
- various expected expressions that the user 901 may describe are registered in advance in the analysis table 251.
- AI Artificial Intelligence
- FIG. 11 is a processing flow of the control data acquisition process of the present embodiment. This process is a process of acquiring the control data of the controlled device 510 specified by the specific information acquisition process.
- the control data acquisition unit 223 determines whether or not the control data of the device is registered based on the controlled device identification information including the ID of the controlled device 510 specified by the specific information acquisition unit 222, in the control data table 252. Is searched (step S1301).
- the control data table 252 is searched based on the ID, and it is determined whether or not the data having the matching ID 252b is registered.
- the control data acquisition unit 223 performs the control command extraction process (step S1302). Specifically, the control data of the matching data is referred to, and the control command associated with the analysis content of the desired control operation is extracted for each desired control operation.
- the specific information acquisition unit 242 acquires the ID of the controlled device 510 from the ID control data table 252 in the specific information acquisition process, it corresponds to the acquired ID 252b without searching the control data table 252 again.
- the control command may be extracted from the attached control data 252c.
- control data acquisition unit 223 registers each of the extracted control commands in the remote control table 253 in association with the desired control operation (step S1303), and ends the process.
- control data acquisition unit 223 requests the network server 550 for the control data corresponding to the ID (step S1304). Then, it is determined whether or not the return control data has been received (step S1305).
- control data acquisition unit 223 registers the received control data in the control data table 252 in association with the controlled device name and ID (step S1306), and steps. Move to S1302.
- step S1305 If a request is made to the network server 550 in step S1305 but a response indicating that there is no control data is obtained, or if there is no response within a predetermined period (S1305; No), the control data acquisition unit 223 The NG display process is performed (step S1311), and the process is terminated.
- a message or the like indicating that the control data of the controlled device 510 cannot be obtained is output.
- the HMD 100 of the present embodiment includes the communication device 160, and transmits a control command to the controlled device 510 via the communication device 160 to remotely control the controlled device 510. Then, the controlled device 510 and the desired control operation which is the desired control operation are specified from the surrounding image which is the surrounding image, and the reception objects 421 and 422 which receive the operation instruction of the user 901 for the specified desired control operation are generated.
- the virtual remote controller generation unit 220, the operation reception unit 230 that receives the operation instruction by the user 901 via the reception objects 421 and 422, and the control command corresponding to the operation instruction received by the operation reception unit 230 are transmitted to the controlled device 510. It includes a command transmission unit 240.
- the surrounding image includes a presentation surface 300 having a notation indicating a desired control operation.
- the virtual remote control generation unit 220 has a specific information acquisition unit 222 that acquires controlled device specific information that identifies the controlled device 510 and desired control operation specific information that specifies a desired control operation from surrounding images, and a controlled device.
- the control data acquisition unit 223 that acquires the control data of the controlled device using the control device specific information, and the corresponding unit 224 that associates the control command that controls the controlled device 510 included in the control data with the desired control operation.
- a virtual remote control image 400 for arranging the reception objects 421 and 422 in the area where the notation indicating the desired control operation is displayed is generated. It includes an image generation unit 225.
- the surface for presenting the remote control function button (presentation surface 300) is imaged by the out-camera 133 mounted on the HMD 100, and the figures and characters drawn on the presentation surface 300 are used.
- the controlled device 510 to be operated and the function (desired controlled operation) are grasped, and the function button of the remote controller that controls the controlled device 510 is presented as an AR object (function button image, etc.) at the place where the function is drawn.
- the AR object and the characters / images printed or described on the presentation surface 300 are superimposed and presented on the presentation surface 300.
- the HMD 100 controls the controlled device by accepting operations such as selection, movement, and erasure of the function button of the user 901 by the gesture operation of the user 901 with respect to the function button.
- a virtual remote controller can be generated at any time, and in this virtual remote controller, only the functions required for the user 901 can be displayed as function buttons.
- the presentation surface 300 which is the base of the virtual remote controller, can be drawn on paper or the like, and can be attached to a refrigerator, or a plurality of sheets can be bound and placed like a notebook and reused as needed. .. As a result, even the user 901 who is remarkably forgotten can easily control the controlled device 510 without searching for the remote controller.
- the user 901 can freely set and generate a virtual remote controller in which the user 901 selects a function desired by the user 901 and a function button is arranged so as to realize the selected function.
- the user 901 even if the user 901 does not have a remote control for remote control at hand, if the necessary operation is described by hand on paper or the like, it functions as a remote control as it is. Therefore, the user 901 does not need to search for the remote controller. Also, only the necessary functions can be created as function buttons, so there is no hassle. In addition, the arrangement can be arranged as desired, so that it is easy to use.
- the troublesomeness of searching for the remote controller is eliminated, and the convenience of the user 901 is further improved.
- control data table 252 is stored in the data storage unit of the HMD 100 for the controlled device 510 that has generated the virtual remote controller even once.
- control to the controlled device 510 from the next time can be performed without going through the network 530.
- problems in the network 530 delay due to the congestion of the network line, line disconnection, etc.
- the remote control table 253 is deleted when it is used up, and is created again when it is needed next time.
- the user 901 does not need to search the virtual remote controller in the data storage unit 250 of the HMD 100.
- wasteful memory consumption can be suppressed.
- ⁇ Modification 1> if the user 901 has a function that the controlled device does not have among the desired control operations described on the presentation surface 300, it may be configured to clearly indicate that the function does not exist.
- FIG. 12 (b) An example of a virtual remote control image in this case is shown in FIG. 12 (b).
- the case where the user 901 is described on the presentation surface 300 as shown in FIG. 12A is shown as an example.
- the user 901 describes "air conditioner” as the controlled device name 311 and "Ray Bow”, “Taishi”, and “Dan Bow” as the desired control operation 312.
- control data acquired by the control data acquisition unit 223 does not have a control command corresponding to heating.
- the control data acquisition unit 223 notifies the virtual image generation unit 225 of this.
- the virtual image generation unit 225 generates and displays a function button in association with the control command by the method of the above embodiment for the control operation of "Ray Bow” and “Tashi". However, with respect to "Danbow", an unusable display 431 indicating that it cannot be used is performed in response to a notification from the control data acquisition unit 223.
- the non-display 431 may be displayed on the controlled device name 411 of the virtual remote controller to notify the user 901.
- the AR object marked with X is superimposed and presented, but the non-display 431 is not limited to this.
- it may be an AR object having a character string such as "There is no corresponding heating function" or "There is no air conditioner that can be operated.” Further, those character strings may be output as voice from the speaker 141.
- This modification can reduce unnecessary operations by the user.
- the controlled device 510 has been described as an air conditioner, but the controlled device 510 is not limited to the air conditioner. For example, it may be a lighting fixture.
- FIG. 13 (a) shows an example of the presentation surface 301 created by the user 901 when the controlled device 510 is a lighting fixture
- FIG. 13 (b) shows a virtual remote control image created by the HMD 100 analyzing the presentation surface 301.
- An example of 401 is shown. As shown in this figure, from the description information 310, a virtual remote control image 401 in which "lighting" is displayed as the controlled device name 411 and "lighting" and “off” are displayed as the receiving objects 421 and 422, respectively, is generated.
- control command from the HMD 100 to the controlled device 510 is transmitted via the network 530.
- transmission of the control command from the HMD 100 to the controlled device 510 is not limited to this.
- a control command may be directly transmitted from the HMD 100 to the controlled device 510 by using a short-range communication I / F such as infrared communication or Bluetooth.
- the command transmission unit 240 transmits the control command corresponding to the control operation selected by the user 901 via the BT communication I / F163 and the infrared communication I / F164.
- the controlled device 510 can be controlled by the HMD 100 by using the short-range communication I / F without going through the network 530.
- the network 530 it is not necessary to consider a problem in the network 530 (delay due to the congestion of the network line, line disconnection, etc.).
- the user 901 may be able to select which communication means is used to transmit the control command from the HMD 100 to the controlled device 510.
- the control data of the controlled device is registered in the HMD 100 in advance or acquired from the network server 550.
- the control data may be acquired from the controlled device 510 itself.
- the control data acquisition unit 223 accesses the controlled device 510 via any of the LAN communication I / F161, the BT communication I / F163, and the infrared communication I / F164, and acquires the data from the controlled device 510.
- the infrared pattern (data format) generated from a general remote controller for each control operation may be acquired instead of the control command.
- the command transmission unit 240 transmits the infrared pattern associated with the operation of the user 901 to the controlled device 510.
- an infrared pattern that is substantially the same as the infrared pattern generated from a normal remote controller can be output, and instructions can be transmitted with high accuracy.
- infrared data formats used by the remote controller.
- NEC format home appliance cooperative format
- SONY format used by the remote controller.
- the infrared data format adopted by the controlled device 510 is used.
- the information processing terminal used by the user 901 has been described by taking the HMD 100 as an example, but the information processing terminal used is not limited to this.
- a portable information terminal such as a smartphone (hereinafter referred to as a smartphone) or a tablet terminal may be used.
- FIG. 15 shows an outline of the processing of the virtual remote controller and a display example when using a smartphone.
- the user 901 raises a smartphone 101 between the presentation surface 300 and himself / herself, photographs an area including the description information 310 of the presentation surface 300, and obtains a surrounding image.
- the smartphone 101 analyzes the surrounding image, identifies the controlled device 510 and the desired control operation, and associates a control command or an infrared format. Then, the virtual remote controller image 400 is created and displayed on the display 131 of the smartphone 101. Then, when the operation instruction from the user 901 is received via the virtual remote controller image 400, the smartphone 101 transmits a control command (infrared format) according to the operation instruction to the controlled device 510.
- the generated virtual remote control image 400 includes a controlled device information display area 410 and a desired control operation display area 420.
- the reception objects 421 and 422 of "cooling" that accepts the instruction to start cooling and "stop” that accepts the instruction to stop are displayed.
- FIG. 15 shows the hardware configuration of the smartphone 101 that realizes this.
- the hardware configuration of the smartphone 101 basically has the same configuration as the hardware of the HMD 100.
- the smartphone 101 is not used by being attached like the HMD100, but is used by the user 901 by holding it in his / her hand. Therefore, the operation receiving device 120 is partially different.
- the smartphone 101 includes not only a button switch 121 and a touch panel 122 but also an operation key 123 as an operation reception device 120.
- the functional configuration of the smartphone 101 that realizes the above embodiment is basically the same as that of the HMD 100.
- the operation reception unit 230 of the smartphone 101 does not need to detect and analyze the gesture operation to specify the user 901.
- the intention of the user 901 is accepted by detecting the touch operation of the function reception button display area of the touch panel 122.
- this modified example can be realized if there is an equivalent or similar hardware configuration or software configuration.
- it may be a notebook PC, a tablet PC, or the like.
- the presentation surface 300 has been described on the premise that it is on a sheet such as paper.
- the presentation surface 300 is not limited to this.
- it may be a whiteboard or the like. As long as the user 901 can freely write and shoot, the material and shape do not matter.
- the presentation surface 300 may be provided by, for example, the manufacturer of the controlled device 510.
- the description of the remote controller in the manual may be used.
- the remote controller itself provided in the controlled device 510 may be used.
- the description information 310 is prepared in advance, that is, when the manual or the remote controller itself, which is not created by the user 901, is used for the presentation surface 300, when the virtual remote controller is created, the reception object of the function button generates all. You don't have to.
- the specific information acquisition unit 222 analyzes the image of the presentation surface 300, identifies the control operation, presents the control operation to the user 901, and accepts the selection of the control operation to be displayed on the virtual remote control image 400. Then, the control operation that has received the selection is specified as a desired control operation.
- the specific information acquisition unit 222 analyzes the description information 310 to specify the characters.
- an existing character analysis such as an OCR (Optical character recognition) function is used.
- OCR Optical character recognition
- the user 901 may be made to register handwritten characters in advance, and the registration may be referred to for analysis by a method such as pattern matching.
- the user 901 may be made to input digital data, and the handwritten character and the digital data may be associated and registered. Also in this case, at the next presentation surface information acquisition process, the pattern matching process is performed using the registered handwritten character as a pattern to specify the character.
- the description information 310 has been described by taking the case where it is basically a character string as an example.
- the description information 310 is not limited to the character string. It may be a figure, a color, or the like.
- figures, colors, and the like are registered in advance in the description content 251a.
- the registration may be performed by the user 901.
- the user 901 registers his / her favorite figure or color in advance as representing a specific controlled device 510 or a specific desired control operation. As a result, the user 901 can generate a virtual remote controller without even writing characters.
- voice is registered in the analysis table 251. That is, voice data is registered in place of or additionally of the description content 251a. Then, the analysis content 251b and the description type 251c are registered in association with the voice data.
- the display of the virtual remote control image 400 is erased after a predetermined time has elapsed.
- the display of the virtual remote controller image 400 may be erased in response to an explicit instruction from the user 901.
- the remote control table 253 which is the basis for generating the virtual remote control image 400 may be configured to be erased in synchronization with the timing of erasing the display of the virtual remote control image 400. ..
- the layout information of the function buttons of the controlled device 510 (arrangement of the function buttons of the remote controller) can be obtained from the manufacturer's homepage via the network 530, the layout information of the function buttons of the controlled device 510 is further added to the data storage unit. It may be stored in 250. Then, the virtual image generation unit 225 may generate the virtual remote control image 400 according to the arrangement of the function buttons of the controlled device 510 instead of the arrangement of the presentation surface 300.
- the configuration for realizing the technique of the present invention is not limited to the above-described embodiments and modifications, and various modifications can be considered. For example, it is possible to replace part of the configuration of one embodiment or variant with the configuration of another example. It is also possible to add the configuration of another example to the configuration of one embodiment or modification. All of these belong to the category of the present invention. Further, the numerical values and messages appearing in the text and figures are merely examples, and even if different ones are used, the effect of the present invention is not impaired.
- the above-mentioned functions of the present invention may be realized by hardware by designing a part or all of them by, for example, an integrated circuit. Further, it may be realized by software by interpreting and executing a program in which a microprocessor unit or the like realizes each function or the like. Hardware and software may be used together.
- the software may be stored in the internal memory storage of the HMD 100 or the like in advance at the time of product shipment. After the product is shipped, it may be acquired from various server devices on the Internet. Further, the software provided by a memory card, an optical disk, or the like may be acquired.
- control lines and information lines shown in the figure indicate what is considered necessary for explanation, and do not necessarily indicate all the control lines and information lines on the product. In practice, it can be considered that almost all configurations are interconnected.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Computer Networks & Wireless Communication (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- User Interface Of Digital Computer (AREA)
- Selective Calling Equipment (AREA)
Abstract
Description
以下、上記を実現する本実施形態の遠隔操作システム500について、まず、システム構成を説明する。図2は、本実施形態の遠隔操作システム500のシステム構成図である。
次に、本実施形態のHMD100のハードウェア構成および機能ブロックについて図面を用いて説明する。
以下、上記処理を実現する本実施形態のHMD100について、ハードウェア構成を説明する。図3は、本実施形態のHMD100の構成例を示すブロック図である。これらの図では、図1、図2に示した構成と同一のものには同一の符号を付す。
次に、HMD100の、本実施形態に関連する機能構成について説明する。図4は、本実施形態のHMD100の機能ブロック図である。本図に示す各機能は、メインプロセッサ111が内部メモリストレージに格納されたプログラムをRAM113にロードして実行することにより実現される。
ここで、解析用テーブル251の一例を説明する。解析用テーブル251は、特定情報取得部222が、記載情報310を解析する際に用いられる。記載情報310に記載される可能性のある文字列、図形、色等(以下、文字列等)に対応づけて、解析結果が登録されるテーブルである。
制御データテーブル252は、被制御機器510の制御データが格納される。図7(a)は、本実施形態の制御データテーブル252の一例を示す図である。本図に示すように、制御データテーブル252は、被制御機器名252aに対応付けて、その識別情報(ID)252bと、その被制御機器510の制御データ252cと、その被制御機器510の詳細情報252dと、が格納される。
次に、リモコンテーブル253を説明する。リモコンテーブル253は、仮想リモコンの対象の被制御機器510を特定する情報と、所望制御操作毎の制御コマンドが格納されるテーブルである。
次に、ジェスチャ動作テーブル254の一例を説明する。ジェスチャ動作テーブル254は、ユーザ901のジェスチャ毎の、操作指示を対応づけて登録したテーブルである。操作受付部230が検出したユーザ901のジェスチャ動作から、操作指示を特定する際に用いられる。
次に、本実施形態のHMD100による、仮想リモコン生成処理を含む遠隔操作処理の流れを説明する。図9は、本実施形態の遠隔操作処理の処理フローである。本処理は、ユーザ901の指示、または、HMD100が提示面300を検出したことを契機に開始される。なお、ここでは、仮想リモコン画像400を作成後、所定期間経過後、データを消去する場合を例にあげて説明する。
次に、特定情報取得部222による、特定情報取得処理の流れを説明する。図10は、本実施形態の特定情報取得処理の処理フローである。本処理は、周囲画像を解析し、リモコンテーブル253の、記載種別253aと解析内容253bと、ID253cとを完成させる処理である。
次に、制御データ取得部223による、制御データ取得処理について説明する。図11は、本実施形態の制御データ取得処理の処理フローである。本処理は、特定情報取得処理で特定した被制御機器510の制御データを取得する処理である。
なお、上記実施形態において、ユーザ901が提示面300に記載した所望制御操作のうち、被制御機器にない機能がある場合は、当該機能がないことを、明示するよう構成してもよい。
上記実施形態では、被制御機器510を、エアコンとして説明したが、被制御機器510は、エアコンに限定されない。例えば、照明器具であってもよい。
さらに、上記実施形態では、HMD100から、被制御機器510への制御コマンドを、ネットワーク530を介して送信している。しかしながら、HMD100から被制御機器510への制御コマンドの送信は、これに限定されない。
また、上記実施形態では、被制御機器の制御データは、予めHMD100に登録されるか、ネットワークサーバ550から取得する。しかしながら、制御データは、被制御機器510そのものから取得してもよい。この場合、制御データ取得部223は、LAN通信I/F161、BT通信I/F163、赤外線通信I/F164のいずれかを介して、被制御機器510にアクセスし、被制御機器510から取得する。
また、上記実施形態では、ユーザ901が使用する情報処理端末として、HMD100を例にあげて説明したが、使用する情報処理端末は、これに限定されない。例えば、スマートフォン(以下、スマホと呼ぶ。)やタブレット端末等の携帯型情報端末を用いてもよい。
上記実施形態では、提示面300は、例えば、紙などのシート上のものを前提に説明した。しかし、提示面300は、これに限定されない。例えば、ホワイトボード等であってもよい。ユーザ901が自在に書込み可能であり、撮影可能でありさえすれば、材質、形状は問わない。
また、特定情報取得部222は、上記実施形態では、記載情報310を解析して文字を特定している。このとき、解析には、例えば、OCR(Optical character recognition)機能などの既存の文字解析を用いる。しかしながら、これに限定されない。例えば、予め、ユーザ901に手書き文字を登録させ、当該登録を参照して、パターンマッチング等の手法で解析してもよい。
また、上記実施形態では、記載情報310は、基本的に文字列である場合を例にあげて説明した。しかしながら、記載情報310は、文字列に限定されない。図形、色等であってもよい。この場合、解析用テーブル251において、記載内容251aに、図形や色等を予め登録しておく。
さらに、上記実施形態では、仮想リモコン画像を表示する場合、所定時間が経過すると、仮想リモコン画像400の表示を消去する。しかし、これに限定されない。例えば、ユーザ901からの明示の指示に応じて、仮想リモコン画像400の表示を消去するよう構成してもよい。
ネットワーク530を介してメーカのホームページなどから被制御機器510の機能ボタンの配置(リモコンの機能ボタンの配置)情報が入手できる場合は、被制御機器510の機能ボタンの配置情報を、さらにデータ記憶部250に記憶してもよい。そして、仮想画像生成部225は、提示面300の配置ではなく、被制御機器510の機能ボタンの配置に従って、仮想リモコン画像400を生成してもよい。
210:主制御部、220:仮想リモコン生成部、221:画像取得部、222:特定情報取得部、223:制御データ取得部、224:対応付部、225:仮想画像生成部、230:操作受付部、240:コマンド送信部、250:データ記憶部、
251:解析用テーブル、251a:記載内容、251b:解析内容、251c:記載種別、252:制御データテーブル、252a:被制御機器名、252b:ID、252c:制御データ、252d:詳細情報、253:リモコンテーブル、253a:記載種別、253b:解析内容、253c:ID、253d:制御コマンド、253e:領域、253f:表示データ、253g:作成日時、254:ジェスチャ動作テーブル、254a:ジェスチャ動作、254b:操作指示、255:処理データ、
300:提示面、301:提示面、305:外枠、310:記載情報、311:被制御機器名、312:所望制御操作、400:仮想リモコン画像、401:仮想リモコン画像、410:被制御機器情報表示領域、411:被制御機器名、420:所望制御操作表示領域、421:受付オブジェクト、422:受付オブジェクト、423:受付オブジェクト、431:不可表示、
500:遠隔操作システム、510:被制御機器、520:無線ルータ、530:ネットワーク、550:ネットワークサーバ、901:ユーザ
Claims (11)
- 通信装置を備え、被制御機器に対して当該通信装置を介して制御コマンドを送信して当該被制御機器を遠隔制御する情報処理端末であって、
当該情報処理端末の周囲の画像である周囲画像から前記被制御機器および所望の制御操作である所望制御操作を特定するとともに、特定した前記所望制御操作に対するユーザの操作指示を受け付ける受付オブジェクトを生成する仮想リモコン生成部と、
前記受付オブジェクトを介してユーザによる前記操作指示を受け付ける操作受付部と、
前記操作受付部が受け付けた前記操作指示に対応する前記制御コマンドを前記被制御機器に送信するコマンド送信部と、を備えること
を特徴とする情報処理端末。 - 請求項1記載の情報処理端末であって、
前記周囲画像は、前記所望制御操作を示す表記を有する提示面を含み、
前記仮想リモコン生成部は、
前記周囲画像から、前記被制御機器を特定する被制御機器特定情報と、前記所望制御操作を特定する所望制御操作特定情報と、をそれぞれ取得する特定情報取得部と、
前記被制御機器特定情報を用いて当該被制御機器の制御データを取得する制御データ取得部と、
前記制御データに含まれる前記被制御機器を制御する制御コマンドと前記所望制御操作特定情報で特定される前記所望制御操作とを対応づける対応付部と、
前記受付オブジェクトを生成し、前記周囲画像を前記当該情報処理端末のディスプレイに表示した場合に前記所望制御操作を示す表記が表示される領域に当該受付オブジェクトを配置する仮想リモコン画像を生成する仮想画像生成部と、を備えること
を特徴とする情報処理端末。 - 請求項2記載の情報処理端末であって、
前記仮想画像生成部は、生成した前記仮想リモコン画像を、当該情報処理端末が備えるディスプレイに、予め定めた期間表示させること
を特徴とする情報処理端末。 - 請求項2記載の情報処理端末であって、
前記仮想画像生成部は、前記被制御機器が受け付けない制御操作が前記所望制御操作として特定された場合、前記受付オブジェクトに予め定めた不可表示を行うこと
を特徴とする情報処理端末。 - 請求項2記載の情報処理端末であって、
前記被制御機器の前記制御データを、当該被制御機器に対応づけて記憶する制御データテーブルをさらに備え、
前記制御データ取得部は、前記制御データを、前記制御データテーブル、外部サーバ、および前記被制御機器のいずれかから取得すること
を特徴とする情報処理端末。 - 請求項1記載の情報処理端末であって、
制御コマンドを出力する通信装置として、LAN通信インタフェースおよび近距離通信インタフェースの少なくとも1つを備えること
を特徴とする情報処理端末。 - 請求項2記載の情報処理端末であって、
前記提示面は、シート状であり、
前記提示面は、前記被制御機器を示す表記をさらに備え、
前記被制御機器を表す表記と前記所望制御操作を示す表記とは、ユーザが記載したものであること
を特徴とする情報処理端末。 - 請求項1から7のいずれか1項記載の情報処理端末であって、
当該情報処理端末は、ヘッドマウントディスプレイであること
を特徴とする情報処理端末。 - 請求項1から7のいずれか1項記載の情報処理端末であって、
当該情報処理端末は、スマートフォンであること
を特徴とする情報処理端末。 - 通信装置を備え、被制御機器に対して当該通信装置を介して制御コマンドを送信して当該被制御機器を遠隔制御する情報処理端末における遠隔制御方法であって、
当該情報処理端末の周囲の画像である周囲画像から前記被制御機器および所望の制御操作である所望制御操作を特定するとともに、特定した前記所望制御操作に対するユーザの操作指示を受け付ける受付オブジェクトを生成する仮想リモコン生成ステップと、
前記受付オブジェクトを介してユーザによる前記操作指示を受け付ける操作受付ステップと、
受け付けた前記操作指示に対応する前記制御コマンドを前記被制御機器に送信するコマンド送信ステップと、を備えること
を特徴とする遠隔制御方法。 - 通信装置を備え、被制御機器に対して当該通信装置を介して制御コマンドを送信して当該被制御機器を遠隔制御する情報処理端末が備えるコンピュータに、
当該情報処理端末の周囲の画像である周囲画像から前記被制御機器および所望の制御操作である所望制御操作を特定するとともに、特定した前記所望制御操作に対するユーザの操作指示を受け付ける受付オブジェクトを生成する仮想リモコン生成機能、
前記受付オブジェクトを介してユーザによる前記操作指示を受け付ける操作受付機能、
受け付けた前記操作指示に対応する前記制御コマンドを前記被制御機器に送信するコマンド送信機能を実現させるためのプログラム。
Priority Applications (5)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2022534560A JP7515590B2 (ja) | 2020-07-08 | 2020-07-08 | 情報処理端末、遠隔制御方法およびプログラム |
| US18/011,879 US12380698B2 (en) | 2020-07-08 | 2020-07-08 | Information processing terminal and remote control method |
| PCT/JP2020/026710 WO2022009338A1 (ja) | 2020-07-08 | 2020-07-08 | 情報処理端末、遠隔制御方法およびプログラム |
| CN202080102493.3A CN115997388A (zh) | 2020-07-08 | 2020-07-08 | 信息处理终端、远程控制方法以及程序 |
| JP2024106510A JP7755002B2 (ja) | 2020-07-08 | 2024-07-02 | 遠隔制御方法 |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2020/026710 WO2022009338A1 (ja) | 2020-07-08 | 2020-07-08 | 情報処理端末、遠隔制御方法およびプログラム |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2022009338A1 true WO2022009338A1 (ja) | 2022-01-13 |
Family
ID=79552378
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2020/026710 Ceased WO2022009338A1 (ja) | 2020-07-08 | 2020-07-08 | 情報処理端末、遠隔制御方法およびプログラム |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US12380698B2 (ja) |
| JP (2) | JP7515590B2 (ja) |
| CN (1) | CN115997388A (ja) |
| WO (1) | WO2022009338A1 (ja) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP7727981B1 (ja) * | 2025-01-22 | 2025-08-22 | 木村 力也 | 様々な電化製品の操作ができるリモートコントロールシステム及びそのシステムを備えたリモコン及びロボット及びそのシステムを設定するアプリケーション |
| WO2025182230A1 (ja) * | 2024-02-29 | 2025-09-04 | 株式会社Lixil | 制御装置、水栓システム、ホームシステム、装置診断方法および装置診断プログラム |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2023138873A (ja) * | 2020-08-21 | 2023-10-03 | ソニーグループ株式会社 | 情報処理装置、情報処理システム、情報処理方法及びプログラム |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0918971A (ja) * | 1995-06-28 | 1997-01-17 | Matsushita Electric Ind Co Ltd | 電子機器操作装置 |
| JP2007511128A (ja) * | 2003-11-04 | 2007-04-26 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | タッチスクリーンを有するユニバーサル・リモートコントロール装置 |
| JP2009517949A (ja) * | 2005-11-30 | 2009-04-30 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 汎用リモコン装置の設定 |
| JP2012105131A (ja) * | 2010-11-11 | 2012-05-31 | Sony Corp | サーバ装置、表示操作端末、および遠隔操作システム |
| JP2013172432A (ja) * | 2012-02-23 | 2013-09-02 | Panasonic Corp | 機器制御装置、機器制御方法、機器制御プログラム、及び集積回路 |
Family Cites Families (20)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP4588395B2 (ja) * | 2004-09-24 | 2010-12-01 | 富士通株式会社 | 情報処理端末 |
| US8196055B2 (en) * | 2006-01-30 | 2012-06-05 | Microsoft Corporation | Controlling application windows in an operating system |
| JP4203525B1 (ja) * | 2007-06-13 | 2009-01-07 | 健治 吉田 | ドットパターンを利用した機器の入力装置、制御対象機器の受信装置、端末装置およびペーパーコントローラ |
| KR101444407B1 (ko) * | 2010-11-02 | 2014-09-29 | 한국전자통신연구원 | 근거리 무선 통신을 이용한 증강현실 기반의 기기 제어 장치 및 그 방법 |
| JP5681713B2 (ja) | 2011-03-29 | 2015-03-11 | パナソニックIpマネジメント株式会社 | 遠隔操作システムおよびリモコン |
| WO2013093906A1 (en) * | 2011-09-19 | 2013-06-27 | Eyesight Mobile Technologies Ltd. | Touch free interface for augmented reality systems |
| JP6318740B2 (ja) * | 2014-03-17 | 2018-05-09 | 株式会社リコー | 機器制御システム、機器制御装置、機器制御方法及びプログラム |
| JP6500477B2 (ja) * | 2015-02-12 | 2019-04-17 | セイコーエプソン株式会社 | 頭部装着型表示装置、制御システム、頭部装着型表示装置の制御方法、および、コンピュータープログラム |
| KR101595957B1 (ko) * | 2014-06-12 | 2016-02-18 | 엘지전자 주식회사 | 이동 단말기 및 제어 시스템 |
| JP6367031B2 (ja) | 2014-07-17 | 2018-08-01 | 公立大学法人首都大学東京 | 電子機器遠隔操作システム及びプログラム |
| US10447785B2 (en) * | 2014-11-17 | 2019-10-15 | Lg Electronics Inc. | Digital device and method for controlling same |
| WO2016171512A1 (ko) * | 2015-04-23 | 2016-10-27 | 엘지전자 주식회사 | 복수의 디바이스에 대한 원격제어를 수행할 수 있는 원격제어장치 |
| KR102383130B1 (ko) * | 2016-01-18 | 2022-04-08 | 삼성전자주식회사 | 기능을 제어하는 방법 및 이를 지원하는 전자 장치 |
| CN107104996B (zh) * | 2016-02-19 | 2021-05-18 | 腾讯科技(深圳)有限公司 | 用户位置校验方法和装置、受控设备访问方法和装置 |
| JP2017151894A (ja) * | 2016-02-26 | 2017-08-31 | ソニーモバイルコミュニケーションズ株式会社 | 情報処理装置、情報処理方法、およびプログラム |
| KR102655584B1 (ko) * | 2017-01-02 | 2024-04-08 | 삼성전자주식회사 | 디스플레이 장치 및 디스플레이 장치의 제어 방법 |
| CN106990894B (zh) * | 2017-03-21 | 2020-08-11 | 北京小米移动软件有限公司 | 智能设备的控制方法及装置 |
| KR102411124B1 (ko) * | 2017-10-27 | 2022-06-21 | 삼성전자주식회사 | 전자 장치 및 전자 장치에서 외부 장치를 이용한 태스크 수행 방법 |
| KR102040939B1 (ko) * | 2019-07-15 | 2019-11-27 | 한화테크윈 주식회사 | 감시 시스템 및 그 동작 방법 |
| US11328692B2 (en) * | 2019-08-06 | 2022-05-10 | Alexandra Cartier | Head-mounted situational awareness system and method of operation |
-
2020
- 2020-07-08 US US18/011,879 patent/US12380698B2/en active Active
- 2020-07-08 JP JP2022534560A patent/JP7515590B2/ja active Active
- 2020-07-08 CN CN202080102493.3A patent/CN115997388A/zh active Pending
- 2020-07-08 WO PCT/JP2020/026710 patent/WO2022009338A1/ja not_active Ceased
-
2024
- 2024-07-02 JP JP2024106510A patent/JP7755002B2/ja active Active
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0918971A (ja) * | 1995-06-28 | 1997-01-17 | Matsushita Electric Ind Co Ltd | 電子機器操作装置 |
| JP2007511128A (ja) * | 2003-11-04 | 2007-04-26 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | タッチスクリーンを有するユニバーサル・リモートコントロール装置 |
| JP2009517949A (ja) * | 2005-11-30 | 2009-04-30 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 汎用リモコン装置の設定 |
| JP2012105131A (ja) * | 2010-11-11 | 2012-05-31 | Sony Corp | サーバ装置、表示操作端末、および遠隔操作システム |
| JP2013172432A (ja) * | 2012-02-23 | 2013-09-02 | Panasonic Corp | 機器制御装置、機器制御方法、機器制御プログラム、及び集積回路 |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2025182230A1 (ja) * | 2024-02-29 | 2025-09-04 | 株式会社Lixil | 制御装置、水栓システム、ホームシステム、装置診断方法および装置診断プログラム |
| JP7727981B1 (ja) * | 2025-01-22 | 2025-08-22 | 木村 力也 | 様々な電化製品の操作ができるリモートコントロールシステム及びそのシステムを備えたリモコン及びロボット及びそのシステムを設定するアプリケーション |
Also Published As
| Publication number | Publication date |
|---|---|
| JP7515590B2 (ja) | 2024-07-12 |
| CN115997388A (zh) | 2023-04-21 |
| US20230245456A1 (en) | 2023-08-03 |
| US12380698B2 (en) | 2025-08-05 |
| JP2024123273A (ja) | 2024-09-10 |
| JPWO2022009338A1 (ja) | 2022-01-13 |
| JP7755002B2 (ja) | 2025-10-15 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11699271B2 (en) | Beacons for localization and content delivery to wearable devices | |
| JP6256339B2 (ja) | 制御装置および記憶媒体 | |
| JP7755002B2 (ja) | 遠隔制御方法 | |
| US10685059B2 (en) | Portable electronic device and method for generating a summary of video data | |
| CN105306084B (zh) | 眼镜型终端及其控制方法 | |
| CN103577102B (zh) | 用于标记关于图像的信息的方法和系统、及其装置 | |
| TWI423112B (zh) | 可攜式虛擬輸入操作裝置與其操作方法 | |
| CN109088803B (zh) | 一种ar遥控装置、智能家居遥控系统及方法 | |
| US20170185276A1 (en) | Method for electronic device to control object and electronic device | |
| US20150381885A1 (en) | Glass-type terminal and method for controlling the same | |
| KR101847200B1 (ko) | 객체 제어 방법 및 시스템 | |
| US11373650B2 (en) | Information processing device and information processing method | |
| WO2018000200A1 (zh) | 对电子设备进行控制的终端及其处理方法 | |
| CN106662926A (zh) | 普适计算环境中的姿势交互的系统和方法 | |
| US20180196503A1 (en) | Information processing device, information processing method, and program | |
| US20170083268A1 (en) | Mobile terminal and method of controlling the same | |
| US20210216768A1 (en) | Mobile terminal and method for controlling the same | |
| CN209514548U (zh) | Ar搜索装置、基于ar搜索装置的物品搜索系统 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20944373 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2022534560 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 20944373 Country of ref document: EP Kind code of ref document: A1 |
|
| WWG | Wipo information: grant in national office |
Ref document number: 18011879 Country of ref document: US |