WO2024154297A1 - Dispositif de commande, système d'affichage, procédé de commande et support d'enregistrement - Google Patents
Dispositif de commande, système d'affichage, procédé de commande et support d'enregistrement Download PDFInfo
- Publication number
- WO2024154297A1 WO2024154297A1 PCT/JP2023/001534 JP2023001534W WO2024154297A1 WO 2024154297 A1 WO2024154297 A1 WO 2024154297A1 JP 2023001534 W JP2023001534 W JP 2023001534W WO 2024154297 A1 WO2024154297 A1 WO 2024154297A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display
- aerial image
- touch panel
- user
- optical element
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
Definitions
- This disclosure relates to control devices, etc.
- Non-contact displays that project images onto a screen in mid-air (see, for example, Patent Documents 1 to 3).
- a sensor detects operations performed by the user on the mid-air image.
- One example of the objective of this disclosure is to provide a control device or the like that enables various operations to be performed when using a non-contact display.
- the control device includes a receiving means for receiving an operation on an aerial image detected by a sensor included in a non-contact display and an operation on a touch panel superimposed on a portion of the surface of an optical element included in the non-contact display as different operations, a processing means for performing processing according to the operation on the aerial image when an operation on the aerial image is received, and for performing processing according to the operation on the touch panel when an operation on the touch panel is received, and an output control means for controlling display on a display included in the non-contact display based on a processing result by the processing means.
- a display system comprising a non-contact display and a control device
- the non-contact display includes a touch panel, an optical element, a display, and a sensor
- the touch panel is disposed over a portion of a surface of the optical element
- the touch panel detects an operation on the touch panel itself
- the sensor detects an operation on the display and an aerial image formed by the optical element
- the control device comprises a receiving means for receiving an operation on the aerial image and an operation on the touch panel as different operations, a processing means for performing processing according to the operation on the aerial image when an operation on the aerial image is received, and for performing processing according to the operation on the touch panel when an operation on the touch panel is received, and an output control means for controlling the display on the display based on the processing result by the processing means.
- the control device includes a receiving means for receiving an operation on an aerial image detected by a sensor included in a non-contact display installed in a vehicle and an operation on a touch panel provided over a portion of the surface of an optical element included in the non-contact display, a processing means for performing a process corresponding to the operation on the aerial image when an operation on the aerial image is received, and for performing a process corresponding to the operation on the touch panel when an operation on the touch panel is received, and an output control means for controlling a display on a display included in the non-contact display based on a processing result by the processing means, the optical element, the display, and the touch panel are installed so that an operation surface of the touch panel and a surface of the optical element form an acute angle with a display surface of the display, the operation on the aerial image is an operation by a passenger in a front passenger seat of the vehicle, and the operation on the touch panel is an operation by a driver of the vehicle, and the receiving means receives an operation on the aerial image and does not
- a control method accepts an operation on an aerial image detected by a sensor included in a non-contact display and an operation on a touch panel superimposed on a portion of the surface of an optical element included in the non-contact display as different operations, and when an operation on the aerial image is accepted, performs processing according to the operation on the aerial image, and when an operation on the touch panel is accepted, performs processing according to the operation on the touch panel, and controls the display on a display included in the non-contact display based on the processing results.
- a control method includes receiving an operation on an aerial image detected by a sensor included in a non-contact display installed in a vehicle and an operation on a touch panel provided over a portion of a surface of an optical element included in the non-contact display;
- processing is performed in accordance with the operation on the aerial image
- processing is performed in accordance with the operation on the touch panel
- display on a display included in the non-contact display is controlled, the optical element, the display, and the touch panel are installed so that an operation surface of the touch panel and a surface of the optical element form an acute angle with a display surface of the display
- the operation on the aerial image is an operation by a passenger in the front seat of the vehicle
- the operation on the touch panel is an operation by the driver of the vehicle, and in the acceptance, while the vehicle is traveling, an operation on the aerial image is accepted, and an operation on the touch panel is not accepted.
- a program in one aspect of the present disclosure causes a computer to accept an operation on an aerial image detected by a sensor included in a non-contact display and an operation on a touch panel superimposed on a portion of the surface of an optical element included in the non-contact display as different operations, and when an operation on the aerial image is accepted, performs processing corresponding to the operation on the aerial image, and when an operation on the touch panel is accepted, performs processing corresponding to the operation on the touch panel, and controls display on a display included in the non-contact display based on the processing results.
- a program causes a computer to execute a process that receives an operation on an aerial image detected by a sensor included in a non-contact display installed in a vehicle and an operation on a touch panel provided over a portion of a surface of an optical element included in the non-contact display, and when an operation on the aerial image is received, performs a process corresponding to the operation on the aerial image, and when an operation on the touch panel is received, performs a process corresponding to the operation on the touch panel, and controls display on a display included in the non-contact display based on a result of the process, the optical element, the display, and the touch panel are installed such that an operating surface of the touch panel and a surface of the optical element form an acute angle with a display surface of the display, the operation on the aerial image is an operation by a passenger in a front passenger seat of the vehicle, and the operation on the touch panel is an operation by a driver of the vehicle, and the receiving process receives an operation on the aerial image and does not receive an operation on the
- Each program may be stored on a non-transitory computer-readable recording medium.
- FIG. 1 is an explanatory diagram illustrating an example of a display system. This is an explanatory diagram (part 1) showing a simplified aerial display. This is an explanatory diagram (part 2) showing a simplified aerial display.
- FIG. 2 is an explanatory diagram illustrating an example of a hardware configuration of a computer. 2 is a block diagram showing a configuration example of a control device according to the first embodiment;
- FIG. An explanatory diagram showing example 1 of an aerial display installation for use by multiple users.
- 11 is an explanatory diagram showing an example in which a user is imaged for the purpose of correcting an operation position;
- FIG. 13 is an explanatory diagram showing an example of a screen for correcting a standing position.
- FIG. 10 is a flowchart showing an example of an operation of the control device when used by a plurality of users.
- FIG. 13 is an explanatory diagram showing an example of installation in which one user operates both the touch panel and the aerial imaging.
- 11A and 11B are explanatory diagrams showing examples in which an operation is determined to be an operation on an aerial image
- 13A and 13B are explanatory diagrams showing an example in which an operation is determined not to be an operation for aerial imaging.
- 1 is an explanatory diagram showing an example of installation of an imaging device for detecting the position of a user's hand
- FIG. 10 is an explanatory diagram showing an example in which the aerial display is used as a POS terminal.
- FIG. 10 is a flowchart showing an example of an operation of the control device when used by a plurality of users.
- 11 is an explanatory diagram showing an example of mirror placement when an aerial image is superimposed on an image reflected in the mirror;
- FIG. 11 is an explanatory diagram showing an example of a superimposed display.
- FIG. 13 is an explanatory diagram showing an example of installation of an imaging device for making the display follow the movement of a user.
- FIG. 13 is an explanatory diagram showing an example of making a display follow the movement of a user.
- 11 is an explanatory diagram showing an example of the installation of another display when an aerial image is superimposed on an image shown on the other display.
- FIG. 11 is an explanatory diagram showing an example of how an object is placed when an aerial image is superimposed on the object.
- FIG. FIG. 13 is an explanatory diagram showing an example in which an aerial image is superimposed on an object.
- FIG. 11 is a block diagram showing a configuration example of a control device according to a second embodiment. 13 is a flowchart showing an example of an operation of the control device according to the second embodiment;
- control device display system, control method, program, and non-transitory recording medium for recording the program according to the present disclosure will be described in detail.
- the disclosed technology is not limited to these embodiments.
- an airborne display is described as an example of a non-contact type display, but the example of the non-contact type display is not particularly limited.
- (Embodiment 1) 1 is an explanatory diagram showing an example of a display system.
- the display system 1 includes an aerial display 10 and a control device 11.
- the control device 11 controls the aerial display 10.
- the control device 11 accepts user operations via the aerial display 10.
- the control device 11 also causes the aerial display 10 to display information.
- the aerial display 10 and the control device 11 are connected via a communication network NT.
- the display system 1 may further include other devices.
- the display system 1 may include an imaging device that captures an image of the user.
- the control device 11 and the imaging device are connected via a communication network NT.
- the communication network NT that connects the imaging device and the control device 11 and the communication network NT that connects the aerial display 10 and the control device 11 may be the same or different, and are not particularly limited.
- FIG. 2A and 2B are explanatory diagrams showing the aerial display 10 in a simplified manner.
- the aerial display 10 includes, for example, an optical element 101, a display 102, a sensor 105, and a touch panel 103.
- the X-axis, Y-axis, and Z-axis are defined for ease of explanation.
- the plane of the X-axis and Y-axis is, for example, approximately parallel to the surface of the optical element 101 and the operation surface of the touch panel 103.
- the aerial display 10 is shown as viewed from the positive direction of the X-axis.
- the aerial display 10 has display directionality. Therefore, when a user views the aerial display 10 from the positive direction of the X-axis, the aerial image 104 cannot actually be seen, but the aerial image 104 is illustrated for ease of understanding.
- the optical element 101 passes light emitted by the display on the display 102, forming an identical image on the opposite side to the display 102.
- this image is the aerial image 104.
- the sensor 105 is used to operate the aerial image 104.
- the sensor 105 is, for example, a motion sensor or a 3D (Dimension) sensor.
- the angle between the display 102 and the optical element 101 may be controllable. This changes the angle between the aerial image 104 and the optical element 101, making it possible to change the orientation of the aerial image 104.
- the display 102 may be installed so that the optical element 101 and the aerial image 104 are approximately parallel.
- the touch panel 103 is installed on at least a part of the surface of the optical element 101.
- the operation surface of the touch panel 103 is approximately parallel to the surface of the optical element 101.
- the touch panel 103 may be attached to at least a part of the optical element 101.
- the touch panel 103 may be integrated with a protective glass for the optical element 101.
- the touch panel 103 may be embedded in the optical element 101.
- the aerial display 10 may also have a vibration generating device that generates vibrations.
- the vibration generating device may generate vibrations in response to an operation under the control of the control device 11.
- the vibration generating device may be a device that generates vibrations in response to an operation on the aerial image 104, and a device that generates vibrations in response to an operation on the touch panel 103, which may be different from each other.
- the aerial display 10 may also have an audio output device that outputs audio.
- the audio output device may output audio in response to an operation under the control of the control device 11.
- the audio output device may be a device that outputs audio in response to an operation on the aerial image 104, and a device that outputs audio in response to an operation on the touch panel 103, which may be different from each other.
- FIG. 3 is an explanatory diagram showing an example of the hardware configuration of a computer.
- some or all of the devices can be realized using any combination of a computer 80 and a program as shown in FIG. 3.
- the computer 80 has, for example, a processor 801, a ROM (Read Only Memory) 802, a RAM (Random Access Memory) 803, and a storage device 804.
- the computer 80 also has a communication interface 805 and an input/output interface 806.
- Each component is connected to the other via, for example, a bus 807. Note that the number of each component is not particularly limited, and there may be one or more of each component.
- the processor 801 controls the entire computer 80.
- the processor 801 may be, for example, a CPU (Central Processing Unit), a DSP (Digital Signal Processor), a GPU (Graphics Processing Unit), a MPU (Micro Processing Unit), an FPU (Floating point number Processing Unit), a PPU (Physics Processing Unit), a TPU (Tensor Processing Unit), a quantum processor, a microcontroller, or a combination of these.
- the computer 80 also has a ROM 802, a RAM 803, and a storage device 804 as storage units.
- the storage device 804 include semiconductor memory such as flash memory, a hard disk drive (HDD), and a solid state drive (SSD).
- the storage device 804 stores an OS (operating system) program, application programs, and programs according to each embodiment.
- the ROM 802 stores application programs and programs according to each embodiment.
- the RAM 803 is used as a work area for the processor 801.
- the processor 801 also loads programs stored in the storage device 804, ROM 802, etc. The processor 801 then executes each process coded in the program. The processor 801 may also download various programs via the communications network NT. The processor 801 also functions as a part or all of the computer 80. The processor 801 may then execute the processes or instructions in the illustrated flowchart based on the program.
- the communication interface 805 is connected to a communication network NT, such as a LAN (Local Area Network) or a WAN (Wide Area Network), via a wireless or wired communication line.
- the communication network NT may be composed of multiple communication networks NT.
- the computer 80 is connected to an external device or an external computer 80 via the communication network NT.
- the communication interface 805 serves as an interface between the communication network NT and the inside of the computer 80.
- the communication interface 805 also controls the input and output of data from the external device or the external computer 80.
- the input/output interface 806 is connected to at least one of an input device, an output device, and an input/output device.
- the connection method may be wireless or wired.
- Examples of the input device include a keyboard, a mouse, and a microphone.
- Examples of the output device include a display device, a lighting device, and an audio output device that outputs audio.
- Examples of the input/output device include a touch panel display.
- the input device, output device, and input/output device may be built into the computer 80 or may be external.
- Computer 80 may have some of the components shown in FIG. 3. Computer 80 may have components other than those shown in FIG. 3. For example, computer 80 may have a drive device or the like. Then, processor 801 may read out programs and data stored in a recording medium attached to the drive device or the like to RAM 803. Examples of non-transient tangible recording media include optical disks, flexible disks, magnetic optical disks, and USB (Universal Serial Bus) memories. Also, as described above, for example, computer 80 may have input devices such as a keyboard and a mouse. Computer 80 may have an output device such as a display 102.
- FIG. 4 is a block diagram showing an example of the configuration of the control device 11 according to the first embodiment.
- the control device 11 includes a reception unit 111, a processing unit 112, an output control unit 113, a correction unit 114, a determination unit 115, and a detection unit 116.
- the reception unit 111 receives an operation on the aerial image 104 detected by the sensor 105 and an operation on the touch panel 103. Also, for example, the reception unit 111 may receive an operation on the aerial image 104 detected by the sensor 105 and an operation on the touch panel 103 as different operations. Here, there is no particular limitation on the operation on the aerial image 104 being different from the operation on the touch panel 103. Taking a button operation as an example, the operation on the aerial image 104 being different from the operation on the touch panel 103 may be such that even if the same button is pressed, the processing by the processing unit 112 after the button is pressed is different.
- the processing unit 112 when the processing unit 112 receives an operation on the aerial image 104, it performs processing according to the operation on the aerial image 104. Also, when the processing unit 112 receives an operation on the touch panel 103, it performs processing according to the operation on the touch panel 103.
- the output control unit 113 also changes the color of the position where an operation on the aerial image 104 has been performed.
- the output control unit 113 also changes the color of the position where an operation on the touch panel 103 has been performed.
- the output control unit 113 may change the color of the position where an operation on the touch panel 103 has been performed so that it follows the change until the finger is removed.
- the output control unit 113 may return the color of the changed position to its original color when the finger is removed.
- the color of the position where an operation on the aerial image 104 has been performed and the color of the position where an operation on the touch panel 103 has been performed may be different. This allows the user to distinguish whether they are operating the aerial image 104 or the touch panel 103 by the color.
- the output control unit 113 may also control audio output.
- the output control unit 113 may control the sound output when operating the aerial image 104 to be different from the sound output when operating the touch panel 103. This allows each user to determine by sound whether they are operating the aerial image 104 or the touch panel 103.
- the output control unit 113 may also control the generation of vibrations.
- the output control unit 113 may control the vibrations generated when operating the aerial image 104 to be different from the vibrations generated when operating the touch panel 103.
- the vibrations generated when operating the aerial image 104 may be aerial haptics using ultrasound or the like.
- the vibrations generated when operating the touch panel 103 may be an existing technology such as the haptics of a smartphone or the like. It is only necessary that the vibrations generated when operating the aerial image 104 and the vibrations generated when operating the touch panel 103 are different enough to be distinguishable. This allows each user to determine whether they are operating the aerial image 104 or the touch panel 103 by the vibrations.
- FIG. 5 is an explanatory diagram showing an installation example 1 of the aerial display 10 when used by multiple users.
- the surface of the optical element 101 of the aerial display 10 and the operation surface of the touch panel 103 are installed approximately parallel to a wall, a vehicle dashboard, or the like.
- the user's head is being viewed.
- the display 102 is installed so that the operation surface of the touch panel 103 and the operation surface of the aerial image 104 are not parallel.
- the optical element 101, the display 102, and the touch panel 103 are installed so that the operation surface of the touch panel 103 and the surface of the optical element 101 form an acute angle with the display surface of the display 102.
- the first user views the display 102 via the aerial image 104. Meanwhile, the second user views the display 102 directly.
- the orientation of the screens seen when the first user and the second user view the aerial image 104 and the display 102 from the front will be the same.
- the aerial image 104 may appear brighter than the display 102.
- the angle between the display 102 and the optical element 101 and the touch panel 103 may be determined by the position of the driver's face.
- the position of the driver's face may be the position of an average driver's face or the position of an individual driver's face.
- the position of the individual driver's face may be detected from an image captured by an imaging device installed in the vehicle, for example. Specifically, for example, the imaging device captures an image of the driver's face. Then, the detection unit 116 detects the position of the driver's face from the captured image.
- the display system 1 may determine the angle between the display 102 and the optical element 101 based on the detected position of the face, and control the position of the display 102 to be the determined angle.
- the reception unit 111 receives operations on the aerial image 104 and operations on the touch panel 103.
- the operations on the touch panel 103 may be operations by the driver of the vehicle, and the operations on the aerial image 104 may be operations by a passenger other than the driver.
- the passenger other than the driver may be, for example, a passenger sitting in the passenger seat.
- the reception unit 111 receives operations on the touch panel 103 and receives operations on the aerial image 104 when the vehicle is not in motion.
- the reception unit 111 does not receive operations on the touch panel 103 and receives operations on the aerial image 104 when the vehicle is in motion.
- the reception unit 111 may not receive operations on the touch panel 103 that are operations by the driver, but may receive operations on the aerial image 104 that are operations by passengers other than the driver. This allows the driver and the passenger in the front passenger seat to operate the same screen content using the touch panel 103 and the aerial image 104, respectively. The driver will not be able to operate the vehicle when it is in motion, but the passenger in the front passenger seat can operate it even when the vehicle is in motion.
- General touch panels used as display devices for car navigation devices are controlled so that they cannot be operated while driving in order to prevent a decrease in safety. This control is performed to prevent the driver from operating the device while driving. However, since it is difficult to distinguish whether the person in the passenger seat is operating the device or the driver, there are cases where the touch panel operation is not accepted all at once regardless of who is operating the device.
- the aerial display 10 is installed in a vehicle as a display device as shown in FIG. 5, the driver cannot see the aerial image 104 and has difficulty operating the aerial image 104. Therefore, by using the aerial display 10 including the touch panel 103, the person in the passenger seat can operate the aerial image 104 while driving, but the driver cannot operate the touch panel 103. The passenger in the passenger seat can operate the device while maintaining safety while the vehicle is driving. Therefore, various operations can be performed.
- FIG. 6 is an explanatory diagram showing a second installation example of the aerial display 10 when used by multiple users.
- the aerial display 10 may be installed on a stand or a wall so that the surface of the optical element 101 of the aerial display 10 and the operation surface of the touch panel 103 are approximately parallel to the plane of the stand or the plane of the wall.
- the display 102 is installed so that the operation surface of the touch panel 103 and the operation surface of the aerial image 104 are not parallel.
- the reception unit 111 receives an operation on the aerial image 104 by a first user and an operation on the touch panel 103 by a second user.
- the first user may be a main user
- the second user may be a sub-user.
- the first user may be a sub-user
- the second user may be a main user.
- the first user sees the display on the display 102 via the aerial image 104. Meanwhile, the second user sees the display 102 directly. By setting it up as shown in FIG. 6, the second user sees a screen that is inverted from the screen seen by the first user.
- the aerial display 10 may be installed in a store.
- the aerial display 10 may be used as a display device for performing processes such as product registration, product settlement, and product search in the store.
- the first user who operates the aerial image 104 is a customer
- the second user who operates the touch panel 103 is a store clerk.
- the customer and the store clerk can view the same screen content on the aerial image 104 and the direct display 102, respectively.
- the image viewed by the store clerk is an inverted image of the aerial image 104.
- the reception unit 111 receives an operation by a customer on the aerial image 104 and an operation by a store clerk on the touch panel 103 as different operations.
- the operation by a customer may be an operation related to product registration, an operation related to payment processing, or an operation related to product search.
- the operation by the store clerk may be an operation to support the customer's operation, or an operation to perform processing exclusive to the store clerk.
- the operation by a customer on the aerial image 104 and the operation by a store clerk on the touch panel 103 being different may be, for example, that even if the same button is pressed, the processing by the processing unit 112 after the button is pressed is different.
- the operation by a customer on the aerial image 104 and the operation by a store clerk on the touch panel 103 being different may be a button that is only valid on one side.
- the screen may have a store clerk button, and the reception unit 111 may not accept the store clerk button as an operation when the store clerk button is touched on the aerial image 104, but may accept the store clerk button as an operation when the store clerk button is touched on the touch panel 103.
- the processing unit 112 may transition from the customer mode to the store clerk mode.
- the reception unit 111 receives an operation on the aerial image 104 and receives an operation on the touch panel 103.
- the reception unit 111 may not receive an operation on the aerial image 104, but may receive an operation on the touch panel 103. More specifically, as an example of transitioning to the store clerk mode, when the reception unit 111 receives a touch on the touch panel 103, the output control unit 113 displays a screen on which a store clerk code can be input. Then, the reception unit 111 may receive an input of a store clerk code as the predetermined operation. Then, when the processing unit 112 receives an input of the store clerk code, it may transition to the store clerk mode.
- the first user who operates the aerial image 104 may be a store clerk, and the second user who operates the touch panel 103 may be a customer.
- the processing unit 112 may transition from the customer mode to the store clerk mode.
- the reception unit 111 receives an operation on the touch panel 103 and receives an operation on the aerial image 104.
- the reception unit 111 may not receive an operation on the touch panel 103, but may receive an operation on the aerial image 104.
- the position where the user operates on the touch panel 103 may deviate from the position where the user actually wants to operate. Also, the position where the user operates on the aerial image 104 may deviate from the position where the user actually wants to operate. For this reason, the operated position may be corrected. An example of the correction of the operation position will be described with reference to FIG. 7.
- FIG. 7 is an explanatory diagram showing an example in which a user is imaged to correct the operation position.
- there is a distance between the display 102 and the touch panel 103 that is, there is a gap between the display 102 and the touch panel 103. Since the positional relationship between the display 102 and the touch panel 103 varies depending on the position of the user's viewpoint, when the touch panel 103 performs an operation on the display on the display 102, the operation position may be shifted. In other words, the position of the touch panel 103 corresponding to the area of the display 102 that the second user is gazing at changes depending on the position of the eyes of the second user.
- the imaging device 31 is installed so that it can image the second user.
- the detection unit 116 detects the position of the face of the second user from the image of the second user captured by the imaging device 31.
- the correction unit 114 corrects the position of the operation on the accepted touch panel 103 based on the position of the face of the second user. Specifically, for example, the correction unit 114 calculates a correction amount between the position of an operation on the touch panel 103 and the position of the display on the display 102 based on the position of the face of the second user.
- the reception unit 111 receives an operation on the touch panel 103
- the correction unit 114 corrects the position of the received operation by the correction amount.
- the processing unit 112 performs processing as if the operation was performed at the corrected position.
- the position where the user operates on the aerial image 104 may differ from the position where the user actually wants to operate.
- the imaging device 37 detects the position of the hand of the first user.
- the imaging device 37 may be, for example, a 3D camera.
- the correction unit 114 corrects the position of the operation on the aerial image 104 based on the position of the hand of the first user detected by the imaging device 37.
- the correction unit 114 calculates the amount of correction between the position of the operation on the aerial image 104 and the display position on the display 102 based on the position of the hand of the first user.
- the reception unit 111 receives an operation on the aerial image 104
- the correction unit 114 corrects the position of the operation on the aerial image 104 based on the amount of correction.
- the processing unit 112 processes the operation as an operation at the corrected position.
- FIG. 8 is an explanatory diagram showing an example of a screen for correcting a standing position.
- the first user may change his/her standing position so that the desired display is visible.
- the first user is asked to change his/her standing position to a position where a number is placed inside a circle.
- the second user may be asked to change his/her position so that the desired display is visible.
- (flowchart) 9 is a flowchart showing an example of the operation of the control device 11 when multiple users use the device.
- the output control unit 113 controls the display on the display 102 (step S101).
- the reception unit 111 receives an operation on the aerial image 104 by the first user (step S102).
- the processing unit 112 performs processing according to the operation (step S103), and the process returns to step S101.
- the reception unit 111 receives an operation on the touch panel 103 by the second user (step S104).
- the processing unit 112 performs processing according to the operation on the touch panel 103 (step S105), and the process returns to step S101.
- the flowchart can be ended at any time.
- FIG. 10 is an explanatory diagram showing an example of installation when one user operates both the touch panel 103 and the aerial imaging 104.
- the aerial display 10 is installed on a table such as a desk so that the surface of the optical element 101 of the aerial display 10 and the operating surface of the touch panel 103 are approximately parallel to the plane of the table.
- the display 102 is installed so that the operation surface of the touch panel 103 and the operation surface of the aerial image 104 are not parallel. However, when one user operates the aerial display 10, the display 102 may be installed so that the operation surface of the touch panel 103 and the operation surface of the aerial image 104 are approximately parallel.
- the determination unit 115 may determine whether the operation is directed to the aerial image 104 based on the position of the hand, for example.
- FIG. 11A is an explanatory diagram showing an example in which an operation is determined to be an operation on the aerial image 104.
- FIG. 11B is an explanatory diagram showing an example in which an operation is determined not to be an operation on the aerial image 104.
- the sensor 105 is a 3D sensor, and the sensor 105 may detect the position of the hand.
- the determination unit 115 determines whether the user's hand is on the optical element 101 side beyond the aerial image 104 based on the position of the user's hand detected by the sensor 105.
- the reception unit 111 receives the operation on the aerial image 104.
- FIG. 11B when the hand is on the optical element 101 side, the reception unit 111 does not receive the operation on the aerial image 104.
- the reception unit 111 when there is no hand on the optical element 101 side, the reception unit 111 does not receive operations on the touch panel 103. Also, as shown in FIG. 11B, when there is a hand on the optical element 101 side, the reception unit 111 does not receive operations on the aerial image 104. This makes it possible to prevent operations on the touch panel 103 by other users from being received while a user is performing an operation on the aerial image 104.
- the reception unit 111 may be able to receive operations on the touch panel 103.
- the method for detecting the hand position is not particularly limited and may be detection by the sensor 105 or detection from an image captured by an imaging device.
- FIG. 12 is an explanatory diagram showing an example of the installation of an imaging device for detecting the position of the user's hand.
- the sensor 105 is installed at hand, whereas the imaging device 38 is installed in a position where it can capture an image of the periphery of the aerial image 104.
- the imaging device 38 may be an imaging device that includes a function capable of measuring distance information.
- the imaging device 38 captures an image of the user's hand.
- the dashed line shows an example of the range captured by the imaging device 38.
- the detection unit 116 detects the position of the user's hand from the image of the user captured by the imaging device 38.
- the determination unit 115 determines whether the user's hand is beyond the aerial image 104 and on the optical element 101 side based on the position of the user's hand. If the hand is not on the optical element 101 side, the reception unit 111 accepts an operation on the aerial image 104. If the hand is on the optical element 101 side, the reception unit 111 does not accept an operation on the aerial image 104.
- the position of the finger may be detected in more detail.
- the determination unit 115 determines whether the user's finger is located beyond the aerial image 104 on the optical element 101 side based on the detected position of the user's finger. If there is no finger on the optical element 101 side, the reception unit 111 accepts an operation on the aerial image 104. If there is a finger on the optical element 101 side, the reception unit 111 does not accept an operation on the aerial image 104.
- a left click operation of a mouse and a right click operation of a mouse may be realized by an operation on the aerial image 104 and an operation on the touch panel 103.
- the operation on the aerial image 104 may correspond to a left click operation of a mouse.
- the operation on the aerial image 104 is an operation corresponding to a specified position on the aerial image 104.
- the operation on the touch panel 103 may correspond to a right click operation of a mouse.
- the operation on the touch panel 103 may be at least one of an operation to instruct the display of a menu for performing an operation related to the position where the operation on the touch panel 103 was performed, an operation to instruct the display of a menu for making settings related to the position where the operation on the touch panel 103 was performed, and an operation to instruct the display of information related to the position where the operation on the touch panel 103 was performed.
- the general-purpose operation and the detailed operation may be realized by an operation on the aerial image 104 and an operation on the touch panel 103.
- the general-purpose operation may be, for example, an operation that is common to all users, and the detailed operation may be an operation with a different target depending on the user.
- the operation on the touch panel 103 since an operation on the touch panel 103 has higher operability than an operation on the aerial image 104, the operation on the aerial image 104 may be a general-purpose operation, and the operation on the touch panel 103 may be a detailed operation.
- the general-purpose operation may be a selection operation by the user touching, and the detailed operation may be an operation such as writing characters using a finger or a pen.
- the detailed operation may be an operation of writing characters such as signing.
- normal operations and special operations may be realized by operations on the aerial image 104 and operations on the touch panel 103.
- the operations on the touch panel 103 are easier to operate than operations on the aerial image 104
- the operations on the aerial image 104 may be normal operations
- the operations on the touch panel 103 may be special operations.
- a normal operation is an operation to register a product
- a special operation is an operation to cancel a product that was registered when the product was registered.
- important operations and unimportant operations may be realized by operations on the aerial image 104 and operations on the touch panel 103.
- Operations on the touch panel 103 are easier to operate than operations on the aerial image 104, so the operations on the aerial image 104 may be unimportant operations and the operations on the touch panel 103 may be important operations.
- operations on the aerial image 104 may be unimportant operations and the operations on the touch panel 103 may be important operations.
- operations that lead to payment such as operations to instruct the execution of payment, cannot be easily undone and involve the transfer of money, so they can be considered to be important operations.
- FIG. 13 is an explanatory diagram showing an example in which the aerial display 10 is used as a POS (Point Of Sales) terminal.
- POS Point Of Sales
- FIG. 13 shows an example in which the aerial display 10 shown in FIGS. 2A and 2B is also used as a self-service POS terminal that can be operated by customers.
- FIG. 13 shows a reader 32 that reads barcodes and a reader 33 that reads credit cards, along with the aerial display 10.
- a confirm button for confirming the payment and a back button for returning to the process before the payment are displayed on the aerial image 104.
- the back button can be operated by operating the aerial image 104
- the confirm button can be operated by operating the touch panel 103.
- operations related to products may be operations on the touch panel 103
- operations related to shopping other than products may be operations on the aerial image 104.
- Operations related to products may be, for example, operations for registering products.
- an operation for registering a product may be a decision operation for registering a product whose product code has been read in a product purchase list, or an operation for inputting the quantity of a product whose product code has been obtained, and is not particularly limited.
- an operation such as deciding to ship may be an operation on the touch panel 103
- operations other than shipping may be operations on the aerial image 104.
- Operations other than shipping may be, for example, an operation for inputting the delivery destination or address.
- (flowchart) 14 is a flowchart showing an example of an operation of the control device 11 when multiple users use the device.
- the output control unit 113 controls the display on the display 102 (step S111).
- the determination unit 115 determines whether the user's hand is beyond the aerial image 104 and on the optical element 101 side based on the position of the user's hand (step S112).
- step S112 If the user's hand is beyond the aerial image 104 and not on the optical element 101 side (step S112: No), the reception unit 111 receives an operation on the aerial image 104 (step S113). Then, the processing unit 112 performs processing according to the operation on the aerial image 104 (step S114) and returns to step S111.
- step S112 If the user's hand is beyond the aerial image 104 and on the optical element 101 side (step S112: Yes), the reception unit 111 receives an operation on the touch panel 103 (step S115). Here, if the user's hand is beyond the aerial image 104 and on the optical element 101 side, the reception unit 111 does not receive an operation on the aerial image 104. Then, the processing unit 112 performs processing according to the operation on the touch panel 103 (step S116) and returns to step S111.
- the flowchart may be ended as appropriate.
- the process in which the determination unit 115 determines whether a user's hand is beyond the aerial image 104 and on the optical element 101 side may also be performed when multiple users perform operations. For example, the determination unit 115 may determine whether a first user's hand is beyond the aerial image 104 and on the optical element 101 side. If the first user's hand is beyond the aerial image 104 and on the optical element 101 side, the reception unit 111 does not accept an operation on the aerial image 104.
- Example of superimposed display of aerial image 104 As an application example of the aerial display 10, an example will be described in which an aerial image 104 is superimposed on an image reflected in a mirror, an actual object or living thing, or another display 102.
- FIG. 15 is an explanatory diagram showing an example of mirror installation when the aerial image 104 is superimposed on the image reflected in the mirror.
- the display system 1 further includes a mirror 34.
- the mirror 34 is disposed on the surface opposite the aerial image 104 with respect to the optical element 101 so that the image reflected in the mirror 34 and the aerial image 104 by the aerial display 10 are superimposed. That is, the mirror 34 is disposed opposite the aerial image 104 with respect to the optical element 101 included in the aerial display 10. This allows the user to see the mirror 34 on an extension where the aerial image 104 can be seen. Therefore, the user sees the image of himself/herself reflected in the mirror 34 superimposed on the aerial image 104.
- FIG. 16 is an explanatory diagram showing an example of a superimposed display.
- a user is reflected in the mirror 34.
- a comment such as "Slow down" is displayed in the aerial image 104.
- the comment is superimposed on the user reflected in the mirror 34.
- the aerial display 10 can be used to superimpose graphics such as text and pictures onto the image reflected in the mirror 34, just like a smart mirror.
- FIG. 17 is an explanatory diagram showing an example of the installation of an imaging device to make the display follow the user's movements.
- the display system 1 may further include an imaging device 39.
- the imaging device 39 is installed in the optical element 101 included in the aerial display 10, and captures an image of the user or the user reflected in the mirror 34.
- the detection unit 116 detects the user's movements from the image captured by the imaging device 39. Then, the output control unit 113 determines the display of the display 102 of the aerial display 10 based on the user's movements.
- FIG. 18 is an explanatory diagram showing an example of making the display follow the user's movements.
- the user is reflected in the mirror 34.
- the comment "here" is displayed in the aerial image 104.
- the imaging device 39 captures an image of the user.
- the detection unit 116 detects the user's movement from the image captured by the imaging device 39.
- the output control unit 113 then controls the display of the display 102 of the aerial display 10 based on the user's movement so that the position of the comment "here" displayed in the aerial image 104 follows the user's movement.
- Examples of the use of the aerial image 104 superimposed on the image reflected in the mirror 34 include when providing instruction or practice such as tooth brushing instruction, makeup instruction, exercise instruction, and work instruction for store clerks.
- the difference between this and a smart mirror is that there is a distance between the aerial image 104 and the image reflected on the mirror 34, which is the object that is superimposed on the aerial image 104.
- Another difference from a smart mirror is that the imaging device 39 that captures the image of the user can be fixedly positioned.
- FIG. 19 is an explanatory diagram showing an example of installation of another display 102 in the case where the aerial image 104 is superimposed on the image reflected on the other display 102.
- the display system 1 further includes a second display 35.
- the second display 35 is disposed on the surface opposite the aerial image 104 with respect to the optical element 101 so that the display of the second display 35 and the aerial image 104 by the aerial display 10 are superimposed. That is, the second display 35 is disposed opposite the aerial image 104 with respect to the optical element 101 included in the aerial display 10. This allows the user to view the screen of the second display 35 on an extension where the aerial image 104 can be seen. Therefore, the aerial image 104 is superimposed on the image displayed on the second display 35.
- the output control unit 113 may control the display on the second display 35, for example.
- Example of superimposing an aerial image 104 on an actual object 20 is an explanatory diagram showing an example of an object placement when the aerial image 104 is superimposed on the object.
- the display system 1 further includes an object 36.
- the object 36 is disposed opposite the aerial image 104 with respect to the optical element 101 included in the aerial display 10, on an extension where the aerial image 104 can be seen, so that the object 36 and the aerial image 104 produced by the aerial display 10 are superimposed.
- the display system 1 further includes an imaging device 39 capable of capturing an image of the object 36.
- the imaging device 39 captures an image of the object 36.
- the detection unit 116 detects the object 36 from the captured image.
- the processing unit 112 may then determine the display content of the display 102 based on the detected object 36.
- the output control unit 113 causes the display 102 to display the determined display content.
- FIG. 21 is an explanatory diagram showing an example of superimposing an aerial image 104 on an object 36.
- a painting is taken as an example of the object 36 for explanation.
- the imaging device 39 captures an image of the painting.
- the output control unit 113 detects the painting from the captured image.
- the processing unit 112 may determine the display content of the display 102 based on the detected object 36.
- the output control unit 113 causes the display 102 to display the determined display content.
- the output control unit 113 may cause a description of the painting to be displayed on the display 102. As a result, the description of the painting is superimposed on the painting.
- the object 36 is not limited to a plant or living thing, and may be a plant or living thing.
- the installation examples in Figs. 15, 19, and 20 are merely examples.
- an image on the floor or an actual object 36 may be superimposed on the image using the aerial display 10 from the ceiling.
- the display system 1 accepts operations on the aerial image 104 and operations on the touch panel 103 superimposed on a portion of the surface of the optical element 101 as different operations. This allows various operations to be performed using the aerial display 10.
- an operation on the aerial image 104 is an operation by a first user
- an operation on the touch panel 103 is an operation by a second user. In this way, multiple users can simultaneously perform different operations on the aerial display 10.
- the display system 1 corrects the positional relationship between the display of the display 102 and the touch panel 103. This makes it possible to more accurately recognize the operation that the user wants to perform.
- the aerial display 10 when used as a display device for a car navigation device, if the driver operates the touch panel 103 and the passenger in the front passenger seat operates the aerial image 104, the display system 1 will accept operations on the aerial image 104 while the vehicle is moving, but will not accept operations on the touch panel 103. This allows the passenger in the front passenger seat to operate the aerial display 10 while ensuring the safety of the driver.
- the aerial display 10 may also be used as a display device in a store.
- operations on the touch panel 103 may be performed by a store clerk, and operations on the aerial image 104 may be performed by a customer.
- the store clerk may support the customer while the customer is operating the device.
- the display system 1 may also output different sounds when an operation is performed on the aerial image 104 and when an operation is performed on the touch panel 103.
- the display system 1 outputs different vibrations when an operation is performed on the aerial image 104 and when an operation is performed on the touch panel 103.
- Mouse operations may also be realized by operations on the touch panel 103 and operations on the aerial image 104.
- an operation on the aerial image 104 is an operation corresponding to a specified position on the aerial image 104.
- an operation on the touch panel 103 is at least one of an operation to instruct the display of a menu for performing an operation related to the position where an operation on the touch panel 103 is performed, an operation to instruct the display of a menu for making settings related to the position where an operation on the touch panel 103 is performed, and an operation to instruct the display of information related to the position where an operation on the touch panel 103 is performed.
- operations on the touch panel 103 may be operations related to payment among the operations related to product registration and settlement
- operations on the aerial image 104 may be operations other than the operations related to payment among the operations related to product registration and settlement.
- the display system 1 also determines whether the user's hand is beyond the aerial image 104 on the optical element 101 side, based on the position of the user's hand performing an operation on the aerial image 104.
- the hand position may be detected by a sensor 105 that detects an operation on the aerial image 104, or the hand position may be detected from an image captured by an imaging device 38 that captures an image of the hand of the user performing an operation on the aerial image 104.
- the display system 1 then accepts an operation on the aerial image 104 when the hand is not on the optical element 101 side, and does not accept an operation on the aerial image 104 when the hand is on the optical element 101 side. This makes it possible to distinguish between an operation on the aerial image 104 and an operation on the touch panel 103.
- the display system 1 also detects the position of the user's face from an image captured by the imaging device 37 of the user performing an operation on the aerial image 104.
- the display system 1 corrects the positional relationship between the operation on the aerial image 104 and the operation on the touch panel 103 based on the detected position of the user's face.
- FIG. 22 is a block diagram showing an example of a configuration of a control device according to the second embodiment.
- the control device 21 includes a reception unit 211, a processing unit 212, and an output control unit 213.
- the reception unit 211 may be the basic function of the reception unit 111 in the first embodiment.
- the processing unit 212 may be the basic function of the processing unit 112 in the first embodiment.
- the output control unit 213 may be the basic function of the output control unit 113 in the first embodiment.
- the reception unit 211 receives, as separate operations, an operation on an aerial image detected by a sensor included in the non-contact display and an operation on a touch panel provided over a portion of the surface of an optical element included in the non-contact display.
- the processing unit 212 When the processing unit 212 receives an operation on the aerial image, it performs processing according to the operation on the aerial image, and when the processing unit 212 receives an operation on the touch panel, it performs processing according to the operation on the touch panel.
- the output control unit 213 controls the display on the display included in the non-contact display based on the processing results by the processing unit 212.
- (flowchart) 23 is a flowchart showing an example of an operation of the control device 21 according to the second embodiment.
- the output control unit 213 controls the display (step S201).
- the reception unit 211 receives an operation for the aerial image formation (step S202).
- the processing unit 212 performs a process according to the operation (step S203), and the process returns to step S201.
- the output control unit 213 performs a process according to the operation based on the processing result by the processing unit 212.
- the reception unit 211 receives an operation on the touch panel (step S204).
- the processing unit 212 performs processing according to the operation on the touch panel (step S205), and the process returns to step S201.
- the output control unit 213 performs processing according to the operation based on the processing result by the processing unit 212.
- the flowchart can be ended at any time.
- control device 21 accepts operations on the aerial image formation and operations on the touch panel as different operations, and performs processing according to each operation. This makes it possible to perform various operations when using a non-contact display.
- the hardware configuration of the control device 21 when implemented by a computer may be the same as the hardware configuration of the control device 21 described in the first embodiment when implemented by a computer, and detailed description thereof will be omitted.
- the display system 1 may be configured to include each functional unit and part of the information.
- the control devices 11 and 21 may be configured to include each functional unit and part of the information.
- the control device 11 may be configured to include a reception unit 111, a processing unit 112, an output control unit 113, a correction unit 114, and a determination unit 115.
- the control device 11 may be configured to include a reception unit 111, a processing unit 112, an output control unit 113 ... a determination unit 115, and a detection unit 116.
- each embodiment is not limited to the above-mentioned examples, and can be modified in various ways.
- the configuration of the display system 1 and the configuration of the control devices 11, 21 in the embodiments are not particularly limited.
- the control devices 11, 21 may be realized by different devices depending on the function or data.
- the functional units of the control devices 11, 21 may be configured by multiple servers and realized as a control system.
- each functional unit of the control devices 11, 21 may be realized by a database server including each DB (Database) and a server having each functional unit.
- each screen is merely an example and is not particularly limited. Buttons, lists, check boxes, information display fields, input fields, etc. (not shown) may be added to each screen. Furthermore, the background color of the screen, etc. may be changed.
- the process of generating information to be displayed on a display included in the aerial display 10 may be performed by an output control unit. This process may also be performed by a display included in the aerial display 10.
- each device may be realized by any combination of a different computer and program for each component.
- multiple components of each device may be realized by any combination of a single computer and program.
- each device may be realized by circuits for a specific application. Further, some or all of the components of each device may be realized by general-purpose circuits including a processor such as an FPGA (Field Programmable Gate Array). Further, some or all of the components of each device may be realized by a combination of circuits for a specific application and general-purpose circuits. Further, these circuits may be a single integrated circuit. Alternatively, these circuits may be divided into multiple integrated circuits. The multiple integrated circuits may be configured by being connected via a bus or the like.
- each device may be realized by multiple computers, circuits, etc.
- the multiple computers, circuits, etc. may be centralized or distributed.
- control method described in each embodiment is realized by execution by a control device. Also, for example, the control method is realized by execution by a computer such as a server or a terminal device of a program prepared in advance.
- the programs described in each embodiment are recorded on a computer-readable recording medium such as a HDD, SSD, flexible disk, optical disk, magneto-optical disk, or USB memory.
- the programs are then read from the recording medium by a computer and executed.
- the programs may also be distributed via a communications network NT.
- each component of the display system and control device in each embodiment described above may have their functions realized by dedicated hardware, such as a computer.
- each component may be realized by software.
- each component may be realized by a combination of hardware and software.
- (Appendix 1) a receiving means for receiving, as separate operations, an operation on an aerial image detected by a sensor included in the non-contact display and an operation on a touch panel provided so as to overlap a portion of a surface of an optical element included in the non-contact display; a processing means for performing a process corresponding to the operation on the aerial image when an operation on the aerial image is received, and for performing a process corresponding to the operation on the touch panel when an operation on the touch panel is received; an output control means for controlling display on a display included in the non-contact type display based on a processing result by the processing means;
- a control device comprising: (Appendix 2) the operation on the aerial image is an operation by a first user, The operation on the touch panel is an operation by a second user. 2.
- the control device of claim 1 (Appendix 3) a detection means for detecting a position of a face of the second user from an image of the second user captured by an imaging device; a correction means for correcting a position of a received operation on the touch panel based on a position of a face of the second user; 3.
- the control device according to claim 2, comprising: (Appendix 4) When the non-contact display is installed in a vehicle, The first user is a passenger in a front passenger seat of the vehicle, The second user is a driver of the vehicle, the reception means receives an operation on the aerial image while the vehicle is traveling, and does not receive an operation on the touch panel. 4.
- the operation on the touch panel is at least one of an operation to instruct the display of a menu for performing an operation related to a position on the touch panel where the operation is performed, an operation to instruct the display of a menu for performing a setting related to the position on the touch panel where the operation is performed, and an operation to instruct the display of information related to the position on the touch panel where the operation is performed.
- the operation on the touch panel is a payment operation among operations related to product registration and payment at the store
- the operation on the aerial image is an operation related to the product registration and the payment other than the operation related to the settlement.
- the control device according to any one of claims 1 to 8.
- (Appendix 10) a determination means for determining whether a hand of a user who performs an operation on the aerial image is located beyond the aerial image on a side of an optical element, based on a position of the hand of the user, the acceptance means accepts an operation on the aerial image when the hand is not on the optical element side, and does not accept the operation on the aerial image when the hand is on the optical element side. 10.
- the determination means determines whether the hand of the user is located beyond the aerial image on the side of the optical element based on the position of the hand of the user detected by the sensor. 11.
- the control device of claim 10. (Appendix 12) a detection means for detecting the hand of the user from an image captured by an imaging device of the hand of the user performing an operation on the aerial image; Equipped with the determination means determines whether the hand of the user is located beyond the aerial image on the side of the optical element based on the detected position of the hand of the user. 11.
- the control device comprising: (Appendix 14) A non-contact display; A control device; Equipped with The non-contact display comprises: A touch panel and An optical element; A display and A sensor and the touch panel is disposed over a portion of a surface of the optical element; The touch panel detects an operation on the touch panel itself, The sensor detects an operation on the display and an aerial image formed by the optical element;
- the control device includes: a receiving means for receiving an operation on the aerial image and an operation on the touch panel as different operations; a processing means for performing a process corresponding to the operation on the aerial image when an operation on the aerial image is received, and for performing a process corresponding to the operation on the touch panel when an operation on the touch panel is received; an output control means for controlling display on the display based on a processing result by the processing means;
- a display system comprising: (Appendix 15) a determination means for determining whether a hand of a user performing an operation on the aerial image is located beyond the aerial image on the optical element side,
- the display system of claim 14. (Appendix 16) In addition, it is equipped with a mirror, The mirror is disposed on a surface of the optical element opposite to the aerial image such that an image reflected on the mirror and the aerial image are superimposed on each other. 16.
- the display system of claim 14 or 15. (Appendix 17)
- the imaging device is provided, the imaging device images a user performing an operation on the aerial image;
- the control device includes: a detection means for detecting a movement of the user from an image captured by the imaging device; Equipped with The output control means controls the display on the display based on the movement of the user. 17.
- the accepting means accepts an operation on the touch panel as an operation on the user reflected in the mirror. 18.
- a second display is provided, the second display is disposed on a surface of the optical element opposite to the aerial image such that the display of the second display and the aerial image are superimposed on each other; 16.
- the device includes an object, the object is disposed on a surface of the optical element opposite to the aerial image such that the object and the aerial image are superimposed on each other; 16.
- an imaging device for imaging the object includes: a detection means for detecting the object from an image captured by the imaging device; Equipped with The processing means determines content to be displayed on the display based on the detected object; The output control means causes the display device to display the determined display content. 21.
- the display system of claim 20 includes: a detection means for detecting the object from an image captured by the imaging device; Equipped with The processing means determines content to be displayed on the display based on the detected object; The output control means causes the display device to display the determined display content.
- a receiving means for receiving an operation on an aerial image detected by a sensor included in a non-contact type display installed in a vehicle and an operation on a touch panel provided so as to overlap a part of a surface of an optical element included in the non-contact type display; a processing means for performing a process corresponding to the operation on the aerial image when an operation on the aerial image is received, and for performing a process corresponding to the operation on the touch panel when an operation on the touch panel is received; an output control means for controlling display on a display included in the non-contact type display based on a processing result by the processing means; the operation on the aerial image is an operation by a passenger in a front passenger seat of the vehicle, The operation on the touch panel is an operation by a driver of the vehicle, the reception means receives an operation on the aerial image while the vehicle is traveling, and does not receive an operation on the touch panel.
- Control device accepting an operation on an aerial image detected by a sensor included in the non-contact display and an operation on a touch panel provided so as to overlap a part of a surface of an optical element included in the non-contact display as different operations, when an operation on the aerial image is received, a process is performed in response to the operation on the aerial image; and when an operation on the touch panel is received, a process is performed in response to the operation on the touch panel; Controlling display on a display included in the non-contact display based on a processing result. Control methods.
- Appendix 24 receiving an operation on an aerial image detected by a sensor included in a non-contact display installed in the vehicle and an operation on a touch panel provided so as to overlap a part of a surface of an optical element included in the non-contact display; when an operation on the aerial image is received, a process is performed in response to the operation on the aerial image; and when an operation on the touch panel is received, a process is performed in response to the operation on the touch panel; Controlling display on a display included in the non-contact display based on a processing result; the operation on the aerial image is an operation by a passenger in a front passenger seat of the vehicle, The operation on the touch panel is an operation by a driver of the vehicle, In the reception, an operation on the aerial image is received while the vehicle is traveling, and an operation on the touch panel is not received.
- Control methods On the computer, accepting an operation on an aerial image detected by a sensor included in the non-contact display and an operation on a touch panel provided so as to overlap a part of a surface of an optical element included in the non-contact display as different operations, when an operation on the aerial image is received, a process is performed in response to the operation on the aerial image; and when an operation on the touch panel is received, a process is performed in response to the operation on the touch panel; Controlling display on a display included in the non-contact display based on a processing result.
- a non-transitory recording medium readable by the computer, which records a program for executing a process.
- the operation on the aerial image is an operation by a passenger in a front passenger seat of the vehicle
- the operation on the touch panel is an operation by a driver of the vehicle
- an operation on the aerial image is received while the vehicle is traveling, and an operation on the touch panel is not received.
- the operation on the aerial image is an operation by a passenger in a front passenger seat of the vehicle
- the operation on the touch panel is an operation by a driver of the vehicle
- an operation on the aerial image is received while the vehicle is traveling, and an operation on the touch panel is not received. program.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Le dispositif de commande selon la présente invention commande un affichage d'une unité d'affichage sans contact. Le dispositif de commande comprend une unité d'acceptation, une unité de traitement et une unité de commande de sortie. L'unité d'affichage sans contact inclut un panneau tactile, le panneau tactile étant disposé superposé sur une partie d'une surface d'un élément optique inclus dans l'unité d'affichage sans contact. Un moyen d'acceptation accepte une opération, réalisée par rapport à une image aérienne détectée par un capteur inclus dans l'unité d'affichage sans contact, et une opération, réalisée par rapport à un panneau tactile, comme étant chacune une opération différente. L'unité de traitement, lors de l'acceptation d'une opération réalisée par rapport à l'image aérienne, réalise un traitement selon une opération réalisée par rapport à l'image aérienne et, lors de l'acceptation d'une opération réalisée par rapport au panneau tactile, réalise un traitement selon une opération réalisée par rapport au panneau tactile. L'unité de commande de sortie commande l'affichage sur une unité d'affichage sur la base des résultats de traitement par l'unité de traitement.
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2024571540A JPWO2024154297A5 (ja) | 2023-01-19 | 制御装置、表示システム、制御方法、およびプログラム | |
| PCT/JP2023/001534 WO2024154297A1 (fr) | 2023-01-19 | 2023-01-19 | Dispositif de commande, système d'affichage, procédé de commande et support d'enregistrement |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2023/001534 WO2024154297A1 (fr) | 2023-01-19 | 2023-01-19 | Dispositif de commande, système d'affichage, procédé de commande et support d'enregistrement |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024154297A1 true WO2024154297A1 (fr) | 2024-07-25 |
Family
ID=91955638
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2023/001534 Ceased WO2024154297A1 (fr) | 2023-01-19 | 2023-01-19 | Dispositif de commande, système d'affichage, procédé de commande et support d'enregistrement |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2024154297A1 (fr) |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2009044437A1 (fr) * | 2007-10-01 | 2009-04-09 | Pioneer Corporation | Dispositif d'affichage d'images |
| JP2014049023A (ja) * | 2012-09-03 | 2014-03-17 | Sharp Corp | 入力装置 |
| JP2016006447A (ja) * | 2014-06-20 | 2016-01-14 | 船井電機株式会社 | 画像表示装置 |
| WO2016152300A1 (fr) * | 2015-03-25 | 2016-09-29 | 京セラドキュメントソリューションズ株式会社 | Dispositif de traitement d'informations |
| JP2016184222A (ja) * | 2015-03-25 | 2016-10-20 | 京セラドキュメントソリューションズ株式会社 | 可視像形成装置及び画像形成装置 |
| JP2017073128A (ja) * | 2015-10-08 | 2017-04-13 | 船井電機株式会社 | 空間入力装置 |
| KR20180135649A (ko) * | 2017-06-13 | 2018-12-21 | 광운대학교 산학협력단 | 공간 영상 입력시스템 |
-
2023
- 2023-01-19 WO PCT/JP2023/001534 patent/WO2024154297A1/fr not_active Ceased
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2009044437A1 (fr) * | 2007-10-01 | 2009-04-09 | Pioneer Corporation | Dispositif d'affichage d'images |
| JP2014049023A (ja) * | 2012-09-03 | 2014-03-17 | Sharp Corp | 入力装置 |
| JP2016006447A (ja) * | 2014-06-20 | 2016-01-14 | 船井電機株式会社 | 画像表示装置 |
| WO2016152300A1 (fr) * | 2015-03-25 | 2016-09-29 | 京セラドキュメントソリューションズ株式会社 | Dispositif de traitement d'informations |
| JP2016184222A (ja) * | 2015-03-25 | 2016-10-20 | 京セラドキュメントソリューションズ株式会社 | 可視像形成装置及び画像形成装置 |
| JP2017073128A (ja) * | 2015-10-08 | 2017-04-13 | 船井電機株式会社 | 空間入力装置 |
| KR20180135649A (ko) * | 2017-06-13 | 2018-12-21 | 광운대학교 산학협력단 | 공간 영상 입력시스템 |
Also Published As
| Publication number | Publication date |
|---|---|
| JPWO2024154297A1 (fr) | 2024-07-25 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| KR102665643B1 (ko) | 아바타 표시를 제어하기 위한 방법 및 그 전자 장치 | |
| KR102342267B1 (ko) | 휴대 장치 및 휴대 장치의 화면 변경방법 | |
| EP3203429B1 (fr) | Dispositif électronique mobile et procédé de paiement électronique | |
| CN104756051B (zh) | 输入面板的输入位置校正表 | |
| CN102508572B (zh) | 触摸手势通知撤消技术 | |
| JP5802247B2 (ja) | 情報処理装置 | |
| KR20190021595A (ko) | 전자 장치 및 전자 장치의 제어 방법 | |
| CN102236471A (zh) | 信息处理装置、信息生成方法以及记录介质 | |
| US20180275414A1 (en) | Display device and display method | |
| EP3985484A1 (fr) | Procédé d'étalonnage, dispositif d'étalonnage et procédé de commande de geste sans contact | |
| CN105138247A (zh) | 检测到第二设备接近第一设备而在第一设备呈现用户界面 | |
| CN108200416B (zh) | 投影设备中投影图像的坐标映射方法、装置及投影设备 | |
| JP2010511945A (ja) | 対話型入力システムおよび方法 | |
| CN119053979A (zh) | 用于发起交易的用户界面 | |
| US20250284348A1 (en) | Electronic device and control method of the same | |
| JP2017126225A (ja) | 画像処理装置、方法およびプログラム | |
| KR20150026647A (ko) | 필기 서명 입력에 대한 검증 방법 및 장치 | |
| CN103809817B (zh) | 光学触控系统及其物件位置的判断方法 | |
| KR102681016B1 (ko) | 아이템 관련 정보 제공을 위한 동작 방법 및 이를 지원하는 전자 장치 | |
| WO2024154297A1 (fr) | Dispositif de commande, système d'affichage, procédé de commande et support d'enregistrement | |
| JP5999236B2 (ja) | 情報処理システム、その制御方法、及びプログラム、並びに情報処理装置、その制御方法、及びプログラム | |
| KR102278882B1 (ko) | 동적 화면 전환을 기반으로 한 상품 판매 서비스 장치, 동적 화면 전환을 기반으로 한 상품 판매 시스템, 동적 화면 전환을 기반으로 한 상품을 판매하는 방법 및 컴퓨터 프로그램이 기록된 기록매체 | |
| JP7006767B2 (ja) | 画像識別レジ装置、画像識別レジシステム、会計処理方法、およびプログラム | |
| CN110895458A (zh) | 信息处理方法及装置、非临时性计算机可读存储介质 | |
| US10802700B2 (en) | Information processing apparatus and information processing method |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23917510 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2024571540 Country of ref document: JP Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2024571540 Country of ref document: JP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |