[go: up one dir, main page]

WO2012032842A1 - Display system and detection method - Google Patents

Display system and detection method Download PDF

Info

Publication number
WO2012032842A1
WO2012032842A1 PCT/JP2011/065579 JP2011065579W WO2012032842A1 WO 2012032842 A1 WO2012032842 A1 WO 2012032842A1 JP 2011065579 W JP2011065579 W JP 2011065579W WO 2012032842 A1 WO2012032842 A1 WO 2012032842A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
sensor array
distance measuring
display area
display system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2011/065579
Other languages
French (fr)
Japanese (ja)
Inventor
圭一 山本
倫大 河合
和也 辻埜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sharp Corp
Original Assignee
Sharp Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sharp Corp filed Critical Sharp Corp
Publication of WO2012032842A1 publication Critical patent/WO2012032842A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers

Definitions

  • the present invention relates to a display system and a detection method.
  • the present invention particularly relates to a display system that displays a two-dimensional image in the air and a method for detecting the position of an object in the two-dimensional image.
  • Patent Document 1 discloses a stereoscopic two-dimensional image display device as the display system.
  • the stereoscopic two-dimensional image display device includes a display unit, a microlens array, a position detection sensor, and a control unit.
  • the display unit includes an image display surface for displaying a two-dimensional image.
  • the microlens array displays a two-dimensional image on a stereoscopic image display surface in a pseudo-stereoscopic manner by forming light emitted from the image display surface on a stereoscopic image display surface separated from the image display surface.
  • the position detection sensor is arranged in association with the stereoscopic image display surface, and outputs a signal corresponding to the position subjected to physical action from the outside.
  • the control unit changes the image in the stereoscopic image display surface according to the output signal from the position detection sensor.
  • Patent Document 2 JP-A-9-55152 discloses a display device having a touchless panel switch as the display system.
  • the touchless panel switch when the finger enters a predetermined area where the finger is to be detected, the light projection from the light projecting element is reflected on the finger and incident on the light receiving element.
  • At least one reflection type optical sensor composed of an element and the light receiving element is installed in the space around the gradient index lens element for each predetermined region.
  • Patent Document 3 discloses a display device including an imaging element as the display system.
  • the display device forms an object to be projected, which is a two-dimensional or three-dimensional object, on the opposite side of the imaging element as a two-dimensional image or a real image of a three-dimensional image. More detailed description is as follows.
  • the imaging element is an optical element that causes a light beam to bend when light passes through an element surface constituting one plane.
  • the imaging element is configured by arranging a plurality of unit optical elements that reflect light by one or more mirror surfaces arranged at an angle perpendicular to or close to the element surface.
  • the imaging element is a real image in a space where there is no physical entity on the other side of the element surface by reflecting the light emitted from the projection object arranged on one side of the element surface to the mirror surface when passing through the element surface. As an image.
  • Patent Document 4 discloses an optical distance measuring sensor including a long-distance light receiving unit and a short-distance light receiving unit.
  • Patent Document 1 it is necessary to arrange a position detection sensor so as to surround the periphery of the two-dimensional image displayed in the air. Therefore, in Patent Document 1, a frame is required around the two-dimensional image displayed in the air. Therefore, it is difficult for the user to feel the difference between the image displayed by the stereoscopic two-dimensional image display device of Patent Document 1 and the image displayed by a general display that displays an image on the display panel.
  • Patent Document 2 it is detected by one sensor that an object such as a finger is located at a predetermined position in a two-dimensional image in the air. For this reason, in order to detect about the display area which displays a two-dimensional image, many sensors are needed. Furthermore, it is very difficult to determine the installation position of each sensor.
  • Patent Document 3 it cannot be detected at which position in the real image of the formed two-dimensional image or three-dimensional image.
  • the present invention has been made in view of the above problems, and its purpose is to detect the position of an object in a two-dimensional image with a simple configuration without surrounding the two-dimensional image displayed in the air with a frame. It is to provide a possible display system and a detection method in the display system.
  • a display system includes a display, an optical element that displays a two-dimensional image in an aerial display area based on an image displayed on the display, a processor, and a plurality of distance measuring sensors.
  • a first sensor array arranged in a row in one direction, and a memory storing first data indicating a correspondence relationship between an output voltage and a distance in each distance measuring sensor.
  • the normal direction of the display area is a direction perpendicular to the first direction.
  • Each of the plurality of distance measuring sensors emits light in a second direction perpendicular to the first direction and the normal direction and in a direction approaching the display area.
  • Each of the plurality of distance measuring sensors receives light reflected by the object among the emitted light, and outputs a voltage value based on the distance to the object.
  • the processor detects the position of the object in the two-dimensional image based on each voltage value output from each of the plurality of distance measuring sensors and the first data.
  • the first sensor array is disposed at a position intersecting with a plane including the display area.
  • the light emitted from each of the plurality of distance measuring sensors passes through the display area.
  • the display system further includes a first moving mechanism that moves the display in the third direction and a second moving mechanism that moves the first sensor array in the fourth direction.
  • the memory further stores second data indicating the correspondence between the position of the display and the position of the first sensor array.
  • the display area moves based on the movement of the display.
  • the processor determines the position of the first sensor array based on the position of the display and the second data.
  • the processor moves the first sensor array to the determined position using the second moving mechanism.
  • the third direction is a direction perpendicular to the display surface of the display.
  • the display area moves in the third direction based on the movement of the display in the third direction.
  • the fourth direction is the same direction as the third direction.
  • the third direction is a direction perpendicular to the display surface of the display.
  • the direction of the normal of the display area is different from the third direction.
  • the display area moves in the normal direction based on the movement of the display in the third direction.
  • the fourth direction is a direction having a component of the normal direction.
  • the processor detects the position of the object in the two-dimensional image based on each voltage value output from each of the plurality of ranging sensors, the first data, and the position of the first sensor array or the position of the display. .
  • the third direction is a direction in which the incident angle of the light with respect to the optical element based on the image displayed on the display is changed.
  • the fourth direction is a direction in which the emission angle of each light emitted from the plurality of distance measuring sensors is changed.
  • the display system further includes a second sensor array in which a plurality of distance measuring sensors are arranged in a row in the first direction.
  • the second sensor array is arranged in parallel with the first sensor array and emits light in the same direction as the first sensor array.
  • the processor displays at least one object in the display area.
  • the processor detects the position of the object based on the voltage value output from each of the plurality of distance measuring sensors included in the second sensor array and the first data.
  • the processor determines whether or not a position on a plane including a display area corresponding to the detected position of the object is included in an area displaying at least one object in the display area.
  • the processor changes the display mode of the object in the display area from the first display mode to the second display mode based on the determination that the processor is included in the area for displaying the object.
  • the processor displays at least one object in the display area.
  • the distance measuring sensor in the first sensor array is provided at a position corresponding to at least one object.
  • each of the plurality of distance measuring sensors in the first sensor array includes one light emitting element and two light receiving elements each receiving reflected light of light emitted from the light emitting element.
  • the processor displays at least one object in the display area.
  • the processor executes processing associated with the object based on the fact that the position of the object detected by the first sensor array is included in the area displaying at least one object.
  • the detection method is a detection method in a display system that detects the position of an object in a two-dimensional image displayed in an aerial display area.
  • the display system includes a processor, a sensor array in which a plurality of distance measuring sensors are arranged in a row in a first direction, and a memory that stores data indicating a correspondence relationship between an output voltage and a distance in each distance measuring sensor.
  • the normal direction of the display area is a direction perpendicular to the first direction.
  • the detection method includes a step in which each of the plurality of distance measuring sensors emits light in a second direction perpendicular to the first direction and the normal direction and approaches the display area; Each of the distance sensors receives light reflected by the object out of the emitted light, and outputs a voltage value based on the distance to the object, and the processor outputs by each of the plurality of distance sensors Detecting the position of the object in the two-dimensional image based on each of the voltage values and the data.
  • the present invention it is possible to detect the position of an object in a two-dimensional image with a simple configuration without surrounding the two-dimensional image displayed in the air with a frame.
  • FIG. 2 is a cross-sectional view taken along line II-II in FIG. It is the block diagram which showed a part of hardware constitutions of the display system. It is a figure for demonstrating the structure of a sensor array. It is the figure which showed the external appearance of the ranging sensor. It is a figure for demonstrating the irradiation range of the light which a sensor array radiate
  • FIG. 12 is a cross-sectional view taken along line XII-XII in FIG. 11. It is the block diagram which showed a part of hardware constitutions of the display system. It is the figure which showed the external appearance and use condition of other display systems.
  • FIG. 15 is a cross-sectional view taken along line XV-XV in FIG. 14. It is the block diagram which showed a part of hardware constitutions of the display system. It is a figure for demonstrating a 1st display mode and a 2nd display mode. It is the flowchart which showed the flow of the change process of the display mode in a display system.
  • FIG. 23 is a cross-sectional view taken along line XXIII-XXIII in FIG. It is the block diagram which showed a part of hardware constitutions of the display system. It is the figure which showed the correspondence of each ranging sensor and a display area. It is the figure which showed the external appearance and use condition of other display systems.
  • FIG. 27 is a cross-sectional view taken along line XXVII-XXVII in FIG. 26.
  • FIG. 30 is a cross-sectional view taken along line XXX-XXX in FIG. 29. It is the block diagram which showed a part of hardware constitutions of the display system.
  • direction represents two different directions.
  • “Two different directions” represents, for example, two directions opposite to each other.
  • the direction of the X axis represents the positive direction of the X axis and the negative direction of the X axis.
  • “two different directions” include, for example, a direction in which the two directions do not differ by 180 degrees, such as a clockwise direction and a counterclockwise direction.
  • FIG. 1 is a diagram illustrating an appearance and a usage state of the display system 1.
  • the display system 1 includes a housing 10, an opening 20, a sensor array 30, and a microlens array (optical element) 40.
  • the microlens array 40 transmits light emitted from the display (see FIG. 2) in the housing 10 and displays a two-dimensional image (hereinafter also referred to as “aerial image”) in a rectangular display area 810 in the air.
  • the display area 810 is an area surrounded by four sides 810a, 810b, 810c, and 810d. Note that the side 810a and the side 810b are parallel, and the side 810c and the side 810d are parallel.
  • the direction of the normal line of the display area 810 is the Z direction.
  • the opening 20 has a rectangular shape.
  • the opening 20 is formed below the display area 810 and along the side 810 b of the display area 810.
  • the sensor array 30 has a plurality of distance measuring sensors arranged in a row in the X direction.
  • the sensor array 30 is disposed along the opening 20 in the housing 10. Specifically, the sensor array 30 is installed such that the sensing surface of each distance measuring sensor faces the display area 810. Details of the arrangement of the sensor array will be described later.
  • the user touches the aerial image displayed in the display area 810 with the user's finger 910, for example.
  • the user touches the object with the finger 910 in order to select the object included in the aerial image.
  • An object is, for example, an icon image. Note that since the aerial image is not a physical object, even if the two-dimensional image is touched, there is no physical contact with the two-dimensional image.
  • FIG. 2 is a cross-sectional view taken along line II-II in FIG.
  • the display system 1 includes a sensor array 30, a microlens array 40, and a display 50 in a housing 10.
  • the microlens array 40 is a light wave control device in which microlenses are arranged in a lattice pattern.
  • Display 50 displays an image in the direction of microlens array 40. As described above, the image displayed on the display 50 is displayed in the display area 810 as an aerial image by the microlens array 40.
  • Each of the distance measuring sensors of the sensor array 30 is in the Y direction perpendicular to the direction in which the distance measuring sensors are arranged (X direction) and the direction of the normal of the display area 810 (Z direction), and in a direction approaching the display area 810. Emits light. More specifically, the sensor array 30 is arranged at a position that intersects a plane including the display area 810. That is, the sensor array 30 is arranged at a position parallel to the sides 810a and 810b of the display area 810 (see FIG. 1).
  • the sensor array 30 may be arranged so that the light emitted from the distance measuring sensor passes through the display area 810 (that is, the light overlaps the display area 810), or the light travels along the display area 810. It may be arranged to move forward.
  • the light emitted from the distance measuring sensor is arranged to pass through the display area 810 will be described as an example.
  • FIG. 3 is a block diagram showing a part of the hardware configuration of the display system 1.
  • the display system 1 includes a sensor array 30, a display 50, a CPU (Central Processing Unit) 60, a memory 70, a display driving device 80, and an A / D (Analog / Digital) converter 90.
  • a sensor array 30 a display 50
  • a CPU Central Processing Unit
  • memory 70 a memory 70
  • a display driving device 80 a display driving device 80
  • a / D Analog / Digital
  • the display system 1 includes a device (speaker or the like) that generates sound. The same applies to other display systems described later.
  • Sensor array 30 outputs an analog voltage value as a sensing result to A / D converter 90.
  • the A / D converter 90 converts an analog voltage value into a digital voltage value.
  • the A / D converter 90 sends the converted digital voltage value to the CPU 60.
  • the memory 70 includes, for example, a ROM, a RAM, and a flash memory.
  • the memory 70 stores various data such as a program executed by the display system 1 and association data 71.
  • the association data 71 will be described later.
  • CPU 60 executes a program stored in memory 70 in advance.
  • the CPU 60 refers to the voltage value acquired from the A / D converter 90 and the association data 71 and executes processing to be described later.
  • the display driving device 80 receives a command from the CPU 60 and drives the display 50.
  • FIG. 4 is a diagram for explaining the configuration of the sensor array 30.
  • the sensor array 30 eight distance measuring sensors 31 to 38 are arranged in a row. Each of the distance measuring sensors 31 to 38 emits light in the Y direction. Each of the distance measuring sensors 31 to 38 receives the light reflected by the object among the emitted light, and outputs a voltage value based on the distance to the object to the A / D converter 90.
  • the distance measuring sensors 31 to 38 are arranged at equal intervals. Note that the distances from the distance measuring sensors 31 to 38 to the display area 810 are the same for the distance measuring sensors 31 to 38.
  • the distance measuring sensors 31 to 38 have the same configuration except that the positions in the sensor array 30 are different. Although the number of distance measuring sensors in the sensor array 30 is eight, the number is not limited to this.
  • FIG. 5 is a diagram showing the external appearance of the distance measuring sensor 31.
  • the distance measuring sensor 31 includes a light emitting device 311 and a light receiving device 312.
  • the light emitting device 311 emits light to the outside.
  • the light receiving device 312 receives reflected light of the light emitted from the light emitting device 311.
  • the light receiving device 312 also receives light other than reflected light (for example, indoor light or external light).
  • FIG. 6 is a diagram for explaining the irradiation range of the light emitted from the sensor array 30.
  • FIG. 6A is a diagram showing an irradiation range of light emitted from the light emitting device 311 of the distance measuring sensor 31 constituting the sensor array 30.
  • the light-emitting device 311 radiate
  • the distance measuring sensor 31 emits infrared rays
  • the irradiation range has a spread of ⁇ 1.5 ° with respect to the optical axis.
  • FIG. 6B is a diagram showing the irradiation range of the light emitted from the distance measuring sensors 31 to 38. With reference to FIG. 6B, light can be irradiated to the entire range of the display area 810 by eight distance measuring sensors.
  • the display system 1 does not perform sensing in the Z direction.
  • FIG. 7 is a first diagram for explaining the principle of measurement by the distance measuring sensor 31.
  • the light emitting device 311 includes an infrared LED (Light Emitting Diode) 311a which is a light emitting element, and a lens 311b.
  • the light receiving device 312 includes an infrared light receiving element 312a and a lens 312b.
  • the infrared LED 311a emits infrared rays to the lens 311b.
  • the infrared rays emitted from the lens 311b are reflected by the object 950 that is a sensing target.
  • the reflected infrared light passes through the lens 312b and enters the infrared light receiving element 312a.
  • the infrared light emitted from the infrared LED 311a is reflected by the object 960.
  • Infrared light reflected by the object 960 passes through the lens 312b and enters the infrared light receiving element 312a as described above.
  • FIG. 8 is a second diagram for explaining the principle of measurement by the distance measuring sensor 31. Referring to FIG. 8, when the distance d is greater than d1, the distance measuring sensor 31 has a characteristic that the output voltage decreases as the distance to the sensing target (objects 950, 960, finger 910) increases.
  • the distance range that can be detected by the distance measuring sensor 31 is a distance from the distance d1 to the distance d2 where the output voltage does not become a certain value or less.
  • the sensor array 30 includes distance measuring sensors 31 to 38 that are included in the distance range in which the display area 810 can be detected.
  • the association data 71 stored in the memory 70 is data indicating the characteristics of FIG. Specifically, the association data 71 is data indicating the correspondence between the output voltage and the distance in each of the distance measuring sensors 31 to 38. In the present embodiment, since eight distance measuring sensors having the same configuration are provided, the association data 71 may be any data indicating the correspondence between the output voltage and the distance in one distance measuring sensor. .
  • FIG. 9 is a diagram showing an example of measurement by the sensor array 30.
  • FIG. 9A is a diagram for explaining output voltages of the distance measuring sensors 31 to 38 when the user touches the area 911 in the display area 810 with the finger 910.
  • FIG. 9B is a diagram for explaining output voltages of the distance measuring sensors 31 to 38 when the user touches the area 912 in the display area 810 with the finger 910.
  • FIG. 9C is a diagram for explaining output voltages of the distance measuring sensors 31 to 38 when the user touches the area 913 in the display area 810 with the finger 910.
  • the CPU 60 specifies the position in the Y direction of the area 911 in the display area 810 based on the voltage value V2 that is the highest voltage value and the association data 71.
  • the CPU 60 specifies the position in the X direction of the area 911 in the display area 810 based on the position of the distance measuring sensor 33. Specifically, in the memory 70, data in which each distance measuring sensor is associated with the X coordinate in the display area 810 is stored in advance. The CPU 60 specifies the X coordinate associated with the distance measuring sensor having the highest output voltage value based on the data.
  • the CPU 60 specifies the position in the Y direction of the area 912 in the display area 810 based on the voltage value V3 and the association data 71. Further, the CPU 60 specifies the position in the X direction of the area 912 in the display area 810 based on the position of the distance measuring sensor 35. Since the region 912 is farther from the sensor array 30 than the region 911 illustrated in FIG. 9A, the voltage value V3 is smaller than the voltage value V2.
  • the CPU 60 specifies the position of the area 913 in the Y direction in the display area 810 based on the voltage value V4 and the association data 71.
  • the CPU 60 specifies the position in the X direction of the area 913 in the display area 810 based on the positions of the distance measuring sensors 34 and 35. Specifically, the CPU 60 sets the average value of the X coordinate associated with the distance measuring sensor 34 and the X coordinate associated with the distance measuring sensor 35 as the position of the region 913 in the X direction.
  • the voltage value V4 is smaller than the voltage value V2.
  • the voltage value V4 is larger than the voltage value V3 because it is closer to the sensor array 30 than the region 912 shown in FIG.
  • the CPU 60 can detect which position in the display area 810 has been touched. That is, the CPU 60 can detect the position of the finger 910 in the aerial image displayed in the display area 810.
  • the CPU 60 calculates the coordinate value of the position of the finger 910 in the display area 810 based on the output of the sensor array 30. Based on the coordinate value, the CPU 60 can determine that a command for selecting the object has been input by the user.
  • FIG. 10 is a flowchart showing the flow of processing in the display system 1.
  • the distance measuring sensors 31 to 38 are referred to as a first distance measuring sensor, a second distance measuring sensor,..., An eighth distance measuring sensor, respectively.
  • CPU 60 sets the value of variable j to 1.
  • the j-th ranging sensor emits infrared rays.
  • the j-th distance measuring sensor receives infrared light (reflected light) reflected by the finger 910.
  • step S8 the j-th ranging sensor outputs a voltage value based on the distance to the object (that is, the finger 910). More specifically, the j-th distance measuring sensor outputs a voltage value based on the received infrared light and a voltage value based on the received room light or external light.
  • step S10 the CPU 60 increases the value of j by one. That is, the CPU 60 increments j.
  • step S12 the CPU 60 determines whether j is greater than 8.
  • CPU 60 determines that the value is large (YES in step S12), it acquires the voltage value output from each distance measuring sensor 31 to 38 from A / D converter 90 in step S14. Specifically, the value (digital value) obtained by A / D converting the voltage value (analog value) output from the distance measuring sensors 31 to 38 by the A / D converter 90 is acquired for each distance measuring sensor. . If the CPU determines that it is not large (NO in step S12), the process proceeds to step S4.
  • step S ⁇ b> 16 the CPU 60 detects the position of the finger 910 in the two-dimensional image based on each voltage value output by each distance measuring sensor and the association data 71.
  • the sensor array 30 in which the plurality of distance measuring sensors 31 to 38 are arranged in a row is installed in the downward direction of the two-dimensional image (aerial image), thereby the object of the two-dimensional image is displayed.
  • the position can be determined. Therefore, the display system 1 can detect the position of the object in the two-dimensional image with a simple configuration without surrounding the two-dimensional image displayed in the air with a frame.
  • FIG. 11 is a diagram illustrating an appearance and a usage state of a display system 1A which is a first modification of the display system 1.
  • the display system 1 ⁇ / b> A includes a housing 10 ⁇ / b> A, an opening 20 ⁇ / b> A, a sensor array 30, and a microlens array 40.
  • the opening 20A has a rectangular shape. Similarly to the opening 20, the opening 20 ⁇ / b> A is formed below the display area 810 along the side 810 b of the display area 810. The length of the opening 20 ⁇ / b> A in the X direction is the same as that of the opening 20. The length in the Z direction of the opening 20 ⁇ / b> A is longer than that of the opening 20.
  • the sensor array 30 is configured to be movable in the Z direction.
  • FIG. 12 is a cross-sectional view taken along line XII-XII in FIG.
  • the display system 1A includes a sensor array 30, a microlens array 40, and a display 50 in a housing 10A.
  • the display system 1A includes a moving mechanism 110 (see FIG. 12) that translates the display in the direction of the arrow 711 (third direction).
  • the display system 1A includes a moving mechanism 120 (see FIG. 12) that translates the sensor array in the direction of the arrow 713 (fourth direction).
  • the direction of the arrow 711 and the direction of the arrow 713 are the Z direction. That is, the direction of the arrow 711 and the direction of the arrow 713 are directions perpendicular to the display surface of the display 50.
  • the moving mechanisms 110 and 120 are configured using, for example, an actuator.
  • the position of the display area 810 is also moved in the direction of the arrow 712. Specifically, when the display 50 moves in the negative direction of the Z axis, the display area 810 moves in parallel in the positive direction of the Z axis by a distance corresponding to the amount of movement of the display 50. When the display 50 moves in the positive direction of the Z axis, the display area 810 moves in parallel in the negative direction of the Z axis by a distance corresponding to the amount of movement of the display 50.
  • the sensor array 30 is moved by the moving mechanism 120 in the same direction as the moving direction of the display area 810 by the same amount as the moving amount of the display area 810. In other words, the sensor array 30 moves in the direction corresponding to the movement of the display 50 by the amount corresponding to the movement amount of the display 50. Thus, in the display system 1A, the sensor array 30 moves in conjunction with the movement of the display 50.
  • FIG. 13 is a block diagram showing a part of the hardware configuration of the display system 1A.
  • display system 1A includes a sensor array 30, a display 50, a CPU 60, a memory 70A, a display driving device 80, an A / D converter 90, a moving mechanism 110, and a moving mechanism 120. Is provided.
  • the memory 70 ⁇ / b> A stores positional relationship data 72 in addition to the programs and data stored in the memory 70.
  • the positional relationship data 72 is data indicating the correspondence between the position of the display 50 and the position of the sensor array 30.
  • the CPU 60 When the CPU 60 receives a command for moving the moving mechanism 110 from an input device (for example, an operation key) (not shown), the CPU 60 sends a command for moving the display 50 to the moving mechanism 110 by a direction and an amount corresponding to the command.
  • the moving mechanism 110 moves the display 50 based on a command from the CPU 60.
  • the CPU 60 When the CPU 60 receives an instruction to move the moving mechanism 110 as described above, the CPU 60 further determines the position of the sensor array 30 based on the position of the display 50 and the positional relationship data 72. When the position of the sensor array 30 is determined, the CPU 60 moves the sensor array 30 to the determined position using the moving mechanism 120.
  • the display system 1A can change the position of the display area 810 in the Z-axis direction. That is, the display system 1A can change the position of the aerial image. Further, the display system 1A causes the sensor array 30 to follow the movement of the display area 810. Therefore, the display system 1A can detect the position of the object in the two-dimensional image while allowing the position of the display area 810 to be changed.
  • FIG. 14 is a diagram illustrating an appearance and a usage state of a display system 1B which is a second modification of the display system 1.
  • display system 1B includes a housing 10B, an opening 20B, an opening 20C, a sensor array 30, a microlens array 40, and a sensor array 140.
  • the opening 20B and the opening 20C have the same shape as the opening 20. Similar to the opening 20, the opening 20 ⁇ / b> B is formed below the display area 810 along the side 810 b of the display area 810. The opening 20C is formed so as to be parallel to the opening 20B in the negative direction of the Z axis relative to the opening 20B.
  • the sensor array 30 is disposed along the opening 20B in the housing 10B. As described above, the sensor array 30 is installed such that the sensing surfaces of the distance measuring sensors 31 to 38 face the display area 810.
  • the sensor array 140 has the same configuration as the sensor array 30. That is, the sensor array 140 has eight distance measuring sensors arranged in a row in the X-axis direction. The sensor array 140 is arranged in parallel with the sensor array 30 at a position where it intersects with a virtual plane (not shown) parallel to the display area 810.
  • the sensor array 140 is disposed at a position obtained by translating the sensor array 30 in the negative direction of the Z axis. That is, the X coordinate values of the eight distance measuring sensors of the sensor array 140 and the eight distance measuring sensors 31 to 38 of the sensor array 30 are all the same. Also, the Y coordinate values of the sensor array 140 and the sensor array 30 are the same.
  • the display system 1B includes a sensor array 30, a microlens array 40, a display 50, and a sensor array 140 in a housing 10B.
  • the sensing area 610 by the sensor array 140 is formed in the virtual plane. That is, the virtual plane includes the sensing area 610.
  • the sensor array 140 is installed so that the sensing surface of each distance measuring sensor of the sensor array 140 faces the Y direction. That is, the sensor array 140 emits infrared rays in the same direction as the sensor array 30 (the positive direction of the Y axis).
  • FIG. 16 is a block diagram showing a part of the hardware configuration of the display system 1B.
  • a display system 1B includes a sensor array 30, a display 50, a CPU 60, a memory 70, a display driving device 80, an A / D converter 90, a sensor array 140, and an A / D converter. 150.
  • Sensor array 140 outputs an analog voltage value as a sensing result to A / D converter 150.
  • the A / D converter 150 converts an analog voltage value into a digital voltage value.
  • the A / D converter 150 sends the converted digital voltage value to the CPU 60.
  • the CPU 60 detects the position of the finger 910 in the virtual plane based on the voltage value output by each of the distance measuring sensors included in the sensor array 140 and the association data 71. More precisely, the CPU 60 detects the position of the finger 910 in the sensing area 610 based on the voltage value output by each of the distance measuring sensors included in the sensor array 140 and the association data 71.
  • CPU 60 determines whether or not the position on the plane including the display area 810 corresponding to the position of the finger 910 on the virtual plane is included in the area for displaying the object in the display area 810. More precisely, the CPU 60 determines whether or not the position in the display area 810 corresponding to the position of the finger 910 in the sensing area 610 is included in the area for displaying the object in the display area 810.
  • the CPU 60 changes the display mode of the object on the display 50 from the first display mode to the second display mode based on the determination that it is included in the area for displaying the object.
  • FIG. 17 is a diagram for explaining the first display mode and the second display mode.
  • FIG. 17A is a diagram illustrating a first example of the display mode after the change.
  • FIG. 17B is a diagram showing a second example of the display mode after the change.
  • the position in the plane including the display area 810 corresponding to the position of the finger 910 in the virtual plane is It is assumed that the object B is included in the display area 810 in the display area. More specifically, it is assumed that the value of the XY coordinate of the finger 910 in the sensing area 610 is a value included in the display area of the object B in the display area 810. In this case, the CPU 60 changes the color of the object B on the display 50 from a normal color to a color different from the normal color.
  • CPU 60 determines that object B in display 50 The size is changed from the normal size to a size larger than the normal size.
  • FIG. 18 is a flowchart showing the flow of display mode change processing in the display system 1B. More specifically, FIG. 18 shows processing performed when the objects A to F shown in FIG. 17 are displayed.
  • step S ⁇ b> 22 CPU 60 determines whether finger 910 has touched sensing region 610 based on the output of sensor array 140.
  • CPU 60 determines that finger 910 has touched sensing area 610 (YES in step S22)
  • step S24 is the position of finger 910 in sensing area 610 corresponding to objects AF in display area 810? Judge whether or not.
  • CPU 60 determines that finger 910 has not touched sensing area 610 (NO in step S22)
  • CPU 60 advances the process to step S22.
  • CPU 60 determines that the position corresponds to objects A to F (YES in step S24), it changes the color of the object corresponding to the position of finger 910 in step S26. If CPU 610 determines that the position does not correspond to objects A to F (NO in step S24), it generates an error sound in step S34.
  • step S28 based on the output of the sensor array 30, the CPU 60 determines whether or not the finger 910 has touched any of the objects A to F in the display area 810. If CPU 60 determines that the object has been touched (YES in step S28), it determines in step S30 whether the object at the position touched by finger 910 is the same as the object whose color has been changed. When CPU 60 determines that it is not touched (NO in step S28), the process proceeds to step S28.
  • step S32 CPU 60 performs an operation corresponding to the object touched by finger 910.
  • step S36 CPU 60 determines that they are not the same object (NO in step S30)
  • the display system 1B changes the display mode of the object at the position corresponding to the reached position. Therefore, the user can know in advance which object can be selected by further moving the finger 910 in the negative direction of the Z axis. Therefore, the display system 1B is superior in operability than the display system 1.
  • the arrangement of the sensor array 30 and the sensor array 140 may be reversed. That is, the sensor array 140 may be arranged so that the sensing area 610 is sandwiched between the display area 810 and the microlens array 40.
  • the display system according to the third modification (hereinafter referred to as “display system 1C”) is different from the display system 1 in the configuration of the sensor array. More specifically, the sensor array of the display system 1C is different from the display system 1 in the configuration of the distance measuring sensor.
  • Each distance measuring sensor in the display system 1C includes one light emitting device and two light receiving devices. In this respect, each distance measuring sensor is different from the display system 1 including one light emitting device and one light receiving device. Note that the number of distance measuring sensors is the same between the display system 1 ⁇ / b> C and the display system 1.
  • FIG. 19 is a diagram for explaining the configuration of the distance measuring sensor 31A of the display system 1C.
  • the light emitting device 311 includes the infrared LED 311a and the lens 311b as described above.
  • the light receiving device 312 includes an infrared light receiving element 312a and a lens 312b.
  • the light receiving device 313 includes an infrared light receiving element 313a and a lens 313b. The characteristics of the infrared light receiving element 313a are different from those of the infrared light receiving element 312a.
  • FIG. 20 shows the output characteristics of the sensor of the display system 1C.
  • FIG. 20 is a diagram showing the characteristics of the infrared light receiving element 313a and the characteristics of the infrared light receiving element 312a.
  • infrared light receiving element 313a has a characteristic that when the distance d is greater than di (di ⁇ d2), the output voltage decreases as the distance to the sensing object increases. Therefore, the distance measuring sensor 31A can sense an object existing between the distances d1 to d3.
  • the infrared LED 311a emits infrared rays to the lens 311b.
  • the infrared rays emitted from the lens 311b are reflected by the object 950 that is a sensing target.
  • the reflected infrared light passes through the lens 312b and enters the infrared light receiving element 312a.
  • the infrared light emitted from the infrared LED 311a is reflected by the object 970.
  • the infrared light reflected by the object 970 passes through the lens 313b and enters the infrared light receiving element 313a. In this way, the display system 1C can also sense the object 970 existing at the distances d2 to d3.
  • the display system according to the fourth modification (hereinafter referred to as “display system 1D”) is different from the display system 1 in the configuration of the sensor array. More specifically, the sensor array of the display system 1D is different from the display system 1 in the number of distance measuring sensors.
  • FIG. 21 is a diagram for explaining the sensor array 30A of the display system 1D.
  • CPU 60 executes a program for displaying an object at a predetermined position on display 50.
  • the display 50 displays the objects G to N at eight predetermined positions.
  • the positions of the objects H and G, the positions of the objects J and I, the positions of the objects L and K, and the positions of the objects N and M are the same.
  • the sensor array 30 includes distance measuring sensors 31, 33, 36, and 38 at four positions corresponding to each object displayed on the display 50. It only has to be.
  • the number of distance measuring sensors can be reduced as compared with the display system 1. Therefore, in the display system 1D, the manufacturing cost can be reduced more than the manufacturing cost of the display system 1.
  • FIG. 22 is a diagram illustrating an appearance and a usage state of the display system 2.
  • the display system 2 includes a housing 11, an opening 21, a sensor array 30, and an optical element 41.
  • the optical element 41 transmits light emitted from the display (see FIG. 23) in the housing 11 and displays a two-dimensional image (aerial image) in the rectangular display area 820 in the air.
  • the optical element 41 for example, the imaging element of Patent Document 3 described as the background art can be used.
  • the display area 820 is an area surrounded by four sides 820a, 820b, 820c, and 820d. Note that the side 820a and the side 820b are parallel, and the side 820c and the side 820d are parallel.
  • the direction of the normal line of the display area 820 is the z direction in the xyz coordinate system.
  • the display area 820 is parallel to the xy plane.
  • the xyz coordinate system is a coordinate system obtained by rotating the XYZ coordinate system by a predetermined angle about the X axis as a rotation axis.
  • the opening 21 has a rectangular shape.
  • the opening 21 is formed along the side 820b of the display area 820 below the display area 820 (in the negative y-axis direction).
  • the sensor array 30 has a plurality of ranging sensors 31 to 38 arranged in a row in the x direction.
  • the sensor array 30 is arranged along the opening 21 in the housing 11. Specifically, the sensor array 30 is installed so that the sensing surfaces of the distance measuring sensors 31 to 38 face the display area 820.
  • the user touches the aerial image displayed in the display area 820 with, for example, the user's finger 910. Specifically, the user touches the object with the finger 910 in order to select the object included in the aerial image.
  • the display system 2 includes a sensor array 30 and a display 50 in the housing 11.
  • the display system 2 includes an optical element 41 in an opening provided on the surface of the housing 11.
  • the display 50 displays an image in the direction of the optical element 41.
  • the image displayed on the display 50 is displayed in the display area 820 as an aerial image by the optical element 41. More detailed description is as follows.
  • the display 50 is installed with an inclination of 90 ° - ⁇ a with respect to the optical element 41.
  • the incident angle of the light emitted from the display 50 with respect to the optical element 41 is also 90 ° ⁇ a.
  • the optical element 41 emits the light emitted from the display 50 at an emission angle of 90 ° ⁇ b.
  • the image displayed on the display 50 is displayed in the display area 820 as an aerial image.
  • Each of the distance measuring sensors 31 to 38 of the sensor array 30 is in the y direction perpendicular to the arrangement direction (x direction) of the distance measuring sensors and the normal direction (z direction) of the display area 820, and is in the display area 820. Light is emitted in the direction of approach. More specifically, the sensor array 30 is disposed at a position that intersects a plane including the display area 820. That is, the sensor array 30 is arranged at a position parallel to the sides 820a and 820b of the display area 820 (see FIG. 22).
  • the sensor array 30 may be arranged so that the light emitted from the distance measuring sensors 31 to 38 passes through the display area 820 (that is, the light overlaps the display area 820), or the light is displayed in the display area 820. You may arrange
  • the light emitted from the distance measuring sensors 31 to 38 is arranged so as to pass through the display area 820 will be described as an example.
  • FIG. 24 is a block diagram showing a part of the hardware configuration of the display system 2.
  • display system 2 includes a sensor array 30, a display 50, a CPU 60, a memory 70, a display driving device 80, and an A / D converter 90. That is, the display system 2 has the same configuration as the display system 1 according to the first embodiment. Therefore, the description about each hardware of the display system 2 is not repeated here.
  • FIG. 25 is a diagram showing a correspondence relationship between each of the distance measuring sensors 31 to 38 and the display area 820.
  • each of the distance measuring sensors 31 to 38 emits light in the y direction.
  • Each of the distance measuring sensors 31 to 38 receives the light reflected by the object among the emitted light, and outputs a voltage value based on the distance to the object to the A / D converter 90.
  • the position of the object in the two-dimensional image can be determined by installing the sensor array 30 in which the plurality of distance measuring sensors 31 to 38 are arranged in a row in the downward direction of the two-dimensional image. . Therefore, the display system 2 can detect the position of the object in the two-dimensional image with a simple configuration without surrounding the two-dimensional image displayed in the air with a frame.
  • FIG. 26 is a diagram illustrating an appearance and a usage state of a display system 2A which is a first modification of the display system 2.
  • the display system 2 ⁇ / b> A includes a housing 11 ⁇ / b> A, an opening 21 ⁇ / b> A, a sensor array 30, and an optical element 41.
  • the opening 21A has a rectangular shape. Similarly to the opening 21, the opening 21 ⁇ / b> A is formed below the display area 820 along the side 820 b of the display area 820.
  • the length of the opening 21A in the X direction is the same as that of the opening 21.
  • the length of the opening 21 ⁇ / b> A in the Y direction is longer than that of the opening 21.
  • the sensor array 30 is configured to be movable in the Y direction.
  • FIG. 27 is a sectional view taken along line XXVII-XXVII in FIG.
  • display system 2A includes a sensor array 30 and a display 50 in housing 11A.
  • the display system 2A includes an optical element 41 in an opening provided on the surface of the housing 11A.
  • the position of the optical element 41 in the display system 2A is the same as the position of the optical element 41 in the display system 2.
  • the display system 2A includes a moving mechanism 110A (see FIG. 28) that translates the display 50 in the direction of the arrow 721 (third direction).
  • the display system 2A includes a moving mechanism 120A (see FIG. 28) that translates the sensor array 30 in the direction of the arrow 723 (fourth direction).
  • the direction of the arrow 721 is a direction in which the incident angle of the light emitted from the display 50 to the optical element 41 does not change.
  • the direction of the arrow 723 is the Y-axis direction. Note that the direction of the arrow 723 is also a direction having a component in the normal direction of the display region 820.
  • the position of the display area 820 is also moved in the direction of the arrow 722.
  • the display area 820 moves in the direction of the normal line of the display area 820 based on the movement of the display 50. Note that the direction of the normal line is different from the direction of the arrow 721.
  • the display area 820 moves in parallel in a direction away from the optical element 41 by a distance corresponding to the amount of movement of the display 50.
  • the display area 820 moves in parallel by a distance corresponding to the amount of movement of the display 50 in the direction approaching the optical element 41.
  • the sensor array 30 is moved in the direction according to the movement of the display area 820 by the moving mechanism 120A. In other words, the sensor array 30 moves in the direction corresponding to the movement of the display 50 by the amount corresponding to the movement amount of the display 50. Thus, in the display system 2A, the sensor array 30 moves in conjunction with the movement of the display 50.
  • FIG. 28 is a block diagram showing a part of the hardware configuration of the display system 2A.
  • display system 2A includes sensor array 30, display 50, CPU 60, memory 70A, display driving device 80, A / D converter 90, moving mechanism 110A, and moving mechanism 120A. Is provided.
  • the memory 70A stores the positional relationship data 72 as described above.
  • the positional relationship data 72 is data indicating the correspondence between the position of the display 50 and the position of the sensor array 30.
  • the CPU 60 When the CPU 60 receives a command for moving the moving mechanism 110A from an input device (for example, an operation key) (not shown), the CPU 60 sends a command for moving the display 50 by the direction and amount corresponding to the command to the moving mechanism 110A.
  • the moving mechanism 110 ⁇ / b> A moves the display 50 based on a command from the CPU 60.
  • the CPU 60 When the CPU 60 receives a command to move the moving mechanism 110A as described above, the CPU 60 further determines the position of the sensor array 30 based on the position of the display 50 and the positional relationship data 72. When the position of the sensor array 30 is determined, the CPU 60 moves the sensor array 30 to the determined position using the moving mechanism 120A.
  • the distance from the display area 820 changes according to the position of the sensor array 30.
  • the voltage value output by the sensor array 30 differs depending on the position of the sensor array 30. Therefore, in order to accurately detect the y coordinate value of the position of the finger 910 in the two-dimensional image, the CPU 60 performs processing in consideration of the position of the sensor array 30 when specifying the y coordinate value of the position of the finger 910. Necessary.
  • the CPU 60 determines the finger 910 in the two-dimensional image based on each voltage value output from each of the distance measuring sensors 31 to 38, the association data 71, and the position of the sensor array 30. The position of is detected. For example, a program for adding a distance considering the moving distance of the sensor array 30 to the distance obtained from the output voltage in the characteristics of the distance measuring sensor (see FIG. 8) to obtain the y coordinate value in the display area 820. And may be stored in the memory 70A.
  • the display system 2A can change the position of the display area 820 in the z-axis direction. That is, the display system 2A can change the position of the aerial image. Further, the display system 2A causes the sensor array 30 to follow the movement of the display area 820. Therefore, the display system 2A can detect the position of the object in the two-dimensional image while allowing the position of the display area 820 to be changed.
  • the CPU 60 detects the position of the finger 910 in the two-dimensional image based on each voltage value output from each of the distance measuring sensors 31 to 38, the association data 71, and the position of the display 50.
  • the display system 2A may be configured. In this case, for example, the distance obtained by considering the moving distance of the display 50 is added to the distance obtained from the output voltage in the characteristics of the distance measuring sensor (see FIG. 8) to obtain the y coordinate value in the display area 820.
  • the program may be stored in the memory 70A.
  • FIG. 29 is a diagram illustrating an appearance and a usage state of a display system 2B which is a second modification of the display system 2.
  • display system 2B includes a casing 11B, an opening 21B, a sensor array 30, and an optical element 41.
  • the opening 21B has a rectangular shape. Similar to the opening 21A, the opening 21B is formed below the display area 820 along the side 820b of the display area 820. The length of the opening 21B in the X direction is the same as that of the opening 21. The length of the opening 21 ⁇ / b> B in the Y direction is longer than that of the opening 21.
  • the sensor array 30 is configured to be movable in a predetermined direction (see FIG. 30). For example, the opening 21B has the same shape as the opening 21A of the display system 2A.
  • FIG. 30 is a cross-sectional view taken along line XXX-XXX in FIG.
  • the display system 2B includes a sensor array 30 and a display 50 in a housing 11B.
  • the display system 2B includes an optical element 41 in an opening provided on the surface of the housing 11B. Note that the position of the optical element 41 in the display system 2B is the same as the position of the optical element 41 in the display system 2.
  • the display system 2B includes a moving mechanism 110B (see FIG. 31) that translates the display 50 in the direction of the arrow 731 (third direction). Further, the display system 2A includes a moving mechanism 120B (see FIG. 31) that translates the sensor array 30 in the direction of the arrow 733 (fourth direction).
  • the direction of the arrow 731 is a direction in which the incident angle of the light with respect to the optical element 41 based on the image displayed on the display 50 is changed.
  • the direction of the arrow 733 is a direction in which the emission angle of each light emitted from the plurality of distance measuring sensors 31 to 38 is changed.
  • the display 50 rotates within a predetermined first range around a predetermined first rotation axis (not shown).
  • the sensor array 30 rotates within a predetermined second range around a predetermined second rotation axis J1.
  • the position of the display area 830 is also moved in the direction of the arrow 732. Specifically, the display area 820 rotates around the second rotation axis J ⁇ b> 1 based on the movement of the display 50.
  • the display area 820 moves in a direction away from the optical element 41 by a distance corresponding to the movement amount (rotation amount) of the display 50.
  • the display area 820 moves by a distance corresponding to the movement amount (rotation amount) of the display 50 in a direction approaching the optical element 41.
  • the sensor array 30 is moved in a direction according to the movement of the display area 820 by the moving mechanism 120B. More specifically, the sensor array 30 rotates in the direction corresponding to the movement of the display 50 by an amount corresponding to the rotation amount of the display 50. Thus, in the display system 2A, the sensor array 30 moves in conjunction with the movement of the display 50.
  • FIG. 31 is a block diagram showing a part of the hardware configuration of the display system 2B.
  • display system 2B includes sensor array 30, display 50, CPU 60, memory 70A, display driving device 80, A / D converter 90, moving mechanism 110B, and moving mechanism 120B. Is provided.
  • the CPU 60 When the CPU 60 receives a command to move the moving mechanism 110B from an input device (for example, an operation key) (not shown), the CPU 60 sends a command for moving the display 50 by the direction and amount corresponding to the command to the moving mechanism 110B.
  • the moving mechanism 110 ⁇ / b> B moves the display 50 based on a command from the CPU 60.
  • the CPU 60 When the CPU 60 receives a command to move the moving mechanism 110B as described above, the CPU 60 further determines the position of the sensor array 30 based on the position of the display 50 and the positional relationship data 72. When the position of the sensor array 30 is determined, the CPU 60 moves the sensor array 30 to the determined position using the moving mechanism 120B.
  • the display system 2B can change the angle of the display area 820 with respect to the optical element 41. That is, the display system 2B can change the angle of the aerial image with respect to the optical element 41. Further, the display system 2B causes the sensor array 30 to follow the movement of the display area 820. For this reason, the display system 2B can detect the position of the object in the two-dimensional image while allowing the angle of the display area 820 to the optical element 41 to be changed.
  • the display system 2 may be modified like a display system 1B (see FIG. 15) which is a second modification of the display system 1. That is, in the display system 2, the sensor array 140 may be arranged in parallel with the sensor array 30.
  • the display areas 810 and 820 are described as rectangular display areas. However, the display areas are not limited to rectangles.
  • the shape of the display area 810 is a shape corresponding to the shape of the display surface of the display 50, the shape of the microlens array 40, and the shape of the optical element 41. For this reason, the shape of the display surface of the display used in the display system, the shape of the microlens array, and the shape of the optical element are different from the shape of the display surface of the display 50, the shape of the microlens array 40, and the shape of the optical element 41.
  • the display area has a shape corresponding to the different shape. Even if the display area is not rectangular, according to the above-described configuration, the position of the object in the two-dimensional image can be detected with a simple configuration without surrounding the two-dimensional image displayed in the air with a frame.
  • 1, 1A, 1B, 1C, 1D, 2, 2A, 2B Display system 10, 10A, 10B, 11, 11A, 11B housing, 20, 20A, 20B, 20C, 21, 21A, 21B opening, 30, 30A, 140 sensor array, 31-38, 31A ranging sensor, 40 microlens array, 41 optical elements, 50 display, 70, 70A memory, 71 association data, 72 positional relationship data, 80 display drive device, 90 A / D converter, 110, 120, 110A, 110B, 120A, 120B moving mechanism, 150 A / D converter, 311 light emitting device, 312, 313 light receiving device, 312a, 313a infrared light receiving element, 610 sensing area, 810, 820, 830 display Area, 910 fingers, A ⁇ Objects, 311a infrared LED, 312a, 313a infrared receiving component.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

A display system (1) is provided with a processor; a sensor array (30) arranged in a row in a first direction; and a memory which stores data indicative of the correspondence relationship between an output voltage and the distance in each distance measuring sensor. A direction normal to a display area (810) is perpendicular to the first direction. Each of a plurality of distance measuring sensors emits light in a second direction perpendicular to the first direction and the direction normal to the display area (810) and in a direction approaching the display area (810). Each of the plurality of distance measuring sensors receives, of the light emitted, the light reflected from an object and outputs a voltage value based on the distance to the object. The processor detects the position of the object in a two dimensional image on the basis of each voltage value that each of the plurality of distance measuring sensors outputted and the data stored in the memory.

Description

表示システム、および検出方法Display system and detection method

 本発明は、表示システム、および検出方法に関する。本発明は、特に、空中に2次元画像を表示する表示システム、および当該2次元画像における物体の位置の検出方法に関する。 The present invention relates to a display system and a detection method. The present invention particularly relates to a display system that displays a two-dimensional image in the air and a method for detecting the position of an object in the two-dimensional image.

 従来、空中に2次元画像を表示する表示システムが知られている。
 特開2005-141102号公報(特許文献1)には、上記表示システムとして、立体的二次元画像表示装置が開示されている。立体的二次元画像表示装置は、表示部と、マイクロレンズアレイと、位置検出センサと、制御部とを備える。
Conventionally, a display system that displays a two-dimensional image in the air is known.
Japanese Patent Laying-Open No. 2005-141102 (Patent Document 1) discloses a stereoscopic two-dimensional image display device as the display system. The stereoscopic two-dimensional image display device includes a display unit, a microlens array, a position detection sensor, and a control unit.

 表示部は、二次元画像を表示する画像表示面を備える。マイクロレンズアレイは、画像表示面から出射した光を画像表示面から離間した立体画像表示面に結像することにより、立体画像表示面に二次元画像を擬似立体的に表示する。位置検出センサは、立体画像表示面に対応づけて配置され、外部からの物理的な働きかけを受けた位置に対応した信号を出力する。制御部は、位置検出センサからの出力信号に応じて、立体画像表示面内の画像を変化させる。 The display unit includes an image display surface for displaying a two-dimensional image. The microlens array displays a two-dimensional image on a stereoscopic image display surface in a pseudo-stereoscopic manner by forming light emitted from the image display surface on a stereoscopic image display surface separated from the image display surface. The position detection sensor is arranged in association with the stereoscopic image display surface, and outputs a signal corresponding to the position subjected to physical action from the outside. The control unit changes the image in the stereoscopic image display surface according to the output signal from the position detection sensor.

 特開平9-55152号公報(特許文献2)には、上記表示システムとして、タッチレスパネルスイッチを備えた表示装置が開示されている。当該タッチレスパネルスイッチにおいては、指を検出しようとしている所定の領域へ指が侵入してきたときに、投光素子からの光線が当該指に反射し、受光素子へ入射するように、当該投光素子と当該受光素子とからなる反射型光センサを、屈折率分布型レンズ素子の周囲の空間に、当該所定の領域毎に少なくとも1個以上設置している。 JP-A-9-55152 (Patent Document 2) discloses a display device having a touchless panel switch as the display system. In the touchless panel switch, when the finger enters a predetermined area where the finger is to be detected, the light projection from the light projecting element is reflected on the finger and incident on the light receiving element. At least one reflection type optical sensor composed of an element and the light receiving element is installed in the space around the gradient index lens element for each predetermined region.

 また、従来、空中に2次元画像および3次元画像を表示する表示システムが知られている。 Also, conventionally, display systems that display 2D images and 3D images in the air are known.

 国際公開第2007/116639号公報(特許文献3)には、当該表示システムとして、結像素子を備えたディスプレイ装置が開示されている。当該ディスプレイ装置は、2次元または3次元の物体である被投影物を、結像素子に対して反対側に、2次元像または3次元像の実像として結像させる。より詳しく説明すると以下のとおりである。 International Publication No. 2007/116639 (Patent Document 3) discloses a display device including an imaging element as the display system. The display device forms an object to be projected, which is a two-dimensional or three-dimensional object, on the opposite side of the imaging element as a two-dimensional image or a real image of a three-dimensional image. More detailed description is as follows.

 結像素子は、1つの平面を構成する素子面を光が透過する際に光線の屈曲を生じさせる光学素子である。結像素子は、素子面に垂直もしくはそれに近い角度で配置された1つ以上の鏡面による光の反射を行なう単位光学素子を複数配置することにより構成される。結像素子は、素子面の一方側に配置した被投影物から発せられる光を、素子面を透過する際に鏡面に反射させることによって、素子面の他方側の物理的実体のない空間に実像として結像させる。 The imaging element is an optical element that causes a light beam to bend when light passes through an element surface constituting one plane. The imaging element is configured by arranging a plurality of unit optical elements that reflect light by one or more mirror surfaces arranged at an angle perpendicular to or close to the element surface. The imaging element is a real image in a space where there is no physical entity on the other side of the element surface by reflecting the light emitted from the projection object arranged on one side of the element surface to the mirror surface when passing through the element surface. As an image.

 特開平6-18258号公報(特許文献4)には、遠距離用受光部と近距離用受光部とを備えた光学的測距センサが開示されている。 Japanese Patent Laid-Open No. 6-18258 (Patent Document 4) discloses an optical distance measuring sensor including a long-distance light receiving unit and a short-distance light receiving unit.

特開2005-141102号公報JP 2005-141102 A 特開平9-55152号公報JP-A-9-55152 国際公開第2007/116639号公報International Publication No. 2007/116639 特開平6-18258号公報Japanese Patent Laid-Open No. 6-18258

 しかしながら、特許文献1では、空中に表示した2次元画像の周囲を囲むように位置検出センサを配置する必要がある。それゆえ、特許文献1では、空中に表示した2次元画像の周囲に枠が必要となる。したがって、ユーザは、特許文献1の立体的二次元画像表示装置が表示する画像と、表示パネルに画像を表示させる一般のディスプレイが表示する画像との違いを感じにくい。 However, in Patent Document 1, it is necessary to arrange a position detection sensor so as to surround the periphery of the two-dimensional image displayed in the air. Therefore, in Patent Document 1, a frame is required around the two-dimensional image displayed in the air. Therefore, it is difficult for the user to feel the difference between the image displayed by the stereoscopic two-dimensional image display device of Patent Document 1 and the image displayed by a general display that displays an image on the display panel.

 特許文献2では、1つのセンサによって、空中の2次元画像内の予め定められた位置に指等の物体が位置したことを検知している。このため、2次元画像を表示する表示領域について検知を行なうには、数多くのセンサが必要になる。さらに、各センサの設置位置の位置決めは、非常に難しい。 In Patent Document 2, it is detected by one sensor that an object such as a finger is located at a predetermined position in a two-dimensional image in the air. For this reason, in order to detect about the display area which displays a two-dimensional image, many sensors are needed. Furthermore, it is very difficult to determine the installation position of each sensor.

 特許文献3では、結像した2次元像または3次元画像の実像内のどの位置に物体が存在しているかを検知することはできない。 In Patent Document 3, it cannot be detected at which position in the real image of the formed two-dimensional image or three-dimensional image.

 本発明は上記の問題点に鑑みなされたものであって、その目的は、空中に表示させた2次元画像の周囲を枠で囲むことなく、簡易な構成で2次元画像における物体の位置を検出可能な表示システム、および当該表示システムにおける検出方法を提供することにある。 The present invention has been made in view of the above problems, and its purpose is to detect the position of an object in a two-dimensional image with a simple configuration without surrounding the two-dimensional image displayed in the air with a frame. It is to provide a possible display system and a detection method in the display system.

 本発明のある局面に従うと、表示システムは、ディスプレイと、ディスプレイに表示された画像に基づいて、空中の表示領域に2次元画像を表示する光学素子と、プロセッサと、複数の測距センサを第1の方向に列状に配置した第1のセンサアレイと、各測距センサにおける出力電圧と距離との対応関係を示した第1のデータを格納したメモリとを備える。表示領域の法線の方向は、第1の方向に垂直な方向である。複数の測距センサの各々は、第1の方向および法線の方向と垂直な第2の方向であって、表示領域に近づく向きに、光を出射する。複数の測距センサの各々は、出射された光のうち物体により反射された光を受光して、当該物体までの距離に基づく電圧値を出力する。プロセッサは、複数の測距センサの各々が出力した各電圧値と、第1のデータとに基づいて、2次元画像における物体の位置を検出する。 According to an aspect of the present invention, a display system includes a display, an optical element that displays a two-dimensional image in an aerial display area based on an image displayed on the display, a processor, and a plurality of distance measuring sensors. A first sensor array arranged in a row in one direction, and a memory storing first data indicating a correspondence relationship between an output voltage and a distance in each distance measuring sensor. The normal direction of the display area is a direction perpendicular to the first direction. Each of the plurality of distance measuring sensors emits light in a second direction perpendicular to the first direction and the normal direction and in a direction approaching the display area. Each of the plurality of distance measuring sensors receives light reflected by the object among the emitted light, and outputs a voltage value based on the distance to the object. The processor detects the position of the object in the two-dimensional image based on each voltage value output from each of the plurality of distance measuring sensors and the first data.

 好ましくは、第1のセンサアレイは、表示領域を含む平面と交わる位置に配置されている。 Preferably, the first sensor array is disposed at a position intersecting with a plane including the display area.

 好ましくは、複数の測距センサの各々が出射する光は、表示領域を通過する。
 好ましくは、表示システムは、ディスプレイを第3の方向に移動させる第1の移動機構と、第1のセンサアレイを第4の方向に移動させる第2の移動機構とをさらに備える。メモリは、ディスプレイの位置と第1のセンサアレイの位置との対応関係を示した第2のデータをさらに格納している。表示領域は、ディスプレイの移動に基づいて移動する。プロセッサは、ディスプレイの位置と第2のデータとに基づき、第1のセンサアレイの位置を決定する。プロセッサは、第2の移動機構を用いて、第1のセンサアレイを決定した位置に移動させる。
Preferably, the light emitted from each of the plurality of distance measuring sensors passes through the display area.
Preferably, the display system further includes a first moving mechanism that moves the display in the third direction and a second moving mechanism that moves the first sensor array in the fourth direction. The memory further stores second data indicating the correspondence between the position of the display and the position of the first sensor array. The display area moves based on the movement of the display. The processor determines the position of the first sensor array based on the position of the display and the second data. The processor moves the first sensor array to the determined position using the second moving mechanism.

 好ましくは、第3の方向は、ディスプレイの表示面に垂直な方向である。表示領域は、ディスプレイの第3の方向への移動に基づき、当該第3の方向に移動する。第4の方向は、第3の方向と同じ方向である。 Preferably, the third direction is a direction perpendicular to the display surface of the display. The display area moves in the third direction based on the movement of the display in the third direction. The fourth direction is the same direction as the third direction.

 好ましくは、第3の方向は、ディスプレイの表示面に垂直な方向である。表示領域の法線の方向は、第3の方向とは異なる。表示領域は、ディスプレイの第3の方向への移動に基づき、上記法線の方向に移動する。第4の方向は、上記法線の方向の成分を有する方向である。プロセッサは、複数の測距センサの各々が出力した各電圧値と、第1のデータと、第1のセンサアレイの位置またはディスプレイの位置とに基づいて、2次元画像における物体の位置を検出する。 Preferably, the third direction is a direction perpendicular to the display surface of the display. The direction of the normal of the display area is different from the third direction. The display area moves in the normal direction based on the movement of the display in the third direction. The fourth direction is a direction having a component of the normal direction. The processor detects the position of the object in the two-dimensional image based on each voltage value output from each of the plurality of ranging sensors, the first data, and the position of the first sensor array or the position of the display. .

 好ましくは、第3の方向は、ディスプレイに表示された画像に基づく光の光学素子に対する入射角を変更する方向である。第4の方向は、複数の測距センサが出射する各光の出射角度を変化させる方向である。 Preferably, the third direction is a direction in which the incident angle of the light with respect to the optical element based on the image displayed on the display is changed. The fourth direction is a direction in which the emission angle of each light emitted from the plurality of distance measuring sensors is changed.

 好ましくは、表示システムは、複数の測距センサを第1の方向に列状に配置した第2のセンサアレイをさらに備える。第2のセンサアレイは、第1のセンサアレイと平行に配置され、第1のセンサアレイと同じ向きに光を出射する。プロセッサは、表示領域に少なくとも1つのオブジェクトを表示させる。プロセッサは、第2のセンサアレイに含まれる複数の測距センサの各々が出力した電圧値と、第1のデータとに基づいて、物体の位置を検出する。プロセッサは、当該検出した物体の位置に対応する表示領域を含む平面における位置が、当該表示領域における少なくとも1つのオブジェクトを表示している領域に含まれるか否かを判断する。プロセッサは、オブジェクトを表示する領域に含まれると判断したことに基づき、表示領域における当該オブジェクトの表示態様を第1の表示態様から第2の表示態様に変更する。 Preferably, the display system further includes a second sensor array in which a plurality of distance measuring sensors are arranged in a row in the first direction. The second sensor array is arranged in parallel with the first sensor array and emits light in the same direction as the first sensor array. The processor displays at least one object in the display area. The processor detects the position of the object based on the voltage value output from each of the plurality of distance measuring sensors included in the second sensor array and the first data. The processor determines whether or not a position on a plane including a display area corresponding to the detected position of the object is included in an area displaying at least one object in the display area. The processor changes the display mode of the object in the display area from the first display mode to the second display mode based on the determination that the processor is included in the area for displaying the object.

 好ましくは、プロセッサは、表示領域に少なくとも1つのオブジェクトを表示させる。第1のセンサアレイにおける測距センサは、少なくとも1つのオブジェクトに対応する位置に設けられている。 Preferably, the processor displays at least one object in the display area. The distance measuring sensor in the first sensor array is provided at a position corresponding to at least one object.

 好ましくは、第1のセンサアレイにおける複数の測距センサの各々は、1つの発光素子と、当該発光素子が発光した光の反射光を各々受光する2つの受光素子とを含む。 Preferably, each of the plurality of distance measuring sensors in the first sensor array includes one light emitting element and two light receiving elements each receiving reflected light of light emitted from the light emitting element.

 好ましくは、プロセッサは、表示領域に少なくとも1つのオブジェクトを表示させる。プロセッサは、第1のセンサアレイによって検出された物体の位置が、少なくとも1つのオブジェクトを表示している領域に含まれていることに基づき、当該オブジェクトに対応付けられた処理を実行する。 Preferably, the processor displays at least one object in the display area. The processor executes processing associated with the object based on the fact that the position of the object detected by the first sensor array is included in the area displaying at least one object.

 本発明の他の局面に従うと、検出方法は、空中の表示領域に表示した2次元画像における物体の位置を検出する表示システムにおける検出方法である。表示システムは、プロセッサと、複数の測距センサを第1の方向に列状に配置したセンサアレイと、各測距センサにおける出力電圧と距離との対応関係を示したデータを格納したメモリとを含む。表示領域の法線の方向は、第1の方向に垂直な方向である。検出方法は、複数の測距センサの各々が、第1の方向および法線の方向と垂直な第2の方向であって、表示領域に近づく向きに、光を出射するステップと、複数の測距センサの各々が、出射された光のうち物体により反射された光を受光して、当該物体までの距離に基づく電圧値を出力するステップと、プロセッサが、複数の測距センサの各々によって出力された各電圧値と、データとに基づいて、2次元画像における物体の位置を検出するステップとを備える。 According to another aspect of the present invention, the detection method is a detection method in a display system that detects the position of an object in a two-dimensional image displayed in an aerial display area. The display system includes a processor, a sensor array in which a plurality of distance measuring sensors are arranged in a row in a first direction, and a memory that stores data indicating a correspondence relationship between an output voltage and a distance in each distance measuring sensor. Including. The normal direction of the display area is a direction perpendicular to the first direction. The detection method includes a step in which each of the plurality of distance measuring sensors emits light in a second direction perpendicular to the first direction and the normal direction and approaches the display area; Each of the distance sensors receives light reflected by the object out of the emitted light, and outputs a voltage value based on the distance to the object, and the processor outputs by each of the plurality of distance sensors Detecting the position of the object in the two-dimensional image based on each of the voltage values and the data.

 本発明によれば、空中に表示させた2次元画像の周囲を枠で囲むことなく、簡易な構成で2次元画像における物体の位置を検出可能となる。 According to the present invention, it is possible to detect the position of an object in a two-dimensional image with a simple configuration without surrounding the two-dimensional image displayed in the air with a frame.

表示システムの外観および使用状態を示した図である。It is the figure which showed the external appearance and use condition of the display system. 図1におけるII-II線矢視断面図である。FIG. 2 is a cross-sectional view taken along line II-II in FIG. 表示システムのハードウェア構成の一部を示したブロック図である。It is the block diagram which showed a part of hardware constitutions of the display system. センサアレイの構成を説明するための図である。It is a figure for demonstrating the structure of a sensor array. 測距センサの外観を示した図である。It is the figure which showed the external appearance of the ranging sensor. センサアレイが出射する光の照射範囲を説明するための図である。It is a figure for demonstrating the irradiation range of the light which a sensor array radiate | emits. 測距センサによる測定原理を説明するための第1の図である。It is a 1st figure for demonstrating the measurement principle by a ranging sensor. 測距センサによる測定原理を説明するための第2の図である。It is a 2nd figure for demonstrating the measurement principle by a ranging sensor. センサアレイによる測定例を示した図である。It is the figure which showed the example of a measurement by a sensor array. 表示システムにおける処理の流れを示したフローチャートである。It is the flowchart which showed the flow of the process in a display system. 他の表示システムの外観および使用状態を示した図である。It is the figure which showed the external appearance and use condition of the other display system. 図11におけるXII-XII線矢視断面図である。FIG. 12 is a cross-sectional view taken along line XII-XII in FIG. 11. 表示システムのハードウェア構成の一部を示したブロック図である。It is the block diagram which showed a part of hardware constitutions of the display system. さらに他の表示システムの外観および使用状態を示した図である。It is the figure which showed the external appearance and use condition of other display systems. 図14におけるXV-XV線矢視断面図である。FIG. 15 is a cross-sectional view taken along line XV-XV in FIG. 14. 表示システムのハードウェア構成の一部を示したブロック図である。It is the block diagram which showed a part of hardware constitutions of the display system. 第1の表示態様と第2の表示態様とを説明するための図である。It is a figure for demonstrating a 1st display mode and a 2nd display mode. 表示システムにおける表示態様の変更処理の流れを示したフローチャートである。It is the flowchart which showed the flow of the change process of the display mode in a display system. さらに他の表示システムの測距センサの構成を説明するための図である。It is a figure for demonstrating the structure of the ranging sensor of another display system. 表示システムのセンサの出力特性を示した図である。It is the figure which showed the output characteristic of the sensor of a display system. さらに他の表示システムのセンサアレイを説明するための図である。It is a figure for demonstrating the sensor array of another display system. さらに他の表示システムの外観および使用状態を示した図である。It is the figure which showed the external appearance and use condition of other display systems. 図22におけるXXIII-XXIII線矢視断面図である。FIG. 23 is a cross-sectional view taken along line XXIII-XXIII in FIG. 表示システムのハードウェア構成の一部を示したブロック図である。It is the block diagram which showed a part of hardware constitutions of the display system. 各測距センサと、表示領域との対応関係を示した図である。It is the figure which showed the correspondence of each ranging sensor and a display area. さらに他の表示システムの外観および使用状態を示した図である。It is the figure which showed the external appearance and use condition of other display systems. 図26におけるXXVII-XXVII線矢視断面図である。FIG. 27 is a cross-sectional view taken along line XXVII-XXVII in FIG. 26. 表示システムのハードウェア構成の一部を示したブロック図である。It is the block diagram which showed a part of hardware constitutions of the display system. さらに他の表示システムの外観および使用状態を示した図である。It is the figure which showed the external appearance and use condition of other display systems. 図29におけるXXX-XXX線矢視断面図である。FIG. 30 is a cross-sectional view taken along line XXX-XXX in FIG. 29. 表示システムのハードウェア構成の一部を示したブロック図である。It is the block diagram which showed a part of hardware constitutions of the display system.

 以下、図面を参照しつつ、本発明の各実施の形態について説明する。以下の説明では、同一の部品には同一の符号を付してある。それらの名称および機能も同じである。したがって、それらについての詳細な説明は繰り返さない。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. In the following description, the same parts are denoted by the same reference numerals. Their names and functions are also the same. Therefore, detailed description thereof will not be repeated.

 なお、以下では、「方向」とは、互いに異なる2つの向きを表す。「互いに異なる2つの向き」とは、たとえば、互いに反対を向いた2つの向きを表す。一例を挙げれば、X軸の方向とは、X軸の正の向きと、X軸の負の向きとを表す。また、「互いに異なる2つの向き」とは、たとえば、時計回りの向きと反時計回りの向きとのように、2つの向きが180度異なる向きとならないものも含む。 In the following, “direction” represents two different directions. “Two different directions” represents, for example, two directions opposite to each other. For example, the direction of the X axis represents the positive direction of the X axis and the negative direction of the X axis. In addition, “two different directions” include, for example, a direction in which the two directions do not differ by 180 degrees, such as a clockwise direction and a counterclockwise direction.

 [実施の形態1]
 図1は、表示システム1の外観および使用状態を示した図である。図1を参照して、表示システム1は、筐体10と、開口部20と、センサアレイ30と、マイクロレンズアレイ(光学素子)40とを備える。
[Embodiment 1]
FIG. 1 is a diagram illustrating an appearance and a usage state of the display system 1. With reference to FIG. 1, the display system 1 includes a housing 10, an opening 20, a sensor array 30, and a microlens array (optical element) 40.

 マイクロレンズアレイ40は、筐体10内のディスプレイ(図2参照)が発した光を透過して、空中の矩形の表示領域810に2次元画像(以下、「空中画像」とも称する)を表示する。表示領域810は、4つの辺810a,810b,810c,810dに囲まれた領域である。なお、辺810aと辺810bとは平行であり、辺810cと辺810dとは平行である。表示領域810の法線の方向は、Z方向である。 The microlens array 40 transmits light emitted from the display (see FIG. 2) in the housing 10 and displays a two-dimensional image (hereinafter also referred to as “aerial image”) in a rectangular display area 810 in the air. . The display area 810 is an area surrounded by four sides 810a, 810b, 810c, and 810d. Note that the side 810a and the side 810b are parallel, and the side 810c and the side 810d are parallel. The direction of the normal line of the display area 810 is the Z direction.

 開口部20は、矩形状である。開口部20は、表示領域810の下方に、表示領域810の辺810bに沿って形成されている。 The opening 20 has a rectangular shape. The opening 20 is formed below the display area 810 and along the side 810 b of the display area 810.

 センサアレイ30は、複数の測距センサをX方向に列状に配置している。センサアレイ30は、筐体10内において開口部20に沿って配置されている。具体的には、センサアレイ30は、各測距センサのセンシング面が表示領域810を向くように設置されている。センサアレイの配置についての詳細は、後述する。 The sensor array 30 has a plurality of distance measuring sensors arranged in a row in the X direction. The sensor array 30 is disposed along the opening 20 in the housing 10. Specifically, the sensor array 30 is installed such that the sensing surface of each distance measuring sensor faces the display area 810. Details of the arrangement of the sensor array will be described later.

 表示システム1においては、ユーザは、表示領域810に表示された空中画像を、たとえばユーザの指910で触る。具体的には、ユーザは、空中画像に含まれているオブジェクトを選択するために、指910で当該オブジェクトを触る。オブジェクトとは、たとえばアイコン画像である。なお、空中画像は物理的な物体ではないため、2次元画像に触っても当該2次元画像との物理的な接触はない。 In the display system 1, the user touches the aerial image displayed in the display area 810 with the user's finger 910, for example. Specifically, the user touches the object with the finger 910 in order to select the object included in the aerial image. An object is, for example, an icon image. Note that since the aerial image is not a physical object, even if the two-dimensional image is touched, there is no physical contact with the two-dimensional image.

 図2は、図1におけるII-II線矢視断面図である。図2を参照して、表示システム1は、筐体10内に、センサアレイ30と、マイクロレンズアレイ40と、ディスプレイ50とを備える。 FIG. 2 is a cross-sectional view taken along line II-II in FIG. With reference to FIG. 2, the display system 1 includes a sensor array 30, a microlens array 40, and a display 50 in a housing 10.

 マイクロレンズアレイ40は、マイクロレンズを格子状に並べた光波制御デバイスである。 The microlens array 40 is a light wave control device in which microlenses are arranged in a lattice pattern.

 ディスプレイ50は、マイクロレンズアレイ40の方向に画像を表示する。なお、上述したように、ディスプレイ50が表示した画像が、マイクロレンズアレイ40により空中画像として表示領域810に表示される。 Display 50 displays an image in the direction of microlens array 40. As described above, the image displayed on the display 50 is displayed in the display area 810 as an aerial image by the microlens array 40.

 センサアレイ30の測距センサの各々は、測距センサの配列方向(X方向)および表示領域810の法線の方向(Z方向)と垂直なY方向であって、表示領域810に近づく向きに光を出射する。より具体的には、センサアレイ30は、表示領域810を含む平面と交わる位置に配置されている。つまり、センサアレイ30は、表示領域810の辺810a,810bと平行となる位置に配されている(図1参照)。 Each of the distance measuring sensors of the sensor array 30 is in the Y direction perpendicular to the direction in which the distance measuring sensors are arranged (X direction) and the direction of the normal of the display area 810 (Z direction), and in a direction approaching the display area 810. Emits light. More specifically, the sensor array 30 is arranged at a position that intersects a plane including the display area 810. That is, the sensor array 30 is arranged at a position parallel to the sides 810a and 810b of the display area 810 (see FIG. 1).

 センサアレイ30は、測距センサが出射する光が表示領域810を通過する(つまり、光が表示領域810と重なる)ように配置されていてもよいし、あるいは当該光が表示領域810に沿って進むように配置されていてもよい。以下では、測距センサが出射する光が表示領域810を通過するように配置される場合を例に挙げて説明する。 The sensor array 30 may be arranged so that the light emitted from the distance measuring sensor passes through the display area 810 (that is, the light overlaps the display area 810), or the light travels along the display area 810. It may be arranged to move forward. Hereinafter, a case where light emitted from the distance measuring sensor is arranged to pass through the display area 810 will be described as an example.

 図3は、表示システム1のハードウェア構成の一部を示したブロック図である。図3を参照して、表示システム1は、センサアレイ30と、ディスプレイ50と、CPU(Central Processing Unit)60と、メモリ70と、ディスプレイ駆動装置80と、A/D(Analog/Digital)コンバータ90とを備える。 FIG. 3 is a block diagram showing a part of the hardware configuration of the display system 1. Referring to FIG. 3, the display system 1 includes a sensor array 30, a display 50, a CPU (Central Processing Unit) 60, a memory 70, a display driving device 80, and an A / D (Analog / Digital) converter 90. With.

 また、表示システム1は、音を発生させる装置(スピーカ等)を備える。後述する他の表示システムについても同様である。 Also, the display system 1 includes a device (speaker or the like) that generates sound. The same applies to other display systems described later.

 センサアレイ30は、センシング結果であるアナログの電圧値をA/Dコンバータ90に出力する。A/Dコンバータ90は、アナログの電圧値をデジタルの電圧値に変換する。A/Dコンバータ90は、変換後のデジタルの電圧値を、CPU60に送る。 Sensor array 30 outputs an analog voltage value as a sensing result to A / D converter 90. The A / D converter 90 converts an analog voltage value into a digital voltage value. The A / D converter 90 sends the converted digital voltage value to the CPU 60.

 メモリ70は、たとえば、ROM、RAM、およびフラッシュメモリで構成される。メモリ70は、表示システム1が実行するプログラム、および対応付けデータ71等の各種データを格納している。対応付けデータ71については後述する。 The memory 70 includes, for example, a ROM, a RAM, and a flash memory. The memory 70 stores various data such as a program executed by the display system 1 and association data 71. The association data 71 will be described later.

 CPU60は、メモリ70に予め格納されているプログラムを実行する。また、CPU60は、A/Dコンバータ90から取得した電圧値および対応付けデータ71を参照して、後述する処理を実行する。 CPU 60 executes a program stored in memory 70 in advance. In addition, the CPU 60 refers to the voltage value acquired from the A / D converter 90 and the association data 71 and executes processing to be described later.

 ディスプレイ駆動装置80は、CPU60からの指令を受け、ディスプレイ50を駆動する。 The display driving device 80 receives a command from the CPU 60 and drives the display 50.

 図4は、センサアレイ30の構成を説明するための図である。センサアレイ30は、8個の測距センサ31~38を列状に配置している。各測距センサ31~38は、Y方向に光を出射する。測距センサ31~38の各々は、出射した光のうち物体により反射された光を受光して、当該物体までの距離に基づく電圧値をA/Dコンバータ90に出力する。 FIG. 4 is a diagram for explaining the configuration of the sensor array 30. In the sensor array 30, eight distance measuring sensors 31 to 38 are arranged in a row. Each of the distance measuring sensors 31 to 38 emits light in the Y direction. Each of the distance measuring sensors 31 to 38 receives the light reflected by the object among the emitted light, and outputs a voltage value based on the distance to the object to the A / D converter 90.

 各測距センサ31~38は、等間隔で配置されている。なお、各測距センサ31~38から表示領域810までの距離は、各測距センサ31~38で同一である。また、各測距センサ31~38は、センサアレイ30における位置が異なる点を除き、同一の構成を有する。なお、センサアレイ30における測距センサの数を8個としているが、これに限定されるものではない。 The distance measuring sensors 31 to 38 are arranged at equal intervals. Note that the distances from the distance measuring sensors 31 to 38 to the display area 810 are the same for the distance measuring sensors 31 to 38. The distance measuring sensors 31 to 38 have the same configuration except that the positions in the sensor array 30 are different. Although the number of distance measuring sensors in the sensor array 30 is eight, the number is not limited to this.

 図5は、測距センサ31の外観を示した図である。図5を参照して、測距センサ31は、発光装置311と、受光装置312とを備える。発光装置311は、光を外部に出射する。受光装置312は、発光装置311が出射した光の反射光を受光する。また、受光装置312は、反射光以外の光(たとえば、室内の光や外光)も受光する。 FIG. 5 is a diagram showing the external appearance of the distance measuring sensor 31. Referring to FIG. 5, the distance measuring sensor 31 includes a light emitting device 311 and a light receiving device 312. The light emitting device 311 emits light to the outside. The light receiving device 312 receives reflected light of the light emitted from the light emitting device 311. The light receiving device 312 also receives light other than reflected light (for example, indoor light or external light).

 図6は、センサアレイ30が出射する光の照射範囲を説明するための図である。図6(a)は、センサアレイ30を構成する測距センサ31の発光装置311が出射する光の照射範囲を示した図である。図6(a)を参照して、発光装置311は、距離dに応じて照射範囲が広がる光を出射する。たとえば測距センサ31が赤外線を出射する場合、照射範囲は、光軸に対して±1.5°の広がりを有する。 FIG. 6 is a diagram for explaining the irradiation range of the light emitted from the sensor array 30. FIG. 6A is a diagram showing an irradiation range of light emitted from the light emitting device 311 of the distance measuring sensor 31 constituting the sensor array 30. With reference to Fig.6 (a), the light-emitting device 311 radiate | emits the light which an irradiation range spreads according to the distance d. For example, when the distance measuring sensor 31 emits infrared rays, the irradiation range has a spread of ± 1.5 ° with respect to the optical axis.

 図6(b)は、測距センサ31~38が出射する光の照射範囲を示した図である。図6(b)を参照して、8つの測距センサにより、表示領域810の全範囲に光を照射できる。 FIG. 6B is a diagram showing the irradiation range of the light emitted from the distance measuring sensors 31 to 38. With reference to FIG. 6B, light can be irradiated to the entire range of the display area 810 by eight distance measuring sensors.

 このように、表示システム1では、表示領域810と、センサアレイ30による検出領域(以下、「センシング領域」とも称する)とが重なる。なお、表示システム1は、Z方向におけるセンシングを行なわない。 Thus, in the display system 1, the display area 810 and the detection area (hereinafter also referred to as “sensing area”) by the sensor array 30 overlap. The display system 1 does not perform sensing in the Z direction.

 図7は、測距センサ31による測定原理を説明するための第1の図である。図7を参照して、発光装置311は、発光素子である赤外線LED(Light Emitting Diode)311aと、レンズ311bとを備える。受光装置312は、赤外線受光素子312aとレンズ312bとを備える。 FIG. 7 is a first diagram for explaining the principle of measurement by the distance measuring sensor 31. Referring to FIG. 7, the light emitting device 311 includes an infrared LED (Light Emitting Diode) 311a which is a light emitting element, and a lens 311b. The light receiving device 312 includes an infrared light receiving element 312a and a lens 312b.

 赤外線LED311aは、レンズ311bに対して赤外線を出射する。レンズ311bから出射した赤外線は、センシング対象である物体950により反射する。当該反射した赤外線は、レンズ312bを通過して、赤外線受光素子312aに入射する。 The infrared LED 311a emits infrared rays to the lens 311b. The infrared rays emitted from the lens 311b are reflected by the object 950 that is a sensing target. The reflected infrared light passes through the lens 312b and enters the infrared light receiving element 312a.

 物体950の代わりに物体960があるときは、赤外線LED311aから出射した赤外線は、物体960により反射する。物体960により反射した赤外線は、上記と同様に、レンズ312bを通過して、赤外線受光素子312aに入射する。 When there is an object 960 instead of the object 950, the infrared light emitted from the infrared LED 311a is reflected by the object 960. Infrared light reflected by the object 960 passes through the lens 312b and enters the infrared light receiving element 312a as described above.

 なお、測距センサ31が出射する光は、赤外線に限定されるものではない。
 図8は、測距センサ31による測定原理を説明するための第2の図である。図8を参照して、測距センサ31は、距離dがd1より大きい場合には、センシング対象(物体950,960、指910)との距離が長くなるほど出力電圧が低下する特性を有する。
The light emitted from the distance measuring sensor 31 is not limited to infrared rays.
FIG. 8 is a second diagram for explaining the principle of measurement by the distance measuring sensor 31. Referring to FIG. 8, when the distance d is greater than d1, the distance measuring sensor 31 has a characteristic that the output voltage decreases as the distance to the sensing target (objects 950, 960, finger 910) increases.

 なお、測距センサ31が検出可能な距離範囲は、距離d1から出力電圧が一定値以下にならない距離d2までの距離となる。センサアレイ30は、表示領域810が検出可能な距離範囲に含まれるような測距センサ31~38により構成されている。 The distance range that can be detected by the distance measuring sensor 31 is a distance from the distance d1 to the distance d2 where the output voltage does not become a certain value or less. The sensor array 30 includes distance measuring sensors 31 to 38 that are included in the distance range in which the display area 810 can be detected.

 メモリ70に格納された対応付けデータ71は、図8の特性を示したデータである。具体的には、対応付けデータ71は、各測距センサ31~38における出力電圧と距離との対応関係を示したデータである。なお、本実施の形態では、同じ構成の測距センサを8個備えているため、対応付けデータ71は、1つの測距センサにおける出力電圧と距離との対応関係を示したものであればよい。 The association data 71 stored in the memory 70 is data indicating the characteristics of FIG. Specifically, the association data 71 is data indicating the correspondence between the output voltage and the distance in each of the distance measuring sensors 31 to 38. In the present embodiment, since eight distance measuring sensors having the same configuration are provided, the association data 71 may be any data indicating the correspondence between the output voltage and the distance in one distance measuring sensor. .

 図9は、センサアレイ30による測定例を示した図である。図9(a)は、表示領域810における領域911をユーザが指910で触ったときの各測距センサ31~38の出力電圧を説明するための図である。図9(b)は、表示領域810における領域912をユーザが指910で触ったときの各測距センサ31~38の出力電圧を説明するための図である。図9(c)は、表示領域810における領域913をユーザが指910で触ったときの各測距センサ31~38の出力電圧を説明するための図である。 FIG. 9 is a diagram showing an example of measurement by the sensor array 30. FIG. 9A is a diagram for explaining output voltages of the distance measuring sensors 31 to 38 when the user touches the area 911 in the display area 810 with the finger 910. FIG. 9B is a diagram for explaining output voltages of the distance measuring sensors 31 to 38 when the user touches the area 912 in the display area 810 with the finger 910. FIG. 9C is a diagram for explaining output voltages of the distance measuring sensors 31 to 38 when the user touches the area 913 in the display area 810 with the finger 910.

 図9(a)を参照して、領域911を指910で触った場合、測距センサ33が出力する電圧値V2は、他の測距センサが出力する電圧値V1に比べて高くなる。この場合、CPU60は、最も高い電圧値である電圧値V2と対応付けデータ71とに基づいて、表示領域810における領域911のY方向の位置を特定する。 9A, when the area 911 is touched with a finger 910, the voltage value V2 output from the distance measuring sensor 33 is higher than the voltage value V1 output from other distance measuring sensors. In this case, the CPU 60 specifies the position in the Y direction of the area 911 in the display area 810 based on the voltage value V2 that is the highest voltage value and the association data 71.

 また、CPU60は、測距センサ33の位置に基づき、表示領域810における領域911のX方向の位置を特定する。具体的には、メモリ70には、各測距センサと表示領域810におけるX座標とを対応付けられたデータが予め格納されている。CPU60は、出力が最も高い電圧値となった測距センサに対応付けられたX座標を当該データに基づいて特定する。 Further, the CPU 60 specifies the position in the X direction of the area 911 in the display area 810 based on the position of the distance measuring sensor 33. Specifically, in the memory 70, data in which each distance measuring sensor is associated with the X coordinate in the display area 810 is stored in advance. The CPU 60 specifies the X coordinate associated with the distance measuring sensor having the highest output voltage value based on the data.

 図9(b)を参照して、領域912を指910で触った場合、測距センサ35が出力する電圧値V3は、他の測距センサが出力する電圧値V1に比べて高くなる。この場合、CPU60は、電圧値V3と対応付けデータ71とに基づいて、表示領域810における領域912のY方向の位置を特定する。また、CPU60は、測距センサ35の位置に基づき、表示領域810における領域912のX方向の位置を特定する。なお、領域912は、図9(a)に示した領域911よりもセンサアレイ30から離れているため、電圧値V3は電圧値V2よりも小さくなる。 9B, when the area 912 is touched with the finger 910, the voltage value V3 output from the distance measuring sensor 35 is higher than the voltage value V1 output from other distance measuring sensors. In this case, the CPU 60 specifies the position in the Y direction of the area 912 in the display area 810 based on the voltage value V3 and the association data 71. Further, the CPU 60 specifies the position in the X direction of the area 912 in the display area 810 based on the position of the distance measuring sensor 35. Since the region 912 is farther from the sensor array 30 than the region 911 illustrated in FIG. 9A, the voltage value V3 is smaller than the voltage value V2.

 図9(c)を参照して、領域913を指910で触った場合、測距センサ34,35が出力する電圧値V4は、他の測距センサが出力する電圧値V1に比べて高くなる。この場合、CPU60は、電圧値V4と対応付けデータ71とに基づいて、表示領域810における領域913のY方向の位置を特定する。 With reference to FIG. 9C, when the region 913 is touched with the finger 910, the voltage value V4 output from the distance measuring sensors 34 and 35 is higher than the voltage value V1 output from the other distance measuring sensors. . In this case, the CPU 60 specifies the position of the area 913 in the Y direction in the display area 810 based on the voltage value V4 and the association data 71.

 また、CPU60は、測距センサ34,35の位置に基づき、表示領域810における領域913のX方向の位置を特定する。具体的には、CPU60は、測距センサ34に対応付けられたX座標と、測距センサ35に対応付けられたX座標との平均値を領域913のX方向の位置とする。 Further, the CPU 60 specifies the position in the X direction of the area 913 in the display area 810 based on the positions of the distance measuring sensors 34 and 35. Specifically, the CPU 60 sets the average value of the X coordinate associated with the distance measuring sensor 34 and the X coordinate associated with the distance measuring sensor 35 as the position of the region 913 in the X direction.

 なお、領域913は、図9(a)に示した領域911よりもセンサアレイ30から離れているため、電圧値V4は電圧値V2よりも小さくなる。また、図9(b)に示した領域912よりもセンサアレイ30に近いため、電圧値V4は電圧値V3よりも大きくなる。 Note that, since the region 913 is farther from the sensor array 30 than the region 911 shown in FIG. 9A, the voltage value V4 is smaller than the voltage value V2. In addition, the voltage value V4 is larger than the voltage value V3 because it is closer to the sensor array 30 than the region 912 shown in FIG.

 このような構成により、ユーザが指910で表示領域810を触ると、CPU60は、表示領域810のどの位置が触られたかを検出することができる。つまり、CPU60は、表示領域810に表示されている空中画像における指910の位置を検出できる。 With this configuration, when the user touches the display area 810 with the finger 910, the CPU 60 can detect which position in the display area 810 has been touched. That is, the CPU 60 can detect the position of the finger 910 in the aerial image displayed in the display area 810.

 たとえば、表示領域810に選択可能なオブジェクトが表示されている場合、ユーザが指910で当該オブジェクトを触ったとする。このとき、CPU60は、センサアレイ30の出力に基づいて表示領域810における指910の位置の座標値を算出する。CPU60は、当該座標値に基づき、当該オブジェクトを選択する指令がユーザにより入力されたと判断できる。 For example, when a selectable object is displayed in the display area 810, it is assumed that the user touches the object with the finger 910. At this time, the CPU 60 calculates the coordinate value of the position of the finger 910 in the display area 810 based on the output of the sensor array 30. Based on the coordinate value, the CPU 60 can determine that a command for selecting the object has been input by the user.

 図10は、表示システム1における処理の流れを示したフローチャートである。なお、以下では、測距センサ31~38を、それぞれ、1番目の測距センサ、2番目の測距センサ、…、8番目の測距センサと称する。図10を参照して、ステップS2において、CPU60は、変数jの値を1とする。ステップS4において、j番目の測距センサが赤外線を出射する。ステップS6において、j番目の測距センサが、指910によって反射した赤外線(反射光)を受光する。 FIG. 10 is a flowchart showing the flow of processing in the display system 1. In the following description, the distance measuring sensors 31 to 38 are referred to as a first distance measuring sensor, a second distance measuring sensor,..., An eighth distance measuring sensor, respectively. Referring to FIG. 10, in step S2, CPU 60 sets the value of variable j to 1. In step S4, the j-th ranging sensor emits infrared rays. In step S <b> 6, the j-th distance measuring sensor receives infrared light (reflected light) reflected by the finger 910.

 ステップS8において、j番目の測距センサは、物体(つまり指910)までの距離に基づく電圧値を出力する。より詳細には、j番目の測距センサは、受光した赤外線による電圧値と、受光した室内光や外光による電圧値を出力する。ステップS10において、CPU60は、jの値を1つ増加させる。つまり、CPU60は、jについてインクリメントを行なう。ステップS12において、CPU60は、jが8よりも大きいか否かを判断する。 In step S8, the j-th ranging sensor outputs a voltage value based on the distance to the object (that is, the finger 910). More specifically, the j-th distance measuring sensor outputs a voltage value based on the received infrared light and a voltage value based on the received room light or external light. In step S10, the CPU 60 increases the value of j by one. That is, the CPU 60 increments j. In step S12, the CPU 60 determines whether j is greater than 8.

 CPU60は、大きいと判断した場合(ステップS12においてYES)、ステップS14において、各測距センサ31~38が出力した電圧値をA/Dコンバータ90から取得する。具体的には、測距センサ31~38が出力した電圧値(アナログ値)をA/Dコンバータ90によってA/D変換することにより得られる値(デジタル値)を、測距センサ毎に取得する。CPUは、大きくないと判断した場合(ステップS12においてNO)、処理をステップS4に進める。 When CPU 60 determines that the value is large (YES in step S12), it acquires the voltage value output from each distance measuring sensor 31 to 38 from A / D converter 90 in step S14. Specifically, the value (digital value) obtained by A / D converting the voltage value (analog value) output from the distance measuring sensors 31 to 38 by the A / D converter 90 is acquired for each distance measuring sensor. . If the CPU determines that it is not large (NO in step S12), the process proceeds to step S4.

 ステップS16において、CPU60は、測距センサの各々が出力した各電圧値と対応付けデータ71とに基づいて、2次元画像における指910の位置を検出する。 In step S <b> 16, the CPU 60 detects the position of the finger 910 in the two-dimensional image based on each voltage value output by each distance measuring sensor and the association data 71.

 以上のように、表示システム1では、複数の測距センサ31~38を列状に配置したセンサアレイ30を2次元画像(空中画像)の下方向に設置することにより、2次元画像における物体の位置を判断できる。したがって、表示システム1では、空中に表示させた2次元画像の周囲を枠で囲むことなく、簡易な構成で2次元画像における物体の位置を検出可能となる。 As described above, in the display system 1, the sensor array 30 in which the plurality of distance measuring sensors 31 to 38 are arranged in a row is installed in the downward direction of the two-dimensional image (aerial image), thereby the object of the two-dimensional image is displayed. The position can be determined. Therefore, the display system 1 can detect the position of the object in the two-dimensional image with a simple configuration without surrounding the two-dimensional image displayed in the air with a frame.

 <第1の変形例>
 図11は、表示システム1の第1の変形例である表示システム1Aの外観および使用状態を示した図である。図11を参照して、表示システム1Aは、筐体10Aと、開口部20Aと、センサアレイ30と、マイクロレンズアレイ40とを備える。
<First Modification>
FIG. 11 is a diagram illustrating an appearance and a usage state of a display system 1A which is a first modification of the display system 1. With reference to FIG. 11, the display system 1 </ b> A includes a housing 10 </ b> A, an opening 20 </ b> A, a sensor array 30, and a microlens array 40.

 開口部20Aは、矩形状である。開口部20Aは、開口部20と同様に、表示領域810の下方に、表示領域810の辺810bに沿って形成されている。開口部20AのX方向の長さは、開口部20と同じである。開口部20AのZ方向の長さは、開口部20よりも長い。センサアレイ30は、Z方向に移動可能に構成されている。 The opening 20A has a rectangular shape. Similarly to the opening 20, the opening 20 </ b> A is formed below the display area 810 along the side 810 b of the display area 810. The length of the opening 20 </ b> A in the X direction is the same as that of the opening 20. The length in the Z direction of the opening 20 </ b> A is longer than that of the opening 20. The sensor array 30 is configured to be movable in the Z direction.

 図12は、図11におけるXII-XII線矢視断面図である。図12を参照して、表示システム1Aは、筐体10A内に、センサアレイ30と、マイクロレンズアレイ40と、ディスプレイ50とを備える。 FIG. 12 is a cross-sectional view taken along line XII-XII in FIG. Referring to FIG. 12, the display system 1A includes a sensor array 30, a microlens array 40, and a display 50 in a housing 10A.

 表示システム1Aは、ディスプレイを矢印711の方向(第3の方向)に平行移動させる移動機構110(図12参照)を備えている。また、表示システム1Aは、センサアレイを矢印713の方向(第4の方向)に平行移動させる移動機構120(図12参照)を備えている。なお、矢印711の方向および矢印713の方向はZ方向である。つまり、矢印711の方向および矢印713の方向は、ディスプレイ50の表示面に垂直な方向である。なお、移動機構110,120は、たとえばアクチュエータを用いて構成される。 The display system 1A includes a moving mechanism 110 (see FIG. 12) that translates the display in the direction of the arrow 711 (third direction). In addition, the display system 1A includes a moving mechanism 120 (see FIG. 12) that translates the sensor array in the direction of the arrow 713 (fourth direction). The direction of the arrow 711 and the direction of the arrow 713 are the Z direction. That is, the direction of the arrow 711 and the direction of the arrow 713 are directions perpendicular to the display surface of the display 50. The moving mechanisms 110 and 120 are configured using, for example, an actuator.

 ディスプレイ50が移動機構110により移動すると、表示領域810の位置も矢印712の方向に移動する。具体的には、ディスプレイ50がZ軸の負の向きに移動すると、表示領域810は、Z軸の正の向きにディスプレイ50の移動量に応じた距離だけ平行移動する。ディスプレイ50がZ軸の正の向きに移動すると、表示領域810は、Z軸の負の向きにディスプレイ50の移動量に応じた距離だけ平行移動する。 When the display 50 is moved by the moving mechanism 110, the position of the display area 810 is also moved in the direction of the arrow 712. Specifically, when the display 50 moves in the negative direction of the Z axis, the display area 810 moves in parallel in the positive direction of the Z axis by a distance corresponding to the amount of movement of the display 50. When the display 50 moves in the positive direction of the Z axis, the display area 810 moves in parallel in the negative direction of the Z axis by a distance corresponding to the amount of movement of the display 50.

 センサアレイ30は、移動機構120によって、表示領域810の移動方向と同じ方向に、表示領域810の移動量と同じ量だけ移動する。言い換えれば、センサアレイ30は、ディスプレイ50の移動に応じた向きに、ディスプレイの50の移動量に応じた量だけ移動する。このように、表示システム1Aにおいては、ディスプレイ50の移動に連動して、センサアレイ30が移動する。 The sensor array 30 is moved by the moving mechanism 120 in the same direction as the moving direction of the display area 810 by the same amount as the moving amount of the display area 810. In other words, the sensor array 30 moves in the direction corresponding to the movement of the display 50 by the amount corresponding to the movement amount of the display 50. Thus, in the display system 1A, the sensor array 30 moves in conjunction with the movement of the display 50.

 図13は、表示システム1Aのハードウェア構成の一部を示したブロック図である。図13を参照して、表示システム1Aは、センサアレイ30と、ディスプレイ50と、CPU60と、メモリ70Aと、ディスプレイ駆動装置80と、A/Dコンバータ90と、移動機構110と、移動機構120とを備える。 FIG. 13 is a block diagram showing a part of the hardware configuration of the display system 1A. Referring to FIG. 13, display system 1A includes a sensor array 30, a display 50, a CPU 60, a memory 70A, a display driving device 80, an A / D converter 90, a moving mechanism 110, and a moving mechanism 120. Is provided.

 メモリ70Aは、メモリ70が格納しているプログラムおよびデータに加え、さらに位置関係データ72を格納している。位置関係データ72は、ディスプレイ50の位置と、センサアレイ30の位置との対応関係を示したデータである。 The memory 70 </ b> A stores positional relationship data 72 in addition to the programs and data stored in the memory 70. The positional relationship data 72 is data indicating the correspondence between the position of the display 50 and the position of the sensor array 30.

 CPU60は、図示しない入力装置(たとえば操作キー)から移動機構110を移動させる指令を受け付けた場合、当該指令に応じた方向および量だけディスプレイ50を移動させるための指令を移動機構110に送る。移動機構110は、CPU60からの指令に基づき、ディスプレイ50を移動させる。 When the CPU 60 receives a command for moving the moving mechanism 110 from an input device (for example, an operation key) (not shown), the CPU 60 sends a command for moving the display 50 to the moving mechanism 110 by a direction and an amount corresponding to the command. The moving mechanism 110 moves the display 50 based on a command from the CPU 60.

 CPU60は、上記のように移動機構110を移動させる指令を受け付けた場合、さらに、ディスプレイ50の位置と位置関係データ72とに基づき、センサアレイ30の位置を決定する。CPU60は、センサアレイ30の位置を決定すると、移動機構120を用いてセンサアレイ30を決定した位置に移動させる。 When the CPU 60 receives an instruction to move the moving mechanism 110 as described above, the CPU 60 further determines the position of the sensor array 30 based on the position of the display 50 and the positional relationship data 72. When the position of the sensor array 30 is determined, the CPU 60 moves the sensor array 30 to the determined position using the moving mechanism 120.

 以上のように、表示システム1Aは、表示領域810のZ軸方向の位置を変更することができる。つまり、表示システム1Aは、空中画像の位置を変更できる。また、表示システム1Aは、センサアレイ30を表示領域810の移動に追従させる。このため、表示システム1Aは、表示領域810の位置を変更可能としつつも、2次元画像における物体の位置を検出可能となる。 As described above, the display system 1A can change the position of the display area 810 in the Z-axis direction. That is, the display system 1A can change the position of the aerial image. Further, the display system 1A causes the sensor array 30 to follow the movement of the display area 810. Therefore, the display system 1A can detect the position of the object in the two-dimensional image while allowing the position of the display area 810 to be changed.

 <第2の変形例>
 図14は、表示システム1の第2の変形例である表示システム1Bの外観および使用状態を示した図である。図14を参照して、表示システム1Bは、筐体10Bと、開口部20Bと、開口部20Cと、センサアレイ30と、マイクロレンズアレイ40と、センサアレイ140とを備える。
<Second Modification>
FIG. 14 is a diagram illustrating an appearance and a usage state of a display system 1B which is a second modification of the display system 1. Referring to FIG. 14, display system 1B includes a housing 10B, an opening 20B, an opening 20C, a sensor array 30, a microlens array 40, and a sensor array 140.

 開口部20Bおよび開口部20Cは、開口部20と同じ形状である。開口部20Bは、開口部20と同様に、表示領域810の下方に、表示領域810の辺810bに沿って形成されている。開口部20Cは、開口部20BよりもZ軸負の向きに、開口部20Bと平行となるように形成されている。 The opening 20B and the opening 20C have the same shape as the opening 20. Similar to the opening 20, the opening 20 </ b> B is formed below the display area 810 along the side 810 b of the display area 810. The opening 20C is formed so as to be parallel to the opening 20B in the negative direction of the Z axis relative to the opening 20B.

 センサアレイ30は、筐体10B内において開口部20Bに沿って配置されている。上述したように、センサアレイ30は、各測距センサ31~38のセンシング面が表示領域810を向くように設置されている。 The sensor array 30 is disposed along the opening 20B in the housing 10B. As described above, the sensor array 30 is installed such that the sensing surfaces of the distance measuring sensors 31 to 38 face the display area 810.

 センサアレイ140は、センサアレイ30と同様の構成を有する。つまり、センサアレイ140は、8個の測距センサをX軸方向に列状に配置している。センサアレイ140は、表示領域810に平行な仮想平面(図示せず)と交わる位置に、センサアレイ30と平行に配置されている。 The sensor array 140 has the same configuration as the sensor array 30. That is, the sensor array 140 has eight distance measuring sensors arranged in a row in the X-axis direction. The sensor array 140 is arranged in parallel with the sensor array 30 at a position where it intersects with a virtual plane (not shown) parallel to the display area 810.

 具体的には、センサアレイ140は、センサアレイ30をZ軸の負の向きに平行移動した位置に配置されている。つまり、センサアレイ140の8個の測距センサと、センサアレイ30の8個の測距センサ31~38とについてのX座標の値が全て同一である。また、センサアレイ140とセンサアレイ30とのY座標の値も同じである。 Specifically, the sensor array 140 is disposed at a position obtained by translating the sensor array 30 in the negative direction of the Z axis. That is, the X coordinate values of the eight distance measuring sensors of the sensor array 140 and the eight distance measuring sensors 31 to 38 of the sensor array 30 are all the same. Also, the Y coordinate values of the sensor array 140 and the sensor array 30 are the same.

 図15は、図14におけるXV-XV線矢視断面図である。図15を参照して、表示システム1Bは、筐体10B内に、センサアレイ30と、マイクロレンズアレイ40と、ディスプレイ50と、センサアレイ140とを備える。 15 is a cross-sectional view taken along line XV-XV in FIG. Referring to FIG. 15, the display system 1B includes a sensor array 30, a microlens array 40, a display 50, and a sensor array 140 in a housing 10B.

 センサアレイ140によるセンシング領域610は、上記仮想平面内に形成される。つまり、仮想平面は、センシング領域610を含んでいる。センサアレイ140は、センサアレイ140の各測距センサのセンシング面がY方向を向くように設置されている。つまり、センサアレイ140は、センサアレイ30と同じ向き(Y軸の正の向き)に赤外線を出射する。 The sensing area 610 by the sensor array 140 is formed in the virtual plane. That is, the virtual plane includes the sensing area 610. The sensor array 140 is installed so that the sensing surface of each distance measuring sensor of the sensor array 140 faces the Y direction. That is, the sensor array 140 emits infrared rays in the same direction as the sensor array 30 (the positive direction of the Y axis).

 図16は、表示システム1Bのハードウェア構成の一部を示したブロック図である。図16を参照して、表示システム1Bは、センサアレイ30と、ディスプレイ50と、CPU60と、メモリ70と、ディスプレイ駆動装置80と、A/Dコンバータ90と、センサアレイ140と、A/Dコンバータ150とを備える。 FIG. 16 is a block diagram showing a part of the hardware configuration of the display system 1B. Referring to FIG. 16, a display system 1B includes a sensor array 30, a display 50, a CPU 60, a memory 70, a display driving device 80, an A / D converter 90, a sensor array 140, and an A / D converter. 150.

 センサアレイ140は、センシング結果であるアナログの電圧値をA/Dコンバータ150に出力する。A/Dコンバータ150は、アナログの電圧値をデジタルの電圧値に変換する。A/Dコンバータ150は、変換後のデジタルの電圧値を、CPU60に送る。 Sensor array 140 outputs an analog voltage value as a sensing result to A / D converter 150. The A / D converter 150 converts an analog voltage value into a digital voltage value. The A / D converter 150 sends the converted digital voltage value to the CPU 60.

 CPU60は、センサアレイ140に含まれる測距センサの各々が出力した電圧値と、対応付けデータ71とに基づいて、仮想平面における指910の位置を検出する。より正確には、CPU60は、センサアレイ140に含まれる測距センサの各々が出力した電圧値と、対応付けデータ71とに基づいて、センシング領域610における指910の位置を検出する。 The CPU 60 detects the position of the finger 910 in the virtual plane based on the voltage value output by each of the distance measuring sensors included in the sensor array 140 and the association data 71. More precisely, the CPU 60 detects the position of the finger 910 in the sensing area 610 based on the voltage value output by each of the distance measuring sensors included in the sensor array 140 and the association data 71.

 CPU60は、仮想平面における指910の位置に対応する表示領域810を含む平面における位置が、表示領域810におけるオブジェクトを表示する領域に含まれるか否かを判断する。より正確には、CPU60は、センシング領域610における指910の位置に対応する表示領域810における位置が、表示領域810におけるオブジェクトを表示する領域に含まれるか否かを判断する。 CPU 60 determines whether or not the position on the plane including the display area 810 corresponding to the position of the finger 910 on the virtual plane is included in the area for displaying the object in the display area 810. More precisely, the CPU 60 determines whether or not the position in the display area 810 corresponding to the position of the finger 910 in the sensing area 610 is included in the area for displaying the object in the display area 810.

 CPU60は、オブジェクトを表示する領域に含まれると判断したことに基づき、ディスプレイ50における当該オブジェクトの表示態様を第1の表示態様から第2の表示態様に変更する。 The CPU 60 changes the display mode of the object on the display 50 from the first display mode to the second display mode based on the determination that it is included in the area for displaying the object.

 図17は、第1の表示態様と第2の表示態様とを説明するための図である。図17(a)は、変更後の表示態様についての1つ目の例を示した図である。図17(b)は、変更後の表示態様についての2つ目の例を示した図である。 FIG. 17 is a diagram for explaining the first display mode and the second display mode. FIG. 17A is a diagram illustrating a first example of the display mode after the change. FIG. 17B is a diagram showing a second example of the display mode after the change.

 図17(a)を参照して、ユーザが指910をZ軸の正の方向に進めた結果(図15参照)、仮想平面における指910の位置に対応する表示領域810を含む平面における位置が、表示領域810におけるオブジェクトBを表示する領域に含まれたとする。より具体的には、センシング領域610における指910のXY座標の値が、表示領域810のオブジェクトBの表示領域に含まれる値であったとする。この場合、CPU60は、ディスプレイ50におけるオブジェクトBの色を、通常の色から、通常の色とは異なる色に変更する。 Referring to FIG. 17A, as a result of the user moving the finger 910 in the positive direction of the Z-axis (see FIG. 15), the position in the plane including the display area 810 corresponding to the position of the finger 910 in the virtual plane is It is assumed that the object B is included in the display area 810 in the display area. More specifically, it is assumed that the value of the XY coordinate of the finger 910 in the sensing area 610 is a value included in the display area of the object B in the display area 810. In this case, the CPU 60 changes the color of the object B on the display 50 from a normal color to a color different from the normal color.

 図17(b)を参照して、センシング領域610における指910のXY座標の値が、表示領域810のオブジェクトBの表示領域に含まれる値であった場合、CPU60は、ディスプレイ50におけるオブジェクトBのサイズを、通常のサイズから、通常のサイズよりも大きなサイズに変更する。 Referring to FIG. 17B, when the value of the XY coordinate of finger 910 in sensing area 610 is a value included in the display area of object B in display area 810, CPU 60 determines that object B in display 50 The size is changed from the normal size to a size larger than the normal size.

 図18は、表示システム1Bにおける表示態様の変更処理の流れを示したフローチャートである。より詳しくは、図18は、図17に示したオブジェクトA~Fが表示されている場合に行なわれる処理である。 FIG. 18 is a flowchart showing the flow of display mode change processing in the display system 1B. More specifically, FIG. 18 shows processing performed when the objects A to F shown in FIG. 17 are displayed.

 図18を参照して、ステップS22において、CPU60は、センサアレイ140の出力に基づき、指910がセンシング領域610に触れたか否かを判断する。CPU60は、指910がセンシング領域610に触れたと判断すると(ステップS22においてYES)、ステップS24において、センシング領域610における指910の位置は、表示領域810におけるオブジェクトA~Fに対応する位置であるか否かを判断する。CPU60は、指910がセンシング領域610に触れていないと判断すると(ステップS22においてNO)、処理をステップS22に進める。 Referring to FIG. 18, in step S <b> 22, CPU 60 determines whether finger 910 has touched sensing region 610 based on the output of sensor array 140. When CPU 60 determines that finger 910 has touched sensing area 610 (YES in step S22), in step S24, is the position of finger 910 in sensing area 610 corresponding to objects AF in display area 810? Judge whether or not. When CPU 60 determines that finger 910 has not touched sensing area 610 (NO in step S22), CPU 60 advances the process to step S22.

 CPU60は、オブジェクトA~Fに対応する位置であると判断すると(ステップS24においてYES)、ステップS26において、指910の位置に対応するオブジェクトの色を変更する。CPU610は、オブジェクトA~Fに対応する位置でないと判断すると(ステップS24においてNO)、ステップS34において、エラー音を発生させる。 When CPU 60 determines that the position corresponds to objects A to F (YES in step S24), it changes the color of the object corresponding to the position of finger 910 in step S26. If CPU 610 determines that the position does not correspond to objects A to F (NO in step S24), it generates an error sound in step S34.

 ステップS28において、CPU60は、センサアレイ30の出力に基づき、指910が表示領域810のオブジェクトA~Fのいずれかに触れたか否かを判断する。CPU60は、触れたと判断した場合(ステップS28においてYES)、ステップS30において、指910が触れた位置のオブジェクトと、色を変更したオブジェクトとが同じであるか否かを判断する。CPU60は、触れていないと判断した場合(ステップS28においてNO)、処理をステップS28に進める。 In step S28, based on the output of the sensor array 30, the CPU 60 determines whether or not the finger 910 has touched any of the objects A to F in the display area 810. If CPU 60 determines that the object has been touched (YES in step S28), it determines in step S30 whether the object at the position touched by finger 910 is the same as the object whose color has been changed. When CPU 60 determines that it is not touched (NO in step S28), the process proceeds to step S28.

 CPU60は、同じオブジェクトであると判断した場合(ステップS30においてYES)、ステップS32において、指910が触れたオブジェクトに対応した動作を実行する。CPU60は、同じオブジェクトではないと判断した場合(ステップS30においてNO)、ステップS36において、エラー音を発生させる。 When CPU 60 determines that they are the same object (YES in step S30), in step S32, CPU 60 performs an operation corresponding to the object touched by finger 910. When CPU 60 determines that they are not the same object (NO in step S30), it generates an error sound in step S36.

 以上のように、表示システム1Bは、センシング領域610に指910が到達した場合、到達した位置に対応する位置のオブジェクトの表示態様を変更する。したがって、ユーザは、さらにZ軸の負の向きに指910を移動させれば、どのオブジェクトを選択できるかを事前に知ることができる。それゆえ、表示システム1Bは、表示システム1よりも操作性に優れる。 As described above, when the finger 910 reaches the sensing area 610, the display system 1B changes the display mode of the object at the position corresponding to the reached position. Therefore, the user can know in advance which object can be selected by further moving the finger 910 in the negative direction of the Z axis. Therefore, the display system 1B is superior in operability than the display system 1.

 なお、センサアレイ30とセンサアレイ140との配置を逆にしてもよい。つまり、センシング領域610が、表示領域810とマイクロレンズアレイ40との間に挟まれるように、センサアレイ140を配置してもよい。 The arrangement of the sensor array 30 and the sensor array 140 may be reversed. That is, the sensor array 140 may be arranged so that the sensing area 610 is sandwiched between the display area 810 and the microlens array 40.

 <第3の変形例>
 次に、表示システム1の第3の変形例について説明する。第3の変形例に係る表示システム(以下、「表示システム1C」と称する)は、センサアレイの構成が表示システム1と異なる。より詳しくは、表示システム1Cのセンサアレイは、測距センサの構成が表示システム1とは異なる。
<Third Modification>
Next, a third modification of the display system 1 will be described. The display system according to the third modification (hereinafter referred to as “display system 1C”) is different from the display system 1 in the configuration of the sensor array. More specifically, the sensor array of the display system 1C is different from the display system 1 in the configuration of the distance measuring sensor.

 表示システム1Cにおける測距センサの各々は、1つの発光装置と2つの受光装置とを備える。この点において、各測距センサにおいて1つの発光装置と1つの受光装置とを備える表示システム1とは異なる。なお、測距センサの数は、表示システム1Cと表示システム1とでは同じである。 Each distance measuring sensor in the display system 1C includes one light emitting device and two light receiving devices. In this respect, each distance measuring sensor is different from the display system 1 including one light emitting device and one light receiving device. Note that the number of distance measuring sensors is the same between the display system 1 </ b> C and the display system 1.

 図19は、表示システム1Cの測距センサ31Aの構成を説明するための図である。図19を参照して、発光装置311は、上述したとおり、赤外線LED311aとレンズ311bとを備える。受光装置312は、赤外線受光素子312aとレンズ312bとを備える。受光装置313は、赤外線受光素子313aとレンズ313bとを備える。赤外線受光素子313aの特性は、赤外線受光素子312aと異なる。 FIG. 19 is a diagram for explaining the configuration of the distance measuring sensor 31A of the display system 1C. Referring to FIG. 19, the light emitting device 311 includes the infrared LED 311a and the lens 311b as described above. The light receiving device 312 includes an infrared light receiving element 312a and a lens 312b. The light receiving device 313 includes an infrared light receiving element 313a and a lens 313b. The characteristics of the infrared light receiving element 313a are different from those of the infrared light receiving element 312a.

 図20は、表示システム1Cのセンサの出力特性を示した図である。具体的には、図20は、赤外線受光素子313aの特性と、赤外線受光素子312aの特性とを示した図である。図20を参照して、赤外線受光素子313aは、距離dがdi(di≦d2)より大きい場合には、センシング対象との距離が長くなるほど出力電圧が低下する特性を有する。したがって、測距センサ31Aは、距離d1~d3の間に存在する物体をセンシングすることができる。 FIG. 20 shows the output characteristics of the sensor of the display system 1C. Specifically, FIG. 20 is a diagram showing the characteristics of the infrared light receiving element 313a and the characteristics of the infrared light receiving element 312a. Referring to FIG. 20, infrared light receiving element 313a has a characteristic that when the distance d is greater than di (di ≦ d2), the output voltage decreases as the distance to the sensing object increases. Therefore, the distance measuring sensor 31A can sense an object existing between the distances d1 to d3.

 再び図19に戻り、赤外線LED311aは、レンズ311bに対して赤外線を出射する。レンズ311bから出射した赤外線は、センシング対象である物体950により反射する。当該反射した赤外線は、レンズ312bを通過して、赤外線受光素子312aに入射する。物体950,960の代わりに物体970があるときは、赤外線LED311aから出射した赤外線は、物体970により反射する。物体970により反射した赤外線は、レンズ313bを通過して、赤外線受光素子313aに入射する。このようにして、表示システム1Cは、距離d2~d3に存在する物体970についてもセンシングできる。 Referring back to FIG. 19, the infrared LED 311a emits infrared rays to the lens 311b. The infrared rays emitted from the lens 311b are reflected by the object 950 that is a sensing target. The reflected infrared light passes through the lens 312b and enters the infrared light receiving element 312a. When there is an object 970 instead of the objects 950 and 960, the infrared light emitted from the infrared LED 311a is reflected by the object 970. The infrared light reflected by the object 970 passes through the lens 313b and enters the infrared light receiving element 313a. In this way, the display system 1C can also sense the object 970 existing at the distances d2 to d3.

 <第4の変形例>
 次に、表示システム1の第4の変形例について説明する。第4の変形例に係る表示システム(以下、「表示システム1D」と称する)は、センサアレイの構成が表示システム1と異なる。より詳しくは、表示システム1Dのセンサアレイは、測距センサの数が表示システム1とは異なる。
<Fourth Modification>
Next, a fourth modification of the display system 1 will be described. The display system according to the fourth modification (hereinafter referred to as “display system 1D”) is different from the display system 1 in the configuration of the sensor array. More specifically, the sensor array of the display system 1D is different from the display system 1 in the number of distance measuring sensors.

 図21は、表示システム1Dのセンサアレイ30Aを説明するための図である。図21を参照して、CPU60は、ディスプレイ50の予め定められた位置にオブジェクトを表示させるプログラムを実行する。たとえば、ディスプレイ50は、8つの予め定められた位置にオブジェクトG~Nを表示する。オブジェクトHとオブジェクトGとの位置、オブジェクトJとオブジェクトIとの位置、オブジェクトLとオブジェクトKとの位置、およびオブジェクトNとオブジェクトMとの位置は、それぞれ同一である。 FIG. 21 is a diagram for explaining the sensor array 30A of the display system 1D. Referring to FIG. 21, CPU 60 executes a program for displaying an object at a predetermined position on display 50. For example, the display 50 displays the objects G to N at eight predetermined positions. The positions of the objects H and G, the positions of the objects J and I, the positions of the objects L and K, and the positions of the objects N and M are the same.

 このようなプログラムをCPU60が実行することが前提である場合には、センサアレイ30は、ディスプレイ50に表示される各オブジェクトに対応する4つの位置に測距センサ31,33,36,38を備えていればよい。 When it is assumed that the CPU 60 executes such a program, the sensor array 30 includes distance measuring sensors 31, 33, 36, and 38 at four positions corresponding to each object displayed on the display 50. It only has to be.

 このような構成では、表示システム1に比べて測距センサの数を減らすことができる。したがって、表示システム1Dでは、製造コストを、表示システム1の製造コストよりも低減することができる。 In such a configuration, the number of distance measuring sensors can be reduced as compared with the display system 1. Therefore, in the display system 1D, the manufacturing cost can be reduced more than the manufacturing cost of the display system 1.

 [実施の形態2]
 図22は、表示システム2の外観および使用状態を示した図である。図22を参照して、表示システム2は、筐体11と、開口部21と、センサアレイ30と、光学素子41とを備える。
[Embodiment 2]
FIG. 22 is a diagram illustrating an appearance and a usage state of the display system 2. With reference to FIG. 22, the display system 2 includes a housing 11, an opening 21, a sensor array 30, and an optical element 41.

 光学素子41は、筐体11内のディスプレイ(図23参照)が発した光を透過して、空中の矩形の表示領域820に2次元画像(空中画像)を表示する。光学素子41としては、たとえば、背景技術として説明した特許文献3の結像素子を用いることができる。 The optical element 41 transmits light emitted from the display (see FIG. 23) in the housing 11 and displays a two-dimensional image (aerial image) in the rectangular display area 820 in the air. As the optical element 41, for example, the imaging element of Patent Document 3 described as the background art can be used.

 表示領域820は、4つの辺820a,820b,820c,820dに囲まれた領域である。なお、辺820aと辺820bとは平行であり、辺820cと辺820dとは平行である。表示領域820の法線の方向は、xyz座標系においてz方向である。また、表示領域820は、xy平面に平行である。 The display area 820 is an area surrounded by four sides 820a, 820b, 820c, and 820d. Note that the side 820a and the side 820b are parallel, and the side 820c and the side 820d are parallel. The direction of the normal line of the display area 820 is the z direction in the xyz coordinate system. The display area 820 is parallel to the xy plane.

 なお、xyz座標系のx方向と、XYZ座標系のX方向とは平行である。xyz座標系は、XYZ座標系をX軸を回転軸として所定の角度回転させた座標系である。 Note that the x direction of the xyz coordinate system and the X direction of the XYZ coordinate system are parallel. The xyz coordinate system is a coordinate system obtained by rotating the XYZ coordinate system by a predetermined angle about the X axis as a rotation axis.

 開口部21は、矩形状である。開口部21は、表示領域820の下方(y軸の負の向き)に、表示領域820の辺820bに沿って形成されている。 The opening 21 has a rectangular shape. The opening 21 is formed along the side 820b of the display area 820 below the display area 820 (in the negative y-axis direction).

 センサアレイ30は、複数の測距センサ31~38をx方向に列状に配置している。センサアレイ30は、筐体11内において開口部21に沿って配置されている。具体的には、センサアレイ30は、各測距センサ31~38のセンシング面が表示領域820を向くように設置されている。 The sensor array 30 has a plurality of ranging sensors 31 to 38 arranged in a row in the x direction. The sensor array 30 is arranged along the opening 21 in the housing 11. Specifically, the sensor array 30 is installed so that the sensing surfaces of the distance measuring sensors 31 to 38 face the display area 820.

 表示システム2においては、ユーザは、表示領域820に表示された空中画像を、たとえばユーザの指910で触る。具体的には、ユーザは、空中画像に含まれているオブジェクトを選択するために、指910で当該オブジェクトを触る。 In the display system 2, the user touches the aerial image displayed in the display area 820 with, for example, the user's finger 910. Specifically, the user touches the object with the finger 910 in order to select the object included in the aerial image.

 図23は、図22におけるXXIII-XXIII線矢視断面図である。図23を参照して、表示システム2は、筐体11内に、センサアレイ30と、ディスプレイ50とを備える。表示システム2は、筐体11の表面に設けられた開口部に、光学素子41を備える。 23 is a cross-sectional view taken along line XXIII-XXIII in FIG. Referring to FIG. 23, the display system 2 includes a sensor array 30 and a display 50 in the housing 11. The display system 2 includes an optical element 41 in an opening provided on the surface of the housing 11.

 ディスプレイ50は、光学素子41の方向に画像を表示する。ディスプレイ50が表示した画像が、光学素子41により空中画像として表示領域820に表示される。より詳しく説明すれば、以下の通りである。 The display 50 displays an image in the direction of the optical element 41. The image displayed on the display 50 is displayed in the display area 820 as an aerial image by the optical element 41. More detailed description is as follows.

 ディスプレイ50は、光学素子41に対して、90°-θa傾いた状態で設置されている。なお、ディスプレイ50が出射する光についての光学素子41対する入射角も、90°-θaとなる。光学素子41は、ディスプレイ50が出射した光を、出射角90°-θbで出射する。これにより、表示領域820に、ディスプレイ50が表示した画像が空中画像として表示される。 The display 50 is installed with an inclination of 90 ° -θa with respect to the optical element 41. The incident angle of the light emitted from the display 50 with respect to the optical element 41 is also 90 ° −θa. The optical element 41 emits the light emitted from the display 50 at an emission angle of 90 ° −θb. As a result, the image displayed on the display 50 is displayed in the display area 820 as an aerial image.

 センサアレイ30の測距センサ31~38の各々は、測距センサの配列方向(x方向)および表示領域820の法線の方向(z方向)と垂直なy方向であって、表示領域820に近づく向きに光を出射する。より具体的には、センサアレイ30は、表示領域820を含む平面と交わる位置に配置されている。つまり、センサアレイ30は、表示領域820の辺820a,820bと平行となる位置に配されている(図22参照)。 Each of the distance measuring sensors 31 to 38 of the sensor array 30 is in the y direction perpendicular to the arrangement direction (x direction) of the distance measuring sensors and the normal direction (z direction) of the display area 820, and is in the display area 820. Light is emitted in the direction of approach. More specifically, the sensor array 30 is disposed at a position that intersects a plane including the display area 820. That is, the sensor array 30 is arranged at a position parallel to the sides 820a and 820b of the display area 820 (see FIG. 22).

 なお、センサアレイ30が出射する光の光路と光学素子41とがなす角θc(つまり、表示領域820と光学素子41とがなす角)とθbとの間には、θc=90°-θbの関係が成立する。 It should be noted that between the angle θc formed by the optical path of the light emitted from the sensor array 30 and the optical element 41 (that is, the angle formed by the display area 820 and the optical element 41) and θb, θc = 90 ° −θb. A relationship is established.

 センサアレイ30は、測距センサ31~38が出射する光が表示領域820を通過する(つまり、光が表示領域820と重なる)ように配置されていてもよいし、あるいは当該光が表示領域820に沿って進むように配置されていてもよい。以下では、測距センサ31~38が出射する光が表示領域820を通過するように配置される場合を例に挙げて説明する。 The sensor array 30 may be arranged so that the light emitted from the distance measuring sensors 31 to 38 passes through the display area 820 (that is, the light overlaps the display area 820), or the light is displayed in the display area 820. You may arrange | position so that it may follow along. Hereinafter, a case where the light emitted from the distance measuring sensors 31 to 38 is arranged so as to pass through the display area 820 will be described as an example.

 図24は、表示システム2のハードウェア構成の一部を示したブロック図である。図24を参照して、表示システム2は、センサアレイ30と、ディスプレイ50と、CPU60と、メモリ70と、ディスプレイ駆動装置80と、A/Dコンバータ90とを備える。つまり、表示システム2は、実施の形態1に係る表示システム1と同様の構成を有する。したがって、表示システム2の各ハードウェアについての説明は、ここでは繰り返さない。 FIG. 24 is a block diagram showing a part of the hardware configuration of the display system 2. Referring to FIG. 24, display system 2 includes a sensor array 30, a display 50, a CPU 60, a memory 70, a display driving device 80, and an A / D converter 90. That is, the display system 2 has the same configuration as the display system 1 according to the first embodiment. Therefore, the description about each hardware of the display system 2 is not repeated here.

 図25は、各測距センサ31~38と、表示領域820との対応関係を示した図である。図25を参照して、各測距センサ31~38は、y方向に光を出射する。測距センサ31~38の各々は、出射した光のうち物体により反射された光を受光して、当該物体までの距離に基づく電圧値をA/Dコンバータ90に出力する。 FIG. 25 is a diagram showing a correspondence relationship between each of the distance measuring sensors 31 to 38 and the display area 820. Referring to FIG. 25, each of the distance measuring sensors 31 to 38 emits light in the y direction. Each of the distance measuring sensors 31 to 38 receives the light reflected by the object among the emitted light, and outputs a voltage value based on the distance to the object to the A / D converter 90.

 以上のように、表示システム2では、複数の測距センサ31~38を列状に配置したセンサアレイ30を2次元画像の下方向に設置することにより、2次元画像における物体の位置を判断できる。したがって、表示システム2では、空中に表示させた2次元画像の周囲を枠で囲むことなく、簡易な構成で2次元画像における物体の位置を検出可能となる。 As described above, in the display system 2, the position of the object in the two-dimensional image can be determined by installing the sensor array 30 in which the plurality of distance measuring sensors 31 to 38 are arranged in a row in the downward direction of the two-dimensional image. . Therefore, the display system 2 can detect the position of the object in the two-dimensional image with a simple configuration without surrounding the two-dimensional image displayed in the air with a frame.

 <第1の変形例>
 図26は、表示システム2の第1の変形例である表示システム2Aの外観および使用状態を示した図である。図26を参照して、表示システム2Aは、筐体11Aと、開口部21Aと、センサアレイ30と、光学素子41とを備える。
<First Modification>
FIG. 26 is a diagram illustrating an appearance and a usage state of a display system 2A which is a first modification of the display system 2. With reference to FIG. 26, the display system 2 </ b> A includes a housing 11 </ b> A, an opening 21 </ b> A, a sensor array 30, and an optical element 41.

 開口部21Aは、矩形状である。開口部21Aは、開口部21と同様に、表示領域820の下方に、表示領域820の辺820bに沿って形成されている。開口部21AのX方向の長さは、開口部21と同じである。開口部21AのY方向の長さは、開口部21よりも長い。センサアレイ30は、Y方向に移動可能に構成されている。 The opening 21A has a rectangular shape. Similarly to the opening 21, the opening 21 </ b> A is formed below the display area 820 along the side 820 b of the display area 820. The length of the opening 21A in the X direction is the same as that of the opening 21. The length of the opening 21 </ b> A in the Y direction is longer than that of the opening 21. The sensor array 30 is configured to be movable in the Y direction.

 図27は、図26におけるXXVII-XXVII線矢視断面図である。図27を参照して、表示システム2Aは、筐体11A内に、センサアレイ30と、ディスプレイ50とを備える。表示システム2Aは、筐体11Aの表面に設けられた開口部に、光学素子41を備える。なお、表示システム2Aにおける光学素子41の位置は、表示システム2における光学素子41の位置と同じである。 FIG. 27 is a sectional view taken along line XXVII-XXVII in FIG. Referring to FIG. 27, display system 2A includes a sensor array 30 and a display 50 in housing 11A. The display system 2A includes an optical element 41 in an opening provided on the surface of the housing 11A. The position of the optical element 41 in the display system 2A is the same as the position of the optical element 41 in the display system 2.

 表示システム2Aは、ディスプレイ50を矢印721の方向(第3の方向)に平行移動させる移動機構110A(図28参照)を備えている。また、表示システム2Aは、センサアレイ30を矢印723の方向(第4の方向)に平行移動させる移動機構120A(図28参照)を備えている。なお、矢印721の方向は、ディスプレイ50が出射する光についての光学素子41への入射角が変化しない方向である。矢印723の方向はY軸方向である。なお、矢印723の方向は、表示領域820の法線方向の成分を有する方向でもある。 The display system 2A includes a moving mechanism 110A (see FIG. 28) that translates the display 50 in the direction of the arrow 721 (third direction). In addition, the display system 2A includes a moving mechanism 120A (see FIG. 28) that translates the sensor array 30 in the direction of the arrow 723 (fourth direction). Note that the direction of the arrow 721 is a direction in which the incident angle of the light emitted from the display 50 to the optical element 41 does not change. The direction of the arrow 723 is the Y-axis direction. Note that the direction of the arrow 723 is also a direction having a component in the normal direction of the display region 820.

 ディスプレイ50が移動機構110Aにより移動すると、表示領域820の位置も矢印722の方向に移動する。言い換えれば、表示領域820は、ディスプレイ50の移動に基づき、表示領域820の法線の方向に移動する。なお、当該法線の方向と、矢印721の方向とは異なる。 When the display 50 is moved by the moving mechanism 110A, the position of the display area 820 is also moved in the direction of the arrow 722. In other words, the display area 820 moves in the direction of the normal line of the display area 820 based on the movement of the display 50. Note that the direction of the normal line is different from the direction of the arrow 721.

 具体的には、ディスプレイ50が光学素子41から遠ざかる向きに移動すると、表示領域820は、光学素子41から遠ざかる向きに、ディスプレイ50の移動量に応じた距離だけ平行移動する。ディスプレイ50が光学素子41に近づく向きに移動すると、表示領域820は、光学素子41に近づく向きにディスプレイ50の移動量に応じた距離だけ平行移動する。 Specifically, when the display 50 moves in a direction away from the optical element 41, the display area 820 moves in parallel in a direction away from the optical element 41 by a distance corresponding to the amount of movement of the display 50. When the display 50 moves in the direction approaching the optical element 41, the display area 820 moves in parallel by a distance corresponding to the amount of movement of the display 50 in the direction approaching the optical element 41.

 センサアレイ30は、移動機構120Aによって、表示領域820の移動に応じた向きに移動する。言い換えれば、センサアレイ30は、ディスプレイ50の移動に応じた向きに、ディスプレイの50の移動量に応じた量だけ移動する。このように、表示システム2Aにおいては、ディスプレイ50の移動に連動して、センサアレイ30が移動する。 The sensor array 30 is moved in the direction according to the movement of the display area 820 by the moving mechanism 120A. In other words, the sensor array 30 moves in the direction corresponding to the movement of the display 50 by the amount corresponding to the movement amount of the display 50. Thus, in the display system 2A, the sensor array 30 moves in conjunction with the movement of the display 50.

 図28は、表示システム2Aのハードウェア構成の一部を示したブロック図である。図28を参照して、表示システム2Aは、センサアレイ30と、ディスプレイ50と、CPU60と、メモリ70Aと、ディスプレイ駆動装置80と、A/Dコンバータ90と、移動機構110Aと、移動機構120Aとを備える。 FIG. 28 is a block diagram showing a part of the hardware configuration of the display system 2A. Referring to FIG. 28, display system 2A includes sensor array 30, display 50, CPU 60, memory 70A, display driving device 80, A / D converter 90, moving mechanism 110A, and moving mechanism 120A. Is provided.

 メモリ70Aは、上述したように、位置関係データ72を格納している。位置関係データ72は、ディスプレイ50の位置と、センサアレイ30の位置との対応関係を示したデータである。 The memory 70A stores the positional relationship data 72 as described above. The positional relationship data 72 is data indicating the correspondence between the position of the display 50 and the position of the sensor array 30.

 CPU60は、図示しない入力装置(たとえば操作キー)から移動機構110Aを移動させる指令を受け付けた場合、当該指令に応じた方向および量だけディスプレイ50を移動させるための指令を移動機構110Aに送る。移動機構110Aは、CPU60からの指令に基づき、ディスプレイ50を移動させる。 When the CPU 60 receives a command for moving the moving mechanism 110A from an input device (for example, an operation key) (not shown), the CPU 60 sends a command for moving the display 50 by the direction and amount corresponding to the command to the moving mechanism 110A. The moving mechanism 110 </ b> A moves the display 50 based on a command from the CPU 60.

 CPU60は、上記のように移動機構110Aを移動させる指令を受け付けた場合、さらに、ディスプレイ50の位置と位置関係データ72とに基づき、センサアレイ30の位置を決定する。CPU60は、センサアレイ30の位置を決定すると、移動機構120Aを用いてセンサアレイ30を決定した位置に移動させる。 When the CPU 60 receives a command to move the moving mechanism 110A as described above, the CPU 60 further determines the position of the sensor array 30 based on the position of the display 50 and the positional relationship data 72. When the position of the sensor array 30 is determined, the CPU 60 moves the sensor array 30 to the determined position using the moving mechanism 120A.

 ところで、表示システム2Aでは、センサアレイ30の位置に応じて、表示領域820との距離が変化する。このため、2次元画像における同じ位置を指910で接触している場合であっても、センサアレイ30の位置に応じて、センサアレイ30が出力する電圧値は異なる。それゆえ、2次元画像における指910の位置のy座標値を正確に検出するために、CPU60は、指910の位置のy座標値を特定する際にはセンサアレイ30の位置を考慮した処理が必要となる。 By the way, in the display system 2A, the distance from the display area 820 changes according to the position of the sensor array 30. For this reason, even when the same position in the two-dimensional image is in contact with the finger 910, the voltage value output by the sensor array 30 differs depending on the position of the sensor array 30. Therefore, in order to accurately detect the y coordinate value of the position of the finger 910 in the two-dimensional image, the CPU 60 performs processing in consideration of the position of the sensor array 30 when specifying the y coordinate value of the position of the finger 910. Necessary.

 このため、表示システム2Aにおいては、CPU60は、測距センサ31~38の各々が出力した各電圧値と、対応付けデータ71と、センサアレイ30の位置とに基づいて、2次元画像における指910の位置を検出する。たとえば、測距センサの特性(図8参照)において出力電圧から得られた距離に対して、センサアレイ30の移動距離を考慮した距離を加算して、表示領域820におけるy座標値とするプログラムを、メモリ70Aに格納すればよい。 Therefore, in the display system 2A, the CPU 60 determines the finger 910 in the two-dimensional image based on each voltage value output from each of the distance measuring sensors 31 to 38, the association data 71, and the position of the sensor array 30. The position of is detected. For example, a program for adding a distance considering the moving distance of the sensor array 30 to the distance obtained from the output voltage in the characteristics of the distance measuring sensor (see FIG. 8) to obtain the y coordinate value in the display area 820. And may be stored in the memory 70A.

 以上のように、表示システム2Aは、表示領域820のz軸方向の位置を変更することができる。つまり、表示システム2Aは、空中画像の位置を変更できる。また、表示システム2Aは、センサアレイ30を表示領域820の移動に追従させる。このため、表示システム2Aは、表示領域820の位置を変更可能としつつも、2次元画像における物体の位置を検出可能となる。 As described above, the display system 2A can change the position of the display area 820 in the z-axis direction. That is, the display system 2A can change the position of the aerial image. Further, the display system 2A causes the sensor array 30 to follow the movement of the display area 820. Therefore, the display system 2A can detect the position of the object in the two-dimensional image while allowing the position of the display area 820 to be changed.

 なお、CPU60は、測距センサ31~38の各々が出力した各電圧値と、対応付けデータ71と、ディスプレイ50の位置とに基づいて、2次元画像における指910の位置を検出するように、表示システム2Aを構成してもよい。この場合、たとえば、測距センサの特性(図8参照)において出力電圧から得られた距離に対して、ディスプレイ50の移動距離を考慮した距離を加算して、表示領域820におけるy座標値とするプログラムを、メモリ70Aに格納すればよい。 The CPU 60 detects the position of the finger 910 in the two-dimensional image based on each voltage value output from each of the distance measuring sensors 31 to 38, the association data 71, and the position of the display 50. The display system 2A may be configured. In this case, for example, the distance obtained by considering the moving distance of the display 50 is added to the distance obtained from the output voltage in the characteristics of the distance measuring sensor (see FIG. 8) to obtain the y coordinate value in the display area 820. The program may be stored in the memory 70A.

 <第2の変形例>
 図29は、表示システム2の第2の変形例である表示システム2Bの外観および使用状態を示した図である。図29を参照して、表示システム2Bは、筐体11Bと、開口部21Bと、センサアレイ30と、光学素子41とを備える。
<Second Modification>
FIG. 29 is a diagram illustrating an appearance and a usage state of a display system 2B which is a second modification of the display system 2. Referring to FIG. 29, display system 2B includes a casing 11B, an opening 21B, a sensor array 30, and an optical element 41.

 開口部21Bは、矩形状である。開口部21Bは、開口部21Aと同様に、表示領域820の下方に、表示領域820の辺820bに沿って形成されている。開口部21BのX方向の長さは、開口部21と同じである。開口部21BのY方向の長さは、開口部21よりも長い。センサアレイ30は、予め定められた方向(図30参照)に移動可能に構成されている。たとえば、開口部21Bは、表示システム2Aの開口部21Aと同様な形状を有する。 The opening 21B has a rectangular shape. Similar to the opening 21A, the opening 21B is formed below the display area 820 along the side 820b of the display area 820. The length of the opening 21B in the X direction is the same as that of the opening 21. The length of the opening 21 </ b> B in the Y direction is longer than that of the opening 21. The sensor array 30 is configured to be movable in a predetermined direction (see FIG. 30). For example, the opening 21B has the same shape as the opening 21A of the display system 2A.

 図30は、図29におけるXXX-XXX線矢視断面図である。図30を参照して、表示システム2Bは、筐体11B内に、センサアレイ30と、ディスプレイ50とを備える。表示システム2Bは、筐体11Bの表面に設けられた開口部に、光学素子41を備える。なお、表示システム2Bにおける光学素子41の位置は、表示システム2における光学素子41の位置と同じである。 FIG. 30 is a cross-sectional view taken along line XXX-XXX in FIG. Referring to FIG. 30, the display system 2B includes a sensor array 30 and a display 50 in a housing 11B. The display system 2B includes an optical element 41 in an opening provided on the surface of the housing 11B. Note that the position of the optical element 41 in the display system 2B is the same as the position of the optical element 41 in the display system 2.

 表示システム2Bは、ディスプレイ50を矢印731の方向(第3の方向)に平行移動させる移動機構110B(図31参照)を備えている。また、表示システム2Aは、センサアレイ30を矢印733の方向(第4の方向)に平行移動させる移動機構120B(図31参照)を備えている。なお、矢印731の方向は、ディスプレイ50に表示された画像に基づく光の光学素子41に対する入射角を変更する方向である。矢印733の方向は、複数の測距センサ31~38が出射する各光の出射角度を変化させる方向である。 The display system 2B includes a moving mechanism 110B (see FIG. 31) that translates the display 50 in the direction of the arrow 731 (third direction). Further, the display system 2A includes a moving mechanism 120B (see FIG. 31) that translates the sensor array 30 in the direction of the arrow 733 (fourth direction). Note that the direction of the arrow 731 is a direction in which the incident angle of the light with respect to the optical element 41 based on the image displayed on the display 50 is changed. The direction of the arrow 733 is a direction in which the emission angle of each light emitted from the plurality of distance measuring sensors 31 to 38 is changed.

 より具体的には、ディスプレイ50は、予め定められた第1の回転軸(図示せず)を中心に予め定められた第1の範囲内で回転する。また、センサアレイ30は、予め定められた第2の回転軸J1を中心に予め定められた第2の範囲内で回転する。 More specifically, the display 50 rotates within a predetermined first range around a predetermined first rotation axis (not shown). In addition, the sensor array 30 rotates within a predetermined second range around a predetermined second rotation axis J1.

 ディスプレイ50が移動機構110Bにより移動すると、表示領域830の位置も矢印732の方向に移動する。具体的には、表示領域820は、ディスプレイ50の移動に基づき、第2の回転軸J1を中心に回転する。 When the display 50 is moved by the moving mechanism 110 </ b> B, the position of the display area 830 is also moved in the direction of the arrow 732. Specifically, the display area 820 rotates around the second rotation axis J <b> 1 based on the movement of the display 50.

 具体的には、ディスプレイ50が光学素子41から遠ざかる向きに移動すると、表示領域820は、光学素子41から遠ざかる向きに、ディスプレイ50の移動量(回転量)に応じた距離だけ移動する。ディスプレイ50が光学素子41に近づく向きに移動すると、表示領域820は、光学素子41に近づく向きにディスプレイ50の移動量(回転量)に応じた距離だけ移動する。 Specifically, when the display 50 moves in a direction away from the optical element 41, the display area 820 moves in a direction away from the optical element 41 by a distance corresponding to the movement amount (rotation amount) of the display 50. When the display 50 moves in a direction approaching the optical element 41, the display area 820 moves by a distance corresponding to the movement amount (rotation amount) of the display 50 in a direction approaching the optical element 41.

 センサアレイ30は、移動機構120Bによって、表示領域820の移動に応じた向きに移動する。より具体的には、センサアレイ30は、ディスプレイ50の移動に応じた向きに、ディスプレイの50の回転量に応じた量だけ回転する。このように、表示システム2Aにおいては、ディスプレイ50の移動に連動して、センサアレイ30が移動する。 The sensor array 30 is moved in a direction according to the movement of the display area 820 by the moving mechanism 120B. More specifically, the sensor array 30 rotates in the direction corresponding to the movement of the display 50 by an amount corresponding to the rotation amount of the display 50. Thus, in the display system 2A, the sensor array 30 moves in conjunction with the movement of the display 50.

 図31は、表示システム2Bのハードウェア構成の一部を示したブロック図である。図31を参照して、表示システム2Bは、センサアレイ30と、ディスプレイ50と、CPU60と、メモリ70Aと、ディスプレイ駆動装置80と、A/Dコンバータ90と、移動機構110Bと、移動機構120Bとを備える。 FIG. 31 is a block diagram showing a part of the hardware configuration of the display system 2B. Referring to FIG. 31, display system 2B includes sensor array 30, display 50, CPU 60, memory 70A, display driving device 80, A / D converter 90, moving mechanism 110B, and moving mechanism 120B. Is provided.

 CPU60は、図示しない入力装置(たとえば操作キー)から移動機構110Bを移動させる指令を受け付けた場合、当該指令に応じた方向および量だけディスプレイ50を移動させるための指令を移動機構110Bに送る。移動機構110Bは、CPU60からの指令に基づき、ディスプレイ50を移動させる。 When the CPU 60 receives a command to move the moving mechanism 110B from an input device (for example, an operation key) (not shown), the CPU 60 sends a command for moving the display 50 by the direction and amount corresponding to the command to the moving mechanism 110B. The moving mechanism 110 </ b> B moves the display 50 based on a command from the CPU 60.

 CPU60は、上記のように移動機構110Bを移動させる指令を受け付けた場合、さらに、ディスプレイ50の位置と位置関係データ72とに基づき、センサアレイ30の位置を決定する。CPU60は、センサアレイ30の位置を決定すると、移動機構120Bを用いてセンサアレイ30を決定した位置に移動させる。 When the CPU 60 receives a command to move the moving mechanism 110B as described above, the CPU 60 further determines the position of the sensor array 30 based on the position of the display 50 and the positional relationship data 72. When the position of the sensor array 30 is determined, the CPU 60 moves the sensor array 30 to the determined position using the moving mechanism 120B.

 以上のように、表示システム2Bは、光学素子41に対する表示領域820の角度を変更することができる。つまり、表示システム2Bは、光学素子41に対する空中画像の角度を変更できる。また、表示システム2Bは、センサアレイ30を表示領域820の移動に追従させる。このため、表示システム2Bは、光学素子41に対する表示領域820の角度を変更可能としつつも、2次元画像における物体の位置を検出可能となる。 As described above, the display system 2B can change the angle of the display area 820 with respect to the optical element 41. That is, the display system 2B can change the angle of the aerial image with respect to the optical element 41. Further, the display system 2B causes the sensor array 30 to follow the movement of the display area 820. For this reason, the display system 2B can detect the position of the object in the two-dimensional image while allowing the angle of the display area 820 to the optical element 41 to be changed.

 <第3の変形例>
 センサアレイ30に含まれる測距センサと、センサアレイ140に含まれる測距センサとの特性は、必ずしも同じである必要はない。この場合、センサアレイ30に含まれる各測距センサ31~38における出力電圧と距離との対応関係を示したデータ(対応付けデータ71)と、センサアレイ140に含まれる各測距センサにおける出力電圧と距離との対応関係を示したデータとをメモリ70,70Aに格納しておけばよい。
<Third Modification>
The characteristics of the distance measuring sensors included in the sensor array 30 and the distance measuring sensors included in the sensor array 140 are not necessarily the same. In this case, data (correspondence data 71) indicating the correspondence between the output voltage and distance in each of the distance measuring sensors 31 to 38 included in the sensor array 30 and the output voltage of each distance measuring sensor included in the sensor array 140. And the data indicating the correspondence between the distance and the distance may be stored in the memories 70 and 70A.

 <第4の変形例>
 表示システム1の第2の変形例である表示システム1B(図15参照)のように、表示システム2を変形してもよい。つまり、表示システム2において、センサアレイ30と平行にセンサアレイ140を配置してもよい。
<Fourth Modification>
The display system 2 may be modified like a display system 1B (see FIG. 15) which is a second modification of the display system 1. That is, in the display system 2, the sensor array 140 may be arranged in parallel with the sensor array 30.

 なお、上記においては、表示領域810,820を矩形の表示領域として説明したが、矩形に限定されるものではない。表示領域810の形状は、ディスプレイ50の表示面の形状、マイクロレンズアレイ40の形状、光学素子41の形状に応じた形状となる。このため、表示システムに用いるディスプレイの表示面の形状、マイクロレンズアレイの形状、光学素子の形状を、ディスプレイ50の表示面の形状、マイクロレンズアレイ40の形状、光学素子41の形状とは異なる形状にした場合、表示領域は当該異なる形状に応じた形状となる。表示領域が矩形でなくても、上述した構成によれば、空中に表示させた2次元画像の周囲を枠で囲むことなく、簡易な構成で2次元画像における物体の位置を検出可能となる。 In the above description, the display areas 810 and 820 are described as rectangular display areas. However, the display areas are not limited to rectangles. The shape of the display area 810 is a shape corresponding to the shape of the display surface of the display 50, the shape of the microlens array 40, and the shape of the optical element 41. For this reason, the shape of the display surface of the display used in the display system, the shape of the microlens array, and the shape of the optical element are different from the shape of the display surface of the display 50, the shape of the microlens array 40, and the shape of the optical element 41. In this case, the display area has a shape corresponding to the different shape. Even if the display area is not rectangular, according to the above-described configuration, the position of the object in the two-dimensional image can be detected with a simple configuration without surrounding the two-dimensional image displayed in the air with a frame.

 今回開示された実施の形態は例示であって、上記内容のみに制限されるものではない。本発明の範囲は請求の範囲によって示され、請求の範囲と均等の意味および範囲内でのすべての変更が含まれることが意図される。 The embodiment disclosed this time is an example, and is not limited to the above contents. The scope of the present invention is defined by the terms of the claims, and is intended to include any modifications within the scope and meaning equivalent to the terms of the claims.

 1,1A,1B,1C,1D,2,2A,2B 表示システム、10,10A,10B,11,11A,11B 筐体、20,20A,20B,20C,21,21A,21B 開口部、30,30A,140 センサアレイ、31~38,31A 測距センサ、40 マイクロレンズアレイ、41 光学素子、50 ディスプレイ、70,70A メモリ、71 対応付けデータ、72 位置関係データ、80 ディスプレイ駆動装置、90 A/Dコンバータ、110,120,110A,110B,120A,120B 移動機構、150 A/Dコンバータ、311 発光装置、312,313 受光装置、312a,313a 赤外線受光素子、610 センシング領域、810,820,830 表示領域、910 指、A~N オブジェクト、311a 赤外線LED、312a,313a 赤外線受光素子。 1, 1A, 1B, 1C, 1D, 2, 2A, 2B Display system 10, 10A, 10B, 11, 11A, 11B housing, 20, 20A, 20B, 20C, 21, 21A, 21B opening, 30, 30A, 140 sensor array, 31-38, 31A ranging sensor, 40 microlens array, 41 optical elements, 50 display, 70, 70A memory, 71 association data, 72 positional relationship data, 80 display drive device, 90 A / D converter, 110, 120, 110A, 110B, 120A, 120B moving mechanism, 150 A / D converter, 311 light emitting device, 312, 313 light receiving device, 312a, 313a infrared light receiving element, 610 sensing area, 810, 820, 830 display Area, 910 fingers, A ~ Objects, 311a infrared LED, 312a, 313a infrared receiving component.

Claims (12)

 ディスプレイ(50)と、
 前記ディスプレイに表示された画像に基づいて、空中の表示領域(810,820)に2次元画像を表示する光学素子(40,41)と、
 プロセッサ(60)と、
 複数の測距センサを第1の方向に列状に配置した第1のセンサアレイ(30,30A)と、
 各前記測距センサにおける出力電圧と距離との対応関係を示した第1のデータを格納したメモリ(70,70A)とを備え、
 前記表示領域の法線の方向は、前記第1の方向に垂直な方向であり、
 前記複数の測距センサの各々は、前記第1の方向および前記法線の方向と垂直な第2の方向であって、前記表示領域に近づく向きに、光を出射し、
 前記複数の測距センサの各々は、前記出射された光のうち物体により反射された光を受光して、当該物体までの距離に基づく電圧値を出力し、
 前記プロセッサは、前記複数の測距センサの各々が出力した各電圧値と、前記第1のデータとに基づいて、前記2次元画像における前記物体の位置を検出する、表示システム(1,1A,1B,1C,1D,2,2A,2B)。
A display (50);
An optical element (40, 41) for displaying a two-dimensional image in an aerial display area (810, 820) based on the image displayed on the display;
A processor (60);
A first sensor array (30, 30A) in which a plurality of distance measuring sensors are arranged in a row in a first direction;
A memory (70, 70A) storing first data indicating a correspondence relationship between an output voltage and a distance in each distance measuring sensor;
The normal direction of the display area is a direction perpendicular to the first direction;
Each of the plurality of distance measuring sensors emits light in a second direction perpendicular to the first direction and the normal direction and approaching the display area,
Each of the plurality of distance measuring sensors receives light reflected by an object out of the emitted light, and outputs a voltage value based on a distance to the object,
The processor detects a position of the object in the two-dimensional image based on each voltage value output from each of the plurality of distance measuring sensors and the first data, a display system (1, 1A, 1B, 1C, 1D, 2, 2A, 2B).
 前記第1のセンサアレイは、前記表示領域を含む平面と交わる位置に配置されている、請求項1に記載の表示システム。 The display system according to claim 1, wherein the first sensor array is disposed at a position intersecting with a plane including the display area.  前記複数の測距センサの各々が出射する光は、前記表示領域を通過する、請求項2に記載の表示システム。 The display system according to claim 2, wherein the light emitted from each of the plurality of distance measuring sensors passes through the display area.  前記表示システムは、
 前記ディスプレイを第3の方向に移動させる第1の移動機構(110,110A,110B)と、
 前記第1のセンサアレイを第4の方向に移動させる第2の移動機構(120,120A,120B)とをさらに備え、
 前記メモリは、前記ディスプレイの位置と前記第1のセンサアレイの位置との対応関係を示した第2のデータをさらに格納しており、
 前記表示領域は、前記ディスプレイの移動に基づいて移動し、
 前記プロセッサは、
  前記ディスプレイの位置と前記第2のデータとに基づき、前記第1のセンサアレイの位置を決定し、
  前記第2の移動機構を用いて、前記第1のセンサアレイを前記決定した位置に移動させる、請求項1から3のいずれか1項に記載の表示システム(1A,2A,2B)。
The display system includes:
A first moving mechanism (110, 110A, 110B) for moving the display in a third direction;
A second moving mechanism (120, 120A, 120B) for moving the first sensor array in a fourth direction;
The memory further stores second data indicating a correspondence relationship between the position of the display and the position of the first sensor array;
The display area moves based on the movement of the display,
The processor is
Determining a position of the first sensor array based on the position of the display and the second data;
The display system (1A, 2A, 2B) according to any one of claims 1 to 3, wherein the first sensor array is moved to the determined position using the second moving mechanism.
 前記第3の方向は、前記ディスプレイの表示面に垂直な方向であり、
 前記表示領域は、前記ディスプレイの前記第3の方向への移動に基づき、当該第3の方向に移動し、
 前記第4の方向は、前記第3の方向と同じ方向である、請求項4に記載の表示システム(1A)。
The third direction is a direction perpendicular to a display surface of the display;
The display area moves in the third direction based on the movement of the display in the third direction,
The display system (1A) according to claim 4, wherein the fourth direction is the same direction as the third direction.
 前記第3の方向は、前記ディスプレイの表示面に垂直な方向であり、
 前記表示領域の前記法線の方向は、前記第3の方向とは異なり、
 前記表示領域は、前記ディスプレイの前記第3の方向への移動に基づき、前記法線の方向に移動し、
 前記第4の方向は、前記法線の方向の成分を有する方向であり、
 前記プロセッサは、前記複数の測距センサの各々が出力した各電圧値と、前記第1のデータと、前記第1のセンサアレイの位置または前記ディスプレイの位置とに基づいて、前記2次元画像における物体の位置を検出する、請求項4に記載の表示システム(2A)。
The third direction is a direction perpendicular to a display surface of the display;
The direction of the normal of the display area is different from the third direction,
The display area moves in the direction of the normal based on the movement of the display in the third direction,
The fourth direction is a direction having a component in the direction of the normal line,
The processor is configured to output the two-dimensional image based on each voltage value output from each of the plurality of ranging sensors, the first data, and the position of the first sensor array or the position of the display. The display system (2A) according to claim 4, wherein the position of the object is detected.
 前記第3の方向は、前記ディスプレイに表示された画像に基づく光の前記光学素子に対する入射角を変更する方向であり、
 前記第4の方向は、前記複数の測距センサが出射する各光の出射角度を変化させる方向である、請求項4に記載の表示システム(2B)。
The third direction is a direction for changing an incident angle of light with respect to the optical element based on an image displayed on the display;
The display system (2B) according to claim 4, wherein the fourth direction is a direction in which an emission angle of each light emitted from the plurality of distance measuring sensors is changed.
 前記表示システムは、
 複数の測距センサを前記第1の方向に列状に配置した第2のセンサアレイ(140)をさらに備え、
 前記第2のセンサアレイは、前記第1のセンサアレイと平行に配置され、前記第1のセンサアレイと同じ向きに光を出射し、
 前記プロセッサは、
  前記表示領域に少なくとも1つのオブジェクトを表示させ、
  前記第2のセンサアレイに含まれる前記複数の測距センサの各々が出力した電圧値と、前記第1のデータとに基づいて、前記物体の位置を検出し、
  当該検出した物体の位置に対応する前記表示領域を含む平面における位置が、当該表示領域における前記少なくとも1つのオブジェクトを表示している領域に含まれるか否かを判断し、
  前記オブジェクトを表示する領域に含まれると判断したことに基づき、前記表示領域における当該オブジェクトの表示態様を第1の表示態様から第2の表示態様に変更する、請求項1から7のいずれか1項に記載の表示システム(1B)。
The display system includes:
A second sensor array (140) in which a plurality of distance measuring sensors are arranged in a row in the first direction;
The second sensor array is arranged in parallel with the first sensor array, emits light in the same direction as the first sensor array,
The processor is
Displaying at least one object in the display area;
Detecting the position of the object based on the voltage value output from each of the plurality of distance measuring sensors included in the second sensor array and the first data;
Determining whether a position on a plane including the display area corresponding to the position of the detected object is included in an area displaying the at least one object in the display area;
The display mode of the object in the display area is changed from the first display mode to the second display mode based on the determination that the object is included in the display area. The display system (1B) according to item.
 前記プロセッサは、前記表示領域に少なくとも1つのオブジェクトを表示させ、
 前記第1のセンサアレイ(30A)における測距センサは、前記少なくとも1つのオブジェクトに対応する位置に設けられている、請求項1から8のいずれか1項に記載の表示システム(1D)。
The processor displays at least one object in the display area;
The display system (1D) according to any one of claims 1 to 8, wherein a distance measuring sensor in the first sensor array (30A) is provided at a position corresponding to the at least one object.
 前記第1のセンサアレイにおける前記複数の測距センサの各々は、1つの発光素子と、当該発光素子が発光した光の反射光を各々受光する2つ以上の受光素子とを含む、請求項1から9のいずれか1項に記載の表示システム(1C)。 2. Each of the plurality of distance measuring sensors in the first sensor array includes one light emitting element and two or more light receiving elements that respectively receive reflected light of light emitted from the light emitting element. The display system (1C) according to any one of 1 to 9.  前記プロセッサは、
 前記表示領域に少なくとも1つのオブジェクトを表示させ、
 前記第1のセンサアレイによって検出された前記物体の位置が、前記少なくとも1つのオブジェクトを表示している領域に含まれていることに基づき、当該オブジェクトに対応付けられた処理を実行する、請求項1から10のいずれか1項に記載の表示システム。
The processor is
Displaying at least one object in the display area;
The process associated with the object is executed based on the fact that the position of the object detected by the first sensor array is included in an area displaying the at least one object. The display system according to any one of 1 to 10.
 空中の表示領域(810,820)に表示した2次元画像における物体の位置を検出する表示システム(1,1A,1B,1C,1D,2,2A,2B)における検出方法であって、
 前記表示システムは、プロセッサ(60)と、複数の測距センサを第1の方向に列状に配置したセンサアレイ(30,30A)と、各前記測距センサにおける出力電圧と距離との対応関係を示したデータを格納したメモリ(70,70A)とを含み、前記表示領域の法線の方向は、前記第1の方向に垂直な方向であり、
 前記検出方法は、
 前記複数の測距センサの各々が、前記第1の方向および前記法線の方向と垂直な第2の方向であって、前記表示領域に近づく向きに、光を出射するステップ(S4)と、
 前記複数の測距センサの各々が、前記出射された光のうち物体により反射された光を受光して、当該物体までの距離に基づく電圧値を出力するステップ(S6、S8)と、
 前記プロセッサが、前記複数の測距センサの各々によって出力された各電圧値と、前記データとに基づいて、前記2次元画像における前記物体の位置を検出するステップ(S16)とを備える、検出方法。
A detection method in a display system (1, 1A, 1B, 1C, 1D, 2, 2A, 2B) for detecting the position of an object in a two-dimensional image displayed in an aerial display area (810, 820),
The display system includes a processor (60), a sensor array (30, 30A) in which a plurality of distance measuring sensors are arranged in a row in a first direction, and a correspondence relationship between an output voltage and a distance in each of the distance measuring sensors. And a memory (70, 70A) that stores data indicating the normal direction of the display area is a direction perpendicular to the first direction,
The detection method is:
Each of the plurality of distance measuring sensors emits light in a second direction perpendicular to the first direction and the normal direction and approaching the display area (S4);
Each of the plurality of distance measuring sensors receives light reflected by an object out of the emitted light, and outputs a voltage value based on the distance to the object (S6, S8);
The processor includes a step (S16) of detecting the position of the object in the two-dimensional image based on each voltage value output by each of the plurality of distance measuring sensors and the data. .
PCT/JP2011/065579 2010-09-06 2011-07-07 Display system and detection method Ceased WO2012032842A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010199062 2010-09-06
JP2010-199062 2010-09-06

Publications (1)

Publication Number Publication Date
WO2012032842A1 true WO2012032842A1 (en) 2012-03-15

Family

ID=45810447

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/065579 Ceased WO2012032842A1 (en) 2010-09-06 2011-07-07 Display system and detection method

Country Status (1)

Country Link
WO (1) WO2012032842A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2849038A1 (en) * 2013-09-17 2015-03-18 Funai Electric Co., Ltd. Spatial coordinate identification device
JP2016006564A (en) * 2014-06-20 2016-01-14 船井電機株式会社 Image display device
JP2016006447A (en) * 2014-06-20 2016-01-14 船井電機株式会社 Image display device
JP2016009396A (en) * 2014-06-25 2016-01-18 船井電機株式会社 Input device
JP2016009204A (en) * 2014-06-20 2016-01-18 船井電機株式会社 Image display device
JP2016048347A (en) * 2014-08-28 2016-04-07 エヌカント株式会社 Information display system
EP2957997B1 (en) * 2014-06-20 2020-11-18 Funai Electric Co., Ltd. Image display device
WO2022018972A1 (en) * 2020-07-22 2022-01-27 日本電産サンキョー株式会社 Input device and input device control method
JP2023168064A (en) * 2022-05-13 2023-11-24 株式会社村上開明堂 aerial control device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006038509A1 (en) * 2004-10-07 2006-04-13 Pioneer Corporation Stereoscopic two-dimensional image display device
JP2006323521A (en) * 2005-05-17 2006-11-30 Takenaka Komuten Co Ltd Non-contact type input device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006038509A1 (en) * 2004-10-07 2006-04-13 Pioneer Corporation Stereoscopic two-dimensional image display device
JP2006323521A (en) * 2005-05-17 2006-11-30 Takenaka Komuten Co Ltd Non-contact type input device

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2849038A1 (en) * 2013-09-17 2015-03-18 Funai Electric Co., Ltd. Spatial coordinate identification device
JP2016006564A (en) * 2014-06-20 2016-01-14 船井電機株式会社 Image display device
JP2016006447A (en) * 2014-06-20 2016-01-14 船井電機株式会社 Image display device
JP2016009204A (en) * 2014-06-20 2016-01-18 船井電機株式会社 Image display device
EP2957997B1 (en) * 2014-06-20 2020-11-18 Funai Electric Co., Ltd. Image display device
JP2016009396A (en) * 2014-06-25 2016-01-18 船井電機株式会社 Input device
JP2016048347A (en) * 2014-08-28 2016-04-07 エヌカント株式会社 Information display system
WO2022018972A1 (en) * 2020-07-22 2022-01-27 日本電産サンキョー株式会社 Input device and input device control method
JP2023168064A (en) * 2022-05-13 2023-11-24 株式会社村上開明堂 aerial control device

Similar Documents

Publication Publication Date Title
WO2012032842A1 (en) Display system and detection method
KR101926406B1 (en) Position sensing systems for use in touch screens and prismatic film used therein
WO2018146867A1 (en) Control device
WO2012032851A1 (en) Display system and detection method
US10303305B2 (en) Scanning touch systems
WO2017125984A1 (en) Aerial display device
KR20010014970A (en) Optical unit for detecting object and coordinate input apparatus using the same
JP2018031925A (en) Aerial display device
US20220074573A1 (en) Optical System for Noise Mitigation
JP6721875B2 (en) Non-contact input device
KR20010041694A (en) Optical sensor system for detecting the position of an object
JP5451538B2 (en) Coordinate input device
JP2015060296A (en) Spatial coordinate specification device
TW201324259A (en) User interface display device
US20240019715A1 (en) Air floating video display apparatus
JP2014202951A (en) Image projection device and operation matter detection method
CN201853211U (en) Laser Optical Touch Module
JP2023006618A (en) Space floating image display device
TW201327324A (en) Optical touch control module
JP2011258039A (en) Position detector
JP7710938B2 (en) Space-floating image information display system and 3D sensing device used therein
JP4034328B2 (en) Luminescence detection device and coordinate detection device
JP6315127B2 (en) Input device, aerial image interaction system, and input method
US12493195B1 (en) Projector and sensing system used therefor
CN221124870U (en) Distance measuring device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11823321

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11823321

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP