WO2013081671A1 - Compensation de brouillage et de contamination pour des dispositifs d'entrée multipoint - Google Patents
Compensation de brouillage et de contamination pour des dispositifs d'entrée multipoint Download PDFInfo
- Publication number
- WO2013081671A1 WO2013081671A1 PCT/US2012/044055 US2012044055W WO2013081671A1 WO 2013081671 A1 WO2013081671 A1 WO 2013081671A1 US 2012044055 W US2012044055 W US 2012044055W WO 2013081671 A1 WO2013081671 A1 WO 2013081671A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- digital image
- input device
- touch input
- digital
- infrared light
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/0202—Constructional details or processes of manufacture of the input device
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/023—Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
- G06F3/0238—Programmable keyboards
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03543—Mice or pucks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/0418—Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0421—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04104—Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04109—FTIR in optical digitiser, i.e. touch detection by frustrating the total internal reflection within an optical waveguide due to changes of optical properties or deformation at the touch location
Definitions
- keyboards, mice, and touchpads may not be suitable for environments where debris is common, such as an industrial or manufacturing environment where dirt, grease, or other debris may interfere with the proper functioning of the keyboard, mouse, or touchpad.
- a keyboard cover such as a clear plastic or silicon cover, may provide some protection in such an environment, but it may also interfere or degrade performance of the device, such as the ability to depress single keys precisely.
- keyboards, mice, and touchpads may not be suitable for environments where a clean and sterile surface is needed, such as in a health care or hospital environment.
- a keyboard, mouse, or touchpad surface may have gaps, edges, crevices, or other surface areas that are difficult to clean or sanitize.
- traditional keyboards, mice, and touchpads provide a fixed input configuration.
- mechanical keyboards are created with keys of fixed size and layout.
- mice have fixed buttons and scroll wheels, and touchpads have a fixed input area.
- software can be used to map fixed keyboard keys to new functions, it cannot change the physical size, position, or layout of the keys.
- a FTIR multi-touch input device can comprise a base portion, a transparent interface panel connected to the base portion, an interface map attached to the interface panel, an infrared light source for emitting infrared light inside the interface panel using total internal reflection, and one or more digital cameras, where the one or more digital cameras are configured to detect infrared light that is scattered from the interface panel using FTIR when the interface panel is touched.
- the interface map can be a custom or pre-defined removably attached interface map (e.g., a transparent polycarbonate film).
- the interface map can depict a layout comprising one or more touch areas (e.g., one or more keys and/or one or more touchpad zones).
- a method can be provided for creating a custom interface map for a multi-touch input device (e.g., a frustrated total internal reflection (FTIR) multi-touch input device).
- the method can comprise receiving layout information for a custom interface map, where the layout information defines a plurality of touch areas, and where each touch area is one of a key and a touchpad zone, generating a layout for the custom interface map according to the layout information, and generating a configuration file from the received layout
- FTIR frustrated total internal reflection
- the method can also comprise outputting the configuration file (e.g., storing the configuration file or sending the configuration file to a user) and/or providing the layout for printing on a transparent film to create the custom interface map.
- a FTIR multi-touch input device can comprise a base portion, a transparent interface panel connected to the base portion, an infrared light source for emitting infrared light inside the interface panel using total internal reflection, an ultraviolet light source for emitting ultra-violet light inside the transparent interface panel, and one or more digital cameras, where the one or more digital cameras are configured to detect infrared light that is scattered from the interface panel using FTIR when the interface panel is touched.
- the ultraviolet light can provide a sterilization effect to the transparent interface panel.
- a custom interface map and configuration file can be provided for a multi-touch input device (e.g., a FTIR multi-touch input device), comprising a custom interface map that is defined by layout information, where the layout information comprises, for each of a plurality of touch areas: configuration of the touch area, location of the touch area on the custom interface map, and function performed by the touch area when touched, and comprising a configuration file corresponding to the custom interface map, where the configuration file is loadable on the multi-touch input device to configure the multi-touch input device to use the custom interface map and process touch events according to the layout information.
- a multi-touch input device e.g., a FTIR multi-touch input device
- a method implemented at least in part by a frustrated total internal reflection multi-touch input device, can be provided for removing interference.
- the method can comprise capturing a first digital image, where the first digital image is captured when an infrared light source of the FTIR multi-touch input device is turned off, capturing a second digital image, where the second digital image is captured when the infrared light source of the FTIR multi-touch input device is turned on, and processing the first and second digital images to remove infrared interference.
- a frustrated total internal reflection multi-touch input device can comprise a transparent interface panel, an infrared light source for emitting infrared light inside the interface panel using total internal reflection, and a digital camera, where the digital camera is configured to detect infrared light that is scattered from the interface panel using FTIR when the interface panel is touched.
- the input device can be configured to perform operations comprising detecting infrared interference based at least in part from digital images captured using the digital camera when the infrared light source is off, and compensating for the detected infrared interference in digital images captured using the digital camera when the infrared light source is on
- a method, implemented at least in part by a frustrated total internal reflection multi-touch input device can be provided for removing contamination from digital images.
- the method can comprise capturing a first digital image, where the first digital image is captured when an infrared light source of the FTIR multi-touch input device is turned on, and where the first digital image is captured when an interface panel of the FTIR multi-touch input device is not being touched, determining contamination information based at least in part upon the first digital image, where the contamination information is determined from detected infrared light present in the first digital image, capturing a plurality of additional digital images, where the plurality of additional digital images are captured when the infrared light source of the FTIR multi-touch input device is turned on, and processing the plurality of additional digital images based at least in part upon the contamination information to remove contamination from the plurality of additional digital images
- FIG. 1 is a perspective view depicting an example multi-touch input device, which can be in the format of a keyboard.
- FIG. 2 is a perspective view depicting an example multi-touch input device, which can be in the format of a mouse or touchpad.
- FIG. 3 is a side elevation view of the multi-touch input device of Fig. 1 or Fig. 2.
- FIG. 4 is an exploded view depicting an example multi-touch input device.
- FIG. 5 is a schematic side elevation view of an example multi-touch input device in use, depicting scattered IR light from a touch.
- FIGS. 6A and 6B are plan views of example interface maps with a keyboard layout.
- FIG. 7 is a plan view of an example interface map with a mouse or touchpad layout.
- FIG. 8 is a flowchart of an exemplary method for creating a custom interface map for a multi-touch input device.
- FIGS. 9A, 9B, and 9C are diagrams depicting example digital images, including images depicting infrared interference.
- FIG. 10 is a flowchart of an exemplary method for removing infrared interference.
- FIGS. 11A, 1 IB, and 11C are diagrams depicting example digital images, including images depicting scattered light from contamination.
- FIG. 12 is a flowchart of an exemplary method for removing contamination from digital images.
- FIG. 13 is a system diagram depicting a generalized example of a suitable computing system in which the described innovations may be implemented
- Multi-touch input devices can be provided in the format of keyboards, mice, touchpads, or
- a multi-touch input device can be provided in the format of a traditional keyboard, comprising letter keys, number keys, function keys, special keys, etc.
- a multi-touch input device can be provided in the format of a mouse or touchpad, comprising a touchpad zone for cursor movement and one or more buttons.
- a multi-touch input device can also comprise functionality of a keyboard in addition to a mouse or touchpad.
- a layout for a multi- touch input device can comprise keys and buttons in combination with touchpad zones (e.g., in a pre-defined or user-defined layout).
- a multi-touch input device e.g., keyboard, mouse, touchpad, and the like
- a touch sensitive surface e.g., an interface panel
- the touch sensitive surface can be a sheet of material, such as glass or plastic, within which light (e.g., infrared (IR), ultraviolet (UV), and/or visible light) is reflected internally (e.g., via total internal reflection).
- the interface panel is made from glass or a glass blend, such as a borosilicate type glass.
- the size or dimensions of a multi-touch input device can vary according to implementation details.
- a multi-touch input device that will be used as a traditional keyboard can be sized similarly to a traditional keyboard.
- An input device that will be used as a touchpad with a small number of buttons can be sized accordingly (e.g., smaller than a traditional keyboard).
- Multi-touch input devices can detect touch by an object (e.g., a person's finger, stylus, or other type of object) using FTIR technology.
- Multi-touch input devices can detect touch by more than one object (e.g., multiple fingers) at once (e.g., by imaging the interface panel to detect one or more touch events as spots or blobs of scattered IR light).
- a multi-touch keyboard, mouse, or touchpad has a glass touch area where activation of standard keys or manipulation of ID or 2D sliders can occur (e.g., ID volume control or zoom sliders, or 2D cursor movement zones).
- the multi-touch input device can be supported by a base part that houses control circuitry, cameras, communication connections, and/or a battery.
- the input device can have raised surface indicators (e.g., small bumps) to indicate specific locations on the surface to a user (e.g., bumps on the J and F keys to help with finger placement).
- the input device can provide (e.g., via software or firmware) selectable functions to create keystroke sounds for each touch.
- the input device can include a backlight that can be turned on or off.
- FTIR technology can use infrared light-emitting diodes (LEDs) placed at the edge of a panel which transmits the wavelength(s) of light produced by the LEDs. Because of the low angle at which the light impacts the internal glass surface, it is internally reflected or "bounced around inside the glass,” much like looking through a tube to see the reflections on the inside walls. When the glass is touched it frustrates the reflection and the IR light is scattered, with some of the light being scattered downward, out of the glass, which allows a camera to see it. The device can then determine the location and send the appropriate information to a computer or other computing device.
- the multi-touch input devices described herein can be completely
- the interface map can be easily changed by affixing (e.g., placing on or adhering) clear static sheets that can be purchased or printed custom by the end user. This allows the user to adapt to specific languages or dedicate areas of the input device to anything they choose, in addition to selecting standard layouts for users who want to use the number pad area just for track pad function for instance. Flexible software can allow even more versatility by providing the ability to develop special functionality for any desired interface map layout.
- the multi-touch input devices described herein can provide advantages over existing FTIR technology and conventional capacitive touch screen systems.
- the input device technology described herein can be less expensive and use less power, because the touch panel is separate from the display screen, and no projection apparatus is needed to project the interface map onto the touch panel.
- custom interface maps can be developed and removably applied to satisfy a variety of standard and special purpose applications.
- the touch surface can be easily cleaned and can support sterilization techniques.
- Other advantages of the multi-touch input device technology are described elsewhere herein.
- This section describes different aspects of multi-touch input devices that utilize FTIR techniques. Different features and elements of the devices described in this section can be used separately or in combination with other features and elements described in this section and/or described elsewhere herein.
- Fig. 1 is a perspective view depicting an example multi-touch input device 100.
- the multi-touch input device 100 can be in the format of a keyboard.
- the multi-touch input device 100 comprises a transparent interface panel 102, a base portion 104, a housing portion 107 connecting the interface panel 102 to the base portion 104, a light source (not depicted) for projecting light into the interface panel 102 (e.g., for directing IR light to an edge of the interface panel 102), and one or more digital cameras 116 positioned to scan a surface of the interface panel 102.
- the multi-touch input device 100 also comprises an interface map 126 (e.g., a custom or pre-defined interface map), such as a transparent film with a printed layout, affixed to the interface panel 102.
- an interface map 126 e.g., a custom or pre-defined interface map
- the digital cameras 116 are mounted in the base portion 104, distal from interface panel 102, in order to provide adequate field of vision.
- the digital cameras 116 can scan (e.g., continuously capture images or frames a number of times per second, such as 30 or 60 frames per second) at least the area of the interface panel 102 corresponding to the interface map 126.
- the light source can flicker at a frequency and pulse length which is synchronized with the scan rate of the corresponding digital cameras 116. Synchronizing the light source and the digital cameras 116 can help prevent interference from other potential light sources (e.g., IR light), such as IR remote control devices.
- IR light e.g., IR light
- Fig. 2 is a perspective view depicting an example multi-touch input device 200.
- the multi-touch input device 200 can be in the format of a touchpad or mouse.
- the multi-touch input device 200 comprises a transparent interface panel 202, a base portion 204, a housing portion 207 connecting the interface panel 202 to the base portion 204, a light source (not depicted) for projecting light into the interface panel 202, and one or more digital cameras 216 positioned to scan a surface of the interface panel 202.
- the multi-touch input device 200 also comprises an interface map 226 (e.g., a custom or pre-defined interface map) affixed to the interface panel 202.
- an interface map 226 e.g., a custom or pre-defined interface map
- the multi-touch input devices depicted in Figs. 1 and 2 comprise many of the same design features.
- the input device of Fig. 2 can be a smaller version (e.g., in width, length, and/or height) of the input device of Fig. 1.
- the input device of Fig. 2 may have a different interface map 226 (e.g., a touchpad style interface map instead of a keyboard style interface map) and may have a fewer number of digital cameras 216 (e.g., only one digital camera instead of multiple digital cameras).
- the multi-touch input devices depicted in Figs. 1 and 2 can be configured for different applications. For example, a larger input device, such as depicted in Fig.
- Fig. 1 can operate as a typical keyboard (e.g., it can operate in place of a standard mechanical keyboard without any special software or drivers).
- a smaller input device, such as depicted in Fig. 2 can operate as a typical mouse (e.g., it can operate in place of a standard opto-mechanical mouse or touchpad without any special software or drivers). In this way, the input device can be directly connected to a computer system and operate as a keyboard or mouse without the need for additional software or drivers.
- Fig. 1 generally depicts an input device with a keyboard type layout
- Fig. 2 generally depicts an input device with a mouse or touchpad type layout
- the input devices are not limited to such layouts.
- Fig. 1 can be configured with touchpad zones (e.g., including a touchpad zone that performs cursor movement), separately or in combination with keys performing standard or custom functions.
- the multi-touch input devices depicted in Figs. 1 and 2 can detect touch events by an object (e.g., a person's finger, stylus, or other type of object) using FTIR technology.
- the touch events can be, for example, a touch (e.g., a touch or press by a finger with a large enough area to indicate a user's desire to activate a key or button), a rest (e.g., a touch or press by a finger for a longer duration to indicate the user does not desire to activate a key or button), a movement (e.g., movement of a touch area over successive frames to indicate a user's desire to perform a movement action, such as cursor movement on a touchpad zone).
- a touch e.g., a touch or press by a finger with a large enough area to indicate a user's desire to activate a key or button
- a rest e.g., a touch or press by a finger for a longer duration to indicate
- the transparent interface panel (e.g., 102 or 202) can be a panel made out of a material, such as glass or plastic, within which light (e.g., infrared (IR), ultraviolet (UV), and/or visible light) is reflected internally.
- the panel can be a curved panel (e.g., curved in one direction).
- the base portion e.g., 104 or 204) of the multi-touch input devices depicted in Figs. 1 and 2 can provide support for the input device and can house various components.
- the base portion can house the digital cameras (e.g., 116 or 216).
- the digital cameras can be housed in the base portion in a position to view (e.g., capture images) of the underside of the interface panel (e.g., 102 or 202).
- the base portion can house components needed for communication with a computing device (e.g., a computer system, such as a desktop computer, or another type of computing device), such as a USB connection and/or wireless communication technology (e.g., Bluetooth).
- the base portion can house control components (e.g., processors or other types of controllers) for operating the input device (e.g., controlling the digital cameras to scan or image the underside of the interface panel, process the information, and communicate results to an associated computing device).
- the housing portion (e.g., 107 or 207) of the multi-touch input devices depicted in Figs. 1 and 2 can connect the base portion (e.g., 104 or 204) to the interface panel (e.g., 102 or 202).
- the housing portion can house a light source (e.g., IR, UV, and/or visible light sources, such as LEDs).
- the light source e.g., one or more LEDs
- the light source can be located in in the housing portion or in the base portion, and be positioned for directing light into an edge of the interface panel (e.g., positioned to directly shine light into the edge of the interface panel or to indirectly conduct light, such as via a light pipe).
- the light source of the multi-touch input devices depicted in Figs. 1 and 2 can provide UV, IR, and/or visible light.
- one or more IR LEDs provide a light source within the interface panel, for detection by the digital cameras using FTIR technology.
- one or more UV LEDs provide a light source to the interface panel and provides a sterilization effect to the interface panel. UV, IR, and/or visible light sources can be used separately or in combination.
- the digital cameras e.g., 116 or 216) of the multi-touch input devices depicted in Figs. 1 and 2 capture images (e.g., continuously capture a sequence of images as a video stream) of the interface panel (e.g., 102 or 202).
- the digital cameras can be configured to capture at a specific rate (e.g., 30, 50, or 80 pictures or frames per second (FPS)).
- the digital cameras can be configured to capture at a specific rate that depends on the rate of a light source.
- the digital camera capture rate can be set accordingly (e.g., to the same rate, such as 15Hz or 30Hz, or to a different rate, such as 30Hz or 60Hz, which may be a multiple of the flashing rate of the IR light).
- the digital cameras can be selected, or filtered, to be primarily sensitive to specific types of light, such as IR light.
- the digital images captured by the digital cameras can depict the presence and absence of infrared light.
- the digital images can be grayscale digital images in which areas of detected infrared light are lighter (e.g., where the lighter the grayscale value, the greater the intensity of IR light) and areas with little or no detected infrared light are darker (e.g., where the darker the grayscale value, the lesser the intensity of IR light).
- the multi-touch input devices depicted in Figs. 1 and 2 can include an interface map (e.g., 126 or 226).
- the interface map depicts a layout for the various key and/or touchpad areas of the input device.
- the interface map can depict key areas for a standard keyboard (e.g., depicting letters, numbers, special characters, arrow keys, function keys, etc.).
- the interface map is a separate sheet (e.g., a static plastic sheet) that is removably attached to the top surface of the interface panel (e.g., a separate sheet or film that is placed on the interface panel or adhered to the interface panel).
- the interface map can be attached to the bottom surface of the interface panel.
- the interface map is permanently attached to the top or bottom surface of the interface panel (e.g., printed directly on the surface of the interface panel, etched into the interface panel, or otherwise permanently applied or attached).
- the interface map can be a custom interface map (e.g., user-defined or selected from a number of layout options or combinations of elements) or a pre-defined interface map (e.g., a standard keyboard layout in a specific language).
- the multi-touch input devices depicted in Figs. 1 and 2 can include a controller (e.g., one or more processors, integrated circuits, and/or associated components).
- the controller can be located, for example, in the base portion (e.g., 104 or 204) of the multi-touch input device.
- the controller can control functions of the input device, such as communication (e.g., wired or wireless communication) with an associated computing device, operation of the digital cameras, processing of images to detect touch events, translation of touch events into specific key presses or touchpad movement (e.g., using an interface map), sterilization cleaning cycles, processing of images to remove interference and/or contamination, communication with an associated computer as a standard keyboard and/or mouse, and other functions performed by the input device.
- communication e.g., wired or wireless communication
- processing of images to detect touch events e.g., translation of touch events into specific key presses or touchpad movement (e.g., using an interface map)
- sterilization cleaning cycles processing of images to remove interference and/or contamination
- communication with an associated computer e.g., a standard keyboard and/or mouse, and other functions performed by the input device.
- Fig. 3 is a side elevation view of the multi-touch input device 100 of Fig. 1 or 200 of Fig. 2, showing the transparent interface panel 102 or 202, the base portion 104 or 204, the housing portion 107 or 207 connecting the interface panel to the base portion, a light source (not depicted) for projecting light into the interface panel, and one or more digital cameras 116 or 216 positioned to scan a surface of the interface panel.
- the multi-touch input device also has the interface map 126, 226 (e.g., a visually perceptible and/or tactile perceptible interface map) attached (e.g., removably attached or permanently attached) to the top 318 (as shown) and/or bottom 320 surface of the interface panel.
- the interface map can be selected from a plurality of pre-existing layouts or a custom layout can be created for a particular user or purpose.
- the interface map can be configured in a single piece, or multiple interface map parts can be provided.
- the base portion 104 or 204 and/or housing portion 107 or 207 can comprise various components for operating the multi-touch input device 100, 200.
- the components can include controllers (e.g., one or more processors), one or more of the cameras 116 or 216, power supplies, interfaces to other systems (e.g., USB and/or wireless connection components for communicating with other computing devices), LEDs, light pipes, etc.
- controllers e.g., one or more processors
- the components can include controllers (e.g., one or more processors), one or more of the cameras 116 or 216, power supplies, interfaces to other systems (e.g., USB and/or wireless connection components for communicating with other computing devices), LEDs, light pipes, etc.
- the multi-touch input device has a streamlined aesthetic appearance.
- the interface panel is slightly curved and attached only at one end, such that its forward edge is cantilevered over the cameras and the forward edge of the base.
- the multi-touch input device 400 comprises a transparent interface panel 402 (e.g., corresponding to 102 or 202), an interface map 426 (e.g., corresponding to 126 or 226), a base portion 404 (e.g., corresponding to 104 or 204) coupled to interface panel 402, a housing portion 407 (e.g., corresponding to 107 or 207) connecting the interface panel 402 to the base portion 404, and three digital cameras 416 positioned to scan a surface of the interface panel 402. Depending on implementation details, a different number of digital cameras 416 can be used.
- Fig. 4 also depicts components that are internal to the multi-touch input device 400 and that may be present in the multi-touch input devices depicted in Figs. 1 and 2.
- the exploded view 400 depicts eight light sources (e.g., comprising IR and/or UV LEDs), two of which are depicted at 410.
- the light sources (e.g., 410) are optically coupled to an edge of the interface panel 402, such that light is internally reflected inside the interface panel 402.
- Light from the light sources (e.g., 410) can be optically coupled to the edge of the interface panel 402 using light pipes.
- a different number of light sources e.g., 410) can be used.
- Fig. 4 illustrates just one example of how the components (including internal components) of the multi-touch input device can be designed and configured.
- Fig. 5 depicts a schematic side elevation view of an example multi-touch input device 500 in use, depicting scattered IR light from a touch.
- IR light is being directed to an edge of the interface panel 502 via one or more IR LEDs 510.
- the reflectivity at the surface of the interface panel 502 is altered, causing some of the IR light to "escape" (to be scattered outward) from the point of contact 530.
- the light that escapes is detected by the digital cameras 516 as a point source of IR light.
- the digital cameras 516 can capture images (or frames) multiple times per second, recording the presence or absence of any point sources during each capture.
- the captured images can be analyzed (e.g., to determine touch events, such as using blob detection) and compared to an interface map to determine the corresponding location on the interface map that has been touched.
- the captured images can also be analyzed to determine other types of touch events, such as a tap, hold (e.g., for repetitive key entry), multiple touches at different locations, movement, etc.
- each touch location (e.g., as determined by a center point of the touch location) can activate a corresponding key or other function according to the interface map and its key and/or touch zone layout.
- a multi-touch input device can include an interface map (e.g., 126 or 226).
- the interface map provides a layout for keys and/or touchpad areas on the interface panel (e.g., 102 or 202) surface.
- the interface map can be a plastic film that is affixed or adhered (e.g., removably affixed or adhered) to the interface panel (e.g., to the top or bottom surface of the interface panel).
- the interface map is a polycarbonate film (e.g., a transparent polycarbonate film).
- the interface map can be removably adhered to the interface panel such that the interface map can be later removed (e.g., and replaced with a different interface map).
- the interface map can be a transparent or semi-transparent sheet or film (e.g., transparent to at least IR wavelengths).
- the interface map provides a layout for touch areas (e.g., keys and/or touchpad zones) on the interface panel.
- the layout can be a pre-defined layout or a custom (e.g., user-defined) layout.
- pre-defined layouts can be provided for standard keyboard keys in a variety of languages.
- the layout can be printed or imprinted (e.g., using various printing technologies, such as solvent inkjet printing, UV inkjet printing, laser printing, etc.) on the interface map.
- the layout of a keyboard key can be depicted on the interface map as an outline of the key area and a symbol indicating the function performed by the key (e.g., the symbol "A" for a key that will enter the letter "A" when touched).
- Figs. 6A and 6B are plan views of example interface maps, 600 and 640.
- the first example interface map 600 in Fig. 6 A is an English language keyboard layout (e.g., a pre-defined layout).
- the keyboard layout includes letter keys, number keys, function keys, special keys (e.g., control keys, shift keys, etc.), and a number pad 610.
- the second example interface map 640 is an English language keyboard layout (e.g., a pre-defined layout), which is similar to the first example layout 610, but includes a touchpad zone 650 instead of the number pad 610.
- the second example layout 640 also includes keys which can function as mouse buttons, a right mouse button “RMB” and a left mouse button “LMB” 652, as well as a zoom slider 650.
- the second example interface map 650 can perform the functions of both a keyboard and a mouse.
- the example interface maps 600 and 640 can represent pre-defined layouts. For example, they can be provided as two pre-defined alternative interface maps for a multi-touch input device.
- the example interface maps 600 and 640 can be a transparent films (e.g., polycarbonate films) on which the keyboard layouts are printed.
- Fig. 7 is a plan view of an example interface map 700.
- the example interface map 700 represents a mouse or touchpad layout (e.g., a pre-defined layout).
- the mouse/touchpad layout includes a touchpad zone 7 10 (e.g., for moving a cursor on a computer screen) and eight buttons 720.
- the eight buttons can include mouse buttons (e.g., a right mouse button "RMB” and a left mouse button "LMB”) and/or buttons that perform other functions (e.g., pre-defined or user-assignable functions).
- the layout of the interface map can be defined using layout information.
- the layout information (e.g., size and location of keys and touchpad zones) can be used when creating the interface map (e.g., by printing a layout on plastic film, according to the layout information, to create the interface map) and when processing touches to determine actions to perform (e.g., key presses or touchpad movement).
- the layout information for keyboard keys can include size (e.g., height and width), shape (e.g., square, round, etc.), and/or function information (e.g., a key, symbol, or function corresponding to the key).
- Touchpad zone layout information can include size (e.g., height and width), shape (e.g., square, round, etc.), and/or function information (e.g., an indication of how touch information is to be processed, such as a 1 -dimension slider or a 2-dimension area).
- the layout information can comprise configuration of the touchpad area (e.g., size, shape, and/or type), location of the touch area on the interface map, and the function performed by the touch area when used.
- Table 1 below depicts a simplified example of layout information for a number of keys and a touchpad area (touchpad zone).
- the "type” column indicates whether the element is a key or touchpad area
- the "height x width” column indicates the height and width of the key or touchpad area
- the "location” column indicates an x/y offset from the upper-left corner of the interface map where the key or touchpad area is located
- the "key/function” column indicates how touches within the key or touchpad area are processed.
- the interface map can be a pre-defined interface map.
- a multi- touch input device can be provided with a standard pre-defined interface map (e.g., a standard English-language keyboard interface map), or multiple standard predefined interface maps can be provided from which a selection can be made.
- a purchaser of a multi-touch input device may be provided with the option of a standard keyboard layout interface map with a number pad or a standard keyboard layout interface map with a touchpad in place of the number pad.
- One or more standard pre-defined interface maps can be provided with an input device (e.g., pre-attached to the input device or user-attached), or purchased separately.
- the interface map can be a custom or user-defined interface map.
- a custom interface map can be defined (e.g., by a user) using pre-defined elements and/or by defining the layout using individual or custom elements.
- Predefined elements can include blocks of keys (e.g., a block of letters, numbers, and associated space bar, control key etc., a block of arrow keys, a block of function keys, a block of keys for a number pad, etc.) or pre-defined touchpad areas (e.g., a touchpad zone of a pre-determined size for use as a cursor movement or mouse area).
- Defining the layout using individual or custom elements can include selecting individual elements (e.g., specific keys, such as letters, numbers, function keys, arrow keys, etc., including location and/or size) and/or defining custom elements (e.g., defining the custom element by size, shape, function, etc.).
- a custom element can be a key of a specific size, shape, and location that performs a simple or complex function (e.g., a key that types a sequence of letters or that performs one or more actions).
- Interface maps can be created using a web site. For example, a user can create a custom interface map using a web site (e.g., using a graphical user interface to create the layout of the interface map). The user can then order or purchase the custom interface map. The custom interface map can be printed and delivered to the user. The user can then attach the custom interface map to a multi-touch input device. The user can also receive a configuration or data file (e.g., a firmware file) comprising layout information for the custom interface map (e.g., as a downloaded file or on a storage device received with the interface map).
- a configuration or data file e.g., a firmware file
- layout information for the custom interface map e.g., as a downloaded file or on a storage device received with the interface map.
- the user can install (e.g., via a firmware update) the configuration or data file on the multi-touch input device (e.g., via a USB or wireless connection from the user's computer), thus configuring the multi-touch input device to process touch events according to the custom interface map created by the user.
- a custom driver can be provided allowing a computing device (e.g., a computer) to process events from the input device (e.g., events for custom functions, other than standard key and mouse events).
- Interface maps can also be created locally by a user.
- a user can create a custom interface map using software on the user's computer (e.g., a custom interface map design application).
- the user can then order or purchase the custom interface map (e.g., by uploading or sending the custom interface map to the manufacturer or to a third party), or the user can locally print the custom interface map.
- the user can use a variety of printers (e.g., UV printers, inkjet printers, and solvent printers) to print the custom interface map on a sheet or film (e.g., on a polycarbonate film) and attach the custom interface map to a multi-touch input device.
- printers e.g., UV printers, inkjet printers, and solvent printers
- the user can also generate (e.g., via the software on the user's computer) a configuration or data file (e.g., a firmware file) comprising layout information for the custom interface map.
- the user can install (e.g., via a firmware update) the configuration or data file on the multi-touch input device (e.g., via a USB or wireless connection from the user's computer), thus configuring the multi- touch input device to process touch events according to the custom interface map created by the user (e.g., by detecting one or more touches by a finger or other object and mapping the location(s), according to the custom interface map, to generate key press and/or touchpad movement events and communicate the events to an attached computing device).
- a configuration or data file e.g., a firmware file
- the user can install (e.g., via a firmware update) the configuration or data file on the multi-touch input device (e.g., via a USB or wireless connection from the user's computer), thus configuring the multi- touch input device to
- a custom driver can be provided allowing a computing device (e.g., a computer) to process events from the input device (e.g., events for custom functions, other than standard key and mouse events).
- the interface maps can be attached to the input device at a specific location. For example, indicators (e.g., raised bumps, imprinted marks, etched marks, etc.) can be present on the input device interface panel for determining where to attach the interface map. The indicators can be located, for example, at or near one or more corners of the input device interface panel corresponding to one or more corners of the interface map.
- the interface map can be a transparent or semi-transparent film (e.g., a transparent or semi-transparent polycarbonate film).
- the interface map can also be colored (e.g., a transparent film with a colored tint), such as with colors that fluoresce under UV light.
- the layout that is printed on the interface map can also be printed in a variety of colors, including colors that fluoresce under UV light.
- the layout that is printed on the interface map can be printed in an outline format showing the outline of keys and/or touchpad zones, in addition to symbols indicating the function or action performed by the key and/or touchpad zone (e.g., a letter, such as "S,” indicating that the key will type the letter "S" when touched).
- the interface map can be attached or applied (e.g., removably attached or applied) to the top or bottom surface of the interface panel.
- the interface map can be a polycarbonate film coated with a transparent adhesive (e.g., a "static film") allowing the interface map to be attached, and later removed if needed (e.g., to replace the interface map with a new interface map having the same layout or a different layout).
- the interface map can be permanently attached to the top or bottom surface of the interface panel.
- the interface map can be integrated with the interface panel (e.g., printed directly on the surface of the interface panel or etched into the interface panel).
- the interface map may be changed by a user, and a corresponding configuration file provided to load a new interface map.
- an interface map comprising a standard QWERTY keyboard printed on a static film may be applied to an interface panel with a corresponding configuration file, allowing the input device to be used as a Western-alphabet keyboard.
- the user could then remove and replace the interface map with a different interface map having a Chinese-character keyboard layout, load the corresponding configuration file (e.g., as a firmware update to the input device), and use the input device as a Chinese character keyboard.
- the user could also create her own customized interface map and configuration file, for a desired specific functionality, print the interface map on static film, and apply it to the interface panel.
- the interface panel may be made from extremely rugged materials, such as tempered glass, GORILLA GLASSTM, acrylic, or other materials which can easily transmit IR wavelengths.
- FIG. 8 is a flowchart of an exemplary method 800 for creating a custom interface map for a frustrated total internal reflection (FTIR) multi-touch input device.
- layout information is received for a custom interface map.
- the layout information defines a plurality of touch areas.
- the touch areas can be keys and/or touchpad zones.
- the layout information can be received from a user using a local software application or from a user using a remote service (e.g., via a web site).
- a layout is generated for the custom interface map according to the received layout information 810.
- the layout can depict outlines and/or symbols for the various touch areas (e.g., keys and touchpad zones) defined by the layout information.
- a configuration file is generated from the received layout information 810.
- the configuration file is loadable on the FTIR multi-touch input device to configure the FTIR multi-touch input device to use the custom interface map.
- a custom driver can be created to support input device functions (e.g., functions that use custom processing) on an associated computing device.
- the configuration file is output.
- the configuration file can be stored on a server computer or on a local computer.
- the configuration file can be delivered (e.g., separately or with the custom interface map) to a user to install on the multi-touch input device (e.g., via a USB or wireless firmware update of the multi-touch input device).
- a custom driver can be provided for storage and/or installation.
- the custom interface map can be created by printing the generated layout 830 on a transparent film.
- the interface map can be printed, for example, at a manufacturer, third party, or locally by a user.
- an online web service can provide a design tool for receiving the layout information 810 from a user.
- the online web service can generate the layout 820, generate the configuration file 830, and output the configuration file (e.g., store the configuration file and/or send the configuration file to the user).
- the online web service can provide the layout for printing on a transparent film (e.g., at a manufacturer of the multi-touch input device or a third party printing service) and delivery to the user.
- a user can also print the custom interface map locally.
- the user can use local software (e.g., running on the user's computer) or a remote service to design the custom interface map and print the layout on transparent film using a local printer.
- the user can also generate or download a corresponding configuration file for the custom interface map and install it on the multi-touch input device along with the transparent film with the printed custom layout (the custom interface map).
- the user can also generate or download a corresponding custom driver to install on the user' s computing device to support the input device with the custom layout.
- Sterilization technology can be used with any of the multi-touch input devices described herein.
- specific wavelengths of UV light can be used to sterilize the surface of the multi-touch input device (e.g., to sterilize the interface panel, including the interface map, of the multi-touch input device).
- the ultraviolet light is emitted at wavelengths of approximately 265nm to 280nm.
- UV light can be applied to the interface panel of a multi-touch input device.
- UV LEDs can be used to direct UV light into the edge of the interface panel.
- the UV light can provide a sterilization effect to the multi-touch input device from inside the interface panel.
- UV light can be directed to the outside (e.g., top or bottom) surface of the interface panel (e.g., separately or in combination with internally-directed UV light).
- the light source (e.g., of multi-touch input devices depicted in Figs. 1 through 5) can comprise UV LEDs (e.g., in addition to IR LEDs).
- the UV LEDs can be enabled when a UV sterilization effect is desired.
- the UV LEDs can be enabled at pre-defined times (e.g., on a schedule) or at user-defined times (e.g., on a user-defined schedule or manually enabled).
- the UV LEDs can be enabled for a specific duration sufficient to provide a sterilization effect to the multi-touch input device (e.g., a number of seconds or minutes, such as 2-3 minutes).
- the UV LEDs can be enabled during a cleaning cycle of the multi-touch input device.
- the cleaning cycle can be enabled according to a schedule (e.g., once a day when the input device is not in use, such as during the night).
- the cleaning cycle can be enabled when the input device is in a sleep mode (e.g., when the input device has not been used for an amount of time, such as a number of minutes or hours and/or when a computing device connected to the input device has not been used for an amount of time).
- the cleaning cycle can be enabled during a period of inactivity according to an inactivity timer (e.g., when the input device has been inactive for a number of minutes or hours).
- the interface panel of the multi-touch input device can be made from a material (e.g., specific types of glass or glass blend) that does not attenuate (or does not significantly attenuate) the specific UV wavelengths used.
- the UV cleaning cycle can be user-initiated.
- a user can manually initiate a UV cleaning (e.g., a 2-3 minute UV cleaning cycle).
- Infrared interference can be removed from digital images captured using any of the multi-touch input devices described herein.
- a first digital image can be captured with an infrared light source of a multi-touch input device turned off, and a second digital image can be captured with the infrared light source turned on. Because the first digital image is captured with the infrared light source turned off, any infrared light detected in the first digital image will be the result of an infrared light source other than the infrared light source of the multi-touch input device.
- Such infrared interference present in the first digital image can be removed from the second digital image.
- various image processing techniques can be applied to filter out or remove (e.g., subtract) the infrared interference during image processing (e.g., to produce a processed image with the infrared interference removed).
- Infrared interference can interfere with the proper determination of touch events on a FTIR multi-touch input device.
- an external infrared light source such as an IR remote control
- the input device could determine that a touch event has occurred, and activate a corresponding keyboard key for example.
- Figs. 9 A, 9B, and 9C are diagrams depicting example digital images, from which infrared interference can be detected and removed.
- the first digital image 900 in Fig. 9A represents a digital image that is captured by a digital camera of a multi- touch input device when an infrared light source of the multi-touch input device is turned off. Because the infrared light source of the input device is turned off, any detected infrared light will be from a source other than the infrared light source of the input device. In addition, even if an interface panel of the input device is being touched (e.g., by a person's finger), the touch location will not be detected because the infrared light source of the input device is off.
- the first digital image 900 depicts two locations where infrared light is detected, 902 and 904.
- the two locations of infrared light, 902 and 904, detected when the infrared light source is turned off are two locations of infrared interference.
- the infrared interference could be generated by an IR remote control.
- the two locations of infrared light, 902 and 904, are considered to be infrared interference because they could be incorrectly interpreted as touch events.
- the second digital image 920 in Fig. 9B represents a digital image that is captured by a digital camera of a multi-touch input device when an infrared light source of the multi-touch input device is turned on. Because the infrared light source of the input device is turned on, infrared light will be detected from touch events (e.g., scattered infrared light from a person touching an interface panel of the input device) as well as any from any other infrared light source. Therefore, the second digital image 920 will depict actual touch events in addition to any infrared interference.
- touch events e.g., scattered infrared light from a person touching an interface panel of the input device
- the second digital image 920 depicts three locations where infrared light is detected, 922, 924, and 926.
- the two locations of infrared light, 922 and 924 correspond to the two locations of infrared light, 902 and 904, detected in image 900.
- An additional location of infrared light 926 is also depicted in the second digital image 920.
- the third digital image 940 in Fig. 9C represents a digital image that can be generated using the first digital image 900 and the second digital image 920.
- the third digital image 940 depicts detected infrared light with infrared interference removed.
- the third digital image 940 depicts one location of detected infrared light 946, which can be determined to be a touch event (e.g., a valid touch event).
- the third digital image 940 can be a new digital image that is created by taking the second digital image 920 and subtracting the first digital image 900.
- the third digital image 940 can also represent a modified version of the second digital image 920 (e.g., modified by removing the locations of detected infrared light present in the first digital image 900 from the second digital image 920).
- the second digital image modified to filter out (e.g., subtract) the infrared interference from the first digital image.
- the result of the first image processing technique is a modified second digital image.
- a new digital image is created by subtracting the first digital image from the second digital image.
- the result of the second image processing technique is the new digital image, which depicts detected infrared light present in the second digital image, but not the first digital image.
- locations of detected infrared light are determined from the first digital image and from the second digital image. For example, the locations can be determined based on size, shape, position, and/or intensity. Specific locations of detected infrared light that are present in both the first and second digital image can then be removed (e.g., they can be discarded from consideration as valid touch events).
- the result of the third image processing technique are any specific locations of detected infrared light that are present in the second digital image and that do not have any corresponding locations in the first digital image (e.g., corresponding locations 902 and 922, and 904 and 924, can be removed leaving location 926 as a valid touch event).
- some digital images can be captured by an FTIR multi-touch input device when the infrared light source (e.g., one or more IR LEDs) of the FTIR multi-touch input device is on, and other digital images can be captured when the infrared light source is off.
- digital images can be captured at a first rate (e.g., a predetermined number of images per second) and the infrared light source can be configured to flicker at a second rate. By setting the first rate to be greater than the second rate, some of the digital images will be captured when the infrared light source is off.
- the digital images that are captured when the infrared light source is off can show infrared interference and can be used to process digital images taken when the infrared light source is on to remove the infrared interference.
- the infrared light source is configured to switch between on and off between each successive digital image. For example, a first digital image can be captured with the IR light source on, the next digital image can be captured with the IR light source off, the next digital image can be captured with the IR light source on, and so on. For example, if the rate of digital image capture is 30 images per second, then the cycling rate of the IR light source can be set to 15 cycles per second, such that every other image is captured with the IR light source on. Alternatively, other rates of capture and IR cycling can be used (e.g., image capture at 50 images per second with IR light cycling at 25 cycles per second).
- Fig. 10 depicts an example method 1000 for removing infrared interference.
- a first digital image is captured with an infrared light source turned off.
- a second digital image is captured with the infrared light source turned on.
- the second digital image can be the next image captured after the first digital image.
- the first and second digital images can be captured by a digital camera of a FTIR multi-touch input device.
- the infrared light source can be configured to emit infrared light within an interface panel of the FTIR multi-touch input device.
- the first and second digital images can be processed to remove infrared interference.
- infrared interference from the first captured digital image 1010 can be subtracted from the second captured digital image 1020.
- the second digital image is filtered, based at least in part upon the first digital image, to subtract infrared interference present in the first digital image from the second digital image.
- the first digital image is analyzed to determine whether infrared interference is present (e.g., wither the size, shape, content, and/or position of infrared light is sufficient to be capable of interfering with the detection of touch events).
- the second digital image can be filtered to remove the infrared interference.
- a new digital image can be created by removing (e.g., subtracting) the infrared interference present in the first digital image from corresponding infrared interference present in the second digital image.
- infrared interference is detected in digital images captured when the infrared light source is off. Compensation for the infrared interference is then performed for digital images captured when the infrared light source is on. For example, the digital images captured when the infrared light source is on can be filtered to remove the infrared interference, filtering can be performed when (e.g., only when) infrared interference is present is a corresponding digital images captured when the infrared light source is off, and/or processed digital images can be generated by subtracting the infrared interference.
- Contamination can be removed from digital images captured using any of the multi-touch input devices described herein.
- a digital image can be captured with an infrared light source of a multi-touch input device turned on and when an interface panel of the multi-touch input device is not being touched.
- the digital image can be captured, for example, when the input device is turned on (e.g., as part of an initialization or wake -up process).
- the digital image can be processed to determine whether any contamination is present. For example, any infrared light detected in the digital image can be treated as contamination. Alternatively, the digital image can be compared to a reference image (e.g., a digital image captured when the input device was manufactured or during a setup process) and any difference can be treated as contamination. Contamination information can be determined from the digital image (e.g., information indicating size, shape, intensity, and/or position of infrared light corresponding to instances of contamination). One or more subsequent images can then be filtered to remove detected infrared light resulting from the contamination (e.g., to subtract instances of contamination, using the contamination information, from the subsequent images).
- a reference image e.g., a digital image captured when the input device was manufactured or during a setup process
- Contamination information can be determined from the digital image (e.g., information indicating size, shape, intensity, and/or position of infrared light corresponding to instances of contamination).
- One or more subsequent images can then be
- a first digital image can be captured with an infrared light source of a multi-touch input device turned on and when an interface panel of the multi-touch input device is not being touched.
- the first image can represent a "default" or “reference” image (e.g., an image captured when the device is manufactured, first activated, or at a later time such as during a setup process).
- a second digital image can be captured at a later time with an infrared light source of a multi-touch input device turned on and when the interface panel of the multi-touch input device is not being touched.
- the second digital image can be captured, for example, when the input device is turned on or during an automatic or user- initiated contamination detection operation.
- the first and second digital images can be processed to determine whether any contamination is present in the second digital image in comparison to the first digital image.
- contamination information e.g., a contamination mask
- contamination information can be determined that indicates the size, shape, intensity, and/or position of infrared light created by the contamination.
- One or more subsequent images can then be filtered, using the contamination
- contamination e.g., dust, dirt, grease, food crumbs, coffee, and other types of debris
- contamination can interfere with the proper determination of touch events on an FTIR multi-touch input device.
- contamination e.g., dust, dirt, grease, food crumbs, coffee, and other types of debris
- contamination can interfere with the proper determination of touch events on an FTIR multi-touch input device.
- debris can scatter infrared light being reflected within the interface panel of the input device and be detected by the digital cameras.
- the detected infrared light from such contamination may be difficult to distinguish from touch events.
- the input device could determine that a touch event has occurred, and activate a corresponding keyboard key.
- the first digital image 1100 in Fig. 11A represents a digital image that is captured by a digital camera of a multi-touch input device when an infrared light source of the multi- touch input device is turned on, when the multi-touch input device is not being touched, and when an interface map is applied to an interface panel of the multi- touch input device.
- the first digital image 1100 can be captured, for example, during startup process of the multi-touch input device (e.g., when the multi-touch input device is being powered-on or waking from a sleep mode).
- the first digital image 1100 can also be captured, for example, during a contamination detection operation (e.g., an automatic or manual contamination detection operation).
- the first digital image 1100 depicts infrared light that is scattered from any contamination present on the interface panel of the multi-touch input device. Specifically, there are three locations of scattered infrared light, 1102, 1104, and 1106, from contamination depicted in the first digital image 1100.
- the first digital image 1100 also depicts a faint outline of scattered infrared light from the interface map. Depending on the type of interface map used, infrared light may or may not be scattered by the interface map. Depending on implementation details, scattered infrared light from the interface map may or may not be processed as contamination and removed from subsequent images.
- the second digital image 1120 in Fig. 1 IB represents a digital image that is captured after the first digital image 1100.
- the second digital image 1120 can be one of a number of digital images captured during use of the multi-touch input device (e.g., when the multi-touch input device is being touched).
- the second digital image 1120 depicts infrared light scattered from three locations of contamination (1122, 1124, and 1126) which correspond to the three locations detected in the first digital image (1102, 1104, and 1106).
- the second digital image also depicts an additional location of scattered infrared light 1128.
- Contamination information can be determined, at least in part, using the first digital image 1100. For example, the three locations of contamination (1102, 1104, and 1106) can be identified. The contamination information can be used to filter subsequent images. For example, the corresponding locations (1122, 1124, and 1126) in the second digital image 1120 can be filtered (e.g., removed or subtracted). Remaining locations (e.g., location 1128) can be identified as touch locations (e.g., as valid touch locations).
- the third digital image 1140 in Fig. 11C represents a processed image in which the contamination present in the first digital image 1100, in addition to the interface map outline, has been removed from the second digital image 1120.
- the processed digital image 1140 depicts a touch location 1128 that remains after the contamination (including the interface map outline) has been removed.
- the remaining touch location 1128 can be identified as a valid touch location and processing of the touch location can be performed (e.g., a button press, touchpad movement, etc.).
- Fig. 12 depicts a flowchart for an example method 1200 for removing contamination from digital images.
- a first digital image is captured when an infrared light source of a FTIR multi-touch input device is turned and when an interface panel of the input device is not being touched.
- contamination information is determined based at least in part upon the first digital image.
- a reference digital image is captured prior to capturing the first digital image 1210 (e.g., during manufacturing). The reference digital image can be used, for example, in determining contamination information 1220 (e.g., the reference digital image can be used to distinguish between scattered infrared light from an interface map and scattered infrared light from other sources).
- a plurality of digital images are captured after the first digital image.
- the plurality of additional digital images are captured when the infrared light source is turned on.
- the plurality of additional digital images can be captured when the input device is being touched (e.g., when the input device is being used).
- the plurality of additional digital images are processed based at least in part upon the contamination information to remove contamination from the plurality of additional digital images.
- touch events can be determined using a FTIR multi-touch input device.
- it can be desirable to distinguish between different types of events including touch (single, multiple, and/or simultaneous multiple), rest, and/or movement events.
- touch single, multiple, and/or simultaneous multiple
- rest e.g., rest their fingers on a keyboard input device in the home position in preparation for the next keystroke.
- touch events can be distinguished from rest events.
- a touch is distinguished from a rest based at least in part upon duration.
- a touch that lasts for more than one -half second can be determined to be a rest (e.g., and no key activation or touchpad movement events can be performed when a rest is detected). Additional criteria can be used to distinguish a rest from a touch. For example, movement criteria can be used in combination with duration to distinguish between a touch and a rest. In a specific implementation, a touch that lasts for more than one-half second and that moves less than 2mm (e.g., as determined by movement of the center of the touch area) is considered to be a rest.
- FIG. 13 depicts a generalized example of a suitable computing system 1300 in which the described innovations may be implemented.
- the computing system 1300 is not intended to suggest any limitation as to scope of use or functionality, as the innovations may be implemented in diverse general-purpose or special-purpose computing systems.
- the computing system 1300 includes one or more processing units 1310, 1315 and memory 1320, 1325.
- this basic configuration 1330 is included within a dashed line.
- the processing units 1310, 1315 and memory 1320, 1325 are included within a dashed line.
- a processing unit can be a general- purpose central processing unit (CPU), processor in an application-specific integrated circuit (ASIC) or any other type of processor.
- ASIC application-specific integrated circuit
- FIG. 13 shows a central processing unit 1310 as well as a graphics processing unit or co-processing unit 1315.
- the tangible memory 1320, 1325 may be volatile memory (e.g., registers, cache, RAM), nonvolatile memory (e.g., ROM, EEPROM, flash memory, etc.), or some combination of the two, accessible by the processing unit(s).
- the memory 1320, 1325 stores software 1380 implementing one or more innovations described herein, in the form of computer-executable instructions suitable for execution by the processing unit(s).
- a computing system may have additional features.
- the computing system 1300 includes storage 1340, one or more input devices 1350, one or more output devices 1360, and one or more communication connections 1370.
- An interconnection mechanism such as a bus, controller, or network interconnects the components of the computing system 1300.
- operating system software provides an operating environment for other software executing in the computing system 1300, and coordinates activities of the components of the computing system 1300.
- the tangible storage 1340 may be removable or non-removable, and includes magnetic disks, magnetic tapes or cassettes, CD-ROMs, DVDs, or any other medium which can be used to store information in a non-transitory way and which can be accessed within the computing system 1300.
- the storage 1340 stores instructions for the software 1380 implementing one or more innovations described herein.
- the input device(s) 1350 may be a touch input device such as a keyboard, mouse, pen, or trackball, a voice input device, a scanning device, or another device that provides input to the computing system 1300.
- the input device(s) 1350 may be a camera, video card, TV tuner card, or similar device that accepts video input in analog or digital form, or a CD-ROM or CD-RW that reads video samples into the computing system 1300.
- the output device(s) 1360 may be a display, printer, speaker, CD-writer, or another device that provides output from the computing system 1300.
- the communication connection(s) 1370 enable communication over a communication medium to another computing entity.
- the communication medium conveys information such as computer-executable instructions, audio or video input or output, or other data in a modulated data signal.
- a modulated data signal is a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
- communication media can use an electrical, optical, RF, or other carrier.
- program modules include routines, programs, libraries, objects, classes, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
- the functionality of the program modules may be combined or split between program modules as desired in various embodiments.
- Computer-executable instructions for program modules may be executed within a local or distributed computing system.
- system and “device” are used interchangeably herein. Unless the context clearly indicates otherwise, neither term implies any limitation on a type of computing system or computing device. In general, a computing system or computing device can be local or distributed, and can include any combination of special-purpose hardware and/or general-purpose hardware with software implementing the functionality described herein. [0127] For the sake of presentation, the detailed description uses terms like
- Computer-readable storage media are any available tangible media that can be accessed within a computing environment (e.g., non-transitory computer-readable media, such as one or more optical media discs such as DVD or CD, volatile memory components (such as DRAM or SRAM), or nonvolatile memory components (such as flash memory or hard drives)).
- non-transitory computer-readable media such as one or more optical media discs such as DVD or CD, volatile memory components (such as DRAM or SRAM), or nonvolatile memory components (such as flash memory or hard drives)
- computer-readable storage media include memory 1320 and 1325, and storage 1340.
- the term computer-readable storage media does not include communication connections (e.g., 1370) such as modulated data signals.
- Any of the computer-executable instructions for implementing the disclosed techniques as well as any data created and used during implementation of the disclosed embodiments can be stored on one or more computer-readable storage media (e.g., non-transitory computer-readable media).
- the computer-executable instructions can be part of, for example, a dedicated software application or a software application that is accessed or downloaded via a web browser or other software application (such as a remote computing application).
- Such software can be executed, for example, on a single local computer (e.g., any suitable
- any of the software-based embodiments (comprising, for example, computer-executable instructions for causing a computer to perform any of the disclosed methods) can be uploaded, downloaded, or remotely accessed through a suitable communication means.
- suitable communication means include, for example, the Internet, the World Wide Web, an intranet, software applications, cable (including fiber optic cable), magnetic communications, electromagnetic
- communications including RF, microwave, and infrared communications
- electronic communications or other such communication means.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Position Input By Displaying (AREA)
Abstract
Selon l'invention, une compensation peut être réalisée pour un brouillage et/ou une contamination pour des dispositifs d'entrée multipoint qui utilisent des techniques de réflexion interne totale inexécutable (FTIR). Un brouillage, tel qu'un brouillage infrarouge, peut être compensé par capture d'une première image numérique avec une source de lumière infrarouge du dispositif d'entrée éteinte, capture d'une seconde image numérique avec la source de lumière infrarouge allumée, et traitement des première et seconde images numériques pour compenser (par exemple, éliminer, filtrer et/ou soustraire) le brouillage infrarouge. Une contamination, telle que des saletés, de la poussière et d'autres types de débris, peut être compensée par capture d'une première image numérique lorsque le panneau d'interface du dispositif d'entrée n'est pas touché, et capture d'images numériques supplémentaires durant l'utilisation du dispositif d'entrée. Les images numériques supplémentaires peuvent être traitées pour compenser une contamination à l'aide de la première image numérique.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201161565494P | 2011-11-30 | 2011-11-30 | |
| US61/565,494 | 2011-11-30 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2013081671A1 true WO2013081671A1 (fr) | 2013-06-06 |
Family
ID=48535928
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2012/044056 Ceased WO2013081672A1 (fr) | 2011-11-30 | 2012-06-25 | Dispositif d'entrée tactile multipoint |
| PCT/US2012/044055 Ceased WO2013081671A1 (fr) | 2011-11-30 | 2012-06-25 | Compensation de brouillage et de contamination pour des dispositifs d'entrée multipoint |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2012/044056 Ceased WO2013081672A1 (fr) | 2011-11-30 | 2012-06-25 | Dispositif d'entrée tactile multipoint |
Country Status (1)
| Country | Link |
|---|---|
| WO (2) | WO2013081672A1 (fr) |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2016018416A1 (fr) * | 2014-07-31 | 2016-02-04 | Hewlett-Packard Development Company, L.P. | Détermination de l'emplacement d'un dispositif d'entrée d'utilisateur |
| US9553579B2 (en) | 2014-12-10 | 2017-01-24 | Pr Electronics A/S | Optical keypad for explosive locations |
| CN117422854A (zh) * | 2023-05-18 | 2024-01-19 | 北京科加触控技术有限公司 | 光学触摸图像优化方法、装置、光学触摸设备和存储介质 |
| US12488552B2 (en) | 2023-05-18 | 2025-12-02 | Koga Touch Co., Ltd | Method, apparatus, optical touch device, and storage medium for optical touch image optimization |
Families Citing this family (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9233179B2 (en) | 2013-10-01 | 2016-01-12 | Vioguard LLC | Touchscreen sanitizing system |
| KR102730129B1 (ko) * | 2019-08-22 | 2024-11-15 | 삼성전자주식회사 | 디스플레이 장치 및 그 제어 방법 |
| GB2602336B (en) * | 2020-12-23 | 2024-08-14 | Uniphy Ltd | Optical touch screen |
| US11679171B2 (en) | 2021-06-08 | 2023-06-20 | Steribin, LLC | Apparatus and method for disinfecting substances as they pass through a pipe |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5196835A (en) * | 1988-09-30 | 1993-03-23 | International Business Machines Corporation | Laser touch panel reflective surface aberration cancelling |
| US20110109594A1 (en) * | 2009-11-06 | 2011-05-12 | Beth Marcus | Touch screen overlay for mobile devices to facilitate accuracy and speed of data entry |
| US8004502B2 (en) * | 2007-10-05 | 2011-08-23 | Microsoft Corporation | Correcting for ambient light in an optical touch-sensitive device |
| US20110216042A1 (en) * | 2008-11-12 | 2011-09-08 | Flatfrog Laboratories Ab | Integrated touch-sensing display apparatus and method of operating the same |
| US8026904B2 (en) * | 2007-01-03 | 2011-09-27 | Apple Inc. | Periodic sensor panel baseline adjustment |
Family Cites Families (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH07117868B2 (ja) * | 1991-04-30 | 1995-12-18 | インターナショナル・ビジネス・マシーンズ・コーポレイション | タツチ型作動キーボード定義方法及び装置 |
| US6776546B2 (en) * | 2002-06-21 | 2004-08-17 | Microsoft Corporation | Method and system for using a keyboard overlay with a touch-sensitive display screen |
| US7176905B2 (en) * | 2003-02-19 | 2007-02-13 | Agilent Technologies, Inc. | Electronic device having an image-based data input system |
| US7403191B2 (en) * | 2004-01-28 | 2008-07-22 | Microsoft Corporation | Tactile overlay for an imaging display |
| US8109981B2 (en) * | 2005-01-25 | 2012-02-07 | Valam Corporation | Optical therapies and devices |
| US20100066690A1 (en) * | 2008-05-17 | 2010-03-18 | Darin Beamish | Digitizing tablet devices, methods and systems |
| US9524047B2 (en) * | 2009-01-27 | 2016-12-20 | Disney Enterprises, Inc. | Multi-touch detection system using a touch pane and light receiver |
| US8597569B2 (en) * | 2010-04-19 | 2013-12-03 | Microsoft Corporation, LLC | Self-sterilizing user input device |
-
2012
- 2012-06-25 WO PCT/US2012/044056 patent/WO2013081672A1/fr not_active Ceased
- 2012-06-25 WO PCT/US2012/044055 patent/WO2013081671A1/fr not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5196835A (en) * | 1988-09-30 | 1993-03-23 | International Business Machines Corporation | Laser touch panel reflective surface aberration cancelling |
| US8026904B2 (en) * | 2007-01-03 | 2011-09-27 | Apple Inc. | Periodic sensor panel baseline adjustment |
| US8004502B2 (en) * | 2007-10-05 | 2011-08-23 | Microsoft Corporation | Correcting for ambient light in an optical touch-sensitive device |
| US20110216042A1 (en) * | 2008-11-12 | 2011-09-08 | Flatfrog Laboratories Ab | Integrated touch-sensing display apparatus and method of operating the same |
| US20110109594A1 (en) * | 2009-11-06 | 2011-05-12 | Beth Marcus | Touch screen overlay for mobile devices to facilitate accuracy and speed of data entry |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2016018416A1 (fr) * | 2014-07-31 | 2016-02-04 | Hewlett-Packard Development Company, L.P. | Détermination de l'emplacement d'un dispositif d'entrée d'utilisateur |
| US11460956B2 (en) | 2014-07-31 | 2022-10-04 | Hewlett-Packard Development Company, L.P. | Determining the location of a user input device |
| US9553579B2 (en) | 2014-12-10 | 2017-01-24 | Pr Electronics A/S | Optical keypad for explosive locations |
| CN117422854A (zh) * | 2023-05-18 | 2024-01-19 | 北京科加触控技术有限公司 | 光学触摸图像优化方法、装置、光学触摸设备和存储介质 |
| US12488552B2 (en) | 2023-05-18 | 2025-12-02 | Koga Touch Co., Ltd | Method, apparatus, optical touch device, and storage medium for optical touch image optimization |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2013081672A1 (fr) | 2013-06-06 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP5411265B2 (ja) | ペントラッキングを組み込んだマルチタッチ式タッチスクリーン | |
| JP5346081B2 (ja) | ペントラッキングを組み込んだマルチタッチ式タッチスクリーン | |
| WO2013081671A1 (fr) | Compensation de brouillage et de contamination pour des dispositifs d'entrée multipoint | |
| US8432372B2 (en) | User input using proximity sensing | |
| US8581852B2 (en) | Fingertip detection for camera based multi-touch systems | |
| JP5154446B2 (ja) | 対話型入力システム | |
| Hodges et al. | ThinSight: versatile multi-touch sensing for thin form-factor displays | |
| US20090231281A1 (en) | Multi-touch virtual keyboard | |
| US8775971B2 (en) | Touch display scroll control | |
| US8619027B2 (en) | Interactive input system and tool tray therefor | |
| EP2107446A1 (fr) | Système et procédé de suivi des dispositifs de saisie sur des affichages LC | |
| US20100245263A1 (en) | Digital picture frame having near-touch and true-touch | |
| AU2010218345A1 (en) | Dynamic rear-projected user interface | |
| US10042464B2 (en) | Display apparatus including touchscreen device for detecting proximity touch and method for controlling the same | |
| TW200846996A (en) | Touch sensing using shadow and reflective modes | |
| WO2007024163A1 (fr) | Pointage et écriture manuscrite d'espace libre | |
| WO2021158167A1 (fr) | Système d'interaction de réunion | |
| US20090096751A1 (en) | Projected cleanable keyboard | |
| US20110242005A1 (en) | Interactive input device with palm reject capabilities | |
| JP2009134444A (ja) | 光学式タッチパネル入力装置 | |
| US9557905B2 (en) | System and method for user input | |
| KR102101565B1 (ko) | 미디어 안내장치 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12852763 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 12852763 Country of ref document: EP Kind code of ref document: A1 |