US20120327035A1 - Optical touch system and image processing method thereof - Google Patents
Optical touch system and image processing method thereof Download PDFInfo
- Publication number
- US20120327035A1 US20120327035A1 US13/495,712 US201213495712A US2012327035A1 US 20120327035 A1 US20120327035 A1 US 20120327035A1 US 201213495712 A US201213495712 A US 201213495712A US 2012327035 A1 US2012327035 A1 US 2012327035A1
- Authority
- US
- United States
- Prior art keywords
- intensity
- pixel groups
- value
- pixel group
- pixel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
Definitions
- Taiwan Patent Application Serial Number 100121547 filed on Jun. 21, 2011, the disclosure of which is hereby incorporated by reference herein in its entirety.
- the present invention relates to an optical touch system and an image processing method thereof.
- An optical touch system generally comprises an imaging device, an illuminating device, and a computing device for determining the touch position of an object.
- the object may be a finger, a stylus, or the like.
- the imaging device is configured to view a touch zone above a touch surface.
- the illuminating device is configured such that when an object is in the touch zone, the illuminating device can make the object generate an identifiable contrast image on a picture produced by the imaging device.
- the computing device is configured to calculate the coordinates of the object according to the brightness variation of the picture produced by the imaging device.
- the optical touch system can be designed as a system in which the object entering into the touch zone blocks the light projected from the illuminating device so as to form a dark area on the imaging device, or as a system in which the object reflects the light projected from the illuminating device to form a bright area.
- the current coordinate calculating method requires two pictures, a background picture and a picture taken when there is an object in the touch zone. Normally, the background picture is generated and stored before operation.
- the optical touch system can identify the region having obviously different intensity by comparing the captured picture and the background picture. When the region having obviously different intensity is used as an image formed by an object, the optical system can calculate the coordinates of the object according to some features of the region having obviously different intensity.
- the intensity levels of a portion of the background area of the picture may be changed due to the change of the light path or the manner of light reflection of the touch surface caused by the object such that a difference may occur between a background area of the picture and the corresponding portion of the background picture.
- Such difference may make it impossible for the region having obviously different intensity to be correctly calculated or properly identified. As a result, the coordinates of the object cannot be accurately calculated.
- One embodiment provides an image processing method and an optical touch system using the same.
- the image processing method can use a single picture to determine the coordinates of an object such that incorrect coordinates will not be obtained when there is a difference between a background area of a picture and the corresponding portion of a background picture.
- an optical touch system comprises an image sensor module and a processor.
- the image sensor module comprises a plurality of image sensing elements.
- the image sensing elements are configured to be independently controlled to achieve different exposure times.
- the plurality of image sensing elements may generate a picture comprising a plurality of pixel groups.
- the processor is configured to extract an intensity value representing each pixel group.
- the processor is also configured to select a portion of the pixel groups as an object image according to the intensity values of the pixel groups.
- an image processing method of an optical touch system comprises obtaining a picture comprising a plurality of first pixel groups, determining a plurality of first difference values each determined by subtracting intensity values of two of the plurality of first pixel groups, and selecting a set of successive pixel groups as an object image according to the first difference values.
- FIG. 1 is a schematic view showing an optical touch system according to one embodiment of the present invention
- FIG. 2 is an illustration of a picture according to one embodiment of the present invention.
- FIG. 3 schematically depicts an intensity fluctuating pattern according to one embodiment of the present invention
- FIG. 4 schematically depicts a portion of sensing elements and the circuit of an image sensor module according to one embodiment of the present invention
- FIG. 5 is a flow chart related to an image processing method according to one embodiment of the present invention.
- FIG. 6 schematically depicts an intensity fluctuating pattern according to another embodiment of the present invention.
- FIG. 7 schematically depicts a difference fluctuating pattern according to one embodiment of the present invention.
- FIG. 8 is a flow chart related to an image processing method according to another embodiment of the present invention.
- FIG. 9 schematically depicts an intensity fluctuating pattern according to another embodiment of the present invention.
- FIG. 10 schematically depicts a difference fluctuating pattern according to another embodiment of the present invention.
- FIG. 11 is a flow chart related to an image processing method according to another embodiment of the present invention.
- FIG. 12 schematically depicts an intensity fluctuating pattern according to another embodiment of the present invention.
- FIG. 13 schematically depicts a difference fluctuating pattern according to another embodiment of the present invention.
- FIG. 14 is a flow chart related to an image processing method according to another embodiment of the present invention.
- FIG. 15 schematically depicts an intensity fluctuating pattern according to another embodiment of the present invention.
- FIG. 16 schematically depicts a difference fluctuating pattern according to another embodiment of the present invention.
- FIG. 17 schematically depicts an intensity fluctuating pattern according to another embodiment of the present invention.
- FIG. 18 schematically depicts a difference fluctuating pattern according to another embodiment of the present invention.
- FIG. 1 is a schematic view showing an optical touch system 1 according to one embodiment of the present invention.
- One embodiment of the present invention is related to an image processing method that can calculate the coordinate data using a single picture.
- the image processing method is applicable to the optical touch system 1 shown in FIG. 1 .
- the optical touch system 1 comprises an image sensor module 11 and a processor 12 .
- the processor 12 is coupled with the image sensor module 11 to analyze an object image from the picture generated by the image sensor module 11 .
- the image sensor module 11 is configured to monitor a touch area 13 .
- the illuminating devices 14 and 15 are disposed adjacent to the touch area 13 to provide illumination such that the object on the touch area 13 can generate an identifiable image on a picture.
- the illuminating device 14 or 15 can be an active light source including, for example, a light tube, a plurality of light emitting diodes, or a combination of a light emitting diode and a light guide member.
- the illuminating device 14 or 15 may also be a passive light source such as a mirror.
- the image processing method of one embodiment of the present invention can be applied using an illumination-compensated picture whose image intensity variation is compensated for and with a normally captured picture whose image intensity variation is not compensated for.
- the description below begins by describing application of the image processing method of one embodiment of the present invention to an illumination-compensated picture.
- the method of compensating for image intensity variation can be implemented as software or hardware means and is applicable to a picture to increase the intensity of the portion that originally has lower intensity and to reduce the intensity of the portion that originally has higher intensity, so as to obtain a new picture with uniform intensity.
- the image sensor module 11 may produce a picture 2 .
- the picture 2 may comprise a plurality of pixel groups 21 that may be arranged along a direction.
- Each pixel group 21 may comprise a plurality of pixels 211 .
- each pixel group 21 may be a pixel column, and the plurality of pixel groups 21 are arranged along a row direction.
- the pixel group 21 may be a pixel row, and the plurality of pixel groups 21 are arranged along a column direction.
- the processor 12 is configured to extract an intensity value representing each pixel group 21 from the captured picture 2 .
- the intensity value representing a pixel group 21 can be a sum of the intensity values of the pixels 211 of the pixel group 21 .
- the intensity value representing a pixel group 21 can be an average of the intensity values of the pixels 211 of the pixel group 21 .
- the intensity fluctuating pattern 3 exhibits significant variation.
- the intensity fluctuating pattern 3 may be compensated for in advance.
- an adjustment value for each pixel group 21 is determined.
- each adjustment value is multiplied by the intensity value of the corresponding pixel group 21 to obtain a new intensity fluctuating pattern 4 exhibiting less variation.
- the adjusted intensity fluctuating pattern 4 can vary within an intensity range 5 .
- the adjustment value is used to compensate for the variation of an intensity fluctuating pattern.
- the adjustment value can be determined through many methods, one of which is illustratively demonstrated herein.
- an adjustment value can be obtained by the following steps: A background picture is generated using a fixed exposure time. Next, an intensity value (I n ) of each pixel group is determined. Finally, a ratio of a target intensity value (I T ) to the intensity value (I p ) for each pixel group is calculated, wherein the ratio (I p /I T ) can be used as an adjustment value.
- the image sensor module 11 comprises a plurality of image sensing elements 41 a and 41 b each comprising an electronic shutter 411 and a photo detector 412 .
- the electronic shutter 411 is coupled with the photo detector 412 to control the exposure time of the photo detector 412 .
- the photo detector 412 generates charge in response to received light.
- the transistor 413 controls the transferring of the charge from the photo detector 412 to a floating diffusion (FD) output node.
- the transistor 414 and the constant current source 415 form as a source follower, which can amplify the photovoltaic voltage produced by the photo detector 412 .
- the transistor 416 can be activated when the signal WL 1 is at a high level, and at this moment, data can be output to the bit line 417 that is coupled to a readout circuit 220 .
- signals RST 1 and TG 1 go high, the transistors 418 and 413 are activated such that the voltage supply VDDAY can reset the photo detector 412 to a photo-electric conversion initiation state.
- the transistor 413 is turned on, the charge flows from the photo-detector 412 to the FD output node.
- the signal RST 1 goes high, the voltage source VDDAY resets the FD output node.
- the electronic shutters 411 of the image sensing elements 41 a and 41 b are respectively coupled to different shutter control lines 419 a and 419 b .
- different signals AB 1 and AB 2 can be applied to the electronic shutters 411 of the image sensing elements 41 a and 41 b to operate the image sensing elements 41 a and 41 b for different exposure times such that the intensity values of the corresponding pixels of a picture can be independently manipulated.
- the image sensing element 41 a or 41 b which is exposed to stronger light intensity, is assigned shorter exposure time and the image sensing element 41 a or 41 b which is exposed to weaker light intensity is assigned longer exposure time.
- a captured picture can exhibit a more uniform background intensity level.
- the exposure time for controlling the electronic shutter 411 of each image sensing element 41 a or 41 b can be obtained using the following method; however, the present invention is not limited to such method.
- the method obtains a background picture by a fixed exposure time. Next, the intensity value of each pixel of the background picture is extracted. Thereafter, the exposure time that is needed for operating the corresponding electronic shutter 411 of an image sensing element 41 a or 41 b and can make the corresponding pixel achieve a target intensity level is computed using the target intensity level and the intensity value of the pixel.
- FIG. 5 is a flow chart related to an image processing method according to one embodiment of the present invention.
- a picture is captured, wherein the picture comprises a plurality of pixel groups that can be arranged along a direction.
- the intensity value I(p i ) of each pixel group of the picture is computed to obtain an intensity fluctuating pattern I(p) as shown in FIG. 6 , where p i represents the i-th pixel group.
- the intensity value of each pixel group can be the sum of the intensities of pixels or the average of the intensities of pixels of the pixel group.
- T 1 a target intensity value
- the target intensity value (T 1 ) can be the average of the intensity fluctuating pattern I(p) or the intensity values I(p i ). In another embodiment, the target intensity value (T 1 ) can be a predetermined value.
- Step S 54 a difference fluctuating pattern D 1 ( p ) formed by a plurality of difference values D 1 ( p i ) each calculated, as shown in equation (1), by subtracting the target intensity value (T 1 ) from the corresponding intensity value I(p i ) on the intensity fluctuating pattern I(p) as shown in FIG. 7 is obtained.
- a threshold Th 1 is used to determine a section R 1 of the intensity fluctuating pattern I(p) that includes pixel groups having ratios of difference values D 1 ( p i ) to the target intensity value T 1 less than or greater than the threshold Th 1 as an object image.
- FIG. 8 is a flow chart related to an image processing method according to another embodiment of the present invention.
- Step S 81 a picture is obtained.
- the picture comprises a plurality of pixel groups that may be arranged along a direction.
- Step S 82 the intensity value I(p i ) representing each pixel group (p i ) of the picture is extracted to obtain an intensity fluctuating pattern I(p) as shown in FIG. 9 , where p i represents the i-th pixel group.
- the intensity value representing each pixel group can be either the sum of the intensity values of pixels of the pixel group or an average of the intensity values of pixels of the pixel group.
- a difference fluctuating pattern ICD(p) as shown in FIG. 10 is determined.
- the difference fluctuating pattern ICD(p) has a plurality of difference values ICD(p i ) sequentially calculated along the arrangement direction of the pixel groups and each ICD(p i ) determined by subtracting the intensity values of two different pixel groups (p i and p i+1 ).
- the point P t1 whose difference value is less than a threshold Th 2 and the point P t2 whose difference value is greater than a threshold Th 3 are determined.
- One of the pixel groups corresponding to the point P t1 is used as a left boundary pixel group, and one of the pixel groups corresponding to the point P t2 is used as a right boundary pixel group. Accordingly, the section of the intensity fluctuating pattern constituted by a set of successive pixel groups between the left and right boundary pixel groups is obtained, and the section can then be selected as an object image.
- FIG. 11 is a flow chart related to an image processing method according to another embodiment of the present invention.
- Step S 111 a picture comprising a plurality of pixel groups arranged along a direction is generated.
- the intensity value I(p i ) representing each pixel group (p i ) of the picture is extracted to obtain an intensity fluctuating pattern I(p) as shown in FIG. 12 , where p i represents the i-th pixel group.
- the intensity value representing each pixel group can be either the sum of the intensity values of pixels of the pixel group or the average of the intensity values of pixels of the pixel group.
- Step S 113 sequential calculation of the difference value ICD(p i ) between a pixel group p i and a pixel group p i that are spaced at an interval of a predetermined number q is performed one after another along the arrangement of the pixel group p i , to the pixel groups of the picture to obtain a difference fluctuating pattern ICD(p) as shown in FIG. 13 .
- Step S 114 on the fluctuating pattern ICD(p), the point P o whose difference value is less than a threshold Th 4 and the point P t4 whose difference value is greater than a threshold Th 5 are determined.
- One of the pixel groups corresponding to the point P o is used as a left boundary pixel group, and one of the pixel groups corresponding to the point P t4 is used as a right boundary pixel group. Accordingly, the section constituted by a set of successive pixel groups between the left and right boundary pixel group is obtained, and the section can be selected as an object image 121 as shown in FIG. 12 .
- the method of calculating the difference value between a pixel group p i and a pixel group p i+q+1 that are spaced at a predetermined number q can obtain larger difference values at the edges of the object image 121 .
- the difference value between two adjacent points 1212 and 1213 on the left side edge 1211 of the object image 121 is about 50, while the difference value between two separated points 1212 and 1213 is about 90.
- the method of calculating the difference value between a pixel group p i and a pixel group p i+q+1 that are spaced at a predetermined number q is not easily affected by noise.
- the predetermined number q can be a user set number, or a value determined by the modulation transfer function (MTF) of the image sensor module 11 , wherein the MTF is a measure of the transfer of modulation (or contrast) from an object to an image.
- An MTF value can be a point on a MTF curve or an average of points on an MTF curve.
- a pixel group is selected and an initial intensity value M o representing the pixel group is extracted.
- the following equation (2), an MTF value, and the initial intensity value M o are used to determine the to number of iterations needed when a last iterated result is less than a predetermined value, wherein the number of iterations can be used as the number q.
- the description below is related to an image processing method that is applicable to a normally captured picture whose image intensity variation is not compensated for.
- FIG. 14 is a flow chart related to an image processing method according to another embodiment of the present invention.
- a background picture comprising a plurality of pixel groups arranged along a direction is generated.
- intensity values IB(p 1 ) of the pixel groups of the background picture are extracted to obtain an intensity fluctuating pattern IB(p) as shown in FIG. 15 , where p i represents the i-th pixel group.
- the intensity value IB(p i ) of the pixel group can be the sum or the average of the intensity values of the pixels of the pixel group.
- Step S 143 the sequential calculation of the difference value BSD(p 1 ) between a pixel group p 1 and a pixel group p i+q+1 that are spaced at an interval of a predetermined number q is performed one after another along the arrangement of the pixel group p i , to the pixel groups of the background picture to obtain a difference fluctuating pattern BSD(p) as shown in FIG. 16 .
- the sequential calculation of the difference value between two adjacent pixel groups (p i and p i+1 ) is performed one after another along the arrangement of the pixel group p i , to the pixel groups of the background picture to obtain a difference fluctuating pattern BSD(p).
- a minimum of the difference fluctuating pattern BSD(p) is then determined as a threshold Th 6
- a maximum of the difference fluctuating pattern BSD(p) is determined as a threshold Th 7 .
- a Step S 145 a picture comprises a plurality of pixel groups arranged along a direction obtained.
- an intensity value I(p i ) representing each pixel group of the picture is extracted to obtain an intensity fluctuating pattern I(p) as shown in FIG. 17 , where p i represents the i-th pixel group.
- the intensity value I(p i ) of the pixel group can be the sum or the average of the intensity values of the pixels of the pixel group.
- Step S 147 the sequential calculation of the difference value ISD(p i ) between a pixel group p i and a pixel group p i+1 that are spaced at an interval of a predetermined number q is performed one after another along the arrangement of the pixel group p i , to the pixel groups of the picture to obtain a difference fluctuating pattern ISD(p) as shown in FIG. 18 .
- the sequential calculation of the difference value between two adjacent pixel groups (p i and p i+1 ) is performed one after another along the arrangement of the pixel group p i , to the pixel groups of the picture to obtain a difference fluctuating pattern ISD(p).
- Step S 418 on the difference fluctuating pattern ISD(p), the point 811 whose difference value is less than the threshold Th 6 and the point 812 whose difference value is greater than a threshold Th 7 are determined.
- One of the pixel groups corresponding to the point 811 is used as a left boundary pixel group, and one of the pixel groups corresponding to the point 812 is used as a right boundary pixel group.
- the section R 2 constituted by a set of successive pixel groups between the left and right boundary pixel groups can be obtained.
- the section R 2 can then be selected as an object image.
- non-transitory computer-readable storage medium which may be any device or medium that can store code and/or data for use by a computer system.
- the non-transitory computer-readable storage medium includes, but is not limited to, volatile memory, non-volatile memory, magnetic and optical storage devices such as disk drives, magnetic tape, CDs (compact discs), DVDs (digital versatile discs or digital video discs), or other media capable of storing code and/or data now known or later developed.
- the methods and processes described in the detailed description section can be embodied as code and/or data, which can be stored in a non-transitory computer-readable storage medium as described above.
- a computer system reads and executes the code and/or data stored on the non-transitory computer-readable storage medium, the computer system performs the methods and processes embodied as data structures and code and stored within the non-transitory computer-readable storage medium.
- the methods and processes described below can be included in hardware modules.
- the hardware modules can include, but are not limited to, application-specific integrated circuit (ASIC) chips, field-programmable gate arrays (FPGAs), and other programmable-logic devices now known or later developed. When the hardware modules are activated, the hardware modules perform the methods and processes included within the hardware modules.
- ASIC application-specific integrated circuit
- FPGAs field-programmable gate arrays
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Studio Devices (AREA)
Abstract
An optical touch system includes an image sensor module and a processor. The image sensor module includes a plurality of image sensing elements that can be independently controlled to achieve different exposure times. The plurality of image sensor elements can produce a picture including a plurality of pixel groups. The processor is configured to extract an intensity value of each pixel group and to select a set of successive pixel groups as an object image according to the intensity values of the pixel groups.
Description
- The present application is based on, and claims priority from, Taiwan Patent Application Serial Number 100121547, filed on Jun. 21, 2011, the disclosure of which is hereby incorporated by reference herein in its entirety.
- 1. Technical Field
- The present invention relates to an optical touch system and an image processing method thereof.
- 2. Related Art
- An optical touch system generally comprises an imaging device, an illuminating device, and a computing device for determining the touch position of an object. Typically, the object may be a finger, a stylus, or the like. The imaging device is configured to view a touch zone above a touch surface. The illuminating device is configured such that when an object is in the touch zone, the illuminating device can make the object generate an identifiable contrast image on a picture produced by the imaging device. The computing device is configured to calculate the coordinates of the object according to the brightness variation of the picture produced by the imaging device.
- The optical touch system can be designed as a system in which the object entering into the touch zone blocks the light projected from the illuminating device so as to form a dark area on the imaging device, or as a system in which the object reflects the light projected from the illuminating device to form a bright area. Regardless of which system is employed, the current coordinate calculating method requires two pictures, a background picture and a picture taken when there is an object in the touch zone. Normally, the background picture is generated and stored before operation. The optical touch system can identify the region having obviously different intensity by comparing the captured picture and the background picture. When the region having obviously different intensity is used as an image formed by an object, the optical system can calculate the coordinates of the object according to some features of the region having obviously different intensity.
- In addition to the region having obviously different intensity, the intensity levels of a portion of the background area of the picture may be changed due to the change of the light path or the manner of light reflection of the touch surface caused by the object such that a difference may occur between a background area of the picture and the corresponding portion of the background picture. Such difference may make it impossible for the region having obviously different intensity to be correctly calculated or properly identified. As a result, the coordinates of the object cannot be accurately calculated.
- One embodiment provides an image processing method and an optical touch system using the same. The image processing method can use a single picture to determine the coordinates of an object such that incorrect coordinates will not be obtained when there is a difference between a background area of a picture and the corresponding portion of a background picture.
- In one embodiment, an optical touch system comprises an image sensor module and a processor. The image sensor module comprises a plurality of image sensing elements. The image sensing elements are configured to be independently controlled to achieve different exposure times. The plurality of image sensing elements may generate a picture comprising a plurality of pixel groups. The processor is configured to extract an intensity value representing each pixel group. The processor is also configured to select a portion of the pixel groups as an object image according to the intensity values of the pixel groups.
- In one embodiment, an image processing method of an optical touch system comprises obtaining a picture comprising a plurality of first pixel groups, determining a plurality of first difference values each determined by subtracting intensity values of two of the plurality of first pixel groups, and selecting a set of successive pixel groups as an object image according to the first difference values.
- To better understand the above-described objectives, characteristics and advantages of the present invention, embodiments, with reference to the drawings, are provided for detailed explanations.
- The invention will be described according to the appended drawings in which:
-
FIG. 1 is a schematic view showing an optical touch system according to one embodiment of the present invention; -
FIG. 2 is an illustration of a picture according to one embodiment of the present invention; -
FIG. 3 schematically depicts an intensity fluctuating pattern according to one embodiment of the present invention; -
FIG. 4 schematically depicts a portion of sensing elements and the circuit of an image sensor module according to one embodiment of the present invention; -
FIG. 5 is a flow chart related to an image processing method according to one embodiment of the present invention; -
FIG. 6 schematically depicts an intensity fluctuating pattern according to another embodiment of the present invention; -
FIG. 7 schematically depicts a difference fluctuating pattern according to one embodiment of the present invention; -
FIG. 8 is a flow chart related to an image processing method according to another embodiment of the present invention; -
FIG. 9 schematically depicts an intensity fluctuating pattern according to another embodiment of the present invention; -
FIG. 10 schematically depicts a difference fluctuating pattern according to another embodiment of the present invention; -
FIG. 11 is a flow chart related to an image processing method according to another embodiment of the present invention; -
FIG. 12 schematically depicts an intensity fluctuating pattern according to another embodiment of the present invention; -
FIG. 13 schematically depicts a difference fluctuating pattern according to another embodiment of the present invention; -
FIG. 14 is a flow chart related to an image processing method according to another embodiment of the present invention; -
FIG. 15 schematically depicts an intensity fluctuating pattern according to another embodiment of the present invention; -
FIG. 16 schematically depicts a difference fluctuating pattern according to another embodiment of the present invention; -
FIG. 17 schematically depicts an intensity fluctuating pattern according to another embodiment of the present invention; and -
FIG. 18 schematically depicts a difference fluctuating pattern according to another embodiment of the present invention. - The following description is presented to enable any person skilled in the art to make and use the disclosed embodiments, and is provided in the context of a particular application and its requirements. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the disclosed embodiments. Thus, the disclosed embodiments are not limited to the embodiments shown, but are to be accorded the widest scope consistent with the principles and features disclosed herein.
-
FIG. 1 is a schematic view showing anoptical touch system 1 according to one embodiment of the present invention. One embodiment of the present invention is related to an image processing method that can calculate the coordinate data using a single picture. The image processing method is applicable to theoptical touch system 1 shown inFIG. 1 . Referring toFIG. 1 , theoptical touch system 1 comprises animage sensor module 11 and aprocessor 12. Theprocessor 12 is coupled with theimage sensor module 11 to analyze an object image from the picture generated by theimage sensor module 11. Theimage sensor module 11 is configured to monitor atouch area 13. The 14 and 15 are disposed adjacent to theilluminating devices touch area 13 to provide illumination such that the object on thetouch area 13 can generate an identifiable image on a picture. The 14 or 15 can be an active light source including, for example, a light tube, a plurality of light emitting diodes, or a combination of a light emitting diode and a light guide member. Theilluminating device 14 or 15 may also be a passive light source such as a mirror.illuminating device - The image processing method of one embodiment of the present invention can be applied using an illumination-compensated picture whose image intensity variation is compensated for and with a normally captured picture whose image intensity variation is not compensated for. The description below begins by describing application of the image processing method of one embodiment of the present invention to an illumination-compensated picture.
- The method of compensating for image intensity variation can be implemented as software or hardware means and is applicable to a picture to increase the intensity of the portion that originally has lower intensity and to reduce the intensity of the portion that originally has higher intensity, so as to obtain a new picture with uniform intensity. For example, as shown in
FIG. 2 , theimage sensor module 11 may produce a picture 2. The picture 2 may comprise a plurality ofpixel groups 21 that may be arranged along a direction. Eachpixel group 21 may comprise a plurality ofpixels 211. In one embodiment, eachpixel group 21 may be a pixel column, and the plurality ofpixel groups 21 are arranged along a row direction. In another embodiment, thepixel group 21 may be a pixel row, and the plurality ofpixel groups 21 are arranged along a column direction. Theprocessor 12 is configured to extract an intensity value representing eachpixel group 21 from the captured picture 2. In one embodiment, the intensity value representing apixel group 21 can be a sum of the intensity values of thepixels 211 of thepixel group 21. In another embodiment, the intensity value representing apixel group 21 can be an average of the intensity values of thepixels 211 of thepixel group 21. After the calculation of the intensity values of allpixel groups 21 is completed, an intensity fluctuating pattern 3 as shown inFIG. 3 can be obtained. - As illustrated in
FIG. 3 , the intensity fluctuating pattern 3 exhibits significant variation. The intensity fluctuating pattern 3 may be compensated for in advance. In one embodiment, an adjustment value for eachpixel group 21 is determined. Next, each adjustment value is multiplied by the intensity value of thecorresponding pixel group 21 to obtain a newintensity fluctuating pattern 4 exhibiting less variation. In one embodiment, the adjustedintensity fluctuating pattern 4 can vary within an intensity range 5. - The adjustment value is used to compensate for the variation of an intensity fluctuating pattern. The adjustment value can be determined through many methods, one of which is illustratively demonstrated herein. In one embodiment, an adjustment value can be obtained by the following steps: A background picture is generated using a fixed exposure time. Next, an intensity value (In) of each pixel group is determined. Finally, a ratio of a target intensity value (IT) to the intensity value (Ip) for each pixel group is calculated, wherein the ratio (Ip/IT) can be used as an adjustment value.
- The compensation for the intensity variation of the picture can be achieved through hardware. As illustrated in
FIG. 4 , theimage sensor module 11 comprises a plurality of 41 a and 41 b each comprising animage sensing elements electronic shutter 411 and aphoto detector 412. Theelectronic shutter 411 is coupled with thephoto detector 412 to control the exposure time of thephoto detector 412. Thephoto detector 412 generates charge in response to received light. Thetransistor 413 controls the transferring of the charge from thephoto detector 412 to a floating diffusion (FD) output node. Thetransistor 414 and the constantcurrent source 415 form as a source follower, which can amplify the photovoltaic voltage produced by thephoto detector 412. Thetransistor 416 can be activated when the signal WL1 is at a high level, and at this moment, data can be output to thebit line 417 that is coupled to areadout circuit 220. When signals RST1 and TG1 go high, the 418 and 413 are activated such that the voltage supply VDDAY can reset thetransistors photo detector 412 to a photo-electric conversion initiation state. When thetransistor 413 is turned on, the charge flows from the photo-detector 412 to the FD output node. When the signal RST1 goes high, the voltage source VDDAY resets the FD output node. - Referring to
FIG. 4 , theelectronic shutters 411 of the 41 a and 41 b are respectively coupled to differentimage sensing elements 419 a and 419 b. As such, different signals AB1 and AB2 can be applied to theshutter control lines electronic shutters 411 of the 41 a and 41 b to operate theimage sensing elements 41 a and 41 b for different exposure times such that the intensity values of the corresponding pixels of a picture can be independently manipulated. During operation, theimage sensing elements 41 a or 41 b, which is exposed to stronger light intensity, is assigned shorter exposure time and theimage sensing element 41 a or 41 b which is exposed to weaker light intensity is assigned longer exposure time. As a result, a captured picture can exhibit a more uniform background intensity level.image sensing element - The exposure time for controlling the
electronic shutter 411 of each 41 a or 41 b can be obtained using the following method; however, the present invention is not limited to such method. The method obtains a background picture by a fixed exposure time. Next, the intensity value of each pixel of the background picture is extracted. Thereafter, the exposure time that is needed for operating the correspondingimage sensing element electronic shutter 411 of an 41 a or 41 b and can make the corresponding pixel achieve a target intensity level is computed using the target intensity level and the intensity value of the pixel.image sensing element -
FIG. 5 is a flow chart related to an image processing method according to one embodiment of the present invention. Referring toFIG. 5 , at Step S51, a picture is captured, wherein the picture comprises a plurality of pixel groups that can be arranged along a direction. At Step S52, the intensity value I(pi) of each pixel group of the picture is computed to obtain an intensity fluctuating pattern I(p) as shown inFIG. 6 , where pi represents the i-th pixel group. The intensity value of each pixel group can be the sum of the intensities of pixels or the average of the intensities of pixels of the pixel group. At Step S53, a target intensity value (T1) is decided. In one embodiment, the target intensity value (T1) can be the average of the intensity fluctuating pattern I(p) or the intensity values I(pi). In another embodiment, the target intensity value (T1) can be a predetermined value. At Step S54, a difference fluctuating pattern D1(p) formed by a plurality of difference values D1(p i) each calculated, as shown in equation (1), by subtracting the target intensity value (T1) from the corresponding intensity value I(pi) on the intensity fluctuating pattern I(p) as shown inFIG. 7 is obtained. -
D1(p i)=I(p i)−T1 (1) - At Step S55, a threshold Th1 is used to determine a section R1 of the intensity fluctuating pattern I(p) that includes pixel groups having ratios of difference values D1(p i) to the target intensity value T1 less than or greater than the threshold Th1 as an object image.
-
FIG. 8 is a flow chart related to an image processing method according to another embodiment of the present invention. Referring toFIG. 8 , as Step S81, a picture is obtained. The picture comprises a plurality of pixel groups that may be arranged along a direction. At Step S82, the intensity value I(pi) representing each pixel group (pi) of the picture is extracted to obtain an intensity fluctuating pattern I(p) as shown inFIG. 9 , where pi represents the i-th pixel group. The intensity value representing each pixel group can be either the sum of the intensity values of pixels of the pixel group or an average of the intensity values of pixels of the pixel group. At Step S83, a difference fluctuating pattern ICD(p) as shown inFIG. 10 is determined. The difference fluctuating pattern ICD(p) has a plurality of difference values ICD(pi) sequentially calculated along the arrangement direction of the pixel groups and each ICD(pi) determined by subtracting the intensity values of two different pixel groups (pi and pi+1). At Step S84, on the difference fluctuating pattern ICD(p), the point Pt1 whose difference value is less than a threshold Th2 and the point Pt2 whose difference value is greater than a threshold Th3 are determined. One of the pixel groups corresponding to the point Pt1, is used as a left boundary pixel group, and one of the pixel groups corresponding to the point Pt2 is used as a right boundary pixel group. Accordingly, the section of the intensity fluctuating pattern constituted by a set of successive pixel groups between the left and right boundary pixel groups is obtained, and the section can then be selected as an object image. -
FIG. 11 is a flow chart related to an image processing method according to another embodiment of the present invention. Referring toFIG. 11 , at Step S111, a picture comprising a plurality of pixel groups arranged along a direction is generated. At Step S112, the intensity value I(pi) representing each pixel group (pi) of the picture is extracted to obtain an intensity fluctuating pattern I(p) as shown inFIG. 12 , where pi represents the i-th pixel group. The intensity value representing each pixel group can be either the sum of the intensity values of pixels of the pixel group or the average of the intensity values of pixels of the pixel group. At Step S113, sequential calculation of the difference value ICD(pi) between a pixel group pi and a pixel group pi that are spaced at an interval of a predetermined number q is performed one after another along the arrangement of the pixel group pi, to the pixel groups of the picture to obtain a difference fluctuating pattern ICD(p) as shown inFIG. 13 . At Step S114, on the fluctuating pattern ICD(p), the point Po whose difference value is less than a threshold Th4 and the point Pt4 whose difference value is greater than a threshold Th5 are determined. One of the pixel groups corresponding to the point Po is used as a left boundary pixel group, and one of the pixel groups corresponding to the point Pt4 is used as a right boundary pixel group. Accordingly, the section constituted by a set of successive pixel groups between the left and right boundary pixel group is obtained, and the section can be selected as anobject image 121 as shown inFIG. 12 . - In particular, compared with the method of calculating the difference value of two adjacent pixel groups (pi and pi+1), the method of calculating the difference value between a pixel group pi and a pixel group pi+q+1 that are spaced at a predetermined number q can obtain larger difference values at the edges of the
object image 121. As shown inFIG. 12 , the difference value between two 1212 and 1213 on theadjacent points left side edge 1211 of theobject image 121 is about 50, while the difference value between two 1212 and 1213 is about 90. Moreover, due to the existence of noise, the method of calculating the difference value between a pixel group pi and a pixel group pi+q+1 that are spaced at a predetermined number q is not easily affected by noise.separated points - The predetermined number q can be a user set number, or a value determined by the modulation transfer function (MTF) of the
image sensor module 11, wherein the MTF is a measure of the transfer of modulation (or contrast) from an object to an image. An MTF value can be a point on a MTF curve or an average of points on an MTF curve. From a background picture, a pixel group is selected and an initial intensity value Mo representing the pixel group is extracted. The following equation (2), an MTF value, and the initial intensity value Mo are used to determine the to number of iterations needed when a last iterated result is less than a predetermined value, wherein the number of iterations can be used as the number q. -
- The description below is related to an image processing method that is applicable to a normally captured picture whose image intensity variation is not compensated for.
-
FIG. 14 is a flow chart related to an image processing method according to another embodiment of the present invention. Referring toFIG. 14 , atStep 141, a background picture comprising a plurality of pixel groups arranged along a direction is generated. At Step S142, intensity values IB(p1) of the pixel groups of the background picture are extracted to obtain an intensity fluctuating pattern IB(p) as shown inFIG. 15 , where pi represents the i-th pixel group. The intensity value IB(pi) of the pixel group can be the sum or the average of the intensity values of the pixels of the pixel group. At Step S143, the sequential calculation of the difference value BSD(p1) between a pixel group p1 and a pixel group pi+q+1 that are spaced at an interval of a predetermined number q is performed one after another along the arrangement of the pixel group pi, to the pixel groups of the background picture to obtain a difference fluctuating pattern BSD(p) as shown inFIG. 16 . In another embodiment, the sequential calculation of the difference value between two adjacent pixel groups (pi and pi+1) is performed one after another along the arrangement of the pixel group pi, to the pixel groups of the background picture to obtain a difference fluctuating pattern BSD(p). - At Step S144, a minimum of the difference fluctuating pattern BSD(p) is then determined as a threshold Th6, and a maximum of the difference fluctuating pattern BSD(p) is determined as a threshold Th7. A Step S145, a picture comprises a plurality of pixel groups arranged along a direction obtained. At Step S146, an intensity value I(pi) representing each pixel group of the picture is extracted to obtain an intensity fluctuating pattern I(p) as shown in
FIG. 17 , where pi represents the i-th pixel group. The intensity value I(pi) of the pixel group can be the sum or the average of the intensity values of the pixels of the pixel group. At Step S147, the sequential calculation of the difference value ISD(pi) between a pixel group pi and a pixel group pi+1 that are spaced at an interval of a predetermined number q is performed one after another along the arrangement of the pixel group pi, to the pixel groups of the picture to obtain a difference fluctuating pattern ISD(p) as shown inFIG. 18 . In another embodiment, the sequential calculation of the difference value between two adjacent pixel groups (pi and pi+1) is performed one after another along the arrangement of the pixel group pi, to the pixel groups of the picture to obtain a difference fluctuating pattern ISD(p). At Step S418, on the difference fluctuating pattern ISD(p), thepoint 811 whose difference value is less than the threshold Th6 and thepoint 812 whose difference value is greater than a threshold Th7 are determined. One of the pixel groups corresponding to thepoint 811 is used as a left boundary pixel group, and one of the pixel groups corresponding to thepoint 812 is used as a right boundary pixel group. Accordingly, the section R2 constituted by a set of successive pixel groups between the left and right boundary pixel groups can be obtained. The section R2 can then be selected as an object image. - The data structures and code described in this detailed description are typically stored on a non-transitory computer-readable storage medium, which may be any device or medium that can store code and/or data for use by a computer system. The non-transitory computer-readable storage medium includes, but is not limited to, volatile memory, non-volatile memory, magnetic and optical storage devices such as disk drives, magnetic tape, CDs (compact discs), DVDs (digital versatile discs or digital video discs), or other media capable of storing code and/or data now known or later developed.
- The methods and processes described in the detailed description section can be embodied as code and/or data, which can be stored in a non-transitory computer-readable storage medium as described above. When a computer system reads and executes the code and/or data stored on the non-transitory computer-readable storage medium, the computer system performs the methods and processes embodied as data structures and code and stored within the non-transitory computer-readable storage medium. Furthermore, the methods and processes described below can be included in hardware modules. For example, the hardware modules can include, but are not limited to, application-specific integrated circuit (ASIC) chips, field-programmable gate arrays (FPGAs), and other programmable-logic devices now known or later developed. When the hardware modules are activated, the hardware modules perform the methods and processes included within the hardware modules.
- It will be apparent to those skilled in the art that various modifications and variations can be made to the disclosed embodiments. It is intended that the specification and examples be considered as exemplary only, with the true scope of the disclosure being indicated by the following claims and their equivalent.
Claims (18)
1. An optical touch system comprising:
an image sensor module comprising a plurality of image sensing elements configured to be independently controlled to achieve different exposure times, the plurality of image sensing elements generating a picture comprising a plurality of pixel groups; and
a processor configured to extract an intensity value representing each pixel group and select a portion of the pixel groups as an object image according to the intensity values of the pixel groups.
2. The optical touch system of claim 1 , wherein the picture is an illumination-compensated picture.
3. The optical touch system of claim 1 , wherein each pixel group comprises a plurality of pixels, wherein the intensity value representing each pixel group is a sum or an average of intensity values of the pixels of the pixel group.
4. The optical touch system of claim 1 , wherein each pixel group is a line of pixels.
5. The optical touch system of claim 1 , wherein the processor is configured to subtract a threshold value from the intensity value representing each pixel group to obtain a plurality of difference values and to select the portion of the pixel groups according to ratios of the difference values to the threshold value.
6. The optical touch system of claim 1 , wherein the portion of the pixel groups is greater or smaller than a threshold value.
7. The optical touch system of claim 5 , wherein the threshold value is an average of the intensity values representing the pixel groups or a predetermined value.
8. The optical touch system of claim 1 , wherein the image sensor module is configured to generate a background picture using a fixed exposure time and use an intensity value of the background picture corresponding to each pixel group to adjust the exposure times of corresponding image sensing elements.
9. The optical touch system of claim 1 , wherein the image sensor module is configured to adjust the exposure time for each image sensing element to allow the image sensor module to generate a new background picture comprising a plurality of pixel groups having intensity values within an intensity range.
10. An image processing method of an optical touch system, comprising the steps of:
obtaining a picture comprising a plurality of first pixel groups;
is determining a plurality of first difference values each determined by subtracting intensity values of two of the plurality of first pixel groups; and
selecting a set of successive pixel groups as an object image according to the first difference values.
11. The method of claim 10 , wherein the picture is an illumination-compensated picture.
12. The method of claim 10 , wherein the step of determining a plurality of first difference values comprises a step of determining a first difference value between the intensity values of each two adjacent ones of the plurality of first pixel groups.
13. The method of claim 10 , wherein the step of determining a plurality of first difference values comprises a step of determining a first difference value between the intensity values of a pair of first pixel groups separated by an interval of a predetermined number.
14. The method of claim 13 , wherein the predetermined number is determined by the steps of:
obtaining an MTF value of an image sensor;
selecting the intensity value of one of the first pixel groups as an initial intensity value;
using the following equation, the MTF value and the initial intensity value to determine a number of iterations needed when an iterated result is less than a predetermined value:
where Mi represents an intensity value on the i-th iteration and Ki+1 represents an iterated result on the i-th iteration; and
selecting the number of iterations as the predetermined number.
15. The method of claim 10 , further comprising the steps of:
determining, from the plurality of first pixel groups, a first boundary pixel group corresponding to the first difference value less than a first threshold value;
determining, from the plurality of first pixel groups, a second boundary pixel group corresponding to the first difference value greater than a second threshold value; and
selecting a plurality of ones of the pixel groups between the first and second boundary pixel groups as the object image.
16. The method of claim 15 , further comprising the steps of:
obtaining a background picture that comprises a plurality of second pixel groups;
determining a plurality of second difference values each determined by subtracting intensity values of two of the plurality of second pixel groups;
selecting a minimum from the second difference values as the first threshold value; and
selecting a maximum from the second difference values as the second threshold value.
17. The method of claim 10 , wherein each first pixel group comprises a plurality of pixels, and the intensity value representing each pixel group is a sum of intensity values of the pixels of the pixel group.
18. The method of claim 10 , wherein the pixel group is a line of pixels.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US15/346,082 US10282036B2 (en) | 2011-06-21 | 2016-11-08 | Optical touch system and image processing method thereof |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| TW100121547 | 2011-06-21 | ||
| TW100121547A TWI441062B (en) | 2011-06-21 | 2011-06-21 | Optical touch system and image processing method thereof |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/346,082 Division US10282036B2 (en) | 2011-06-21 | 2016-11-08 | Optical touch system and image processing method thereof |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20120327035A1 true US20120327035A1 (en) | 2012-12-27 |
Family
ID=47361393
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/495,712 Abandoned US20120327035A1 (en) | 2011-06-21 | 2012-06-13 | Optical touch system and image processing method thereof |
| US15/346,082 Active 2033-02-22 US10282036B2 (en) | 2011-06-21 | 2016-11-08 | Optical touch system and image processing method thereof |
Family Applications After (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/346,082 Active 2033-02-22 US10282036B2 (en) | 2011-06-21 | 2016-11-08 | Optical touch system and image processing method thereof |
Country Status (2)
| Country | Link |
|---|---|
| US (2) | US20120327035A1 (en) |
| TW (1) | TWI441062B (en) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20160378266A1 (en) * | 2015-06-25 | 2016-12-29 | Wistron Corporation | Optical touch apparatus and width detecting method thereof |
| TWI888248B (en) * | 2024-08-06 | 2025-06-21 | 大陸商業泓科技(成都)有限公司 | Optical tactile sensing device |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| TWI507947B (en) * | 2013-07-12 | 2015-11-11 | Wistron Corp | Apparatus and system for correcting touch signal and method thereof |
| TWI622142B (en) * | 2016-11-07 | 2018-04-21 | 財團法人工業技術研究院 | Chip package and chip packaging method |
Citations (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6665010B1 (en) * | 1998-07-21 | 2003-12-16 | Intel Corporation | Controlling integration times of pixel sensors |
| US6671422B1 (en) * | 1999-06-22 | 2003-12-30 | International Business Machine Corporation | Apparatus and method for detecting rough position of two-dimensional code |
| US20050134698A1 (en) * | 2003-12-18 | 2005-06-23 | Schroeder Dale W. | Color image sensor with imaging elements imaging on respective regions of sensor elements |
| US20090085881A1 (en) * | 2007-09-28 | 2009-04-02 | Microsoft Corporation | Detecting finger orientation on a touch-sensitive device |
| US20090091554A1 (en) * | 2007-10-05 | 2009-04-09 | Microsoft Corporation | Correcting for ambient light in an optical touch-sensitive device |
| US20100123873A1 (en) * | 2008-11-14 | 2010-05-20 | Amo Wavefront Sciences, Llc | Method of qualifying light spots for optical measurements and measurement instrument employing method of qualifying light spots |
| US20100177062A1 (en) * | 2009-01-13 | 2010-07-15 | Quanta Computer Inc. | Light compensation method |
| US20100225617A1 (en) * | 2009-03-06 | 2010-09-09 | Yoshimoto Yoshiharu | Position detection device |
| US20100283875A1 (en) * | 2006-09-28 | 2010-11-11 | Nokia Corporation | Read out method for a cmos imager with reduced dark current |
| US20110012866A1 (en) * | 2009-07-17 | 2011-01-20 | Microsoft Corporation | Ambient correction in rolling image capture system |
| US20110193969A1 (en) * | 2010-02-09 | 2011-08-11 | Qisda Corporation | Object-detecting system and method by use of non-coincident fields of light |
| US20120212639A1 (en) * | 2011-02-23 | 2012-08-23 | Pixart Imaging Inc. | Image Sensor |
| US20130063402A1 (en) * | 2011-09-09 | 2013-03-14 | Pixart Imaging Inc. | Optical touch system |
Family Cites Families (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN100511115C (en) | 2002-03-13 | 2009-07-08 | 平蛙实验室股份公司 | A touch pad, a stylus for use with the touch pad, and a method of operating the touch pad |
| AU2003220595A1 (en) * | 2002-03-27 | 2003-10-13 | The Trustees Of Columbia University In The City Of New York | Imaging method and system |
| AU2003274484A1 (en) | 2002-11-19 | 2004-06-15 | Koninklijke Philips Electronics N.V. | Image segmentation using template prediction |
| US7173688B2 (en) * | 2004-12-28 | 2007-02-06 | Asml Holding N.V. | Method for calculating an intensity integral for use in lithography systems |
| CN100555179C (en) | 2006-10-13 | 2009-10-28 | 广东威创视讯科技股份有限公司 | A kind of based on cmos digital imageing sensor locating device and localization method |
| US20090174674A1 (en) | 2008-01-09 | 2009-07-09 | Qualcomm Incorporated | Apparatus and methods for a touch user interface using an image sensor |
| TWI356332B (en) | 2008-04-30 | 2012-01-11 | Raydium Semiconductor Corp | Optical sensing system and optical sensing device |
| CN101808178B (en) | 2009-02-12 | 2012-06-13 | 亚泰影像科技股份有限公司 | Method for adjusting the size of the light source used by the contact image sensor module |
-
2011
- 2011-06-21 TW TW100121547A patent/TWI441062B/en not_active IP Right Cessation
-
2012
- 2012-06-13 US US13/495,712 patent/US20120327035A1/en not_active Abandoned
-
2016
- 2016-11-08 US US15/346,082 patent/US10282036B2/en active Active
Patent Citations (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6665010B1 (en) * | 1998-07-21 | 2003-12-16 | Intel Corporation | Controlling integration times of pixel sensors |
| US6671422B1 (en) * | 1999-06-22 | 2003-12-30 | International Business Machine Corporation | Apparatus and method for detecting rough position of two-dimensional code |
| US20050134698A1 (en) * | 2003-12-18 | 2005-06-23 | Schroeder Dale W. | Color image sensor with imaging elements imaging on respective regions of sensor elements |
| US20100283875A1 (en) * | 2006-09-28 | 2010-11-11 | Nokia Corporation | Read out method for a cmos imager with reduced dark current |
| US20090085881A1 (en) * | 2007-09-28 | 2009-04-02 | Microsoft Corporation | Detecting finger orientation on a touch-sensitive device |
| US20090091554A1 (en) * | 2007-10-05 | 2009-04-09 | Microsoft Corporation | Correcting for ambient light in an optical touch-sensitive device |
| US8004502B2 (en) * | 2007-10-05 | 2011-08-23 | Microsoft Corporation | Correcting for ambient light in an optical touch-sensitive device |
| US20100123873A1 (en) * | 2008-11-14 | 2010-05-20 | Amo Wavefront Sciences, Llc | Method of qualifying light spots for optical measurements and measurement instrument employing method of qualifying light spots |
| US20100177062A1 (en) * | 2009-01-13 | 2010-07-15 | Quanta Computer Inc. | Light compensation method |
| US20100225617A1 (en) * | 2009-03-06 | 2010-09-09 | Yoshimoto Yoshiharu | Position detection device |
| US20110012866A1 (en) * | 2009-07-17 | 2011-01-20 | Microsoft Corporation | Ambient correction in rolling image capture system |
| US8289300B2 (en) * | 2009-07-17 | 2012-10-16 | Microsoft Corporation | Ambient correction in rolling image capture system |
| US20110193969A1 (en) * | 2010-02-09 | 2011-08-11 | Qisda Corporation | Object-detecting system and method by use of non-coincident fields of light |
| US20120212639A1 (en) * | 2011-02-23 | 2012-08-23 | Pixart Imaging Inc. | Image Sensor |
| US20130063402A1 (en) * | 2011-09-09 | 2013-03-14 | Pixart Imaging Inc. | Optical touch system |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20160378266A1 (en) * | 2015-06-25 | 2016-12-29 | Wistron Corporation | Optical touch apparatus and width detecting method thereof |
| US10719174B2 (en) * | 2015-06-25 | 2020-07-21 | Wistron Corporation | Optical touch apparatus and width detecting method thereof |
| TWI888248B (en) * | 2024-08-06 | 2025-06-21 | 大陸商業泓科技(成都)有限公司 | Optical tactile sensing device |
Also Published As
| Publication number | Publication date |
|---|---|
| US20170052647A1 (en) | 2017-02-23 |
| TW201301102A (en) | 2013-01-01 |
| TWI441062B (en) | 2014-06-11 |
| US10282036B2 (en) | 2019-05-07 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| TWI787472B (en) | Apparatus and method for determining light intensity | |
| US9380202B2 (en) | Focus detection apparatus, image pickup apparatus, image pickup system, focus detection method, and non-transitory computer-readable storage medium | |
| US10282036B2 (en) | Optical touch system and image processing method thereof | |
| US9989656B2 (en) | Radiation imaging apparatus | |
| US8311385B2 (en) | Method and device for controlling video recordation property of camera module according to velocity of object | |
| KR101697519B1 (en) | Apparatus and method for depth sending | |
| US20120177252A1 (en) | Distance measuring apparatus, distance measuring method, and program | |
| US9875390B2 (en) | Method and apparatus for recognizing object | |
| US20120287242A1 (en) | Adaptive high dynamic range camera | |
| WO2017141957A1 (en) | Distance measuring device | |
| US10477100B2 (en) | Distance calculation apparatus, imaging apparatus, and distance calculation method that include confidence calculation of distance information | |
| JP2011501841A5 (en) | ||
| KR20180053333A (en) | Imaging devices with autofocus control | |
| WO2015118973A1 (en) | Image capturing apparatus and method of controlling the same | |
| JP2014115264A (en) | Three-dimensional shape measuring device and control method therefor | |
| KR102025928B1 (en) | Imaging apparatus and controlling method thereof | |
| US20150263710A1 (en) | Sampling period control circuit capable of controlling sampling period | |
| US20170295310A1 (en) | Image processing apparatus for detecting flicker, method of controlling the same, and non-transitory storage medium | |
| US20120327310A1 (en) | Object detecting device and information acquiring device | |
| US9544508B2 (en) | Image sensor which can adjust brightness information to fall in a predetermined range | |
| US8836831B2 (en) | Image sensor | |
| US9229578B2 (en) | Image sensor and optical touch system including the same | |
| CN104270582B (en) | Image sensor with a plurality of pixels | |
| JP7604907B2 (en) | Distance image capturing device and distance image capturing method | |
| CN114846355B (en) | Distance measuring device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: PIXART IMAGING INC., TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SU, TZUNG MIN;GU, REN HAU;LIN, CHIH HSIN;AND OTHERS;REEL/FRAME:028369/0717 Effective date: 20120525 |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |