[go: up one dir, main page]

WO2012103982A1 - Method and system for determining motion parameters associated with an object - Google Patents

Method and system for determining motion parameters associated with an object Download PDF

Info

Publication number
WO2012103982A1
WO2012103982A1 PCT/EP2011/072725 EP2011072725W WO2012103982A1 WO 2012103982 A1 WO2012103982 A1 WO 2012103982A1 EP 2011072725 W EP2011072725 W EP 2011072725W WO 2012103982 A1 WO2012103982 A1 WO 2012103982A1
Authority
WO
WIPO (PCT)
Prior art keywords
pattern
distinct
optical
image
patterns
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2011/072725
Other languages
French (fr)
Inventor
Varun Akur Venkatesan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens AG
Siemens Corp
Original Assignee
Siemens AG
Siemens Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens AG, Siemens Corp filed Critical Siemens AG
Publication of WO2012103982A1 publication Critical patent/WO2012103982A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D5/00Mechanical means for transferring the output of a sensing member; Means for converting the output of a sensing member to another variable where the form or nature of the sensing member does not constrain the means for converting; Transducers not specially adapted for a specific variable
    • G01D5/26Mechanical means for transferring the output of a sensing member; Means for converting the output of a sensing member to another variable where the form or nature of the sensing member does not constrain the means for converting; Transducers not specially adapted for a specific variable characterised by optical transfer means, i.e. using infrared, visible, or ultraviolet light
    • G01D5/32Mechanical means for transferring the output of a sensing member; Means for converting the output of a sensing member to another variable where the form or nature of the sensing member does not constrain the means for converting; Transducers not specially adapted for a specific variable characterised by optical transfer means, i.e. using infrared, visible, or ultraviolet light with attenuation or whole or partial obturation of beams of light
    • G01D5/34Mechanical means for transferring the output of a sensing member; Means for converting the output of a sensing member to another variable where the form or nature of the sensing member does not constrain the means for converting; Transducers not specially adapted for a specific variable characterised by optical transfer means, i.e. using infrared, visible, or ultraviolet light with attenuation or whole or partial obturation of beams of light the beams of light being detected by photocells
    • G01D5/347Mechanical means for transferring the output of a sensing member; Means for converting the output of a sensing member to another variable where the form or nature of the sensing member does not constrain the means for converting; Transducers not specially adapted for a specific variable characterised by optical transfer means, i.e. using infrared, visible, or ultraviolet light with attenuation or whole or partial obturation of beams of light the beams of light being detected by photocells using displacement encoding scales
    • G01D5/34707Scales; Discs, e.g. fixation, fabrication, compensation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches

Definitions

  • the present invention is related to a method and a system for determining motion parameters associated with an object.
  • Various systems are inherently dependent on such technology that is able to determine the position of one or more movable objects.
  • Some examples of such systems include various industrial automation and control systems such as precision instrumentation, positioning tables, robotics, printers, plotters, tape drives, machine automation, medical devices, and so on.
  • the afore-mentioned reference relates to an apparatus that includes an illumination source that illuminates a reference surface including a coded sequence of reference markings, a detector that generates an output indicative of the coded sequence of reference markings from light reflected from the reference surface, and an electronics module receiving output indicative of the coded sequence of reference markings from the detector, processing the received signal to determine a width and a centroid location of individual reference
  • the functioning of the detector depends on the frame rate for capturing the sample frames. This factor, in turn, influences the threshold speed of the object below which the object may be satisfactorily tracked to determine the motion parameters. In order to be able to satisfactorily track fast motion of the object, a high-frame rate image acquisition system is desirable.
  • the system and method for determining the motion parameters of an object using optical means necessitates availability of high-frame rate image acquisition systems for satisfactory performance. This factor considerably increases the cost of such systems.
  • the object of the present invention is achieved by a method according to claim 1 and a system according to claim 10.
  • the method and system first capture and subsequently, separate at least two distinct colour channels in each image of an object corresponding to at least two distinct sampling periods during which a surface of the object is illuminated with light of at least two distinct colours, and thereby, effectively multiply the frame rate.
  • Each colour channel thus separated, includes a sample pattern that is indicative of a position of the object corresponding to the respective sampling period and hence, facilitates determination of one or more motion parameters associated with the object.
  • the present invention facilitates determination of one or more motion parameters associated with an object moving at a high speed through use of light of distinct colours for illuminating the surface.
  • the distinct sampling periods are intervened by an
  • the intermediate time delay acts as a separator between the individual sampling periods such that the distinct sampling periods capture distinct portions of the object at varying time instants during its motion in distinct colour channels.
  • the one or more motion parameters are computed based on a pattern matching between each sample pattern and at least a sub-set of reference patterns selected from a set of reference patterns, wherein each reference pattern
  • This technical feature enables identification of a reference pattern matching to a sample pattern, and thereby,
  • the sub-set of reference patterns is selected from the set of reference patterns based on a predictive motion analysis, wherein the predictive motion analysis predicts a range of likely positions of the object corresponding to the image based on one or more motion parameters determined at a preceding time instant.
  • a sample pattern is compared with the sub-set of reference patterns, thus selected, instead of the set of reference patterns, and therefore, the real-time processing speed of the system and method is improved.
  • a set of optical patterns is defined such that each optical pattern is distinct, and the set of optical patterns is disposed on the surface of the object such that each optical pattern is associated with a distinct portion of the surface of the object, and such that each sample pattern comprises at least one optical pattern.
  • the individual optical patterns, thus disposed, are easily identifiable and thus, the pattern matching between a sample pattern and a reference pattern is greatly facilitated. Thus, these technical features improve the efficiency of the system and method of the present invention.
  • each reference pattern is an image of at least a portion of the surface of the object uniquely corresponding to a distinct position of the object. As each reference pattern is associated with a distinct position of the object, it facilitates retrieving the position information from the sample pattern through pattern matching.
  • each optical pattern is an encoded symbol
  • the position information may be determined from the sample pattern through pattern decoding, without the necessity of creating reference patterns and pattern matching between a sample pattern and multiple reference patterns.
  • FIG 1 illustrates a system for determining one or more motion parameters associated with an object
  • FIG 2 illustrates a schematic representation of an
  • FIG 3 illustrates a schematic representation of an
  • FIG 4 illustrates an exemplary timing diagram for
  • FIGS 5A-5C illustrate various exemplary schemes for
  • FIGS 6A-6C illustrate various exemplary schemes for
  • FIG 7 illustrates various exemplary schemes for
  • FIG 8 illustrates various exemplary schemes for
  • FIGS 9A-9E illustrate a schematic representation of various sets of optical patterns, a colour channel, a sample pattern, and a set of reference patterns
  • FIG 10 illustrates a high-level flow diagram
  • FIG 11 illustrates a detailed flow diagram
  • FIG 12 illustrates a detailed flow diagram
  • FIG 1 illustrates a schematic representation of a system 100 for determining one or more motion parameters associated with an object.
  • the system 100 comprises a light source 102 to illuminate at least a portion of a surface of the object, an optical sensor 104 to capture light reflected from the surface and generate an image, and an image processor 106 to compute one or more motion parameters associated with the obj ect .
  • the light source 102 illuminates the surface with light of at least two distinct colours during distinct sampling periods.
  • the optical sensor 104 captures light reflected from the surface to generate an image of the surface.
  • the image includes at least two distinct colour channels capable of distinguishing the at least two distinct colours of the light.
  • the image processor 106 is adapted for retrieving a sample pattern from each of the at least two distinct colour channels. Each sample pattern is indicative of a position of the object.
  • the image processor 106 is further adapted for computing the one or more motion parameters based on the sample patterns.
  • the light source 102 is capable of producing light of at least two distinct colours.
  • any suitable light source known in the art may be used.
  • Various examples include, but are not limited to, gas-discharge based devices such as xenon flash devices, light emitting diodes (LEDs) such as high current flash LEDs .
  • the light source 102 includes multiple coloured LEDs to produce light of distinct colours.
  • the light source 102 includes a white-coloured LED in
  • the light source 102 is such that each colour channel
  • RGB Red, Green, Blue
  • the at least two distinct colours produced by the light source 102 are selected from the set of primary colours including red, green and blue colours.
  • the present invention shall be explained hereinafter with reference to the light source 102 producing red, blue and green colours.
  • the aforementioned colour sets have been specified by way of example and should not be construed to limit the present invention. It will be apparent that the present invention shall work satisfactorily so long as the colour channels in the image are easily distinguishable. Moreover, it is
  • the light of each colour is produced during distinct sampling periods.
  • the distinct sampling periods are intervened by a predefined time delay such that the sampling periods capture the surface at varying time instants in distinct colour channels.
  • the distinct sampling periods initiated based on predefined time delays with respect to a reference time. This facilitates accurate timing of the distinct sampling periods during which the surface of the object is illuminated using light of distinct colours and the light reflected from the surface is captured.
  • the surface of the object reflects the light projected by the light source 102.
  • the surface illuminated using the light source 102 producing light of three distinct colours - red, green and blue during distinct sampling periods will reflect red colour light, green colour light and blue colour light during the respective sampling periods.
  • the optical sensor 104 captures the light reflected from the surface during each of the sampling periods in which the light source 102 produces light of distinct colours to generate one image of at least a portion of the surface of the object.
  • the image, thus generated has at least two distinct colour channels capable of distinguishing the at least two distinct colours of the light.
  • the number of colour channels in an image is equivalent to the number of distinct colours illuminating the surface.
  • sampling period refers to a time interval during the surface of the object is illuminated with light of a distinct colour and the light reflected from the surface of the object is captured
  • the image processor 106 is adapted for retrieving the sample pattern, which is indicative of a position of the object, from each of the at least two distinct colour channels, and computing the one or more motion parameters based on the sample patterns.
  • the image processor 106 processes the image to isolate individual colour channels in the image.
  • the image processor 106 processes the
  • image processor 106 uses the sample patterns, thus retrieved, to compute the motion parameters associated with the object.
  • the technical features and functioning of image processor 106 will be further explained in more detail in conjunction with FIGS 2 and 3.
  • the present invention is suitable for determining the motion parameters of an object that is in relative motion with respect to the system 100, in
  • the optical sensor 104 In various exemplary embodiments of the present invention, the system 100 is static while the object is in motion. Alternatively, the object is static while the system 100 is in motion.
  • the present invention is equally applicable to various possible kinds of motion of the object relative to the system 100 for determining motion parameters including, but not limited to, rectilinear motion, rotational motion, and a combination thereof. Accordingly the present invention facilitates implementation of linear encoders used for computation of motion parameters related to linear motion, such as linear position, linear displacement, linear velocity, and linear acceleration as well as rotary encoders used for computation of motion parameters related to rotary motion, such as rotational position, rotational displacement, rotational velocity, and rotational
  • FIG 2 illustrates a schematic representation of the image processor 106 based on pattern matching.
  • the image processor 106 includes a colour-plane slicing module 202 to isolate individual colour channels in an image generated by the optical sensor 104, a pattern matching module 204 to match each sample pattern retrieved from the distinct colour channels and at least a sub-set of the reference patterns selected from a set of reference patterns, a predictive motion analysis module 206 to select the sub-set of reference patterns from the set of reference patterns, a memory module 208 to store the set of reference patterns, and a motion computation module 210 to compute the motion parameters associated with the object.
  • the system 100 can operate in two distinct modes of
  • a first mode of operation which is referred to as a calibration mode of operation
  • a second mode of operation which is referred to as a tracking mode of operation.
  • the object is static.
  • the light source 102 illuminates a surface of the object during a sampling period.
  • the optical sensor 104 captures the light reflected from the surface of the object during the sampling period to generate an image of at least a portion of the surface of the object.
  • the image processor 106 retrieves a reference pattern from the image, associates the reference pattern with the current position of the object and stores the reference pattern, along with information
  • predefined quantum of, linear or angular, as the case may be, displacement and steps for generating the reference pattern are repeated and thus, a reference pattern for the new position is generated.
  • This process of subjecting the object to predefined quantum of displacement and generating the reference pattern corresponding to the current position is iteratively performed for the entire range of distinct positions of the object.
  • a set of reference patterns is created during the calibration mode of operation.
  • the set of reference patterns is stored in the memory module 208.
  • predefined quantum of displacement is based on the desired level of resolution with which the position of the object may be determined.
  • light used for illuminating the object For example, it is possible to use white light for generation of the reference patterns.
  • the object is dynamic and undergoes motion within a specified range of operable speeds depending on a function of the object.
  • the light source 102 and the optical sensor 104 operate as explained in conjunction with FIG 1.
  • the image generated by the optical sensor 104 includes at least two distinct colour channels .
  • the colour-plane slicing module 202 processes the image including two or more colour channels to isolate the
  • a set of reference patterns is generated and is stored in memory module 208.
  • the predictive motion analysis module 206 maintains motion- related data corresponding to expected values of one or more motion parameters associated with the object.
  • the motion-related data may be pre-recorded in the predictive motion analysis module 206 or the motion-related data may be fetched from the motion computation module 210 during initial stages of the tracking mode of operation. Further, the motion-related data stored in the predictive motion analysis module 206 may be updated during the tracking mode of operation at regular intervals.
  • analysis module 206 predicts a range of likely positions of the object during a given sampling period in accordance with the motion-related data corresponding to one or more
  • the information related to position and velocity of the object during a preceding sampling period and the information related to the time elapsed between the preceding sampling period and a current sampling period may be used to predict the range of likely positions of the object during the current time instant.
  • a sub-set of reference patterns is selected from the set of reference patterns in the memory module 208. It is evident that the reference patterns
  • the pattern matching module 204 retrieves a sample pattern from each individual colour channel and matches the sample pattern with the sub-set of reference patterns provided by predictive motion analysis module 206.
  • the reference pattern providing the closest match to the sample pattern is
  • the pattern matching module 204 provides the
  • the pattern matching module 204 matches the sample pattern with one or more reference patterns in the sub-set of reference patterns based on generating a match-measuring metric between the sample pattern and the reference pattern.
  • the match-measuring metric provides an indication as to the extent of match between the sample and the reference
  • the pattern matching module 204 may use any suitable match- measuring metric known in the art.
  • the match-measuring metrics that may be used in the present invention as
  • the feature-based approach is based on matching one or more features present in the sample and the reference patterns.
  • the spatial features that may be used for generating a feature-based metric include corners, edges, blobs, and ridges.
  • the template-based approach uses metrics based on sum or mean values such as Sum of Absolute Differences (SAD) , Sum of Absolute Transformed Differences (SATD) , Sum of Squared
  • Deviations SSD
  • the corresponding mean values based metrics namely, Mean Absolute Difference (MAD) , Mean of Absolute Transformed Differences (MATD) , and Mean Squared Deviations (MSD) , may also be used.
  • the template-based approach may use a correlation-based metric. These metrics are well-known in the art and a detailed description of these metrics is not being included for sake of brevity.
  • one or more match-measuring metrics may be combined.
  • the reference pattern that provides the highest match based on a combination of the match- measuring metric is selected as the second portion for subsequent processing.
  • the motion computation module 210 receives the position of the object as well as the corresponding time instants from the pattern matching module 204 and computes various motion parameters associated with the object based on the position and time information.
  • the distance and displacement of the object are computed based on a first position of the object and a second position of the object.
  • the speed and the velocity are computed based on a ratio of distance and the corresponding time period, and a ratio of distance and the corresponding time period
  • a set of optical patterns is defined such that each optical pattern is distinct and the set of optical patterns is disposed on the surface of the object such that each optical pattern is associated with a distinct portion of the surface of the object.
  • Each reference pattern generated during the calibration mode of operation includes at least one optical pattern.
  • each sample pattern generated during the tracking mode of operation includes at least one optical pattern.
  • FIG 3 illustrates a schematic representation of the image processor 106 based on pattern decoding.
  • the image processor 106 includes a colour-plane slicing module 302 to isolate individual colour channels in an image generated by the optical sensor 104, a pattern decoding module 304 to decode each sample pattern retrieved from the distinct colour channels in accordance with a decoding criterion, a memory module 306 to store the decoding criterion, and a motion computation module 308 to compute the motion parameters associated with the object.
  • a set of optical patterns is defined such that each optical pattern is a distinct encoded symbol generated based on an encoding criterion.
  • the set of optical patterns is disposed on the surface of the object such that each optical pattern is associated with a distinct portion of the surface of the object.
  • each encoded symbol is associated with a distinct position of the object.
  • the optical patterns are such that each sample pattern includes at least one optical pattern.
  • the decoding criterion corresponding to the encoding criterion used for generation of encoded symbols is stored in the memory module 306.
  • the light source 102 and the optical sensor 104 operate as explained in conjunction with FIG 1.
  • generated by the optical sensor 104 includes at least two distinct colour channels.
  • the colour-plane slicing module 302 is similar to the colour- plane slicing module 202, and processes the image including two or more colour channels to isolate the individual colour channels present in the image.
  • each optical pattern disposed on the surface of the object is an encoded symbol.
  • the pattern decoding module 304 retrieves a sample pattern from each individual colour channel and decodes an encoded symbol included therein, based on the decoding criterion stored in the memory module 306 to obtain information related to the position of the object.
  • the encoding and decoding is based on any suitable encoding- decoding technique known in the art.
  • Various examples of such encoding-decoding techniques include, but are not limited to, various types of linear bar-codes and matrix bar-codes.
  • Each bar-code is a unique encoded symbol and is associated with a distinct position of the object.
  • the pattern decoding module 304 determines the information related to the position of the object.
  • the information related to the position of the object as well as the corresponding time instants are provided to the motion computation module 308.
  • the motion computation module 308 is similar to motion computation module 210, and receives the position of the object as well as the corresponding time instants from the pattern decoding module 304 and computes various motion parameters associated with the object based on the position and time information in a manner described in conjunction with FIG 2.
  • FIG 4 illustrates an exemplary timing diagram for
  • FIG 4 shows a timing signal 402 which provides a reference time, a timing signal 404 which triggers the light source 102 to produce light of a first colour, a timing signal 406 which triggers the light source 102 to produce light of a second colour, a timing signal 408 which trigger the light source 102 to produce light of a third colour, a timing signal 410 which triggers the optical sensor 104 to capture light reflected from the surface.
  • the timing diagram illustrated is for the light source 102 producing light of three distinct colours.
  • the timing signal 402 provides a reference time to control the remaining timing signals shown in FIG 4.
  • the timing signals 404, 406, and 408 are such that the light source is triggered to produce light of the first, the second and the third colours at a time delay of Ti, T2, and T3 respectively from the reference time provided by the timing signal 402.
  • the sampling periods during which light of the first, the second and the third colours is produced correspond to on-times Si, S2 and S3 respectively.
  • the timing signal 410 is such that the optical sensor 104 is triggered to capture light reflected from the surface starting from the reference time provided by the timing signal 402.
  • the sampling period during which light reflected from the surface of the object is captured corresponds to on- time E.
  • the timing signal 410 regulates an exposure time of the optical sensor 104.
  • the timing signals 404 through 410 are controlled such that the on-times Si, S 2 and S3 are necessarily within the on-time E. This ensures that the light reflected from the surface in response to illumination by light of distinct colours produced by the light source 102 is captured within one exposure time of the optical sensor 104 such that the image includes distinct colour channels that comprise spatial and temporal information related to the moving surface at varying time instants.
  • the duration of illumination by light source 102 during each sampling period is sufficiently short such that the surface features, as captured by the optical sensor 104, are not blurred in the resulting image.
  • the illumination by the light source 102 is
  • the timing signals 404, 406, and 408 are such that the light source is triggered to produce light of the first and the second colours at an intervening time delay of Di, and light of the second and the third colours at an
  • time delays Ti, T 2 , and T3 for a specified exposure time of the optical sensor 104 (on-time E) , time delays Ti, T 2 , and T3 or
  • intervening time delays Di and D 2 are adjusted based on a desired operable range of speeds of the object for
  • the intervening delays are decreased and for determining the motion parameters associated with relatively slower moving surfaces, the delays are increased.
  • FIGS 5A to 5C illustrate various exemplary schemes for disposing a set of optical patterns on a surface of an object 500 undergoing rotary motion.
  • FIGS 5A to 5C show the object 500 undergoing rotary motion about an axis A-A' .
  • the object 500 as shown in the FIG 5A to 5C, is a cylindrical shaft.
  • FIG 5A shows an optical pattern 502 is imprinted on the cylindrical surface of the object 500.
  • FIG 5B shows an optical pattern 504 is imprinted on the planar end surface of the object 500 perpendicular to the axis A-A' .
  • the light source 102 and the optical sensor 104 are suitably positioned to image the portion of surface of the object 500 where the optical patterns 502 and 504 are disposed.
  • FIG 5C shows both the optical patterns 502 and 504 are disposed on the
  • two sets of the light source 102 and the optical sensor 104 are positioned to image both the optical patterns 502 and 504.
  • the results obtained through the two sets the light source 102 and the optical sensor 104 may be triangulated for improved accuracy.
  • FIGS 6A to 6C illustrate exemplary schemes for disposing optical patterns on a surface of an object 600 undergoing rotary motion.
  • FIGS 6A to 6C show an object 600 undergoing rotary motion about an axis A-A' .
  • the object 600 as shown in the FIG 6A to 6C, is a cylindrical shaft.
  • FIGS 6A to 6C also show a cylindrical sleeve 606 mounted on the object 600. It is also possible to manufacture the object 600 such that the cylindrical sleeve 606 is an intrinsic part of the object design instead of being externally mounted on the object.
  • FIG 6A shows an optical pattern 602 is imprinted on the
  • FIG 6B shows an optical pattern 604 is imprinted on the planar end surface of the cylindrical sleeve 606
  • FIG 6A and FIG 6B the light source 102 and the optical sensor 104 are suitably positioned to image the portion of surface of the cylindrical sleeve 606 where the optical patterns 602 and 604 are disposed.
  • FIG 6C shows both the optical patterns 602 and 604 are disposed on the cylindrical surface and the planar end surface of cylindrical sleeve 606 respectively.
  • two sets of the light source 102 and the optical sensor 104 are positioned to image both the optical patterns 602 and 604.
  • FIG 7 illustrates an exemplary scheme for disposing optical patterns on a surface of an object 700 undergoing linear motion.
  • the object 700 as shown in the FIG 7, is in the form of a cuboid. However, the object 700 may have any geometrical form factor so long as there is at least one surface on the object 700 that is amenable to disposing an optical pattern.
  • An optical pattern 702 is imprinted on one of the surfaces of the object 700.
  • FIG 8 illustrates an exemplary scheme for disposing optical patterns on a surface of an object 800 undergoing linear motion.
  • FIG 8 also shows a planar strip 804 mounted on the object 800.
  • An optical pattern 802 is imprinted on the planar strip.
  • the light source 102 and the optical sensor 104 are suitably positioned to image the portion of surface of the planar strip 804 where the optical pattern 802 is disposed.
  • FIGS 9A to 9E illustrate a schematic representation of various optical patterns 902a, 902b, and 902c, a colour channel 904, a sample pattern 906, and a set of reference patterns 908.
  • FIG 9A shows the optical pattern 902a suitable for deposition on a surface of an object in accordance with exemplary schemes shown in FIGS 5A and 6A.
  • FIG 9B shows the optical pattern 902b suitable for deposition on a surface of an object in accordance with exemplary schemes shown in FIGS 5B and 6B .
  • FIG 9C shows the optical pattern 902c suitable for deposition on a surface of an object in accordance with exemplary schemes shown in FIGS 7 and 8.
  • Each optical pattern 902, as used herein, is a set of optical patterns Pi, P 2 , P 3 , and so on. Each optical pattern Pi through P n is distinct from the other optical patterns in the set.
  • FIG 9D shows the colour channel 904 and the sample pattern 906 retrieved from the colour channel 904, as explained in conjunction with FIG 1, 2, and 3.
  • the sample pattern 906 includes the optical pattern P 3 .
  • the size of optical patterns Pi through P n and the size of the colour channel 904 are so adjusted that each colour channel 904 contains at least one optical pattern.
  • the size of the optical patterns Pi through P n is same as the size of the colour channel 904.
  • FIG 9E shows the set of reference patterns 908 that are stored in the memory module 208 as described in conjunction with FIG 2.
  • FIG 10 illustrates a high-level flow diagram corresponding to a method for determining one or more motion parameters associated with an object.
  • a surface of the object is illuminated with light of at least two distinct colours during distinct sampling periods.
  • the distinct sampling periods are such that they lie within an exposure time of an optical sensor.
  • the light reflected from the surface of the object during the distinct sampling periods is captured to generate an image of at least a portion of the surface of the object such that the image includes at least two distinct colour channels capable of distinguishing the at least two distinct colours of the light.
  • a sample pattern is retrieved from each of the at least two distinct colour channels. Each sample pattern is indicative of a position of the object.
  • the one or more motion parameters associated with the object are computed based on the sample pattern.
  • FIG 11 illustrates a detailed flow diagram corresponding to a method for determining one or more motion parameters
  • the surface is illuminated with light of at least two distinct colours during distinct sampling periods.
  • the distinct sampling periods are such that they lie within an exposure time of an optical sensor.
  • the light reflected from the surface during the distinct sampling periods is captured to generate an image of at least a portion of the surface of the object such that the image includes at least two distinct colour channels capable of distinguishing the at least two distinct colours of the light .
  • the image is processed to isolate the
  • a sample pattern is retrieved from each of the at least two distinct colour channels. Each sample pattern is indicative of a position of the object.
  • the sample pattern is matched to at least a sub-set of the reference patterns selected from a set of reference patterns, wherein each reference pattern uniquely corresponds to a distinct position of the object.
  • the sub-set of reference patterns is selected from the set of reference patterns based on a predictive motion analysis.
  • the one or more motion parameters associated with the object are computed based on the pattern matching between the sample pattern and the sub-set of reference patterns .
  • FIG 12 illustrates a detailed flow diagram corresponding to a method for determining one or more motion parameters
  • steps 1202 through 1208 are same as step 1102 through 1108, which were described in conjunction with FIG 11.
  • the sample pattern retrieved from the at least two distinct colour channels includes an encoded symbol which is encoded using an encoding criterion.
  • Each encoded symbol is
  • the encoded symbol is decoded using a decoding criterion and the information, thus obtained, is used to ascertain the position of the object.
  • the one or more motion parameters associated with the object are computed based on sample pattern
  • the system and method of the present invention eliminate the requirement of high speed cameras, which are bulky and need external cooling systems, for capturing images of the objects moving at high speeds.
  • the present invention utilizes light of at least two distinct colours to track motion parameters of a fast moving object.
  • the present invention facilitates the use of low-frame rate image acquisition systems, the desired levels of image resolution may be achieved without any significant cost overheads.
  • the present invention facilitates use of high-resolution cameras. As the image resolution increases, the accuracy and the precision of the optical encoder is also enhanced.
  • the present invention provides a robust system and method suitable for determining a large range of motion parameters associated with the object.
  • the present invention will be useful for encoder applications in industrial automation and other similar fields.
  • the present invention is equally applicable to implementation as an incremental encoder, an absolute encoder, and a multi-turn encoder . While the present invention has been described in detail with reference to certain embodiments, it should be appreciated that the present invention is not limited to those

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The present invention provides a system and method to illuminate a surface of an object with light of at least two distinct colours during distinct sampling periods within an exposure time of an optical sensor and capture light reflected from the surface of the object to generate an image of at least a portion of the surface of the object such that the image includes at least two distinct colour channels capable of distinguishing the at least two distinct colours of the light. Subsequently, a sample pattern is retrieved from each distinct colour channel. Each sample pattern is indicative of a distinct position of the object. The present invention computes the one or more motion parameters associated with the object based on the information embodied in the sample pattern.

Description

Description
Method and system for determining motion parameters
associated with an object
The present invention is related to a method and a system for determining motion parameters associated with an object. Various systems are inherently dependent on such technology that is able to determine the position of one or more movable objects. Some examples of such systems include various industrial automation and control systems such as precision instrumentation, positioning tables, robotics, printers, plotters, tape drives, machine automation, medical devices, and so on.
One such method and system is known from US 7,710,553. The afore-mentioned reference relates to an apparatus that includes an illumination source that illuminates a reference surface including a coded sequence of reference markings, a detector that generates an output indicative of the coded sequence of reference markings from light reflected from the reference surface, and an electronics module receiving output indicative of the coded sequence of reference markings from the detector, processing the received signal to determine a width and a centroid location of individual reference
markings in the coded sequence of reference markings, and deriving angle or position information based on the widths and centroid locations of the individual reference markings.
The functioning of the detector, among other factors, depends on the frame rate for capturing the sample frames. This factor, in turn, influences the threshold speed of the object below which the object may be satisfactorily tracked to determine the motion parameters. In order to be able to satisfactorily track fast motion of the object, a high-frame rate image acquisition system is desirable. Thus, the system and method for determining the motion parameters of an object using optical means necessitates availability of high-frame rate image acquisition systems for satisfactory performance. This factor considerably increases the cost of such systems.
It is an object of the invention to determine one or more motion parameters associated with an object while obviating the need of expensive high-frame rate image acquisition systems .
The object of the present invention is achieved by a method according to claim 1 and a system according to claim 10.
Further embodiments of the invention are addressed in the dependent claims.
In lieu of directly increasing the image acquisition frame rate, the method and system first capture and subsequently, separate at least two distinct colour channels in each image of an object corresponding to at least two distinct sampling periods during which a surface of the object is illuminated with light of at least two distinct colours, and thereby, effectively multiply the frame rate. Each colour channel, thus separated, includes a sample pattern that is indicative of a position of the object corresponding to the respective sampling period and hence, facilitates determination of one or more motion parameters associated with the object. Thus, the present invention facilitates determination of one or more motion parameters associated with an object moving at a high speed through use of light of distinct colours for illuminating the surface.
In accordance with an embodiment of the present invention, the distinct sampling periods are intervened by an
intermediate time delay. The intermediate time delay acts as a separator between the individual sampling periods such that the distinct sampling periods capture distinct portions of the object at varying time instants during its motion in distinct colour channels.
In accordance with another embodiment of the present
invention, the one or more motion parameters are computed based on a pattern matching between each sample pattern and at least a sub-set of reference patterns selected from a set of reference patterns, wherein each reference pattern
uniquely corresponds to a distinct position of the object. This technical feature enables identification of a reference pattern matching to a sample pattern, and thereby,
facilitates determination of a position of the object.
In accordance with another embodiment of the present
invention, the sub-set of reference patterns is selected from the set of reference patterns based on a predictive motion analysis, wherein the predictive motion analysis predicts a range of likely positions of the object corresponding to the image based on one or more motion parameters determined at a preceding time instant. Thus, during the real-time
processing, a sample pattern is compared with the sub-set of reference patterns, thus selected, instead of the set of reference patterns, and therefore, the real-time processing speed of the system and method is improved. These technical features further ensure that the system and method are implementable without requiring undue amount of computational resources .
In accordance with another embodiment of the present
invention, a set of optical patterns is defined such that each optical pattern is distinct, and the set of optical patterns is disposed on the surface of the object such that each optical pattern is associated with a distinct portion of the surface of the object, and such that each sample pattern comprises at least one optical pattern. The individual optical patterns, thus disposed, are easily identifiable and thus, the pattern matching between a sample pattern and a reference pattern is greatly facilitated. Thus, these technical features improve the efficiency of the system and method of the present invention.
In accordance with another embodiment of the present
invention, each reference pattern is an image of at least a portion of the surface of the object uniquely corresponding to a distinct position of the object. As each reference pattern is associated with a distinct position of the object, it facilitates retrieving the position information from the sample pattern through pattern matching.
In accordance with another embodiment of the present
invention, each optical pattern is an encoded symbol
generated based on an encoding criterion, and further wherein the one or more motion parameters associated with the object are computed through decoding the encoded symbol extracted from each sample pattern based on a decoding criterion. Thus, the position information may be determined from the sample pattern through pattern decoding, without the necessity of creating reference patterns and pattern matching between a sample pattern and multiple reference patterns.
The present invention is further described hereinafter with reference to illustrated embodiments shown in the
accompanying drawings, in which:
FIG 1 illustrates a system for determining one or more motion parameters associated with an object,
FIG 2 illustrates a schematic representation of an
image processor based on pattern matching,
FIG 3 illustrates a schematic representation of an
image processor based on pattern decoding,
FIG 4 illustrates an exemplary timing diagram for
illuminating a surface of the object with the light of distinct colours and capturing the light reflected from the surface of the object,
FIGS 5A-5C illustrate various exemplary schemes for
disposing a set of optical patterns on a surface of an object undergoing rotary motion,
FIGS 6A-6C illustrate various exemplary schemes for
disposing a set of optical patterns on a surface of an object undergoing rotary motion,
FIG 7 illustrates various exemplary schemes for
disposing a set of optical patterns on a surface of an object undergoing linear motion,
FIG 8 illustrates various exemplary schemes for
disposing a set of optical patterns on a surface of an object undergoing linear motion, FIGS 9A-9E illustrate a schematic representation of various sets of optical patterns, a colour channel, a sample pattern, and a set of reference patterns,
FIG 10 illustrates a high-level flow diagram
corresponding to a method for determining one or more motion parameters associated with an obj ect ,
FIG 11 illustrates a detailed flow diagram
corresponding to a method for determining one or more motion parameters associated with the object based on pattern matching, and
FIG 12 illustrates a detailed flow diagram
corresponding to a method for determining one or more motion parameters associated with the object based on pattern decoding. Various embodiments are described with reference to the drawings, wherein like reference numerals are used to refer to like elements throughout. In the following description, for purpose of explanation, numerous specific details are set forth in order to provide a thorough understanding of one or more embodiments. It may be evident that such embodiments may be practiced without these specific details.
FIG 1 illustrates a schematic representation of a system 100 for determining one or more motion parameters associated with an object. The system 100 comprises a light source 102 to illuminate at least a portion of a surface of the object, an optical sensor 104 to capture light reflected from the surface and generate an image, and an image processor 106 to compute one or more motion parameters associated with the obj ect .
The light source 102 illuminates the surface with light of at least two distinct colours during distinct sampling periods. The optical sensor 104 captures light reflected from the surface to generate an image of the surface. The image includes at least two distinct colour channels capable of distinguishing the at least two distinct colours of the light. The image processor 106 is adapted for retrieving a sample pattern from each of the at least two distinct colour channels. Each sample pattern is indicative of a position of the object. The image processor 106 is further adapted for computing the one or more motion parameters based on the sample patterns.
The light source 102 is capable of producing light of at least two distinct colours. In various embodiments of the present invention, any suitable light source known in the art may be used. Various examples include, but are not limited to, gas-discharge based devices such as xenon flash devices, light emitting diodes (LEDs) such as high current flash LEDs . In accordance with an exemplary embodiment of the present invention, the light source 102 includes multiple coloured LEDs to produce light of distinct colours. In accordance with an alternative exemplary embodiment of the present invention, the light source 102 includes a white-coloured LED in
combination with multiple wavelength conversion materials such as phosphors with different excitation and emission wavelengths such that the colour of light produced from the light source 102 may be adjusted.
The light source 102 is such that each colour channel
corresponding to the respective distinct colour light is easily distinguishable from the remaining colour channels in the image .
This is achieved by selecting a light source in which each distinct colour is one of the set of primary colours
including, but not limited to, well-known additive primary colour sets such as RGB (Red, Green, Blue) set and
subtractive primary colour sets such as RYB (Red, Yellow, Blue) set. Thus, in one example, the at least two distinct colours produced by the light source 102 are selected from the set of primary colours including red, green and blue colours. The present invention shall be explained hereinafter with reference to the light source 102 producing red, blue and green colours. However, it should be noted that the aforementioned colour sets have been specified by way of example and should not be construed to limit the present invention. It will be apparent that the present invention shall work satisfactorily so long as the colour channels in the image are easily distinguishable. Moreover, it is
possible to implement the present invention using a light source producing light of two distinct colours instead of three distinct colours.
The light of each colour is produced during distinct sampling periods. The distinct sampling periods are intervened by a predefined time delay such that the sampling periods capture the surface at varying time instants in distinct colour channels. In accordance with an embodiment of the present invention, the distinct sampling periods initiated based on predefined time delays with respect to a reference time. This facilitates accurate timing of the distinct sampling periods during which the surface of the object is illuminated using light of distinct colours and the light reflected from the surface is captured.
The surface of the object reflects the light projected by the light source 102. For example, the surface illuminated using the light source 102 producing light of three distinct colours - red, green and blue during distinct sampling periods will reflect red colour light, green colour light and blue colour light during the respective sampling periods.
The optical sensor 104 captures the light reflected from the surface during each of the sampling periods in which the light source 102 produces light of distinct colours to generate one image of at least a portion of the surface of the object. The image, thus generated, has at least two distinct colour channels capable of distinguishing the at least two distinct colours of the light. The number of colour channels in an image is equivalent to the number of distinct colours illuminating the surface. Thus, illuminating the surface with light of distinct colours and exposing the optical sensor 104 for the entire duration enable capturing the surface at varying time instants in distinct colour channels in one image.
It is in order to mention that although the sampling period refers to a time interval during the surface of the object is illuminated with light of a distinct colour and the light reflected from the surface of the object is captured
corresponding to one colour channel in an image, the
information embodied in the colour channel itself is
representative of a time instant during the motion of the obj ect . The image processor 106 is adapted for retrieving the sample pattern, which is indicative of a position of the object, from each of the at least two distinct colour channels, and computing the one or more motion parameters based on the sample patterns. The image processor 106 processes the image to isolate individual colour channels in the image.
Subsequently, the image processor 106 processes the
individual colour channels using retrieve a sample pattern from each of the individual colour channels. The image processor 106 uses the sample patterns, thus retrieved, to compute the motion parameters associated with the object. The technical features and functioning of image processor 106 will be further explained in more detail in conjunction with FIGS 2 and 3.
The present invention, as described herein, is suitable for determining the motion parameters of an object that is in relative motion with respect to the system 100, in
particular, the optical sensor 104. In various exemplary embodiments of the present invention, the system 100 is static while the object is in motion. Alternatively, the object is static while the system 100 is in motion.
It is imperative to mention that the present invention is equally applicable to various possible kinds of motion of the object relative to the system 100 for determining motion parameters including, but not limited to, rectilinear motion, rotational motion, and a combination thereof. Accordingly the present invention facilitates implementation of linear encoders used for computation of motion parameters related to linear motion, such as linear position, linear displacement, linear velocity, and linear acceleration as well as rotary encoders used for computation of motion parameters related to rotary motion, such as rotational position, rotational displacement, rotational velocity, and rotational
acceleration; and a combination thereof. FIG 2 illustrates a schematic representation of the image processor 106 based on pattern matching. The image processor 106 includes a colour-plane slicing module 202 to isolate individual colour channels in an image generated by the optical sensor 104, a pattern matching module 204 to match each sample pattern retrieved from the distinct colour channels and at least a sub-set of the reference patterns selected from a set of reference patterns, a predictive motion analysis module 206 to select the sub-set of reference patterns from the set of reference patterns, a memory module 208 to store the set of reference patterns, and a motion computation module 210 to compute the motion parameters associated with the object. In accordance with an embodiment of the present invention, the system 100 can operate in two distinct modes of
operation, namely, a first mode of operation, which is referred to as a calibration mode of operation and a second mode of operation, which is referred to as a tracking mode of operation.
During the calibration mode of operation, the object is static. The light source 102 illuminates a surface of the object during a sampling period. The optical sensor 104 captures the light reflected from the surface of the object during the sampling period to generate an image of at least a portion of the surface of the object. The image processor 106 retrieves a reference pattern from the image, associates the reference pattern with the current position of the object and stores the reference pattern, along with information
corresponding to the current position of the object. Thus, a reference pattern corresponding to the current position of the object is generated. The object is subjected to a
predefined quantum of, linear or angular, as the case may be, displacement and steps for generating the reference pattern are repeated and thus, a reference pattern for the new position is generated. This process of subjecting the object to predefined quantum of displacement and generating the reference pattern corresponding to the current position is iteratively performed for the entire range of distinct positions of the object. Thus, a set of reference patterns is created during the calibration mode of operation. The set of reference patterns is stored in the memory module 208.
It should be noted that the size of predefined quantum of displacement is based on the desired level of resolution with which the position of the object may be determined. Further, several variations are possible with regard to light used for illuminating the object. For example, it is possible to use white light for generation of the reference patterns.
Alternatively, it is possible to use light of one distinct colour. Further, although the calibration mode of operation is intended to generate only one set of reference patterns, it is possible to create separate sets of reference patterns for each colour channel. All such variations are intended to be well within the scope of the present invention. During the tracking mode of operation, the object is dynamic and undergoes motion within a specified range of operable speeds depending on a function of the object. The light source 102 and the optical sensor 104 operate as explained in conjunction with FIG 1. Thus, the image generated by the optical sensor 104 includes at least two distinct colour channels .
The colour-plane slicing module 202 processes the image including two or more colour channels to isolate the
individual colour channels present in the image.
As explained earlier, during the calibration mode of
operation, a set of reference patterns is generated and is stored in memory module 208.
The predictive motion analysis module 206 maintains motion- related data corresponding to expected values of one or more motion parameters associated with the object. In accordance with various embodiments of the present invention, the motion-related data may be pre-recorded in the predictive motion analysis module 206 or the motion-related data may be fetched from the motion computation module 210 during initial stages of the tracking mode of operation. Further, the motion-related data stored in the predictive motion analysis module 206 may be updated during the tracking mode of operation at regular intervals. The predictive motion
analysis module 206 predicts a range of likely positions of the object during a given sampling period in accordance with the motion-related data corresponding to one or more
preceding sampling periods. For example, the information related to position and velocity of the object during a preceding sampling period and the information related to the time elapsed between the preceding sampling period and a current sampling period may be used to predict the range of likely positions of the object during the current time instant. In accordance with the predicted range of likely positions of the object, a sub-set of reference patterns is selected from the set of reference patterns in the memory module 208. It is evident that the reference patterns
included in the sub-set of reference patterns will be
continually changed during the motion of the object. The pattern matching module 204 retrieves a sample pattern from each individual colour channel and matches the sample pattern with the sub-set of reference patterns provided by predictive motion analysis module 206. The reference pattern providing the closest match to the sample pattern is
identified and the position of the object corresponding to the reference pattern is identified as the position of the object during the sampling period corresponding to the sample pattern. The pattern matching module 204 provides the
information related to the position of the object as well as the corresponding time instants to the motion computation module 210. The pattern matching module 204 matches the sample pattern with one or more reference patterns in the sub-set of reference patterns based on generating a match-measuring metric between the sample pattern and the reference pattern. The match-measuring metric provides an indication as to the extent of match between the sample and the reference
patterns .
The pattern matching module 204 may use any suitable match- measuring metric known in the art. The match-measuring metrics, that may be used in the present invention as
described herewith, broadly correspond to two distinct approaches namely, feature-based approach and template-based approach .
The feature-based approach is based on matching one or more features present in the sample and the reference patterns. The spatial features that may be used for generating a feature-based metric include corners, edges, blobs, and ridges.
The template-based approach uses metrics based on sum or mean values such as Sum of Absolute Differences (SAD) , Sum of Absolute Transformed Differences (SATD) , Sum of Squared
Deviations (SSD) . The corresponding mean values based metrics namely, Mean Absolute Difference (MAD) , Mean of Absolute Transformed Differences (MATD) , and Mean Squared Deviations (MSD) , may also be used. In addition, the template-based approach may use a correlation-based metric. These metrics are well-known in the art and a detailed description of these metrics is not being included for sake of brevity.
In order to improve the accuracy, one or more match-measuring metrics may be combined. The reference pattern that provides the highest match based on a combination of the match- measuring metric is selected as the second portion for subsequent processing. The motion computation module 210 receives the position of the object as well as the corresponding time instants from the pattern matching module 204 and computes various motion parameters associated with the object based on the position and time information.
The distance and displacement of the object are computed based on a first position of the object and a second position of the object. The speed and the velocity are computed based on a ratio of distance and the corresponding time period, and a ratio of distance and the corresponding time period
respectively .
In accordance with an embodiment of the present invention, a set of optical patterns is defined such that each optical pattern is distinct and the set of optical patterns is disposed on the surface of the object such that each optical pattern is associated with a distinct portion of the surface of the object. Each reference pattern generated during the calibration mode of operation includes at least one optical pattern. Similarly, each sample pattern generated during the tracking mode of operation includes at least one optical pattern. Thus, the ease, efficiency, and accuracy of pattern matching between the sample and the reference patterns are significantly improved.
FIG 3 illustrates a schematic representation of the image processor 106 based on pattern decoding. The image processor 106 includes a colour-plane slicing module 302 to isolate individual colour channels in an image generated by the optical sensor 104, a pattern decoding module 304 to decode each sample pattern retrieved from the distinct colour channels in accordance with a decoding criterion, a memory module 306 to store the decoding criterion, and a motion computation module 308 to compute the motion parameters associated with the object. A set of optical patterns is defined such that each optical pattern is a distinct encoded symbol generated based on an encoding criterion. The set of optical patterns is disposed on the surface of the object such that each optical pattern is associated with a distinct portion of the surface of the object. Thus, each encoded symbol is associated with a distinct position of the object. The optical patterns are such that each sample pattern includes at least one optical pattern. The decoding criterion corresponding to the encoding criterion used for generation of encoded symbols is stored in the memory module 306.
The light source 102 and the optical sensor 104 operate as explained in conjunction with FIG 1. Thus, the image
generated by the optical sensor 104 includes at least two distinct colour channels.
The colour-plane slicing module 302 is similar to the colour- plane slicing module 202, and processes the image including two or more colour channels to isolate the individual colour channels present in the image.
As stated earlier, each optical pattern disposed on the surface of the object is an encoded symbol. The pattern decoding module 304 retrieves a sample pattern from each individual colour channel and decodes an encoded symbol included therein, based on the decoding criterion stored in the memory module 306 to obtain information related to the position of the object.
The encoding and decoding is based on any suitable encoding- decoding technique known in the art. Various examples of such encoding-decoding techniques include, but are not limited to, various types of linear bar-codes and matrix bar-codes.
Each bar-code is a unique encoded symbol and is associated with a distinct position of the object. Thus, the pattern decoding module 304 determines the information related to the position of the object. The information related to the position of the object as well as the corresponding time instants are provided to the motion computation module 308. The motion computation module 308 is similar to motion computation module 210, and receives the position of the object as well as the corresponding time instants from the pattern decoding module 304 and computes various motion parameters associated with the object based on the position and time information in a manner described in conjunction with FIG 2.
FIG 4 illustrates an exemplary timing diagram for
illuminating a surface of the object with the light of distinct colours and capturing the light reflected from the surface of the object. FIG 4 shows a timing signal 402 which provides a reference time, a timing signal 404 which triggers the light source 102 to produce light of a first colour, a timing signal 406 which triggers the light source 102 to produce light of a second colour, a timing signal 408 which trigger the light source 102 to produce light of a third colour, a timing signal 410 which triggers the optical sensor 104 to capture light reflected from the surface. The timing diagram illustrated is for the light source 102 producing light of three distinct colours.
As mentioned earlier, the timing signal 402 provides a reference time to control the remaining timing signals shown in FIG 4. The timing signals 404, 406, and 408 are such that the light source is triggered to produce light of the first, the second and the third colours at a time delay of Ti, T2, and T3 respectively from the reference time provided by the timing signal 402. The sampling periods during which light of the first, the second and the third colours is produced correspond to on-times Si, S2 and S3 respectively.
The timing signal 410 is such that the optical sensor 104 is triggered to capture light reflected from the surface starting from the reference time provided by the timing signal 402. The sampling period during which light reflected from the surface of the object is captured corresponds to on- time E. Thus, in effect, the timing signal 410 regulates an exposure time of the optical sensor 104.
As evident from the figure, the timing signals 404 through 410 are controlled such that the on-times Si, S2 and S3 are necessarily within the on-time E. This ensures that the light reflected from the surface in response to illumination by light of distinct colours produced by the light source 102 is captured within one exposure time of the optical sensor 104 such that the image includes distinct colour channels that comprise spatial and temporal information related to the moving surface at varying time instants.
It should be noted that the duration of illumination by light source 102 during each sampling period is sufficiently short such that the surface features, as captured by the optical sensor 104, are not blurred in the resulting image. Thus, in effect, the illumination by the light source 102 is
equivalent to a light flash used in optical imaging systems. In order to ensure improved performance of the present invention, it is preferable to prohibit light from any other source to illuminate the parts of surface being sampled by the system of the present invention lest it should distort the image being captured by the optical sensor 104.
In accordance with another embodiment of the present
invention, the timing signals 404, 406, and 408 are such that the light source is triggered to produce light of the first and the second colours at an intervening time delay of Di, and light of the second and the third colours at an
intervening time delay of D2 independent of any reference time.
In accordance with various embodiments of the present
invention, for a specified exposure time of the optical sensor 104 (on-time E) , time delays Ti, T2, and T3 or
intervening time delays Di and D2 are adjusted based on a desired operable range of speeds of the object for
determining motion parameters in accordance with the field of the application. As the maximum value of velocity to be determined increases, the time delays ΤΊ, T2, and T3 are so adjusted that the intervening time delays Di and D2 are reduced and vice versa. Thus, for determining motion
parameters associated with relatively faster moving surfaces, the intervening delays are decreased and for determining the motion parameters associated with relatively slower moving surfaces, the delays are increased.
FIGS 5A to 5C illustrate various exemplary schemes for disposing a set of optical patterns on a surface of an object 500 undergoing rotary motion. FIGS 5A to 5C show the object 500 undergoing rotary motion about an axis A-A' . The object 500, as shown in the FIG 5A to 5C, is a cylindrical shaft. FIG 5A shows an optical pattern 502 is imprinted on the cylindrical surface of the object 500. Similarly, FIG 5B shows an optical pattern 504 is imprinted on the planar end surface of the object 500 perpendicular to the axis A-A' . In the embodiments shown in FIG 5A and FIG 5B, the light source 102 and the optical sensor 104 are suitably positioned to image the portion of surface of the object 500 where the optical patterns 502 and 504 are disposed. FIG 5C shows both the optical patterns 502 and 504 are disposed on the
cylindrical surface and the planar end surface of the object 500 respectively. In this embodiment, two sets of the light source 102 and the optical sensor 104 are positioned to image both the optical patterns 502 and 504. The results obtained through the two sets the light source 102 and the optical sensor 104 may be triangulated for improved accuracy.
Alternatively, one set may be used as a fail-safe provision against the possible failure of the other set. FIGS 6A to 6C illustrate exemplary schemes for disposing optical patterns on a surface of an object 600 undergoing rotary motion. FIGS 6A to 6C show an object 600 undergoing rotary motion about an axis A-A' . The object 600, as shown in the FIG 6A to 6C, is a cylindrical shaft. FIGS 6A to 6C also show a cylindrical sleeve 606 mounted on the object 600. It is also possible to manufacture the object 600 such that the cylindrical sleeve 606 is an intrinsic part of the object design instead of being externally mounted on the object. FIG 6A shows an optical pattern 602 is imprinted on the
cylindrical surface of the cylindrical sleeve 606. Similarly, FIG 6B shows an optical pattern 604 is imprinted on the planar end surface of the cylindrical sleeve 606
perpendicular to the axis A-A' . In the embodiments shown in FIG 6A and FIG 6B, the light source 102 and the optical sensor 104 are suitably positioned to image the portion of surface of the cylindrical sleeve 606 where the optical patterns 602 and 604 are disposed. FIG 6C shows both the optical patterns 602 and 604 are disposed on the cylindrical surface and the planar end surface of cylindrical sleeve 606 respectively. In this embodiment, two sets of the light source 102 and the optical sensor 104 are positioned to image both the optical patterns 602 and 604. FIG 7 illustrates an exemplary scheme for disposing optical patterns on a surface of an object 700 undergoing linear motion. The object 700, as shown in the FIG 7, is in the form of a cuboid. However, the object 700 may have any geometrical form factor so long as there is at least one surface on the object 700 that is amenable to disposing an optical pattern. An optical pattern 702 is imprinted on one of the surfaces of the object 700.
FIG 8 illustrates an exemplary scheme for disposing optical patterns on a surface of an object 800 undergoing linear motion. FIG 8 also shows a planar strip 804 mounted on the object 800. An optical pattern 802 is imprinted on the planar strip. The light source 102 and the optical sensor 104 are suitably positioned to image the portion of surface of the planar strip 804 where the optical pattern 802 is disposed.
FIGS 9A to 9E illustrate a schematic representation of various optical patterns 902a, 902b, and 902c, a colour channel 904, a sample pattern 906, and a set of reference patterns 908. FIG 9A shows the optical pattern 902a suitable for deposition on a surface of an object in accordance with exemplary schemes shown in FIGS 5A and 6A. FIG 9B shows the optical pattern 902b suitable for deposition on a surface of an object in accordance with exemplary schemes shown in FIGS 5B and 6B . FIG 9C shows the optical pattern 902c suitable for deposition on a surface of an object in accordance with exemplary schemes shown in FIGS 7 and 8. Each optical pattern 902, as used herein, is a set of optical patterns Pi, P2, P3, and so on. Each optical pattern Pi through Pn is distinct from the other optical patterns in the set.
FIG 9D shows the colour channel 904 and the sample pattern 906 retrieved from the colour channel 904, as explained in conjunction with FIG 1, 2, and 3. As shown in FIG 9D, the sample pattern 906 includes the optical pattern P3. The size of optical patterns Pi through Pn and the size of the colour channel 904 are so adjusted that each colour channel 904 contains at least one optical pattern. In a limiting case, the size of the optical patterns Pi through Pn is same as the size of the colour channel 904.
FIG 9E shows the set of reference patterns 908 that are stored in the memory module 208 as described in conjunction with FIG 2.
FIG 10 illustrates a high-level flow diagram corresponding to a method for determining one or more motion parameters associated with an object.
At step 1002, a surface of the object is illuminated with light of at least two distinct colours during distinct sampling periods. The distinct sampling periods are such that they lie within an exposure time of an optical sensor.
At step 1004, the light reflected from the surface of the object during the distinct sampling periods is captured to generate an image of at least a portion of the surface of the object such that the image includes at least two distinct colour channels capable of distinguishing the at least two distinct colours of the light.
At step 1006, a sample pattern is retrieved from each of the at least two distinct colour channels. Each sample pattern is indicative of a position of the object. At step 1008, the one or more motion parameters associated with the object are computed based on the sample pattern.
FIG 11 illustrates a detailed flow diagram corresponding to a method for determining one or more motion parameters
associated with the object based on pattern matching.
At step 1102, the surface is illuminated with light of at least two distinct colours during distinct sampling periods. The distinct sampling periods are such that they lie within an exposure time of an optical sensor.
At step 1104, the light reflected from the surface during the distinct sampling periods is captured to generate an image of at least a portion of the surface of the object such that the image includes at least two distinct colour channels capable of distinguishing the at least two distinct colours of the light .
At step 1106, the image is processed to isolate the
individual colour channels in the image. At step 1108, a sample pattern is retrieved from each of the at least two distinct colour channels. Each sample pattern is indicative of a position of the object. At step 1110, the sample pattern is matched to at least a sub-set of the reference patterns selected from a set of reference patterns, wherein each reference pattern uniquely corresponds to a distinct position of the object. The sub-set of reference patterns is selected from the set of reference patterns based on a predictive motion analysis. The
predictive motion analysis provides a range of likely
positions of the object corresponding to the image based on one or more motion parameters determined at a preceding time instant. The technical features related to pattern matching between the sample pattern and the sub-set of reference patterns have been explained in more detail in conjunction with FIG 3.
At step 1112, the one or more motion parameters associated with the object are computed based on the pattern matching between the sample pattern and the sub-set of reference patterns .
FIG 12 illustrates a detailed flow diagram corresponding to a method for determining one or more motion parameters
associated with the object based on pattern decoding.
The steps 1202 through 1208 are same as step 1102 through 1108, which were described in conjunction with FIG 11.
The sample pattern retrieved from the at least two distinct colour channels includes an encoded symbol which is encoded using an encoding criterion. Each encoded symbol is
associated with a distinct position of the object.
At step 1210, the encoded symbol is decoded using a decoding criterion and the information, thus obtained, is used to ascertain the position of the object. At step 1212, the one or more motion parameters associated with the object are computed based on sample pattern
decoding .
Thus, the embodiments of the present invention, as described herein, enable determining motion parameters such as
displacement, velocity and acceleration associated with an object using a relatively less expensive system.
The system and method of the present invention eliminate the requirement of high speed cameras, which are bulky and need external cooling systems, for capturing images of the objects moving at high speeds. In lieu of increasing the image acquisition frame rate, the present invention utilizes light of at least two distinct colours to track motion parameters of a fast moving object.
Owing to the fact that the present invention facilitates the use of low-frame rate image acquisition systems, the desired levels of image resolution may be achieved without any significant cost overheads. Thus, the present invention facilitates use of high-resolution cameras. As the image resolution increases, the accuracy and the precision of the optical encoder is also enhanced.
As the intervening delays between sampling periods
corresponding to an image are easily regulated, the present invention provides a robust system and method suitable for determining a large range of motion parameters associated with the object.
The present invention will be useful for encoder applications in industrial automation and other similar fields. The present invention is equally applicable to implementation as an incremental encoder, an absolute encoder, and a multi-turn encoder . While the present invention has been described in detail with reference to certain embodiments, it should be appreciated that the present invention is not limited to those
embodiments. In view of the present disclosure, many
modifications and variations would present themselves, to those of skill in the art without departing from the scope and spirit of this invention. The scope of the present invention is, therefore, indicated by the following claims rather than by the foregoing description. All changes, modifications, and variations coming within the meaning and range of equivalency of the claims are to be considered within their scope.

Claims

Claims
1. A method for determining one or more motion parameters associated with an object comprising:
- illuminating a surface of the object with light of at least two distinct colours during distinct sampling periods,
- capturing light reflected from the surface to generate an image of at least a portion of the surface of the object, the image comprising at least two distinct colour channels capable of distinguishing the at least two distinct colours of the light,
- retrieving a sample pattern from each of the at least two distinct colour channels, wherein each sample pattern is indicative of a position of the object, and
- computing the one or more motion parameters based on the sample patterns.
2. The method according to claim 1, wherein the distinct sampling periods are intervened by an intermediate time delay, wherein the intermediate time delay is determined based on a specified range of speeds of the object.
3. The method according to claim 1 or 2, wherein the one or more motion parameters are computed based on a pattern matching between each sample pattern and at least a sub-set of reference patterns selected from a set of reference patterns, wherein each reference pattern uniquely corresponds to a distinct position of the object.
4. The method according to claim 3, wherein the sub-set of reference patterns is selected from the set of reference patterns based on a predictive motion analysis, wherein the predictive motion analysis predicts a range of likely
positions of the object corresponding to the image based on one or more motion parameters determined at a preceding time instant .
5. The method according to claim 3 or 4, further comprising - defining a set of optical patterns, wherein each optical pattern is distinct; and
- disposing the set of optical patterns on the surface of the object such that each optical pattern is associated with a distinct portion of the surface of the object, and such that each reference pattern and each sample pattern comprise at least one optical pattern.
6. The method according to any of claims 3 to 5 further comprising recording the set of reference patterns, wherein each reference pattern is an image of at least a portion of the surface of the object uniquely corresponding to a
distinct position of the object.
7. The method according to claim 5, wherein each optical pattern is an encoded symbol generated based on an encoding criterion, and further wherein the one or more motion
parameters associated with the object are computed through decoding the encoded symbol extracted from each sample pattern based on a decoding criterion.
8. The method according to any of claims 1 to 7, wherein the object is undergoing linear motion, whereby the one or more motion parameters comprise at least one of a linear position, a linear displacement, a linear velocity, and a linear acceleration of the object.
9. The method according to any of claims 1 to 7, wherein the object is undergoing rotary motion, whereby the one or more motion parameters comprise at least one of an angular
position, an angular displacement, an angular velocity, and an angular acceleration of the object.
10. A system for determining one or more motion parameters associated with an object comprising:
- a light source for illuminating a surface of the object with light of at least two distinct colours during distinct sampling periods, - an optical sensor for capturing light reflected from the surface of the object to generate an image of at least a portion of the surface of the object, the image comprising at least two distinct colour channels capable of distinguishing the at least two distinct colours of the light, and
- an image processor adapted for retrieving a sample pattern from each of the at least two distinct colour channels, wherein each sample pattern is indicative of a position of the object, and computing the one or more motion parameters based on the sample patterns.
11. The system according to claim 10, wherein the image processor further comprises a pattern matching module for pattern matching between each sample pattern and at least a sub-set of the reference patterns selected from a set of reference patterns, wherein each reference pattern uniquely corresponds to a distinct position of the object.
12. The system according to claim 11, wherein the image processor further comprises a predictive motion analysis module for selecting the sub-set of reference patterns from the set of reference patterns based on a predictive motion analysis, wherein the predictive motion analysis predicts a range of likely positions of the object corresponding to the image based on one or more motion parameters determined at a preceding time instant.
13. The system according to any of claims 10 to 12 further comprising
- an optical pattern generator for defining a set of optical patterns, wherein each optical pattern is distinct; and
- means for disposing the set of optical patterns on the surface of the object such that each optical pattern is associated with a distinct portion of the surface of the object, and such that each sample pattern comprises at least one optical pattern.
14. The system according to any of claims 11 to 13, wherein the image processor comprises a memory module storing a set of reference patterns, wherein each reference pattern is an image of at least a portion of the surface of the object uniquely corresponding to a distinct position of the object.
15. The system according to claim 13, wherein each optical pattern is an encoded symbol generated based on an encoding criterion, further wherein, the image processor comprises a pattern decoding module for decoding the encoded symbol extracted from each sample pattern based on a decoding criterion .
16. The system according to any of the claims 10 to 15, wherein the system is adapted for performing the method according to any of the claims 1 to 9.
PCT/EP2011/072725 2011-01-31 2011-12-14 Method and system for determining motion parameters associated with an object Ceased WO2012103982A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN131/KOL/2011 2011-01-31
IN131KO2011 2011-01-31

Publications (1)

Publication Number Publication Date
WO2012103982A1 true WO2012103982A1 (en) 2012-08-09

Family

ID=45478273

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2011/072725 Ceased WO2012103982A1 (en) 2011-01-31 2011-12-14 Method and system for determining motion parameters associated with an object

Country Status (1)

Country Link
WO (1) WO2012103982A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016156406A1 (en) * 2015-03-30 2016-10-06 Carl Zeiss Industrielle Messtechnik Gmbh Motion-measuring system of a machine and method for operating the motion-measuring system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2197146A (en) * 1986-11-04 1988-05-11 Canon Kk Optically detecting position of object
US6034366A (en) * 1995-12-27 2000-03-07 Lg Semicon Co., Ltd. Color linear CCD image device and driving method
US6483104B1 (en) * 1996-09-23 2002-11-19 Valeo Schalter Und Sensoren Gmbh Rotational angle sensor using a CCD line with enhanced measuring precision
US7710553B2 (en) 2005-03-30 2010-05-04 Samuel Hollander Imaging optical encoder

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2197146A (en) * 1986-11-04 1988-05-11 Canon Kk Optically detecting position of object
US6034366A (en) * 1995-12-27 2000-03-07 Lg Semicon Co., Ltd. Color linear CCD image device and driving method
US6483104B1 (en) * 1996-09-23 2002-11-19 Valeo Schalter Und Sensoren Gmbh Rotational angle sensor using a CCD line with enhanced measuring precision
US7710553B2 (en) 2005-03-30 2010-05-04 Samuel Hollander Imaging optical encoder

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
VARUN A V ET AL: "Color plane slicing and its applications to motion characterization for machine vision", ADVANCED INTELLIGENT MECHATRONICS, 2009. AIM 2009. IEEE/ASME INTERNATIONAL CONFERENCE ON, IEEE, PISCATAWAY, NJ, USA, 14 July 2009 (2009-07-14), pages 590 - 594, XP031524145, ISBN: 978-1-4244-2852-6 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016156406A1 (en) * 2015-03-30 2016-10-06 Carl Zeiss Industrielle Messtechnik Gmbh Motion-measuring system of a machine and method for operating the motion-measuring system
US10706562B2 (en) 2015-03-30 2020-07-07 Carl Zeiss Industrielle Messtechnik Gmbh Motion-measuring system of a machine and method for operating the motion-measuring system

Similar Documents

Publication Publication Date Title
US12192668B2 (en) Method and apparatus for imaging a sample using a microscope scanner
CA2945256C (en) Fringe projection for in-line inspection
EP3577416B1 (en) System and method for 3d scanning
Dziwiński A novel approach of an absolute encoder coding pattern
US8976368B2 (en) Optical grid enhancement for improved motor location
CN103512500B (en) A kind of image acquisition circuit of high speed absolute grating ruler
US20170075101A1 (en) Imaging microscope samples
JP6335011B2 (en) Measuring apparatus and method
JP7224708B2 (en) Depth data measuring head, measuring device and measuring method
KR20150145251A (en) Determining depth data for a captured image
US20130200886A1 (en) Position detecting device and method for producing a marking arrangement for a postion detecting device
CN105165006A (en) Projection system and semiconductor integrated circuit
CN107121072B (en) 2D Absolute Displacement Encoder
KR20210003310A (en) Methods and apparatus for absolute and relative depth measurements using camera focal length
CN105890634B (en) A kind of absolute type encoder and its decoding measuring method
US9507338B2 (en) Motor control device and correction data generation method in same
US20130010080A1 (en) Method and apparatus for mapping in stereo imaging
WO2012103982A1 (en) Method and system for determining motion parameters associated with an object
JP2006010568A (en) Position detection device
CN104897181B (en) Optical encoder and operating method thereof
JP2006121739A (en) System for acquiring accurate data in scanning, method for accurately relating location data to optical data and method for accurately locating sampled optical data
JP6472166B2 (en) Position control apparatus and method
KR101346982B1 (en) Apparatus and method for extracting depth image and texture image
Monks Measuring the shape of time-varying objects.
Ishii A coded structured light projection method for high-frame-rate 3D image acquisition

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11808179

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11808179

Country of ref document: EP

Kind code of ref document: A1