EP2165289A2 - Optical reader system for extracting information in a digital image - Google Patents
Optical reader system for extracting information in a digital imageInfo
- Publication number
- EP2165289A2 EP2165289A2 EP08770756A EP08770756A EP2165289A2 EP 2165289 A2 EP2165289 A2 EP 2165289A2 EP 08770756 A EP08770756 A EP 08770756A EP 08770756 A EP08770756 A EP 08770756A EP 2165289 A2 EP2165289 A2 EP 2165289A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- image
- data
- tape
- housing
- indicia
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/10544—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
- G06K7/10712—Fixed beam scanning
- G06K7/10722—Photodetector array or CCD scanning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/10544—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
- G06K7/10792—Special measures in relation to the object to be scanned
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/10544—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
- G06K7/10821—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices
- G06K7/10851—Circuits for pulse shaping, amplifying, eliminating noise signals, checking the function of the sensing device
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/10544—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
- G06K7/10821—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices
- G06K7/10861—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices sensing of data fields affixed to objects or articles, e.g. coded labels
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/10544—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
- G06K7/10821—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices
- G06K7/10881—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices constructional details of hand-held scanners
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
- G06T2207/30208—Marker matrix
Definitions
- the present invention relates to indicia reading devices, and more particularly to a method of extracting information from an image provided by an indicia reading device including one or more parameters of an object bearing such image.
- Indicia reading devices typically read data represented by printed indicia, (also referred to as symbols, symbology, bar codes, graphic symbols, etc.) For instance one type of a symbol is an array of rectangular bars and spaces that are arranged in a specific way to represent elements of data in machine readable form.
- Optical indicia reading devices typically transmit light onto a symbol and receive light scattered and/ or reflected back from a bar code symbol or indicia. The received light is interpreted by an image processor to extract the data represented by the symbol.
- Laser indicia reading devices typically utilize transmitted laser light.
- One-dimensional (1 D) optical bar code readers are characterized by reading data that is encoded along a single axis, in the widths of bars and spaces, so that such symbols can be read from a single scan along that axis, provided that the symbol is imaged with a sufficiently high resolution along that axis.
- 2D matrix symboiogies A class of bar code symbologies known as two dimensional (2D) matrix symboiogies have been developed which offer orientation-free scanning and greater data densities and capacities thani D symbologies.
- 2D matrix codes encode data as dark or light data elements within a regular polygonal matrix, accompanied by graphical finder, orientation and reference structures.
- an optica! reader may be portable and wireless in nature thereby providing added flexibility. In these circumstances, such readers form part of a wireless network in which data collected within the terminals is communicated to a host computer situated on a hardwired backbone via a wireless link.
- the readers may include a radio or optical transceiver for communicating with a network computer.
- a reader may include a central processor which directly controls the operations of the various electrical components housed within the bar code reader.
- the central processor controls detection of keyboard entries, display features, wireless communication functions, trigger detection, and bar code read and decode functionality.
- the invention has a number of embodiments for solving a problem of limited information gathered by reader devices. While reader devices are useful to decode information in one or more graphic codes, a user often requires more data to handle a package bearing such graphic codes. For example, a user often desires knowledge of salient features on a form or the size and weight of the package that bears the graphic code.
- the invention provides one or more embodiments that read conventional information bearing indicia (e.b. bar codes and Aztex codes) and uses image analysis, in particular software familiar to machine vision technology, to extract information from an object using a hand held instrument.
- the information includes but is not limited to measurements of dimensions and weight of packages.
- Some embodiments measure one parameter and other embodiments measure two or more parameters.
- Some embodiments provide stand alone apparatus for acquiring data about one or more parameters. Some embodiments operate independent of networks and others rely upon client/server networks for full operation.
- Fig. 1 is a perspective view of an exemplary PDA in accordance with the present invention.
- Fig. 2 is a fragmentary partially cutaway side view of an exemplary PDA in accordance with the present invention.
- Fig. 3 is a block schematic diagram of an exemplary PDA in accordance with the present invention.
- Fig. 4 is a flowchart of an exemplary method of operating a PDA system in accordance with the present invention.
- Fig. 5.1 is a block schematic diagram of an exemplary PDA system in accordance with the present invention.
- Fig. 5.2 shows a reference prop and a table with details of the reference marks used for the prop.
- Fig. 5.3 shows further details of the reference marks.
- Figs. 5.4 and 5.5 show a form and a distorted form image, respectively.
- Figs. 6A and 6B are exemplary diagrams of a measuring tape and a wireless measuring tape holder in accordance with the present invention.
- Fig. 7 is a partially broken away view of an exemplary PDA having a tape measure in accordance with the present invention.
- Fig. 8 is a partially broken away view of the back of a PDA having a ball-type computer mouse in accordance with the present invention.
- Fig. 9 is a partially broken away view of the back of a PDA having an optical-type computer mouse in accordance with the present invention.
- Figs. 10A-10D are views of a light box measurement apparatus in accordance with the present invention.
- Fig. 11 is a view of a networked system with a scale in accordance with the present invention.
- Fig. 12 is a view of a stand alone system with a scale in accordance with the present invention.
- a personal data acquisition device or reader device such as a personal digital assistant (PDA) 112, portable data terminal (PDT), hand held scanner or image reader, mobile phone, cellular phone, or other device may be a platform for an image reading assembly 114 having the capability for capturing and reading images, some of which may have symbol indicia provided therein.
- PDAs Personal Digital Assistants
- PDAs are typically defined as handheld devices used as a personal organizer, and having many uses such as reading information bearing indicia, calculating, use as a clock and calendar, playing computer games, accessing the Internet, sending and receiving E-mails, use as a radio or stereo, video recording, recording notes, use as an address book, and use as a spreadsheet.
- a plurality of buttons or keys 115 may be used to control operation of the PDA and the imaging reader assembly 114.
- a display 116 may be utilized to provide a graphical user interface (GUI).
- GUI graphical user interface
- PDAs may be equipped with the ability to query and receive and transmit data, such as information extracted from and image via a communication link, such as by radio link or wired link.
- a PDT is typically an electronic device that is used to enter or retrieve data via wireless transmission (WLAN or WWAN) and may also serve as an indicia reader used in a stores, warehouse, hospital, or in the field to access a database from a remote location.
- WLAN wireless transmission
- WWAN wireless wide area network
- the PDA 112 may be a Hand Held Products Dolphin® series or the like and may include a cradle connected to a computer by a cable or wireless connection to provide two-way data communication there between.
- the computer may be replaced with a different processing device, such as a data processor, a laptop computer, a modem or other connection to a network computer server, an internet connection, or the like.
- the PDA may include a display and keys mounted in a case to activate and control various features on the PDA.
- the display may be a touch screen LCD that allows the display of various icons representative of different programs available on the PDA which may be activated by finger pressure or the touch of a stylus.
- the display may also be used to show indicia, graphs, tabular data, animation, or the like.
- imaging reader assembly 114 may have an aiming pattern generator 130, illumination assembly 142, and imaging assembly 150.
- the aiming pattern generator is part of a reader.
- One or more LEDs generate light.
- Known optics shape the LED output into a cross that is projected onto the surface of a target package or form. The optics also form a corner to indicate on the target the limit of the image that can be captured by the reader. The projected aiming pattern remains on the target while the user captures the image.
- the user programs the reader to store reference features and salient features of a form.
- Reference features may be cross hairs or other registration indicia printed on the form.
- the referece forms are known and data corresponding to the reference features are stoned in a memory in the reader.
- the reader has software that compares the image data of the form to the stored reference features to locate the features.
- One the reference features are located they salient features can be found using simple geometry becasue the salient features are always in the same relative location on the form with respect to the reference features.
- Such referece features are more easily located by machine vision software because they are so distinctive. Once located, the reference features are used by the software to orient the image and find the salient features.
- salient features include image bearing indicia or gaphic code to distinguish one form from another identical form.
- the type of unique code, bar code or Aztec code can by identified as a salient feature and software in the reader can analyze images to identify the graphic code.
- Other salient features include check boxes. The user may record an image of the boxes and whether or not the boxes are checked. Another salient feature is a singature block. The user often wants to record an image of the signature.
- the pattern of the graphic code, the check boxes, the signature block and other salient form features are prestored in the memory of the reader for later identification on the target forms. Once the reader captures an image of the form, known software routines for pattern recognition analyze the captured image to identify the salient features in the form.
- Those salient features can be forwared via an wired or wireless network [ink to a network server.
- the server may record the salient features in a network database that stores only the salient features relevant to the form. Since the form has large quantities of preprinted, known data, the user does not need to record the known, standard data for each form. Instead, the user has an efficient reader that recognizes salient features and records those and discards the rest of the image.
- a form 550 With reference to Figs. 5.4 and 5.5 there is shown a form 550. It has a number of reference markings, including squares 561 , crosses 562, information bearing indicia (graphic symbols) 563, and an "X" for a signature line 564.
- a salient feature includes signature block 565.
- the form 550 is shown with perspective distortion. The invention uses the reference markings 561-564 to orient the image and thereby locate the salient feature signature block 565. Illumination and aiming light sources with different colors may be employed.
- the image reader may include white and red LEDs, red and green LEDs, white, red, and green LEDs, or some other combination chosen in response to, for example, the color of the symbols most commonly imaged by the image reader.
- Different colored LEDs may be each alternatively pulsed at a level in accordance with an overall power budget.
- Aiming pattern generator 130 may include a power supply 131 , light source 132, aperture 133 and optics 136 to create an aiming light pattern projected on or near the target which spans a portion of the receive optical system 150 operational field of view with the intent of assisting the operator to properly aim the reader at the bar code pattern that is to be read.
- a number of representative generated aiming patterns are possible and not limited to any particular pattern or type of pattern, such as any combination of rectilinear, linear, circular, elliptical, etc. figures, whether continuous or discontinuous, i.e., defined by sets of discrete dots, dashes and the like.
- the aiming light source may comprise any light source which is sufficiently small or concise and bright to provide a desired illumination pattern at the target.
- light source 132 for aiming generator 130 may comprise one or more LEDs 134, such as part number NSPG300A made by Nichia Corporation.
- the light beam from the LEDs 132 may be directed towards an aperture 133 located in close proximity to the LEDs. An image of this back illuminated aperture 133 may then be projected out towards the target location with a lens 136.
- Lens 136 may be a spherically symmetric lens, an aspheric lens, a cylindrical lens or an anamorphic lens with two different radii of curvature on their orthogonal lens axis.
- the aimer pattern generator may be a laser pattern generator.
- the light sources 132 may also be comprised of one or more laser diodes such as those available from Rohm. In this case a laser collimation lens (not shown in these drawings) will focus the laser light to a spot generally forward of the scanning hear and approximately at the plane of the target T.
- This beam may then be imaged through a diffractive interference pattern generating element, such as a holographic element fabricated with the desired pattern in mind.
- a diffractive interference pattern generating element such as a holographic element fabricated with the desired pattern in mind.
- these types of elements are known, commercially available items and may be purchased, for example, from Digital Optics Corp. of Charlotte, N. C. among others. Elements of some of these types and methods for making them are also described in U.S. Pat. Nos. 4,895,790 (Swanson); 5,170,269 (Lin et al) and 5,202,775 (Feldman et ai), which are hereby incorporated herein by reference.
- Illumination assembly 142 for illuminating target area T may include one or more power supplies 144, illumination sources 146 and illumination optics 148.
- Imaging assembly may have received optics 152 and an image sensor 154.
- the receive optics 152 has a focal point wherein parallel rays of light coming from infinity converge at the focal point. If the focal point is coincident with the image sensor, the target (at infinity) is "in focus". A target T is said to be in focus if light from target points are converged about as well as desirable at the image sensor. Conversely, it is out of focus if light is not well converged. "Focusing" is the procedure of adjusting the distance between the receive optics and the image sensor to cause the target T to be approximately in focus.
- the target may be any object or substrate and may bear a 1 D or 2D bar code symbol or text or other machine readable indicia.
- a trigger 115 may be used for controlling full or partial operation of the PDA 112.
- Image sensor 154 may be a two-dimensional array of pixels adapted to operate in a global shutter or full frame operating mode which is a color or monochrome 2D CCD, CMOS, NMOS, PMOS, CID, CMD, etc. solid state image sensor. This sensor contains an array of light sensitive photodiodes (or pixels) that convert incident light energy into electric charge. Solid state image sensors allow regions of a full frame of image data to be addressed.
- An exemplary CMOS sensor is model number MT9V022 from Micron Technology Inc. or model number VC5602V036 36CLCC from STMicroelectronics.
- the entire imager is reset before integration to remove any residual signal in the photodiodes.
- the photodiodes then accumulate charge for some period of time (exposure period), with the light collection starting and ending at about the same time for all pixels.
- Exposure period time during which light is collected
- all charges are simultaneously transferred to light shielded areas of the sensor.
- the light shield prevents further accumulation of charge during the readout process.
- the signals are then shifted out of the light shielded areas of the sensor and read out.
- the image sensor 154 may read images with illumination from a source other than illumination source 146, such as by illumination from a source located remote from the PDA.
- the output of the image sensor may be processed utilizing one or more functions or algorithms to condition the signal appropriately for use in further processing downstream, including being digitized to provide a digitized image of target T.
- a microcontroller 160 may be utilized to control one or more functions and devices of the image reader assembly 114 wherein the particulars of the functionality of microcontroller 160 may be determined by or based upon certain parameters which may be stored in memory or firmware.
- One such function may be controlling the amount of illumination provided by illumination source 146 by controlling the output power provided by illumination source power supply 144.
- An exemplary microcontroller 160 is a CY8C24223A made by Cypress Semiconductor Corporation, which is a mixed-signal array with on-chip controller devices designed to replace multiple traditional MCU-based system components with one single-chip programmable device. It may include configurable blocks of analog and digital logic, as well as programmable interconnects.
- Microcontroller 160 may include a predetermined amount of memory 162 for storing firmware and data.
- the firmware may be a software program or set of instructions embedded in or programmed on the microcontroller which provides the necessary instructions for how the microcontroller operates and communicates with other hardware.
- the firmware may be stored in the flash ROM of the microcontroller as a binary image file and may be erased and rewritten.
- the firmware may be considered "semi-permanent" since it remains the same unless it is updated. This firmware update or load may be handled by a device driver.
- the components in reader 112 may be connected by one or more bus 168 or data lines, such as an Inter-IC bus such as an I 2 C bus, which is a control bus that provides a communications link between integrated circuits in a system.
- I 2 C is a two-wire serial bus with a software-defined protocol and may be used to link such diverse components as the image sensor 154, temperature sensors, voltage level translators, EEPROMs, general-purpose I/O, A/D and D/A converters, CODECs, and microprocessors/microcontrollers.
- the functional operation of the host processor or local server 118 may involve the performance of a number of related steps, the particulars of which may be determined by or based upon certain parameters stored in memory 166 which may be any one of a number of memory types such as RAM, ROM, EEPROM, etc... In addition some memory functions may be stored in memory 162 provided as part of the microcontroller 160.
- One of the functions of the host processor 118 may be to decode machine readable symbols provided within the target or captured image.
- One dimensional symbols may include very large to ultra-small, Code 128, Interleaved 2 of 5, Codabar, Code 93, Code 11 , Code 39, UPC, EAN, and MSI.
- Stacked 1 D symbols may include PDF, Code 16K and Code 49.
- 2D symbols may include Aztec, Datamatrix, Maxicode, and QR-code.
- UPC/EAN bar codes are standard codes to mark retail products throughout North America, Europe and several other countries throughout the worlds. Decoding is a term used to describe the interpretation of a machine readable code contained in an image projected on the image sensor 154. The code has data or information encoded therein.
- information from the indicia may be preliminarily reviewed or analyzed utilizing software provided in on-board memory (i.e. 162 or other) on the reader 112 and processed by an on-board device such as microcontroller 160. The preliminary review would identify whether upgrade software is available and perhaps where to access it.
- ISO International Standards Organization
- a communications module 180 provides a communication link from imaging reader 114 to other imaging readers or to other remote systems such as host processor 118, memory 166, communication network 120, or network computer 124.
- the information bearing indicia with upgrade data may be considered sensitive information, it may therefore be required that the data be encrypted, wherein the information bearing indicia can be read, but the data in the information bearing indicia is encrypted.
- Encryption is the conversion of data into a form that cannot be easily understood by unauthorized people.
- a decrypting algorithm would be required to decrypt such data.
- Decryption is the process of converting encrypted data back into its original form, so it can be understood. Operation of the decrypting algorithm requires the use of a "key”.
- Encryption key(s) may be secret keys, private keys, or public keys.
- This encryption key may be provided in the scanner firmware, the host device, in the encrypted barcode or in a separate barcode, which allows the user to decide whether to separate the encryption key from the data or combine them.
- Encryption keys may be associated by mathematical derivation, symmetry, or other relationship. Encryption keys may updated by pushing the key to the scanner from the host device, or by scanner to scanner communication as discussed hereinbefore.
- the scanner may be able to recognize the information bearing indicia as an encrypted information bearing indicia by recognizing a unique unencrypted piece of a data string provided within the information bearing indicia. That same piece of data may also instruct the scanner where to look for the encryption key.
- the information bearing indicia may be partially encrypted, which may allow the user only to read an unencrypted part of the information bearing indicia with any scanner.
- a data formatter may be utilized to strip encrypted data portions before further processing. If the encryption key matches the encrypted information bearing indicia and decoding is completed, the scanner will successfully "read” the data in the information bearing indicia.
- the scanner may have an "encryption protected" routine with a different sequence of led blinking/beeps, different from an unsuccessful scanner read type situation.
- Fig. 5 illustrates an exemplary scanning system configuration in accordance with one embodiment of the present invention, wherein a plurality of readers 112A, 112B are being operated or utilized in a remote location, such as in a warehouse or on a delivery truck.
- Each reader may be in communication (wired or wireless) with a communication network 120.
- the communication network 120 may be in communication with a remote/web server 134 through a wired or wireless connection for the transfer of information over a distance without the use of electrical conductors or "wires".
- the distances involved may be short (a few meters as in television remote control) or very long (thousands or even millions of kilometers for radio communications).
- Wireless communication may involve radio frequency communication.
- Applications may involve point-to-point communication, point-to-multipoint communication, broadcasting, cellular networks and other wireless networks.
- cordless telephony such as DECT (Digital Enhanced Cordless Telecommunications); Cellular systems such as OG, 1G, 2G, 3G or 4G; Short-range point-to-point communication such as IrDA or RFID (Radio Frequency Identification), Wireless USB, DSRC (Dedicated Short Range Communications); Wireless sensor networks such as ZigBee; Personal area networks such as Bluetooth or Ultra- wideband (UWB from WiMedia Alliance); Wireless computer networks such as Wireless Local Area Networks (WLAN), IEEE 802.11 branded as WiFi or HIPERLAN; or Wireless Metropolitan Area Networks (WMAN) and Broadband Fixed Access (BWA) such as LMDS, WiMAX or HIPERMAN.
- WLAN Wireless Local Area Networks
- WiMAX Wireless Metropolitan Area Networks
- BWA Broadband Fixed Access
- the Internet is the worldwide, publicly accessible network of interconnected computer networks that transmit data by packet switching using the standard Internet Protocol (IP). It is a "network of networks" that consists of millions of smaller domestic, academic, business, and government networks, which together carry various information and services, such as electronic mail, online chat, file transfer, and the interlinked Web pages and other documents of the World Wide Web.
- IP Internet Protocol
- the IP is a data-oriented protocol used for communicating data across a packet-switched internetwork, and may be a network layer protocol in the internet protocol suite and encapsulated in a data link layer protocol (e.g., Ethernet).
- Ethernet data link layer protocol
- the IP provides the service of communicable unique global addressing amongst computers to provide a service not necessarily available with a data link layer.
- Ethernet provides globally unique addresses and may not be globally communicable (i.e., two arbitrarily chosen Ethernet devices will only be able to communicate if they are on the same bus).
- IP provides final destinations with data packets whereas Ethernet may only be concerned with the next device (computer, router, etc.) in the chain. The final destination and next device could be one and the same (if they are on the same bus) but the final destination could be remotely located.
- IP can be used over a heterogeneous network (i.e., a network connecting two computers can be any mix of Ethernet, ATM, FDDl, Wi-fi, token ring, etc.) and does not necessarily affect upper layer protocols.
- One or more PDAs may be outfitted with a communication module configured to communicate with other PDAs that have an appropriate type communication module.
- One or more PDA may be configured to communicate with a base unit 138 configured to interface between the PDA and the communication network.
- this link between the PDA and base unit is fixed and permanent, in the case of a wireless mobile hand held optical PDA that communicates wirelessiy with its individual base unit, this link can be made by programming the PDA with information identifying the particular base unit so the PDA directs its transmitted information to that base unit, or vice versa.
- a Portable Data Terminal is typically an electronic device that is used to enter or retrieve data via wireless transmission (WLAN or WWAN) and may also serve as an indicia reader used in a stores, warehouse, hospital, or in the field to access a database from a remote location.
- WLAN wireless transmission
- WWAN wireless wide area network
- scan refers to reading or extracting data from information bearing indicia or symbol.
- a target is imaged in a step 310.
- the target may take on many forms, such as a package, box, container, etc.
- the image reader 112 and the target may be positioned such three surfaces of the target appear in the captured image.
- a processor may then identify or determine (314) from the image outer edges of the package. For instance, the processor may identify three or more outer edges 304a-g.
- the processor may then identify or determine (318) from the image corners 306a-g of the target, corners being representative of the intersection of edges.
- the processor may then calculate or determine (320) from the edges and corners dimensions of the package, such as height, width or depth. This determination may be made by translating (324) image sensor pixels into true distance.
- An exemplary method of performing this translation is to image the target on a surface or in an environment that has a pattern or other marks which includes indicators of known dimensions or known distances with respect to each other.
- the indicators are black bars of known dimensions, square or rectangular, on a white surface.
- the indicators are one or more concentric black rings or a central black dot surrounded by one or more concentric black rings. See, for example, Figs. 5.2 and 5.3.
- a prop 500 has three walls 510, 511 , 512 at right angles to each other to form an inside corner 505 and the intersection of the planes of the three walls. The edges of the walls have one or more reference marks designated as Type I 501 and Type Il 502.
- a rectangular package has one corner placed in inside corner 505 with the package end and side against walls 513 and 510. Now the package is oriented and the location of the package with respect to the markings 501 , 502 can be determined by analysis of an image of the package and the prop.
- the table in Fig. 5.2 shows the type, number and locations of the reference markings. Details of the markings are shown in Fig. 5.3.
- the image of the object and the indicators is processed to identify the edges of the object using well-known edge detection image analysis techniques.
- the indicators provide standards for measuring the size of the pixels in the image. Oblique images suffer from perspective shortening, so portions of objects farther from the camera appear physically closer than they are. These errors are known and there are conventional software techniques that use geometry and trigonometry to scale the oblique image in order to measure the length of an edge of the object. In general, the average height of camera is known and the distance between the camera and the object can be calculated using known parametric values of the camera.
- Another exemplary method of performing this translation is to determine the distance from the target to the sensor and calculate using the known focal length of the lens of the imager.
- Another exemplary method of performing this translation is to utilize two cameras with slightly different optical axis angles to the target and then calculate by triangulating the image in a manner analogous to human sight utilizing two eyes.
- Another exemplary method of performing this translation is to project an aiming pattern on the target at a known angle.
- the processor may calculate dimensions by triangulating the length of the know dimension of the aiming pattern at specific distances with the imager optical axis.
- Another exemplary method of performing this translation is to project and aiming pattern on the target and measure the time for the aiming pattern reflection reaches the imager, similarly to a radar system.
- Another exemplary method of performing this translation is to project two approximately parallel aiming patterns onto the target with a know distance between them, which would be the same distance as reflected from the target.
- An exemplary embodiment for an image reader system is to take an image of a vehicular license plate as discussed above in relation to Fig. 5.
- the image may be analyzed with optical character recognition to determine the alphanumeric characters of the plate. Once the letters and numbers are recognized, they may be stored in memory, and compared to a lookup table to determine the vehicle owner.
- a reader scans a vehicle's state license plate.
- the reader output signal of information contained in the alphanumeric characters is wirelessly communicated to a local host which may decode the data message for further processing.
- the local host may be located in a relative close proximity, such as toll booth or another vehicle, such as police or other governmental agency vehicle.
- the local host may communicate the license data message to a remote server.
- the remote server may perform a variety of functions and responsibilities, such as decoding, accessing information to compare the indicia information against information in government databases such as motor vehicle departments, customs, the justice department, the BATF, police departments, etc..
- the information retrieved from the databases may be vehicle or operator registration information, driving or other records.
- the remote server may be linked to another remote server or computer so that another person may provide remote help or service.
- the remote server may reference a third party database, cull information, make comparisons and determinations, alert establishment personnel and security, etc. and send back a result.
- the remote server may also record the information to another database for record keeping purposes.
- the reader is an optical reader, it may take an archival picture of the vehicle, operator, or passengers which may be saved by the remote server. Information read from indicia or the picture taken may be used to electronically complete various types of forms, such as traffic tickets, statutorily required forms, etc.
- the process of extracting the information from the picture might include OCR, 2D barcode decoder such as PDF417 decoder, or matrix decoder such as Datamatrix, Aztec, QR code decoder, etc.
- indicia may be provided on the license plate in order to provide the capacity to read more information that is allowable in alphanumeric characters or singular indicia.
- the scanner might read the vehicle operator's information from the indicia (such as a PDF417 bar code) on the operator's driver's license. The information may then be compared with information associated with the indicia on vehicle license plate. Information from the operator driver license may also be utilized to populate forms, such as traffic tickets, statutoriiy required forms, etc. Such a system would be more convenient while at the same time reducing time and reducing application error rate because of incorrectly transcribed information.
- the scanner may be automatically changed to a picture taking mode, signal the operator to aim the scanner at the applicant, the driver's license, etc. and then take a picture.
- This picture could then also be automatically added to or associated with a roadside transaction or stop, at toll booths, customs checkpoints, military checkpoints, airports, etc...
- the reader may include a wireless transceiver, such as, for example a wireless Bluetooth, IEEE 802.11 b, ZigBee, or other standardized or proprietary RF device which may be configured to provide communications between the reader and the local host 118.
- the wireless transceiver may consist of an RF module and antenna (not shown) and is configured to engage in two-way communication with at least one other wireless transceiver.
- Another wireless transceiver may be located in the local host, which may be a stand-alone unit or physically incorporated into another host device such as a computer or similar device.
- the wireless transceiver may include a RF module and an antenna.
- the wireless transceiver may transmit decoded information to a wireless transceiver in the local host for secure transactions.
- the wireless communication protocol may be according to a secure protocol, such as the FIPS 140-2 standard.
- the wireless device may be configured for operation in a hostile environment and may be hermetically sealed units.
- Information bearing indicia or alphanumeric characters may contain sensitive information such as component specifications, recipes or process data in a production environment, personal records, medical information in healthcare, social security numbers, biometrics, entrance and access keys, ticketing applications, vouchers for discount in retail or the information bearing devices may be involved in transactions involving financial or private information.
- the data is generally at risk from being misused and/or to perform criminal activity.
- a scanning system with security features may reduce such risks.
- Encryption is the conversion of data into a form that cannot be easily understood by unauthorized people. A decrypting algorithm would be required to decrypt such data. Decryption is the process of converting encrypted data back into its original form, so it can be understood. Operation of the decrypting algorithm requires the use of a "key".
- Encryption key(s) may be secret keys, private keys, or public keys. This encryption key may be provided in the scanner firmware, the host device, in the encrypted barcode or in a separate barcode, which allows the user to decide whether to separate the encryption key from the data or combine them.
- Encryption keys may be associated by mathematical derivation, symmetry, or other relationship. Encryption keys may updated by pushing the key to the scanner from the host device, or by scanner to scanner communication as discussed hereinbefore.
- the scanner may be able to recognize the information bearing indicia as an encrypted information bearing indicia by recognizing a unique unencrypted piece of a data string provided within the information bearing indicia. That same piece of data may also instruct the scanner where to look for the encryption key.
- the information bearing indicia may be partially encrypted, which may allow the user only to read an unencrypted part of the information bearing indicia with any scanner.
- a data formatter may be utilized to strip encrypted data portions before further processing. If the encryption key matches the encrypted information bearing indicia and decoding is completed, the scanner will successfully "read" the data in the information bearing indicia.
- the scanner may have an "encryption protected" routine with a different sequence of led blinking/beeps, different from an unsuccessful scanner read type situation.
- Another exemplary embodiment here is to have a measuring tape which knows the measurement distance, and can communicate that information to a wireless connected device such as a PDT.
- a wireless connected device such as a PDT.
- the user would then pull out the tape so that one end was touching one end of the item to be measured, and the base of the tape measure would be at the other end of the item to be measured.
- the tape measure would "know" the distance based on how far the tape is stretched out.
- the user would then indicate (possibly with a button press on the tape measure) that the tape is in position at which time the tape measure would send the measurement data, using the wireless connection, to the PDT and the data would automatically input that data from the tape measure as the input to the PDT program.
- Tape measures often use a stiff, curved metallic ribbon that can remain stiff and straight when extended, but retracts into a coil for convenient storage. This type of tape measure will have a floating tang on the end to aid measuring. The tang will float a distance equal to its thickness, to provide both inside and outside measurements that are accurate. The tape extends from point to point placing the end-clip at the location one wants to measure from. Most tape measures have a clip (tang) that attaches to a fixed object to measure spans easily. Many steel blade tapes have tension-control brakes that lock the blade in place for measuring spans.
- the tape itself could be segmented by lengths of black and white areas of a predetermined distance (say 1/8 inch or other desired distance resolution) repeating itself all the way up the tape. Fluxuations from black to white on the tape itself could be monitored optically as the reflectance changes, and counted by a processing unit on the tape measure itself. By counting the number of transitions and multiplying that by the length of a specific color, the distance could be measured by the unit.
- a predetermined distance say 1/8 inch or other desired distance resolution
- triggering or wireless communication of the data this may be done by a mechanical switch triggering a port pin on the processor, and a radio such as Bluetooth could be used respectively.
- Figs. 6A and 6B show a stand alone tape measure 600.
- the measuring tape 610 with sequential, alternate light and dark regions 614, 616.
- the length X 612 of each region is the same. As such, counting the sequential light and dark regions and multiplying by the known length of X give a distance measurement.
- the tape measure may be a stand alone apparatus 618 that has a coiled tape on a roller (not shown).
- the tape 610 has a tang 609 on one end to prevent the tape from traveling entirely into the holder housing.
- the tang 609 also provides a stop against the wall of a measured object for withdrawing the tape 610 through opening 630 in the housing.
- An optical monitor 622 senses the passage of the alternate light and dark areas of the tape 610.
- the holder has a wireless transmitter 620 for broadcasting data signals representative of the each passage of a light or dark area.
- the a processor onboard the holder may count the pulses sensed by the optical monitor and broadcast the distance the tape is extended from the housing.
- Another exemplary embodiment here is to have a measuring tape which knows the measurement distance, and can communicate that information to a wireless connected device such as a PDT. This would allow a user to have a program on their PDT where a distance measure input was required, and at this time the PDT would connect (for instance using Bluetooth) up to this tape measure. The user would then pull out the tape so that one end was touching one end of the item to be measured, and the base of the tape measure would be at the other end of the item to be measured.
- the tape measure would "know” the distance based on how far the tape is stretched out. The user would then indicate (possibly with a button press on the tape measure) that the tape is in position at which time the tape measure would send the measurement data, using the wireless connection, to the PDT and the data would automatically input that data from the tape measure as the input to the PDT program.
- Fig. 7 there is another embodiment using a tape measure.
- the tape 716 is would in a coil on a roller disposed inside the PDA.
- the tang 709 prevents the tape from retracting entirely within the housing of the PDA 712. Since the PDA 712 already has a light source and onboard processor, they can by used to perform the sensing and counting functions described above in connection with the stand alone tape measure 600.
- Mechanical computer mice track their own movement by the user. As such, they record velocities and X and Y positions. Mechanical computer mice are well known. Early mouse patents include opposing track wheels, U.S. Patent 3,541 ,541 , ball and wheel devices, U.S. Patent 3,835,464, and ball and two rollers with spring, U.S. Patent 3,987,685 ⁇
- the ball mouse utilizes two rollers rolling against two sides of the ball. One roller detects the horizontal motion of the mouse and other the vertical motion. The motion of these two rollers causes two disc-like encoder wheels to rotate, interrupting optical beams to generate electrical signals.
- the mouse sends these signals to the computer system by means of connecting wires.
- the driver software in the system converts the signals into motion of the mouse pointer along X and Y axes on the screen.
- the operating features of a mechanical mouse are also well known. They include moving the mouse to turn a ball located on the bottom of the mouse.
- X and Y rollers grip the ball and transfer movement to optical encoding disks include light holes. Infrared LEDs shine through the disks and sensors gather light pulses passing through the holes to convert to X and Y velocities.
- Conventional computer programs such and Microsoft Paint and Microsoft Power Point permits users to click on the mouse once to set a start point and a second time to set a finish point. Such operations draw lines of known lengths that are proportional to the distance between the points.
- a ball-type mouse has a ball 810 held in a cradle (not shown) on the bottom surface of a PDA 812.
- Optical wheels 815, 825 are oriented at right angles to each other. The wheels turn about their respective axes 814, 824 as they follow the motion of the ball 810. As such, the wheels turn clockwise and counter clockwise as indicated by the arrows 816, 826.
- a guide rail 860 extends from the bottom surface to keep the PDA aligned as it transits the length of an object, such as an edge of a rectangular package.
- a guide roller 862 keeps the PDA traveling in a straight line along an edge of the package.
- the wheels 815, 816 turn and generate signals that are received by the controller 160. It has a known program to calculate the distance traveled by the PDA 812.
- the portable data acquisition device may have a cradle that holds one wheel which protrudes beyond the housing.
- the wheel is mounted on a shaft and an encoder proximate the shaft for generates signals representative of the motion of the motion of the wheel in a direction perpendicular to the shaft.
- Modern surface-independent optical mice work by using an optoelectronic sensor to take successive pictures of the surface on which the mouse operates.
- the optical mouse has embedded powerful special-purpose image-processing chips in the mouse itself. This enables the mouse to detect relative motion on a wide variety of surfaces, translating the movement of the mouse into the movement of the pointer and eliminating the need for a special mouse-pad.
- Optical mice illuminate the surface that they track over, using an LED or a laser diode. Changes between one frame and the next are processed by the image processing part of the chip and translated into movement on the two axes using an optical flow estimation algorithm.
- the Avago Technologies ADNS-2610 optical mouse sensor processes 1512 frames per second: each frame consisting of a rectangular array of 18x18 pixels and each pixel can sense 64 different levels of gray.
- Optical mice work equally well with drawing programs such as Paint and Power Point.
- PDA 912 has a light source, such as a light emitting diode, which illuminates the surface.
- An image capture device 922 typically a CMOS imager, follows the motion of the mouse. Suitable programming known by those skilled in the art computes the travel of the mouse over the surface.
- a guide rail 960 extends from the bottom surface to keep the PDA aligned as it transits the length of an object, such as an edge of a rectangular package.
- a guide roller 962 keeps the PDA traveling in a straight line along an edge of the package.
- the cmos imager provide data signals representative of the distance traveled by the PDA 912 to processor 160.
- mice It has a known program to calculate the distance traveled by the PDA 812.
- the computer mice can be refined to add more precise shaft encoders and imagers.
- the mice can be tested against a reference standard and their readings can be normalized to such standard. For example, if the mouse was moved a standard 100 inches and measured only 90 inches, the encoder could be normalized to add 0.1 inch to each one inch measurement.
- Modern accelerometers are often small micro electro-mechanical systems (MEMS), and are indeed the simplest MEMS devices possible, consisting of little more than a cantilever beam with a proof mass (also known as seismic mass) and some type of deflection sensing circuitry. Under the influence of gravity or acceleration, the proof mass deflects from its neutral position. The deflection is measured in an analog or digital manner.
- MEMS-based accelerometer contains a small heater at the bottom of a very small dome, which heats the air inside the dome to cause it to rise. A thermocouple on the dome determines where the heated air reaches the dome and the deflection off the center is a measure of the acceleration applied to the sensor.
- MEMS acceierometers are available in a wide variety of measuring ranges, reaching up to thousands of g's.
- Accelerometers can measure velocity and distance. Velocity is the first integral of acceleration and distance is the second integral. So long as one knows the time the mass undergoes motion, the distance between the start and stop of motion is relatively simple calculation and can be done automatically be a processor.
- the proportion of time that the switch is closed in each cycle is proportional to the square root of the acceleration value and distance measuring comprises counting the clock pulses to give a signal proportional to the square root of distance traveled. In both cases, this signal may be compared with a stored predetermined value to provide a control function.
- the accelerometer 950 is mounted inside the housing of the PDA. In operation, it senses the motion and provides data signals for the processor 160. It has a known program to calculate the distance the PDA moves.
- Edge detection is a terminology in image processing and computer vision, particularly in within the areas of feature detection and feature extraction, to refer to algorithms which aim at identifying points in a digital image at which the image brightness changes sharply or more formally has discontinuities.
- the edges extracted from a two-dimensional image of a three-dimensional scene can be classified as either viewpoint dependent or viewpoint independent.
- a viewpoint independent edge typically reflects inherent properties of the three-dimensional objects, such as surface markings and surface shape.
- a viewpoint dependent edge may change as the viewpoint changes, and typically reflects the geometry of the scene, such as objects occluding one another.
- a typical edge might for instance be the border between a package and substrate supporting the package.
- a line (as can be extracted by a ridge detector) can be a small number of pixels of a different color on an otherwise unchanging background. For a line, there may therefore usually be one edge on each side of the line. Edges play quite an important role in many applications of image processing, in particular for machine vision systems that analyze scenes of man-made objects under controlled illumination conditions. During recent years, however, substantial (and successful) research has also been made on computer vision methods that do not explicitly rely on edge detection as a preprocessing step.
- Edge detection methods often differ in the types of smoothing filters that are applied and the way the measures of edge strength are computed. As many edge detection methods rely on the computation of image gradients, they also differ in the types of filters used for computing gradient estimates in the x- and y- directions.
- a measure of edge strength typically the gradient magnitude
- the next stage is to apply a threshold, to decide whether edges are present or not at an image point. The lower the threshold, the more edges will be detected, and the result will be increasingly susceptible to noise, and also to picking out irrelevant features from the image. Conversely, a high threshold may miss subtle edges, or result in fragmented edges.
- edge thresholding is applied to just the gradient magnitude image, the resulting edges will in general be thick and some type of edge thinning postprocessing is necessary.
- edge curves are thin by definition and the edge pixels can be linked into edge polygon by an edge linking (edge tracking) procedure.
- edge tracking edge tracking
- the non-maximum suppression stage can be implemented by estimating the gradient direction using first-order derivatives, then rounding off the gradient direction to multiples of 45 degrees, and finally comparing the values of the gradient magnitude in the estimated gradient direction.
- a commonly used approach to handle the problem of appropriate thresholds for thresholding is by using thresholding with hysteresis.
- This method uses multiple thresholds to find edges. We begin by using the upper threshold to find the start of an edge. Once we have a start point, we then trace the path of the edge through the image pixel by pixel, marking an edge whenever we are above the lower threshold. We stop marking our edge only when the value falls below our lower threshold.
- This approach makes the assumption that edges are likely to be in continuous curves, and allows us to follow a faint section of an edge we have previously seen, without meaning that every noisy pixel in the image is marked down as an edge. Still, however, we have the problem of choosing appropriate thresholding parameters, and suitable thresholding values may vary over the image.
- Machine vision systems for identifying edges and processing images of objects to measure the lengths of edges are well known. See, for example, United States Patent 6,621 ,928, lnagaki et al., September 16, 2003 for Image edge detection method, inspection system, and recording medium. That patent shows an inspection system which includes a memory for storing image data provided by picking up an image of a workpiece, a monitor for displaying the image data stored in the memory on a display screen with pixels arranged in an X-axis direction and a Y-axis direction perpendicular to the X-axis direction. It also provides a contro!
- the panel for setting a window with four sides along the X- or Y-axis direction on the display screen, and an edge detection section for integrating the lightness values of the pixels with respect to each pixel string arranged in the Y- or X-axis direction in the setup window.
- the system detects as an edge the position in the X- or Y-axis direction corresponding to the maximum value of the portion where the absolute value of the differential operation result in the X- or Y-axis direction, of the integration result is equal to or greater than a threshold value.
- the length of the edge can be calculated by the counting the number of pixels that define the length of the edge and scaling the pixel count to provide a distance.
- Scale may be provided by indicia embedded in the surface defined by the edge or by other means, such as a light table with a grid pattern.
- Another exemplary embodiment is to take an image of a target, such as a paper form (e.g. a shipping label or shipping form).
- a target such as a paper form (e.g. a shipping label or shipping form).
- Various types of data may automatically be collected by an image reader.
- the collection process involves the human operator placing a target, such as a form or target in the field of view of an image reader.
- the operator may actuate a trigger on an image reader for any data type to be read or the reader may automatically image the target.
- the data shown may include typed text, an IBI, such as a two-dimensional barcode encoding a label number, a signature, hand-written text, etc.
- An image reader may be placed on a stand for viewing a document which may be placed on a surface or platen or the image reader may be pointed at the document.
- An image reader may be used as a document scanner or camera, as well as an IBI reader for use in certain exemplary situations, such as a shipping label, wherein a shipping company may desire to keep electronic records of packages or documents or forms.
- Forms may be of many different sizes and shapes, which may result in different image file sizes, some of which may be undesirably large. For example, a large document may take up the entire field of view of the image reader; however, a very small document may only take up a small portion of the imager field of view.
- a method to minimize image file size may be to binarize the image (i.e. turn it into a 1 bit-per-pixel image so that each dot or pixel is either black or white instead of grayscale), and then compress the data using a lossless algorithm such as a CCITT T6 Group 4 compression.
- an image cropping process is taking an image, looking at that image to determine a region or regions of interest, and cropping the image so that the resulting image only includes the region(s) of interest. The unwanted portions of the image are cut or cropped out of the image.
- Other exemplary embodiments of the invention may include correction of angular distortion or incorrect rotational orientation caused by improper location of the imager relative to the object being imaged.
- other image processing may be utilized on the image taken for different effects, such as flattening of the image (i.e.
- An exemplary embodiment for cropping an image may be to search at least two digitized images, one image taken at full or high resolution and one taken with reduced or lower resolution for nominally straight edges within the image(s). These nominally straight edges may then be characterized in terms of length and direction (i.e. vectors). By a histogram of the directions, a determination may be made as to which edge orientation predominates. All edges not nominally parallel or perpendicular to the predominate orientation may be discarded. A group of edges that comprise a form may then be chosen by their proximity to the center of the image and then their proximity to other remaining edge positions. The process may then transmute a rectangle bounding those edges into a rectified image.
- the digitized images may be binarized if they are captured as grayscale images. Binarization may be described as turning the pixels of an image from grayscale or pixels having multibit values to binary value pixels, so that each dot or pixel is either black (e.g. 1) or white (e.g. 0).
- the higher and lower resolution images may be derived or obtained from a single image capture taken by the image reader, viewed both at high resolution and at reduced or lower resolution. Full resolution may be considered the highest resolution.
- the searches for nominally straight edges may be done in succession. Both sets of straight edges may contribute to the same pool of candidate edges. Some edges may appear in both images, and contribute twice to the search.
- An exemplary histogram analysis may consist of a series of one-dimensional slices along horizontal and vertical directions defined relative to the orientation of edges.
- the value for each one-dimensional slice corresponds to the number of zero valued pixels along a pixel slice.
- An exemplary histogram analysis may provide a two-dimensional plot of the density of data element pixels in the image data. Edges may be determined with respect to a minimum density threshold for a certain number of sequential slices.
- a histogram analysis searches inwardly along both horizontal and vertical directions until the pixel density rises above a predefined cut-off threshold.
- An exemplary embodiment for determining the region or regions of interest determination of image processing may be implemented depending on the complexity desired by a user.
- the region or regions of interest may be determined by having a known template on the surface where the document or form to be imaged is placed.
- the exemplary template may have a known pattern such as evenly spaced dots, or a grid of some type, wherein so that placing a document on the grid breaks the pattern and reveals where the document is.
- the image may be electronically rotated if, and for example, an operator does not place a form properly square with the image reader when imaging the form.
- a processor looks at an image to determine a region or regions of interest and then crops the image (i.e. cutting portions of the image out) so that the resulting image only includes the region(s) of interest.
- An exemplary embodiment may comprise correction of angular distortion or rotational orientation caused by improper location or positioning of the imager relative to the object being imaged.
- An exemplary embodiment may comprise other image processing effects such as flattening of the image (i.e. adjusting to make the dark/light contrast uniform across the cropped image), and/or other filtering techniques which may help to make the resulting image look better to the human eye.
- An exemplary embodiment may comprise determining congruent lines of activity for rotational adjustment.
- a transformation matrix may be utilized to orient the image.
- Another exemplary embodiment is to use a reader stand relative to the area to be imaged for angular distortion correction.
- a programmed processor searches two binarized images, one full resolution and one with reduced resolution, for nominally straight edges, characterizing them in terms of length and direction. By a histogram of those directions it determines which orientation predominates, discarding all edges not nominally parallel or perpendicular to it. By proximity to the center, and then to other remaining edge positions, it chooses a group of edges that may comprise a form. It then transmutes the rectangle bounding those edges into a rectified image.
- the two binarized images may be derived from a single captured image, viewed both at full resolution and at reduced resolution. The searches may be done in succession. Both sets of nominally straight edges may contribute to the same pool of candidate edges. Some edges may appear in both images, and contribute twice.
- Another exemplary embodiment for finding the region, or regions, of interest is by mapping the energy of the image.
- High energy areas i.e. areas with large pixel value variation in relatively close proximity, thus representing high contrast areas
- an area within the image may be determined by including all of these regions of interest, and the image is cropped to that new area.
- Another exemplary technique may include determining congruent lines of activity in the image for rotational adjustment.
- Another exemplary embodiment is to provide location of a stand relative to the area to be imaged, wherein angular distortion may be determined correction applied.
- Edge data may be utilized to establish predominant orthogonal horizontal and/or vertical directions which may be interpreted as representing features of a form if the presence of a form has not been assumed, if the predominant horizontal and/or vertical edges have been established, horizontal and/or vertical outer form boundaries may be established.
- predominant horizontal edges may be established first, and then predominant vertical edges.
- FIGs 10A-10C they show another exemplary embodiment to aid in the process of software driven package dimensioning software.
- This embodiment 1010 has a light emitting surface 1014 that receives objects (packages) so that imager based products (or any package dimensioning product) can use the surface for determining packaging dimension.
- This embodiment may or may not have a translucent film (either permanent or removable) to control the amount of reflection.
- This film may or may not be textured to aid in the desired contrast. This will have the ability to control the amount and brightness of the light.
- This surface may or may not have a grid pattern applied to it to aid in the package dimensioning software edge detection techniques.
- the device may or may not have the ability to exchange different types of light sources.
- the apparatus 1010 has a light source or sources 1041 disposed in the lower portion of a housing.
- a surface 1014 usually translucent or transparent, covers the light source or sources 1041.
- a rectangular package 1030 is placed on the surface of cover 1014.
- the light source 1041 is turned on to generate light.
- Light striking the bottom surface of the package 1030 is reflected back to toward the bottom of the apparatus 1010.
- a reader obtains an image of the illuminated package.
- a suitable processor and suitable software are used to process the data signals and provide detection of the edges of the bottom surface of the package 1030. By counting the number of pixels in the edges and knowing the scale of the pixels to the cover surface, one may calculate the length of the edges.
- the apparatus 1010 has indicia 1012 on the edges of the cover 1014.
- the indicia are evenly spaced, for example, one-eighth of an inch apart, and are in the same plane as the bottom of the package.
- an image that captures the bottom surface of the package and the indicia of the cover can be processed using the indicia to measure the edges.
- the surface can be coated with a retro-reflective material and light striking the surface at a critical angle will be reflected back, leaving the package dark.
- the edge of any color box in relation to its surroundings will be enhanced making package dimensioning detection easier and more accurate.
- This embodiment can be used for any package dimensioning system.
- Another exemplary embodiment is to image a form, and recognize or determine certain within a target check boxes for a person to check or mark regarding the answers to certain questions of otherwise certain information.
- the check boxes may be found or determined, and then a determination may be made as to whether those boxes have been checked or marked.
- the boxes which are marked may be correlated with a lookup table or other information to determine certain information, such as requirements, instructions, insurance information, time, etc.
- Another exemplary embodiment is to image a form, and recognize or determine known features on a target, such as lines, text, logos, etc. and then locate check boxes based on the known or predetermined location.
- Another exemplary embodiment is to image a form, and recognize, determine or locate places which can't be automatically read, and image that part for further image processing.
- An exemplary embodiment uses a PDT to take an image of a package placed on a reference grid. Once the image is captured, image processing software may be used to determine if the image is of acceptable quality.
- Another exemplary embodiment is to a user input via the touch screen or the keyboard of the PDT to indicate the quality of the image or input some other possible data for inclusion in dimensioning calculations.
- Another exemplary embodiment is to have the corners of the package in the image marked or noted utilizing a touch screen on the PDT.
- One embodiment is a client server architecture where a client application on the PDT is used to acquire an image and possibly provide additional input into the system. The client application can then transmit wirelessly the image and any additional data to a server application. The server application can then perform the dimensioning logic and provide feedback back to the client application, for instance wirelessly, such as Bluetooth, 802.11 , or other wireless communication methodology.
- Another embodiment is to have all the image processing and dimensioning software logic resident in the application running on the PDT, thereby eliminating the need to have a server application for this function.
- FIGs. 11 and 12 there are exemplary embodiments of an apparatus and a system for detecting the weight of a package in addition to taking its external dimensions.
- a package 1130 is placed on top of an RF enabled scale 1120.
- a PDT device 1112 running a client application takes an image of the box on the scale while the scale calculates the weight of the package.
- the client application may initiate the beginning of the weighing programmatically.
- Both the scale 1120 and the PDT 1112 will communicate either by 8O2.lla/g/b or by some other standard wireless communication technology to the server application on the PC.
- the PDT will send the appropriate package dimensioning data to the server while the scale will send the weight of the package to the server.
- the server 1140 will calculate the dimensions of the package from one or more of the other measurement techniques disclosed above. Abiding by the business logic of any given company the server 1140 will determine if the package's shipping price will be calculated by the weight or by the volume. The server will calculate the appropriate rate and will either send the data to a printer 1142 to print the shipping label or it will send the shipping label to the PDT so that the PDT can send the shipping label information to the printer. (This depends on the environment and physical setup of the infrastructure / equipment)
- FIG. 12 another exemplary embodiment is to provide a stand alone PDT implementation.
- a package 1230 is placed on top of an RF enabled scale 1220.
- a PDT 1212 running a software server application takes an image of the box 1230 on the scale 1220 while the scale calculates the weight of the package.
- the PDT application may initiate the beginning of the weighing programmaticaliy.
- Both the scale 1220 and the PDT 1212 will communicate either by 8O2.ila/g/b or by some other standard wireless communication technology to the server application on the PC.
- the PDT 1212 will calculate the dimensions of the package 1230 while the scale 1220 will send the weight of the package to the PDT 1212.
- the server running on the PDT will abide by the business logic of the given company.
- the server will determine if the package's shipping price will be calculated by the weight or by the volume.
- the server will calculate the appropriate rate and will send the data to a printer 1242 to print the shipping label.
- the printer is incorporated into the PDT 1212.
- the embodiments disclosed and claimed herein add more functions to the PDT, including measuring one or more parameters of packages, such as their dimensions and weight.
- Many functions of electrical and electronic apparatus may be implemented in hardware (for example, hard-wired logic), in software (for example, logic encoded in a program operating on a general purpose processor), and in firmware (for example, logic encoded in a non-volatiie memory that is invoked for operation on a processor as required). Substitution of one implementation of hardware, firmware and software for another implementation of the equivalent functionality using a different one of hardware, firmware and software may be considered.
- any implementation of the transfer function including any combination of hardware, firmware and software implementations of portions or segments of the transfer function may be considered.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Electromagnetism (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Toxicology (AREA)
- Health & Medical Sciences (AREA)
- Optics & Photonics (AREA)
- Image Input (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US93412007P | 2007-06-11 | 2007-06-11 | |
| PCT/US2008/066615 WO2008154611A2 (en) | 2007-06-11 | 2008-06-11 | Optical reader system for extracting information in a digital image |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| EP2165289A2 true EP2165289A2 (en) | 2010-03-24 |
| EP2165289A4 EP2165289A4 (en) | 2012-07-04 |
Family
ID=40130488
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP08770756A Withdrawn EP2165289A4 (en) | 2007-06-11 | 2008-06-11 | Optical reader system for extracting information in a digital image |
Country Status (2)
| Country | Link |
|---|---|
| EP (1) | EP2165289A4 (en) |
| WO (1) | WO2008154611A2 (en) |
Families Citing this family (60)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9741134B2 (en) | 2013-12-16 | 2017-08-22 | Symbol Technologies, Llc | Method and apparatus for dimensioning box object |
| US9627159B2 (en) | 2014-10-21 | 2017-04-18 | Motorola Solutions, Inc. | Method and apparatus for providing slide actuation on a device |
| US9396554B2 (en) | 2014-12-05 | 2016-07-19 | Symbol Technologies, Llc | Apparatus for and method of estimating dimensions of an object associated with a code in automatic response to reading the code |
| US10352689B2 (en) | 2016-01-28 | 2019-07-16 | Symbol Technologies, Llc | Methods and systems for high precision locationing with depth values |
| US10145955B2 (en) | 2016-02-04 | 2018-12-04 | Symbol Technologies, Llc | Methods and systems for processing point-cloud data with a line scanner |
| US10721451B2 (en) | 2016-03-23 | 2020-07-21 | Symbol Technologies, Llc | Arrangement for, and method of, loading freight into a shipping container |
| US9805240B1 (en) | 2016-04-18 | 2017-10-31 | Symbol Technologies, Llc | Barcode scanning and dimensioning |
| US10776661B2 (en) | 2016-08-19 | 2020-09-15 | Symbol Technologies, Llc | Methods, systems and apparatus for segmenting and dimensioning objects |
| US11042161B2 (en) | 2016-11-16 | 2021-06-22 | Symbol Technologies, Llc | Navigation control method and apparatus in a mobile automation system |
| US10451405B2 (en) | 2016-11-22 | 2019-10-22 | Symbol Technologies, Llc | Dimensioning system for, and method of, dimensioning freight in motion along an unconstrained path in a venue |
| US10354411B2 (en) | 2016-12-20 | 2019-07-16 | Symbol Technologies, Llc | Methods, systems and apparatus for segmenting objects |
| US10591918B2 (en) | 2017-05-01 | 2020-03-17 | Symbol Technologies, Llc | Fixed segmented lattice planning for a mobile automation apparatus |
| US10949798B2 (en) | 2017-05-01 | 2021-03-16 | Symbol Technologies, Llc | Multimodal localization and mapping for a mobile automation apparatus |
| US10726273B2 (en) | 2017-05-01 | 2020-07-28 | Symbol Technologies, Llc | Method and apparatus for shelf feature and object placement detection from shelf images |
| US11449059B2 (en) | 2017-05-01 | 2022-09-20 | Symbol Technologies, Llc | Obstacle detection for a mobile automation apparatus |
| US10663590B2 (en) | 2017-05-01 | 2020-05-26 | Symbol Technologies, Llc | Device and method for merging lidar data |
| WO2018204342A1 (en) | 2017-05-01 | 2018-11-08 | Symbol Technologies, Llc | Product status detection system |
| US11978011B2 (en) | 2017-05-01 | 2024-05-07 | Symbol Technologies, Llc | Method and apparatus for object status detection |
| US11367092B2 (en) | 2017-05-01 | 2022-06-21 | Symbol Technologies, Llc | Method and apparatus for extracting and processing price text from an image set |
| US11600084B2 (en) | 2017-05-05 | 2023-03-07 | Symbol Technologies, Llc | Method and apparatus for detecting and interpreting price label text |
| US10572763B2 (en) | 2017-09-07 | 2020-02-25 | Symbol Technologies, Llc | Method and apparatus for support surface edge detection |
| US10521914B2 (en) | 2017-09-07 | 2019-12-31 | Symbol Technologies, Llc | Multi-sensor object recognition system and method |
| US10859363B2 (en) | 2017-09-27 | 2020-12-08 | Stanley Black & Decker, Inc. | Tape rule assembly with linear optical encoder for sensing human-readable graduations of length |
| US10823572B2 (en) | 2018-04-05 | 2020-11-03 | Symbol Technologies, Llc | Method, system and apparatus for generating navigational data |
| US11327504B2 (en) | 2018-04-05 | 2022-05-10 | Symbol Technologies, Llc | Method, system and apparatus for mobile automation apparatus localization |
| US10809078B2 (en) | 2018-04-05 | 2020-10-20 | Symbol Technologies, Llc | Method, system and apparatus for dynamic path generation |
| US10740911B2 (en) | 2018-04-05 | 2020-08-11 | Symbol Technologies, Llc | Method, system and apparatus for correcting translucency artifacts in data representing a support structure |
| US10832436B2 (en) | 2018-04-05 | 2020-11-10 | Symbol Technologies, Llc | Method, system and apparatus for recovering label positions |
| US11506483B2 (en) | 2018-10-05 | 2022-11-22 | Zebra Technologies Corporation | Method, system and apparatus for support structure depth determination |
| US11010920B2 (en) | 2018-10-05 | 2021-05-18 | Zebra Technologies Corporation | Method, system and apparatus for object detection in point clouds |
| US12243001B1 (en) | 2018-10-09 | 2025-03-04 | Fida, Llc | Multilayered method and apparatus to facilitate the accurate calculation of freight density, area, and classification and provide recommendations to optimize shipping efficiency |
| US11379788B1 (en) | 2018-10-09 | 2022-07-05 | Fida, Llc | Multilayered method and apparatus to facilitate the accurate calculation of freight density, area, and classification and provide recommendations to optimize shipping efficiency |
| US11090811B2 (en) | 2018-11-13 | 2021-08-17 | Zebra Technologies Corporation | Method and apparatus for labeling of support structures |
| US11003188B2 (en) | 2018-11-13 | 2021-05-11 | Zebra Technologies Corporation | Method, system and apparatus for obstacle handling in navigational path generation |
| US11416000B2 (en) | 2018-12-07 | 2022-08-16 | Zebra Technologies Corporation | Method and apparatus for navigational ray tracing |
| US11079240B2 (en) | 2018-12-07 | 2021-08-03 | Zebra Technologies Corporation | Method, system and apparatus for adaptive particle filter localization |
| US11100303B2 (en) | 2018-12-10 | 2021-08-24 | Zebra Technologies Corporation | Method, system and apparatus for auxiliary label detection and association |
| US11015938B2 (en) | 2018-12-12 | 2021-05-25 | Zebra Technologies Corporation | Method, system and apparatus for navigational assistance |
| US10731970B2 (en) | 2018-12-13 | 2020-08-04 | Zebra Technologies Corporation | Method, system and apparatus for support structure detection |
| CA3028708A1 (en) | 2018-12-28 | 2020-06-28 | Zih Corp. | Method, system and apparatus for dynamic loop closure in mapping trajectories |
| US11662739B2 (en) | 2019-06-03 | 2023-05-30 | Zebra Technologies Corporation | Method, system and apparatus for adaptive ceiling-based localization |
| US11080566B2 (en) | 2019-06-03 | 2021-08-03 | Zebra Technologies Corporation | Method, system and apparatus for gap detection in support structures with peg regions |
| US11151743B2 (en) | 2019-06-03 | 2021-10-19 | Zebra Technologies Corporation | Method, system and apparatus for end of aisle detection |
| US11341663B2 (en) | 2019-06-03 | 2022-05-24 | Zebra Technologies Corporation | Method, system and apparatus for detecting support structure obstructions |
| US11200677B2 (en) | 2019-06-03 | 2021-12-14 | Zebra Technologies Corporation | Method, system and apparatus for shelf edge detection |
| US11960286B2 (en) | 2019-06-03 | 2024-04-16 | Zebra Technologies Corporation | Method, system and apparatus for dynamic task sequencing |
| US11402846B2 (en) | 2019-06-03 | 2022-08-02 | Zebra Technologies Corporation | Method, system and apparatus for mitigating data capture light leakage |
| FR3099271B1 (en) * | 2019-07-25 | 2021-07-02 | Spirtech | Method of identifying and coupling a target terminal close to a portable object among a plurality of terminals within the range of wireless communication with the portable object |
| TWI709913B (en) | 2019-10-28 | 2020-11-11 | 阿丹電子企業股份有限公司 | Multifunctional handheld scanner |
| US11507103B2 (en) | 2019-12-04 | 2022-11-22 | Zebra Technologies Corporation | Method, system and apparatus for localization-based historical obstacle handling |
| US11107238B2 (en) | 2019-12-13 | 2021-08-31 | Zebra Technologies Corporation | Method, system and apparatus for detecting item facings |
| US11822333B2 (en) | 2020-03-30 | 2023-11-21 | Zebra Technologies Corporation | Method, system and apparatus for data capture illumination control |
| US11450024B2 (en) | 2020-07-17 | 2022-09-20 | Zebra Technologies Corporation | Mixed depth object detection |
| CN114079761B (en) * | 2020-08-21 | 2024-05-17 | 深圳市环球数码科技有限公司 | Integrated movie player and DLP projection device conforming to DCI standard |
| US11593915B2 (en) | 2020-10-21 | 2023-02-28 | Zebra Technologies Corporation | Parallax-tolerant panoramic image generation |
| US11392891B2 (en) | 2020-11-03 | 2022-07-19 | Zebra Technologies Corporation | Item placement detection and optimization in material handling systems |
| US11847832B2 (en) | 2020-11-11 | 2023-12-19 | Zebra Technologies Corporation | Object classification for autonomous navigation systems |
| CN113435240B (en) * | 2021-04-13 | 2024-06-14 | 北京易道博识科技有限公司 | End-to-end form detection and structure identification method and system |
| US11954882B2 (en) | 2021-06-17 | 2024-04-09 | Zebra Technologies Corporation | Feature-based georegistration for mobile computing devices |
| CN118071978B (en) * | 2024-04-19 | 2024-07-09 | 陕西正浩电力科技有限公司 | Goods shelf positioning method and system based on data processing |
Family Cites Families (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5001658A (en) * | 1989-06-19 | 1991-03-19 | Scan Systems, Inc. | Pipe length tally system |
| US7387253B1 (en) * | 1996-09-03 | 2008-06-17 | Hand Held Products, Inc. | Optical reader system comprising local host processor and optical reader |
| ATE462164T1 (en) * | 1998-04-30 | 2010-04-15 | Anoto Group Ab | APPARATUS AND METHOD FOR RECORDING HANDWRITTEN INFORMATION |
| US6336587B1 (en) * | 1998-10-19 | 2002-01-08 | Symbol Technologies, Inc. | Optical code reader for producing video displays and measuring physical parameters of objects |
| KR20020016322A (en) * | 2000-08-25 | 2002-03-04 | 박종규 | A portable multimedia wireless personal computer |
| US6995762B1 (en) * | 2001-09-13 | 2006-02-07 | Symbol Technologies, Inc. | Measurement of dimensions of solid objects from two-dimensional image(s) |
| US7289230B2 (en) * | 2002-02-06 | 2007-10-30 | Cyberoptics Semiconductors, Inc. | Wireless substrate-like sensor |
| KR20030082128A (en) * | 2002-04-16 | 2003-10-22 | 엘지전자 주식회사 | System of mouse include iris recognition of pc |
-
2008
- 2008-06-11 WO PCT/US2008/066615 patent/WO2008154611A2/en not_active Ceased
- 2008-06-11 EP EP08770756A patent/EP2165289A4/en not_active Withdrawn
Also Published As
| Publication number | Publication date |
|---|---|
| EP2165289A4 (en) | 2012-07-04 |
| WO2008154611A3 (en) | 2009-02-05 |
| WO2008154611A2 (en) | 2008-12-18 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2008154611A2 (en) | Optical reader system for extracting information in a digital image | |
| AU2003297246B2 (en) | System and method for verifying optical code reads and RFID reads | |
| US8645216B2 (en) | Enrollment apparatus, system, and method | |
| JP6499386B2 (en) | Collection of vehicle performance using PDT | |
| US6076731A (en) | Magnetic stripe reader with signature scanner | |
| US7066388B2 (en) | System and method for verifying RFID reads | |
| US9773142B2 (en) | System and method for selectively reading code symbols | |
| US20140351073A1 (en) | Enrollment apparatus, system, and method featuring three dimensional camera | |
| WO2015021473A1 (en) | Apparatus, systems and methods for enrollment of irregular shaped objects | |
| US20150003673A1 (en) | Dimensioning system | |
| EP3163497A1 (en) | Image transformation for indicia reading | |
| EP1708118A2 (en) | Combination RFID/image reader | |
| US8994513B2 (en) | Adaptive multi-sensor handheld computing device | |
| JP2012066807A5 (en) | ||
| WO2013170260A1 (en) | Hand held dimension capture apparatus, system, and method | |
| EP2079036A2 (en) | System and logo and method for logo identification and verification | |
| EP2320350B1 (en) | Annotation of optical images on a mobile device | |
| GB2595775A (en) | Using barcodes to determine item dimensions | |
| US20110261203A1 (en) | Imaging scanner utilized as a cctv camera | |
| EP1755065B1 (en) | System and method for verifying optical code reads and RFID reads | |
| JP2009129269A (en) | Information reader and information reading method | |
| EP2397966A1 (en) | Apparatus for detecting and processing data in cash desk | |
| AU2007226813A1 (en) | System and method for verifying optical code reads and RFID reads | |
| Banta et al. | Automated tag reading system for material tracking |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
| 17P | Request for examination filed |
Effective date: 20091209 |
|
| AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MT NL NO PL PT RO SE SI SK TR |
|
| AX | Request for extension of the european patent |
Extension state: AL BA MK RS |
|
| A4 | Supplementary search report drawn up and despatched |
Effective date: 20120604 |
|
| RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06K 7/10 20060101AFI20120529BHEP Ipc: G06T 7/00 20060101ALI20120529BHEP |
|
| DAX | Request for extension of the european patent (deleted) | ||
| 17Q | First examination report despatched |
Effective date: 20121120 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
| 18D | Application deemed to be withdrawn |
Effective date: 20130403 |