[go: up one dir, main page]

WO2025024080A1 - System and methods for determining glass ribbon curvature - Google Patents

System and methods for determining glass ribbon curvature Download PDF

Info

Publication number
WO2025024080A1
WO2025024080A1 PCT/US2024/035367 US2024035367W WO2025024080A1 WO 2025024080 A1 WO2025024080 A1 WO 2025024080A1 US 2024035367 W US2024035367 W US 2024035367W WO 2025024080 A1 WO2025024080 A1 WO 2025024080A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixels
curvature
glass sheet
pixel value
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2024/035367
Other languages
French (fr)
Inventor
Chih-Hua Cheng
Hsiang-Rong CHU
Po-Keng CHUNG
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Corning Inc
Original Assignee
Corning Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Corning Inc filed Critical Corning Inc
Publication of WO2025024080A1 publication Critical patent/WO2025024080A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30116Casting

Definitions

  • the present disclosure relates to the production of glass sheets and, more particularly, to apparatus and methods for determining the curvature of glass sheets during glass sheet production.
  • Glass sheets are used in a variety of applications. For example, they may be used in glass display panels such as in mobile devices, laptops, tablets, computer monitors, and television displays. Glass sheets may be manufactured by a slot drawdown process whereby molten glass is drawn through a slot to form a glass sheet. In some instances, however, the glass sheet breaks during the drawdown process. When glass sheets break, additional, unexpected costs are incurred. For example, new glass sheets need to be produced to replace the broken glass sheets, and further production of additional glass sheets is delayed due to having to produce the replacement glass sheets. As such, there are opportunities to improve the production of glass sheets. SUMMARY
  • the embodiments disclosed herein are directed to apparatus and methods for imaging a glass sheet during a drawdown process, and determining a curvature of the glass sheet based on the images.
  • the frequency at which glass sheets with various curvatures are produced can then be determined to improve the glass formation process.
  • glass sheet maps such as 3 -dimensional (3D) maps, may be generated based on the determined curvatures.
  • the glass sheet maps can also be used to analyze broken and unbroken glass sheets, and thus improve the glass formation process.
  • the embodiments may capture images, such as infrared (IR) images, of an edge of a glass sheet as the glass sheet is being formed.
  • IR infrared
  • an IR camera may be positioned such that it captures IR images of one edge of a glass sheet as the glass sheet is being drawn down to formation.
  • another IR camera may be positioned to capture IR images of a second edge of a glass sheet as the glass sheet is being drawn down.
  • the one or more IR cameras may capture one or more IR images, and may transmit the IR images to a computing device.
  • Each IR image may include one or more pixel values for each of a plurality of pixels. Each pixel corresponds to a pixel location of an IR image. The pixel values may characterize heat intensity values, for instance.
  • the computing device may receive an IR image from an IR camera, and may determine a subset of the plurality of pixels for the IR image based on the pixel values and a pixel value threshold. For example, in the example of a single channel grayscale IR image, the computing device may compare each pixel value to the pixel value threshold, and keep within the subset of the plurality of pixels those pixels that satisfy the pixel value threshold, such as those pixels with a pixel value that is at or above the pixel value threshold. The computing device may then generate curvature data characterizing a curvature (e.g., bow) of the glass sheet based on a position of each of the subset of the plurality of pixels within the IR image (e.g., a 2D position). For instance, the computing device may determine a line based on the positions of the subset of the plurality of pixels, and may then determine the curvature of the line to generate the curvature data.
  • a curvature e.g., bow
  • a glass forming system includes a glass forming apparatus, an infrared camera, and a computing device communicatively coupled to the glass forming apparatus and the infrared camera.
  • the glass forming apparatus is configured to drawdown molten glass to produce a glass sheet.
  • the infrared camera is configured to capture infrared images of the glass sheet.
  • the computing device is configured to receive an infrared image from the infrared camera, the infrared image comprising a pixel value for each of a plurality of pixels.
  • the computing device is also configured to determine a subset of the plurality of pixels based on the pixel values for the plurality of pixels and a pixel value threshold.
  • the computing device is configured to generate curvature data characterizing a curvature of the glass sheet based on a position of the subset of the plurality of pixels within the infrared image.
  • the computing device is also configured to store the curvature data in a database.
  • an apparatus includes a memory storing instructions and a processor communicatively coupled to the memory.
  • the processor is configured to execute the instructions to receive an infrared image from an infrared camera with a field-of-view that includes an edge of a glass sheet being drawn down by a glass forming apparatus, the infrared image comprising a pixel value for each of a plurality of pixels.
  • the computing device is also configured to execute the instructions to determine a subset of the plurality of pixels based on the pixel values for the plurality of pixels and a pixel value threshold.
  • the computing device is further configured to execute the instructions to generate curvature data characterizing a curvature of the glass sheet based on a position of the subset of the plurality of pixels within the infrared image.
  • the computing device is also configured to execute the instructions to store the curvature data in a database.
  • a method by a processor includes receiving an infrared image from an infrared camera with a field-of-view that includes an edge of a glass sheet being drawn down by a glass forming apparatus, the infrared image comprising a pixel value for each of a plurality of pixels.
  • the method also includes determining a subset of the plurality of pixels based on the pixel values for the plurality of pixels and a pixel value threshold.
  • the method further includes generating curvature data characterizing a curvature of the glass sheet based on a position of the subset of the plurality of pixels within the infrared image.
  • the method also includes storing the curvature data in a database.
  • a non-transitory computer readable medium stores instructions that, when executed by one or more processors, cause the one or more processors to perform a method that includes receiving an infrared image from an infrared camera with a field-of-view that includes an edge of a glass sheet being drawn down by a glass forming apparatus, the infrared image comprising a pixel value for each of a plurality of pixels.
  • the method also includes determining a subset of the plurality of pixels based on the pixel values for the plurality of pixels and a pixel value threshold.
  • the method further includes generating curvature data characterizing a curvature of the glass sheet based on a position of the subset of the plurality of pixels within the infrared image.
  • the method also includes storing the curvature data in a database.
  • FIG. 1 illustrates a glass sheet curvature detection system in accordance with some examples.
  • FIG. 2 is a block diagram of a glass sheet curvature computing device in accordance with some examples.
  • FIG. 3 illustrates exemplary functions of a glass sheet curvature computing device in accordance with some examples.
  • FIGS. 4A, 4B, 4C, 4D, and 4E illustrate infrared images in accordance with some examples.
  • FIG. 5 A illustrates an infrared image with filtered pixel values used to compute a line in accordance with some examples.
  • FIG. 5B illustrates the curvature of a line in accordance with some examples.
  • FIG. 6 illustrates an exemplary method for computing a curvature of a glass sheet based on an infrared image in accordance with some examples.
  • FIG. 7 illustrates an exemplary method for generating a map based on a computed curvature of a glass sheet in accordance with some examples.
  • any absolute term e.g., high, low, etc.
  • any singular term e.g., processor
  • any singular term e.g., processors
  • any plural term can be understood as disclosing a corresponding singular term.
  • FIG. 1 illustrates a glass sheet curvature detection system 100 that includes a glass forming apparatus 120, a first infrared (IR) camera 130, a second IR camera 140, a video camera 180, a glass sheet curvature computing device 160, and a database 170.
  • the glass forming apparatus 120 e.g., a fusion glass forming apparatus
  • the walls 123 and 124 are integral with each of opposed and downwardly inclined glass forming surfaces 133 and 134, respectively.
  • the pair of opposed and downwardly inclined glass forming surfaces 133 and 134 terminate and meet at a joining position 135.
  • Molten glass (e.g., high-purity molten glass) is delivered into cavity 121 until the molten glass overflows each of the walls 123 and 124, and then flows along the glass forming surfaces 133 and 134 to joining position 135.
  • the molten glass flowing over the glass forming surfaces 133 and 134 meets at the joining position 135 and begins to form a glass sheet 150.
  • the molten glass begins to draw down from the joining position 135 until a glass sheet 150 is formed.
  • the glass sheet 150 may be formed to any suitable thickness.
  • the glass sheet 150 may be anywhere from 500 micrometers to 3 millimeters (e.g., 2 millimeters) thick in some instances. In other instances, the glass sheet 150 may be thicker than 3 millimeters, or less than 500 micrometers thick.
  • Each of the first IR camera 130 and second IR camera 140 may be any suitable IR cameras.
  • the first IR camera 130 and second IR camera 140 may each be an IR camera that can measure heat intensities that include the range of temperatures of the glass sheet 150 as it is being formed.
  • the IR camera 130, 140 may include a 320 x 240 pixel thermal detector that can measure temperatures up to 650° Celsius (C) (i.e., 1202° Fahrenheit (F)).
  • the IR camera 130, 140 may include a 640 x 480 pixel thermal detector, and may have an object temperature range of 0° C to 650° C (i.e., 32° F to 1202° F).
  • Each of the IR cameras 130, 140 may include sensors that can capture IR images of scenes within their field-of- views.
  • first IR camera 130 may be positioned to capture IR images of a first edge 151 of the glass sheet 150.
  • second IR camera 140 may be positioned to capture IR images of a second edge 153 of the glass sheet 150.
  • the first edge 151 of the glass sheet 150 may be opposite to and longitudinally across the second edge 153 of the glass sheet 150.
  • the first edge 151 of the glass sheet 150 may have a first width 155
  • the second edge 153 of the glass sheet may have a second width 157.
  • the first width 155 and the second width 157 may differ along a vertical direction of each of the respective first edge 151 and second edge 153. In some instances, at a particular distance from the joining position 135, the first width 155 and the second width 157 are the same or nearly the same.
  • the glass sheet curvature computing device 160 may be communicatively coupled to each of the first IR camera 130, the second IR camera 140, and, in some examples, the video camera 180 and/or the glass forming apparatus 120.
  • the glass sheet curvature computing device 160 may be configured to communicate over a communication network 116 with each of the first IR camera 130, the second IR camera 140, and the video camera 180.
  • Communication network 116 may be a wired or wireless communication network.
  • communication network 116 may include one or more Universal Serial Bus® (USB) (e.g., USB 2.0) or Ethernet® cables establishing communication channels between the glass sheet curvature computing device 160 and each of the first IR camera 130, the second IR camera 140, and the video camera 180.
  • USB Universal Serial Bus
  • communication network 116 may include a cellular network, a Wi-Fi® network, a Bluetooth® network, or any other suitable wireless network.
  • the glass sheet curvature computing device 160 communicates with one or more of the first IR camera 130, the second IR camera 140, and the video camera 180 using one or more protocols, such as the real-time streaming protocol (RTSP).
  • RTSP real-time streaming protocol
  • the glass sheet curvature computing device 160 can receive IR images from each of the first IR camera 130 and the second IR camera 140 over communication network 116.
  • the first IR camera 130 may capture an IR image of the first edge 151 of the glass sheet 150, and may transmit the IR image to the glass sheet curvature computing device 160 using communication network 116.
  • the second IR camera 140 may capture an IR image of the second edge 153 of the glass sheet 150, and may transmit the IR image to the glass sheet curvature computing device 160 using communication network 116.
  • the glass sheet curvature computing device 160 can transmit a message to any of the first IR camera 130 and the second IR camera 140 to adjust their corresponding field-of-views.
  • the glass sheet curvature computing device 160 may transmit a message to the first IR camera 130 to cause the first IR camera 130 to adjust a pitch or yaw of its corresponding sensor.
  • the glass sheet curvature computing device 160 may transmit a message to the second IR camera 140 to cause the second IR camera 140 to adjust a pitch or yaw of its corresponding sensor.
  • the glass sheet curvature computing device 160 can transmit a message to any of the first IR camera 130 and the second IR camera 140 to adjust a focus of a corresponding lens that focuses light to the corresponding sensor.
  • the glass sheet curvature computing device 160 may transmit a message to the first IR camera 130 to increase or decrease a focal length of the first IR camera 130.
  • the glass sheet curvature computing device 160 may transmit a message to the second IR camera 140 to increase or decrease a focal length of the second IR camera 140.
  • the video camera 180 is configured to capture images (e.g., still images, video) of the glass sheet 150 along a longitudinal length of a front side 159 of the glass sheet 150.
  • the glass sheet curvature computing device 160 may transmit a signal to the video camera 180 to cause the video camera to capture an image.
  • the video camera 180 can transmit the captured images to the glass sheet curvature computing device 160.
  • the glass sheet curvature computing device 160 can receive the images, and can store the images within the database 170.
  • the glass sheet curvature computing device 160 can transmit one or more messages to the video camera 180 to adjust its corresponding field-of-view, as well as to adjust its focal length.
  • the first IR camera 130 captures multiple images.
  • the glass sheet curvature computing device 160 may receive a signal from the glass forming apparatus 120 (or other source) indicating that glass drawdown is beginning, and based on the signal, the glass sheet curvature computing device 160 may transmit a message to the first IR camera 130 to cause the first IR camera 130 to capture an IR image of the first edge 151 of the glass sheet 150 as its being drawn down.
  • the glass sheet curvature computing device 160 delays a predetermined amount of time after receiving the signal and, once the predetermined amount of time has expired, transmits the message to the first IR camera 130 to capture the IR image. Similarly, the glass sheet curvature computing device 160 may transmit a message to the second IR camera 140 to capture an IR image of the second edge 153 of the glass sheet 150 as its being drawn down.
  • the first IR camera 130 may transmit the captured IR image to the glass sheet curvature computing device 160.
  • the glass sheet curvature computing device 160 may determine a subset of the plurality of pixels of the IR image based on their pixel values and a pixel value threshold. Further, the glass sheet curvature computing device 160 may generate curvature data characterizing a curvature of the glass sheet based on a position of the subset of the plurality of pixels within the infrared image.
  • the glass sheet curvature computing device 160 may apply a color filtering process to the IR image to generate a color filtered image that identifies pixels within a corresponding color range.
  • the color filtering process may “filter out” all pixels not within the range.
  • the corresponding color range may correspond to the color of pixels that IR images indicate are within a range of heat values.
  • the color range may correspond to blue and/or white colors.
  • the color range includes a range for each of multiple color channels, such as red, green, and blue (RGB) color channels or luminance and chrominance (YCbCr) color channels.
  • the glass sheet curvature computing device 160 may apply a region- of-interest (ROI) mask to the color filtered image to mask out unwanted pixels, and from those remaining pixels may adjust their value based on a binary threshold to generate a binary image. For instance, the glass sheet curvature computing device 160 may compare each pixel value of the remaining pixels to the binary threshold and replace, from the remaining pixels, pixel values that are at or above a binary threshold with a first value (e.g., 355), and pixel values that are below the binary threshold with a second value (e.g., 0). The glass sheet curvature computing device 160 may also perform operations to generate multiple windows within the binary image, and determine the coordinates of all pixels with pixel values of the first value.
  • ROI region- of-interest
  • the glass sheet curvature computing device 160 may generate line data characterizing a line based on the determined pixel coordinates. For instance, the glass sheet curvature computing device 160 may apply a line algorithm to fit a line to the determined coordinates of those pixels with pixel values of the first value. Based on the line data, the glass sheet curvature computing device 160 may determine the curvature of the glass sheet 150. For example, the glass sheet curvature computing device 160 may generate curvature data characterizing the curvature of the glass sheet 150 based on applying a curvature algorithm to the line data. The glass sheet curvature computing device 160 may store the curvature data within the database 170. In some instances, the glass sheet curvature computing device 160 may generate curvature data similarly based on an IR image received from the second IR camera 140.
  • the glass sheet curvature computing device 160 receives one or more images from the video camera 180 (e.g., in response to the glass sheet curvature computing device 160 transmitting a signal to the video camera 180 to cause the video camera to capture an image).
  • the one or more images include the front side 159 of the glass sheet 150.
  • the glass sheet curvature computing device 160 may generate a 3D map characterizing the glass sheet 151. For instance, the glass sheet 151 may be drawn down by one or more pulling rolls.
  • the glass sheet curvature computing device 160 may determine a curvature of a portion of the glass sheet 151 (e.g., where the glass sheet bows in a Z direction) that is below a last pulling row (e.g., the “bottom of the draw” portion of the glass sheet 151). The glass sheet curvature computing device 160 may further determine a position of this portion of the glass sheet 151 based on the images from the video camera (e.g., an X, Y position). Based on combining the curvature and the position of the portion of the glass sheet 151 that is below the last pulling row, the glass sheet curvature computing device 160 may compute and generate the 3D map. As such, the 3D map may include a 3D image of the glass sheet 150 with the determined curvature (e.g., bow). The glass sheet curvature computing device 160 may store the 3D map in the database 170.
  • a curvature of a portion of the glass sheet 151 e.g., where the glass sheet bows in a Z direction
  • FIG. 2 illustrates an exemplary glass sheet curvature computing device 200, such as the glass sheet curvature computing device 160 of FIG. 1.
  • glass sheet curvature computing device 200 can include one or more processors 201, working memory 202, one or more input/output devices 203, instruction memory 207, a transceiver 204, and a display 206, all operatively coupled to one or more data buses 208.
  • Data buses 208 allow for communication among the various devices.
  • Data buses 208 can include wired, or wireless, communication channels.
  • Processors 201 can include one or more distinct processors, each having one or more cores. Each of the distinct processors can have the same or different structure.
  • Processors 201 can include one or more central processing units (CPUs), one or more graphics processing units (GPUs), application specific integrated circuits (ASICs), digital signal processors (DSPs), and the like.
  • Processors 201 can be configured to perform a certain function or operation by executing code, stored on instruction memory 207, embodying the function or operation.
  • processors 201 can be configured to perform one or more of any function, method, or operation disclosed herein.
  • Instruction memory 207 can store instructions that can be accessed (e.g., read) and executed by processors 201.
  • instruction memory 207 can be a non-transitory, computer-readable storage medium such as a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), flash memory, a removable disk, CD-ROM, any non-volatile memory, or any other suitable memory.
  • ROM read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • flash memory a removable disk
  • CD-ROM any non-volatile memory, or any other suitable memory.
  • processors 201 can store data to, and read data from, working memory 202.
  • processors 201 can store a working set of instructions to working memory 202, such as instructions loaded from instruction memory 207.
  • Processors 201 can also use working memory 202 to store dynamic data created during the operation of glass sheet curvature computing device 200.
  • Working memory 202 can be a random access memory (RAM) such as a static random access memory (SRAM) or dynamic random access memory (DRAM), or any other suitable memory.
  • RAM random access memory
  • SRAM static random access memory
  • DRAM dynamic random access memory
  • Input-output devices 203 can include any suitable device that allows for data input or output.
  • input-output devices 203 can include one or more of a keyboard, a touchpad, a mouse, a stylus, a touchscreen, a physical button, a speaker, a microphone, or any other suitable input or output device.
  • Input-output device 203 may allow a user to provide input selecting or characterizing preferred operational values, such as IR camera positions, for instance.
  • Display 209 can display user interface 205.
  • User interface 205 can enable user interaction with the glass sheet curvature computing device 200.
  • user interface 205 can be a user interface for an application that allows a user to enter preferred operational values to update a position of an IR camera, such as a position of the first IR camera 130, the second IR camera 140, and the video camera 180, for example.
  • a user can interact with user interface 205 by engaging input-output devices 203.
  • display 209 can be a touchscreen, where user interface 205 is displayed on the touchscreen.
  • Transceiver 204 allows for communication with a network, such as the communication network 116.
  • transceiver 204 may connect to a Wi-Fi®, Bluetooth®, cellular, or any other suitable wireless network, and may send signals (e.g., data) to, and receive signals from, other devices coupled to the wireless network.
  • Processor(s) 201 is operable to receive data from, or send data to, the wireless network via transceiver 204.
  • FIG. 3 illustrates exemplary functions of the glass sheet curvature computing device 200.
  • glass sheet curvature computing device 200 includes IR image generation engine 302, color filter engine 304, region-of-interest (ROI) application engine 306, binary threshold engine 308, slide window application engine 310, line generation engine 312, and curvature determination engine 314.
  • ROI region-of-interest
  • one or more of the IR image generation engine 302, color filter engine 304, ROI application engine 306, binary threshold engine 308, slide window application engine 310, line generation engine 312, and curvature determination engine 314 include instructions stored in instruction memory 207 that, when executed by the one or more processors 201, cause the one or more processors 201 to carry out the corresponding operations.
  • one or more of the IR image generation engine 302, color filter engine 304, ROI application engine 306, binary threshold engine 308, slide window application engine 310, line generation engine 312, and curvature determination engine 314 can be implemented in logic, such as digital logic, state machines, FPGAs, ASICs, or any other suitable logic.
  • IR image generation engine 302 may receive IR data 301.
  • IR image generation engine 302 may receive IR data 301 from the first IR camera 130.
  • the IR data 301 characterizes an IR image, such as an IR image of the first edge 153 of the glass sheet 150.
  • IR image generation engine 302 may process the IR data 301 to generate an IR image 303.
  • the IR data 301 may include pixel values for each of a plurality of pixels at corresponding pixel coordinate locations, as well as metadata characterizing information about the first IR camera 130 and or the capturing of the IR data 301 e.g., data of capture, measured temperatures, object distance, camera settings, etc.).
  • the IR image generation engine 302 may remove the metadata, and may generate the IR image 303 to include the one or more pixel values (e.g., depending on the number of color channels) for each of the plurality of pixels at corresponding pixel coordinate locations.
  • Color filter engine 304 may receive the IR image 303, and may apply a color filtering process to the IR image to generate filtered image 305 characterizing a color filtered image that identifies pixels within a corresponding color range. For example, color filter engine 304 may determine whether each pixel value for a pixel falls within a predetermined color range (e.g., a color range that corresponds to white colors indicating higher heat). If the pixel value falls outside the predetermined color range, the corresponding pixel is adjusted (e.g., filtered out). For instance, the color filter engine 304 may replace the pixel value with a predetermined value, such as zero. If, however, the pixel value falls within the range, the corresponding pixel value is left without adjustment.
  • a predetermined color range e.g., a color range that corresponds to white colors indicating higher heat.
  • FIG. 4 A illustrates an exemplary IR image 403, while FIG. 4B illustrates a resulting filtered image 405.
  • the predetermined color range may include pixel values corresponding to expected heat intensities of an edge 402 of a glass sheet.
  • IR image 403 pixels with pixel values within the predetermined range were not adjusted (i.e., they remain the same), while pixels with pixel values not within the predetermined range were replaced with a predetermined value, such as zero.
  • ROI application engine 306 receives the filtered image 305, and applies a mask to the filtered image 305 over a region of interest to generate an ROI image 307.
  • the mask may include a range of pixels along an axis of the corresponding pixel coordinates, such as the “x” axis.
  • the mask may include pixels with an “x” coordinate between 300 and 500 when the IR image 303 is of a resolution of 640 x 480 (i.e., 640 pixels along the “x” axis and 480 pixels along the “y” axis).
  • the ROI application engine 306 may replace pixel values for all pixels with an “x” coordinate that is less than 300 or greater than 500 with a predetermined value, such as zero. Otherwise, if a pixel has an “x” coordinate” that is equal to or greater than 300 and equal to or less than 500, the pixel value is not adjusted.
  • FIG. 4C illustrates an ROI image 407.
  • a mask has been applied to the filtered image 405 of FIG. 4B to generate the ROI image 407.
  • the mask may define a region-of-interest that includes pixels with an “x” axis pixel coordinate of between 300 and 500, inclusive, and may mask out pixels with an “x” axis pixel coordinate less than 300 or greater than 500.
  • binary threshold engine 308 may receive the ROI image 307, and may binary threshold process to the ROI image 307 to generate a binary image 309. For example, the binary threshold engine 308 may adjust the pixel values of the pixels within the region-of-interest, as characterized by the ROI image 307, based on a pixel value threshold to generate the binary image 309. For example, binary threshold engine 308 may determine whether each pixel value is at or above the pixel value threshold. If a pixel value is at or above the pixel value threshold, the binary threshold engine 308 may replace the pixel value with a first value (e.g., 355). If the pixel value is below the pixel value threshold, the binary threshold engine 308 may replace the pixel value with a second value (e.g., 0).
  • a first value e.g. 355
  • the binary threshold engine 308 may replace the pixel value with a second value (e.g., 0).
  • FIG. 4D illustrates a binary image 409 generated based on the ROI image 407 of FIG. 4C.
  • the pixel values for the edge 402 of the glass sheet have been replaced with a first value (z.e., because their original pixel value was the same or greater than the pixel value threshold), and other pixels, such as pixels with area 410, have been replaced with a second value (z.e., because their original pixel value was less than the pixel value threshold).
  • slide window application engine 310 may receive the binary image 309 from the binary threshold engine 308, and may generate a slide window image 311 that includes multiple windows, where each window includes a portion of the pixels defining the edge 402 of the glass sheet. For example, slide window application engine 310 may generate a predetermined number of windows (e.g., based on a user-provided configuration setting stored in memory), each window including pixels with a corresponding range of “y” coordinates and a corresponding range of “x” pixel coordinates.
  • a first window may include pixels with a “y” coordinate between 0 and 37
  • a second window may include pixels with a “y” coordinate between 38 and 75, and so on.
  • the first window may have a top edge at “y” coordinate 0 and a bottom edge at “y” coordinate 37.
  • the second window may have a top edge at “y” coordinate 38 and a bottom edge at “y” coordinate 75, and so on.
  • each window may define a range along the “x” axis that expands at least enough to include all pixels with pixel values of the first value (e.g., 355).
  • slide window application engine 310 may generate each window to include a left edge that, for every “y” coordinate position, is a predetermined distance away (e.g., 5 pixels) from a pixel with the first value that has a left-most “x” coordinate at the same “y” coordinate position.
  • Slide window application engine 310 may also generate each window to include a right edge that, for every “y” coordinate position, is a predetermined distance away (e.g., 5 pixels) from a pixel with the first value that has a right-most “x” coordinate at the same “y” coordinate position.
  • FIG. 4E illustrates a slide window image 411 generated based on the binary image 409 of FIG. 4D.
  • slide window image 411 includes multiple windows 415.
  • each window 415 includes a left edge 417, a right edge 419, a top edge 421, and a bottom edge 423.
  • each window 415 includes corresponding pixels with pixel values of the first value.
  • line generation engine 312 receives slide window image 311 from slide window application engine 310, and applies one or more line function generation processes to pixel coordinates of the pixels with pixel values of the first value included within the windows to generate line data 313 characterizing a line.
  • line generation engine 312 may apply any suitable algorithm to the pixel coordinates to fit (e.g., estimate) a line to the corresponding pixels.
  • line generation engine 310 may determine a quadratic equation, such as the one below, based on the pixel coordinates of the pixels with the first value.
  • curvature determination engine 314 may receive the line data 313 from the line generation engine 312, and may generate curvature data 315 characterizing a curvature of the line characterized by the line data 313.
  • curvature determination engine 314 may apply to the line data 313 any suitable curvature algorithm, such as the one below, to generate the curvature data 315.
  • the curvature data 315 further identifies a radius of the curvature.
  • the curvature determination engine 314 may compute the radius using any suitable algorithm, such as the ones defined below.
  • the curvature determination engine 314 may store the curvature data 315 in a memory device, such as working memory 202 or database 170.
  • processor 201 compares the curvature and/or radius characterized by the curvature data 315 to corresponding threshold values. If the curvature is greater than a curvature threshold, the processor 301 may generate and transmit a warning signal indicating a glass sheet breakage warning. Similarly, if the radius is less than a radius threshold, the processor 201 may similarly generate and transmit the warning signal.
  • the warning signal may cause an indication, such as a pop-up message or flashing icon, to be displayed (e.g., on display 209).
  • FIG. 5A illustrates the determination of a curvature for a quadratic equation characterizing a line generated from a slide window image 502.
  • FIG. 5B illustrates the computation of a radius R of the curvature of a line C measured from a point P on the line C.
  • FIG. 6 illustrates an exemplary method 600 that may be performed by one or more computing devices, such as the glass sheet curvature computing device 160.
  • the exemplary method 600 may be carried out by processors distributed across one or more networks, such as processors within one or more cloud-based servers.
  • infrared data is received from a camera positioned to capture a vertical bow of a glass sheet.
  • first IR camera 130 may be positioned to capture IR images of a first edge 151 of the glass sheet 150.
  • the glass sheet curvature computing device 160 may receive the IR images (e.g., IR data 301) from the first IR camera 130.
  • a filtered image is generated based on applying a color filtering process to the infrared data. For example, to generate a filtered image such as filtered image 305, glass sheet curvature computing device 160 may determine whether each pixel value for each pixel characterized by the infrared data falls within a predetermined color range. If the pixel value falls outside the predetermined color range, the corresponding pixel is adjusted (e.g., filtered out). If, however, the pixel value falls within the range, the corresponding pixel value is left without adjustment.
  • a region-of-interest (ROI) image is generated based on applying a ROI mask to the filtered image.
  • glass sheet curvature computing device 160 may apply a mask to the filtered image (e.g., filtered image 305) over a region of interest to generate the ROI image, such as ROI image 307.
  • the mask may define “x” axis pixel coordinates across a range, as described herein. As such, pixel values for pixels with “x” axis pixel coordinates within the range are not adjusted, and pixel values for pixels with “x” axis pixel coordinates outside of the range are adjusted (e.g., to zero).
  • a binary image is generated based on applying a binary threshold process to the ROI image.
  • glass sheet curvature computing device 160 may generate the binary image by adjusting the pixel values of the pixels within the region-of-interest, as characterized by the ROI image, based on a pixel value threshold.
  • glass sheet curvature computing device 160 may determine whether each pixel value is at or above the pixel value threshold. If a pixel value is at or above the pixel value threshold, the glass sheet curvature computing device 160 may replace the pixel value with a first value (e.g., 355). If the pixel value is below the pixel value threshold, the glass sheet curvature computing device 160 may replace the pixel value with a second value (e.g., 0).
  • a slide window image that includes non-zero pixels of the binary image within each of multiple windows is generated.
  • the glass sheet curvature computing device 160 may generate a predetermined number of windows where each window includes binary image pixels with non-zero pixel values (e.g., 355) and that have a “y” coordinate within a corresponding range of “y” coordinates and an “x” coordinate within a corresponding range of “x” pixel coordinates.
  • line data characterizing a line is generated.
  • line generation engine 312 may apply any suitable algorithm to the pixel coordinates to fit (e.g., estimate) a line to the corresponding pixels.
  • the glass sheet curvature computing device 160 may determine a quadratic equation based on the pixel coordinates of the pixels with the non-zero values. The quadratic equation may characterize a best fit line to these corresponding pixel locations.
  • curvature data is generated based on the line data.
  • the curvature data characterizes a curvature of the glass sheet captured within the infrared data received at step 602.
  • the glass sheet curvature computing device 160 may apply any suitable curvature algorithm to the line data (e.g., line data 313) to generate the curvature data (e.g., curvature data 315).
  • the glass sheet curvature computing device 160 further determines a radius of the curvature, and generates the curvature data to, additionally or alternatively, include the computed radius.
  • the curvature data is stored in a database.
  • the glass sheet curvature computing device 160 may store the curvature data within database 170.
  • FIG. 7 illustrates an exemplary method 700 that may be performed by one or more computing devices, such as the glass sheet curvature computing device 160.
  • the exemplary method 700 may be carried out by processors distributed across one or more networks, such as processors within one or more cloud-based servers.
  • a first signal is transmitted to an infrared camera to cause the infrared camera to position its lens to include a side view of a glass forming apparatus within its field-of-view.
  • the glass sheet curvature computing device 160 transmits a message to first IR camera 130 that, when received by the first IR camera 130, causes the first IR camera to adjust a pitch or yaw of its corresponding sensor to capture a side view of the glass forming apparatus 120. Further, at step 704, a second signal is received. The second signal indicates the start of a fusion drawdown process to produce a glass sheet. For example, the glass sheet curvature computing device 160 may receive a signal from the glass forming apparatus 120 indicating that glass drawdown is beginning.
  • a third signal is transmitted to the infrared camera to cause the infrared camera to capture an infrared image of a side view (e.g. , edge view) of the glass sheet.
  • the glass sheet curvature computing device 160 may transmit a message to the first IR camera 130 to cause the first IR camera 130 to capture an IR image of the first edge 151 of the glass sheet 150 as its being drawn down.
  • curvature data is generated characterizing a curvature of the glass sheet based on the infrared image.
  • the glass sheet curvature computing device 160 may determine a subset of the plurality of pixels of the infrared image based on their pixel values and a pixel value threshold. Further, the glass sheet curvature computing device 160 may generate the curvature data characterizing a curvature of the glass sheet based on a position of the subset of the plurality of pixels within the infrared image. In some examples, the glass sheet curvature computing device 160 may generate the curvature data as described with respect to the method of FIG. 6.
  • a fourth signal is transmitted to an additional camera to cause the additional camera to capture a front side image of the glass sheet.
  • the glass sheet curvature computing device 160 may transmit a signal to the video camera 180 to capture an image of the glass sheet 150 along a longitudinal length of a front side 159 of the glass sheet 150.
  • the video camera 180 can then transmit the captured image to the glass sheet curvature computing device 160.
  • a 3D map is generated based on the curvature data and the front side image of the glass sheet.
  • the glass sheet curvature computing device 160 may apply any map generation process described herein to generate the 3D map.
  • the 3D map includes a 3D image of a bow of the glass sheet.
  • the 3D map is stored in a database.
  • the glass sheet curvature computing device 160 may store the 3D map in database 170.
  • the methods and system described herein can be at least partially embodied in the form of computer-implemented processes and apparatus for practicing those processes.
  • the disclosed methods may also be at least partially embodied in the form of tangible, non-transitory machine-readable storage media encoded with computer program code.
  • the steps of the methods can be embodied in hardware, in executable instructions executed by a processor (e.g., software), or a combination of the two.
  • the media may include, for example, RAMs, ROMs, CD-ROMs, DVD-ROMs, BD- ROMs, hard disk drives, flash memories, or any other non-transitory machine-readable storage medium.
  • the computer When the computer program code is loaded into and executed by a computer, the computer becomes an apparatus for practicing the method.
  • the methods may also be at least partially embodied in the form of a computer into which computer program code is loaded or executed, such that, the computer becomes a special purpose computer for practicing the methods.
  • the computer program code segments When implemented on a general-purpose processor, the computer program code segments configure the processor to create specific logic circuits.
  • the methods may alternatively be at least partially embodied in application specific integrated circuits for performing the methods.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

Apparatuses and methods are described for determining the curvature of glass during glass sheet production. In some examples, a glass forming apparatus draws down molten glass to produce a glass sheet, and an infrared camera captures infrared images of an edge of the glass sheet. Further, a computing device receives the infrared images from the infrared camera and determines, for each infrared image, a subset of the plurality of pixels based on pixel values for the plurality of pixels and a pixel value threshold. For example, the subset of the plurality of pixels may include those pixels with pixel values at or above the pixel value threshold. In addition, the computing device determines a line based on the positions of the subset of the plurality of pixels within the infrared image, and generates curvature data characterizing a curvature of the glass sheet based on the line.

Description

SYSTEM AND METHODS FOR DETERMINING GLASS RIBBON
CURVATURE
Inventors:
Chih-Hua Cheng Hsiang-Rong Chu Po-Keng Chung
CROSS-REFERENCE TO RELATED APPLICATION
[0001] This application claims the benefit of priority under 35 U.S.C. § 119 of U.S. Provisional Application Serial No. 63/529024 filed on July 26, 2023, the content of which is relied upon and incorporated herein by reference in its entirety.
FIELD OF THE DISCLOSURE
[0002] The present disclosure relates to the production of glass sheets and, more particularly, to apparatus and methods for determining the curvature of glass sheets during glass sheet production.
BACKGROUND
[0003] Glass sheets are used in a variety of applications. For example, they may be used in glass display panels such as in mobile devices, laptops, tablets, computer monitors, and television displays. Glass sheets may be manufactured by a slot drawdown process whereby molten glass is drawn through a slot to form a glass sheet. In some instances, however, the glass sheet breaks during the drawdown process. When glass sheets break, additional, unexpected costs are incurred. For example, new glass sheets need to be produced to replace the broken glass sheets, and further production of additional glass sheets is delayed due to having to produce the replacement glass sheets. As such, there are opportunities to improve the production of glass sheets. SUMMARY
[0004] The embodiments disclosed herein are directed to apparatus and methods for imaging a glass sheet during a drawdown process, and determining a curvature of the glass sheet based on the images. The frequency at which glass sheets with various curvatures are produced can then be determined to improve the glass formation process. In some examples, glass sheet maps, such as 3 -dimensional (3D) maps, may be generated based on the determined curvatures. The glass sheet maps can also be used to analyze broken and unbroken glass sheets, and thus improve the glass formation process.
[0005] To determine the curvature of a glass sheet, the embodiments may capture images, such as infrared (IR) images, of an edge of a glass sheet as the glass sheet is being formed. For example, an IR camera may be positioned such that it captures IR images of one edge of a glass sheet as the glass sheet is being drawn down to formation. In some examples, another IR camera may be positioned to capture IR images of a second edge of a glass sheet as the glass sheet is being drawn down. The one or more IR cameras may capture one or more IR images, and may transmit the IR images to a computing device. Each IR image may include one or more pixel values for each of a plurality of pixels. Each pixel corresponds to a pixel location of an IR image. The pixel values may characterize heat intensity values, for instance.
[0006] The computing device may receive an IR image from an IR camera, and may determine a subset of the plurality of pixels for the IR image based on the pixel values and a pixel value threshold. For example, in the example of a single channel grayscale IR image, the computing device may compare each pixel value to the pixel value threshold, and keep within the subset of the plurality of pixels those pixels that satisfy the pixel value threshold, such as those pixels with a pixel value that is at or above the pixel value threshold. The computing device may then generate curvature data characterizing a curvature (e.g., bow) of the glass sheet based on a position of each of the subset of the plurality of pixels within the IR image (e.g., a 2D position). For instance, the computing device may determine a line based on the positions of the subset of the plurality of pixels, and may then determine the curvature of the line to generate the curvature data.
[0007] In some embodiments, a glass forming system includes a glass forming apparatus, an infrared camera, and a computing device communicatively coupled to the glass forming apparatus and the infrared camera. The glass forming apparatus is configured to drawdown molten glass to produce a glass sheet. The infrared camera is configured to capture infrared images of the glass sheet. Further, the computing device is configured to receive an infrared image from the infrared camera, the infrared image comprising a pixel value for each of a plurality of pixels. The computing device is also configured to determine a subset of the plurality of pixels based on the pixel values for the plurality of pixels and a pixel value threshold. Further, the computing device is configured to generate curvature data characterizing a curvature of the glass sheet based on a position of the subset of the plurality of pixels within the infrared image. The computing device is also configured to store the curvature data in a database.
[0008] In some embodiments, an apparatus includes a memory storing instructions and a processor communicatively coupled to the memory. The processor is configured to execute the instructions to receive an infrared image from an infrared camera with a field-of-view that includes an edge of a glass sheet being drawn down by a glass forming apparatus, the infrared image comprising a pixel value for each of a plurality of pixels. The computing device is also configured to execute the instructions to determine a subset of the plurality of pixels based on the pixel values for the plurality of pixels and a pixel value threshold. The computing device is further configured to execute the instructions to generate curvature data characterizing a curvature of the glass sheet based on a position of the subset of the plurality of pixels within the infrared image. The computing device is also configured to execute the instructions to store the curvature data in a database.
[0009] In some embodiments, a method by a processor includes receiving an infrared image from an infrared camera with a field-of-view that includes an edge of a glass sheet being drawn down by a glass forming apparatus, the infrared image comprising a pixel value for each of a plurality of pixels. The method also includes determining a subset of the plurality of pixels based on the pixel values for the plurality of pixels and a pixel value threshold. The method further includes generating curvature data characterizing a curvature of the glass sheet based on a position of the subset of the plurality of pixels within the infrared image. The method also includes storing the curvature data in a database.
[0010] In some embodiments, a non-transitory computer readable medium stores instructions that, when executed by one or more processors, cause the one or more processors to perform a method that includes receiving an infrared image from an infrared camera with a field-of-view that includes an edge of a glass sheet being drawn down by a glass forming apparatus, the infrared image comprising a pixel value for each of a plurality of pixels. The method also includes determining a subset of the plurality of pixels based on the pixel values for the plurality of pixels and a pixel value threshold. The method further includes generating curvature data characterizing a curvature of the glass sheet based on a position of the subset of the plurality of pixels within the infrared image. The method also includes storing the curvature data in a database.
BRIEF DESCRIPTION OF DRAWINGS
[0011] The above summary and the below detailed description of illustrative embodiments may be read in conjunction with the appended Figures. The Figures show some of the illustrative embodiments discussed herein. As further explained below, the claims are not limited to the illustrative embodiments. For clarity and ease of reading, Figures may omit views of certain features.
[0012] FIG. 1 illustrates a glass sheet curvature detection system in accordance with some examples.
[0013] FIG. 2 is a block diagram of a glass sheet curvature computing device in accordance with some examples.
[0014] FIG. 3 illustrates exemplary functions of a glass sheet curvature computing device in accordance with some examples.
[0015] FIGS. 4A, 4B, 4C, 4D, and 4E illustrate infrared images in accordance with some examples.
[0016] FIG. 5 A illustrates an infrared image with filtered pixel values used to compute a line in accordance with some examples.
[0017] FIG. 5B illustrates the curvature of a line in accordance with some examples.
[0018] FIG. 6 illustrates an exemplary method for computing a curvature of a glass sheet based on an infrared image in accordance with some examples.
[0019] FIG. 7 illustrates an exemplary method for generating a map based on a computed curvature of a glass sheet in accordance with some examples.
DETAILED DESCRIPTION
[0020] The present application discloses illustrative (z.e., example) embodiments. The disclosure is not limited to the illustrative embodiments. Therefore, many implementations of the claims will be different than the illustrative embodiments. Various modifications can be made to the claims without departing from the spirit and scope of the disclosure. The claims are intended to cover implementations with such modifications.
[0021] At times, the present application uses directional terms (e.g., front, back, top, bottom, left, right, side, edge, etc.) to give the reader context when viewing the Figures. The claims, however, are not limited to the orientations shown in the Figures. Any absolute term (e.g., high, low, etc.) can be understood as disclosing a corresponding relative term (e.g., higher, lower, etc.). In addition, any singular term (e.g., processor) can be understood as disclosing a corresponding plural term (e.g., processors), and any plural term can be understood as disclosing a corresponding singular term.
[0022] Referring now to the Figures, FIG. 1 illustrates a glass sheet curvature detection system 100 that includes a glass forming apparatus 120, a first infrared (IR) camera 130, a second IR camera 140, a video camera 180, a glass sheet curvature computing device 160, and a database 170. The glass forming apparatus 120 (e.g., a fusion glass forming apparatus) includes a cavity 121 that is bounded on its longitudinal sides by walls 123 and 124. The walls 123 and 124 are integral with each of opposed and downwardly inclined glass forming surfaces 133 and 134, respectively. The pair of opposed and downwardly inclined glass forming surfaces 133 and 134 terminate and meet at a joining position 135.
[0023] Molten glass (e.g., high-purity molten glass) is delivered into cavity 121 until the molten glass overflows each of the walls 123 and 124, and then flows along the glass forming surfaces 133 and 134 to joining position 135. The molten glass flowing over the glass forming surfaces 133 and 134 meets at the joining position 135 and begins to form a glass sheet 150. In other words, the molten glass begins to draw down from the joining position 135 until a glass sheet 150 is formed. The glass sheet 150 may be formed to any suitable thickness. For example, the glass sheet 150 may be anywhere from 500 micrometers to 3 millimeters (e.g., 2 millimeters) thick in some instances. In other instances, the glass sheet 150 may be thicker than 3 millimeters, or less than 500 micrometers thick.
[0024] Each of the first IR camera 130 and second IR camera 140 may be any suitable IR cameras. For instance, the first IR camera 130 and second IR camera 140 may each be an IR camera that can measure heat intensities that include the range of temperatures of the glass sheet 150 as it is being formed. In some examples, the IR camera 130, 140 may include a 320 x 240 pixel thermal detector that can measure temperatures up to 650° Celsius (C) (i.e., 1202° Fahrenheit (F)). In some examples, the IR camera 130, 140 may include a 640 x 480 pixel thermal detector, and may have an object temperature range of 0° C to 650° C (i.e., 32° F to 1202° F). Each of the IR cameras 130, 140 may include sensors that can capture IR images of scenes within their field-of- views. For example, first IR camera 130 may be positioned to capture IR images of a first edge 151 of the glass sheet 150. Similarly, second IR camera 140 may be positioned to capture IR images of a second edge 153 of the glass sheet 150. The first edge 151 of the glass sheet 150 may be opposite to and longitudinally across the second edge 153 of the glass sheet 150. In addition, the first edge 151 of the glass sheet 150 may have a first width 155, and the second edge 153 of the glass sheet may have a second width 157. The first width 155 and the second width 157 may differ along a vertical direction of each of the respective first edge 151 and second edge 153. In some instances, at a particular distance from the joining position 135, the first width 155 and the second width 157 are the same or nearly the same.
[0025] The glass sheet curvature computing device 160 may be communicatively coupled to each of the first IR camera 130, the second IR camera 140, and, in some examples, the video camera 180 and/or the glass forming apparatus 120. For example, the glass sheet curvature computing device 160 may be configured to communicate over a communication network 116 with each of the first IR camera 130, the second IR camera 140, and the video camera 180. Communication network 116 may be a wired or wireless communication network. For example, in some examples, communication network 116 may include one or more Universal Serial Bus® (USB) (e.g., USB 2.0) or Ethernet® cables establishing communication channels between the glass sheet curvature computing device 160 and each of the first IR camera 130, the second IR camera 140, and the video camera 180. In some examples, communication network 116 may include a cellular network, a Wi-Fi® network, a Bluetooth® network, or any other suitable wireless network. In some instances, the glass sheet curvature computing device 160 communicates with one or more of the first IR camera 130, the second IR camera 140, and the video camera 180 using one or more protocols, such as the real-time streaming protocol (RTSP).
[0026] The glass sheet curvature computing device 160 can receive IR images from each of the first IR camera 130 and the second IR camera 140 over communication network 116. For example, the first IR camera 130 may capture an IR image of the first edge 151 of the glass sheet 150, and may transmit the IR image to the glass sheet curvature computing device 160 using communication network 116. Similarly, the second IR camera 140 may capture an IR image of the second edge 153 of the glass sheet 150, and may transmit the IR image to the glass sheet curvature computing device 160 using communication network 116. In some examples, the glass sheet curvature computing device 160 can transmit a message to any of the first IR camera 130 and the second IR camera 140 to adjust their corresponding field-of-views. For example, the glass sheet curvature computing device 160 may transmit a message to the first IR camera 130 to cause the first IR camera 130 to adjust a pitch or yaw of its corresponding sensor. Similarly, in some examples, the glass sheet curvature computing device 160 may transmit a message to the second IR camera 140 to cause the second IR camera 140 to adjust a pitch or yaw of its corresponding sensor. In some instances, the glass sheet curvature computing device 160 can transmit a message to any of the first IR camera 130 and the second IR camera 140 to adjust a focus of a corresponding lens that focuses light to the corresponding sensor. For instance, the glass sheet curvature computing device 160 may transmit a message to the first IR camera 130 to increase or decrease a focal length of the first IR camera 130. Similarly, the glass sheet curvature computing device 160 may transmit a message to the second IR camera 140 to increase or decrease a focal length of the second IR camera 140.
[0027] Additionally, the video camera 180 is configured to capture images (e.g., still images, video) of the glass sheet 150 along a longitudinal length of a front side 159 of the glass sheet 150. For instance, the glass sheet curvature computing device 160 may transmit a signal to the video camera 180 to cause the video camera to capture an image. The video camera 180 can transmit the captured images to the glass sheet curvature computing device 160. The glass sheet curvature computing device 160 can receive the images, and can store the images within the database 170. In some examples, the glass sheet curvature computing device 160 can transmit one or more messages to the video camera 180 to adjust its corresponding field-of-view, as well as to adjust its focal length.
[0028] In some examples, while molten glass is being drawn down from the joining position 135 of the glass forming apparatus 120 to form a glass sheet 150, the first IR camera 130 captures multiple images. For example, the glass sheet curvature computing device 160 may receive a signal from the glass forming apparatus 120 (or other source) indicating that glass drawdown is beginning, and based on the signal, the glass sheet curvature computing device 160 may transmit a message to the first IR camera 130 to cause the first IR camera 130 to capture an IR image of the first edge 151 of the glass sheet 150 as its being drawn down. In some examples, the glass sheet curvature computing device 160 delays a predetermined amount of time after receiving the signal and, once the predetermined amount of time has expired, transmits the message to the first IR camera 130 to capture the IR image. Similarly, the glass sheet curvature computing device 160 may transmit a message to the second IR camera 140 to capture an IR image of the second edge 153 of the glass sheet 150 as its being drawn down.
[0029] Further, the first IR camera 130 may transmit the captured IR image to the glass sheet curvature computing device 160. The glass sheet curvature computing device 160 may determine a subset of the plurality of pixels of the IR image based on their pixel values and a pixel value threshold. Further, the glass sheet curvature computing device 160 may generate curvature data characterizing a curvature of the glass sheet based on a position of the subset of the plurality of pixels within the infrared image.
[0030] For example, to generate the curvature data, the glass sheet curvature computing device 160 may apply a color filtering process to the IR image to generate a color filtered image that identifies pixels within a corresponding color range. The color filtering process may “filter out” all pixels not within the range. The corresponding color range may correspond to the color of pixels that IR images indicate are within a range of heat values. For instance, the color range may correspond to blue and/or white colors. In some examples, the color range includes a range for each of multiple color channels, such as red, green, and blue (RGB) color channels or luminance and chrominance (YCbCr) color channels.
[0031 ] Further, the glass sheet curvature computing device 160 may apply a region- of-interest (ROI) mask to the color filtered image to mask out unwanted pixels, and from those remaining pixels may adjust their value based on a binary threshold to generate a binary image. For instance, the glass sheet curvature computing device 160 may compare each pixel value of the remaining pixels to the binary threshold and replace, from the remaining pixels, pixel values that are at or above a binary threshold with a first value (e.g., 355), and pixel values that are below the binary threshold with a second value (e.g., 0). The glass sheet curvature computing device 160 may also perform operations to generate multiple windows within the binary image, and determine the coordinates of all pixels with pixel values of the first value.
[0032] Further, the glass sheet curvature computing device 160 may generate line data characterizing a line based on the determined pixel coordinates. For instance, the glass sheet curvature computing device 160 may apply a line algorithm to fit a line to the determined coordinates of those pixels with pixel values of the first value. Based on the line data, the glass sheet curvature computing device 160 may determine the curvature of the glass sheet 150. For example, the glass sheet curvature computing device 160 may generate curvature data characterizing the curvature of the glass sheet 150 based on applying a curvature algorithm to the line data. The glass sheet curvature computing device 160 may store the curvature data within the database 170. In some instances, the glass sheet curvature computing device 160 may generate curvature data similarly based on an IR image received from the second IR camera 140.
[0033] In some examples, the glass sheet curvature computing device 160 receives one or more images from the video camera 180 (e.g., in response to the glass sheet curvature computing device 160 transmitting a signal to the video camera 180 to cause the video camera to capture an image). The one or more images include the front side 159 of the glass sheet 150. Based on the one or more images from the video camera and the curvature data, the glass sheet curvature computing device 160 may generate a 3D map characterizing the glass sheet 151. For instance, the glass sheet 151 may be drawn down by one or more pulling rolls. Based on the curvature data, the glass sheet curvature computing device 160 may determine a curvature of a portion of the glass sheet 151 (e.g., where the glass sheet bows in a Z direction) that is below a last pulling row (e.g., the “bottom of the draw” portion of the glass sheet 151). The glass sheet curvature computing device 160 may further determine a position of this portion of the glass sheet 151 based on the images from the video camera (e.g., an X, Y position). Based on combining the curvature and the position of the portion of the glass sheet 151 that is below the last pulling row, the glass sheet curvature computing device 160 may compute and generate the 3D map. As such, the 3D map may include a 3D image of the glass sheet 150 with the determined curvature (e.g., bow). The glass sheet curvature computing device 160 may store the 3D map in the database 170.
[0034] FIG. 2 illustrates an exemplary glass sheet curvature computing device 200, such as the glass sheet curvature computing device 160 of FIG. 1. As illustrated, glass sheet curvature computing device 200 can include one or more processors 201, working memory 202, one or more input/output devices 203, instruction memory 207, a transceiver 204, and a display 206, all operatively coupled to one or more data buses 208. Data buses 208 allow for communication among the various devices. Data buses 208 can include wired, or wireless, communication channels.
[0035] Processors 201 can include one or more distinct processors, each having one or more cores. Each of the distinct processors can have the same or different structure. Processors 201 can include one or more central processing units (CPUs), one or more graphics processing units (GPUs), application specific integrated circuits (ASICs), digital signal processors (DSPs), and the like. Processors 201 can be configured to perform a certain function or operation by executing code, stored on instruction memory 207, embodying the function or operation. For example, processors 201 can be configured to perform one or more of any function, method, or operation disclosed herein.
[0036] Instruction memory 207 can store instructions that can be accessed (e.g., read) and executed by processors 201. For example, instruction memory 207 can be a non-transitory, computer-readable storage medium such as a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), flash memory, a removable disk, CD-ROM, any non-volatile memory, or any other suitable memory.
[0037] Further, processors 201 can store data to, and read data from, working memory 202. For example, processors 201 can store a working set of instructions to working memory 202, such as instructions loaded from instruction memory 207. Processors 201 can also use working memory 202 to store dynamic data created during the operation of glass sheet curvature computing device 200. Working memory 202 can be a random access memory (RAM) such as a static random access memory (SRAM) or dynamic random access memory (DRAM), or any other suitable memory.
[0038] Input-output devices 203 can include any suitable device that allows for data input or output. For example, input-output devices 203 can include one or more of a keyboard, a touchpad, a mouse, a stylus, a touchscreen, a physical button, a speaker, a microphone, or any other suitable input or output device. Input-output device 203 may allow a user to provide input selecting or characterizing preferred operational values, such as IR camera positions, for instance.
[0039] Display 209 can display user interface 205. User interface 205 can enable user interaction with the glass sheet curvature computing device 200. For example, user interface 205 can be a user interface for an application that allows a user to enter preferred operational values to update a position of an IR camera, such as a position of the first IR camera 130, the second IR camera 140, and the video camera 180, for example. In some examples, a user can interact with user interface 205 by engaging input-output devices 203. In some examples, display 209 can be a touchscreen, where user interface 205 is displayed on the touchscreen.
[0040] Transceiver 204 allows for communication with a network, such as the communication network 116. For example, transceiver 204 may connect to a Wi-Fi®, Bluetooth®, cellular, or any other suitable wireless network, and may send signals (e.g., data) to, and receive signals from, other devices coupled to the wireless network. Processor(s) 201 is operable to receive data from, or send data to, the wireless network via transceiver 204.
[0041] FIG. 3 illustrates exemplary functions of the glass sheet curvature computing device 200. As illustrated, glass sheet curvature computing device 200 includes IR image generation engine 302, color filter engine 304, region-of-interest (ROI) application engine 306, binary threshold engine 308, slide window application engine 310, line generation engine 312, and curvature determination engine 314. In some instances, one or more of the IR image generation engine 302, color filter engine 304, ROI application engine 306, binary threshold engine 308, slide window application engine 310, line generation engine 312, and curvature determination engine 314 include instructions stored in instruction memory 207 that, when executed by the one or more processors 201, cause the one or more processors 201 to carry out the corresponding operations. In some examples, one or more of the IR image generation engine 302, color filter engine 304, ROI application engine 306, binary threshold engine 308, slide window application engine 310, line generation engine 312, and curvature determination engine 314 can be implemented in logic, such as digital logic, state machines, FPGAs, ASICs, or any other suitable logic.
[0042] IR image generation engine 302 may receive IR data 301. For example, IR image generation engine 302 may receive IR data 301 from the first IR camera 130. The IR data 301 characterizes an IR image, such as an IR image of the first edge 153 of the glass sheet 150. IR image generation engine 302 may process the IR data 301 to generate an IR image 303. For example, the IR data 301 may include pixel values for each of a plurality of pixels at corresponding pixel coordinate locations, as well as metadata characterizing information about the first IR camera 130 and or the capturing of the IR data 301 e.g., data of capture, measured temperatures, object distance, camera settings, etc.). The IR image generation engine 302 may remove the metadata, and may generate the IR image 303 to include the one or more pixel values (e.g., depending on the number of color channels) for each of the plurality of pixels at corresponding pixel coordinate locations.
[0043] Color filter engine 304 may receive the IR image 303, and may apply a color filtering process to the IR image to generate filtered image 305 characterizing a color filtered image that identifies pixels within a corresponding color range. For example, color filter engine 304 may determine whether each pixel value for a pixel falls within a predetermined color range (e.g., a color range that corresponds to white colors indicating higher heat). If the pixel value falls outside the predetermined color range, the corresponding pixel is adjusted (e.g., filtered out). For instance, the color filter engine 304 may replace the pixel value with a predetermined value, such as zero. If, however, the pixel value falls within the range, the corresponding pixel value is left without adjustment.
[0044] As an example, FIG. 4 A illustrates an exemplary IR image 403, while FIG. 4B illustrates a resulting filtered image 405. The predetermined color range may include pixel values corresponding to expected heat intensities of an edge 402 of a glass sheet. In FIG. 4B, to generate a filtered image 405, IR image 403 pixels with pixel values within the predetermined range were not adjusted (i.e., they remain the same), while pixels with pixel values not within the predetermined range were replaced with a predetermined value, such as zero.
[0045] Referring back to FIG. 3, ROI application engine 306 receives the filtered image 305, and applies a mask to the filtered image 305 over a region of interest to generate an ROI image 307. For example, the mask may include a range of pixels along an axis of the corresponding pixel coordinates, such as the “x” axis. For instance, the mask may include pixels with an “x” coordinate between 300 and 500 when the IR image 303 is of a resolution of 640 x 480 (i.e., 640 pixels along the “x” axis and 480 pixels along the “y” axis). In this example, the ROI application engine 306 may replace pixel values for all pixels with an “x” coordinate that is less than 300 or greater than 500 with a predetermined value, such as zero. Otherwise, if a pixel has an “x” coordinate” that is equal to or greater than 300 and equal to or less than 500, the pixel value is not adjusted.
[0046] FIG. 4C, for example, illustrates an ROI image 407. In this example, a mask has been applied to the filtered image 405 of FIG. 4B to generate the ROI image 407. The mask may define a region-of-interest that includes pixels with an “x” axis pixel coordinate of between 300 and 500, inclusive, and may mask out pixels with an “x” axis pixel coordinate less than 300 or greater than 500.
[0047] With reference back to FIG. 3, binary threshold engine 308 may receive the ROI image 307, and may binary threshold process to the ROI image 307 to generate a binary image 309. For example, the binary threshold engine 308 may adjust the pixel values of the pixels within the region-of-interest, as characterized by the ROI image 307, based on a pixel value threshold to generate the binary image 309. For example, binary threshold engine 308 may determine whether each pixel value is at or above the pixel value threshold. If a pixel value is at or above the pixel value threshold, the binary threshold engine 308 may replace the pixel value with a first value (e.g., 355). If the pixel value is below the pixel value threshold, the binary threshold engine 308 may replace the pixel value with a second value (e.g., 0).
[0048] For example, FIG. 4D illustrates a binary image 409 generated based on the ROI image 407 of FIG. 4C. In this example, the pixel values for the edge 402 of the glass sheet have been replaced with a first value (z.e., because their original pixel value was the same or greater than the pixel value threshold), and other pixels, such as pixels with area 410, have been replaced with a second value (z.e., because their original pixel value was less than the pixel value threshold).
[0049] Referring back to FIG. 3, slide window application engine 310 may receive the binary image 309 from the binary threshold engine 308, and may generate a slide window image 311 that includes multiple windows, where each window includes a portion of the pixels defining the edge 402 of the glass sheet. For example, slide window application engine 310 may generate a predetermined number of windows (e.g., based on a user-provided configuration setting stored in memory), each window including pixels with a corresponding range of “y” coordinates and a corresponding range of “x” pixel coordinates. For instance, assuming an original resolution of 640 x 380, and a configured number of ten windows, slide window application engine 310 may generate each window to include pixels with a “y” pixel coordinate falling within a corresponding range of thirtyeight pixels (e.g., 380=10=38). Thus, a first window may include pixels with a “y” coordinate between 0 and 37, a second window may include pixels with a “y” coordinate between 38 and 75, and so on. The first window may have a top edge at “y” coordinate 0 and a bottom edge at “y” coordinate 37. Similarly, the second window may have a top edge at “y” coordinate 38 and a bottom edge at “y” coordinate 75, and so on.
[0050] Furthermore, each window may define a range along the “x” axis that expands at least enough to include all pixels with pixel values of the first value (e.g., 355). For example, slide window application engine 310 may generate each window to include a left edge that, for every “y” coordinate position, is a predetermined distance away (e.g., 5 pixels) from a pixel with the first value that has a left-most “x” coordinate at the same “y” coordinate position. Slide window application engine 310 may also generate each window to include a right edge that, for every “y” coordinate position, is a predetermined distance away (e.g., 5 pixels) from a pixel with the first value that has a right-most “x” coordinate at the same “y” coordinate position. [0051] For example, FIG. 4E illustrates a slide window image 411 generated based on the binary image 409 of FIG. 4D. As illustrated, slide window image 411 includes multiple windows 415. In addition, each window 415 includes a left edge 417, a right edge 419, a top edge 421, and a bottom edge 423. For each “y” coordinate within a window, the left edge 417 may be a predetermined distance 425 from a left-most pixel 427 with a pixel value that is, and the right edge 419 may be a predetermined distance 435 from a right-most pixel 437 with a pixel value that is the first value. As such, each window 415 includes corresponding pixels with pixel values of the first value.
[0052] Referring back to FIG. 3, line generation engine 312 receives slide window image 311 from slide window application engine 310, and applies one or more line function generation processes to pixel coordinates of the pixels with pixel values of the first value included within the windows to generate line data 313 characterizing a line.
For example, line generation engine 312 may apply any suitable algorithm to the pixel coordinates to fit (e.g., estimate) a line to the corresponding pixels. As an example, line generation engine 310 may determine a quadratic equation, such as the one below, based on the pixel coordinates of the pixels with the first value.
(Eq. l) y = ax2 + bx + c
[0053] Further, curvature determination engine 314 may receive the line data 313 from the line generation engine 312, and may generate curvature data 315 characterizing a curvature of the line characterized by the line data 313. For example, curvature determination engine 314 may apply to the line data 313 any suitable curvature algorithm, such as the one below, to generate the curvature data 315.
Figure imgf000016_0001
[0054] In some examples, the curvature data 315 further identifies a radius of the curvature. For example, the curvature determination engine 314 may compute the radius using any suitable algorithm, such as the ones defined below.
Figure imgf000016_0002
(Eq. 4) Radius = 1/ Curvature = 1/ k
[0055] The curvature determination engine 314 may store the curvature data 315 in a memory device, such as working memory 202 or database 170. In some examples, processor 201 compares the curvature and/or radius characterized by the curvature data 315 to corresponding threshold values. If the curvature is greater than a curvature threshold, the processor 301 may generate and transmit a warning signal indicating a glass sheet breakage warning. Similarly, if the radius is less than a radius threshold, the processor 201 may similarly generate and transmit the warning signal. The warning signal may cause an indication, such as a pop-up message or flashing icon, to be displayed (e.g., on display 209).
[0056] FIG. 5A, for instance, illustrates the determination of a curvature for a quadratic equation characterizing a line generated from a slide window image 502. FIG. 5B illustrates the computation of a radius R of the curvature of a line C measured from a point P on the line C.
[0057] FIG. 6 illustrates an exemplary method 600 that may be performed by one or more computing devices, such as the glass sheet curvature computing device 160. In some examples, the exemplary method 600 may be carried out by processors distributed across one or more networks, such as processors within one or more cloud-based servers.
[0058] Beginning at step 602, infrared data is received from a camera positioned to capture a vertical bow of a glass sheet. For example, first IR camera 130 may be positioned to capture IR images of a first edge 151 of the glass sheet 150. The glass sheet curvature computing device 160 may receive the IR images (e.g., IR data 301) from the first IR camera 130. At step 604, a filtered image is generated based on applying a color filtering process to the infrared data. For example, to generate a filtered image such as filtered image 305, glass sheet curvature computing device 160 may determine whether each pixel value for each pixel characterized by the infrared data falls within a predetermined color range. If the pixel value falls outside the predetermined color range, the corresponding pixel is adjusted (e.g., filtered out). If, however, the pixel value falls within the range, the corresponding pixel value is left without adjustment.
[0059] Proceeding to step 606, a region-of-interest (ROI) image is generated based on applying a ROI mask to the filtered image. For example, glass sheet curvature computing device 160 may apply a mask to the filtered image (e.g., filtered image 305) over a region of interest to generate the ROI image, such as ROI image 307. The mask may define “x” axis pixel coordinates across a range, as described herein. As such, pixel values for pixels with “x” axis pixel coordinates within the range are not adjusted, and pixel values for pixels with “x” axis pixel coordinates outside of the range are adjusted (e.g., to zero).
[0060] At step 608, a binary image is generated based on applying a binary threshold process to the ROI image. For instance, glass sheet curvature computing device 160 may generate the binary image by adjusting the pixel values of the pixels within the region-of-interest, as characterized by the ROI image, based on a pixel value threshold. For example, glass sheet curvature computing device 160 may determine whether each pixel value is at or above the pixel value threshold. If a pixel value is at or above the pixel value threshold, the glass sheet curvature computing device 160 may replace the pixel value with a first value (e.g., 355). If the pixel value is below the pixel value threshold, the glass sheet curvature computing device 160 may replace the pixel value with a second value (e.g., 0).
[0061] Further, at step 610, a slide window image that includes non-zero pixels of the binary image within each of multiple windows is generated. For example, the glass sheet curvature computing device 160 may generate a predetermined number of windows where each window includes binary image pixels with non-zero pixel values (e.g., 355) and that have a “y” coordinate within a corresponding range of “y” coordinates and an “x” coordinate within a corresponding range of “x” pixel coordinates. Based on the slide window image, at step 612, line data characterizing a line is generated. For example, line generation engine 312 may apply any suitable algorithm to the pixel coordinates to fit (e.g., estimate) a line to the corresponding pixels. As an example, the glass sheet curvature computing device 160 may determine a quadratic equation based on the pixel coordinates of the pixels with the non-zero values. The quadratic equation may characterize a best fit line to these corresponding pixel locations.
[0062] Further, at step 614, curvature data is generated based on the line data. The curvature data characterizes a curvature of the glass sheet captured within the infrared data received at step 602.
[0063] For example, the glass sheet curvature computing device 160 may apply any suitable curvature algorithm to the line data (e.g., line data 313) to generate the curvature data (e.g., curvature data 315). In some examples, the glass sheet curvature computing device 160 further determines a radius of the curvature, and generates the curvature data to, additionally or alternatively, include the computed radius. Further, and at step 616, the curvature data is stored in a database. For example, the glass sheet curvature computing device 160 may store the curvature data within database 170.
[0064] FIG. 7 illustrates an exemplary method 700 that may be performed by one or more computing devices, such as the glass sheet curvature computing device 160. In some examples, the exemplary method 700 may be carried out by processors distributed across one or more networks, such as processors within one or more cloud-based servers. [0065] Beginning at step 702, a first signal is transmitted to an infrared camera to cause the infrared camera to position its lens to include a side view of a glass forming apparatus within its field-of-view. For instance, and as described herein, the glass sheet curvature computing device 160 transmits a message to first IR camera 130 that, when received by the first IR camera 130, causes the first IR camera to adjust a pitch or yaw of its corresponding sensor to capture a side view of the glass forming apparatus 120. Further, at step 704, a second signal is received. The second signal indicates the start of a fusion drawdown process to produce a glass sheet. For example, the glass sheet curvature computing device 160 may receive a signal from the glass forming apparatus 120 indicating that glass drawdown is beginning. At step 706, in response to the second signal, a third signal is transmitted to the infrared camera to cause the infrared camera to capture an infrared image of a side view (e.g. , edge view) of the glass sheet. For example, based on receiving the signal from the glass forming apparatus 120, the glass sheet curvature computing device 160 may transmit a message to the first IR camera 130 to cause the first IR camera 130 to capture an IR image of the first edge 151 of the glass sheet 150 as its being drawn down.
[0066] Proceeding to step 708, curvature data is generated characterizing a curvature of the glass sheet based on the infrared image. For instance, the glass sheet curvature computing device 160 may determine a subset of the plurality of pixels of the infrared image based on their pixel values and a pixel value threshold. Further, the glass sheet curvature computing device 160 may generate the curvature data characterizing a curvature of the glass sheet based on a position of the subset of the plurality of pixels within the infrared image. In some examples, the glass sheet curvature computing device 160 may generate the curvature data as described with respect to the method of FIG. 6.
[0067] At step 710, a fourth signal is transmitted to an additional camera to cause the additional camera to capture a front side image of the glass sheet. For instance, the glass sheet curvature computing device 160 may transmit a signal to the video camera 180 to capture an image of the glass sheet 150 along a longitudinal length of a front side 159 of the glass sheet 150. The video camera 180 can then transmit the captured image to the glass sheet curvature computing device 160.
[0068] Further, at step 712, a 3D map is generated based on the curvature data and the front side image of the glass sheet. For example, the glass sheet curvature computing device 160 may apply any map generation process described herein to generate the 3D map. The 3D map includes a 3D image of a bow of the glass sheet. Proceeding to step 714, the 3D map is stored in a database. For example, the glass sheet curvature computing device 160 may store the 3D map in database 170.
[0069] Although the methods described above are with reference to the illustrated flowcharts, it will be appreciated that many other ways of performing the acts associated with the methods can be used. For example, the order of some operations may be changed, and some of the operations described may be optional.
[0070] In addition, the methods and system described herein can be at least partially embodied in the form of computer-implemented processes and apparatus for practicing those processes. The disclosed methods may also be at least partially embodied in the form of tangible, non-transitory machine-readable storage media encoded with computer program code. For example, the steps of the methods can be embodied in hardware, in executable instructions executed by a processor (e.g., software), or a combination of the two. The media may include, for example, RAMs, ROMs, CD-ROMs, DVD-ROMs, BD- ROMs, hard disk drives, flash memories, or any other non-transitory machine-readable storage medium. When the computer program code is loaded into and executed by a computer, the computer becomes an apparatus for practicing the method. The methods may also be at least partially embodied in the form of a computer into which computer program code is loaded or executed, such that, the computer becomes a special purpose computer for practicing the methods. When implemented on a general-purpose processor, the computer program code segments configure the processor to create specific logic circuits. The methods may alternatively be at least partially embodied in application specific integrated circuits for performing the methods.
[0071] The foregoing is provided for purposes of illustrating, explaining, and describing embodiments of this disclosure. Modifications and adaptations to these embodiments will be apparent to those skilled in the art and may be made without departing from the scope or spirit of this disclosure.

Claims

CLAIMS What is claimed is:
1. A glass forming system comprising: a glass forming apparatus configured to drawdown molten glass to produce a glass sheet; an infrared camera configured to capture infrared images of the glass sheet; and a computing device communicatively coupled to the glass forming apparatus and the infrared camera, the computing device configured to: receive an infrared image from the infrared camera, the infrared image comprising a pixel value for each of a plurality of pixels; determine a subset of the plurality of pixels based on the pixel values for the plurality of pixels and a pixel value threshold; generate curvature data characterizing a curvature of the glass sheet based on a position of the subset of the plurality of pixels within the infrared image; and store the curvature data in a database.
2. The glass forming system of claim 1, wherein the computing device is configured to apply a color filtering process to the infrared image to generate a color filtered image that identifies pixels of the plurality of pixels within a corresponding color range, wherein the subset of the plurality of pixels comprise the pixels within the corresponding color range.
3. The glass forming system of claim 2, wherein the computing device is configured to apply a region-of-interest mask to the color filtered image to determine the subset of the plurality of pixels.
4. The glass forming system of claim 3, wherein the computing device is configured to: compare the pixel value for each of the subset of the plurality of pixels to a binary threshold; and adjust the pixel value for each of the subset of the plurality of pixels based on the comparison.
5. The glass forming system of claim 4, wherein the computing device is configured to: adjust the pixel value to a first value when the pixel value is at or above the binary threshold; and adjust the pixel value to a second value when the pixel value is below the binary threshold.
6. The glass forming system of claim 1, wherein the computing device is configured to: compare the pixel value for each of the plurality of pixels to the pixel value threshold; and determine the subset of the plurality of pixels based on satisfying the pixel value threshold.
7. The glass forming system of claim 1, wherein the computing device is configured to transmit a signal to the infrared camera to cause the infrared camera to position a sensor of the infrared camera to capture an edge of the glass forming apparatus within a field-of-view of the sensor.
8. The glass forming system of claim 1 comprising a video camera configured to capture images of a front side of the glass sheet, wherein the computing device is configured to: receive a front image of the glass sheet from the video camera; and generate a map of the glass sheet based on the front image and the curvature data.
9. The glass forming system of claim 1, wherein the computing device is configured to: compare the curvature to a curvature threshold; and transmit a warning signal based on satisfying the curvature threshold.
10. The glass forming system of claim 1, wherein the computing device is configured to: determine a radius based on the curvature; and generate the curvature data to include the radius.
11. The glass forming system of claim 1, wherein the infrared camera is configured to capture the infrared images of a first edge of the glass sheet, the glass forming apparatus comprising a second infrared camera configured to capture additional infrared images of a second edge of the glass sheet, wherein the computing device is configured to: receive an additional infrared image from the second infrared camera, the additional infrared image comprising an additional pixel value for each of an additional plurality of pixels; determine a subset of the additional plurality of pixels based on the additional pixel values for the additional plurality of pixels and the pixel value threshold; and generate the curvature data characterizing the curvature of the glass sheet based on a position of the subset of the additional plurality of pixels within the additional infrared image.
12. An apparatus comprising: a memory storing instructions; and a processor communicatively coupled to the memory, wherein the processor is configured to execute the instructions to: receive an infrared image from an infrared camera with a field-of-view that includes an edge of a glass sheet being drawn down by a glass forming apparatus, the infrared image comprising a pixel value for each of a plurality of pixels; determine a subset of the plurality of pixels based on the pixel values for the plurality of pixels and a pixel value threshold; generate curvature data characterizing a curvature of the glass sheet based on a position of the subset of the plurality of pixels within the infrared image; and store the curvature data in a database.
13. The apparatus of claim 12, wherein the processor is configured to execute the instructions to apply a color filtering process to the infrared image to generate a color filtered image that identifies pixels of the plurality of pixels within a corresponding color range, wherein the subset of the plurality of pixels comprise the pixels within the corresponding color range.
14. The apparatus of claim 13, wherein the processor is configured to execute the instructions to apply a region-of-interest mask to the color filtered image to determine the subset of the plurality of pixels.
15. The apparatus of claim 12, wherein the processor is configured to execute the instructions to: compare the pixel value for each of the plurality of pixels to the pixel value threshold; and determine the subset of the plurality of pixels based on satisfying the pixel value threshold.
16. The apparatus of claim 12, wherein the processor is configured to execute the instructions to transmit a signal to the infrared camera to cause the infrared camera to position a sensor of the infrared camera to capture an edge of the glass forming apparatus within a field-of-view of the sensor.
17. The apparatus of claim 12, wherein the processor is configured to execute the instructions to: receive a front image of the glass sheet from a video camera; and generate a map of the glass sheet based on the front image and the curvature data.
18. The apparatus of claim 12, wherein the processor is configured to execute the instructions to: compare the curvature to a curvature threshold; and transmit a warning signal based on satisfying the curvature threshold.
19. A method by at least one processor comprises: receiving an infrared image from an infrared camera with a field-of-view that includes an edge of a glass sheet being drawn down by a glass forming apparatus, the infrared image comprising a pixel value for each of a plurality of pixels; determining a subset of the plurality of pixels based on the pixel values for the plurality of pixels and a pixel value threshold; generating curvature data characterizing a curvature of the glass sheet based on a position of the subset of the plurality of pixels within the infrared image; and storing the curvature data in a database.
20. The method of claim 19, comprising applying a color filtering process to the infrared image to generate a color filtered image that identifies pixels of the plurality of pixels within a corresponding color range, wherein the subset of the plurality of pixels comprise the pixels within the corresponding color range.
PCT/US2024/035367 2023-07-26 2024-06-25 System and methods for determining glass ribbon curvature Pending WO2025024080A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363529024P 2023-07-26 2023-07-26
US63/529,024 2023-07-26

Publications (1)

Publication Number Publication Date
WO2025024080A1 true WO2025024080A1 (en) 2025-01-30

Family

ID=91950171

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2024/035367 Pending WO2025024080A1 (en) 2023-07-26 2024-06-25 System and methods for determining glass ribbon curvature

Country Status (3)

Country Link
CN (1) CN119374517A (en)
TW (1) TW202521925A (en)
WO (1) WO2025024080A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005114610A (en) * 2003-10-09 2005-04-28 Ishikawajima Harima Heavy Ind Co Ltd Flowing glass abnormality monitoring device
JP2016098137A (en) * 2014-11-20 2016-05-30 日本電気硝子株式会社 Method for monitoring shape of glass ribbon, and method and apparatus for manufacturing glass article
US20180297884A1 (en) * 2015-09-24 2018-10-18 Corning Incorporated Methods and apparatus for manufacturing glass

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005114610A (en) * 2003-10-09 2005-04-28 Ishikawajima Harima Heavy Ind Co Ltd Flowing glass abnormality monitoring device
JP2016098137A (en) * 2014-11-20 2016-05-30 日本電気硝子株式会社 Method for monitoring shape of glass ribbon, and method and apparatus for manufacturing glass article
US20180297884A1 (en) * 2015-09-24 2018-10-18 Corning Incorporated Methods and apparatus for manufacturing glass

Also Published As

Publication number Publication date
TW202521925A (en) 2025-06-01
CN119374517A (en) 2025-01-28

Similar Documents

Publication Publication Date Title
EP3477931B1 (en) Image processing method and device, readable storage medium and electronic device
US10997696B2 (en) Image processing method, apparatus and device
US11836903B2 (en) Subject recognition method, electronic device, and computer readable storage medium
US8818127B2 (en) Image processing apparatus and image processing method
WO2019109805A1 (en) Method and device for processing image
TWI485647B (en) Methods for image processing of face regions and electronic devices using the same
CN107509031A (en) Image processing method, device, mobile terminal, and computer-readable storage medium
CN103186763A (en) Face recognition system and face recognition method
WO2019105260A1 (en) Depth of field obtaining method, apparatus and device
CN107087104B (en) Image processing method for face area and electronic device using same
US8055016B2 (en) Apparatus and method for normalizing face image used for detecting drowsy driving
JP2016126750A (en) Image processing system, image processing apparatus, imaging apparatus, image processing method, program, and recording medium
CN108063891A (en) Image processing method, device, computer readable storage medium and computer equipment
US20170041543A1 (en) Image processing device and recording medium
CN107862658A (en) Image processing method, device, computer-readable storage medium, and electronic device
WO2022109855A1 (en) Foldable electronic device for multi-view image capture
CN107743200A (en) Method, device, computer-readable storage medium and electronic device for taking pictures
CN114331975B (en) Display screen display defect detection method and device
JP2019020839A (en) Image processing apparatus, image processing method, and program
US9323981B2 (en) Face component extraction apparatus, face component extraction method and recording medium in which program for face component extraction method is stored
WO2025024080A1 (en) System and methods for determining glass ribbon curvature
CN108012133A (en) Image processing method, device, computer readable storage medium and computer equipment
KR101989868B1 (en) Electronic device and controlling method thereof
CN102905077A (en) Image vignetting brightness regulating method and device
JP2018074191A (en) In-vehicle image display system, in-vehicle image display method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24743169

Country of ref document: EP

Kind code of ref document: A1