[go: up one dir, main page]

WO2023043434A1 - Temperature detections - Google Patents

Temperature detections Download PDF

Info

Publication number
WO2023043434A1
WO2023043434A1 PCT/US2021/050313 US2021050313W WO2023043434A1 WO 2023043434 A1 WO2023043434 A1 WO 2023043434A1 US 2021050313 W US2021050313 W US 2021050313W WO 2023043434 A1 WO2023043434 A1 WO 2023043434A1
Authority
WO
WIPO (PCT)
Prior art keywords
temperature
examples
thermal
thermochromic dye
optical image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2021/050313
Other languages
French (fr)
Inventor
Jacob Tyler WRIGHT
Maria Fabiola LEYVA MENDIVIL
Sunil KOTHARI
Lei Chen
Kyle Douglas WYCOFF
Juan Carlos CATANA SALAZAR
Jun Zeng
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hewlett Packard Development Co LP
Original Assignee
Hewlett Packard Development Co LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett Packard Development Co LP filed Critical Hewlett Packard Development Co LP
Priority to PCT/US2021/050313 priority Critical patent/WO2023043434A1/en
Publication of WO2023043434A1 publication Critical patent/WO2023043434A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C64/00Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
    • B29C64/30Auxiliary operations or equipment
    • B29C64/386Data acquisition or data processing for additive manufacturing
    • B29C64/393Data acquisition or data processing for additive manufacturing for controlling or regulating additive manufacturing processes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B22CASTING; POWDER METALLURGY
    • B22FWORKING METALLIC POWDER; MANUFACTURE OF ARTICLES FROM METALLIC POWDER; MAKING METALLIC POWDER; APPARATUS OR DEVICES SPECIALLY ADAPTED FOR METALLIC POWDER
    • B22F12/00Apparatus or devices specially adapted for additive manufacturing; Auxiliary means for additive manufacturing; Combinations of additive manufacturing apparatus or devices with other processing apparatus or devices
    • B22F12/90Means for process control, e.g. cameras or sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y30/00Apparatus for additive manufacturing; Details thereof or accessories therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y50/00Data acquisition or data processing for additive manufacturing
    • B33Y50/02Data acquisition or data processing for additive manufacturing for controlling or regulating additive manufacturing processes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01KMEASURING TEMPERATURE; MEASURING QUANTITY OF HEAT; THERMALLY-SENSITIVE ELEMENTS NOT OTHERWISE PROVIDED FOR
    • G01K11/00Measuring temperature based upon physical or chemical changes not covered by groups G01K3/00, G01K5/00, G01K7/00 or G01K9/00
    • G01K11/12Measuring temperature based upon physical or chemical changes not covered by groups G01K3/00, G01K5/00, G01K7/00 or G01K9/00 using changes in colour, translucency or reflectance
    • G01K11/16Measuring temperature based upon physical or chemical changes not covered by groups G01K3/00, G01K5/00, G01K7/00 or G01K9/00 using changes in colour, translucency or reflectance of organic materials
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01KMEASURING TEMPERATURE; MEASURING QUANTITY OF HEAT; THERMALLY-SENSITIVE ELEMENTS NOT OTHERWISE PROVIDED FOR
    • G01K15/00Testing or calibrating of thermometers
    • G01K15/005Calibration
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B22CASTING; POWDER METALLURGY
    • B22FWORKING METALLIC POWDER; MANUFACTURE OF ARTICLES FROM METALLIC POWDER; MAKING METALLIC POWDER; APPARATUS OR DEVICES SPECIALLY ADAPTED FOR METALLIC POWDER
    • B22F2998/00Supplementary information concerning processes or compositions relating to powder metallurgy
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B22CASTING; POWDER METALLURGY
    • B22FWORKING METALLIC POWDER; MANUFACTURE OF ARTICLES FROM METALLIC POWDER; MAKING METALLIC POWDER; APPARATUS OR DEVICES SPECIALLY ADAPTED FOR METALLIC POWDER
    • B22F2999/00Aspects linked to processes or compositions used in powder metallurgy
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C64/00Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
    • B29C64/10Processes of additive manufacturing
    • B29C64/165Processes of additive manufacturing using a combination of solid and fluid materials, e.g. a powder selectively bound by a liquid binder, catalyst, inhibitor or energy absorber
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y10/00Processes of additive manufacturing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/09Supervised learning

Definitions

  • Three-dimensional (3D) solid parts may be produced from a digital model using additive manufacturing.
  • Additive manufacturing may be used in rapid prototyping, mold generation, mold master generation, and short-run manufacturing.
  • Additive manufacturing involves the application of successive layers of build material.
  • the build material may be cured or fused.
  • Figure 1 is a flow diagram illustrating an example of a method for temperature detection
  • Figure 2A is a block diagram illustrating an example of engines that may be utilized in accordance with some examples of the techniques described herein;
  • Figure 2B is a block diagram illustrating an example of engines that may be utilized in accordance with some examples of the techniques described herein;
  • Figure 3 is a block diagram of an example of an apparatus that may be used in detecting temperatures
  • Figure 4 is a block diagram illustrating an example of a computer- readable medium for temperature detection
  • Figure 5 is a diagram illustrating an example of optical image processing calibration
  • Figure 6 is a diagram illustrating an example of a build volume including a thermal cage in accordance some examples of the techniques described herein;
  • Figure 7 is a diagram illustrating an example of a printed object with a defect.
  • Additive manufacturing may be used to manufacture three- dimensional (3D) objects.
  • 3D printing is an example of additive manufacturing.
  • Some examples of 3D printing may selectively deposit an agent or agents (e.g., droplets) at a pixel level to enable control over voxel-level energy deposition. For instance, thermal energy may be projected over material in a build area, where a phase change (for example, melting and solidification) in the material may occur depending on the voxels where the agents are deposited.
  • a voxel is a representation of a location in a 3D space.
  • a voxel may represent a volume or component of a 3D space.
  • a voxel may represent a volume that is a subset of the 3D space.
  • voxels may be arranged on a 3D grid.
  • a voxel may be rectangular or cubic in shape.
  • voxels may be arranged along axes.
  • An example of three-dimensional (3D) axes includes an x dimension, a y dimension, and a z dimension.
  • a quantity in the x dimension may be referred to as a width
  • a quantity in the y dimension may be referred to as a length
  • a quantity in the z dimension may be referred to as a height
  • the x and/or y axes may be referred to as horizontal axes
  • the z axis may be referred to as a vertical axis.
  • Other orientations of the 3D axes may be utilized in some examples, and/or other definitions of 3D axes may be utilized in some examples.
  • Examples of a voxel size dimension may include 25.4 millimeters (mm)/150 « 170 microns for 150 dots per inch (dpi), 490 microns for 50 dpi, 2 mm, etc.
  • the term “voxel level” and variations thereof may refer to a resolution, scale, or density corresponding to voxel size.
  • the term “voxel” and variations thereof may refer to a “thermal voxel.”
  • the size of a thermal voxel may be defined as a minimum that is thermally meaningful (e.g., greater than or equal to 42 microns or 600 dots per inch (dpi)).
  • a set of voxels may be utilized to represent a build volume.
  • a build volume is a volume in which an object or objects may be manufactured.
  • a “build” may refer to an instance of 3D manufacturing.
  • a layer is a portion of a build volume (e.g., a build in the build volume).
  • a layer may be a cross section (e.g., two-dimensional (2D) cross section) or 3D portion (e.g., rectangular prism) of a build volume.
  • a layer may refer to a horizontal portion (e.g., plane) of a build volume.
  • an “object” may refer to an area and/or volume in a layer and/or build volume indicated for forming a physical object.
  • each voxel in the build volume may undergo a thermal procedure (approximately 15 hours of build time (e.g., time for layer-by-layer printing) and approximately 35 hours of additional cooling).
  • the thermal procedure of voxels that include an object may affect the manufacturing quality (e.g., functional quality) of the object.
  • Factors affecting production yield in 3D printing may include packing density, powder recyclability, and manufacturing accuracy (e.g., degree of object defect(s)). Tighter packing may allow printing more objects in a single build. However, if the objects are packed too closely, the objects and/or powder may overheat, which may result in a reduction in reusable powder quality and increased object defects. Accordingly, objects may be packed to keep the build volume within a range of temperatures to preserve powder quality and/or reduce object defects.
  • Some thermal information may provide a limited observation of temperatures occurring in a build volume. It may be difficult to directly observe peak temperature in the build volume, which may indicate which powder will undergo higher thermal stress. For example, a temperature of a buried layer may be different from a measured temperature (observed with a thermal sensor, for instance). In areas around objects (e.g., aural regions), peak temperatures may occur due to thermal diffusion after a layer is buried.
  • each powder voxel may provide data to calculate the thermal stress each build exerts on the powder, and to predict potential temperature-related defects in objects.
  • Each voxel may have an individual thermal history. It may be difficult to physically sense the temperature of every voxel during the thermal procedure.
  • peak temperature may drive the cooling history of the voxels (given thermal diffusion and wall conditions, for instance).
  • Accurate peak temperature data may enable calibrating thermal prediction procedures, may increase accuracy in powder degradation prediction, and/or may enable the prediction of defects such as clogged holes, thermal bleed, and/or hot-spot thermal bleed at a top (e.g., outer) surface.
  • Some examples of the techniques described herein may provide approaches for in-situ sensing of peak temperature for powder voxels using thermochromic dyes.
  • peak temperature data may be utilized to validate and/or increase the accuracy of voxel-level thermal simulation and/or prediction for different builds and/or printers.
  • some of the techniques described herein may help to determine a peak temperature that each voxel in a build volume will experience as a result of the geometry being printed and the thermal signature of the printer in question.
  • Some examples of the techniques described herein may provide accurate measurement and/or prediction of the thermal behavior in the build volume during printing.
  • thermochromic dye may evaluate peak powder temperature for powder (e.g., a portion or portions of the build volume and/or across the full build volume). Some examples of the techniques described herein may utilize thermochromic dye data to calibrate a 3D printing simulation and/or prediction.
  • print resolution e.g. 75 dpi
  • agent or agents e.g., fusing agent, detailing agent, and/or other thermally relevant fluids.
  • An example of print resolution is 42 microns in x-y dimensions and 80 microns in a z dimension.
  • thermal information or thermal behavior may be mapped as a thermal image.
  • a thermal image is a set of data indicating temperature(s) (or thermal energy) in an area.
  • a thermal image may be sensed, captured, simulated, and/or predicted.
  • plastics e.g., polymers
  • some the techniques described herein may be utilized in various examples of additive manufacturing. For instance, some examples may be utilized for plastics, polymers, semi-crystalline materials, metals, etc.
  • Some additive manufacturing techniques may be powderbased and driven by powder fusion.
  • Some examples of the approaches described herein may be applied to area-based powder bed fusion-based additive manufacturing, such as Stereolithography (SLA), Multi Jet Fusion (MJF), Metal Jet Fusion, Selective Laser Melting (SLM), Selective Laser Sintering (SLS), liquid resin-based printing, etc.
  • SLA Stereolithography
  • MDF Multi Jet Fusion
  • SLM Selective Laser Melting
  • SLS Selective Laser Sintering
  • liquid resin-based printing etc.
  • Some examples of the approaches described herein may be applied to additive manufacturing where agents carried by droplets are utilized for voxel-level thermal modulation.
  • “powder” may indicate or correspond to particles.
  • an object may indicate or correspond to a location (e.g., area, space, etc.) where particles are to be sintered, melted, and/or solidified.
  • an object may be formed from sintered or melted powder.
  • Machine learning is a technique where a machine learning model is trained to perform a task or tasks based on a set of examples (e.g., data). Training a machine learning model may include determining weights corresponding to structures of the machine learning model.
  • Artificial neural networks are a kind of machine learning model that are structured with nodes, layers, and/or connections. Deep learning is a kind of machine learning that utilizes multiple layers.
  • a deep neural network is a neural network that utilizes deep learning.
  • neural networks include regression networks (e.g., isotonic regression models), convolutional neural networks (CNNs) (e.g., basic CNN, deconvolutional neural network, inception module, residual neural network, etc.) and recurrent neural networks (RNNs) (e.g., basic RNN, multilayer RNN, bi-directional RNN, fused RNN, clockwork RNN, etc.).
  • CNNs convolutional neural networks
  • RNNs recurrent neural networks
  • Different depths of a neural network or neural networks may be utilized in accordance with some examples of the techniques described herein.
  • Figure 1 is a flow diagram illustrating an example of a method 100 for temperature detection.
  • the method 100 and/or an element or elements of the method 100 may be performed by an electronic device.
  • the method 100 may be performed by the apparatus 324 described in relation to Figure 3.
  • the apparatus may dispense 102, during a printing procedure, a thermochromic dye on a layer of material in a build volume.
  • Thermochromic dye is a substance that changes color based on heat exposure.
  • the color of thermochromic dye may correspond to a peak temperature reached by the thermochromic dye.
  • examples of thermochromic dye may include K200-NH or Acid Blue 9.
  • dispensing 102 the thermochromic dye may include controlling a print head(s) and/or sending instructions to a printer to print (e.g., eject, extrude, etc.) the thermochromic dye onto a layer of a build volume.
  • thermochromic dye may be ejected from a first print head that is separate from a second print head to eject agent (e.g., fusing agent, detailing agent, etc.).
  • eject agent e.g., fusing agent, detailing agent, etc.
  • the thermochromic dye may be dispensed over a whole layer or over a region (e.g., subset) of the layer.
  • a fusing layer is an exposed layer, a top layer, or a layer undergoing fusing of material.
  • a fusing layer may be a top layer of material in a build volume that is exposed to a print head and/or thermal projector for sintering.
  • a buried layer is a covered layer.
  • a buried layer may be a layer beneath or under the fusing layer.
  • a layer (e.g., fusing layer) may become a buried layer when a next layer of material is placed in the print bed.
  • thermochromic, water-based dye such as K200- NH may be ejected through a print head to any region of the layer (e.g., aural region(s) and/or region(s) where high temperatures may occur).
  • a thermochromic dye may have an operating range of (60-200° Celsius (C)) and may change color from white to magenta.
  • some regions may also include agent (e.g., fusing agent and/or detailing agent), where the thermochromic dye may indicate the approximate temperature.
  • a calibration procedure may be performed to compensate for a cooling and/or heating effect the thermochromic dye may have on the powder.
  • thermochromic dye such as Acid Blue 9 may be utilized to indicate build volume temperatures.
  • Acid Blue 9 may be ejected to any region of the layer such that the thermal degradation occurring over the course of the build may be captured.
  • the color of the thermochromic dye may indicate a peak temperature experienced by the thermochromic dye.
  • the thermochromic dye may change color according to the peak temperature experienced by the thermochromic dye during and/or after printing.
  • dispensing 102 the thermochromic dye may include dispensing the thermochromic dye in an aural region relative to an object in the build volume.
  • An aural region is a region next to (e.g., adjacent to, bordering, and/or abutting, etc.) an object.
  • an aural region may be an area next to an object that extends up to a distance from the object (e.g., a distance from the object surface in a normal or perpendicular direction from the object surface).
  • the aural region may extend to less than the distance in a case that another object is less than the distance from the object’s surface.
  • thermochromic dye in the aural region may experience heat due to thermal diffusion from an object being printed and/or a printed object.
  • the thermochromic dye in the aural region may be heated by thermal energy diffusing from a sintering region (e.g., the object).
  • the method 100 may include printing a thermal cage in the build volume.
  • the thermochromic dye may be dispensed in and/or on a volume of the thermal cage (e.g., in a volume contained by the thermal cage).
  • a thermal cage is a structure that may partially or completely contain an object or objects and/or powder.
  • a thermal cage e.g., box, container, shape, etc.
  • a thermal cage may have relatively low thermal mass. Having a relatively low thermal mass may mean that the thermal cage does not produce a significant amount of heat by diffusion and/or that the thermal cage has a nonsignificant impact on the temperature history of a region or regions around and/or within the thermal cage.
  • a thermal cage or thermal cages may be printed at a region or regions of interest in the build volume.
  • An example of a thermal cage is given in relation to Figure 6.
  • a thermal cage may be utilized to provide a stable volume where the thermochromic dye may be dispensed.
  • the thermochromic dye may be dispensed inside a thermal cage, which may allow observation of (e.g., image capture of) the color(s) of an aural region or regions around objects.
  • a thermal cage may be utilized to map the location of detected temperature (based on thermochromic dye color) to the build volume.
  • a thermal cage may be utilized to extract a section of the build volume for external analysis of the temperature distribution.
  • a thermal cage may be utilized to avoid losing dyed powder during unpacking (e.g., de-caking, extraction, reclamation, etc.).
  • a thermal cage may be extracted manually.
  • the apparatus may detect 104, after the printing procedure, a temperature in an aural region of an object in the build volume based on a color of the thermochromic dye.
  • the apparatus may capture and/or receive an optical image of the layer, aural region, object, and/or thermal cage.
  • the optical image may be captured by a camera with a view that includes the layer and/or aural region.
  • the camera may capture an optical image depicting a layer, aural region, object, and/or thermal cage.
  • the optical image may be captured during and/or after unpacking. Unpacking is a procedure where material is removed from the build volume and/or an object or objects are removed from the build volume.
  • material e.g., a layer or layers of material
  • material may be removed through vacuuming, scooping, shearing, blowing, and/or otherwise removing material.
  • unpacking may be performed manually by a user or technician. In some examples, unpacking may be performed automatically by a robot and/or other machine.
  • the camera may be included in a device, mounted on a device, and/or may be separate from a device.
  • the camera may be mounted in a printer or trolley above the build volume.
  • the camera may be manually operated by a user or technician.
  • the layer, aural region, object, and/or thermal cage may be illuminated with a white light source (e.g., white light-emitting diode (LED)).
  • a white light source e.g., white light-emitting diode (LED)
  • the camera and/or device e.g., printer, trolley, etc.
  • the optical image may be a still image or video frame.
  • the optical image may depict a whole layer or a region of the layer.
  • the optical image may indicate a color or colors of the thermochromic dye on the layer, aural region, object, and/or thermal cage.
  • the apparatus may measure the temperature of the layer, aural region, object, and/or thermal cage by determining a temperature corresponding to a color in the optical image.
  • detecting 104 the temperature may include mapping a color of the thermochromic dye from a captured image to the temperature.
  • the apparatus may utilize a look-up table or function to map pixel color from the optical image to temperature. For instance, different pixel colors and/or shades may correspond to different temperatures (e.g., peak temperatures) experienced by the thermochromic dye.
  • the apparatus may measure the temperature of the fusing layer by mapping a color in the optical image to a corresponding temperature.
  • the apparatus may assign and/or record the temperature or temperatures corresponding to a pixel or sets of pixels (e.g., areas of the optical images with the same color or within a color range).
  • the apparatus may utilize a spatial mapping between the optical image and voxels of the build volume. For example, the apparatus may apply a transformation or transformations (e.g., unprojection) to the pixels in the optical image to map the color(s) and/or corresponding temperature(s) to the voxels of the layer, aural region, object, and/or thermal cage (e.g., locations in 3D space).
  • the spatial mapping may be based on a marker or structure of the layer, aural region, object, and/or thermal cage.
  • a structure e.g., text, number, pattern, geometry, etc.
  • a structure e.g., text, number, pattern, geometry, etc.
  • the spatial mapping may indicate the approximate original location(s) (e.g., voxels) of colors and/or temperatures of the layer, aural region, object, and/or thermal cage in the build volume.
  • the apparatus may perform optical image processing.
  • Optical image processing is an operation or operations to process optical data (e.g., pixels) of the optical image.
  • Examples of optical image processing may include color compensation, white balance compensation, and/or lens distortion correction, etc.
  • the apparatus may adjust the optical image to increase color accuracy of the optical image.
  • the method 100 may include calibrating optical image processing based on color swatches associated with temperatures.
  • a color swatch is an item with a sample color. For instance, color swatches of the same color (and/or corresponding to the same temperature of the thermochromic dye) may be placed at different locations within a camera’s field of view (of a build volume and/or capture area, for example). Due to environmental variations (e.g., lighting, etc.) and/or sensing variations (e.g., lens distortion, etc.), swatches with the same physical color may vary in the optical image (e.g., color may vary in different locations of the optical image) captured by the camera.
  • environmental variations e.g., lighting, etc.
  • sensing variations e.g., lens distortion, etc.
  • a camera may inaccurately sense the color of a swatch as a slightly different color in the optical image.
  • the apparatus may calibrate (e.g., adjust) the optical image processing to reduce color sensing inaccuracy and/or spatial color variation between swatches of the same color.
  • a color swatch may have a designated color and/or corresponding temperature.
  • the apparatus may receive the designated color (e.g., red-green-blue (RGB) value) of a color swatch or swatches from an input device (e.g., keyboard, mouse, touchscreen, etc.).
  • the apparatus may capture and/or receive an optical image that depicts the swatch or swatches.
  • the apparatus may determine color compensation (e.g., a difference or bias) between the designated color and the captured color (and/or compensation between a temperature corresponding to the designated color and a temperature corresponding to the captured color).
  • color compensation e.g., a difference or bias
  • the apparatus may determine color compensation for spatial variation in color sensing.
  • calibrating the optical image processing may be performed before printing.
  • the calibration e.g., compensation
  • the apparatus may capture and/or receive a calibration image of a color swatch or swatches and compensate the optical image processing before a build is printed, before thermochromic dye is dispensed, and/or before an optical image of a fusing layer with thermochromic dye is captured.
  • the calibration e.g., compensation
  • the calibration may be applied to an optical image processing pipeline, such that compensation is automatically applied during processing of an optical image.
  • calibrating the optical image processing may be performed after printing.
  • the calibration e.g., compensation, post- print correction, etc.
  • the apparatus may capture and/or receive a calibration image of a color swatch or swatches and compensate the optical image processing after a build is printed, after thermochromic dye is dispensed, and/or after an optical image of a fusing layer with thermochromic dye is captured.
  • the calibration e.g., compensation
  • the apparatus may apply the determined compensation (e.g., color compensation) to an optical image to increase color accuracy in the image, which may increase temperature measurement accuracy.
  • the apparatus may apply the determined temperature compensation to a temperature determined from the optical image to increase temperature measurement accuracy.
  • optical image processing calibration may be performed for multiple colors and/or color shades. In some examples, optical image processing calibration may be performed in accordance with the example described in relation to Figure 5.
  • the method 100 may include training a machine learning model based on the temperature.
  • a machine learning model may be trained to predict the thermal behavior (e.g., temperature, a thermal image, etc.) of 3D additive manufacturing.
  • the machine learning model may be trained to predict the thermal behavior of a buried layer or layers.
  • the temperature detected from the thermochromic dye may be utilized as ground truth data during training. For instance, the detected temperature may be utilized to calibrate (e.g., compensate, adjust, etc.) predicted thermal behavior to increase accuracy.
  • the machine learning model may utilize geometrical data as input to predict thermal behavior.
  • Geometrical data is data indicating a geometrical model or models of an object or objects.
  • geometrical data may indicate the placement and/or model of an object or objects in a build volume.
  • a model may specify shape and/or size of a 3D object or objects.
  • a model may be expressed using polygon meshes and/or coordinate points.
  • a model may be defined using a format or formats such as a 3D manufacturing format (3MF) file format, an object (OBJ) file format, computer aided design (CAD) file, and/or a stereolithography (STL) file format, etc.
  • 3MF 3D manufacturing format
  • OBJ object
  • CAD computer aided design
  • STL stereolithography
  • the geometrical data indicating a model or models may be received from another device and/or generated.
  • the apparatus may receive a file or files of geometrical data and/or may generate a file or files of geometrical data.
  • the apparatus may generate geometrical data with model(s) created on the apparatus from an input or inputs (e.g., scanned object input, user-specified input, etc.). Examples of geometrical data include model data, shape image(s), slice(s), contone map(s), etc.
  • the machine learning model may be trained to determine (e.g., predict, infer, etc.) a thermal image or images corresponding to a layer or layers of geometrical data. For instance, the machine learning model may be trained using layers (e.g., slices) of geometrical data as input data and detected temperature(s) (at corresponding locations, for example) as ground truth data. After training, the first machine learning model may predict or infer the thermal behavior (e.g., thermal image(s), temperature(s), etc.) in a build volume.
  • layers e.g., slices
  • the first machine learning model may predict or infer the thermal behavior (e.g., thermal image(s), temperature(s), etc.) in a build volume.
  • the apparatus may utilize a training objective function to train the machine learning model.
  • a training objective function may be utilized that reduces (e.g., minimizes) a pixel-wise difference.
  • the training object function may reduce (e.g., minimize) a pixel-wise temperature difference between the predicted temperatures and the temperatures measured from a corresponding region of the optical image.
  • the apparatus may train the machine learning model with an isotonic regression model, where the temperature distribution value for input and ground truth may be divided into temperature range blocks with probabilities of pixels falling into a temperature range.
  • the isotonic regression model may be utilized as a training objective function to reduce (e.g., minimize) pair-wise differences for each temperature range’s probability value.
  • the method 100 may include predicting, using the machine learning model, a thermal image based on geometrical data.
  • the apparatus may predict, using the machine learning model, a thermal image based on geometrical data after training.
  • the method 100 may include determining a thermal bleed threshold based on the temperature.
  • a thermal bleed threshold is a temperature at which thermal bleed occurs.
  • Thermal bleed is a phenomenon where powder outside of an object (e.g., non-target powder) sinters to an object. For instance, if powder outside of an object reaches the thermal bleed threshold during printing, the powder will stick to the object, which may result in a defect. With the use of thermochromic dye, the defect may indicate a color mark that indicates the peak temperature experienced by the powder. An example of a defect due to thermal bleed is given in relation to Figure 7.
  • the color of the thermochromic dye and/or the corresponding temperature may be utilized for determining the thermal bleed threshold (e.g., temperature threshold) to avoid defects.
  • the apparatus may detect a temperature of the defect (e.g., aural region) of an object as described herein based on an optical image. For instance, the apparatus may map a color or colors of thermochromic dye on a defect to a temperature or temperatures.
  • a temperature indicated by the color of the thermochromic dye on the defect may be determined as the thermal bleed threshold.
  • a minimum temperature on the defect (indicated by a lightest thermochromic dye color on the defect, for instance) may be determined as the thermal bleed threshold.
  • the lowest temperature experienced by the thermochromic dye on the defect may be a thermal bleed threshold at which powder (e.g., non-target powder, powder outside of an object, etc.) may sinter to the object.
  • the thermal bleed threshold may be referred to as a scrapped object threshold and/or may be utilized as a scrapped object threshold. For instance, if the thermal bleed threshold is met or exceeded, a defect may occur on the object, which may cause the object to be scrapped (e.g., discarded).
  • the method 100 may include determining a packing margin based on the temperature.
  • a packing margin is a distance (e.g., minimum distance) between objects in a build.
  • the detected temperature(s) may indicate temperature in the aural region due to thermal diffusion over distance.
  • a packing margin may be determined as a distance (e.g., minimum distance) that prevents heating aural region powder beyond a threshold temperature.
  • the threshold may be the thermal bleed threshold described previously.
  • the apparatus may determine a packing margin as a distance that avoids meeting or exceeding the thermal bleed threshold.
  • the thermal bleed threshold may provide useful information for determining the packing margin.
  • the apparatus may utilize the machine learning model that is trained based on the temperature to predict an aural region temperature for an object in a build.
  • a predicted temperature in an aural region that meets or exceeds the threshold (e.g., thermal bleed threshold) may indicate a packing where an object is packed too closely to avoid producing a defect.
  • the apparatus may iteratively predict aural region temperature and increase a packing margin until the predict aural region temperature does not meet or exceed the threshold (e.g., thermal bleed threshold). The distance at which the aural region temperature does not meet or exceed the threshold may be determined as the packing margin.
  • an extracted object e.g., unpacked object
  • a powder to reclaim e.g., reclaim to recycling in a subsequent build
  • powder to discard e.g., scrap
  • the thermochromic dye on the powder may exhibit a color gradient, where powder closer to the object may be darker than powder away from the object.
  • the color gradient may be utilized to teach users and/or technicians which powder to reclaim and which powder to scrap (to ensure powder quality for a subsequent build, for instance) for a given geometry.
  • Figure 2A is a block diagram illustrating an example of engines 204 that may be utilized in accordance with some examples of the techniques described herein.
  • Figure 2A illustrates examples of engines 204 that may be utilized to train a machine learning model.
  • an engine or engines of the engines 204 described in relation to Figure 2A may be included in the apparatus 324 described in relation to Figure 3.
  • a function or functions described in relation to any of Figures 1-7 may be performed by an engine or engines described in relation to Figure 2A.
  • An engine or engines described in relation to Figure 2A may be included in a device or devices, in hardware (e.g., circuitry) and/or in a combination of hardware and instructions (e.g., processor and instructions).
  • the engines 204 described in relation to Figure 2A include a slicing engine 210, a machine learning model engine 206, a region determination engine 212, a temperature measurement engine 218, a printing control engine 213, and an optical image capture engine 215.
  • the engines 204 may be disposed on one device.
  • the engines 204 may be included in a 3D printer.
  • an engine(s) of the engines 204 may be disposed on different devices.
  • the machine learning model engine 206, the region determination engine 212, and/or the temperature measurement engine 218 may be included in a computer, while the printing control engine 213 and the optical image capture engine 215 may be included in a 3D printer.
  • instructions for the machine learning model engine 206, the region determination engine 212, and/or the temperature measurement engine 218 may be stored in memory 326 and executed by a processor 328 of the apparatus 324 described in Figure 3 in some examples.
  • a function or functions of the printing control engine 213 and/or the optical image capture engine 215 may be performed by another apparatus.
  • Geometrical data 202 may be obtained.
  • the geometrical data 202 may be received from another device and/or generated as described in relation to Figure 1 .
  • the geometrical data 202 may include training data from a training dataset.
  • the geometrical data 202 may be provided to the slicing engine 210.
  • the slicing engine 210 may perform slicing based on the geometrical data 202.
  • slicing may include generating a slice or slices (e.g., 2D slice(s)) corresponding to the geometrical data 202.
  • an apparatus may slice the geometrical data 202 representing a build.
  • slicing may include generating a set of 2D slices corresponding to the build.
  • a slice is a portion or cross-section.
  • a build may be traversed along an axis (e.g., a vertical axis, z-axis, or other axis), where each slice represents a 2D cross section of the build.
  • slicing the model may include identifying a z-coordinate of a slice plane.
  • the z-coordinate of the slice plane can be used to traverse the build to identify a portion or portions of the build intercepted by the slice plane.
  • a slice or slices may be expressed as a binary image or binary images.
  • the slice(s) may be provided to the region determination engine 212.
  • the region determination engine 212 may determine an aural region or regions based on the slice(s) provided by the slicing engine 210. For instance, the region determination engine 212 may determine aural regions of the slice(s) as regions next to (e.g., abutting) and within a distance (e.g., 1 mm, 2 mm, 5 mm, 10 mm, 1 cm, etc.) from an outer surface of an object or objects indicated in the slice(s). The region determination engine 212 may provide an indicator of the aural region(s) to the printing control engine 213, to the temperature measurement engine 218, and/or to the machine learning model engine 206.
  • the indicator may indicate an aural region(s) for thermochromic dye dispensing and/or thermal cage location(s).
  • the region determination engine 212 may determine size and/or location to print a thermal cage or cages. In some examples, the size and/or location of a thermal cage(s) may be determined based on a received input. In some examples, the region determination engine 212 may determine the size and/or location of the thermal cage(s) to meet a criterion or criteria (e.g., volume, object to powder ratio, including a closest packing margin, etc.).
  • a criterion or criteria e.g., volume, object to powder ratio, including a closest packing margin, etc.
  • the printing control engine 213 may control dispensing of a thermochromic dye and/or agent(s) on material in a build volume.
  • the printing control engine 213 may control a print head or print heads and/or a carrier of a 3D printer to dispense the thermochromic dye(s) on a layer of a build volume.
  • the printing control engine 213 may utilize the indicator of the aural region(s) to dispense thermochromic dye.
  • the printing control engine 213 may control dispensing of the thermochromic dye on an aural region next to an object and/or in a thermal cage.
  • multiple thermochromic dyes may be dispensed.
  • thermochromic dye may be dispensed and a second thermochromic dye may be dispensed.
  • a second thermochromic dye may be dispensed from another (e.g., third, separate, etc.) print head.
  • the second thermochromic dye may exhibit a second range of temperature sensitivity that is different from a first range of temperature sensitivity of the first thermochromic dye. For instance, different ranges of temperature sensitivities may occupy different temperature ranges.
  • a first thermochromic dye may be utilized for a lower temperature range and a second thermochromic dye may be utilized for a higher temperature range.
  • the first thermochromic dye may be dispensed on aural regions and the second thermochromic dye may be dispensed on object regions.
  • the printing control engine 213 may print a thermal cage or thermal cages.
  • the printing control engine 213 may utilize the indicator(s) provided by the region determination engine 212 to apply agent at the thermal cage location(s).
  • the agent may cause sintering in the powder when heating during printing to form the thermal cage(s).
  • the optical image capture engine 215 may control optical image capture. For instance, the optical image capture engine 215 may control an optical image sensor and/or camera to capture video and/or a still frame of a layer, aural region, object, and/or thermal cage (after printing, for example). In some examples, the optical image capture engine 215 may capture optical image(s) after printing is completed and/or during an unpacking procedure. In some examples, the optical image capture engine 215 may control a light source or sources (e.g., white LED(s)) to illuminate the build volume and/or capture area (e.g., separate table, trolley, etc.) during image capture. The optical image capture engine 215 may provide an optical image or images of a layer, aural region, object, and/or thermal cage to the temperature measurement engine 218.
  • a light source or sources e.g., white LED(s)
  • the temperature measurement engine 218 may measure a temperature of a layer, aural region, object, and/or thermal cage indicated by an optical image of the thermochromic dye. In some examples, measuring the temperature of the fusing layer(s) may be performed as described in relation to Figure 1. For instance, the temperature measurement engine 218 may map a color appearing in an optical image to a temperature to measure the temperature. In some examples, the temperature measurement engine 218 may utilize the indicator of the thermal image region to measure the temperature. For instance, the temperature measurement engine 218 may determine a temperature based on a region of the optical image corresponding to an aural region and/or thermal cage. The measured temperature or temperatures may be provided to the machine learning model engine 206.
  • the machine learning model engine 206 may train a machine learning model based on the measured temperature and/or the aural region(s). In some examples, the machine learning model engine 206 may train the machine learning model as described in relation to Figure 1 . For instance, the weights of the second machine learning model may be adjusted to reduce (e.g., minimize) a difference between a predicted temperature of the aural region and the temperature (e.g., peak temperature) measured from a corresponding region of the optical image.
  • the weights of the second machine learning model may be adjusted to reduce (e.g., minimize) a difference between a predicted temperature of the aural region and the temperature (e.g., peak temperature) measured from a corresponding region of the optical image.
  • the machine learning model engine 206 may be trained to determine (e.g., predict, infer, etc.) a thermal image or thermal images based on the geometrical data 202 (e.g., slice(s)). In some examples, the machine learning model engine 206 may be trained to determine (e.g., predict, infer, etc.) a thermal image or images corresponding to a layer or layers of geometrical data 202 (e.g., slice(s)) as described in relation to Figure 1. For instance, the first machine learning model may determine a thermal image of a layer or layers (e.g., buried layer(s)).
  • Figure 2B is a block diagram illustrating an example of engines 214 that may be utilized in accordance with some examples of the techniques described herein.
  • Figure 2B illustrates examples of engines 214 that may be utilized after training (e.g., during an inferencing or prediction stage) to predict a temperature(s) 222.
  • an engine or engines of the engines 214 described in relation to Figure 2B may be included in the apparatus 324 described in relation to Figure 3.
  • a function or functions described in relation to any of Figures 1-7 may be performed by an engine or engines described in relation to Figure 2B.
  • An engine or engines described in relation to Figure 2B may be included in a device or devices, in hardware (e.g., circuitry) and/or in a combination of hardware and instructions (e.g., processor and instructions).
  • the engines 214 described in relation to Figure 2B include a slicing engine 210 and a machine learning model engine 206.
  • the machine learning model engine 206 described in relation to Figure 2B may be an example of the machine learning model engine 206 described in relation to Figure 2A after training.
  • Geometrical data 203 may be obtained.
  • the geometrical data 203 may be received from another device and/or generated as described in relation to Figure 1 .
  • the geometrical data 203 may include data for use in an inferencing stage.
  • the geometrical data 203 may be provided to the slicing engine 210.
  • the slicing engine 210 may perform slicing based on the geometrical data 203. For example, slicing may be performed as similarly described in relation to Figure 2A.
  • the slice(s) may be provided to the machine learning model engine 206.
  • the machine learning model engine 206 may determine (e.g., predict, infer, etc.) a thermal image or thermal images based on the geometrical data 203 (e.g., slice(s)).
  • the machine learning model engine 206 may include a first machine learning model that is trained to determine (e.g., predict, infer, etc.) a thermal image or images corresponding to a layer or layers of geometrical data 203 as described in relation to Figure 1 .
  • the first machine learning model may determine a thermal image of a layer or layers (e.g., buried layer(s)).
  • the thermal image(s) may indicate a temperature(s) 222.
  • the thermal image(s) may indicate a temperature(s) 222 in an aural region(s).
  • FIG. 3 is a block diagram of an example of an apparatus 324 that may be used in detecting temperatures.
  • the apparatus 324 may be a computing device, such as a personal computer, a server computer, a printer, a 3D printer, a smartphone, a tablet computer, etc.
  • the apparatus 324 may include and/or may be coupled to a processor 328, a communication interface 330, and/or a memory 326.
  • the apparatus 324 may be in communication with (e.g., coupled to, have a communication link with) an additive manufacturing device (e.g., a 3D printer).
  • the apparatus 324 may be an example of 3D printer.
  • the apparatus 324 may include additional components (not shown) and/or some of the components described herein may be removed and/or modified without departing from the scope of the disclosure.
  • the apparatus 324 may be a 3D printer that includes an optical image sensor(s) (e.g., optical camera(s)) (not shown in Figure 3).
  • the processor 328 may be any of a central processing unit (CPU), a semiconductor-based microprocessor, graphics processing unit (GPU), field- programmable gate array (FPGA), an application-specific integrated circuit (ASIC), and/or other hardware device suitable for retrieval and execution of instructions stored in the memory 326.
  • the processor 328 may fetch, decode, and/or execute instructions stored on the memory 326.
  • the processor 328 may include an electronic circuit or circuits that include electronic components for performing a functionality or functionalities of the instructions.
  • the processor 328 may perform one, some, or all of the aspects, elements, techniques, etc., described in relation to one, some, or all of Figures 1-7.
  • the memory 326 is an electronic, magnetic, optical, and/or other physical storage device that contains or stores electronic information (e.g., instructions and/or data).
  • the memory 326 may be, for example, Random Access Memory (RAM), Electrically Erasable Programmable Read-Only Memory (EEPROM), a storage device, an optical disc, and/or the like.
  • RAM Random Access Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • the memory 326 may be volatile and/or non-volatile memory, such as Dynamic Random Access Memory (DRAM), EEPROM, magnetoresistive random-access memory (MRAM), phase change RAM (PCRAM), memristor, flash memory, and/or the like.
  • DRAM Dynamic Random Access Memory
  • MRAM magnetoresistive random-access memory
  • PCRAM phase change RAM
  • memristor flash memory, and/or the like.
  • the memory 326 may be a non-transitory tangible machine-readable storage medium, where the term “non- transitory” does not encompass transitory propagating signals.
  • the memory 326 may include multiple devices (e.g., a RAM card and a solid-state drive (SSD)).
  • the apparatus 324 may further include a communication interface 330 through which the processor 328 may communicate with an external device or devices (not shown), for instance, to receive and store the information pertaining to a build or builds (e.g., data for training and/or object printing).
  • the communication interface 330 may include hardware and/or machine-readable instructions to enable the processor 328 to communicate with the external device or devices.
  • the communication interface 330 may enable a wired or wireless connection to the external device or devices.
  • the communication interface 330 may further include a network interface card and/or may also include hardware and/or machine-readable instructions to enable the processor 328 to communicate with various input and/or output devices, such as a keyboard, a mouse, a display, another apparatus, electronic device, computing device, printer, etc.
  • a user may input instructions into the apparatus 324 via an input device.
  • the memory 326 may store image data 336.
  • the image data 336 may be generated (e.g., simulated, predicted, and/or inferred) and/or may be obtained (e.g., received, captured, etc.) from an optical image sensor(s).
  • the processor 328 may execute instructions (not shown in Figure 3) to obtain an optical image(s) of thermochromic dye of a layer, aural region, object, and/or thermal cage.
  • the apparatus 324 may include an optical image sensor(s), may be coupled to a remote optical image sensor(s), and/or may receive image data 336 (e.g., optical image(s)) from an (integrated and/or remote) optical image sensor(s).
  • the image data 336 may include a captured optical image(s).
  • a captured optical image may be an optical image of thermochromic dye on a layer, aural region, object, and/or thermal cage of a build volume.
  • a printer may eject thermochromic dye on a layer, aural region, object, and/or thermal cage using a first print head.
  • the first print head may be separate from a second print head to eject agent.
  • the captured optical image may depict the color of the thermochromic dye on the layer, aural region, object, and/or thermal cage.
  • the optical image sensor(s) may undergo a procedure(s) to overcome distortion introduced by sensor(s).
  • the image data 336 may include a captured optical image(s).
  • the captured optical image(s) may be received from a separate camera (e.g., a separate camera operated by a user or technician).
  • the memory 326 may store temperature determination instructions 334.
  • the processor 328 may execute the temperature determination instructions 334 to determine a peak temperature of an aural region of a layer occurring after the layer is buried based on an optical image of thermochromic dye of the layer.
  • determining a temperature of an aural region based on an optical image of thermochromic dye may be performed as described in relation to Figure 1 and/or Figure 2A.
  • a peak temperature of the aural region may occur after the layer is buried due to thermal diffusion.
  • heat utilized to sinter an object may diffuse into the aural region around the object, where a peak temperature experienced by the powder in the aural region may occur after the layer of the aural region is buried.
  • the optical image of the thermochromic dye may be captured and/or utilized to determine the peak temperature based on the color of the thermochromic dye.
  • the memory 326 may store geometrical data 340.
  • the geometrical data 340 may include and/or indicate a model or models (e.g., 3D object model(s)).
  • the apparatus 324 may generate the geometrical data 340 and/or may receive the geometrical data 340 from another device.
  • the memory 326 may include slicing instructions (not shown in Figure 3).
  • the processor 328 may execute the slicing instructions to perform slicing on the 3D model data to produce a stack of 2D vector slices.
  • the processor 328 may execute machine learning model instructions 341 to predict a thermal image (e.g., aural region temperature) based on the geometrical data 340 (e.g., slices).
  • the thermal image may be stored as part of image data 336.
  • the machine learning model may be an example of the machine learning model described in relation to Figure 1 , Figure 2A, and/or Figure 2B.
  • the memory 326 may store training instructions 342.
  • the processor 328 may execute the training instructions 342 to train the machine learning model based on temperature (e.g., the temperature determined by executing the temperature determination instructions 334).
  • the machine learning model may be trained as described in relation to Figure 1 and/or Figure 2A.
  • the machine learning model may be trained to predict a thermal image (e.g., aural region temperature) based on geometrical data 340.
  • the processor 328 may execute the machine learning model instructions 341 to predict a thermal image based on geometrical data 340.
  • the machine learning model may be utilized to predict temperature based on geometrical data 340 during an inferencing stage.
  • the memory 326 may store operation instructions 346.
  • the processor 328 may execute the operation instructions 346 to perform an operation based on temperature(s) (e.g., temperature(s) in an aural region) determined from an optical image and/or a thermal image (e.g., temperature(s) in an aural region) predicted by the machine learning model.
  • the processor 328 may execute the operation instructions 346 to utilize the temperature(s) to serve another device (e.g., printer controller). For instance, the processor 328 may print (e.g., control amount and/or location of agent(s) for) a layer or layers based on the temperature(s).
  • the processor 328 may drive model setting (e.g., the size of the stride) based on the temperature(s). In some examples, the processor 328 may perform offline print model tuning based on the temperature(s). In some examples, the processor 328 may send a message (e.g., alert, alarm, progress report, quality rating, etc.) based on the temperature(s). For instance, the temperature may indicate a probability that a defect in a printed object may occur (e.g., clogged hole(s), thermal bleed, etc.) and/or that powder will be degraded beyond a threshold quantity. The apparatus 324 may present and/or send a message indicating the potential defect and/or powder degradation.
  • model setting e.g., the size of the stride
  • the processor 328 may perform offline print model tuning based on the temperature(s).
  • the processor 328 may send a message (e.g., alert, alarm, progress report, quality rating, etc.) based on the temperature(s). For instance, the temperature may indicate
  • the processor 328 may halt printing in a case that the temperature(s) indicate or indicates an issue (e.g., more than a threshold temperature). In some examples, the processor 328 may feed the temperature(s) for an upcoming layer to a thermal feedback controller to online-compensate contone maps for the upcoming layer.
  • the operation instructions 346 may include 3D printing instructions.
  • the processor 328 may execute the 3D printing instructions to print a 3D object or objects.
  • the 3D printing instructions may include instructions for controlling a device or devices (e.g., rollers, print heads, thermal projectors, and/or fuse lamps, etc.).
  • the 3D printing instructions may use a contone map or contone maps (stored as contone map data, for instance) to control a print head or heads to print an agent or agents in a location or locations specified by the contone map or maps.
  • the processor 328 may execute the 3D printing instructions to print a layer or layers.
  • the printing may be based on the temperature (e.g., temperature determined from an optical image and/or predicted thermal map).
  • a thermal projector e.g., lamp
  • intensity may be adjusted to avoid heating an aural region beyond a thermal bleed threshold (in a case that the predicted thermal map or temperature indicates temperature beyond a threshold, for instance) or may be adjusted to increase heat (in a case that the predicted thermal map or temperature indicates that heat may be increased without exceeding a thermal bleed threshold, for instance).
  • the processor 328 may execute the operation instructions 346 to present a visualization or visualizations of the temperature(s) on a display and/or send the temperature(s) to another device (e.g., computing device, monitor, etc.).
  • the processor 328 may execute the operation instructions 346 to produce a visualization of a determined temperature or temperatures.
  • the processor 328 may label an optical image with the determined temperature(s) (indicated by thermochromic dye color, for instance). The optical image may be labeled in an aural region(s), for example.
  • the processor 328 may execute the operation instructions 346 to produce and/or present a visualization of a defect (e.g., thermal bleed).
  • a defect e.g., thermal bleed
  • the processor 328 may label a defect depicted in an optical image. For instance, the processor 328 may identify a defect based on a color of a thermochromic dye indicated in the image. In some examples, the processor 328 may detect the defect by detecting an area of the optical image with a pixel color (or range of colors, for instance) corresponding to the thermochromic dye. The processor 328 may label and/or mark the area (e.g., add an outline around the area, add a temperature label to the area, etc.) of the detected defect.
  • the processor 328 may label and/or mark the area (e.g., add an outline around the area, add a temperature label to the area, etc.) of the detected defect.
  • Figure 4 is a block diagram illustrating an example of a computer- readable medium 448 for temperature detection.
  • the computer-readable medium 448 is a non-transitory, tangible computer-readable medium.
  • the computer-readable medium 448 may be, for example, RAM, EEPROM, a storage device, an optical disc, and the like.
  • the computer- readable medium 448 may be volatile and/or non-volatile memory, such as DRAM, EEPROM, MRAM, PCRAM, memristor, flash memory, and the like.
  • the memory 326 described in relation to Figure 3 may be an example of the computer-readable medium 448 described in relation to Figure
  • the computer-readable medium 448 may include code, instructions and/or data to cause a processor to perform one, some, or all of the operations, aspects, elements, etc., described in relation to one, some, or all of Figure 1 , Figure 2, Figure 3, Figure 5, Figure 6, and/or Figure 7.
  • the computer-readable medium 448 may include data (e.g., information and/or instructions).
  • the computer-readable medium 448 may include optical image processing calibration instructions 450 and/or temperature measurement instructions 452.
  • the optical image processing calibration instructions 450 may be instructions when executed cause a processor of an electronic device to calibrate optical image processing based on a first optical image of color swatches captured by a camera. In some examples, calibrating optical image processing may be performed as described in relation to Figure 1 and/or Figure
  • the temperature measurement instructions 452 may be instructions when executed cause a processor of an electronic device to measure a temperature of a previously buried layer indicated by a second optical image of thermochromic dye captured by the camera based on the calibrated optical image processing.
  • measuring a temperature of a previously buried layer (e.g., layer, aural region, object, and/or thermal cage) indicated by a second optical image of thermochromic dye based on the calibrated optical image processing may be performed as described in relation to Figure 1 .
  • the computer-readable medium 448 may include thermal image prediction instructions (not shown in Figure 4).
  • the thermal image prediction instructions may be instructions when executed cause a processor of an electronic device to predict a thermal image based on geometrical data.
  • predicting a thermal image based on geometrical data may be performed as described in relation to Figure 1 , Figure 2B, and/or Figure 3.
  • a processor may execute a machine learning model and may predict a thermal image based on the geometrical data.
  • the machine learning model may be trained based on the measured temperature(s).
  • the computer-readable medium 448 may include thermal bleed threshold determination instructions (not shown in Figure 4).
  • the thermal bleed threshold determination instructions may be instructions when executed cause a processor of an electronic device to determine a thermal bleed threshold based on the temperature.
  • thermal bleed threshold determination may be performed as described in relation to Figure 1 .
  • the thermochromic dye may be utilized to determine the thermal bleed threshold (e.g., temperature level(s)) at which an object may be scrapped as a result of thermal bleed.
  • the computer-readable medium 448 may include packing margin determination instructions (not shown in Figure 4).
  • the packing margin determination instructions may be instructions when executed cause a processor of an electronic device to determine a packing margin based on the temperature. In some examples, packing margin determination may be performed as described in relation to Figure 1 .
  • Figure 5 is a diagram illustrating an example of optical image processing calibration 556. As illustrated in Figure 5, an image sensor 558 (e.g., color camera) with a white LED light source may be utilized to capture an optical image of color swatches 560a-h. The color swatches 560a-h may be swatches of different colors and/or color shades (e.g., shades of magenta).
  • color swatches may be placed in different locations.
  • color swatches e.g., color swatches 560a-h and other identical color swatches
  • a build volume e.g., build bed
  • An optical image of the color swatches may be captured from the build volume.
  • the optical image of the color swatches may be utilized to calibrate optical image processing to increase color accuracy.
  • optical image processing may be calibrated as described in relation to Figure 1.
  • the increased color accuracy may be utilized to increase temperature measurement accuracy based on optical images.
  • FIG. 6 is a diagram illustrating an example of a build volume 660 including a thermal cage 662 in accordance some examples of the techniques described herein.
  • the build volume 660 includes a plurality of objects 666 for printing.
  • a thermal cage 662 may be printed with the objects.
  • Thermochromic dye 664 may be dispensed on the thermal cage 662 (e.g., on a layer of a top wall of the thermal cage 662).
  • several of the objects 666 protrude through the top wall of the thermal cage 662.
  • the color of the thermochromic dye 664 in aural regions around objects protruding through the thermal cage 662 may be utilized to detect a temperature or temperatures in the aural regions.
  • an optical image of the thermal cage 662 may be captured after printing and unpacking (e.g., partial unpacking or complete unpacking).
  • an apparatus may utilize the optical image to detect the temperature(s) in the aural region(s).
  • the temperature(s) may be utilized to train a machine learning model to predict temperature (e.g., aural region temperature, a thermal image, etc.), utilized to determine a thermal bleed threshold, and/or utilized to determine a packing margin.
  • an apparatus may print an object or objects based on the trained machine learning model, the thermal bleed threshold, and/or the packing margin.
  • the packing margin may be utilized in a packing technique to pack objects with the packing margin between nearest points of neighboring objects.
  • FIG. 7 is a diagram illustrating an example of a printed object 768 with a defect 770.
  • thermochromic dye may be dispensed in an aural region of an object.
  • the thermochromic dye may provide a color-coded indication of a peak temperature or temperatures experienced around an object.
  • a defect 770 of the object may include a color mark that indicates the peak temperature experienced by the powder where the defect 770 occurred.
  • an apparatus may utilize the color of the defect 770 to determine the temperature of the defect 770.
  • the temperature may be utilized to determine a thermal bleed threshold (and/or scrapping threshold) to avoid defects in a subsequent build(s). For instance, heat application may be reduced and/or agent (e.g., fusing agent and/or detailing agent) placement may be controlled to avoid meeting or exceeding the thermal bleed threshold.
  • agent e.g., fusing agent and/or detailing agent
  • the term “and/or” may mean an item or items.
  • the phrase “A, B, and/or C” may mean any of: A (without B and C), B (without A and C), C (without A and B), A and B (without C), B and C (without A), A and C (without B), or all of A, B, and C.

Landscapes

  • Chemical & Material Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Materials Engineering (AREA)
  • Manufacturing & Machinery (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Optics & Photonics (AREA)
  • Analytical Chemistry (AREA)
  • Automation & Control Theory (AREA)

Abstract

Examples of methods are described. In some examples, a method includes dispensing, during a printing procedure, a thermochromic dye on a layer of material in a build volume. In some examples, the method includes detecting, after the printing procedure, a temperature in an aural region of an object in the build volume based on a color of the thermochromic dye.

Description

TEMPERATURE DETECTIONS
BACKGROUND
[0001] Three-dimensional (3D) solid parts may be produced from a digital model using additive manufacturing. Additive manufacturing may be used in rapid prototyping, mold generation, mold master generation, and short-run manufacturing. Additive manufacturing involves the application of successive layers of build material. In some additive manufacturing techniques, the build material may be cured or fused.
BRIEF DESCRIPTION OF THE DRAWINGS
[0002] Figure 1 is a flow diagram illustrating an example of a method for temperature detection;
[0003] Figure 2A is a block diagram illustrating an example of engines that may be utilized in accordance with some examples of the techniques described herein;
[0004] Figure 2B is a block diagram illustrating an example of engines that may be utilized in accordance with some examples of the techniques described herein;
[0005] Figure 3 is a block diagram of an example of an apparatus that may be used in detecting temperatures;
[0006] Figure 4 is a block diagram illustrating an example of a computer- readable medium for temperature detection;
[0007] Figure 5 is a diagram illustrating an example of optical image processing calibration; [0008] Figure 6 is a diagram illustrating an example of a build volume including a thermal cage in accordance some examples of the techniques described herein; and
[0009] Figure 7 is a diagram illustrating an example of a printed object with a defect.
DETAILED DESCRIPTION
[0010] Additive manufacturing may be used to manufacture three- dimensional (3D) objects. 3D printing is an example of additive manufacturing. Some examples of 3D printing may selectively deposit an agent or agents (e.g., droplets) at a pixel level to enable control over voxel-level energy deposition. For instance, thermal energy may be projected over material in a build area, where a phase change (for example, melting and solidification) in the material may occur depending on the voxels where the agents are deposited.
[0011] A voxel is a representation of a location in a 3D space. For example, a voxel may represent a volume or component of a 3D space. For instance, a voxel may represent a volume that is a subset of the 3D space. In some examples, voxels may be arranged on a 3D grid. For instance, a voxel may be rectangular or cubic in shape. In some examples, voxels may be arranged along axes. An example of three-dimensional (3D) axes includes an x dimension, a y dimension, and a z dimension. In some examples, a quantity in the x dimension may be referred to as a width, a quantity in the y dimension may be referred to as a length, and/or a quantity in the z dimension may be referred to as a height. The x and/or y axes may be referred to as horizontal axes, and the z axis may be referred to as a vertical axis. Other orientations of the 3D axes may be utilized in some examples, and/or other definitions of 3D axes may be utilized in some examples.
[0012] Examples of a voxel size dimension may include 25.4 millimeters (mm)/150 « 170 microns for 150 dots per inch (dpi), 490 microns for 50 dpi, 2 mm, etc. The term “voxel level” and variations thereof may refer to a resolution, scale, or density corresponding to voxel size. In some examples, the term “voxel” and variations thereof may refer to a “thermal voxel.” In some examples, the size of a thermal voxel may be defined as a minimum that is thermally meaningful (e.g., greater than or equal to 42 microns or 600 dots per inch (dpi)). A set of voxels may be utilized to represent a build volume.
[0013] A build volume is a volume in which an object or objects may be manufactured. A “build” may refer to an instance of 3D manufacturing. A layer is a portion of a build volume (e.g., a build in the build volume). For example, a layer may be a cross section (e.g., two-dimensional (2D) cross section) or 3D portion (e.g., rectangular prism) of a build volume. In some examples, a layer may refer to a horizontal portion (e.g., plane) of a build volume. In some examples, an “object” may refer to an area and/or volume in a layer and/or build volume indicated for forming a physical object.
[0014] In some examples of 3D manufacturing (e.g., Multi Jet Fusion (MJF)), each voxel in the build volume may undergo a thermal procedure (approximately 15 hours of build time (e.g., time for layer-by-layer printing) and approximately 35 hours of additional cooling). The thermal procedure of voxels that include an object may affect the manufacturing quality (e.g., functional quality) of the object.
[0015] Factors affecting production yield in 3D printing may include packing density, powder recyclability, and manufacturing accuracy (e.g., degree of object defect(s)). Tighter packing may allow printing more objects in a single build. However, if the objects are packed too closely, the objects and/or powder may overheat, which may result in a reduction in reusable powder quality and increased object defects. Accordingly, objects may be packed to keep the build volume within a range of temperatures to preserve powder quality and/or reduce object defects.
[0016] Some thermal information (e.g., build volume wall temperatures alone, thermal sensor data alone, etc.) may provide a limited observation of temperatures occurring in a build volume. It may be difficult to directly observe peak temperature in the build volume, which may indicate which powder will undergo higher thermal stress. For example, a temperature of a buried layer may be different from a measured temperature (observed with a thermal sensor, for instance). In areas around objects (e.g., aural regions), peak temperatures may occur due to thermal diffusion after a layer is buried.
[0017] The thermal journey of each powder voxel may provide data to calculate the thermal stress each build exerts on the powder, and to predict potential temperature-related defects in objects. Each voxel may have an individual thermal history. It may be difficult to physically sense the temperature of every voxel during the thermal procedure. In some approaches, peak temperature may drive the cooling history of the voxels (given thermal diffusion and wall conditions, for instance). Accurate peak temperature data may enable calibrating thermal prediction procedures, may increase accuracy in powder degradation prediction, and/or may enable the prediction of defects such as clogged holes, thermal bleed, and/or hot-spot thermal bleed at a top (e.g., outer) surface.
[0018] Some examples of the techniques described herein may provide approaches for in-situ sensing of peak temperature for powder voxels using thermochromic dyes. In some examples, peak temperature data may be utilized to validate and/or increase the accuracy of voxel-level thermal simulation and/or prediction for different builds and/or printers. For instance, some of the techniques described herein may help to determine a peak temperature that each voxel in a build volume will experience as a result of the geometry being printed and the thermal signature of the printer in question. Some examples of the techniques described herein may provide accurate measurement and/or prediction of the thermal behavior in the build volume during printing. Some examples of the techniques described herein may utilize thermochromic dye to evaluate peak powder temperature for powder (e.g., a portion or portions of the build volume and/or across the full build volume). Some examples of the techniques described herein may utilize thermochromic dye data to calibrate a 3D printing simulation and/or prediction.
[0019] It may be useful to provide thermal information at or near print resolution (e.g., 75 dpi) for guiding the placement of an agent or agents (e.g., fusing agent, detailing agent, and/or other thermally relevant fluids). An example of print resolution is 42 microns in x-y dimensions and 80 microns in a z dimension.
[0020] In some examples, thermal information or thermal behavior may be mapped as a thermal image. A thermal image is a set of data indicating temperature(s) (or thermal energy) in an area. A thermal image may be sensed, captured, simulated, and/or predicted.
[0021] While plastics (e.g., polymers) may be utilized as a way to illustrate some of the approaches described herein, some the techniques described herein may be utilized in various examples of additive manufacturing. For instance, some examples may be utilized for plastics, polymers, semi-crystalline materials, metals, etc. Some additive manufacturing techniques may be powderbased and driven by powder fusion. Some examples of the approaches described herein may be applied to area-based powder bed fusion-based additive manufacturing, such as Stereolithography (SLA), Multi Jet Fusion (MJF), Metal Jet Fusion, Selective Laser Melting (SLM), Selective Laser Sintering (SLS), liquid resin-based printing, etc. Some examples of the approaches described herein may be applied to additive manufacturing where agents carried by droplets are utilized for voxel-level thermal modulation.
[0022] In some examples, “powder” may indicate or correspond to particles. In some examples, an object may indicate or correspond to a location (e.g., area, space, etc.) where particles are to be sintered, melted, and/or solidified. For example, an object may be formed from sintered or melted powder.
[0023] Some examples of the techniques described herein may include machine learning. Machine learning is a technique where a machine learning model is trained to perform a task or tasks based on a set of examples (e.g., data). Training a machine learning model may include determining weights corresponding to structures of the machine learning model. Artificial neural networks are a kind of machine learning model that are structured with nodes, layers, and/or connections. Deep learning is a kind of machine learning that utilizes multiple layers. A deep neural network is a neural network that utilizes deep learning. [0024] Examples of neural networks include regression networks (e.g., isotonic regression models), convolutional neural networks (CNNs) (e.g., basic CNN, deconvolutional neural network, inception module, residual neural network, etc.) and recurrent neural networks (RNNs) (e.g., basic RNN, multilayer RNN, bi-directional RNN, fused RNN, clockwork RNN, etc.). Different depths of a neural network or neural networks may be utilized in accordance with some examples of the techniques described herein.
[0025] Throughout the drawings, similar reference numbers may designate similar or identical elements. When an element is referred to without a reference number, this may refer to the element generally, with and/or without limitation to any particular drawing or figure. In some examples, the drawings are not to scale and/or the size of some parts may be exaggerated to more clearly illustrate the example shown. Moreover, the drawings provide examples in accordance with the description. However, the description is not limited to the examples provided in the drawings.
[0026] Figure 1 is a flow diagram illustrating an example of a method 100 for temperature detection. The method 100 and/or an element or elements of the method 100 may be performed by an electronic device. For example, the method 100 may be performed by the apparatus 324 described in relation to Figure 3.
[0027] The apparatus may dispense 102, during a printing procedure, a thermochromic dye on a layer of material in a build volume. Thermochromic dye is a substance that changes color based on heat exposure. For instance, the color of thermochromic dye may correspond to a peak temperature reached by the thermochromic dye. Examples of thermochromic dye may include K200-NH or Acid Blue 9. In some examples, dispensing 102 the thermochromic dye may include controlling a print head(s) and/or sending instructions to a printer to print (e.g., eject, extrude, etc.) the thermochromic dye onto a layer of a build volume. In some examples, a print head or print heads (e.g., a print head with a reservoir or “pen”) may be utilized to eject thermochromic dye. In some examples, the thermochromic dye may be ejected from a first print head that is separate from a second print head to eject agent (e.g., fusing agent, detailing agent, etc.). The thermochromic dye may be dispensed over a whole layer or over a region (e.g., subset) of the layer.
[0028] A fusing layer is an exposed layer, a top layer, or a layer undergoing fusing of material. For example, a fusing layer may be a top layer of material in a build volume that is exposed to a print head and/or thermal projector for sintering. A buried layer is a covered layer. For instance, a buried layer may be a layer beneath or under the fusing layer. A layer (e.g., fusing layer) may become a buried layer when a next layer of material is placed in the print bed.
[0029] In some examples, a thermochromic, water-based dye such as K200- NH may be ejected through a print head to any region of the layer (e.g., aural region(s) and/or region(s) where high temperatures may occur). In some examples, a thermochromic dye may have an operating range of (60-200° Celsius (C)) and may change color from white to magenta. In some examples, some regions may also include agent (e.g., fusing agent and/or detailing agent), where the thermochromic dye may indicate the approximate temperature. In some examples, a calibration procedure may be performed to compensate for a cooling and/or heating effect the thermochromic dye may have on the powder.
[0030] In some examples, a thermochromic dye such as Acid Blue 9 may be utilized to indicate build volume temperatures. For materials lacking chromophoric degradation pathways, Acid Blue 9 may be ejected to any region of the layer such that the thermal degradation occurring over the course of the build may be captured.
[0031] In some examples, the color of the thermochromic dye may indicate a peak temperature experienced by the thermochromic dye. For instance, the thermochromic dye may change color according to the peak temperature experienced by the thermochromic dye during and/or after printing.
[0032] In some examples, dispensing 102 the thermochromic dye may include dispensing the thermochromic dye in an aural region relative to an object in the build volume. An aural region is a region next to (e.g., adjacent to, bordering, and/or abutting, etc.) an object. For instance, an aural region may be an area next to an object that extends up to a distance from the object (e.g., a distance from the object surface in a normal or perpendicular direction from the object surface). In some examples, the aural region may extend to less than the distance in a case that another object is less than the distance from the object’s surface. In some examples, the thermochromic dye in the aural region may experience heat due to thermal diffusion from an object being printed and/or a printed object. For instance, the thermochromic dye in the aural region may be heated by thermal energy diffusing from a sintering region (e.g., the object).
[0033] In some examples, the method 100 may include printing a thermal cage in the build volume. The thermochromic dye may be dispensed in and/or on a volume of the thermal cage (e.g., in a volume contained by the thermal cage). A thermal cage is a structure that may partially or completely contain an object or objects and/or powder. For instance, a thermal cage (e.g., box, container, shape, etc.) may be an object that is printed in the build volume. In some examples, a thermal cage may have relatively low thermal mass. Having a relatively low thermal mass may mean that the thermal cage does not produce a significant amount of heat by diffusion and/or that the thermal cage has a nonsignificant impact on the temperature history of a region or regions around and/or within the thermal cage. A thermal cage or thermal cages may be printed at a region or regions of interest in the build volume. An example of a thermal cage is given in relation to Figure 6. In some examples, a thermal cage may be utilized to provide a stable volume where the thermochromic dye may be dispensed. For instance, the thermochromic dye may be dispensed inside a thermal cage, which may allow observation of (e.g., image capture of) the color(s) of an aural region or regions around objects. In some examples, a thermal cage may be utilized to map the location of detected temperature (based on thermochromic dye color) to the build volume. In some examples, a thermal cage may be utilized to extract a section of the build volume for external analysis of the temperature distribution. In some cases, a thermal cage may be utilized to avoid losing dyed powder during unpacking (e.g., de-caking, extraction, reclamation, etc.). In some examples, a thermal cage may be extracted manually.
[0034] The apparatus may detect 104, after the printing procedure, a temperature in an aural region of an object in the build volume based on a color of the thermochromic dye. For example, the apparatus may capture and/or receive an optical image of the layer, aural region, object, and/or thermal cage. The optical image may be captured by a camera with a view that includes the layer and/or aural region. For instance, the camera may capture an optical image depicting a layer, aural region, object, and/or thermal cage. In some examples, the optical image may be captured during and/or after unpacking. Unpacking is a procedure where material is removed from the build volume and/or an object or objects are removed from the build volume. For instance, material (e.g., a layer or layers of material) may be removed through vacuuming, scooping, shearing, blowing, and/or otherwise removing material. In some examples, unpacking may be performed manually by a user or technician. In some examples, unpacking may be performed automatically by a robot and/or other machine.
[0035] In some examples, the camera may be included in a device, mounted on a device, and/or may be separate from a device. For instance, the camera may be mounted in a printer or trolley above the build volume. In some examples, the camera may be manually operated by a user or technician.
[0036] In some examples, the layer, aural region, object, and/or thermal cage (and/or build volume, image capture area, etc.) may be illuminated with a white light source (e.g., white light-emitting diode (LED)). For instance, the camera and/or device (e.g., printer, trolley, etc.) may include a white LED that may illuminate the layer, aural region, object, and/or thermal cage when the optical image is captured. In some examples, the camera may capture a still image or images and/or video frames. In some examples, the optical image may be a still image or video frame. In some examples, the optical image may depict a whole layer or a region of the layer.
[0037] The optical image may indicate a color or colors of the thermochromic dye on the layer, aural region, object, and/or thermal cage. In some examples, the apparatus may measure the temperature of the layer, aural region, object, and/or thermal cage by determining a temperature corresponding to a color in the optical image. In some examples, detecting 104 the temperature may include mapping a color of the thermochromic dye from a captured image to the temperature. For instance, the apparatus may utilize a look-up table or function to map pixel color from the optical image to temperature. For instance, different pixel colors and/or shades may correspond to different temperatures (e.g., peak temperatures) experienced by the thermochromic dye. The apparatus may measure the temperature of the fusing layer by mapping a color in the optical image to a corresponding temperature. In some examples, the apparatus may assign and/or record the temperature or temperatures corresponding to a pixel or sets of pixels (e.g., areas of the optical images with the same color or within a color range).
[0038] In some examples, the apparatus may utilize a spatial mapping between the optical image and voxels of the build volume. For example, the apparatus may apply a transformation or transformations (e.g., unprojection) to the pixels in the optical image to map the color(s) and/or corresponding temperature(s) to the voxels of the layer, aural region, object, and/or thermal cage (e.g., locations in 3D space). In some examples, the spatial mapping may be based on a marker or structure of the layer, aural region, object, and/or thermal cage. For instance, a structure (e.g., text, number, pattern, geometry, etc.) of the layer, aural region, object, and/or thermal cage may be utilized to spatially map the layer, aural region, object, and/or thermal cage appearing in the image to a location (e.g., position and/or orientation) of the corresponding build. The spatial mapping may indicate the approximate original location(s) (e.g., voxels) of colors and/or temperatures of the layer, aural region, object, and/or thermal cage in the build volume.
[0039] In some examples, the apparatus may perform optical image processing. Optical image processing is an operation or operations to process optical data (e.g., pixels) of the optical image. Examples of optical image processing may include color compensation, white balance compensation, and/or lens distortion correction, etc. For instance, the apparatus may adjust the optical image to increase color accuracy of the optical image.
[0040] In some examples, the method 100 may include calibrating optical image processing based on color swatches associated with temperatures. A color swatch is an item with a sample color. For instance, color swatches of the same color (and/or corresponding to the same temperature of the thermochromic dye) may be placed at different locations within a camera’s field of view (of a build volume and/or capture area, for example). Due to environmental variations (e.g., lighting, etc.) and/or sensing variations (e.g., lens distortion, etc.), swatches with the same physical color may vary in the optical image (e.g., color may vary in different locations of the optical image) captured by the camera. In some examples, a camera may inaccurately sense the color of a swatch as a slightly different color in the optical image. The apparatus may calibrate (e.g., adjust) the optical image processing to reduce color sensing inaccuracy and/or spatial color variation between swatches of the same color. For instance, a color swatch may have a designated color and/or corresponding temperature. For instance, the apparatus may receive the designated color (e.g., red-green-blue (RGB) value) of a color swatch or swatches from an input device (e.g., keyboard, mouse, touchscreen, etc.). The apparatus may capture and/or receive an optical image that depicts the swatch or swatches. The apparatus may determine color compensation (e.g., a difference or bias) between the designated color and the captured color (and/or compensation between a temperature corresponding to the designated color and a temperature corresponding to the captured color). The apparatus may determine color compensation for spatial variation in color sensing.
[0041] In some examples, calibrating the optical image processing may be performed before printing. For instance, the calibration (e.g., compensation) may be performed before printing a build. In some examples, the apparatus may capture and/or receive a calibration image of a color swatch or swatches and compensate the optical image processing before a build is printed, before thermochromic dye is dispensed, and/or before an optical image of a fusing layer with thermochromic dye is captured. For instance, the calibration (e.g., compensation) may be applied to an optical image processing pipeline, such that compensation is automatically applied during processing of an optical image.
[0042] In some examples, calibrating the optical image processing may be performed after printing. For instance, the calibration (e.g., compensation, post- print correction, etc.) may be performed after printing a build. In some examples, the apparatus may capture and/or receive a calibration image of a color swatch or swatches and compensate the optical image processing after a build is printed, after thermochromic dye is dispensed, and/or after an optical image of a fusing layer with thermochromic dye is captured. For instance, the calibration (e.g., compensation) may be applied as compensation to a previously captured optical image and/or temperature. In some examples, the apparatus may apply the determined compensation (e.g., color compensation) to an optical image to increase color accuracy in the image, which may increase temperature measurement accuracy. In some examples, the apparatus may apply the determined temperature compensation to a temperature determined from the optical image to increase temperature measurement accuracy.
[0043] In some examples, optical image processing calibration may be performed for multiple colors and/or color shades. In some examples, optical image processing calibration may be performed in accordance with the example described in relation to Figure 5.
[0044] In some examples, the method 100 may include training a machine learning model based on the temperature. For example, a machine learning model may be trained to predict the thermal behavior (e.g., temperature, a thermal image, etc.) of 3D additive manufacturing. In some examples, the machine learning model may be trained to predict the thermal behavior of a buried layer or layers. In some examples, the temperature detected from the thermochromic dye may be utilized as ground truth data during training. For instance, the detected temperature may be utilized to calibrate (e.g., compensate, adjust, etc.) predicted thermal behavior to increase accuracy.
[0045] In some examples, the machine learning model may utilize geometrical data as input to predict thermal behavior. Geometrical data is data indicating a geometrical model or models of an object or objects. For example, geometrical data may indicate the placement and/or model of an object or objects in a build volume. A model may specify shape and/or size of a 3D object or objects. In some examples, a model may be expressed using polygon meshes and/or coordinate points. For example, a model may be defined using a format or formats such as a 3D manufacturing format (3MF) file format, an object (OBJ) file format, computer aided design (CAD) file, and/or a stereolithography (STL) file format, etc. In some examples, the geometrical data indicating a model or models may be received from another device and/or generated. For instance, the apparatus may receive a file or files of geometrical data and/or may generate a file or files of geometrical data. In some examples, the apparatus may generate geometrical data with model(s) created on the apparatus from an input or inputs (e.g., scanned object input, user-specified input, etc.). Examples of geometrical data include model data, shape image(s), slice(s), contone map(s), etc.
[0046] In some examples, the machine learning model may be trained to determine (e.g., predict, infer, etc.) a thermal image or images corresponding to a layer or layers of geometrical data. For instance, the machine learning model may be trained using layers (e.g., slices) of geometrical data as input data and detected temperature(s) (at corresponding locations, for example) as ground truth data. After training, the first machine learning model may predict or infer the thermal behavior (e.g., thermal image(s), temperature(s), etc.) in a build volume.
[0047] The apparatus may utilize a training objective function to train the machine learning model. In some examples, a training objective function may be utilized that reduces (e.g., minimizes) a pixel-wise difference. For instance, the training object function may reduce (e.g., minimize) a pixel-wise temperature difference between the predicted temperatures and the temperatures measured from a corresponding region of the optical image. In some examples, the apparatus may train the machine learning model with an isotonic regression model, where the temperature distribution value for input and ground truth may be divided into temperature range blocks with probabilities of pixels falling into a temperature range. For instance, the isotonic regression model may be utilized as a training objective function to reduce (e.g., minimize) pair-wise differences for each temperature range’s probability value.
[0048] In some examples, the method 100 may include predicting, using the machine learning model, a thermal image based on geometrical data. For instance, the apparatus may predict, using the machine learning model, a thermal image based on geometrical data after training.
[0049] In some examples, the method 100 may include determining a thermal bleed threshold based on the temperature. A thermal bleed threshold is a temperature at which thermal bleed occurs. Thermal bleed is a phenomenon where powder outside of an object (e.g., non-target powder) sinters to an object. For instance, if powder outside of an object reaches the thermal bleed threshold during printing, the powder will stick to the object, which may result in a defect. With the use of thermochromic dye, the defect may indicate a color mark that indicates the peak temperature experienced by the powder. An example of a defect due to thermal bleed is given in relation to Figure 7. The color of the thermochromic dye and/or the corresponding temperature may be utilized for determining the thermal bleed threshold (e.g., temperature threshold) to avoid defects. In some examples, the apparatus may detect a temperature of the defect (e.g., aural region) of an object as described herein based on an optical image. For instance, the apparatus may map a color or colors of thermochromic dye on a defect to a temperature or temperatures. In some examples, a temperature indicated by the color of the thermochromic dye on the defect may be determined as the thermal bleed threshold. In some examples, a minimum temperature on the defect (indicated by a lightest thermochromic dye color on the defect, for instance) may be determined as the thermal bleed threshold. For instance, the lowest temperature experienced by the thermochromic dye on the defect may be a thermal bleed threshold at which powder (e.g., non-target powder, powder outside of an object, etc.) may sinter to the object. In some examples, the thermal bleed threshold may be referred to as a scrapped object threshold and/or may be utilized as a scrapped object threshold. For instance, if the thermal bleed threshold is met or exceeded, a defect may occur on the object, which may cause the object to be scrapped (e.g., discarded).
[0050] In some examples, the method 100 may include determining a packing margin based on the temperature. A packing margin is a distance (e.g., minimum distance) between objects in a build. The detected temperature(s) may indicate temperature in the aural region due to thermal diffusion over distance. A packing margin may be determined as a distance (e.g., minimum distance) that prevents heating aural region powder beyond a threshold temperature. In some examples, the threshold may be the thermal bleed threshold described previously. For example, the apparatus may determine a packing margin as a distance that avoids meeting or exceeding the thermal bleed threshold. For instance, the thermal bleed threshold may provide useful information for determining the packing margin.
[0051] In some examples, the apparatus may utilize the machine learning model that is trained based on the temperature to predict an aural region temperature for an object in a build. A predicted temperature in an aural region that meets or exceeds the threshold (e.g., thermal bleed threshold) may indicate a packing where an object is packed too closely to avoid producing a defect. In some examples, the apparatus may iteratively predict aural region temperature and increase a packing margin until the predict aural region temperature does not meet or exceed the threshold (e.g., thermal bleed threshold). The distance at which the aural region temperature does not meet or exceed the threshold may be determined as the packing margin.
[0052] In some examples, an extracted object (e.g., unpacked object) may be utilized (before sand blasting, for instance) to indicate powder to reclaim (e.g., reclaim to recycling in a subsequent build) and/or powder to discard (e.g., scrap). For instance, the thermochromic dye on the powder may exhibit a color gradient, where powder closer to the object may be darker than powder away from the object. In some examples, the color gradient may be utilized to teach users and/or technicians which powder to reclaim and which powder to scrap (to ensure powder quality for a subsequent build, for instance) for a given geometry.
[0053] Figure 2A is a block diagram illustrating an example of engines 204 that may be utilized in accordance with some examples of the techniques described herein. For instance, Figure 2A illustrates examples of engines 204 that may be utilized to train a machine learning model. In some examples, an engine or engines of the engines 204 described in relation to Figure 2A may be included in the apparatus 324 described in relation to Figure 3. In some examples, a function or functions described in relation to any of Figures 1-7 may be performed by an engine or engines described in relation to Figure 2A. An engine or engines described in relation to Figure 2A may be included in a device or devices, in hardware (e.g., circuitry) and/or in a combination of hardware and instructions (e.g., processor and instructions). The engines 204 described in relation to Figure 2A include a slicing engine 210, a machine learning model engine 206, a region determination engine 212, a temperature measurement engine 218, a printing control engine 213, and an optical image capture engine 215. In some examples, the engines 204 may be disposed on one device. For instance, the engines 204 may be included in a 3D printer.
[0054] In some examples, an engine(s) of the engines 204 may be disposed on different devices. For instance, the machine learning model engine 206, the region determination engine 212, and/or the temperature measurement engine 218 may be included in a computer, while the printing control engine 213 and the optical image capture engine 215 may be included in a 3D printer. For instance, instructions for the machine learning model engine 206, the region determination engine 212, and/or the temperature measurement engine 218 may be stored in memory 326 and executed by a processor 328 of the apparatus 324 described in Figure 3 in some examples. In some examples, a function or functions of the printing control engine 213 and/or the optical image capture engine 215 may be performed by another apparatus.
[0055] Geometrical data 202 may be obtained. For example, the geometrical data 202 may be received from another device and/or generated as described in relation to Figure 1 . In some examples, the geometrical data 202 may include training data from a training dataset. The geometrical data 202 may be provided to the slicing engine 210.
[0056] In some examples, the slicing engine 210 may perform slicing based on the geometrical data 202. For example, slicing may include generating a slice or slices (e.g., 2D slice(s)) corresponding to the geometrical data 202. For instance, an apparatus may slice the geometrical data 202 representing a build. In some examples, slicing may include generating a set of 2D slices corresponding to the build. A slice is a portion or cross-section. In some approaches, a build may be traversed along an axis (e.g., a vertical axis, z-axis, or other axis), where each slice represents a 2D cross section of the build. For example, slicing the model may include identifying a z-coordinate of a slice plane. The z-coordinate of the slice plane can be used to traverse the build to identify a portion or portions of the build intercepted by the slice plane. In some examples, a slice or slices may be expressed as a binary image or binary images. In some examples, the slice(s) may be provided to the region determination engine 212.
[0057] The region determination engine 212 may determine an aural region or regions based on the slice(s) provided by the slicing engine 210. For instance, the region determination engine 212 may determine aural regions of the slice(s) as regions next to (e.g., abutting) and within a distance (e.g., 1 mm, 2 mm, 5 mm, 10 mm, 1 cm, etc.) from an outer surface of an object or objects indicated in the slice(s). The region determination engine 212 may provide an indicator of the aural region(s) to the printing control engine 213, to the temperature measurement engine 218, and/or to the machine learning model engine 206. For instance, the indicator may indicate an aural region(s) for thermochromic dye dispensing and/or thermal cage location(s). In some examples, the region determination engine 212 may determine size and/or location to print a thermal cage or cages. In some examples, the size and/or location of a thermal cage(s) may be determined based on a received input. In some examples, the region determination engine 212 may determine the size and/or location of the thermal cage(s) to meet a criterion or criteria (e.g., volume, object to powder ratio, including a closest packing margin, etc.).
[0058] The printing control engine 213 may control dispensing of a thermochromic dye and/or agent(s) on material in a build volume. For example, the printing control engine 213 may control a print head or print heads and/or a carrier of a 3D printer to dispense the thermochromic dye(s) on a layer of a build volume. In some examples, the printing control engine 213 may utilize the indicator of the aural region(s) to dispense thermochromic dye. For instance, the printing control engine 213 may control dispensing of the thermochromic dye on an aural region next to an object and/or in a thermal cage. In some examples, multiple thermochromic dyes may be dispensed. For instance, a first thermochromic dye may be dispensed and a second thermochromic dye may be dispensed. In some examples, a second thermochromic dye may be dispensed from another (e.g., third, separate, etc.) print head. The second thermochromic dye may exhibit a second range of temperature sensitivity that is different from a first range of temperature sensitivity of the first thermochromic dye. For instance, different ranges of temperature sensitivities may occupy different temperature ranges. In some examples, a first thermochromic dye may be utilized for a lower temperature range and a second thermochromic dye may be utilized for a higher temperature range. For instance, the first thermochromic dye may be dispensed on aural regions and the second thermochromic dye may be dispensed on object regions.
[0059] In some examples, the printing control engine 213 may print a thermal cage or thermal cages. For instance, the printing control engine 213 may utilize the indicator(s) provided by the region determination engine 212 to apply agent at the thermal cage location(s). The agent may cause sintering in the powder when heating during printing to form the thermal cage(s).
[0060] The optical image capture engine 215 may control optical image capture. For instance, the optical image capture engine 215 may control an optical image sensor and/or camera to capture video and/or a still frame of a layer, aural region, object, and/or thermal cage (after printing, for example). In some examples, the optical image capture engine 215 may capture optical image(s) after printing is completed and/or during an unpacking procedure. In some examples, the optical image capture engine 215 may control a light source or sources (e.g., white LED(s)) to illuminate the build volume and/or capture area (e.g., separate table, trolley, etc.) during image capture. The optical image capture engine 215 may provide an optical image or images of a layer, aural region, object, and/or thermal cage to the temperature measurement engine 218.
[0061] The temperature measurement engine 218 may measure a temperature of a layer, aural region, object, and/or thermal cage indicated by an optical image of the thermochromic dye. In some examples, measuring the temperature of the fusing layer(s) may be performed as described in relation to Figure 1. For instance, the temperature measurement engine 218 may map a color appearing in an optical image to a temperature to measure the temperature. In some examples, the temperature measurement engine 218 may utilize the indicator of the thermal image region to measure the temperature. For instance, the temperature measurement engine 218 may determine a temperature based on a region of the optical image corresponding to an aural region and/or thermal cage. The measured temperature or temperatures may be provided to the machine learning model engine 206.
[0062] The machine learning model engine 206 may train a machine learning model based on the measured temperature and/or the aural region(s). In some examples, the machine learning model engine 206 may train the machine learning model as described in relation to Figure 1 . For instance, the weights of the second machine learning model may be adjusted to reduce (e.g., minimize) a difference between a predicted temperature of the aural region and the temperature (e.g., peak temperature) measured from a corresponding region of the optical image.
[0063] The machine learning model engine 206 may be trained to determine (e.g., predict, infer, etc.) a thermal image or thermal images based on the geometrical data 202 (e.g., slice(s)). In some examples, the machine learning model engine 206 may be trained to determine (e.g., predict, infer, etc.) a thermal image or images corresponding to a layer or layers of geometrical data 202 (e.g., slice(s)) as described in relation to Figure 1. For instance, the first machine learning model may determine a thermal image of a layer or layers (e.g., buried layer(s)).
[0064] Figure 2B is a block diagram illustrating an example of engines 214 that may be utilized in accordance with some examples of the techniques described herein. For instance, Figure 2B illustrates examples of engines 214 that may be utilized after training (e.g., during an inferencing or prediction stage) to predict a temperature(s) 222. In some examples, an engine or engines of the engines 214 described in relation to Figure 2B may be included in the apparatus 324 described in relation to Figure 3. In some examples, a function or functions described in relation to any of Figures 1-7 may be performed by an engine or engines described in relation to Figure 2B. An engine or engines described in relation to Figure 2B may be included in a device or devices, in hardware (e.g., circuitry) and/or in a combination of hardware and instructions (e.g., processor and instructions). The engines 214 described in relation to Figure 2B include a slicing engine 210 and a machine learning model engine 206. For instance, the machine learning model engine 206 described in relation to Figure 2B may be an example of the machine learning model engine 206 described in relation to Figure 2A after training.
[0065] Geometrical data 203 may be obtained. For example, the geometrical data 203 may be received from another device and/or generated as described in relation to Figure 1 . In some examples, the geometrical data 203 may include data for use in an inferencing stage. The geometrical data 203 may be provided to the slicing engine 210.
[0066] In some examples, the slicing engine 210 may perform slicing based on the geometrical data 203. For example, slicing may be performed as similarly described in relation to Figure 2A. In some examples, the slice(s) may be provided to the machine learning model engine 206.
[0067] The machine learning model engine 206 may determine (e.g., predict, infer, etc.) a thermal image or thermal images based on the geometrical data 203 (e.g., slice(s)). In some examples, the machine learning model engine 206 may include a first machine learning model that is trained to determine (e.g., predict, infer, etc.) a thermal image or images corresponding to a layer or layers of geometrical data 203 as described in relation to Figure 1 . For instance, the first machine learning model may determine a thermal image of a layer or layers (e.g., buried layer(s)). The thermal image(s) may indicate a temperature(s) 222. For instance, the thermal image(s) may indicate a temperature(s) 222 in an aural region(s).
[0068] Figure 3 is a block diagram of an example of an apparatus 324 that may be used in detecting temperatures. The apparatus 324 may be a computing device, such as a personal computer, a server computer, a printer, a 3D printer, a smartphone, a tablet computer, etc. The apparatus 324 may include and/or may be coupled to a processor 328, a communication interface 330, and/or a memory 326. In some examples, the apparatus 324 may be in communication with (e.g., coupled to, have a communication link with) an additive manufacturing device (e.g., a 3D printer). In some examples, the apparatus 324 may be an example of 3D printer. The apparatus 324 may include additional components (not shown) and/or some of the components described herein may be removed and/or modified without departing from the scope of the disclosure. For instance, the apparatus 324 may be a 3D printer that includes an optical image sensor(s) (e.g., optical camera(s)) (not shown in Figure 3).
[0069] The processor 328 may be any of a central processing unit (CPU), a semiconductor-based microprocessor, graphics processing unit (GPU), field- programmable gate array (FPGA), an application-specific integrated circuit (ASIC), and/or other hardware device suitable for retrieval and execution of instructions stored in the memory 326. The processor 328 may fetch, decode, and/or execute instructions stored on the memory 326. In some examples, the processor 328 may include an electronic circuit or circuits that include electronic components for performing a functionality or functionalities of the instructions. In some examples, the processor 328 may perform one, some, or all of the aspects, elements, techniques, etc., described in relation to one, some, or all of Figures 1-7.
[0070] The memory 326 is an electronic, magnetic, optical, and/or other physical storage device that contains or stores electronic information (e.g., instructions and/or data). The memory 326 may be, for example, Random Access Memory (RAM), Electrically Erasable Programmable Read-Only Memory (EEPROM), a storage device, an optical disc, and/or the like. In some examples, the memory 326 may be volatile and/or non-volatile memory, such as Dynamic Random Access Memory (DRAM), EEPROM, magnetoresistive random-access memory (MRAM), phase change RAM (PCRAM), memristor, flash memory, and/or the like. In some examples, the memory 326 may be a non-transitory tangible machine-readable storage medium, where the term “non- transitory” does not encompass transitory propagating signals. In some examples, the memory 326 may include multiple devices (e.g., a RAM card and a solid-state drive (SSD)).
[0071] The apparatus 324 may further include a communication interface 330 through which the processor 328 may communicate with an external device or devices (not shown), for instance, to receive and store the information pertaining to a build or builds (e.g., data for training and/or object printing). The communication interface 330 may include hardware and/or machine-readable instructions to enable the processor 328 to communicate with the external device or devices. The communication interface 330 may enable a wired or wireless connection to the external device or devices. The communication interface 330 may further include a network interface card and/or may also include hardware and/or machine-readable instructions to enable the processor 328 to communicate with various input and/or output devices, such as a keyboard, a mouse, a display, another apparatus, electronic device, computing device, printer, etc. In some examples, a user may input instructions into the apparatus 324 via an input device.
[0072] In some examples, the memory 326 may store image data 336. The image data 336 may be generated (e.g., simulated, predicted, and/or inferred) and/or may be obtained (e.g., received, captured, etc.) from an optical image sensor(s). For example, the processor 328 may execute instructions (not shown in Figure 3) to obtain an optical image(s) of thermochromic dye of a layer, aural region, object, and/or thermal cage. In some examples, the apparatus 324 may include an optical image sensor(s), may be coupled to a remote optical image sensor(s), and/or may receive image data 336 (e.g., optical image(s)) from an (integrated and/or remote) optical image sensor(s).
[0073] In some examples, the image data 336 may include a captured optical image(s). For example, a captured optical image may be an optical image of thermochromic dye on a layer, aural region, object, and/or thermal cage of a build volume. For instance, a printer may eject thermochromic dye on a layer, aural region, object, and/or thermal cage using a first print head. In some examples, the first print head may be separate from a second print head to eject agent. The captured optical image may depict the color of the thermochromic dye on the layer, aural region, object, and/or thermal cage. In some examples, the optical image sensor(s) may undergo a procedure(s) to overcome distortion introduced by sensor(s). Different types of sensing devices may be used in different examples. In some examples, the image data 336 may include a captured optical image(s). In some examples, the captured optical image(s) may be received from a separate camera (e.g., a separate camera operated by a user or technician).
[0074] The memory 326 may store temperature determination instructions 334. In some examples, the processor 328 may execute the temperature determination instructions 334 to determine a peak temperature of an aural region of a layer occurring after the layer is buried based on an optical image of thermochromic dye of the layer. In some examples, determining a temperature of an aural region based on an optical image of thermochromic dye may be performed as described in relation to Figure 1 and/or Figure 2A. For example, a peak temperature of the aural region may occur after the layer is buried due to thermal diffusion. For example, heat utilized to sinter an object may diffuse into the aural region around the object, where a peak temperature experienced by the powder in the aural region may occur after the layer of the aural region is buried. During or after unpacking, the optical image of the thermochromic dye may be captured and/or utilized to determine the peak temperature based on the color of the thermochromic dye.
[0075] In some examples, the memory 326 may store geometrical data 340. The geometrical data 340 may include and/or indicate a model or models (e.g., 3D object model(s)). The apparatus 324 may generate the geometrical data 340 and/or may receive the geometrical data 340 from another device. In some examples, the memory 326 may include slicing instructions (not shown in Figure 3). For example, the processor 328 may execute the slicing instructions to perform slicing on the 3D model data to produce a stack of 2D vector slices. In some examples, the processor 328 may execute machine learning model instructions 341 to predict a thermal image (e.g., aural region temperature) based on the geometrical data 340 (e.g., slices). In some examples, the thermal image may be stored as part of image data 336. The machine learning model may be an example of the machine learning model described in relation to Figure 1 , Figure 2A, and/or Figure 2B.
[0076] In some examples, the memory 326 may store training instructions 342. The processor 328 may execute the training instructions 342 to train the machine learning model based on temperature (e.g., the temperature determined by executing the temperature determination instructions 334). In some examples, the machine learning model may be trained as described in relation to Figure 1 and/or Figure 2A. The machine learning model may be trained to predict a thermal image (e.g., aural region temperature) based on geometrical data 340. For instance, during an inferencing stage (e.g., after training), the processor 328 may execute the machine learning model instructions 341 to predict a thermal image based on geometrical data 340. In some examples, the machine learning model may be utilized to predict temperature based on geometrical data 340 during an inferencing stage.
[0077] The memory 326 may store operation instructions 346. In some examples, the processor 328 may execute the operation instructions 346 to perform an operation based on temperature(s) (e.g., temperature(s) in an aural region) determined from an optical image and/or a thermal image (e.g., temperature(s) in an aural region) predicted by the machine learning model. In some examples, the processor 328 may execute the operation instructions 346 to utilize the temperature(s) to serve another device (e.g., printer controller). For instance, the processor 328 may print (e.g., control amount and/or location of agent(s) for) a layer or layers based on the temperature(s). In some examples, the processor 328 may drive model setting (e.g., the size of the stride) based on the temperature(s). In some examples, the processor 328 may perform offline print model tuning based on the temperature(s). In some examples, the processor 328 may send a message (e.g., alert, alarm, progress report, quality rating, etc.) based on the temperature(s). For instance, the temperature may indicate a probability that a defect in a printed object may occur (e.g., clogged hole(s), thermal bleed, etc.) and/or that powder will be degraded beyond a threshold quantity. The apparatus 324 may present and/or send a message indicating the potential defect and/or powder degradation. In some examples, the processor 328 may halt printing in a case that the temperature(s) indicate or indicates an issue (e.g., more than a threshold temperature). In some examples, the processor 328 may feed the temperature(s) for an upcoming layer to a thermal feedback controller to online-compensate contone maps for the upcoming layer.
[0078] In some examples, the operation instructions 346 may include 3D printing instructions. For instance, the processor 328 may execute the 3D printing instructions to print a 3D object or objects. In some examples, the 3D printing instructions may include instructions for controlling a device or devices (e.g., rollers, print heads, thermal projectors, and/or fuse lamps, etc.). For example, the 3D printing instructions may use a contone map or contone maps (stored as contone map data, for instance) to control a print head or heads to print an agent or agents in a location or locations specified by the contone map or maps. In some examples, the processor 328 may execute the 3D printing instructions to print a layer or layers. The printing (e.g., thermal projector control) may be based on the temperature (e.g., temperature determined from an optical image and/or predicted thermal map). For instance, a thermal projector (e.g., lamp) intensity may be adjusted to avoid heating an aural region beyond a thermal bleed threshold (in a case that the predicted thermal map or temperature indicates temperature beyond a threshold, for instance) or may be adjusted to increase heat (in a case that the predicted thermal map or temperature indicates that heat may be increased without exceeding a thermal bleed threshold, for instance).
[0079] In some examples, the processor 328 may execute the operation instructions 346 to present a visualization or visualizations of the temperature(s) on a display and/or send the temperature(s) to another device (e.g., computing device, monitor, etc.). In some examples, the processor 328 may execute the operation instructions 346 to produce a visualization of a determined temperature or temperatures. For instance, the processor 328 may label an optical image with the determined temperature(s) (indicated by thermochromic dye color, for instance). The optical image may be labeled in an aural region(s), for example. In some examples, the processor 328 may execute the operation instructions 346 to produce and/or present a visualization of a defect (e.g., thermal bleed). For example, the processor 328 may label a defect depicted in an optical image. For instance, the processor 328 may identify a defect based on a color of a thermochromic dye indicated in the image. In some examples, the processor 328 may detect the defect by detecting an area of the optical image with a pixel color (or range of colors, for instance) corresponding to the thermochromic dye. The processor 328 may label and/or mark the area (e.g., add an outline around the area, add a temperature label to the area, etc.) of the detected defect.
[0080] Figure 4 is a block diagram illustrating an example of a computer- readable medium 448 for temperature detection. The computer-readable medium 448 is a non-transitory, tangible computer-readable medium. The computer-readable medium 448 may be, for example, RAM, EEPROM, a storage device, an optical disc, and the like. In some examples, the computer- readable medium 448 may be volatile and/or non-volatile memory, such as DRAM, EEPROM, MRAM, PCRAM, memristor, flash memory, and the like. In some examples, the memory 326 described in relation to Figure 3 may be an example of the computer-readable medium 448 described in relation to Figure
4. In some examples, the computer-readable medium 448 may include code, instructions and/or data to cause a processor to perform one, some, or all of the operations, aspects, elements, etc., described in relation to one, some, or all of Figure 1 , Figure 2, Figure 3, Figure 5, Figure 6, and/or Figure 7.
[0081] The computer-readable medium 448 may include data (e.g., information and/or instructions). For example, the computer-readable medium 448 may include optical image processing calibration instructions 450 and/or temperature measurement instructions 452.
[0082] The optical image processing calibration instructions 450 may be instructions when executed cause a processor of an electronic device to calibrate optical image processing based on a first optical image of color swatches captured by a camera. In some examples, calibrating optical image processing may be performed as described in relation to Figure 1 and/or Figure
5. [0083] The temperature measurement instructions 452 may be instructions when executed cause a processor of an electronic device to measure a temperature of a previously buried layer indicated by a second optical image of thermochromic dye captured by the camera based on the calibrated optical image processing. In some examples, measuring a temperature of a previously buried layer (e.g., layer, aural region, object, and/or thermal cage) indicated by a second optical image of thermochromic dye based on the calibrated optical image processing may be performed as described in relation to Figure 1 .
[0084] In some examples, the computer-readable medium 448 may include thermal image prediction instructions (not shown in Figure 4). The thermal image prediction instructions may be instructions when executed cause a processor of an electronic device to predict a thermal image based on geometrical data. In some examples, predicting a thermal image based on geometrical data may be performed as described in relation to Figure 1 , Figure 2B, and/or Figure 3. For instance, a processor may execute a machine learning model and may predict a thermal image based on the geometrical data. The machine learning model may be trained based on the measured temperature(s). [0085] In some examples, the computer-readable medium 448 may include thermal bleed threshold determination instructions (not shown in Figure 4). The thermal bleed threshold determination instructions may be instructions when executed cause a processor of an electronic device to determine a thermal bleed threshold based on the temperature. In some examples, thermal bleed threshold determination may be performed as described in relation to Figure 1 . In some examples, the thermochromic dye may be utilized to determine the thermal bleed threshold (e.g., temperature level(s)) at which an object may be scrapped as a result of thermal bleed.
[0086] In some examples, the computer-readable medium 448 may include packing margin determination instructions (not shown in Figure 4). The packing margin determination instructions may be instructions when executed cause a processor of an electronic device to determine a packing margin based on the temperature. In some examples, packing margin determination may be performed as described in relation to Figure 1 . [0087] Figure 5 is a diagram illustrating an example of optical image processing calibration 556. As illustrated in Figure 5, an image sensor 558 (e.g., color camera) with a white LED light source may be utilized to capture an optical image of color swatches 560a-h. The color swatches 560a-h may be swatches of different colors and/or color shades (e.g., shades of magenta). Approximately identical color swatches may be placed in different locations. For instance, color swatches (e.g., color swatches 560a-h and other identical color swatches) may be placed in different locations in a build volume (e.g., build bed). An optical image of the color swatches may be captured from the build volume. The optical image of the color swatches may be utilized to calibrate optical image processing to increase color accuracy. For instance, optical image processing may be calibrated as described in relation to Figure 1. The increased color accuracy may be utilized to increase temperature measurement accuracy based on optical images.
[0088] Figure 6 is a diagram illustrating an example of a build volume 660 including a thermal cage 662 in accordance some examples of the techniques described herein. In this example, the build volume 660 includes a plurality of objects 666 for printing. A thermal cage 662 may be printed with the objects. Thermochromic dye 664 may be dispensed on the thermal cage 662 (e.g., on a layer of a top wall of the thermal cage 662). In this example, several of the objects 666 protrude through the top wall of the thermal cage 662. The color of the thermochromic dye 664 in aural regions around objects protruding through the thermal cage 662 may be utilized to detect a temperature or temperatures in the aural regions. For instance, darker colors may correspond to higher temperatures. In some examples, an optical image of the thermal cage 662 may be captured after printing and unpacking (e.g., partial unpacking or complete unpacking). As described herein, an apparatus may utilize the optical image to detect the temperature(s) in the aural region(s). In some examples, the temperature(s) may be utilized to train a machine learning model to predict temperature (e.g., aural region temperature, a thermal image, etc.), utilized to determine a thermal bleed threshold, and/or utilized to determine a packing margin. In some examples, an apparatus may print an object or objects based on the trained machine learning model, the thermal bleed threshold, and/or the packing margin. For instance, the packing margin may be utilized in a packing technique to pack objects with the packing margin between nearest points of neighboring objects.
[0089] Figure 7 is a diagram illustrating an example of a printed object 768 with a defect 770. In accordance with some examples of the techniques described herein, thermochromic dye may be dispensed in an aural region of an object. The thermochromic dye may provide a color-coded indication of a peak temperature or temperatures experienced around an object. In the example of Figure 7, if thermal bleed occurs, a defect 770 of the object may include a color mark that indicates the peak temperature experienced by the powder where the defect 770 occurred. In some examples of the techniques described herein, an apparatus may utilize the color of the defect 770 to determine the temperature of the defect 770. The temperature may be utilized to determine a thermal bleed threshold (and/or scrapping threshold) to avoid defects in a subsequent build(s). For instance, heat application may be reduced and/or agent (e.g., fusing agent and/or detailing agent) placement may be controlled to avoid meeting or exceeding the thermal bleed threshold.
[0090] As used herein, the term “and/or” may mean an item or items. For example, the phrase “A, B, and/or C” may mean any of: A (without B and C), B (without A and C), C (without A and B), A and B (without C), B and C (without A), A and C (without B), or all of A, B, and C.
[0091] While various examples are described herein, the disclosure is not limited to the examples. Variations of the examples described herein may be within the scope of the disclosure. For example, aspects or elements of the examples described herein may be omitted or combined.

Claims

1 . A method, comprising: dispensing, during a printing procedure, a thermochromic dye on a layer of material in a build volume; and detecting, after the printing procedure, a temperature in an aural region of an object in the build volume based on a color of the thermochromic dye.
2. The method of claim 1 , wherein the color of the thermochromic dye indicates a peak temperature experienced by the thermochromic dye.
3. The method of claim 1 , wherein detecting the temperature comprises mapping the color of the thermochromic dye from a captured image to the temperature.
4. The method of claim 1 , further comprising calibrating optical image processing based on color swatches associated with temperatures.
5. The method of claim 1 , further comprising training a machine learning model based on the temperature.
6. The method of claim 1 , further comprising determining a packing margin based on the temperature.
7. The method of claim 1 , further comprising determining a thermal bleed threshold based on the temperature.
8. The method of claim 1 , wherein dispensing the thermochromic dye comprises dispensing the thermochromic dye in the aural region relative to the object in the build volume.
9. The method of claim 1 , further comprising printing a thermal cage in the build volume, wherein the thermochromic dye is dispensed in or on a volume of the thermal cage.
10. An apparatus, comprising: a memory; and a processor coupled to the memory, wherein the processor is to: determine a peak temperature of an aural region of a layer occurring after the layer is buried based on an optical image of thermochromic dye of the layer.
11 . The apparatus of claim 10, wherein the thermochromic dye is ejected from a first print head that is separate from a second print head to eject agent.
12. The apparatus of claim 11 , wherein a second thermochromic dye is ejected from a third print head, wherein the second thermochromic dye exhibits a second range of temperature sensitivity that is different from a first range of temperature sensitivity of the thermochromic dye.
13. A non-transitory tangible computer-readable medium comprising instructions when executed cause a processor of an electronic device to: calibrate optical image processing based on a first optical image of color swatches captured by a camera; and measure a temperature of a previously buried layer indicated by a second optical image of thermochromic dye captured by the camera based on the calibrated optical image processing.
14. The non-transitory tangible computer-readable medium of claim 13, further comprising instructions when executed cause the processor to determine a thermal bleed threshold based on the temperature.
15. The non-transitory tangible computer-readable medium of claim 14, further comprising instructions when executed cause the processor to determine a packing margin based on the temperature.
PCT/US2021/050313 2021-09-14 2021-09-14 Temperature detections Ceased WO2023043434A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2021/050313 WO2023043434A1 (en) 2021-09-14 2021-09-14 Temperature detections

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2021/050313 WO2023043434A1 (en) 2021-09-14 2021-09-14 Temperature detections

Publications (1)

Publication Number Publication Date
WO2023043434A1 true WO2023043434A1 (en) 2023-03-23

Family

ID=85603371

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2021/050313 Ceased WO2023043434A1 (en) 2021-09-14 2021-09-14 Temperature detections

Country Status (1)

Country Link
WO (1) WO2023043434A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130292881A1 (en) * 2012-05-04 2013-11-07 Robert J. Steiner Thermochromic build materials
WO2018199955A1 (en) * 2017-04-27 2018-11-01 Hewlett-Packard Development Company, L.P. Three-dimensional (3d) printing
WO2021045772A1 (en) * 2019-09-06 2021-03-11 Hewlett-Packard Development Company, L.P. Three-dimensional printing with thermochromic additives
CN112693120A (en) * 2021-01-07 2021-04-23 北京工业大学 Visual monitoring method for surface exposure 3D printing process

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130292881A1 (en) * 2012-05-04 2013-11-07 Robert J. Steiner Thermochromic build materials
WO2018199955A1 (en) * 2017-04-27 2018-11-01 Hewlett-Packard Development Company, L.P. Three-dimensional (3d) printing
WO2021045772A1 (en) * 2019-09-06 2021-03-11 Hewlett-Packard Development Company, L.P. Three-dimensional printing with thermochromic additives
CN112693120A (en) * 2021-01-07 2021-04-23 北京工业大学 Visual monitoring method for surface exposure 3D printing process

Similar Documents

Publication Publication Date Title
KR102439037B1 (en) Visualize object manufacturing
JP2022084860A (en) Systems, methods and media for artificial intelligence feedback control in layered modeling
US20210276270A1 (en) Monitoring additive manufacturing
US12459041B2 (en) Porosity prediction
US20230302539A1 (en) Tool for scan path visualization and defect distribution prediction
US11967037B2 (en) Object deformation determination
US12487580B2 (en) Thermal image determination
CN113950403A (en) Fit Manufacturing Simulation
TWI804746B (en) Method for three-dimensional (3d) manufacturing, 3d printing device, and related non-transitory tangible computer-readable medium
CN109643099A (en) Detect the method and apparatus of part quality
CN113924204A (en) Object manufacturing simulation
WO2023043433A1 (en) Thermochromic dye temperatures
US20220016846A1 (en) Adaptive thermal diffusivity
US20220152936A1 (en) Generating thermal images
WO2023043434A1 (en) Temperature detections
US20250028307A1 (en) Computer-implemented method of providing structured data of an additive manufacturing process
US20250021721A1 (en) Lattice structure thicknesses
US20210331414A1 (en) Determining melting point of build material
US20220016847A1 (en) Material phase detection
US20230245272A1 (en) Thermal image generation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21957675

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21957675

Country of ref document: EP

Kind code of ref document: A1