WO2025106630A1 - Automated detection of packed cell volume - Google Patents
Automated detection of packed cell volume Download PDFInfo
- Publication number
- WO2025106630A1 WO2025106630A1 PCT/US2024/055858 US2024055858W WO2025106630A1 WO 2025106630 A1 WO2025106630 A1 WO 2025106630A1 US 2024055858 W US2024055858 W US 2024055858W WO 2025106630 A1 WO2025106630 A1 WO 2025106630A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- container
- controller
- sensor
- sample
- top surface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2518—Projection by scanning of the object
- G01B11/2522—Projection by scanning of the object the position of the object changing and being recorded
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01F—MEASURING VOLUME, VOLUME FLOW, MASS FLOW OR LIQUID LEVEL; METERING BY VOLUME
- G01F22/00—Methods or apparatus for measuring volume of fluids or fluent solid material, not otherwise provided for
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01F—MEASURING VOLUME, VOLUME FLOW, MASS FLOW OR LIQUID LEVEL; METERING BY VOLUME
- G01F23/00—Indicating or measuring liquid level or level of fluent solid material, e.g. indicating in terms of volume or indicating by means of an alarm
- G01F23/22—Indicating or measuring liquid level or level of fluent solid material, e.g. indicating in terms of volume or indicating by means of an alarm by measuring physical variables, other than linear dimensions, pressure or weight, dependent on the level to be measured, e.g. by difference of heat transfer of steam or water
- G01F23/28—Indicating or measuring liquid level or level of fluent solid material, e.g. indicating in terms of volume or indicating by means of an alarm by measuring physical variables, other than linear dimensions, pressure or weight, dependent on the level to be measured, e.g. by difference of heat transfer of steam or water by measuring the variations of parameters of electromagnetic or acoustic waves applied directly to the liquid or fluent solid material
- G01F23/284—Electromagnetic waves
- G01F23/292—Light, e.g. infrared or ultraviolet
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01F—MEASURING VOLUME, VOLUME FLOW, MASS FLOW OR LIQUID LEVEL; METERING BY VOLUME
- G01F23/00—Indicating or measuring liquid level or level of fluent solid material, e.g. indicating in terms of volume or indicating by means of an alarm
- G01F23/22—Indicating or measuring liquid level or level of fluent solid material, e.g. indicating in terms of volume or indicating by means of an alarm by measuring physical variables, other than linear dimensions, pressure or weight, dependent on the level to be measured, e.g. by difference of heat transfer of steam or water
- G01F23/28—Indicating or measuring liquid level or level of fluent solid material, e.g. indicating in terms of volume or indicating by means of an alarm by measuring physical variables, other than linear dimensions, pressure or weight, dependent on the level to be measured, e.g. by difference of heat transfer of steam or water by measuring the variations of parameters of electromagnetic or acoustic waves applied directly to the liquid or fluent solid material
- G01F23/284—Electromagnetic waves
- G01F23/292—Light, e.g. infrared or ultraviolet
- G01F23/2921—Light, e.g. infrared or ultraviolet for discrete levels
- G01F23/2928—Light, e.g. infrared or ultraviolet for discrete levels using light reflected on the material surface
Definitions
- the present application generally relates to biopharmaceutical product manufacturing processes, and more specifically to the automatic detection of a packed cell volume (PCV) during a biopharmaceutical product manufacturing process and/or other centrifugal processes.
- PCV packed cell volume
- bioreactors are used to culture cells prior to harvesting a desired drug product.
- the harvesting occurs via a centrifugal process wherein the biopharmaceutical product is agitated to form a liquid drug product
- a pellet of the biopharmaceutical product is disposed in a container (e.g., vials, cartridges, syringes, vessels, seals, etc.).
- the packed cell volume (PCV) of this pellet is measured to configure the centrifugal process.
- the PCV may be used to configure the centrifuge bowl discharge interval for stainless steel centrifuges, or to calculate heavy phase (cell discharge/waste) and light phase (supernatant/product) split flowrate for single-use centrifuges Accordingly, it is important to precisely determine the packed cell volume (PCV) of the pellet in the container because errors in measuring the PCV of the pellet may directly impact the yield of the manufacturing process and/or the operation of any post-filtration step of the manufacturing process.
- PCV packed cell volume
- the PCV of the pellet is determined by manually inspecting the container and comparing a height of the top surface of the pellet against measurement lines printed on a surface of the container.
- This manual process has a high degree of variability (as much as a 10-20% relative error) that impacts the accuracy of the PCV measurement.
- the manual dilution and visual verification process is inherently subject to measurement errors associated with parallax
- the top surface of the pellet is generally not flat or orthogonal to the container axis This makes accurate determination of the height of the top surface difficult
- the printed measurement lines on the side of the container have errors in their location (typically having a stated accuracy of 1%). Moreover, in many cases the printed measurement lines are not complete or have been worn off.
- systems and methods described herein generally use automated techniques for measuring the PCV of a pellet inside a container.
- systems may include a sensor (e.g., an image sensor or a laser sensor) capable of automatically detecting a height of a front edge of the pellet when inside the container.
- the techniques may include using mechanical means to rotate the container to capture a sequence of heights of the top surface of the pellet.
- the techniques may then apply a fit algorithm to the sequence to define a shape of the top surface of the pellet.
- the disclosed techniques are able to model the actual geometry of the top surface of the pellet when calculating the PCV.
- the inaccuracies in the conventional techniques that assume a uniform top surface at the maximum height of the pellet are corrected.
- the techniques may include comparing the sensor data to model data of the container (e g , manufacturer specifications of the dimensions of the container) in combination with the derived shape of the top surface of the pellet to determine the PCV of the pellet.
- model data of the container e g , manufacturer specifications of the dimensions of the container
- the accuracy of the PCV measurement is improved. Accordingly, when the more accurate PCV values obtained through the methods disclosed herein are used to configure the centrifugal harvesting process, the resulting process results in a higher yield In some scenarios, the ability to accurately measure the PCV of a pellet eliminates the need to perform a manual dilution altogether.
- the techniques described herein relate to an automated visual inspection system, including: a container holder configured to (i) hold a container that houses a sample and (II) rotate the container axially, a sensor having a sensing axis that passes through the container; and a controller operatively coupled to the container holder and the sensor and configured to: (1) control the container holder to axially rotate the container; (2) control the sensor to capture sensor data of the container at a plurality of axial rotation angles; (3) analyze the captured sensor data to determine a shape of a top surface of the sample; and (4) determine a sample volume based on the shape of the top surface.
- the techniques described herein relate to a method of automated analysis of a sample housed in a container, the method including: (1) controlling, via a controller, a container holder that is holding the container to axially rotate the container; (2) controlling, via the controller, a sensor to capture sensor data of the container at a plurality of axial rotation angles; (3) analyzing, via the controller, the captured sensor data to determine a shape of a top surface of the sample; and (4) determining a sample volume based on the shape of the top surface.
- the techniques described herein relate to one or more non-transitory, computer-readable media storing instructions that, when executed by processing hardware of a controller, cause the controller to : (1) control a container holder that is holding the container to axially rotate the container; (2) control a sensor to capture sensor data of the container at a plurality of axial rotation angles; (3) analyze the captured sensor data to determine a shape of a top surface of the sample; and (4) determine a sample volume based on the shape of the top surface.
- FIGS. 1A and 1 B are simplified block diagrams of example systems that may be used to automatically determine a PCV of a sample.
- FIG. 2A illustrates example image data of a container that houses a pellet that was captured by a sensor at a plurality of rotational angles.
- FIG. 2B shows a detailed view of the data extracted by the container evaluation application when analyzing the image data of FIG. 2A.
- FIG. 3 is a plot of the heights of leading edges of a top surface of a pellet.
- FIG. 4 depicts an example process by which the container evaluation application preprocesses a set of image data to align the image data.
- FIG. 5 depicts an example process by which the container evaluation application projects measurement lines on a container.
- FIG. 6 illustrates a flow chart of an example method for automated inspection of a container.
- examples described herein generally refer to determining the PCV of a pellet
- the techniques may be applied to other biopharmaceutical product manufacturing techniques.
- the techniques may be adapted to determine a PCV in a platelet counting process or in a resin slurry concentration in a chromatography process.
- FIGS. 1A is a simplified block diagram of an example system 100 that may be used to automatically determine the PCV of a sample.
- the system 100 includes a container 105, a sensor 110, a holder 115a, a controller 130, and, in some embodiments, a light source 120.
- the container 105 may be any suitable vessel, device or system that houses a pellet 107 (such as a sample or a biopharmaceutical product).
- the container may be a vial, a cartridge, a syringe, a vessel, etc.
- the container 105 is at least partially transparent such that the pellet 107 can be sensed by the sensor 110 when housed inside the container 105.
- the pellet 107 is a biopharmaceutical product that is about to undergo a centrifugal harvesting process.
- the container 105 may be the same container that is used during the centrifugal harvesting process.
- the senor 110 is configured to have a sensing axis 111 oriented towards the container 105 and/or the pellet 107 housed therein.
- the sensor 110 is an image sensor or camera configured to capture image data of the container 105.
- the sensing axis 111 may also be referred to as the optical axis 111.
- the controller 130 may analyze the image data generated by the sensor 110 to detect a height of a top surface 108 of the pellet 107.
- the camera 110 include a telecentric lens to avoid the optical aberrations associated with endocentric lens when measuring the height of the top surface 108.
- the spatial uniformity associated with telecentric lenses substantially reduces the need to account for optical distortions when measuring the height of the top surface 108.
- the system 100 may also include the light source 120.
- the light source 120 may be a light emitting diode (LED) light source configured to have an illumination axis 121 that reflects off the top surface 108 of the pellet 107.
- the light source 120 provides uniform backlighting of the container 105 to improve scene uniformity when analyzing the captured image data.
- the light source 120 may emit light in the infrared band to improve the ability to distinguish a front edge of the top surface 108 from other portions of the top surface 108. This improves the ability of the controller 130 to accurately determine the height of the top of the top surface 108 when analyzing the image data.
- sensor 110 is a laser depth sensor.
- the laser sensor 110 may be a laser dot sensor configured to sense depth at a particular point along the sensing axis 111 or a laser line sensor configured to sense a plurality of depths along a plane defined by the sensing axis.
- FIG 1A depicts the sensor 110 as being horizontally disposed relative to the container 105, in some embodiments in which the sensor 110 is a laser sensor the sensor 110 may be disposed at other angles with respect to the container 105.
- the system 100 also includes a holder 115a configured to hold a top portion of the container 105, a stopper inserted in the container 105, a cap placed over the container 105, etc.
- the holder 115a may include clamps, and/or other holding elements configured to securely hold the container 105 at a controllable pose.
- the holder 115a may be configured to axially rotate the container 105 about a vertical axis such that a full sweep of the pellet 107 passes through the sensing axis 111 Because the shape of the top surface 108 is typically not uniform, this enables the controller 130 to accurately model the actual shape of the top surface.
- the holder 115a and/or the sensor 110 are arranged such that the sensing axis 111 is offset from a transverse axis of the container 105.
- the holder 115a may be configured to hold the container 105 at an offset angle (e.g., 5°, 10°, 15°) with respect to the vertical axis.
- the sensor 110 may be configured to have the sensing axis 111 at an offset angle (e.g., 5°, 10°, 15°) with respect to the container 105, for example, by elevating the sensor 110 above the top surface 108 and tilting the sensing axis 111 downwards.
- the illumination axis 121 may be incident on a greater portion of the top surface 108 improving the contrast between the top surface 108 and the side portions of the pellet 107. This contrast improves ability of the controller 130 to identify the height of the top surface 108, thereby improving the accuracy of the height measurement. While FIG. 1A depicts an embodiment where the holder 115a holds an upper portion of the container 105, in other embodiments, a holder 115a or 115b may instead hold a lower portion of the container 105.
- FIG. 1 B illustrated is a simplified block diagram of an example system 150 in which the holder 115b holds a lower portion of the container 105.
- the holder 115b may be configured to receive the container 105 when an operator places the container 105 in a cavity
- the operator may not need to operate a clamping mechanism that many types of holders 115a include. This may increase the speed at which the multiple containers 105 can be analyzed via the automated inspection techniques disclosed herein.
- the bottom of the container 105 is not visible to the sensor 110.
- the controller 130 may obtain a model of the container 105 that indicates the dimensions of the container 105 to determine a depth within the holder 115b at which the bottom of the container 105 is held Accordingly, the controller 130 may utilize this depth information when calculating the height of the top surface 108
- the controller 130 may be operatively coupled to the sensor 110, the holder 115a or 115b, and, in some embodiments, the light source 120.
- the controller 130 may be a server, a desktop computer, a laptop computer, a tablet device, a dedicated control device, or any other suitable type of computing device or devices.
- the controller 130 includes processing hardware 132, a network interface 134, a display device 136, a user input device 137, and a memory unit 138.
- the controller 130 includes two or more computers that are either co-located or remote from each other.
- the operations described herein relating to the processing hardware 132, the network interface 134, and/or the memory unit 138 may be divided among multiple processing units, network interfaces, and/or memory units, respectively.
- the processing hardware 132 includes one or more processors, each of which may be a programmable microprocessor that executes software instructions stored in the memory unit 128 to execute some or all of the functions of the controller 130 as described herein Alternatively, some of the processors in the processing hardware 132 may be other types of processors (e g., application-specific integrated circuits (ASICs), field-programmable gate arrays (FPGAs), etc.), and some of the functionality of the controller 130 as described herein may instead be implemented, in part or in whole, by such hardware.
- the memory unit 138 may include one or more physical memory devices or units containing volatile and/or non-volatile memory. Any suitable memory type or types may be used, such as read-only memory (ROM), solid-state drives (SSDs), hard disk drives (HDDs), and so on.
- the network interface 134 may include any suitable hardware (e.g., front-end transmitter and receiver hardware), firmware, and/or software configured to communicate via one or more communication networks and/or using one or more communication protocols.
- the network interface 134 may be or include an Ethernet interface and/or a serial interface via which the controller 130 control operation of the sensor 110, the holder 115a or 115b, and/or the light source 120.
- the display device 136 may use any suitable display technology (e.g., LED, OLED, LCD, etc.) to present information to a user, and the user input device 137 may be a keyboard or other suitable input device.
- the display device 136 and the user input device 137 are integrated within a single device (e.g , a touchscreen display).
- the display device 136 and the user input device 137 may jointly enable a user to interact with graphical user interfaces (GUIs) provided by the controller 130, e.g , for purposes such as determining characteristics of the container 105 and/or pellet 107 housed therein (such as the packed cell volume (PCV) of the pellet 107).
- GUIs graphical user interfaces
- the controller 130 does not include the display device 136 and/or the user input device 137.
- the memory unit 138 stores non-transitory instructions of one or more software applications, including a container evaluation application (not depicted).
- the container evaluation application when executed by the processing hardware 132, is generally configured to communicate with the sensor 110, the holder 115a or 115b, and/or the light source 120 to analytically determine characteristics of the container 105 and/or the pellet 107 housed therein.
- the container evaluation application may be configured to cause the holder 115a or 115b to rotate the container 105 at a fixed rate.
- the container evaluation application may be configured to control the sensor 110 to capture sensor data associated with container 105 at fixed intervals (e.g., every 10° of rotation, every 20° of rotation, every 30° of rotation, every 60° of rotation).
- the container evaluation application may instead be configured to control the holder 115a or 115b to rotate the container 105 in discrete intervals (e.g., 10° intervals, 20° intervals, 30° intervals, 60° intervals)
- the container evaluation application may be configured to cause the sensor 110 to capture sensor data of the container 105 after the holder 115a or 115b has completed the rotation interval
- the container evaluation application may first control the light source 120 to emit illumination light before controlling the holder 115a or 115b and the sensor 110. Regardless, the container evaluation application may compile the sets of sensor data captured at each interval into a sequence of sensor data.
- the memory unit 138 may be configured to store model data representative of the dimensions of the container 105.
- the model data may be obtained from manufacturer documentation provided by the manufacturer and/or provider of the container 105.
- the model data may be a three dimensional scan and/or model of the container that includes the dimensional information associated with the container 105.
- the container evaluation application may be configured to access the model data to determine the characteristics of the container 105 and/or the pellet 107. For example, after determining a shape of the top surface 108, the container evaluation application may utilize the model of the container 105 to determine an inner circumference of the lower portion of the pellet 107. It should be appreciated that many containers 105 are not cylindrical throughout their entire length. Accordingly, the model data may model the different dimensions of the container 105 throughout the length of the container 105.
- FIGS. 2A and 2B illustrated are example image data 212 of a container 205 (such as the container 105) that houses a pellet 207 (such as the pellet 107) that was captured by a sensor (such as the sensor 110).
- the sensor may be configured to capture the image data 212 at the direction of a container evaluation application as the container evaluation application controls a holder (such as the holders 115a, 115b) to rotate the container 205.
- the container evaluation application configured the sensor to capture seven sets of image data 212a-g at intervals of 60° of axial rotation. As shown in FIG. 2A, the height of the front edge of the top surface 208 of the pellet 207 varies at different degrees of rotation.
- FIG. 2B shows a detailed view of the data extracted by the container evaluation application when analyzing the image data 212a
- the FIG. 2B depicts the point 213 that represents the front edge of the top surface 208 of the pellet 207.
- the container evaluation application may determine the lateral position of the point 213 by defining a center axis along the lateral midpoint of the image data representative of the container 205.
- the container evaluation application may determine the vertical position of the point 213 by detecting a shift in intensity in image data along the center axis between the darker side of the pellet 207 and the illuminated top surface 208 of the pellet 207.
- the container evaluation application may then determine a height of the point 213.
- a height of the point 213 there are two options for determining the heigh of the point 213 based on image data - (1) by analyzing the pixel height of the point 213, or (2) comparing the height of the point 213 to a measurement line on the container 205 It should be appreciated that the pixel height techniques may remove error in PCV calculations introduced by the manufacturing tolerances of the container 205.
- the container evaluation application may calculate the height based on a number of pixels from the bottom of the container 205. In embodiments in which the holder holds a bottom portion of the container 205 (such as shown in FIG. 1 B), the container evaluation application may calculate the height based on a number of pixels from the predetermined depth of the bottom of the container 205. For the measurement line techniques, the container evaluation application may compare the height of the point 213 to the bounding measurement lines of the container 205 to derive a height value. Regardless of technique, the container evaluation application may repeat this process for each of the sets of image data 212b-f and generate a sequence that associates the degrees of axial rotation and the corresponding heights of the top surface 208.
- the container evaluation application may obtain a depth value from the sensor indicative of a distance from the sensor to the top surface 208
- the depth of the bottom surface of the pellet may be known based on physical measurements of the system configuration and the model data of the container 205
- the height of the pellet 207 may be determined by subtracting the obtained sensor depth value for a particular edge of the top surface 208 from the predetermined depth of the bottom surface of the pellet 207.
- FIG 3 illustrated is a plot 300 of the heights of leading edges of a top surface (such as the top surfaces 108, 208) of a pellet (such as the pellets 107, 207). It should be appreciated that unlike the scenario illustrated in FIG. 2A, the image data was captured at 10° intervals instead of 60° intervals. Additionally, while FIG. 3 illustrates that the height is measured in terms of pixel height, in embodiments where laser depth or container measurement lines are used, the height may utilize other units of measurement. Regardless of the unit of measurement, the container evaluation application may apply interpolation techniques (such as linear interpolation) to connect the sampled height values included in the sequence.
- interpolation techniques such as linear interpolation
- the container evaluation application may then determine the PCV of the pellet by identifying the minimum of the plot 300 to segment the volume of the pellet into a body having a flat surface up to the height of the minimum and a body having a top surface of the defined shape of the plot 300.
- the container evaluation application may analyze the model and/or the measurement lines on the container to determine the volume associated with the minimum height.
- the container evaluation application may calculate an area under the curve of the plot 300 and combine the area and model data to determine a volume of the upper body.
- the container evaluation application then adds the two volumes to determine the PCV of the pellet. It should be appreciated that this technique may not resolve any contours along the radial axis of the top surface. Accordingly, this technique may be understood as defining a minimum energy surface that connects a shape defined by the outer edges measured by the sensor
- the container evaluation application may alternatively assume that the top surface of the pellet is generally shaped like a tilted oval. Additionally, based on experimental testing, the tilted oval shape is the most common shape of a pellet housed in a container Accordingly, in these embodiments, the container evaluation application may define the shape of the tilted oval (and thus the upper surface of the pellet) by identifying the maximum and minimum of the plot 300 and the inner circumference of the container.
- the holder 115a may be configured to hold the container 105 at an offset angle to improve the illumination of the top surface 108 of the pellet 107. Additionally, in some systems, the coupling between the holder 115a and the container 105 may not result in a perfectly vertical alignment. Accordingly, as the holder 115a axially rotates the container 105, the position of the container 105 shifts between sets of image data. In these embodiments, prior to performing the analyses as illustrated in FIGS. 2A, 2B, and 3, the container evaluation application may first digitally align the image data to ensure that the height calculation is consistent across each set of image data.
- FIG. 4 depicts an example process 400 by which the container evaluation application preprocesses a set of image data 412 (such as the image data 212) to align the image data 412 with a vertical axis prior to analyzing the image data to determine the height of a pellet 407 (such as the pellets 107, 207) in a container 405 (such as the containers 105, 205). While FIG.
- FIG. 4 illustrates the process for aligning image data in a system in which a holder 415 (such as the holder 115a) holds the container 405 via a top portion
- similar techniques may be applied to align image data in a system in which a container is slanted while resting in a holder that holds a bottom portion of the container 405 (such as the holder 115b).
- image (a) represented is the image data 412 as it was captured by an image sensor or camera (such as the sensor 110).
- a center line 414 is superimposed on the image data 412 to illustrate that the holder 415 is holding the container 405 at an angle.
- an edge finding technique is applied to identify an edge 416 of the container 405. More particularly, the edge finding technique is applied to identify a pair of edges 416a, 416b that are vertically aligned and equidistant from the center line 414 when the image data 412 is properly aligned.
- a rake edge finding technique is implemented. That said, in other embodiments, other edge finding techniques may be applied.
- the edges 416 are used to define the lateral and rotational deviation of the container 405 in the image data 412.
- the container evaluation application may be able to define the actual center line 417 of the container 405 using the detected edges 416 and the knowledge that (I) the edges 416a, 416b are equidistant to the center line 417 and (ii) the center line 417 should be vertically aligned
- container evaluation application may determine the rotational and lateral deviation of the container in the image data 412 Using the rotational and lateral deviation values, the container evaluation application is able to derive an offset vector by which the pixels of the image data 412 are translated to produce the aligned image data 422.
- the aligned image data 422 is laterally centered and vertically aligned, when the container evaluation application applies similar techniques to each set of image data obtained from the image sensor, the height values of the pellet 407 are all defined with respect to the same perspective. This ensures that any deviations in how the container 405 is coupled to the holder 415 do not impact the accuracy of the PCV calculation.
- the measurement lines on containers fade over time.
- the measurement lines might not be visible to the image sensor across all degrees of rotation.
- the container evaluation application may apply techniques to enhance and superimpose the measurement lines onto image data to facilitate image analysis techniques that rely upon said measurement lines.
- FIG 5 illustrated is an example process 500 by which the container evaluation application virtually superimposes measurement lines on a container 505 (such as the containers 105, 205, 405).
- a container 505 such as the containers 105, 205, 405
- the measurement lines 519 of the container have faded, making them difficult to use a baseline for measuring volume.
- image (a) depicts the container 505 as having a pellet housed therein, in other embodiments, the process 500 may be applied before the pellet is housed inside the container 505.
- image (b) illustrated is the extracted region of interest where the measurement lines have faded.
- the container evaluation application captures images (a) of the container 505 at a plurality of different rotation angles and extracts a corresponding image (b) of the region of interest from each image (a).
- the container evaluation application combines the extracted regions of interest to produce a minimum intensity projection (MIP) of the region of interest.
- MIP minimum intensity projection
- the Ml P process assigns each pixel and/or row of the image data the minimum value (i.e. , the darkest value) such that any gaps in the printed measurement lines are filled in.
- image (d) when the MIP techniques are applied to fill in the faded measurement lines 519, the container evaluation application can readily identify the measurement lines for use in determining the PCV of the pellet.
- the container evaluation application may apply optical character recognition or other similar techniques to detect the presence of the numbers on the container 505 and exclude those pixels from the MIP analysis to prevent the presence of the numbers from generating wide bands for each measurement line.
- the pellet is the same color as the measurement lines 519, making the lower bounding measurement difficult to detect.
- the container evaluation application may extrapolate and/or project the measurement lines downwards into the region obscured by the pellet For example, the container evaluation application may determine a pixelwise spacing between identified measurement lines 529a and 529b and extrapolate an additional measurement line below measurement line 529b at a distance matching the pixelwise spacing.
- the container 505 is a pseudo-cylindric container in which the inner diameter varies at different heights along a length of the container 505.
- the measurement lines on the container 505 may be non-uniformly spaced due to variations in inner diameter along the length.
- the container evaluation application may analyze the model of the container 505 to account for the non-uniform spacing between the measurement lines .
- the container evaluation application is able to still rely on the measurement lines 519 to determine the PCV of the pellet despite the pellet obscuring the measurement lines 519 in the image data.
- the container evaluation application may extract the correspondence between pixel height and measurement line for the container 505 using an empty container to store a pixel height associated with each of the printed measurement lines in a memory of a controller (such as the controller 130). Accordingly, when the container evaluation application analyzes image data where the printed measurement lines 519 on the container 505 is obscured by the pellet, the container evaluation application can compare the pixel height of the pellet to the stored correspondences when determining the PCV of the pellet In these embodiments, the container evaluation application may identify a particular feature of the container 505 (e g , a height of the top of the container) to align the stored pixel heights associated with the printed measurement lines with the image data being analyzed In some embodiments where the container evaluation application is configured to display the sensor data representative of the container 505 the container evaluation application may overlay indications of the extrapolated and/or stored measurement lines onto the displayed image of the container for manual verification.
- a particular feature of the container 505 e g , a height of the top of the container
- the container evaluation application may
- the container evaluation application may be configured to detect a meniscus height of the liquid in the container. Accordingly, using the disclosed pixelwise or measurement line volume determination techniques, the container evaluation application is able to determine a volume associated with the height of a bottom edge of the meniscus. In some embodiments, the container evaluation application may determine the volume of liquid in the meniscus region based on a surface tension of the liquid and a contact angle between the liquid and a wall of the container.
- the container evaluation application may then subtract the PCV of the pellet from the volume associated with the bottom edge of the meniscus to determine the volume of the liquid in the container.
- the container evaluation application may additionally determine the volume of liquid in the meniscus region based on a surface tension of the liquid and a contact angle between the liquid and a wall of the container and add the meniscus region volume to the volume associated with the bottom edge of the meniscus before subtracting the PCV of the pellet.
- FIG 6 illustrated is a flow chart of an example method 600 for automated inspection of a sample (such as the pellets 107, 207, 407) in a container (such as the containers 105, 205, 405, 505).
- the method 600 may be performed by one or more processors (such as the processing hardware 132) a controller (such as the controller 130) executing a container evaluation application stored in a memory unit (such as the memory unit 138).
- the controller may be operatively coupled to a container holder (such as the holders 115a, 115b, 415) that holds the container and is configured to be able to rotate the container axially at the direction of the controller.
- the controller may also be connected to a sensor (such as the sensor 110) configured to have a sensing axis (such as the sensing axis 111) that passes through the container.
- the sensor is one of a laser line profiler sensor and a laser displacement sensor configured to sense a distance between the sensor and the top surface (such as the top surfaces 108, 208, 408) of the sample.
- the sensor is an image sensor which, in some embodiments, includes a telecentric lens.
- the controller may also be coupled to a light source oriented to emit illumination toward the image sensor such that the emitted illumination reflects off the top surface of the sample.
- the container holder and the sensor are configured such that the sensing axis is offset from a transverse axis of the container by an offset angle.
- the method begins at block 602 when the controller controls the container holder to axially rotate the container.
- the controller controls the sensor to capture sensor data of the container at a plurality of axial rotation angles.
- the controller analyzes the captured sensor data to determine a shape of a top surface of the sample.
- the controller may preprocess the captured sensor data by vertically and/or laterally aligning the captured sensor data (such as by applying the techniques associated with the process 400).
- the controller may determine a height of a front edge of the top surface of the sample at the plurality of axial rotation angles and define the shape by applying a fit algorithm to the determined heights
- the fit algorithm may be a minimum energy surface (such as by applying the techniques describe with respect to the plot 300) or a fit algorithm that fits a tilted oval shape to the determined heights.
- the controller determines a sample volume (such as a PCV).
- the controller compares the sensor data and the shape of the top surface to a model of the container. For example, the controller may identify a pixel height of the top surface and use the pixel heights and inner diameter and/or circumference data to determine the volume of the sample.
- the controller may compare the sensor data and the shape of the top surface to a measurement scale (such as the measurement lines 519) included on a surface of the container.
- the controller may be configured to (i) control the sensor to capture sensor data of the container at a plurality of axial rotation angles; (II) generate a minimum intensity projection (MIP) of the measurement scale based on the captured sensor data; and (ill) extrapolate the measurement scale into a region associated with the sample (such as by applying the techniques described with respect to the process 500)
- the controller may be configured to (i) obtain a stored correspondence between printed measurement lines of the measurement scale and corresponding pixel heights and (ii) align the measurement scale with a region associated with the sample
- the sample volume may then be used for other processes in the biopharmaceutical manufacturing process.
- the sample volume may be used to determine a volume of a liquid included in the container.
- the sample volume may be used to configure a harvesting process for the sample (such as a centrifugal harvesting process).
- the controller is operatively coupled to a controller of the centrifuge and provides sample volume thereto to assist in automatic configuration of the centrifugal process.
- the controller outputs the sample volume for a clinician to configure the centrifugal process
- the sample volume may be used to configure the centrifuge bowl discharge interval for stainless steel centrifuges, or to the heavy phase (cell discharge/waste) and light phase (supernatant/product) split flowrate for single-use centrifuges [0056]
- the disclosed techniques may be applied to improve other centrifugal processes.
- the disclosed techniques may be applied to measure the volume of blood components in a centrifugal blood component analysis process.
- Embodiments of the disclosure relate to a non-transitory computer-readable storage medium having computer code thereon for performing various computer-implemented operations.
- the term “computer-readable storage medium” is used herein to include any medium that is capable of storing or encoding a sequence of instructions or computer codes for performing the operations, methodologies, and techniques described herein.
- the media and computer code may be those specially designed and constructed for the purposes of the embodiments of the disclosure, or they may be of the kind well known and available to those having skill in the computer software arts.
- Examples of computer-readable storage media include, but are not limited to: magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs and holographic devices; magneto-optical media such as optical disks; and hardware devices that are specially configured to store and execute program code, such as ASICs, programmable logic devices (“PLDs”), and ROM and RAM devices.
- magnetic media such as hard disks, floppy disks, and magnetic tape
- optical media such as CD-ROMs and holographic devices
- magneto-optical media such as optical disks
- hardware devices that are specially configured to store and execute program code such as ASICs, programmable logic devices (“PLDs”), and ROM and RAM devices.
- Examples of computer code include machine code, such as produced by a compiler, and files containing higher-level code that are executed by a computer using an interpreter or a compiler.
- an embodiment of the disclosure may be implemented using Java, C++, or other object-oriented programming language and development tools. Additional examples of computer code include encrypted code and compressed code.
- an embodiment of the disclosure may be downloaded as a computer program product, which may be transferred from a remote computer (e g., a server computer) to a requesting computer (e.g., a client computer or a different server computer) via a transmission channel.
- a remote computer e.g., a server computer
- a requesting computer e.g., a client computer or a different server computer
- Another embodiment of the disclosure may be implemented in hardwired circuitry in place of, or in combination with, machine-executable software instructions.
- the terms “approximately,” “substantially,” “substantial” and “about” are used to describe and account for small variations. When used in conjunction with an event or circumstance, the terms can refer to instances in which the event or circumstance occurs precisely as well as instances in which the event or circumstance occurs to a close approximation.
- the terms can refer to a range of variation less than or equal to ⁇ 10% of that numerical value, such as less than or equal to ⁇ 5%, less than or equal to ⁇ 4%, less than or equal to ⁇ 3%, less than or equal to ⁇ 2%, less than or equal to ⁇ 1%, less than or equal to ⁇ 0 5%, less than or equal to ⁇ 0 1%, or less than or equal to ⁇ 0 05%.
- two numerical values can be deemed to be “substantially” the same if a difference between the values is less than or equal to ⁇ 10% of an average of the values, such as less than or equal to ⁇ 5%, less than or equal to ⁇ 4%, less than or equal to ⁇ 3%, less than or equal to ⁇ 2%, less than or equal to ⁇ 1%, less than or equal to ⁇ 0.5%, less than or equal to ⁇ 0.1%, or less than or equal to ⁇ 0.05%.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Fluid Mechanics (AREA)
- Thermal Sciences (AREA)
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
A system may include a container holder configured to (i) hold a container that houses a sample and (ii) rotate the container axially. A system may include a sensor having a sensing axis that passes through the container. A system may include a controller operatively coupled to the container holder and the sensor and configured to: control the container holder to axially rotate the container, control the sensor to capture sensor data of the container at a plurality of axial rotation angles, analyze the captured sensor data to determine a shape of a top surface of the sample; and based on the shape of the top surface, determine a sample volume.
Description
AUTOMATED DETECTION OF PACKED CELL VOLUME
FIELD OF THE DISCLOSURE
[0001] The present application generally relates to biopharmaceutical product manufacturing processes, and more specifically to the automatic detection of a packed cell volume (PCV) during a biopharmaceutical product manufacturing process and/or other centrifugal processes.
BACKGROUND
[0002] In the manufacture of certain biopharmaceutical products (e.g., biotherapeutic proteins), bioreactors are used to culture cells prior to harvesting a desired drug product. For some biopharmaceutical products, the harvesting occurs via a centrifugal process wherein the biopharmaceutical product is agitated to form a liquid drug product Prior to performing the centrifugal harvesting process, a pellet of the biopharmaceutical product is disposed in a container (e.g., vials, cartridges, syringes, vessels, seals, etc.). The packed cell volume (PCV) of this pellet is measured to configure the centrifugal process. For example, the PCV may be used to configure the centrifuge bowl discharge interval for stainless steel centrifuges, or to calculate heavy phase (cell discharge/waste) and light phase (supernatant/product) split flowrate for single-use centrifuges Accordingly, it is important to precisely determine the packed cell volume (PCV) of the pellet in the container because errors in measuring the PCV of the pellet may directly impact the yield of the manufacturing process and/or the operation of any post-filtration step of the manufacturing process.
[0003] Conventionally, the PCV of the pellet is determined by manually inspecting the container and comparing a height of the top surface of the pellet against measurement lines printed on a surface of the container. This manual process has a high degree of variability (as much as a 10-20% relative error) that impacts the accuracy of the PCV measurement There are several sources for this error First, the manual dilution and visual verification process is inherently subject to measurement errors associated with parallax Second, the top surface of the pellet is generally not flat or orthogonal to the container axis This makes accurate determination of the height of the top surface difficult Third, the printed measurement lines on the side of the container have errors in their location (typically having a stated accuracy of 1%). Moreover, in many cases the printed measurement lines are not complete or have been worn off.
BRIEF SUMMARY
[0004] Systems and methods described herein generally use automated techniques for measuring the PCV of a pellet inside a container. For example, systems may include a sensor (e.g., an image sensor or a laser sensor) capable of automatically detecting a height of a front edge of the pellet when inside the container. The techniques may include using mechanical means to rotate the container to capture a sequence of heights of the top surface of the pellet. The techniques may then apply a fit algorithm to the sequence to define a shape of the top surface of the pellet. As a result, the disclosed techniques are able to model the actual geometry of the top surface of the pellet when calculating the PCV. Thus, the inaccuracies in the conventional techniques that assume a uniform top surface at the maximum height of the pellet are corrected. Additionally, in some embodiments, to reduce the reliance on container measurement lines, the techniques may include comparing the sensor data to model data of the container (e g , manufacturer specifications of the dimensions of the container) in combination with the derived shape of the top surface of the pellet to determine the PCV of the pellet. As a result, by applying the PCV measurement
techniques disclosed herein, the accuracy of the PCV measurement is improved. Accordingly, when the more accurate PCV values obtained through the methods disclosed herein are used to configure the centrifugal harvesting process, the resulting process results in a higher yield In some scenarios, the ability to accurately measure the PCV of a pellet eliminates the need to perform a manual dilution altogether.
[0005] In some aspects, the techniques described herein relate to an automated visual inspection system, including: a container holder configured to (i) hold a container that houses a sample and (II) rotate the container axially, a sensor having a sensing axis that passes through the container; and a controller operatively coupled to the container holder and the sensor and configured to: (1) control the container holder to axially rotate the container; (2) control the sensor to capture sensor data of the container at a plurality of axial rotation angles; (3) analyze the captured sensor data to determine a shape of a top surface of the sample; and (4) determine a sample volume based on the shape of the top surface.
[0006] In some aspects, the techniques described herein relate to a method of automated analysis of a sample housed in a container, the method including: (1) controlling, via a controller, a container holder that is holding the container to axially rotate the container; (2) controlling, via the controller, a sensor to capture sensor data of the container at a plurality of axial rotation angles; (3) analyzing, via the controller, the captured sensor data to determine a shape of a top surface of the sample; and (4) determining a sample volume based on the shape of the top surface.
[0007] In some aspects, the techniques described herein relate to one or more non-transitory, computer-readable media storing instructions that, when executed by processing hardware of a controller, cause the controller to : (1) control a container holder that is holding the container to axially rotate the container; (2) control a sensor to capture sensor data of the container at a plurality of axial rotation angles; (3) analyze the captured sensor data to determine a shape of a top surface of the sample; and (4) determine a sample volume based on the shape of the top surface.
BRIEF DESCRIPTION OF THE DRAWINGS
[0008] The skilled artisan will understand that the figures described herein are included for purposes of illustration and are not limiting on the present disclosure. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the present disclosure. It is to be understood that, in some instances, various aspects of the described implementations may be shown exaggerated or enlarged to facilitate an understanding of the described implementations. In the drawings, like reference characters throughout the various drawings generally refer to functionally similar and/or structurally similar components.
[0009] FIGS. 1A and 1 B are simplified block diagrams of example systems that may be used to automatically determine a PCV of a sample.
[0010] FIG. 2A illustrates example image data of a container that houses a pellet that was captured by a sensor at a plurality of rotational angles.
[0011] FIG. 2B shows a detailed view of the data extracted by the container evaluation application when analyzing the image data of FIG. 2A.
[0012] FIG. 3 is a plot of the heights of leading edges of a top surface of a pellet.
[0013] FIG. 4 depicts an example process by which the container evaluation application preprocesses a set of image data to align the image data.
[0014] FIG. 5 depicts an example process by which the container evaluation application projects measurement lines on a container.
[0015] FIG. 6 illustrates a flow chart of an example method for automated inspection of a container.
DETAILED DESCRIPTION
[0016] The various concepts introduced above and discussed in greater detail below may be implemented in any of numerous ways, and the described concepts are not limited to any particular manner of implementation Examples of implementations are provided for illustrative purposes
[0017] Although examples described herein generally refer to determining the PCV of a pellet, the techniques may be applied to other biopharmaceutical product manufacturing techniques. For example, in other embodiments the techniques may be adapted to determine a PCV in a platelet counting process or in a resin slurry concentration in a chromatography process.
[0018] FIGS. 1A is a simplified block diagram of an example system 100 that may be used to automatically determine the PCV of a sample. The system 100 includes a container 105, a sensor 110, a holder 115a, a controller 130, and, in some embodiments, a light source 120.
[0019] The container 105 may be any suitable vessel, device or system that houses a pellet 107 (such as a sample or a biopharmaceutical product). For example, the container may be a vial, a cartridge, a syringe, a vessel, etc. The container 105 is at least partially transparent such that the pellet 107 can be sensed by the sensor 110 when housed inside the container 105. In some embodiments, the pellet 107 is a biopharmaceutical product that is about to undergo a centrifugal harvesting process. In these embodiments, the container 105 may be the same container that is used during the centrifugal harvesting process.
[0020] As illustrated, the sensor 110 is configured to have a sensing axis 111 oriented towards the container 105 and/or the pellet 107 housed therein. In some embodiments, the sensor 110 is an image sensor or camera configured to capture image data of the container 105. In these embodiments, the sensing axis 111 may also be referred to as the optical axis 111. As will be described in more detail below, the controller 130 may analyze the image data generated by the sensor 110 to detect a height of a top surface 108 of the pellet 107. Accordingly, it is preferrable for the camera 110 include a telecentric lens to avoid the optical aberrations associated with endocentric lens when measuring the height of the top surface 108. To this end, the spatial uniformity associated with telecentric lenses substantially reduces the need to account for optical distortions when measuring the height of the top surface 108.
[0021] In embodiments in which the sensor 110 is an image sensor or camera, the system 100 may also include the light source 120. The light source 120 may be a light emitting diode (LED) light source configured to have an illumination axis 121 that reflects off the top surface 108 of the pellet 107. In some embodiments, the light source 120 provides uniform backlighting of the container 105 to improve scene uniformity when analyzing the captured image data. Additionally, in some embodiments, the light source 120 may emit light in the infrared band to improve the ability to distinguish a front edge of the top surface 108 from other portions of the top surface 108. This improves the ability of the controller 130 to accurately determine the height of the top of the top surface 108 when analyzing the image data.
[0022] In other embodiments, sensor 110 is a laser depth sensor. For example, the laser sensor 110 may be a laser dot sensor configured to sense depth at a particular point along the sensing axis 111 or a laser line sensor configured to sense a plurality of depths along a plane defined by the sensing axis. Although FIG 1A depicts the sensor 110 as being horizontally
disposed relative to the container 105, in some embodiments in which the sensor 110 is a laser sensor the sensor 110 may be disposed at other angles with respect to the container 105.
[0023] The system 100 also includes a holder 115a configured to hold a top portion of the container 105, a stopper inserted in the container 105, a cap placed over the container 105, etc. The holder 115a may include clamps, and/or other holding elements configured to securely hold the container 105 at a controllable pose. For example, the holder 115a may be configured to axially rotate the container 105 about a vertical axis such that a full sweep of the pellet 107 passes through the sensing axis 111 Because the shape of the top surface 108 is typically not uniform, this enables the controller 130 to accurately model the actual shape of the top surface.
[0024] In some embodiments, the holder 115a and/or the sensor 110 are arranged such that the sensing axis 111 is offset from a transverse axis of the container 105. For example, the holder 115a may be configured to hold the container 105 at an offset angle (e.g., 5°, 10°, 15°) with respect to the vertical axis. Similarly, the sensor 110 may be configured to have the sensing axis 111 at an offset angle (e.g., 5°, 10°, 15°) with respect to the container 105, for example, by elevating the sensor 110 above the top surface 108 and tilting the sensing axis 111 downwards. By holding the container 105 at an offset angle, the illumination axis 121 may be incident on a greater portion of the top surface 108 improving the contrast between the top surface 108 and the side portions of the pellet 107. This contrast improves ability of the controller 130 to identify the height of the top surface 108, thereby improving the accuracy of the height measurement. While FIG. 1A depicts an embodiment where the holder 115a holds an upper portion of the container 105, in other embodiments, a holder 115a or 115b may instead hold a lower portion of the container 105.
[0025] With reference to FIG. 1 B, illustrated is a simplified block diagram of an example system 150 in which the holder 115b holds a lower portion of the container 105. In the example system 150, the holder 115b may be configured to receive the container 105 when an operator places the container 105 in a cavity As a result, the operator may not need to operate a clamping mechanism that many types of holders 115a include. This may increase the speed at which the multiple containers 105 can be analyzed via the automated inspection techniques disclosed herein. It should be appreciated in this embodiment, the bottom of the container 105 is not visible to the sensor 110. Accordingly, in these embodiments, the controller 130 may obtain a model of the container 105 that indicates the dimensions of the container 105 to determine a depth within the holder 115b at which the bottom of the container 105 is held Accordingly, the controller 130 may utilize this depth information when calculating the height of the top surface 108
[0026] In either embodiment, the controller 130 may be operatively coupled to the sensor 110, the holder 115a or 115b, and, in some embodiments, the light source 120. The controller 130 may be a server, a desktop computer, a laptop computer, a tablet device, a dedicated control device, or any other suitable type of computing device or devices. In the example embodiment shown in FIGS 1A and 1 B, the controller 130 includes processing hardware 132, a network interface 134, a display device 136, a user input device 137, and a memory unit 138. In some embodiments, however, the controller 130 includes two or more computers that are either co-located or remote from each other In these distributed embodiments, the operations described herein relating to the processing hardware 132, the network interface 134, and/or the memory unit 138 may be divided among multiple processing units, network interfaces, and/or memory units, respectively.
[0027] The processing hardware 132 includes one or more processors, each of which may be a programmable microprocessor that executes software instructions stored in the memory unit 128 to execute some or all of the functions of the
controller 130 as described herein Alternatively, some of the processors in the processing hardware 132 may be other types of processors (e g., application-specific integrated circuits (ASICs), field-programmable gate arrays (FPGAs), etc.), and some of the functionality of the controller 130 as described herein may instead be implemented, in part or in whole, by such hardware. The memory unit 138 may include one or more physical memory devices or units containing volatile and/or non-volatile memory. Any suitable memory type or types may be used, such as read-only memory (ROM), solid-state drives (SSDs), hard disk drives (HDDs), and so on.
[0028] The network interface 134 may include any suitable hardware (e.g., front-end transmitter and receiver hardware), firmware, and/or software configured to communicate via one or more communication networks and/or using one or more communication protocols. For example, the network interface 134 may be or include an Ethernet interface and/or a serial interface via which the controller 130 control operation of the sensor 110, the holder 115a or 115b, and/or the light source 120.
[0029] The display device 136 may use any suitable display technology (e.g., LED, OLED, LCD, etc.) to present information to a user, and the user input device 137 may be a keyboard or other suitable input device. In some embodiments, the display device 136 and the user input device 137 are integrated within a single device (e.g , a touchscreen display). Generally, the display device 136 and the user input device 137 may jointly enable a user to interact with graphical user interfaces (GUIs) provided by the controller 130, e.g , for purposes such as determining characteristics of the container 105 and/or pellet 107 housed therein (such as the packed cell volume (PCV) of the pellet 107). In some embodiments, however, the controller 130 does not include the display device 136 and/or the user input device 137.
[0030] The memory unit 138 stores non-transitory instructions of one or more software applications, including a container evaluation application (not depicted). The container evaluation application, when executed by the processing hardware 132, is generally configured to communicate with the sensor 110, the holder 115a or 115b, and/or the light source 120 to analytically determine characteristics of the container 105 and/or the pellet 107 housed therein. For example, in some embodiments, the container evaluation application may be configured to cause the holder 115a or 115b to rotate the container 105 at a fixed rate. In this example, the container evaluation application may be configured to control the sensor 110 to capture sensor data associated with container 105 at fixed intervals (e.g., every 10° of rotation, every 20° of rotation, every 30° of rotation, every 60° of rotation). As another example, to reduce motion blur, the container evaluation application may instead be configured to control the holder 115a or 115b to rotate the container 105 in discrete intervals (e.g., 10° intervals, 20° intervals, 30° intervals, 60° intervals) In this example, the container evaluation application may be configured to cause the sensor 110 to capture sensor data of the container 105 after the holder 115a or 115b has completed the rotation interval In embodiments where the sensor 110 is an image sensor or camera, the container evaluation application may first control the light source 120 to emit illumination light before controlling the holder 115a or 115b and the sensor 110. Regardless, the container evaluation application may compile the sets of sensor data captured at each interval into a sequence of sensor data.
[0031] In addition to the container evaluation application, the memory unit 138 may be configured to store model data representative of the dimensions of the container 105. For example, the model data may be obtained from manufacturer documentation provided by the manufacturer and/or provider of the container 105. As another example, the model data may be a three dimensional scan and/or model of the container that includes the dimensional information associated with the container 105. The container evaluation application may be configured to access the model data to determine the characteristics of the
container 105 and/or the pellet 107. For example, after determining a shape of the top surface 108, the container evaluation application may utilize the model of the container 105 to determine an inner circumference of the lower portion of the pellet 107. It should be appreciated that many containers 105 are not cylindrical throughout their entire length. Accordingly, the model data may model the different dimensions of the container 105 throughout the length of the container 105.
[0032] Turning to FIGS. 2A and 2B, illustrated are example image data 212 of a container 205 (such as the container 105) that houses a pellet 207 (such as the pellet 107) that was captured by a sensor (such as the sensor 110). As described above, the sensor may be configured to capture the image data 212 at the direction of a container evaluation application as the container evaluation application controls a holder (such as the holders 115a, 115b) to rotate the container 205. In the illustrated example, the container evaluation application configured the sensor to capture seven sets of image data 212a-g at intervals of 60° of axial rotation. As shown in FIG. 2A, the height of the front edge of the top surface 208 of the pellet 207 varies at different degrees of rotation.
[0033] FIG. 2B shows a detailed view of the data extracted by the container evaluation application when analyzing the image data 212a In particular, the FIG. 2B depicts the point 213 that represents the front edge of the top surface 208 of the pellet 207. The container evaluation application may determine the lateral position of the point 213 by defining a center axis along the lateral midpoint of the image data representative of the container 205. The container evaluation application may determine the vertical position of the point 213 by detecting a shift in intensity in image data along the center axis between the darker side of the pellet 207 and the illuminated top surface 208 of the pellet 207.
[0034] The container evaluation application may then determine a height of the point 213. Generally, there are two options for determining the heigh of the point 213 based on image data - (1) by analyzing the pixel height of the point 213, or (2) comparing the height of the point 213 to a measurement line on the container 205 It should be appreciated that the pixel height techniques may remove error in PCV calculations introduced by the manufacturing tolerances of the container 205.
[0035] For the pixel height techniques, in embodiments in which the holder holds a top portion of the container 205 (such as shown in FIG. 1A), the container evaluation application may calculate the height based on a number of pixels from the bottom of the container 205. In embodiments in which the holder holds a bottom portion of the container 205 (such as shown in FIG. 1 B), the container evaluation application may calculate the height based on a number of pixels from the predetermined depth of the bottom of the container 205. For the measurement line techniques, the container evaluation application may compare the height of the point 213 to the bounding measurement lines of the container 205 to derive a height value. Regardless of technique, the container evaluation application may repeat this process for each of the sets of image data 212b-f and generate a sequence that associates the degrees of axial rotation and the corresponding heights of the top surface 208.
[0036] It should be appreciated that in alternate embodiments where the sensor is a laser sensor vertically disposed above the container 205, the container evaluation application may obtain a depth value from the sensor indicative of a distance from the sensor to the top surface 208 In these embodiments, the depth of the bottom surface of the pellet may be known based on physical measurements of the system configuration and the model data of the container 205 Accordingly, in these embodiments, the height of the pellet 207 may be determined by subtracting the obtained sensor depth value for a particular edge of the top surface 208 from the predetermined depth of the bottom surface of the pellet 207.
[0037] Turning to FIG 3, illustrated is a plot 300 of the heights of leading edges of a top surface (such as the top surfaces 108, 208) of a pellet (such as the pellets 107, 207). It should be appreciated that unlike the scenario illustrated in FIG. 2A, the image data was captured at 10° intervals instead of 60° intervals. Additionally, while FIG. 3 illustrates that the height is measured in terms of pixel height, in embodiments where laser depth or container measurement lines are used, the height may utilize other units of measurement. Regardless of the unit of measurement, the container evaluation application may apply interpolation techniques (such as linear interpolation) to connect the sampled height values included in the sequence.
[0038] The container evaluation application may then determine the PCV of the pellet by identifying the minimum of the plot 300 to segment the volume of the pellet into a body having a flat surface up to the height of the minimum and a body having a top surface of the defined shape of the plot 300. To determine the volume of the bottom body, the container evaluation application may analyze the model and/or the measurement lines on the container to determine the volume associated with the minimum height. For the upper body, the container evaluation application may calculate an area under the curve of the plot 300 and combine the area and model data to determine a volume of the upper body. The container evaluation application then adds the two volumes to determine the PCV of the pellet. It should be appreciated that this technique may not resolve any contours along the radial axis of the top surface. Accordingly, this technique may be understood as defining a minimum energy surface that connects a shape defined by the outer edges measured by the sensor
[0039] To simplify the calculations, the container evaluation application may alternatively assume that the top surface of the pellet is generally shaped like a tilted oval. Additionally, based on experimental testing, the tilted oval shape is the most common shape of a pellet housed in a container Accordingly, in these embodiments, the container evaluation application may define the shape of the tilted oval (and thus the upper surface of the pellet) by identifying the maximum and minimum of the plot 300 and the inner circumference of the container.
[0040] As described with respect to FIG. 1A, in some embodiments, the holder 115a may be configured to hold the container 105 at an offset angle to improve the illumination of the top surface 108 of the pellet 107. Additionally, in some systems, the coupling between the holder 115a and the container 105 may not result in a perfectly vertical alignment. Accordingly, as the holder 115a axially rotates the container 105, the position of the container 105 shifts between sets of image data. In these embodiments, prior to performing the analyses as illustrated in FIGS. 2A, 2B, and 3, the container evaluation application may first digitally align the image data to ensure that the height calculation is consistent across each set of image data.
[0041] FIG. 4 depicts an example process 400 by which the container evaluation application preprocesses a set of image data 412 (such as the image data 212) to align the image data 412 with a vertical axis prior to analyzing the image data to determine the height of a pellet 407 (such as the pellets 107, 207) in a container 405 (such as the containers 105, 205). While FIG. 4 illustrates the process for aligning image data in a system in which a holder 415 (such as the holder 115a) holds the container 405 via a top portion, similar techniques may be applied to align image data in a system in which a container is slanted while resting in a holder that holds a bottom portion of the container 405 (such as the holder 115b).
[0042] Starting with image (a), represented is the image data 412 as it was captured by an image sensor or camera (such as the sensor 110). A center line 414 is superimposed on the image data 412 to illustrate that the holder 415 is holding the container 405 at an angle. Turning to image (b), an edge finding technique is applied to identify an edge 416 of the container 405. More particularly, the edge finding technique is applied to identify a pair of edges 416a, 416b that are vertically aligned and
equidistant from the center line 414 when the image data 412 is properly aligned. In the illustrated example, a rake edge finding technique is implemented. That said, in other embodiments, other edge finding techniques may be applied.
[0043] Turning to image (c), the edges 416 are used to define the lateral and rotational deviation of the container 405 in the image data 412. For example, the container evaluation application may be able to define the actual center line 417 of the container 405 using the detected edges 416 and the knowledge that (I) the edges 416a, 416b are equidistant to the center line 417 and (ii) the center line 417 should be vertically aligned Based on the lateral and rotational deviation between the actual center line 417 and the center line 414, container evaluation application may determine the rotational and lateral deviation of the container in the image data 412 Using the rotational and lateral deviation values, the container evaluation application is able to derive an offset vector by which the pixels of the image data 412 are translated to produce the aligned image data 422. Because the aligned image data 422 is laterally centered and vertically aligned, when the container evaluation application applies similar techniques to each set of image data obtained from the image sensor, the height values of the pellet 407 are all defined with respect to the same perspective. This ensures that any deviations in how the container 405 is coupled to the holder 415 do not impact the accuracy of the PCV calculation.
[0044] As described above, the measurement lines on containers fade over time. Thus, in embodiments that rely on the measurement lines in determining the PCV of the pellet, the measurement lines might not be visible to the image sensor across all degrees of rotation. Accordingly, in some embodiments, the container evaluation application may apply techniques to enhance and superimpose the measurement lines onto image data to facilitate image analysis techniques that rely upon said measurement lines.
[0045] Turning to FIG 5, illustrated is an example process 500 by which the container evaluation application virtually superimposes measurement lines on a container 505 (such as the containers 105, 205, 405). As shown in image (a), the measurement lines 519 of the container have faded, making them difficult to use a baseline for measuring volume. It should be appreciated that while image (a) depicts the container 505 as having a pellet housed therein, in other embodiments, the process 500 may be applied before the pellet is housed inside the container 505. Turning to image (b), illustrated is the extracted region of interest where the measurement lines have faded. It should be appreciated that in some embodiments, the container evaluation application captures images (a) of the container 505 at a plurality of different rotation angles and extracts a corresponding image (b) of the region of interest from each image (a).
[0046] Turning to image (c), the container evaluation application combines the extracted regions of interest to produce a minimum intensity projection (MIP) of the region of interest. In particular, the Ml P process assigns each pixel and/or row of the image data the minimum value (i.e. , the darkest value) such that any gaps in the printed measurement lines are filled in. Accordingly, as shown in image (d), when the MIP techniques are applied to fill in the faded measurement lines 519, the container evaluation application can readily identify the measurement lines for use in determining the PCV of the pellet. It should be appreciated that the container evaluation application may apply optical character recognition or other similar techniques to detect the presence of the numbers on the container 505 and exclude those pixels from the MIP analysis to prevent the presence of the numbers from generating wide bands for each measurement line.
[0047] It should be appreciated that in some embodiments, the pellet is the same color as the measurement lines 519, making the lower bounding measurement difficult to detect. In these embodiments, the container evaluation application may extrapolate
and/or project the measurement lines downwards into the region obscured by the pellet For example, the container evaluation application may determine a pixelwise spacing between identified measurement lines 529a and 529b and extrapolate an additional measurement line below measurement line 529b at a distance matching the pixelwise spacing. In embodiments the container 505 is a pseudo-cylindric container in which the inner diameter varies at different heights along a length of the container 505. In these embodiments, the measurement lines on the container 505 may be non-uniformly spaced due to variations in inner diameter along the length. Accordingly, the container evaluation application may analyze the model of the container 505 to account for the non-uniform spacing between the measurement lines . As a result, the container evaluation application is able to still rely on the measurement lines 519 to determine the PCV of the pellet despite the pellet obscuring the measurement lines 519 in the image data.
[0048] Alternatively, the container evaluation application may extract the correspondence between pixel height and measurement line for the container 505 using an empty container to store a pixel height associated with each of the printed measurement lines in a memory of a controller (such as the controller 130). Accordingly, when the container evaluation application analyzes image data where the printed measurement lines 519 on the container 505 is obscured by the pellet, the container evaluation application can compare the pixel height of the pellet to the stored correspondences when determining the PCV of the pellet In these embodiments, the container evaluation application may identify a particular feature of the container 505 (e g , a height of the top of the container) to align the stored pixel heights associated with the printed measurement lines with the image data being analyzed In some embodiments where the container evaluation application is configured to display the sensor data representative of the container 505 the container evaluation application may overlay indications of the extrapolated and/or stored measurement lines onto the displayed image of the container for manual verification.
[0049] While the foregoing techniques describe the process of determining the PCV of a pellet in a container, similar techniques can be used to determine a volume of a liquid also included in the container To this end, the container evaluation application may be configured to detect a meniscus height of the liquid in the container. Accordingly, using the disclosed pixelwise or measurement line volume determination techniques, the container evaluation application is able to determine a volume associated with the height of a bottom edge of the meniscus. In some embodiments, the container evaluation application may determine the volume of liquid in the meniscus region based on a surface tension of the liquid and a contact angle between the liquid and a wall of the container. The container evaluation application may then subtract the PCV of the pellet from the volume associated with the bottom edge of the meniscus to determine the volume of the liquid in the container. In some embodiments, the container evaluation application may additionally determine the volume of liquid in the meniscus region based on a surface tension of the liquid and a contact angle between the liquid and a wall of the container and add the meniscus region volume to the volume associated with the bottom edge of the meniscus before subtracting the PCV of the pellet.
[0050] Turning to FIG 6, illustrated is a flow chart of an example method 600 for automated inspection of a sample (such as the pellets 107, 207, 407) in a container (such as the containers 105, 205, 405, 505). The method 600 may be performed by one or more processors (such as the processing hardware 132) a controller (such as the controller 130) executing a container evaluation application stored in a memory unit (such as the memory unit 138). The controller may be operatively coupled to a container holder (such as the holders 115a, 115b, 415) that holds the container and is configured to be able to rotate the container axially at the direction of the controller.
[0051] The controller may also be connected to a sensor (such as the sensor 110) configured to have a sensing axis (such as the sensing axis 111) that passes through the container. In some embodiments, the sensor is one of a laser line profiler sensor and a laser displacement sensor configured to sense a distance between the sensor and the top surface (such as the top surfaces 108, 208, 408) of the sample. In other embodiments, the sensor is an image sensor which, in some embodiments, includes a telecentric lens. In these embodiments, the controller may also be coupled to a light source oriented to emit illumination toward the image sensor such that the emitted illumination reflects off the top surface of the sample. To improve the reflectivity of the illumination, in some embodiments, the container holder and the sensor are configured such that the sensing axis is offset from a transverse axis of the container by an offset angle.
[0052] The method begins at block 602 when the controller controls the container holder to axially rotate the container. At block 604, the controller controls the sensor to capture sensor data of the container at a plurality of axial rotation angles.
[0053] At block 606, the controller analyzes the captured sensor data to determine a shape of a top surface of the sample. In embodiments where the container is held at an offset angle, the controller may preprocess the captured sensor data by vertically and/or laterally aligning the captured sensor data (such as by applying the techniques associated with the process 400). To determine the shape of the top surface, the controller may determine a height of a front edge of the top surface of the sample at the plurality of axial rotation angles and define the shape by applying a fit algorithm to the determined heights For example, the fit algorithm may be a minimum energy surface (such as by applying the techniques describe with respect to the plot 300) or a fit algorithm that fits a tilted oval shape to the determined heights.
[0054] At block 608, based on the shape of the top surface, the controller determines a sample volume (such as a PCV). In some embodiments, the controller compares the sensor data and the shape of the top surface to a model of the container. For example, the controller may identify a pixel height of the top surface and use the pixel heights and inner diameter and/or circumference data to determine the volume of the sample In other embodiments, the controller may compare the sensor data and the shape of the top surface to a measurement scale (such as the measurement lines 519) included on a surface of the container. In these embodiments, to improve the ability to detect the measurement lines, the controller may be configured to (i) control the sensor to capture sensor data of the container at a plurality of axial rotation angles; (II) generate a minimum intensity projection (MIP) of the measurement scale based on the captured sensor data; and (ill) extrapolate the measurement scale into a region associated with the sample (such as by applying the techniques described with respect to the process 500) Alternatively, the controller may be configured to (i) obtain a stored correspondence between printed measurement lines of the measurement scale and corresponding pixel heights and (ii) align the measurement scale with a region associated with the sample
[0055] The sample volume may then be used for other processes in the biopharmaceutical manufacturing process. For example, the sample volume may be used to determine a volume of a liquid included in the container. As another example, the sample volume may be used to configure a harvesting process for the sample (such as a centrifugal harvesting process). In some embodiments, the controller is operatively coupled to a controller of the centrifuge and provides sample volume thereto to assist in automatic configuration of the centrifugal process. In other embodiments, the controller outputs the sample volume for a clinician to configure the centrifugal process In particular, the sample volume may be used to configure the centrifuge bowl discharge interval for stainless steel centrifuges, or to the heavy phase (cell discharge/waste) and light phase (supernatant/product) split flowrate for single-use centrifuges
[0056] It should be appreciated that while the foregoing describes the application of the automated PCV measurement to centrifugal biopharmaceutical product manufacturing processes, the disclosed techniques may be applied to improve other centrifugal processes. For example, the disclosed techniques may be applied to measure the volume of blood components in a centrifugal blood component analysis process.
[0057] Additional considerations pertaining to this disclosure will now be addressed.
[0058] Some of the figures described herein illustrate example block diagrams having one or more functional components It will be understood that such block diagrams are for illustrative purposes and the devices described and shown may have additional, fewer, or alternate components than those illustrated. Additionally, in various embodiments, the components (as well as the functionality provided by the respective components) may be associated with or otherwise integrated as part of any suitable components.
[0059] Embodiments of the disclosure relate to a non-transitory computer-readable storage medium having computer code thereon for performing various computer-implemented operations. The term “computer-readable storage medium” is used herein to include any medium that is capable of storing or encoding a sequence of instructions or computer codes for performing the operations, methodologies, and techniques described herein. The media and computer code may be those specially designed and constructed for the purposes of the embodiments of the disclosure, or they may be of the kind well known and available to those having skill in the computer software arts. Examples of computer-readable storage media include, but are not limited to: magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs and holographic devices; magneto-optical media such as optical disks; and hardware devices that are specially configured to store and execute program code, such as ASICs, programmable logic devices (“PLDs”), and ROM and RAM devices.
[0060] Examples of computer code include machine code, such as produced by a compiler, and files containing higher-level code that are executed by a computer using an interpreter or a compiler. For example, an embodiment of the disclosure may be implemented using Java, C++, or other object-oriented programming language and development tools. Additional examples of computer code include encrypted code and compressed code. Moreover, an embodiment of the disclosure may be downloaded as a computer program product, which may be transferred from a remote computer (e g., a server computer) to a requesting computer (e.g., a client computer or a different server computer) via a transmission channel. Another embodiment of the disclosure may be implemented in hardwired circuitry in place of, or in combination with, machine-executable software instructions.
[0061] As used herein, the singular terms “a,” “an,” and “the” may include plural referents, unless the context clearly dictates otherwise.
[0062] As used herein, the terms “approximately,” “substantially,” “substantial” and “about” are used to describe and account for small variations. When used in conjunction with an event or circumstance, the terms can refer to instances in which the event or circumstance occurs precisely as well as instances in which the event or circumstance occurs to a close approximation. For example, when used in conjunction with a numerical value, the terms can refer to a range of variation less than or equal to ±10% of that numerical value, such as less than or equal to ±5%, less than or equal to ±4%, less than or equal to ±3%, less than or equal to ±2%, less than or equal to ±1%, less than or equal to ±0 5%, less than or equal to ±0 1%, or less than or equal to
±0 05%. For example, two numerical values can be deemed to be “substantially” the same if a difference between the values is less than or equal to ±10% of an average of the values, such as less than or equal to ±5%, less than or equal to ±4%, less than or equal to ±3%, less than or equal to ±2%, less than or equal to ±1%, less than or equal to ±0.5%, less than or equal to ±0.1%, or less than or equal to ±0.05%.
[0063] Additionally, amounts, ratios, and other numerical values are sometimes presented herein in a range format. It is to be understood that such range format is used for convenience and brevity and should be understood flexibly to include numerical values explicitly specified as limits of a range, but also to include all individual numerical values or sub-ranges encompassed within that range as if each numerical value and sub-range is explicitly specified.
[0064] While the present disclosure has been described and illustrated with reference to specific embodiments thereof, these descriptions and illustrations do not limit the present disclosure. It should be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the true spirit and scope of the present disclosure as defined by the appended claims. The illustrations are not necessarily drawn to scale. There may be distinctions between the artistic renditions in the present disclosure and the actual apparatus due to manufacturing processes, tolerances and/or other reasons. There may be other embodiments of the present disclosure which are not specifically illustrated. The specification (other than the claims) and drawings are to be regarded as illustrative rather than restrictive. Modifications may be made to adapt a particular situation, material, composition of matter, technique, or process to the objective, spirit and scope of the present disclosure All such modifications are intended to be within the scope of the claims appended hereto. While the techniques disclosed herein have been described with reference to particular operations performed in a particular order, it will be understood that these operations may be combined, sub-divided, or re-ordered to form an equivalent technique without departing from the teachings of the present disclosure Accordingly, unless specifically indicated herein, the order and grouping of the operations are not limitations of the present disclosure.
Claims
1 . An automated visual inspection system, comprising: a container holder configured to (i) hold a container that houses a sample and (ii) rotate the container axially, a sensor having a sensing axis that passes through the container; and a controller operatively coupled to the container holder and the sensor and configured to: control the container holder to axially rotate the container; control the sensor to capture sensor data of the container at a plurality of axial rotation angles; analyze the captured sensor data to determine a shape of a top surface of the sample; and determine a sample volume based on the shape of the top surface.
2. The automated visual inspection system of claim 1 , wherein the sensor is one of a laser line profiler sensor and a laser displacement sensor configured to sense a distance between the sensor and the top surface of the sample.
3. The automated visual inspection system of claim 1 , wherein the sensor is an image sensor, and the automated visual inspection system further comprises: a light source oriented to emit illumination toward the image sensor such that the emitted illumination reflects off the top surface of the sample
4. The automated visual inspection system of claim 3, where in the image sensor further comprises a telecentric lens.
5. The automated visual inspection system of any one of claims 3 or 4, wherein the container holder and the sensor are configured such that the sensing axis is offset from a transverse axis of the container.
6. The automated visual inspection system of claim 5, wherein the controller is configured to: process the captured sensor data to vertically align the captured sensor data.
7. The automated visual inspection system of any one of claims 5 or 6, wherein the controller is configured to: process the captured sensor data to laterally align the captured sensor data.
8. The automated visual inspection system of any one of claims 1-7, wherein to determine a shape of a top surface of the sample, the controller is configured to: determine a height of a front edge of the top surface of the sample at the plurality of axial rotation angles; and define the shape by applying a fit algorithm to the determined heights.
9. The automated visual inspection system of claim 8, wherein the fit algorithm fits a tilted oval shape to the determined heights.
10 The automated visual inspection system of claim 8, wherein the fit algorithm is a minimum energy surface algorithm.
11 The automated visual inspection system of any one of claims 1-10, wherein to determine the sample volume, the controller is configured to: compare the sensor data and the shape of the top surface to a model of the container.
12 The automated visual inspection system of any one of claims 1-10, wherein to determine the sample volume, the controller is configured to: compare the sensor data and the shape of the top surface to a measurement scale included on a surface of the container.
13 The automated visual inspection system of claim 12, wherein the controller is configured to: control the sensor to capture sensor data of the container at a plurality of axial rotation angles; generate a minimum intensity projection (M IP) of the measurement scale based on the captured sensor data; and extrapolate the measurement scale into a region associated with the sample.
14 The automated visual inspection system of claim 12, wherein the controller is configured to: obtain a stored correspondence between printed measurement lines of the measurement scale and corresponding pixel heights; and align the measurement scale with a region associated with the sample
15 The automated visual inspection system of any one of claims 1-14, wherein the sample volume is applied to configure a harvesting process for the sample
16 A method of automated analysis of a sample housed in a container, the method comprising: controlling, via a controller, a container holder that is holding the container to axially rotate the container; controlling, via the controller, a sensor to capture sensor data of the container at a plurality of axial rotation angles; analyzing, via the controller, the captured sensor data to determine a shape of a top surface of the sample; and determining, via the controller, a sample volume based on the shape of the top surface.
17 The method of claim 16, wherein the sensor is an image sensor, and the method further comprises: controlling, via the controller, a light source to emit illumination toward the image sensor such that the emitted illumination reflects off the top surface of the sample.
18 The method of any one of claims 16 or 17, further comprising:
configuring the container holder to hold the container at an offset angle with respect to an axis orthogonal to a sensing axis of the sensor.
19 The method of claim 18, further comprising: processing, via the controller, the captured sensor data to vertically align the captured sensor data.
20 The method of any one of claims 18 or 19, further comprising: processing, via the controller, the captured sensor data to laterally align the captured sensor data.
21 The method of any one of claims 16-20, wherein determining a shape of a top surface of the sample comprises: determining, by the controller, a height of a front edge of the top surface of the sample at the plurality of axial rotation angles; and defining, by the controller, the shape by applying a fit algorithm to the determined heights.
22 The method of claim 21 , wherein the fit algorithm fits a tilted oval shape or a minimum energy surface to the determined heights.
23 The method of any one of claims 16-22, wherein determining the sample volume comprises: comparing, via the controller, the sensor data and the shape of the top surface to a model of the container.
24 The method of any one of claims 16-23, wherein determining the sample volume comprises: comparing the sensor data and the shape of the top surface to a measurement scale included on a surface of the container.
25 The method of claim 24, further comprising: controlling, via the controller, the sensor to capture sensor data of the container at a plurality of axial rotation angles; generating, via the controller, a minimum intensity projection (MIP) of the measurement scale based on the captured sensor data; and projecting, via the controller, the measurement scale into a region associated with the sample.
26 The method of claim 24, further comprising: obtaining, via the controller, a stored correspondence between printed measurement lines of the measurement scale and corresponding pixel heights; and aligning, via the controller, the measurement scale with a region associated with the sample.
27 The method of any one of claims 16-26, further comprising:
configuring a harvesting process for the sample using the sample volume.
28 One or more non-transitory, computer-readable media storing instructions that, when executed by processing hardware of a controller, cause the controller to perform the method of any one of claims 16-27.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202363600504P | 2023-11-17 | 2023-11-17 | |
| US63/600,504 | 2023-11-17 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025106630A1 true WO2025106630A1 (en) | 2025-05-22 |
Family
ID=93840743
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2024/055858 Pending WO2025106630A1 (en) | 2023-11-17 | 2024-11-14 | Automated detection of packed cell volume |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025106630A1 (en) |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP2133668A1 (en) * | 2008-06-12 | 2009-12-16 | CSL Behring GmbH | Destruction-free measuring of the fill volume of a container filled with fluid |
| US20160018427A1 (en) * | 2014-07-21 | 2016-01-21 | Beckman Coulter, Inc. | Methods and systems for tube inspection and liquid level detection |
-
2024
- 2024-11-14 WO PCT/US2024/055858 patent/WO2025106630A1/en active Pending
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP2133668A1 (en) * | 2008-06-12 | 2009-12-16 | CSL Behring GmbH | Destruction-free measuring of the fill volume of a container filled with fluid |
| US20160018427A1 (en) * | 2014-07-21 | 2016-01-21 | Beckman Coulter, Inc. | Methods and systems for tube inspection and liquid level detection |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN111512162B (en) | Sample container identification | |
| US7499581B2 (en) | Vision system to calculate a fluid volume in a container | |
| EP3803350B1 (en) | A calibration method for calibrating a camera of a mobile device for detecting an analyte in a sample | |
| US20120127290A1 (en) | Liquid medicine identification apparatus and liquid medicine identification method | |
| CN107076732B (en) | Method and system for tube inspection and liquid level detection | |
| EP2634529B1 (en) | Tire shape inspection method and tire shape inspection device | |
| EP3252476B1 (en) | Liquid surface inspection device, automated analysis device, and processing device | |
| JP6453230B2 (en) | Method and apparatus for measuring the verticality of a container | |
| JP5330317B2 (en) | Biological sample analysis method and analyzer | |
| US20220283155A1 (en) | Method and system for detecting target components by utilizing mobile terminal | |
| US20130101158A1 (en) | Determining dimensions associated with an object | |
| US10393563B2 (en) | Volumetric measurement | |
| JP6211391B2 (en) | Automatic analyzer | |
| US20210354123A1 (en) | Cover member with orientation indicia | |
| US9285348B2 (en) | Automated system for handling components of a chromatographic system | |
| EP4134677A1 (en) | Biological sample analysis device | |
| WO2025106630A1 (en) | Automated detection of packed cell volume | |
| EP3625538B1 (en) | Slide rack determination system | |
| US20170305685A1 (en) | Method for Orienting Tube Components | |
| CN114746351A (en) | Device and method for orienting containers | |
| US20220110558A1 (en) | Supporting a measurement of a liquid sample | |
| CN107003237A (en) | The method and apparatus being inoculated with for detecting and the automatic vaccination equipment provided with this detection means | |
| KR20250156737A (en) | Automated visual inspection of prefilled syringes | |
| WO2025264698A1 (en) | Machine vision systems and methods for determining attributes of a container in a digital image and detecting presence of capped tubes in a diagnostic instrument | |
| CN116977310B (en) | Image detection method, system, equipment and storage medium for bottle mouth gap of milk glass bottle |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24820501 Country of ref document: EP Kind code of ref document: A1 |