EP4268120A1 - Procédé de caractérisation de spermatozoïdes - Google Patents
Procédé de caractérisation de spermatozoïdesInfo
- Publication number
- EP4268120A1 EP4268120A1 EP21839575.4A EP21839575A EP4268120A1 EP 4268120 A1 EP4268120 A1 EP 4268120A1 EP 21839575 A EP21839575 A EP 21839575A EP 4268120 A1 EP4268120 A1 EP 4268120A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- image
- sample
- plane
- spermatozoon
- images
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N15/14—Optical investigation techniques, e.g. flow cytometry
- G01N15/1434—Optical arrangements
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N15/14—Optical investigation techniques, e.g. flow cytometry
- G01N15/1429—Signal processing
- G01N15/1433—Signal processing using image recognition
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
- G06V10/443—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
- G06V10/449—Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters
- G06V10/451—Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters with interaction between the filter responses, e.g. cortical complex cells
- G06V10/454—Integrating the filters into a hierarchical structure, e.g. convolutional neural networks [CNN]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N2015/1006—Investigating individual particles for cytology
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N2015/1027—Determining speed or velocity of a particle
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N15/14—Optical investigation techniques, e.g. flow cytometry
- G01N15/1434—Optical arrangements
- G01N2015/144—Imaging characterised by its optical setup
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N15/00—Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
- G01N15/10—Investigating individual particles
- G01N15/14—Optical investigation techniques, e.g. flow cytometry
- G01N2015/1497—Particle shape
Definitions
- the technical field of the invention is the observation of mobile or motile microscopic particles in a sample, with a view to characterizing them.
- a targeted application is the characterization of spermatozoa.
- the observation of motile cellular particles, such as spermatozoa, in a sample is usually carried out using a microscope.
- the microscope comprises a lens defining an object plane, extending into the sample, as well as an image plane, coinciding with a detection plane of an image sensor.
- the microscope images the sperm cells in a focused configuration.
- the choice of such a modality supposes a compromise between the spatial resolution, the observed field and the depth of field.
- a high numerical aperture decreases the depth of field.
- motile particles supposes an optimization of the objective, knowing that, in such a configuration, it is not possible to obtain at the same time a good spatial resolution, a wide observed field and a great depth. of field.
- these properties are particularly important when observing microscopic motile particles, in particular when the latter are numerous: the size of the particles requires good spatial resolution; their number justifies an extended observed field, so as to be able to maximize the number of particles observed on the same image; their movement requires a significant depth of field, so that the particles appear clearly on the image.
- a high magnification microscope it is usual to use a high magnification microscope.
- the small size of the observed field is compensated by the use of a translation stage.
- the latter allows image acquisition by moving the lens relative to the sample, parallel to the latter.
- the shallow depth of field is compensated by limiting the thickness of the sample: the latter is for example placed in a thin fluidic chamber, typically less than 20 ⁇ m, so as to limit the movement of the particles in a direction perpendicular to the object plane.
- the objective can be brought closer or further from the sample, so as to move the object plane in the sample, according to its thickness. This results in a complex and expensive device, requiring precise displacement of the lens.
- the document WO2019/125583 describes general principles relating to the analysis of the motility or the morphology of spermatozoa, by mentioning the use of an artificial intelligence algorithm with supervised learning.
- lensless imaging An alternative to conventional microscopy, as described in the aforementioned publication, has been proposed by lensless imaging. It is known that lensless imaging, coupled with holographic reconstruction algorithms, allows cell observation while maintaining a high field of view, as well as a large depth of field. Patents US9588037 or US8842901 describe for example the use of lensless imaging for the observation of spermatozoa. Patents US10481076 or US10379027 also describe the use of lensless imaging, coupled with reconstruction algorithms, to characterize cells.
- a first object of the invention is a method for characterizing at least one mobile particle in a sample, the method comprising: a) acquisition of at least one image of the sample during an acquisition period, at the using an image sensor and forming a series of images from the acquired images; b) using each image of the series of images resulting from a) as an input image of a detection convolutional neural network, the detection convolutional neural network being configured to detect the particles and to produce , from each image, an output image on which each detected particle is assigned an intensity distribution centered on the particle and extending around the particle; c) for each detected particle, from each output image resulting from b), estimation of a position of each detected particle in each image of the series of images; d) characterization of each particle detected from the estimation of the position resulting from c), established from each image of the series of images.
- steps a) to d) can be performed with a single image.
- the series of images comprises a single image.
- each particle can be represented in the form of a point.
- the particle (or each particle) can in particular be a spermatozoon.
- Step d) can comprise a characterization, in particular morphological, of each spermatozoon detected.
- Step d) then comprises: for each of the spermatozoon detected, from each image resulting from a), and from the positions resulting from c), extraction of a thumbnail comprising the detected spermatozoon, the position of the spermatozoon detected in the thumbnail being predetermined, so as to obtain, for detected sperm, a series of thumbnails, the size of each thumbnail being smaller than the size of each image acquired during step a); - for each spermatozoon detected, use of the series of thumbnails as input data for a classification neural network, the classification neural network being configured to classify the spermatozoon among predetermined classes. These may in particular be morphological classes.
- the method can be such that each detected sperm is centered with respect to each thumbnail.
- Step d) may comprise a characterization of the motility of the spermatozoon. Step d) can then comprise, from the positions of the spermatozoon resulting from step c):
- the process may include:
- the sample extends along a plane of the sample
- the image sensor extends along a detection plane
- an optical system extends between the sample and the image sensor, the optical system defining an object plane and an image plane;
- the object plane is offset from the sample plane by an object defocusing distance and/or the image plane is offset from the sample plane by an image defocusing distance.
- the method can be such that: the sample is placed on a sample support, resting on at least one spring, the spring being configured to push the sample support towards the optical system;
- the optical system is connected to at least one abutment, extending, from the optical system, towards the sample holder; - so that during step a), under the effect of the spring, the sample support rests on the stop.
- no imaging optics extend between the sample and the image sensor.
- step a) comprises a normalization of each image acquired by an average of said image or by an average of images of the series of images.
- the series of images is made from each acquired image, after normalization.
- step a) comprises an application of a high-pass filter to each acquired image.
- the series of images is performed from each image acquired, after application of the high-pass filter.
- the intensity distribution assigned to each particle is decreasing, such that the intensity decreases as a function of the distance relative to the particle.
- the intensity distribution assigned to each particle can be a two-dimensional parametric statistical distribution.
- a second object of the invention is a device for observing a sample, the sample comprising mobile particles, the device comprising:
- - a light source configured to illuminate the sample
- an image sensor configured to form an image of the sample
- a holding structure configured to hold the sample between the light source and the image sensor
- processing unit connected to the image sensor, and configured to implement steps a) to d) of a method according to the first object of the invention from at least one image acquired by the sensor of picture.
- no imaging optics extend between the image sensor and the sample.
- the holding structure can be configured to maintain a fixed distance between the image sensor and the sample.
- the image sensor extends along a detection plane;
- the device comprises an optical system, extending between the image sensor and the support plane, the optical system defining an image plane and an object plane, the device being such that: • the image plane is offset from the detection plane by an image defocusing distance;
- FIG. 1A represents a first embodiment of a device enabling the invention to be implemented.
- Figure IB is a three-dimensional view of the device shown schematically in Figure IA.
- Figure IC shows an assembly making it possible to maintain the sample at a fixed distance from an optical system.
- Figure 1D shows the assembly shown in Figure IC in an observation position.
- FIG. 2 represents a second embodiment of a device enabling the invention to be implemented.
- Figure 3 shows the main steps of a method for characterizing mobile particles in the sample.
- FIGS. 4A, 4B and 4C respectively represent an acquired image, a reference image and an image resulting from the detection neural network.
- FIG. 5A is an example of an image of a sample obtained by implementing a device according to the first embodiment of the invention.
- Figure 5B shows the image depicted in Figure 5A after processing by a detection neural network.
- FIG. 5C shows an example of obtaining trajectories of particles detected on image 5B.
- Figure 6 shows vignettes centered on a single spermatozoon, the vignettes being extracted from a series of nine images acquired using a defocused imaging modality.
- FIGS. 7A, 7B and 7C respectively represent an acquired image, a reference image and an image resulting from the detection neural network.
- FIGS. 8A, 8B and 8C respectively represent an acquired image, a reference image and an image resulting from the detection neural network.
- FIG. 8A has been processed by a high-pass filter.
- FIGS. 9A and 9B represent performance of particle detection by the detection neural network, under different conditions.
- the performances are the sensitivity (FIG. 9A) and the specificity (FIG. 9B).
- Figures 10A to IOC represent confusion matrices, expressing the classification performance of spermatozoa, using images acquired according to the defocused modality, the classification being carried out respectively: by implementing a holographic reconstruction algorithm from an image obtained by holographic reconstruction and processed by a neural network; by implementing a classification neural network whose input layer comprises a single image acquired according to a defocused configuration, without holographic reconstruction; by implementing a classification neural network, the input layer of which comprises a series of images of a particle, each image being acquired according to a defocused configuration, without holographic reconstruction.
- FIG. IA a first embodiment of a device 1 for implementing the invention.
- the device allows the observation of a sample 10 interposed between a light source 11 and an image sensor 20.
- the light source 11 is configured to emit an incident light wave 12 propagating up to 'to the sample parallel to a propagation axis Z.
- the device includes a sample holder 10s configured to receive the sample 10, such that the sample is held on the holder 10s.
- the sample thus maintained extends along a plane, called the sample plane Pw.
- the plane of the sample corresponds for example to a mean plane around which the sample 10 extends.
- the sample support can be a glass slide, for example 1 mm thick.
- the sample notably comprises a liquid medium 10 m in which mobile and possibly motile particles 10i are immersed.
- the 10 m medium can be a biological liquid or a buffer liquid. It may for example comprise a bodily fluid, in the pure or diluted state.
- bodily fluid is meant a fluid generated by a living body. It may in particular be, without limitation, blood, urine, cerebrospinal fluid, semen, lymph.
- the sample 10 is preferably contained in a fluidic chamber 10 c .
- the fluidic chamber is for example a fluidic chamber with a thickness of between 20 ⁇ m and 100 ⁇ m.
- the thickness of the fluidic chamber, and therefore of the sample 10, along the axis of propagation Z typically varies between 10 ⁇ m and 200 ⁇ m, and is preferably between 20 ⁇ m and 50 ⁇ m.
- the mobile particles are spermatozoa.
- the sample contains sperm, possibly diluted.
- the fluidic chamber 10 c can be a counting chamber dedicated to the analysis of the mobility or the concentration of cells. It may for example be a counting chamber marketed by Leja, with a thickness of between 20 ⁇ m and 100 ⁇ m.
- the sample comprises mobile particles, for example microorganisms, for example microalgae or plankton, or cells, for example cells in the process of sedimentation.
- the distance D between the light source 11 and the sample 10 is preferably greater than 1 cm. It is preferably between 2 and 30 cm.
- the light source 11, seen by the sample is considered as point. This means that its diameter (or its diagonal) is preferably less than one tenth, better still one hundredth of the distance between the sample and the light source.
- the light source 11 is for example a light-emitting diode. It is preferably associated with a diaphragm 14, or spatial filter.
- the aperture of the diaphragm is typically between 5 ⁇ m and 1 mm, preferably between 50 ⁇ m and 1 mm. In this example, the diaphragm has a diameter of 400 ⁇ m.
- the diaphragm can be replaced by an optical fiber, a first end of which is placed facing the light source and a second end of which is placed facing the sample 10.
- the device can also comprise a diffuser 13 , placed between the light source 13 and the diaphragm 14. The use of a diffuser/diaphragm assembly is for example described in US10418399
- the image sensor 20 is configured to form an image of the sample according to a detection plane P20.
- the image sensor 20 comprises a matrix of pixels, of CCD or CMOS type.
- the detection plane P20 preferably extends perpendicular to the axis of propagation Z.
- the image sensor has a high sensitive surface, typically greater than 10 mm 2 .
- the image sensor is an IDS-UI-3160CP-M-GL sensor comprising pixels of 4.8 ⁇ 4.8 ⁇ m 2 , the sensitive surface being 9.2 mm ⁇ 5.76 mm, ie 53 mm 2 .
- the image sensor 20 is optically coupled to the sample 10 by an optical system 15.
- the optical system includes an objective 15i and a tube lens 152. The latter is intended to project a formed image on the sensitive surface of the image sensor 20 (53 mm 2 surface).
- the 15i objective is a Motic CCIS EF-N Plan Achromat lOx objective, with a numerical aperture of 0.25;
- lens 152 is a Thorlabs LBF254-075-A lens - 75 mm focal length.
- Such an assembly provides an observation field of 3 mm 2 , with a spatial resolution of 1 ⁇ m.
- the short focal length of the lens makes it possible to optimize the bulk of the device 1 and to adjust the magnification to the size of the image sensor.
- the image sensor is configured to acquire images of the sample, according to an acquisition frequency of a few tens of images per second, for example 60 images per second.
- the sampling frequency is typically between 5 and 100 frames per second.
- the optical system 15 defines an object plane P o and an image plane Pi.
- the image sensor 20 is configured to acquire an image according to a defocused configuration.
- the image plane Pi coincides with the detection plane P 2 o, while the object plane P o is offset by an object focusing distance 6 of between 10 ⁇ m and 500 ⁇ m, relative to the sample.
- the focusing distance is preferably between 50 ⁇ m and 100 ⁇ m, for example 70 ⁇ m.
- the object plane P o extends outside the sample 10. According to another possibility, the object plane extends into the sample, while the image plane is shifted with respect to the detection plane, by a distance image defocus.
- the image focusing distance is preferably between 50 ⁇ m and 100 ⁇ m, for example 70 ⁇ m.
- the object plane P o and the image plane Pi are both offset respectively with respect to the plane of the sample and with respect to the detection plane.
- the defocusing distance is preferably greater than 10 ⁇ m and less than 1 mm, or even 500 ⁇ m, and preferably between 50 ⁇ m and 150 ⁇ m.
- the image sensor 20 is exposed to a light wave, called exposure light wave.
- the image acquired by the image sensor comprises figures interference, which can also be designated by the term “diffraction patterns", formed by: a part of the light wave 12 emitted by the light source 11, and having passed through the sample without interacting with the latter; diffraction waves, formed by the diffraction of part of the light wave 12 emitted by the light source in the sample. These include the diffraction formed by the particles.
- a processing unit 30, comprising for example a microprocessor, is capable of processing each image acquired by the image sensor 20.
- the processing unit comprises a programmable memory 31 in which is stored a sequence of instructions for perform the image processing and calculation operations described in this description.
- the processing unit 30 can be coupled to a screen 32 allowing the display of images acquired by the image sensor 20 or resulting from the processing carried out by the processing unit 30.
- the image acquired by the image sensor 20, according to a defocused imaging modality, is a diffraction figure of the sample, sometimes called a hologram. It does not make it possible to obtain an accurate representation of the observed sample.
- a holographic reconstruction operator so as to calculate a complex expression representative of the light wave to which the sensor is exposed. image, and this at any point of coordinates (x, y, z) in space, and in particular in a reconstruction plane corresponding to the plane of the sample.
- the complex expression returns the intensity or phase of the exposure light wave.
- Figure IB is a representation of an example of a device as shown schematically in Figure IA.
- Figures 1C and 1D represent a detail of the arrangement of the sample 10 facing the optical system 15, and more precisely facing the lens 15i.
- the sample support 10 s is connected to elastic return means 16 a , for example springs. The springs tend to Bring the 10s sample holder closer to the 15i objective.
- the device also includes abutments 16b, integral with the lens 15i.
- the objective 15i and the stops 16b are fixed relative to the image sensor 20.
- the stops 16b and the return means 16a form a holding structure 16, intended to hold the sample between the light source 11 and the image sensor 20.
- Figures 1C and 1D represent the maintenance of the sample on the sample support respectively during the positioning of the sample and during its observation.
- a rigid link 16c connects the lens 15i to the stops 16b.
- the springs 16a are compressed.
- the support of the sample 10s comes to bear against the stops 16b. This makes it possible to control the distance A between the objective 15i and the sample 10, independently of the thickness of the sample support 10 s . This makes it possible to overcome variations in the thickness of the sample support.
- the thickness can fluctuate over a relatively wide range, for example ⁇ 100 ⁇ m.
- the assembly described in connection with FIGS. 1C and 1D makes it possible to control the distance A between the objective 15i and the sample 10, to within ⁇ 5 ⁇ m, independently of such fluctuations in the thickness of the plate 10 s .
- the inventors believe that such an assembly makes it possible to avoid having to resort to an autofocus system to produce images
- Figures 1C and 1D also show a heating resistor 19 connected to a temperature controller 18. The function of these elements is to maintain a temperature of the fluidic chamber 10 c at 37°C.
- FIG. 2 represents a second embodiment of device 1' suitable for implementing the invention.
- the device 1' comprises a light source 11, a diffuser 13, a diaphragm 14, an image sensor 20, a holding structure 17 and a processing unit 30 as described in connection with the first embodiment.
- the holding structure 17 is configured to define a fixed distance between the sample and the image sensor.
- the device does not include an image forming lens between the image sensor 20 and the sample 10.
- the image sensor 20 is preferably brought closer to the sample, the distance between the image sensor 20 and sample 10 being typically between 100 ⁇ m and 3 mm.
- the image sensor acquires images according to a lensless imaging modality.
- the sample is preferably contained in a fluidic chamber 10 c , for example a “Leja” chamber as described in connection with the first embodiment.
- a fluidic chamber 10 c for example a “Leja” chamber as described in connection with the first embodiment.
- the advantage of such an embodiment is that it does not require precise positioning of an optical system 15 with respect to the sample 10, and that it confers a high field of observation.
- the disadvantage is obtaining images of lower quality, but which remain usable.
- the holding structure is arranged so that the distance between the sample, when the sample 10 is placed on the holding structure 17, and the image sensor 20, is constant.
- the holding structure 17 is connected to a base 20', on which the image sensor 20 extends.
- - Printed circuit on which the image sensor 20 is placed.
- the sample 10, contained in the fluidic chamber 10 c is held, by the sample holder 10 s , on the holding structure 17.
- the support sample is for example a transparent slide. The sample extends between the 10s sample holder and the image sensor. Thus, the distance between the sample 10 and the image sensor 20 is not affected by a fluctuation in the thickness of the support 10s.
- FIG. 3 schematizes the main steps of a method for processing several images acquired by an image sensor according to the defocused imaging modality (first embodiment) or imaging without lens (second embodiment).
- the method is described in connection with the observation of spermatozoa, it being understood that it can be applied to the observation of other types of motile particles.
- Step 100 Acquisition of a series of images I O n
- n is a natural number designating the rank of each image acquired, with 1 ⁇ n ⁇ N, N being the total number images acquired.
- the images acquired are images acquired either according to a defocused imaging modality or according to a lensless imaging modality.
- the number N of images acquired can be between 5 and 50.
- the images can be acquired according to an acquisition frequency of 60 Hz.
- Step 110 Preprocessing
- the objective is to perform a pre-processing of each image acquired, so as to limit the effects of a fluctuation in the intensity of the incident light wave or in the sensitivity of the camera.
- the pre-processing consists in normalizing each image acquired by an average of the intensity of at least one image acquired, and preferably of all the images acquired.
- the pre-processing can include an application of a high-pass filter to each image, possibly normalized.
- the high pass filter eliminates low frequencies from the image.
- a Gaussian filter is applied, by carrying out a convolution product of the image I n by a Gaussian kernel K.
- the width at mid-height of the Gaussian kernel is for example 20 pixels.
- Step 120 Particle detection
- This step consists in detecting and precisely positioning the spermatozoa, and this on each image I O n , resulting from step 100 or from each image I′ n preprocessed during step 110.
- This step is carried out using of a CNNd detection convolutional neural network.
- the CNNd neural network comprises an input layer, in the form of an input image I in n .
- the input image I in n is either an acquired image I O n , or a preprocessed image I n , I′ n .
- the detection neural network CNNd From the input image I in n , the detection neural network CNNd generates an output image I or t, n -
- the output image I outin is such that each spermatozoon 10i detected on the image d input I in n , in a position appears as an intensity distribution centered around said position. That is, from an input image the CNNd neural network allows: detection of spermatozoa; a position estimate of each 10i sperm detected; a generation of an output image/ out n comprising a predetermined intensity distribution around each position.
- Each position (x ⁇ y ⁇ is a two-dimensional position, in the detection plane P20.
- the distribution is such that the intensity is maximum at each position (%j,yi) and that the intensity is considered negligible beyond a neighborhood Vt of each position.
- neighborhood is meant a region extending according to a predetermined number of pixels, for example between 5 and 20 pixels, around each position (x ⁇ y ⁇ ).
- the distribution D t can be of the niche type, in which case in the neighborhood each pixel is of constant intensity is high, and beyond the neighborhood each pixel has zero intensity.
- the distribution D t is centered on each position (x ⁇ y ⁇ ) and is strictly decreasing around the latter. It may for example be a two-dimensional Gaussian intensity distribution, centered on each position (Xj,yj).
- the width at mid-height is for example less than 20 pixels, and preferably less than 10 or 5 pixels. Any other form of parametric distribution can be considered, knowing that it is preferable that the distribution be symmetrical around each position, and preferably strictly decreasing from position (xj,yj). Assigning a distribution at each position (xj,yj) makes it possible to obtain an output image / out n in which each spermatozoon is simple to detect. In this, the output image I out n is a detection image, based on which each sperm can be detected.
- the output image of the neural network is formed by a resultant of each intensity distribution D t defined respectively around each position (x ⁇ yj).
- the convolutional detection neural network comprised 20 layers comprising either 10 or 32 characteristics (more usually designated by the term “features”) per layer. The number of features was determined empirically. The transition from one layer to another is performed by applying a convolution kernel of size 3x3. The output image is obtained by combining the characteristics of the last convolution layer.
- the neural network was programmed in a Matlab environment (publisher: The Mathworks). The neural network has previously been the subject of learning (step 80), so as to parameterize the convolution filters. During the learning, learning sets were used, each set comprising: an input image, obtained by the image sensor and preprocessed by normalization and possibly filtering, using animal semen.
- FIGS. 4A, 4B and 4C are a set of test images, comprising respectively: an image of a sample of bovine semen acquired by the image sensor and having undergone normalization and high pass filtering according to step 110; a manually annotated image, which corresponds to a reference image, usually designated by the term “ground truth” (reality); an image resulting from the application of the detection convolutional neural network.
- FIG. 4C The image of FIG. 4C is consistent with the reference image (FIG. 4B), which attests to the detection performance of the CNNd neural network.
- the inventors have noted that FIG. 4C shows a position not shown on the reference image: a check showed that it was an annotation omission on the reference image.
- the application of the convolutional neural network of detection CNNd allows precise detection and positioning of sperm. And this without resorting to an image reconstruction implementing a holographic propagation operator, as suggested in the prior art.
- the input image of the neural network is an image formed in the detection plane, and not in a reconstruction plane remote from the detection plane.
- the detection neural network generates low-noise output images: the signal-to-noise ratio associated with each sperm detection is high, which facilitates subsequent operations.
- Step 120 is repeated for different images of the same series of images, so as to obtain a series of output images I tool I O ut,N-
- step 120 from each output image I out l I 0U t,N> a local maximum detection algorithm is applied, so as to obtain, for each image, a list of 2D coordinates, each coordinate corresponding to a position of a spermatozoon.
- a list L out l L 0Ut w is established -Each list L out n comprises corresponds to the 2D positions, in the detection plane, of spermatozoa detected in a picture I outin .
- FIG. 5A represents an image acquired during step 100. More precisely, it is a detail of an image acquired according to a defocused mode. This is an image of a sample containing bovine semen.
- Figure 5B shows an image resulting from the application of the detection neural network in Figure 5A, after normalization and application of a high-pass filter.
- the comparison between images 5A and 5B shows the gain provided by the detection convolutional neural network in terms of signal-to-noise ratio.
- Step 130 position tracking
- a position tracking algorithm is applied, usually designated by the term “tracking algorithm”.
- an algorithm is implemented allowing tracking of the position of the spermatozoon, parallel to the detection plane, between the different images of the series of images.
- the implementation of the position tracking algorithm is efficient because it is performed from the lists L out l L 0Ut N resulting from step 120. As previously indicated, these images have a signal to noise ratio high, which facilitates the implementation of the position tracking algorithm.
- the position tracking algorithm may be a “nearest neighbor” type algorithm. Step 130 allows a trajectory of each sperm to be determined parallel to the detection plane.
- output images I tool I O ut,N have been superimposed by considering an image stack of 30 images. It is observed that the trajectory, parallel to the detection plane, of each spermatozoon can be determined precisely.
- Step 140 characterization of the motility
- each trajectory can be characterized on the basis of metrics applied to the trajectories resulting from step 130. Knowing the frequency of acquisition of the images, it is possible to quantify displacement speeds of each spermatozoon detected, and in particular: a speed of the linear trajectory VSL , usually designated by the term “velocity straightline path", which corresponds to the speed calculated on the basis of a distance, in a straight line, between the first and last points of the trajectory (the first point is determined from the first acquired image of the series of images, and the last point is determined from the last acquired image of the series of images) a speed of the curvilinear path VCL, usually referred to by the term “velocity curvilinear path”: this is a speed established by summing the distances traveled between each image, and by dividing by the duration of the acquisition period; a speed of the average trajectory VAP, usually designated by the term “velocity average path”: this is a speed established after smoothing the trajectory of a particle:
- indicators making it possible to characterize the motility of the spermatozoa known to those skilled in the art.
- indicators of the type straightness indicator STR, obtained by a ratio between VSL and VAP, usually designated by the term “straightness”. This indicator is all the higher as the sperm moves in a straight line; LIN linearity indicator, obtained by a ratio between VSL and VCL, usually designated by the term “linearity”. This indicator is also all the higher as the sperm moves in a straight line.
- WOB oscillation indicator obtained by a VAP/VCL ratio, usually designated by the term wobble.
- a spermatozoon is considered to be: motile if the length of the trajectory is greater than a first threshold, for example 10 pixels, and its displacement along the average trajectory (VAPx At, At being the acquisition period) is greater than a predefined length, corresponding for example to the length of a sperm head; progressive if the length of the trajectory is greater than the first threshold, and if the straightness STR and the speed of the average trajectory VAP are respectively greater than two threshold values STRth and VAPthi; slow if the length of the trajectory is greater than the first threshold and if the speed of the linear trajectory VSL and the speed of the average trajectory VAP are respectively less than two threshold values VSL t h2 and VAP t h2; static if the length of the trajectory is greater than the first threshold and if the speed of the linear trajectory VSL and the speed of the average trajectory VAP are respectively lower than two threshold values V
- the threshold values STR t h, VSL t h2, VSL t h3 are determined beforehand, with VSL t h2 ⁇ VSL t h3- The same applies to the values VAP t hi, VAP t h2 and VAP t h3 with VAP t hi> VAP t hi> VAP t h3-
- Step 150 Extract thumbnails for each sperm.
- a thumbnail V in is extracted, and this in each image I O n acquired by the image sensor during step 100.
- Each thumbnail V in is a portion of an image I O n acquired by the image sensor.
- a vignette V in is extracted around the position assigned to the sperm.
- the position of the spermatozoon 10d, in each image I O n is obtained following the implementation of the position tracking algorithm (step 130).
- each thumbnail V in is predetermined. Relative to each thumbnail V in , the position (%j, yj) of the spermatozoon 10j considered is fixed: preferably, the position (x ⁇ y ⁇ ) of the spermatozoon 10i is centered in each thumbnail V in .
- a thumbnail V in may for example comprise a few tens or even a few hundreds of pixels, typically between 50 and 500 pixels.
- a thumbnail is 64 x 64 pixels. Due to the size and concentration of spermatozoa in the sample, a V in thumbnail may include several spermatozoa to be characterised. However, only the spermatozoon 10j occupying a predetermined position in the thumbnail V in , for example at the center of the thumbnail, is characterized using said thumbnail.
- Figure 6 shows nine V in thumbnails extracted from I outin images resulting from step 100, using the positions (%j, yj) resulting from trajectory tracking of a spermatozoon resulting from step 130.
- Step 160 Classification of the morphology of each spermatozoon
- a CNN c classification neural network is used, so as to classify each 10i spermatozoon previously detected by the convolutional neural network detection CNNd.
- the neural network is fed with thumbnails ... V i N extracted during step 150.
- the convolutional neural network of classification CNN c can for example comprise 6 convolutional layers, comprising between 16 and 64 characteristics: 16 characteristics for the first four layers, 32 characteristics for the fifth layer, and 64 characteristics for the sixth layer.
- the passage from one layer to another is carried out by applying a convolution kernel of size 3 by 3
- the output layer comprises nodes, each node corresponding to a probability of belonging to a morphological class.
- Each morphological class corresponds to a morphology of the analyzed sperm. It can for example be a known classification, the classes being:
- flagellum abnormality curved or coiled flagellum
- the output layer can thus have 11 classes.
- the neural network is programmed in a Matlab environment (publisher: The Mathworks).
- the neural network has previously been the subject of learning (step 90), so as to parameterize the convolution filters.
- learning games were used, each game comprising: thumbnails, extracted from images acquired by the image sensor; a manual annotation of the morphology of each sperm analyzed.
- the annotation was carried out on the basis of thumbnails having been the subject of a holographic reconstruction.
- FIGS. 7A, 7B and 7C represent respectively an image acquired by the image sensor, an image reference, manually annotated, and an image resulting from the detection neural network. Image 7C was obtained without implementing the filter during step 110.
- FIGS. 8A, 8B and 8C respectively represent an image acquired by the image sensor, a reference image, annotated manually, and an image resulting from the detection neural network.
- Image 8C was obtained by implementing a high pass filter during step 110.
- image 8C includes detected spermatozoa, marked by arrows, which do not appear on image 7C. Applying the filter improves the detection performance of the classification neural network.
- FIGS. 9A and 9B respectively represent the detection sensitivity and the detection specificity of the CNNd detection neural network (curves a, b and c), in comparison with a conventional algorithm (curve d).
- the classical algorithm was based on morphological operations of erosion/dilation type image processing.
- curves a, b and c correspond respectively: to the neural network trained with 10,000 annotations; the trained neural network with 20,000 annotations; to the neural network trained with 20,000 annotations, each image acquired by the sensor having been subject to high-pass filtering.
- Sensitivity and specificity were determined on 14 different samples (abscissa axis) of diluted bovine semen (factor 10). It is observed that the performances of the neural network, whatever the configuration (curves a, b and c) are superior to that of the classical algorithm, which is remarkable. Furthermore, the best performance is obtained in configuration c, according to which the neural network is trained with 20,000 annotations and is fed with an image that has been preprocessed with a high-pass filter. It is recalled that the sensitivity corresponds to the rate of true positives over the sum of the rates of true positives and false negatives, and that the specificity corresponds to the rate of true positives over the sum of the rates of true and false positives.
- Figure 10A is a confusion matrix representing the classification performance of a CNN c classification neural network parameterized to receive, as an input image, an image resulting from a holographic reconstruction.
- the neural network is a convolutional neural network, as previously described, trained using images acquired using a defocused device, as previously described. Using this neural network, each acquired image undergoes a holographic reconstruction, in a sample plane extending through the sample, using a holographic reconstruction algorithm as described in US20190101484. From the reconstructed image, a thumbnail centered on the sperm to be characterized is extracted. The thumbnail forms an input image of the neural network.
- Figure 10B is a confusion matrix representing the classification performance of a classification neural network parameterized to receive, as an input image, a single thumbnail, centered on the spermatozoon to be characterized, and extracted from a image acquired in a defocused configuration.
- the neural network is a convolutional neural network, structured as described in step 160, except that the neural network is powered by a single tile.
- Figure 10C is a confusion matrix representing the classification performance of a classification neural network as described in step 160, fed by a series of 5 vignettes extracted from images acquired in defocused configuration.
- each confusion matrix corresponds to classes 1 to 10 previously described.
- a single input image makes it possible to obtain a satisfactory classification performance.
- morphological classification it is not necessary to have a series of images comprising several images. A single image may suffice. However, the classification performance is better when using multiple images.
- the invention allows an analysis of a sample comprising spermatozoa without resorting to a translation stage. Furthermore, it allows characterization directly from the image acquired by the image sensor, that is to say on the basis of diffraction figures of the different spermatozoa, and this without requiring the use of algorithms of digital reconstruction. Currently, the use of such an algorithm results in a processing time of 10 seconds per image. That is 300 seconds for a series of 30 images. Furthermore, as represented in FIGS. 10A to 10C, the invention allows a more precise classification than a classification based on a neural network of identical structure, fed by reconstructed images.
- Another advantage of the invention is the tolerance of defocusing.
- the inventors estimate that the invention tolerates shifts of ⁇ 25 ⁇ m between the optical system and the sample.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Analytical Chemistry (AREA)
- Dispersion Chemistry (AREA)
- Biochemistry (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Biodiversity & Conservation Biology (AREA)
- Signal Processing (AREA)
- Investigating Or Analysing Biological Materials (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| FR2013979A FR3118169B1 (fr) | 2020-12-22 | 2020-12-22 | Procédé de caractérisation de spermatozoïdes |
| PCT/EP2021/086909 WO2022136325A1 (fr) | 2020-12-22 | 2021-12-20 | Procédé de caractérisation de spermatozoïdes |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| EP4268120A1 true EP4268120A1 (fr) | 2023-11-01 |
Family
ID=74592263
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP21839575.4A Pending EP4268120A1 (fr) | 2020-12-22 | 2021-12-20 | Procédé de caractérisation de spermatozoïdes |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US20240044771A1 (fr) |
| EP (1) | EP4268120A1 (fr) |
| FR (1) | FR3118169B1 (fr) |
| WO (1) | WO2022136325A1 (fr) |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12475564B2 (en) * | 2022-02-16 | 2025-11-18 | Proscia Inc. | Digital pathology artificial intelligence quality check |
| FR3154495B1 (fr) * | 2023-10-23 | 2025-09-05 | Commissariat Energie Atomique | Procédé de détection de particules présentes dans un liquide |
Family Cites Families (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE10146902A1 (de) * | 2000-09-25 | 2002-09-19 | Sensovation Ag | Bildsensor, Vorrichtung und Verfahren für optische Messungen |
| US8842901B2 (en) | 2010-12-14 | 2014-09-23 | The Regents Of The University Of California | Compact automated semen analysis platform using lens-free on-chip microscopy |
| WO2014012031A1 (fr) | 2012-07-13 | 2014-01-16 | The Regents Of The University Of California | Suivi de sperme tridimensionnel (3d) exempt de lentille à haut débit |
| FR3028951B1 (fr) | 2014-11-21 | 2017-01-06 | Commissariat Energie Atomique | Systeme d'imagerie sans lentille comprenant une diode, un diaphragme et un diffuseur entre la diode et le diaphragme |
| FR3030749B1 (fr) * | 2014-12-19 | 2020-01-03 | Commissariat A L'energie Atomique Et Aux Energies Alternatives | Methode d'identification de particules biologiques par piles d'images holographiques defocalisees |
| FR3034196B1 (fr) | 2015-03-24 | 2019-05-31 | Commissariat A L'energie Atomique Et Aux Energies Alternatives | Procede d'analyse de particules |
| FR3034197B1 (fr) | 2015-03-24 | 2020-05-01 | Commissariat A L'energie Atomique Et Aux Energies Alternatives | Procede de determination de l'etat d'une cellule |
| FR3036800B1 (fr) | 2015-05-28 | 2020-02-28 | Commissariat A L'energie Atomique Et Aux Energies Alternatives | Procede d’observation d’un echantillon |
| FR3049347B1 (fr) | 2016-03-23 | 2018-04-27 | Commissariat A L'energie Atomique Et Aux Energies Alternatives | Procede d’observation d’un echantillon par calcul d’une image complexe |
| FR3075372B1 (fr) | 2017-12-18 | 2020-08-28 | Commissariat Energie Atomique | Dispositif et procede d'observation d'un echantillon avec un systeme optique chromatique |
| US10094759B1 (en) * | 2017-12-22 | 2018-10-09 | Hillel Llc | Imaging device for measuring sperm motility |
| FR3081552B1 (fr) * | 2018-05-23 | 2020-05-29 | Commissariat A L'energie Atomique Et Aux Energies Alternatives | Dispositif et procede d'observation d'un echantillon fluorescent par imagerie defocalisee |
| FR3087538B1 (fr) | 2018-10-17 | 2020-10-09 | Commissariat Energie Atomique | Procede d'observation d'un echantillon |
-
2020
- 2020-12-22 FR FR2013979A patent/FR3118169B1/fr active Active
-
2021
- 2021-12-20 WO PCT/EP2021/086909 patent/WO2022136325A1/fr not_active Ceased
- 2021-12-20 US US18/258,726 patent/US20240044771A1/en active Pending
- 2021-12-20 EP EP21839575.4A patent/EP4268120A1/fr active Pending
Also Published As
| Publication number | Publication date |
|---|---|
| FR3118169B1 (fr) | 2025-10-17 |
| US20240044771A1 (en) | 2024-02-08 |
| FR3118169A1 (fr) | 2022-06-24 |
| WO2022136325A1 (fr) | 2022-06-30 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP3274689B1 (fr) | Procédé et dispositif d'analyse de particules | |
| EP3433679B1 (fr) | Procédé d'observation d'un échantillon par calcul d'une image complexe | |
| EP3274694B1 (fr) | Procédé de détermination de l'état d'une cellule | |
| EP3270232B1 (fr) | Dispositif d'observation d'un échantillon | |
| EP2020896B1 (fr) | Tete optique miniaturisee a haute resolution spatiale et haute sensibilite, notamment pour l'imagerie de fluorescence confocale fibree | |
| EP3433678A1 (fr) | Procédé de caractérisation holographique d'une particule dans un échantillon | |
| EP3069185B1 (fr) | Dispositif et methode de mise au point tridimensionnelle pour microscope | |
| EP3199941B1 (fr) | Procédé d'observation d'un échantillon par imagerie sans lentille | |
| EP3637194B1 (fr) | Procédé de détermination de paramètres d'une particule | |
| EP4268120A1 (fr) | Procédé de caractérisation de spermatozoïdes | |
| WO2011125033A1 (fr) | Procédé de détection d'amas de particules biologiques. | |
| FR3060746A1 (fr) | Procede de numeration de particules dans un echantillon par imagerie sans lentille | |
| FR3082944A1 (fr) | Procede d'observation d'un echantillon par imagerie sans lentille, avec prise en compte d'une dispersion spatiale dans l'echantillon | |
| WO2015166009A1 (fr) | Procédé et système de détection d'au moins une particule dans un liquide corporel, et procédé associé de diagnostic de la méningite | |
| EP4232946B1 (fr) | Procédé de classification d'une séquence d'images d'entrée représentant une particule dans un échantillon au cours du temps | |
| EP3754431B1 (fr) | Procédé de reconstruction holographique | |
| EP3899669B1 (fr) | Procédé de caractérisation d'une particule à partir d'un hologramme | |
| EP3724725B1 (fr) | Procede de calibration d'un dispositif d'analyse et dispositif associe | |
| EP4260116A1 (fr) | Microscope confocal avec réallocation de photons | |
| FR3073047A1 (fr) | Procede optique d'estimation d'un volume representatif de particules presentes dans un echantillon | |
| EP3903098B1 (fr) | Procédé de caractérisation d'un échantillon par imagerie de phase | |
| FR3118499A1 (fr) | Procédé de formation d’une image complexe d’un échantillon | |
| WO2019243725A1 (fr) | Procede et dispositif de comptage de thrombocytes dans un echantillon | |
| WO2025099033A1 (fr) | Appareil et procédé de micro-spectrométrie raman confocal avec système de mise au point | |
| EP3575774B1 (fr) | Procédé d'observation de particules, en particulier des particules submicroniques |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
| PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
| 17P | Request for examination filed |
Effective date: 20230615 |
|
| AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
| DAV | Request for validation of the european patent (deleted) | ||
| DAX | Request for extension of the european patent (deleted) | ||
| RAP3 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: COMMISSARIAT A L'ENERGIE ATOMIQUE ET AUX ENERGIESALTERNATIVES |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
| 17Q | First examination report despatched |
Effective date: 20250306 |