[go: up one dir, main page]

WO1992001994A1 - Procedes et dispositif concernant la micropropagation - Google Patents

Procedes et dispositif concernant la micropropagation Download PDF

Info

Publication number
WO1992001994A1
WO1992001994A1 PCT/GB1991/001266 GB9101266W WO9201994A1 WO 1992001994 A1 WO1992001994 A1 WO 1992001994A1 GB 9101266 W GB9101266 W GB 9101266W WO 9201994 A1 WO9201994 A1 WO 9201994A1
Authority
WO
WIPO (PCT)
Prior art keywords
plant
image
stem
plant material
image signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/GB1991/001266
Other languages
English (en)
Inventor
Nigel James Bruce Mcfarlane
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BTG International Ltd
National Research Development Corp UK
Original Assignee
National Research Development Corp UK
British Technology Group Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National Research Development Corp UK, British Technology Group Ltd filed Critical National Research Development Corp UK
Publication of WO1992001994A1 publication Critical patent/WO1992001994A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01HNEW PLANTS OR NON-TRANSGENIC PROCESSES FOR OBTAINING THEM; PLANT REPRODUCTION BY TISSUE CULTURE TECHNIQUES
    • A01H4/00Plant reproduction by tissue culture techniques ; Tissue culture techniques therefor
    • A01H4/003Cutting apparatus specially adapted for tissue culture
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0014Image feed-back for automatic industrial control, e.g. robot with camera

Definitions

  • the present invention relates to methods and apparatus for use in micropropagation.
  • the invention is concerned in particular with a method of locating a stem of a plant during micropropagation, a method of harvesting a plant, and apparatus for putting the methods into effect.
  • Micropropagation is an increasingly important technique for the rapid production of genetically identical plants. It is a labour-intensive industry in which the potential gains in speed, sterility, and cost saving make automation an attractive prospect.
  • plants are grown from small pieces of plant tissue in an agar-based medium. After several weeks of growth, the microplants are removed from their containers for dissection, and the pieces which have the potential to grow into new plants are placed in fresh containers of agar, to develop into the next generation.
  • One of the tasks which must be carried out by an automatic micropropagation system is that of harvesting the plants for dissection, i.e. removing the plants, or the wanted parts thereof, from the soft nutrient medium in which they are growing. At present this is done manually by the operator using forceps. In an automated system, the removal will be carried out by a robotic end effecter.
  • Vision processing is a method of sensory control which has been applied to similar problems in other areas of agriculture, such as tomato sorting, fruit harvesting, and plant identification. However these techniques are difficult to apply in micropropagation because of the confused mass of foliage which often occurs.
  • a method of locating a stem of a plant during micropropagation comprising the steps of generating an image signal representing an image of the plant, and processing the image signal to locate a portion of plant material in the image likely to constitute a stem, by locating a portion of plant material having the following criteria, namely: (i) having a horizontal width lying in a predetermined range of horizontal distance, and (ii) having a length greater than a predetermined value; and having at least one of the following criteria, namely: (iii) having an angle of inclination to the vertical less than a predetermined value, and (iv) having a region of plant • material above and/or below the plant portion fulfilling one or more further predetermined criteria.
  • references are made to horizontal and vertical, and top and bottom, these are to be taken to be references to horizontal and vertical directions on a normal presentation of the image of the plant on a monitor screen with the stem substantially upright leading upwardly from the nutrient medium in which it is planted. It is possible however that the plant may be imaged by a camera when in a position other than the normal vertical position. Thus the terms vertical and horizontal when used with regard to the image signal do not necessarily relate to the orientation of the plant itself at the actual work station.
  • the method includes processing the image signal to locate a portion of plant material having at least the following criterion, namely: having a region of plant material above and/or below the plant portion which provides a continuous or substantially continuous path of predetermined characteristics leading through plant material from the top and/or bottom of the plant portion.
  • the method includes processing the image signal to locate a portion of plant material having at least the following criterion, namely: having a region of plant material below the plant portion which provides a continuous or substantially continuous downward path through plant material from the bottom of the plant portion to a base level related to the level of a material in which the plant is growing.
  • the image signal is processed first to locate a plurality of portions of plant material in the image which have the first two stated criteria of horizonal width and length, and the processing procedure then selects portions which fulfil the criteria of the angle of inclination and the path through plant material to the growing medium in which the plant is growing.
  • portions can be selected by reference to the criterion that the portion of plant material has a region of plant material at the top and/or bottom of the plant portion fulfilling one or more predetermined size criteria.
  • true stem portions can be distinguished from false stem portions such as pieces of leaves, in that stems are normally more nearly vertical than other thin plant portions.
  • true stems can be distinguished from leaves and other parts of the plant by reference to the property possessed of most genuine stem segments of not continuing into empty space at either end of the stem portion.
  • the possible stem portions are tested for inclination of the plant portion to the vertical, and for a path through plant material to the surface of the medium in which the plant is growing or as an alternative, the presence of a mass of plant material of predetermined size above and below the end of the plant portion.
  • the method includes processing the image signal to test for the criterion of a downward path, by the steps of: examining pixels below the bottom of a possible stem portion to determine if there exists a pixel representing plant material which is immediately below, or to one side or the other by a predetermined number of pixels, of a selected pixel in the bottom of the possible stem portion; and repeating the examination step sequentially below each located pixel of plant material to locate subsequent lower pixels which are immediately below, or to one side or the other by a predetermined number of pixels, of previously located pixels in the said downward path.
  • the method may be carried out so that if, in one " of the said examination steps, a pixel of plant material is located for which in a subsequent examination step there is found no succeeding lower pixel of plant material, then the said pixel for which there is no succeeding lower pixel of plant material, is replaced in the image signal by a pixel which does not represent plant material.
  • the, or one of the, predetermined size criteria may be that, at the top and/or bottom of the plant portion, the horizontal width of the plant portion must be greater than a predetermined value, which is itself set to be greater than the lower end of the said range of horizontal width distance specified for the plant portion.
  • the or one of the said predetermined size criteria may be that, above the top and/or below the bottom of the plant portion, there should be in the image an area of plant material of more than a predetermined value.
  • the said criterion of angle of inclination is that the angle to the vertical of the most nearly vertical side of the plant portion should be less than a predetermined value.
  • the said predetermined value is chosen from a range comprising 15° to 30° to the vertical, preferably 20° to 25°.
  • the criterion .for the horizontal width of the plant portion is that the horizontal width must lie within the said predetermined range, throughout the length of the plant portion.
  • the invention finds particular application where it is required to insert into a group of plants growing in a container, a robotic end effector, for example a pair of gripping fingers which will close together and grip a stem of a plant.
  • a robotic end effector for example a pair of gripping fingers which will close together and grip a stem of a plant.
  • depth information indicating the distance of a selected stem from a fixed point of observation.
  • the method may include generating two or more image signals representing images of the plant when observed from different directions, comparing the image signals and deriving depth information relating to the distance of different parts of the plant from a fixed point of observation of the plant, and storing depth information in association with located plant portions of a single image of the plant, which are likely to represent stems.
  • the depth information may be used when setting criteria for selecting likely portions of plant which constitute suitable stem portions for grasping.
  • the method may include deriving from the depth information for a likely stem portion a measure of the accuracy of the depth information associated with that stem portion, and selecting a preferred stem portion by criterion including the accuracy of the depth information associated with the stem portion.
  • the two or more image signals are generated by maintaining the direction of observation of the plant constant, and rotating the plant relative to the direction of observation of the plant.
  • the method includes the further step of processing the image signal to extrapolate the image of the selected portion of plant material to represent an extension of the selected plant portion in the downward direction.
  • the method includes the step of processing the image signal to locate the intersection of the said extension of plant portion with a base level related to the level of a material in which the plant is growing.
  • the image signal is adapted to present an image of the plant by * a horizontally scanned image raster, the method including the steps of scanning across the image to locate a plant portion identified by a transition from background to plant material followed by a transition from plant material to background, the transitions being separated by a horizontal distance which lies in the said predetermined range of distance; scanning along the next line of the image raster to test for the presence of a subsequent adjacent scan containing a pair of transitions separated from each other by a horizontal distance in the same range, and positioned laterally within a predetermined relationship with the previous pair of transitions; and continuing with subsequent scans to test for the presence of at least a predetermined number of successive pairs of transitions fulfilling the above requirements, and thereby locating a portion of the plant image likely to constitute a stem.
  • the invention also provides a method of harvesting a plant during micropropagation comprising the steps of locating a stem of the plant in accordance with the method of any of the preceding paragraphs, generating a control signal related to the location of a stem, and removing the plant from a growing medium at the said location by robotic means under the control of the said control signal.
  • the method includes the step of gripping the plant at the said location by robotic gripping means under the control of the said control signal.
  • the said robotic gripping means is directed to grip the plant at a location related to the said extension of the plant portion.
  • apparatus for use in micropropagation comprising means for generating an image signal representing an image of the plant; and signal processing means for processing the image signal to locate a required portion of plant material and to generate an output signal containing information as to the location of the plant portion, the processing means operating to locate the portion of plant material in the image as having the following criteria, namely: (i) having a horizontal width lying in a predetermined range of horizontal distance, and (ii) having a length greater than a predetermined value; and having at least one of the following criteria, namely: (iii) having an angle of inclination to the vertical less than a predetermined value, and (iv) having a region of plant material above and/or below the plant portion fulfilling one or more further predetermined criteria.
  • the apparatus may further comprise means for presenting a plurality of plants at a work station, and robotic means for removing a plant or a required portion thereof from a growing medium at a selected location under the control of the said output signal.
  • Figure 1 is a diagrammatic representation of apparatus embodying the invention for locating and selecting stems of plants and for harvesting plants, during micropropagation;
  • Figures 2a to 2d show a flow chart of a routine for processing a visual image signal in an embodiment of the invention
  • Figure 3 shows a visual image thresholded into two grey levels of image, of a microplant in a container
  • Figures 3a, 3b, and 3c show the image of Figure 3 after processing to various stages in the flow chart of
  • FIG. 2a to 2d shows an additional step of signal processing which may be carried out
  • Figure 4 is a representation of a close-up of a stem segment in an image of- a plant, found by use of a method embodying the invention
  • Figures 5a to 5e are diagrammatic representations of pixels illustrating tests carried out at step 241 in the flow chart of Figure 2c;
  • Figures 6a to 6b show a flow chart of an alternative routine for processing a visual image signal in an alternative embodiment of the invention.
  • FIG. 1 A typical configuration of microplants at this stage in the process is shown in Figure 1.
  • a container 11 has been cut down from the normal shape of a margarine tub, to provide a shallow tray of agar 12 in which grow three plants 13, nominally arranged in a row as shown.
  • a typical example of plant is chrysanthemum.
  • the tray may measure 80mm by 50mm having a depth of 10mm of agar, and typically the plants are 75mm high.
  • a harvesting tool 19 constituting a robotic end effecter has a pair of gripper rods 20 and 21 extending towards the plants.
  • the end effecter 19 may be driven by a control device 17 in three orthogonal directions indicated at X, Y and Z in the Figure 1.
  • the plant can be grasped by the gripper rods 20 and 21 moving together to grip the stem 14, and the plant can be removed from the container 11.
  • the stems are preferentially grasped close to the bottom because it is assumed that the stem is least likely to break at that point when gripped.
  • the plants may be removed for example by cutting at the base, for example by a laser or by a blade mounted on a robotic tool, or by pulling the plant vertically from the agar.
  • the robotic harvesting tool 19 can be automatically guided.
  • Movement of the end effecter 19 in the Z direction can either be avoided, by growing the plants in a single row, or where for example six plants are grown in two rows, the necessary depth information can be obtained from a second image of the container, slightly rotated from its original position.
  • Figure 1 illustrates in diagrammatic form an arrangement for generating an image signal representing the plants by means of a solid state video camera 15 directed to the plants 13 to capture monochrome images of the microplants in the container.
  • the effect of back lighting can be achieved by using an opaque tunnel (not shown) to shield the plants from overhead and transverse light, with an inclined sheet of white card placed so as to reflect lighting from behind the plants, imaging them in silhouette.
  • the video signal from the camera 15 is transmitted along a line 23 to a main microcomputer 24.
  • the microcomputer 24 constitutes signal processing means for processing the image signal to locate a portion (also referred to as a segment) of plant material in the image which is identified as a stem.
  • the microcomputer 24 then produces a control signal containing information relating to the location of the stem, and the control signal is passed along a further line 25 to operate the control device 17 to effect movement of the end effecter 19.
  • the algorithm used in accordance with a preferred embodiment of the invention to locate, using computer vision, stems or parts of stems suitable for the harvesting robotic end effecter to grasp.
  • the video image is digitised and placed in RAM by a Frame Grabber Card and is continuously displayed from RAM on a monitor screen 26.
  • Software is used to contract the image into one quarter of the monitor screen, after which the dimensions of the image are 256x256 pixels with a resolution of 128 grey levels.
  • the image produced by the camera 15 is thresholded into two grey levels: grey for the plants and white for the background.
  • Figure 3 shows a typical thresholded image, in which the locations of the stems are clearly visible to the human eye. In this image, six plants in two rows are presented. It is unusual for all six stems to be visible at the same time in the crowded container. However this is not important to the practical performance of the algorithm, because hidden stems are gradually revealed by the removal of occlusions as the more readily-visible plants are harvested.
  • the algorithm works by searching the image for features corresponding to the definition of a stem summarised in the following table.
  • the algorithm can be summarised as the steps of selecting a plant portion which satisfies the criteria that:
  • the horizontal width of the plant portion, throughout its length, lies in the range two to eight pixels; and (ii) the length of the plant portion is greater than three pixels; and (iii) the angle to the vertical of the average of the two vertical sides of the plant portion is less than 22°; and (iv) there exists in the line of the scanning raster below the bottom of the stem portion, a pixel representing plant material which is immediately below the centre of the bottom of the stem portion, or is positioned to one side or the other by one pixel, and which represents plant material; and that in succeeding lines of the scanning raster, down to the level of the material in which the plant portion is growing, there is in each succeeding scanning line a pixel of plant material which is either immediately below a pixel of plant material located in the previous line, or is to one side or the other of that pixel by the width of one pixel.
  • the processing starts at step 220 by the video camera 15 observing the plant from a first direction.
  • the processing means 24 captures a first image from the camera 15 and stores this first image.
  • a motor (not shown) rotates the container 11 by a predetermined amount, in this case about 4°, and a second visual image of the plant is taken by the camera 15.
  • the second video image is captured by the microcomputer 24 and stored.
  • both the first image and the second image are thresholded to give an image of two grey levels, such as is shown in Figure 3.
  • the two images are compared in the processing means 24 to derive depth information from the stereo images, and the processing means calculates the distances of edge pixels from the camera in the second image.
  • step 226 the first image is discarded, and from this step onwards the processing is carried out with regard to the second image, shown for example in Figure 3.
  • the flow chart terminates at A, and then continues in Figure 2b, starting at A.
  • the processing continues at step 230 and the algorithm scans the image row by row until a pair of left and right hand edge points are found, at steps 231 and 232, which indicate a white-to-grey transition followed by a grey-to-white transition.
  • the width between the transitions is then tested at step 233 to locate a pair of left and right hand edge points which are 2 to 8 pixels apart.
  • the signal processing means then tracks the stem downwards in step 234 until a stopping condition is reached.
  • the tracking of the stem downwards can be said to comprise the steps of scanning across the image to locate a plant portion identified by a transition from background to plant material followed by a transition from plant material to background, the transitions being separated by a horizontal distance which lies in the said predetermined range of distance; scanning along the next line of the image raster to test for the presence of a subsequent adjacent scan containing a pair of transitions separated from each other by a horizontal distance in the same range, and positioned laterally within a predetermined relationship with the previous pair of transitions; and continuing with subsequent scans to test for the presence of at least a predetermined number of successive pairs of transitions fulfilling the above requirements, and thereby locating a portion of the plant image likely to constitute a stem.
  • the tracking is stopped if (a) the stem thickness becomes thinner than 2 pixels or thicker than 8, or (b) if either edge deviates suddenly by more than 2 pixels to the right or left.
  • the image signal at the output of step 234 is indicated as image I which is shown in Figure 3a.
  • Figure 4 shows an enlargement of part of an image of a plant. The portion shown in Figure 4 appears in the top left hand corner of the plant image of Figure 3a, and is shown to illustrate how the tracking proceeds.
  • Figure 4 shows a stem segment after tracking, with the pixels visited marked in heavy black. In the case shown, the segment was followed downwards for 24 pixels until tracking was stopped by the thickness expanding to more than 8 pixels, and by the right hand edge moving more than 2 pixels to the right.
  • the algorithm rejects stem segments having a length less than 3 pixels.
  • the position of the stem is recorded and added to a list of possible stem segments.After tracking a segment, the coordinates of the sides of the segment are recorded, and the search for more stems is resumed from the point of entry into the segment. This continues until the end of the screen is reached, at step 237.
  • the flow chart is continued in Figure 2c, being linked to the part shown in Figure 2b by the flow line B.
  • the candidate stem segments are extracted from the list for examination for further criteria.
  • the set of candidate stem segments found after scanning the entire image includes leaf-stems, noise and some thin, vertical pieces of leaves amongst the genuine stems.
  • stems are readily distinguishable from the leaf-stems by their angle to the vertical.
  • the angle of a stem segment to the vertical is calculated from the co-ordinates of its corners; the angles of the two straight lines drawn from the bottom left to top left and from bottom right to top right are both calculated, and the average angle to the vertical of the two sides is taken to be that of the stem.
  • the selection by the criterion of angle is carried out at step 239.
  • Stems are also distinguished from noise and many of the leaf pieces by their length. Noise and leaf pieces rarely yield a coherent pair of edges over more than a few pixels. Length is measured as the vertical difference between the top and bottom coordinates of the segment. This step has already been dealt with in step 235. Thus stems are initially defined as a' pair of edges, 2 to 8 pixels thick, with a length greater than 3 pixels and ' an angle to the vertical of not more than 22°.
  • an image typically contains some 50 remaining segments, including perhaps about 3 errors where a feature is falsely classified as a stem. Many of the errors are removed by use of the property possessed by genuine stem segments of continuing downwardly through plant material to the surface of the growing medium such as agar.
  • the algorithm effects the test for a continuous downward path through plant material to the agar, as follows.
  • the method is illustrated in Figure 3b, where the completed paths are indicated at 30, 31, 32, 33, 34, and 35, and in Figures 5a to 5e, where the testing of individual pixels is illustrated.
  • the method carried out in step 241 of the flow chart begins by examining the middle pixel of the bottom row of a stem candidate.
  • the central pixel is indicated by way of example in Figure 5a at 510.
  • the algorithm looks one row down in the scanning raster, and examines three pixels indicated at 511, 512 and 513.
  • the pixels 511 and 512 are both found to be pixels representing plant material (hereinafter referred to as plant pixels) .
  • the algorithm is arranged first to examine pixel 512, the pixel immediately below the previous pixel 510.
  • the algorithm next examines the three pixels in the next row down, comprising a pixel immediately below pixel 512, indicated at pixel 516, and the pixels on either side, indicated at 515 and 517. In this case, each of the three pixels does not represent plant material, as shown in Figure 5c.
  • the algorithm then returns to consideration of pixel 510, and takes two further steps. Firstly the algorithm replaces the plant pixel of 512 by a blank, or non-plant pixel 512, to avoid any further attempts to track down a path through pixel 512. Next the algorithm examines the two pixels on either side of 512, and tries firstly the pixel to the left in Figure 5d, namely pixel 511.
  • pixel 511 is found to be a plant pixel so that the path, for example path 31, now continues downwardly, having moved through an angle of 45° to one side, as shown in Figure 5e.
  • the next step is to test the three pixels immediately below pixel 511, indicated at pixels 514, 515 and 516.
  • the routine established will again first test the central pixel 515, and will preferably follow the vertically downward path through pixel 515. Only if this path is stopped by meeting a non-plant pixel in a lower raster scan, will the algorithm withdraw back up to pixel 511 and then move sideways to examine pixels 514 and 516.
  • step 241 in Figure 2c the outcome of this step 241 in Figure 2c, is that a series of paths 30 to 35 are traced downwardly through the plant material to the level of the agar at 12. If any stem segment fails to meet the test of the routine in step 241, that stem segment is deleted from the list of candidate stems, at step 242 in the flow chart.
  • the algorithm makes use of various depth information stored at previous step 225 in Figure 2a, to remove less likely stem segments.
  • the algorithm examines stem segments taken from the remaining list of candidate stems.
  • the processing means looks at each pixel of a stem segment, and averages the stereo, or depth, information from each pixel, over the whole length of the stem segment. This averaged depth information for the whole stem segment is stored for future use.
  • the processing means calculates a measure of the accuracy of the stereo information for each entire stem segment and again stores the measure.
  • the algorithm averages stereo information of the individual pixels, to give overall information as to the overall movement of that stem portion transversely during the rotation of the container 11 between the two stereo visual images.
  • the estimate of accuracy (or inverse of error), measurement at step 246 takes the form of an inverse of standard deviation, and gives an estimate of the accuracy in that signal.
  • the error could be 5mm or 6mm, or even up to 10mm.
  • the error will be within about 2mm. This is used later in the flow chart by a decision to select for grasping a stem segment which has the best stereo information available for the positioning of the gripping elements.
  • step 247 determination is made as to when the last stem segment in the list has been dealt with, and the flow chart moves through connection C, to the remainder of the chart shown in Figure 2d.
  • the processing means examines the first candidate segment from the remaining list and begins to construct a score for each candidate segment, comprising three components.
  • the components may be summarised as follows:-
  • one hundred units are added to the score for the segment if at least some stereo information is available for that stem segment.
  • the stereo information available is considered again to determine whether there is sufficient clearance for the gripping rods 20 and 21 to be inserted into the mass of plants, to remove a stem portion.
  • pixels are examined on either side of a candidate stem segment to check if there are any edge pixels in front of the stem-portion space which will get in the way of the gripping rods 20 and 21.
  • a higher score is given to stem portions which do not have any obstructions in the space on either side of the stem segment in the two dimensional visual image of Figure 3, but positioned in front of the relevant stem segment area, so far as the stereo information is concerned.
  • step 254 there is added to the score for a stem segment a number from one to nine representing the accuracy of the stereo information available.
  • step 255 a check is made as to when the last stem segment has been dealt with, and the routine then passes to step 256.
  • step 256 the stem segments are arranged in a list in order of the score for each stem segment.
  • the robotic end effector 19 is directed by the control device 17 to the centre of the best stem selected, i.e. the stem portion at the head of the list sorted at step 256.
  • the algorithm proceeds down the list of stems, the lack of accuracy of the stereo information increases, so that the doubt as to the depth position of a plantlet increases. For this reason, it is then wise to advance the gripper rods 20 and 21 to a greater extent into the container 11, relative to the depth position expected, to ensure that the stem is gripped. If for example an early plantlet on the list has a deviation or error of 2mm to 3mm in its depth position, it is sufficient to advance the grippers beyond the expected position by say 5mm. As the container is emptied, and candidates lower down the list are selected, the error may be in the range 5mm to 6mm, so that it is wise to advance the rods 20 and 21 by, say, 10mm beyond the expected position.
  • the plantlet selected is either cut and removed, or is pulled upwardly and removed.
  • the routine ends at step 259.
  • stereo information in the algorithm will now be commented on.
  • the need for stereo information is to avoid the end effector 19 proceeding into the container 11 and removing two or more plants at once, which are positioned one in front of each other as far as the camera is concerned.
  • the rotation of the container avoids the need to calibrate two cameras.
  • a camera is calibrated by setting up a grid at the back of the container 11, and arranging for the robotic end effector 19 to draw a series of dots on the grid.
  • the position of the dots on a monitor screen .26 is then compared with the actual position of the dots in the grid, so that a correlation can be produced between camera position on the screen, and position of the tool 19 in real space.
  • Another advantage of rotation of the container 11, is that if it occurs that no suitable stem portions are located by the processing means 24, from one or two views presented by the camera 15, then the entire container 11 can be rotated through a much greater angle, and the sequence can be tried again.
  • the container 11 can be moved through a number of large angles, until a suitable pair of visual images is obtained, showing suitable stem segments.
  • the algorithm used to obtain stereo information is a simple, known, algorithm. Initially edges are located in the first image, and the second image is then examined. The algorithm looks at six pixels on each side of the computed position for the same edge in the second image, and having located the images, calculates the difference in distance between the two edges.
  • Thickness of top (or bottom) Greater than 2 pixels, end of stem.
  • 'Adjacent row' is defined as the Greater than 1. pixels immediately above the top of the stem (or below the bottom) in a horizontal span equal to the thickness of the stem end.
  • the algorithm can be summarised as the steps of selecting a plant portion which satisfies the criteria that:
  • the horizontal width of the plant portion, throughout its length, lies in the range two to eight pixels; and (ii) the length of the plant portion is greater than five pixels; and (iii) the angle to the vertical of the most nearly vertical side of the plant portion is less than 22°; and (iv) the horizontal widths of the top and bottom ends of the plant portion are each greater than two pixels; and
  • the adjacent lines of the scanning raster, above the top of the plant portion, and below the bottom of the plant portion each contains, in the region respectively immediately above and below the plant portion, more than one pixel representing plant material.
  • the processing starts at step 630 and only a single visual image is used so that no stereo information is available.
  • the algorithm scans the image row by row until a pair of left and right hand edge points are found, at steps 631 and 632, which indicate a white-to-grey transition followed by a grey-to-white transition.
  • the width between the transitions is then tested at step 633 to locate a pair of left and right hand edge points which are 2 to 8 pixels apart.
  • the signal processing means then tracks the stem downwards in step 634 until a stopping condition is reached. The tracking is stopped if (a) the stem thickness becomes thinner than 2 pixels or thicker than 8, or (b) if either edge deviates suddenly by more than 2 pixels to the right or left.
  • image I is generally the same as that shown in Figure 3a for the previous embodiment.
  • image I is generally the same as that shown in Figure 3a for the previous embodiment.
  • all possible stem portions are shown, even if these are very short, for example as indicated at 29.
  • the algorithm rejects stem segments having a length less than 5 pixels.
  • the position of the stem is recorded and added to a list of possible stem segments.
  • the coordinates of the corners that is to say the tops and bottoms of both edges, are recorded, and the search for more stems is resumed from the point of entry into the segment. This continues until the end of the screen is reached, at step 637.
  • the candidate stem segments are extracted from the list for examination for further criteria.
  • the set of candidate stem segments found after scanning the entire image includes leaf-stems, noise and some thin, vertical pieces of leaves amongst the genuine stems.
  • the stems are readily distinguishable from the leaf-stems by their angle to the vertical.
  • the angle of a stem segment to the vertical is calculated from the co-ordinates of its corners; the angles of the two straight lines drawn from the bottom left to top left and from bottom right to top right are both calculated, and the angle closest to the vertical is taken to be that of the stem. Taking the most vertical side of the stem segment as a measure of its angle allows short segments, which are more prone to errors in this quantity, to be more easily recognised as stems.
  • the selection by the criterion of angle is carried out at step 639.
  • Stems are also distinguished from noise and many of the leaf pieces by their length. Noise and leaf pieces rarely yield a coherent pair of edges over more than a few pixels. Length is measured as the vertical difference between the top and bottom coordinates of the segment.
  • stems are initially defined as a pair of edges, 2 to 8 pixels thick, with a length greater than 5 pixels and an angle to the vertical of not more than 22
  • an image typically contains some 20 remaining segments, including perhaps about 3 errors where a feature is falsely classified as a stem.
  • Many of the errors are removed by use of the property possessed by genuine stem segments of not continuing into empty space at either end; stem segments always end in a leaf or another stem segment at the top, or in leaf, stem or agar at the bottom. Segments are rejected which are thinner than 3 pixels at either end (step 640), or which do not continue into at least two plant pixels immediately above and below the ends (step 641). Approximately half the errors are removed by this criterion.
  • the output image signal is indicated at II representing an image generally similar to that shown in Figure 3c for the previous embodiment.
  • the algorithm locates the longest stem of the stems shown in Figure 3c, and at step 645 the algorithm extrapolates the longest stem downwardly towards the container 11. Also at step 645, the algorithm locates a region at the intersection of the extrapolated longest stem with the general level of the agar 12 in the container 11.
  • the output signal after step 645 is indicated at III, and represents a visual image generally the same as that shown in Figure 3d for the previous embodiment.
  • the longest stem section is indicated at 28, and the intersection of the extrapolated stem and the surface of the agar 12, is indicated at 29.
  • the final steps of the flow chart are that, at step 646 the robotic end effector is directed to grasp the selected stem in the region of the location 29 in Figure 3d, and at step 647 the selected stem is cut and the plant removed from the container 11. The algorithm is then stopped at step 648.

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biotechnology (AREA)
  • Developmental Biology & Embryology (AREA)
  • Robotics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Cell Biology (AREA)
  • Botany (AREA)
  • Environmental Sciences (AREA)
  • Image Analysis (AREA)

Abstract

Un procédé de localisation de la tige d'une plante pendant un processus de micropropagation comprend les étapes de génération d'un signal d'image au moyen d'une caméra vidéo représentant une image des plantes et le traitement du signal d'image pour localiser les parties de plantes qui possèdent des largeurs et des longueurs susceptibles de constituer des parties de tiges. On sélectionne alors les tiges en choisissant des parties de plantes remplissant les critères suivants: l'angle situé par rapport à la verticale du côté se rapprochant le plus de la verticale de la partie de plante est inférieur à un angle prédéterminé et on peut tracer une trajectoire descendante continue depuis la base de la plante jusqu'à la surface du matériau dans lequel pousse la plante, en traversant ce dernier. Un autre critère qu'on peut utiliser est que la partie de plante se termine à sa partie supérieure et à sa partie inférieure en une zone de matériau dont la dimension est supérieure à différents critères prédéterminés. Les signaux de localisation sont alors traités pour produire un signal de commande servant à commander à un effectueur d'extrémité robotique de se déplacer à l'emplacement désiré, de saisir la tige sélectionnée et d'enlever la plante ou la partie désirée de celle-ci, du milieu nutritif dans lequel elle pousse. On peut calculer des informations stéréo en observant la plante à partir de points légèrement différents.
PCT/GB1991/001266 1990-07-26 1991-07-26 Procedes et dispositif concernant la micropropagation Ceased WO1992001994A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB9016443.5 1990-07-26
GB909016443A GB9016443D0 (en) 1990-07-26 1990-07-26 Methods and apparatus relating to micropropagation

Publications (1)

Publication Number Publication Date
WO1992001994A1 true WO1992001994A1 (fr) 1992-02-06

Family

ID=10679710

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB1991/001266 Ceased WO1992001994A1 (fr) 1990-07-26 1991-07-26 Procedes et dispositif concernant la micropropagation

Country Status (4)

Country Link
EP (1) EP0540627A1 (fr)
AU (1) AU8304891A (fr)
GB (2) GB9016443D0 (fr)
WO (1) WO1992001994A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0500886A4 (en) * 1990-09-07 1993-01-07 The Commonwealth Industrial Gases Limited Automatic plant dividing system
FR2725812A1 (fr) * 1994-10-17 1996-04-19 Cirad Coop Int Rech Agro Dev Procede d'identification d'objets, especes ou individus divers et applications de ce procede
DE19845883B4 (de) * 1997-10-15 2007-06-06 LemnaTec GmbH Labor für elektronische und maschinelle Naturanalytik Verfahren zur Bestimmung der Phytotoxizität einer Testsubstanz

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI401412B (zh) * 2009-11-13 2013-07-11 Inst Information Industry Automatic measurement system, method and recording medium for plant characteristics

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4613269A (en) * 1984-02-28 1986-09-23 Object Recognition Systems, Inc. Robotic acquisition of objects by means including histogram techniques
WO1986006576A1 (fr) * 1985-05-15 1986-11-20 The Commonwealth Industrial Gases Limited Procede et appareil de division de materiaux vegetaux

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4613269A (en) * 1984-02-28 1986-09-23 Object Recognition Systems, Inc. Robotic acquisition of objects by means including histogram techniques
WO1986006576A1 (fr) * 1985-05-15 1986-11-20 The Commonwealth Industrial Gases Limited Procede et appareil de division de materiaux vegetaux

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Patent Abstracts of Japan, volume 13, no. 157 (p.857) [3505] 17 April 1989, & JP-A-63316277 (TOSHIBA CORP.) 23 December 1988 *
Patent Abstracts of Japan, volume 13, no. 157 (P-857)[3505] 17 April 1989, & JP, A, 63-316277 (TOSHIBA CORP.) 23 December 1988, see the whole abstract *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0500886A4 (en) * 1990-09-07 1993-01-07 The Commonwealth Industrial Gases Limited Automatic plant dividing system
US5370713A (en) * 1990-09-07 1994-12-06 The Commonwealth Industrial Gases Limited Automatic plant dividing system
FR2725812A1 (fr) * 1994-10-17 1996-04-19 Cirad Coop Int Rech Agro Dev Procede d'identification d'objets, especes ou individus divers et applications de ce procede
WO1996012237A1 (fr) * 1994-10-17 1996-04-25 C.I.R.A.D. (Centre De Cooperation Internationale En Recherche Agronomique Pour Le Developpement) Procede d'identification d'objets, especes ou individus divers et applications de ce procede
AU702189B2 (en) * 1994-10-17 1999-02-18 C.I.R.A.D. (Centre De Cooperation Internationale En Recherche Agronomique Pour Le Developpement Method for identifying miscellaneous objects, species or items, and uses thereof
DE19845883B4 (de) * 1997-10-15 2007-06-06 LemnaTec GmbH Labor für elektronische und maschinelle Naturanalytik Verfahren zur Bestimmung der Phytotoxizität einer Testsubstanz

Also Published As

Publication number Publication date
AU8304891A (en) 1992-02-18
GB2247948A (en) 1992-03-18
GB9016443D0 (en) 1990-09-12
GB9116159D0 (en) 1991-09-11
EP0540627A1 (fr) 1993-05-12

Similar Documents

Publication Publication Date Title
Yamamoto et al. Development of a stationary robotic strawberry harvester with a picking mechanism that approaches the target fruit from below
Sarig Robotics of fruit harvesting: A state-of-the-art review
JP2008206438A (ja) 果実収穫ロボット
CN101493313B (zh) 成熟果实识别和定位的图像处理方法
Huang et al. An automatic machine vision-guided grasping system for Phalaenopsis tissue culture plantlets
CN112990103B (zh) 一种基于机器视觉的串采二次定位方法
CA3111952A1 (fr) Systeme visionique de cueillette de champignons et methode de cueillette de champignons au moyen du systeme
CN109146866A (zh) 机器人对焊缝处理的方法及装置
JP3277529B2 (ja) 果実収穫ロボット
CN114175927A (zh) 一种圣女果采摘方法及圣女果采摘机械手
WO1992001994A1 (fr) Procedes et dispositif concernant la micropropagation
CN110741790B (zh) 一种基于深度相机的穴盘苗多爪移栽-分选的处理方法
Feng et al. Fruit Location And Stem Detection Method For Strawbery Harvesting Robot
JP7174217B2 (ja) 果菜収穫装置
Lefebvre et al. Computer vision and agricultural robotics for disease control: the Potato operation
Alam et al. Automated harvesting of green chile peppers with a deep learning-based vision-enabled robotic arm
Tarrío et al. A harvesting robot for small fruit in bunches based on 3-D stereoscopic vision
JP2003052251A (ja) 苗選別装置
EP3641569A1 (fr) Procédé et appareil de reconnaissance d'orientation de fruit
Hayashi et al. Gentle handling of strawberries using a suction device
McFarlane Image-guidance for robotic harvesting of micropropagated plants
McFarlane A computer-vision algorithm for automatic guidance of microplant harvesting
GB2245810A (en) Methods and apparatus relating to micropropagation.
Schaufler et al. Micropropagated sugarcane shoot identification using machine vision
Zhao et al. Locating the Fruit to Be Harvested and Estimating Cut Positions from RGBD Images Acquired by a Camera Moved along Fixed Paths Using a Mask-R-CNN Based Method

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AU CA JP US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FR GB GR IT LU NL SE

WWE Wipo information: entry into national phase

Ref document number: 1991913926

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1991913926

Country of ref document: EP

WWW Wipo information: withdrawn in national office

Ref document number: 1991913926

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: CA