[go: up one dir, main page]

WO2024207052A1 - Waste sorting method and apparatus - Google Patents

Waste sorting method and apparatus Download PDF

Info

Publication number
WO2024207052A1
WO2024207052A1 PCT/AU2024/050296 AU2024050296W WO2024207052A1 WO 2024207052 A1 WO2024207052 A1 WO 2024207052A1 AU 2024050296 W AU2024050296 W AU 2024050296W WO 2024207052 A1 WO2024207052 A1 WO 2024207052A1
Authority
WO
WIPO (PCT)
Prior art keywords
waste
waste item
sensors
sensor data
item
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/AU2024/050296
Other languages
French (fr)
Inventor
Ren Ping LIU
Xu Wang
Wei Ni
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Commonwealth Scientific and Industrial Research Organization CSIRO
Original Assignee
Commonwealth Scientific and Industrial Research Organization CSIRO
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2023901001A external-priority patent/AU2023901001A0/en
Application filed by Commonwealth Scientific and Industrial Research Organization CSIRO filed Critical Commonwealth Scientific and Industrial Research Organization CSIRO
Publication of WO2024207052A1 publication Critical patent/WO2024207052A1/en
Anticipated expiration legal-status Critical
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/30Administration of product recycling or disposal
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F1/00Refuse receptacles; Accessories therefor
    • B65F1/0033Refuse receptacles; Accessories therefor specially adapted for segregated refuse collecting, e.g. receptacles with several compartments; Combination of receptacles
    • B65F1/0053Combination of several receptacles
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C2501/00Sorting according to a characteristic or feature of the articles or material to be sorted
    • B07C2501/0045Return vending of articles, e.g. bottles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/34Sorting according to other particular properties
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/34Sorting according to other particular properties
    • B07C5/342Sorting according to other particular properties according to optical properties, e.g. colour
    • B07C5/3422Sorting according to other particular properties according to optical properties, e.g. colour using video scanning devices, e.g. TV-cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/36Sorting apparatus characterised by the means used for distribution
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F1/00Refuse receptacles; Accessories therefor
    • B65F1/0033Refuse receptacles; Accessories therefor specially adapted for segregated refuse collecting, e.g. receptacles with several compartments; Combination of receptacles
    • B65F2001/008Means for automatically selecting the receptacle in which refuse should be placed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F2210/00Equipment of refuse receptacles
    • B65F2210/168Sensing means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F2210/00Equipment of refuse receptacles
    • B65F2210/176Sorting means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F2210/00Equipment of refuse receptacles
    • B65F2210/184Weighing means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F2240/00Types of refuse collected
    • B65F2240/112Bottles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65FGATHERING OR REMOVAL OF DOMESTIC OR LIKE REFUSE
    • B65F2240/00Types of refuse collected
    • B65F2240/12Cans
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N2021/1765Method using an image detector and processing of image signal
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/55Specular reflectivity
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F7/00Mechanisms actuated by objects other than coins to free or to actuate vending, hiring, coin or paper currency dispensing or refunding apparatus
    • G07F7/06Mechanisms actuated by objects other than coins to free or to actuate vending, hiring, coin or paper currency dispensing or refunding apparatus by returnable containers, i.e. reverse vending systems in which a user is rewarded for returning a container that serves as a token of value, e.g. bottles

Definitions

  • the present invention relates to a waste sorting method and apparatus and in one particular example, to a method and apparatus for sorting waste items for recycling by distinguishing between different types of bottles and cans.
  • Bin-e (https ://bine. world) uses an Al-based recognition system to classify recycling wastes into glass, plastic, paper and metal.
  • the smart waste bin uses a single camera in the bin to capture waste images for a recognition algorithm and achieves up to 92% accuracy.
  • the Bin-e smart waste bins can recognise plastic, they cannot differentiate different types of plastics, such as PET (polyethylene terephthalate) and HDPE (high-density polyethylene) plastic bottles, each of which follows an individual recycling process, thereby limiting its value.
  • the present invention seeks to provide a method for sorting waste items, the method including: receiving a waste item; acquiring sensor data from a plurality of sensors, each sensor being used to measure at least one waste item characteristic; in one or more processing devices, determining a waste item category using the sensor data and at least one computational model, the at least one computational model being at least partially indicative of different waste item categories and being obtained by applying machine learning to sensor data acquired from a plurality of sensors used to measure at least one reference waste item characteristic from reference waste items in different known waste item categories; and, sorting the waste item based on the determined waste item category.
  • the present invention seeks to provide a system for sorting waste items, the system including: a plurality of sensors, each sensor being used to measure at least one waste item characteristic of a received waste item; and, one or more processing devices configured to: determine a waste item category using the sensor data and at least one computational model, the at least one computational model being at least partially indicative of different waste item categories and being obtained by applying machine learning to sensor data acquired from a plurality of sensors used to measure at least one reference waste item characteristic from reference waste items in different known waste item categories; and, sort the waste item based on the determined waste item category.
  • the at least one computational model includes a neural network.
  • the method includes, in the one or more processing devices, applying at least some of the sensor data to the computational model.
  • the method includes, in the one or more processing devices: determining one or more metrics from the sensor data; and applying the metrics to the at least one computational model.
  • the method includes, in the one or more processing devices, determining a waste item category using a single computational model and sensor data from each of the plurality of sensors. [0012] In one embodiment the method includes, in the one or more processing devices: determining a number of categorisation indications, each categorisation indication being determined using sensor data from one or more of the plurality of sensors; and, determining the waste item categorisation using the number of categorisation indications.
  • the waste item categorisation indications are probabilities and wherein the method includes, in the one or more processing devices, at least one of: determining a waste item category based on the categorisation indication having a highest probability; and determining a waste item category for a categorisation indication exceeding a threshold.
  • the method includes: assessing sensor data from a first sensor to determine if the waste item can be categorised; and, depending on the result of the determination at least one of: determining the waste item category; and assessing sensor data from at least one further sensor to determine the waste item category.
  • the method includes assessing sensor data from a number of sensors in sequence to determine the waste item category.
  • the plurality of sensors include: a visible-light camera; and, at least one of: a metal sensor; and, a weight sensor.
  • the waste item characteristic is indicative of at least one of: a quantity of metal in the waste item; a weight of the waste item; optical properties of the waste item; a reflectivity of the waste item; and, dimensions of the waste item.
  • the metrics include: text displayed on the waste item; dimensions of the waste item; and, a waste item reflectivity.
  • the method includes, in one or more processing devices: acquiring sensor data from each of a visible-light camera, a metal sensor and a weight sensor; and, applying the sensor data to a neural network indicative of different waste item categories to determine a waste item category, the neural network being trained on sensor data acquired from sensors used to measure reference waste items in different known waste item categories.
  • the plurality of sensors includes one or more ultrasonic sensors configured to detect the presence of waste items to at least one of: trigger classification of waste items; and, detect a waste bin fill level.
  • the plurality of sensors include at least one of: accelerometers; Radio Frequency (RF) sensors; infrared cameras; and, ultraviolet-light cameras.
  • RF Radio Frequency
  • the waste item characteristics include: RF signatures of materials present in the waste item; data read from RF tags associated with the waste item; and, optical properties of the waste item.
  • the method includes: receiving the waste item in a receptacle; determining the waste item category of the waste item; and, moving the receptacle to transfer the waste item to one of a plurality of destinations in accordance with the waste item category.
  • the method includes transferring the waste item to one of a plurality of waste bins depending on the waste item category.
  • the method includes, in the one or more processing devices, controlling one or more actuators to move the receptacle.
  • the receptacle is a chute
  • the method includes, in the one ormore processing devices: controlling a first actuator to rotate the chute and thereby align the chute with a selected one of a plurality of waste bins; and, controlling a second actuator to release the waste item from the chute and thereby transfer the waste item into the selected waste bin.
  • the method includes, in the one or more processing devices, controlling the first and second actuators to align the chute with an opening to allow a waste item to be placed therein.
  • the method includes, in the one or more processing devices, detecting placement of a waste item on the chute using an ultrasonic sensor.
  • Figure 1A is a schematic side view of an example of an apparatus for sorting waste items
  • Figure IB is a schematic plan view of the apparatus of Figure 1A;
  • Figure 1C is a schematic diagram of an example of a processing device for performing classification and/or controlling the apparatus of Figure 1 A;
  • Figure 2 is a flow chart of an example of a process for sorting waste items
  • Figure 3 is a flow chart of a first specific example of a process for sorting waste items
  • Figure 5 is a flow chart of a third specific example of a process for sorting waste items
  • Figure 6A is an image of a specific example of an apparatus for sorting waste items
  • Figure 6B is an image of the chute of the apparatus of Figure 6A;
  • Figure 7 is a flow chart of an example of operation of the apparatus of Figure 6A;
  • Figures 8A and 8B are a flow chart of an example of a control process for the apparatus of Figure 6A; and, [0042] Figures 9A to 9P are example images of waste items captured using the apparatus of Figure 6A.
  • the apparatus 100 includes a receptacle 111, such as a chute or tray, which is configured to receive a waste item.
  • a receptacle 111 such as a chute or tray, which is configured to receive a waste item.
  • the receptacle 111 is supported by a post 112, although it will be appreciated that this is not essential and other suitable arrangements could be used.
  • One or more actuators for are provided for moving the receptacle 111 to allow waste items to be selectively dispensed therefrom.
  • the apparatus 100 includes two actuators 121, 122 allowing the receptacle 111 to be rotated and tilted, as shown by the arrows 123, 124, respectively.
  • a plurality of sensors 131, 132, 133 are provided, which are configured to measure characteristics of a waste item positioned in the receptacle 111.
  • the nature of the sensors will vary depending on the preferred implementation, although these typically include at least a visible light camera 131, and more typically a visible light camera in combination with one or both of a metal detector 132 and a weight sensor 133.
  • the apparatus 100 also includes one or more processing devices 141, which typically form part of one or more processing systems 140.
  • the processing device could be of any suitable form and could include a microprocessor, microchip processor, logic gate configuration, firmware optionally associated with implementing logic such as an FPGA (Field Programmable Gate Array), or any other electronic device, system or arrangement.
  • the waste sorting process can be performed using multiple processing devices, with processing being distributed between one or more of the devices as needed, for example using one or more processing devices to perform classification and one or more other processing devices to control the physical apparatus. Nevertheless, for the purpose of ease of illustration, the following examples will refer to a single processing device, but it will be appreciated that reference to a singular processing device should be understood to encompass multiple processing devices and vice versa, with processing being distributed between the devices as appropriate.
  • a processing system 140 can include a processing device 141, a memory 142, an optional input/output device 143, such as a keyboard and/or display, and an external interface 144, interconnected via a bus 145 as shown.
  • the external interface 144 can be utilised for connecting the processing system 140 to the sensors 131, 132, 133 and the actuators 121, 122, and optionally to other peripheral devices, such as the communications networks, databases, storage devices, or the like.
  • a single external interface 144 is shown, this is for the purpose of example only, and in practice multiple interfaces using various methods (e.g. Ethernet, serial, USB, wireless or the like) may be provided.
  • the processing device 141 executes instructions in the form of applications software stored in the memory 142 to allow the required processes to be performed.
  • the applications software may include one or more software modules, and may be executed in a suitable execution environment, such as an operating system environment, or the like.
  • the processing system 140 may be formed from any suitable processing system, such as a suitably programmed computing device, or the like.
  • the processing device is configured to determine a waste item category using sensor data from one or more of the sensors 131, 132, 133 and control the one or more actuators 121, 122 to move the receptacle 111 and thereby transfer the waste item to one of a plurality of destinations, such as waste bins 101, in accordance with the waste item category.
  • a waste item is received, for example, by having the waste item placed in the receptacle 111.
  • Sensor data can then be collected by the sensors 131, 132, 133, with this being acquired by the processing device 141 at step 210, allowing the processing device 141 to determine a waste item category at step 220.
  • the processing device 141 can sort the waste item at step 230.
  • this process includes controlling the actuator 121 to rotate the receptacle 111 as shown by the arrow 123 to align the receptacle with a waste bin 101 selected based on the determined category.
  • the actuator 122 is then controlled to thereby tilt the receptacle 111 as show by the arrow 124 so the waste item is dispensed from the receptacle into the selected waste bin 101.
  • the above described arrangement provides a compact apparatus that is capable of using data from multiple sensors to sense characteristics of waste items and use these in order to categorise and subsequently sort waste items.
  • This can be implemented using basic off the shelf components, avoiding the need for expensive equipment, such as multi- spectral cameras.
  • the use of the receptacle and actuator arrangement ensures sensing can be performed at a single location, with the waste items being easily transported to a destination, such as a waste bin, avoiding the need for complex item transporting mechanisms, such as conveyor belts, or similar.
  • the receptacle 111 is a chute, and in particular a curved chute having a closed end.
  • This allows waste items, and particularly generally cylindrical waste items, such as bottles or cans, to be placed in the chute, and rest against the closed end of the chute, preventing the waste item accidentally falling from the chute, and hence entering the wrong waste bin.
  • this can orientate waste items such as bottles and cans within the chute, to ensure these are presented consistently to the sensors, which in turn helps more accurately interpret resulting sensor data.
  • this arrangement allows for easy sorting and dispensing of waste items, allowing the first actuator 121 to be used to rotate the chute 111 to align the open end of the chute with a bin 101, with the second actuator 122 being used to tip the chute 111, allowing the waste item to slide into the waste bin 101.
  • the apparatus can include a frame, which is configured to support the sorting apparatus relative to the waste bins.
  • the actuator 121 can be a rotation actuator attached to the frame, allowing the chute 111 to be aligned with the bins.
  • the post 112 extends upwardly from the rotation actuator 121, with a bracket (not shown) being rotatably mounted to an upper end of the post 112, with the chute 121 being supported by the bracket so that the actuator 122 acts as a tilting actuator to allow the chute to be tilted, thereby dispensing the waste item.
  • the apparatus can further include a cover (not shown) having an opening allowing a waste item to be inserted therein and positioned on the chute. This can be used to protect the apparatus, and prevent ingress of objects that could otherwise interfere with the sorting process.
  • the processing devices can be configured to control the one or more actuators 121, 122 to align the chute with the opening to allow a waste item to be placed therein.
  • the sensors include a visible-light camera 131, and a metal sensor 132 and/or a weight sensor 133, although in other examples, additional and/or alternative sensors may be used.
  • the visible-light camera 131 and an optional illumination source are positioned above the receptacle 111, so that the camera faces downwards to capture images of the waste item.
  • This has a number of benefits, including ensuring an uninterrupted view of the waste items, whilst allowing the waste item to rest within the receptacle 111.
  • the receptacle 111 can be coloured, for example using a black colouring, to ensure more consistent imaging, which can be assisted through the use of an illumination source to thereby counteract changes in ambient radiation. This can help ensure consistent measurement of optical properties of the waste items, and hence improve categorisation.
  • the metal sensor 132 is attached to the receptacle, thereby ensuring the metal content of the waste item is accurately measured, whilst the weight sensor 133 can support the receptacle, for example by positioning this between the rotation actuator and post, so that an accurate weight of the waste item can be captured.
  • waste item characteristics to be measured that are indicative of a quantity of metal in the waste item, a weight of the waste item, optical properties of the waste item, such as a reflectivity of the waste item, dimensions of the waste item, text or other visual information provided on the waste item, coded data, such as bar codes, QR codes, or the like.
  • sensors could also be employed, including for example, one or more ultrasonic sensors, which can be configured to detect a presence of a waste item in the receptacle, for example to trigger the sorting process, and/or a fill level of a waste bin, for example to trigger emptying of the apparatus.
  • Other sensors for sensing waste item characteristics could include accelerometers, Radio Frequency (RF) sensors, infrared cameras and/or ultraviolet-light cameras, which can be used to measure characteristics such as the presence of RFID tags, waste material reflectivity, or the like.
  • RF Radio Frequency
  • waste items could be provided on a conveyor belt and transported past sensors, to allow categorization to be performed, with processing of sensor data again being performed by one or more processing devices as appropriate.
  • the classification of the waste items involves using a computational model, and specifically, using the sensor data and at least one computational model that is at least partially indicative of different waste item categories.
  • the computational model is typically obtained by applying machine learning to sensor data acquired from a plurality of sensors used to measure at least one reference waste item characteristic from reference waste items in different known waste item categories.
  • this process uses a computational model and sensor data from multiple sensors, typically an imaging camera and one or more of a weight sensor and metal sensor, which increases the accuracy compared to using a single sensor only.
  • using multiple sensors and a computational model allows a high degree of accuracy to be achieved when categorizing waste items, allowing the apparatus to accurately sort waste items.
  • the nature of the computational model will vary depending on the preferred implementation, but could include for example a neural network such as a YOLO algorithm, which employs convolutional neural networks (CNN) to detect objects in real-time.
  • a neural network such as a YOLO algorithm, which employs convolutional neural networks (CNN) to detect objects in real-time.
  • CNN convolutional neural networks
  • the nature of the model can be of any appropriate form and could include any one or more of decision tree learning, random forest, logistic regression, association rule learning, artificial neural networks, deep learning, inductive logic programming, support vector machines, clustering, Bayesian networks, reinforcement learning, representation learning, similarity and metric learning, genetic algorithms, rule-based machine learning, learning classifier systems, or the like. As such schemes are known, these will not be described in any further detail.
  • the processing device applies at least some of the sensor data to the computational model.
  • An example of this, for the specific embodiment of neural networks will now be described with reference to Figure 3.
  • a waste item is received, with sensor data being collected by the sensors, with this being acquired by the processing device at step 310.
  • the processing device applies the sensor data to the neural network computational model, with this providing an indication of a categorisation at step 330.
  • sensor data such as image data from an imaging device
  • the method may include determining one or more metrics from the sensor data and then applying the metrics to the at least one computational model.
  • the image data could be analysed, for example by performing edge detection to determine dimensions of the waste item, with the dimensions then being applied to the computational model.
  • image data could be analysed to determine other metrics, such as the reflectivity of the waste item material, the presence and content of any text or images on the waste item, the presence or content of coded data, such as barcodes, QR codes, or the like.
  • a single computational model is used, and sensor data from each of the plurality of sensors is applied to the one model.
  • a respective computational model could be used for each sensor, so that sensor data and/or metrics derived from the sensor data associated with one sensor can be applied to a respective computational model.
  • image data could be applied to an image computational model, weight sensor data applied to a weight computational model, and so on.
  • each model can then be used to provide an indication of a possible categorisation, with these being combined in some manner to determine a waste item category.
  • outputs from the sensors could be assessed sequentially, which can streamline the analysis in some instances.
  • the processing device could assess sensor data from a first sensor to determine if the waste item can be categorised. If so, the waste item is categorised, and if not sensor data from at least one further sensor is assessed, with this process continuing until a category can be determined.
  • An example of this is shown in more detail in Figure 4.
  • a waste item is received, with sensor data being collected by the sensors and acquired by the processing device at step 410.
  • the processing device selects a next sensor, and attempts to perform categorisation at step 430. This can involve the use of a computational model, although this may not be required, depending on the nature of the sensor and the waste items. For example, when sorting bottles and cans, typically only cans are metallic, and so if more than a certain amount of metal is detected, this can be used to determine the waste item is a can.
  • step 420 the process returns to step 420 to select a next sensor, with this process being repeated until a categorisation has been determined.
  • sensor data from each sensor could be considered individually, or once multiple sensor data has been analysed, these could be considered collectively.
  • the processing device can determine a number of categorisation indications, such as categorisation probabilities, with each categorisation indication being determined using sensor data from one or more of the plurality of sensors. The categorisation is then performed using the categorisation indications. An example of this will now be described with reference to Figure 5.
  • a waste item is received, with sensor data being collected by the sensors and acquired by the processing device at step 510.
  • the processing device analyses sensor data, and then determines a number of category probabilities at step 530. This can be performed for individual sensors, for example, determining separate probabilities using metal and weight sensors. Additionally, and/or alternatively, this could be performed using combinations of sensors, for example determining one probability using a combination of image data and metal sensor data and another probability using the combination of image data and weight sensor data.
  • the probabilities can be used to determine a waste item category at step 540. This can be achieved in different manners depending on the preferred implementation, for example by determining a waste item category based on the categorisation indication having a highest probability or by selecting a categorisation indication exceeding a threshold.
  • the plurality of sensors can include a visible-light camera and one or more of a metal sensor or weight sensor, which can measure waste item characteristics such as a quantity of metal in the waste item, a weight of the waste item, optical properties of the waste item, a reflectivity of the waste item, dimensions of the waste item, or the like.
  • the system can also extract metrics from the sensor data, such as text displayed on the waste item, dimensions of the waste item, a waste item reflectivity, or the like, applying the metrics to the computational model.
  • the approach includes acquiring sensor data from each of a visible-light camera, a metal sensor and a weight sensor and applying the sensor data to a neural network indicative of different waste item categories to determine a waste item category, the neural network being trained on sensor data acquired from sensors used to measure reference waste items in different known waste item categories.
  • sensors can also be used including ultrasonic sensors configured to detect the presence of waste items to trigger classification of waste items and/or detect a waste bin fill level.
  • accelerometers e.g., Bosch Sensortec BMA150 sensors
  • Radio Frequency (RF) sensors e.g., Bosch Sensortec BMA150 sensors
  • infrared cameras e.g., Bosch Sensortec BMA150 sensors
  • ultraviolet-light cameras e.g., IR sensors
  • RF Radio Frequency
  • the apparatus 600 includes a chute 611 attached via a rotatable bracket 613 to a post 612.
  • the post is attached to a rotary actuator 621 via a weight scale 633, whilst a tilting actuator 622 is attached to the bracket 613, allowing the chute 611 to be rotated and tilted.
  • the rotary actuator 621 is mounted on a frame 602 to support the arrangement above four waste bins 601.
  • a pyramidal cover 603 is also removably attached to the frame 602, with the cover 603 including an opening 604.
  • the chute 611 can be rotated to align with the opening 604 so that waste items inserted into the opening 604 are received on the chute 611. Following categorisation of the waste item, the chute 611 can be rotated to align with one of the bins 601, before being tilted so that waste items can be deposited therein.
  • the apparatus 600 includes a camera system 631 positioned above the chute, allowing the camera to image waste items positioned on the chute 611, and example captured images are shown in Figures 9A to 9P.
  • a light 631.1 is positioned proximate the camera 631, which in this example is in the form of a ring light surrounding the camera 631, which helps ensure even illumination of the waste item. This minimises the impact of variable ambient light, and thus ensures the waste sample is consistently illuminated, irrespective of external illumination sources. It will be appreciated that this can be further assisted by the use of an opaque cover.
  • one or more metal sensors 632 are mounted on an underside of the chute 611, allowing metal content within waste items positioned on the chute to be detected.
  • the apparatus 600 also includes an ultrasonic sensor 634 mounted on the chute, allowing the presence of waste items on the chute to be detected, whilst ultrasonic sensors 635 are mounted on the frame, allowing the fill volume of each bin 601 to be detected.
  • the processing system 640 is mounted above the camera 631 and light 631. 1, with the apparatus 600 including one or more display screens 643, allowing information regarding the sorting process to be displayed.
  • the vision part of the apparatus includes the ring light 631.1, the camera 631, and an Al processing system 640 (Jetson Nano).
  • the ring light 631.1 provides stable illumination to lighten the chute 611, whilst the camera 631 is installed right above the chute 611 with a distance between the camera 631 and the chute 611 being adjusted according to the focal length of the camera 631 such that the chute 611 substantially fdls the video image frame .
  • the camera 631 is connected to the Al processing system 640 and provides video streams of the recycle chamber to the Al processing system 640.
  • the Al processing system 640 runs a pre-trained classification algorithm to detect objects in the video streams from the camera and to classify the object to be PET or HDPE.
  • the Al processing system 640 then combines the object detection result and other sensing signal to make a final classification result.
  • the Al processing system 640 sends the result to a separate microcontroller (not shown) for sorting control.
  • the Al processing system 640 can also output the video stream to one of the display screens 643.
  • the processor system 640 uses an NVIDIA Jetson device to perform Al-based plastic classification, while the camera module uses a Raspberry Pi camera to capture video of waste items.
  • the light module uses a ring light to provide additional illumination, and the apparatus uses an electrician device to control the sensors and servos.
  • An object and fullness detector module uses the ultrasonic sensors 634, 635 to detect waste items and measure the fullness of the bins, while a scale module uses the weight sensor 633 to measure the weight of waste items.
  • a metal detector module uses the metal sensor 632 to detect metal waste items, and a movement module uses servos 621, 622 to move the chute 611.
  • a rotation servo 621 is used to turn the chute 611 to the corresponding bin 601.
  • the waste bins 601 are evenly located on the four comers of the apparatus 600 such that the rotation angles to the bins are fixed and pre-calculated.
  • the apparatus 600 can control the rotation servo to rotate to a specific angle according to the precalculated bin-angle mapping.
  • the automatic smart bin controls the rotation servo to rotate the same angle in the opposite direction such that the chute 611 can point to the opening 604 for the next recycling process.
  • the tilting servo 622 is used to tip the recycle item into the designated bin.
  • the tilting servo 622 is by default at a position pointing the opening 604 to accept the deposited waste item.
  • the uplifting angle assures that waste bottles can stay at the bottom of the chute 611 such that the bottles are within the detection range of the metal detector 632 and at the centre of the camera field of view.
  • the tilting servo 622 tilts down the chute 611 by 50 degrees, so that the waste item in the chute 611 can slide into the bin 601.
  • the tilting servo 622 tilts up the chute 611 by 50 degrees in the opposite direction for the next recycling process.
  • the bins are evenly distributed around the chute, with a top view similar to that shown in Figure IB.
  • the chamber can rotate to the bin direction with the corresponding degrees and then tilt down to dump the bottle.
  • bin 1, 2, 3 and 4 are for metal, glass, HDPE and PET bottles.
  • the chamber can rotate 45 and 135 degrees clockwise to face the direction of bin 1 and bin 2, respectively, and then tilt down 50 degrees to deposit the bottle in the chamber.
  • the chamber can rotate 45 and 135 degrees counter-clockwise to dump bottles into bin 3 and bin 4, respectively.
  • the rotation-tilt-based sorting mechanism is more compact and suitable for confined space.
  • the rotation-tilt-based sorting can also keep deposited waste bottles at a fixed point making it easy to design the locations of the sensors.
  • the rotation-tilt-based in-situ sorting mechanism can also be extended in a few ways. For example, more bins can be added to accept more types of wastes, with the rotation angles being updated accordingly. Additional sets of bins can be provided, with the chute 611 being moved horizontally to a different set of bins. A chute or splash guard can be added to assure bottles are always dumped into the bin.
  • the automatic smart bin consistently monitors the chute 611 with the ultrasonic sensor 634 to detect when a waste item has been received at step 700. Once the processing system 640 detects a waste item being deposited onto the chute 611, the processing system 640 will start the sensing, classification and sorting process.
  • the apparatus senses the waste with the metal sensor(s) 632 installed on the chute 611, and the weight sensor 633 under the chute 611 at step 730, as well as using the camera 631 on top of the chamber to image the waste item, capturing multimedia sensor data at step 710.
  • Sensor data is sent to the Al classification module implemented by the processing system 640, which analyses the video stream to perform object detection at step 720, and then combines with other sensor data to classify the waste.
  • the Al classification module analyses the sensor data from the sensing module(s), including the multimedia data and non-multimedia data.
  • the multimedia data such as video stream
  • the vision-based object detection model can identify PET bottles in the video stream, and then the size of the bottles and the confidence of the detection result.
  • the multimedia-based object detection module can use the off-the-shelf vision-based object detection models. This allows the apparatus to accurately recognise waste types, including the fine recognition on plastic types, with multiple sensing data.
  • the apparatus can also employ computer vision technologies, including object detection, image classification, Optical Character Recognition (OCR), and bar/QR code scanners, to recognise plastic types.
  • OCR Optical Character Recognition
  • the detection module can employ the OCR technology to extract the brand and other information on the bottle label, and could also include a bar/QR code scanner to read the product code on the bottle if there is any code appearing in the image.
  • the automatic smart bin further refines recognition accuracy with multiple sensing data, including visible light signals, ultraviolet-light signals, near infrared light signals, weight signals, ultrasonic signals, and metal signals.
  • Al vision models have been used to classify images according to image content and identify objects in the image, but has yet to classify different types of plastics with similar appearance.
  • To classify different plastics a large data set with a variety of plastics is created.
  • massive photo data sets of different types of plastics are taken at different angles and labelled with the corresponding plastic types, as shown for example in Figures 9A to 9P.
  • the photos are taken using the automatic smart bin with the same background and lighting condition.
  • the dataset is then used to train the object detection model such that the object detection model can classify different plastics.
  • the output of the multimedia-based object detection and other non-multimedia data are fed into the classification algorithm to get the final waste type detection result at step 740.
  • the data will also be sent to a waste item status algorithm at step 750 to determine a bottle status.
  • An example of classifying a PET bottle is as follows.
  • the object detection model suggests the object could be PET or glass from the video stream; the classification algorithm can calculate the density of the bottle using the size information from the object detection model and the weight from the weight sensor, and then confirm the bottle is type PET.
  • the detection results and non-multimedia data are also sent to the status description algorithm for recycle analysis.
  • the algorithm can describe the bottle as a Brand- A’ product-P bottle with half full of water.
  • the brand and product information can be deduced from the multimedia-based object detection algorithm, while the half full of water is deduced from the weight and the acceleration data.
  • the control algorithm receives the classification at step 800. If it is determined the waste is classified to be metal at step 805, the control algorithm controls the chute 611 to rotate to the metal bin direction, at step 810 and then tilts the chute 611 to dump the waste at step 840. Meanwhile, the control algorithm reads the fullness of the metal bin with the ultrasonic sensor on top of the bin at step 845 and optionally updates the bin fullness on the screen 643, or uploads an indication to a remote location, such as a monitoring system that coordinates bin emptying. Next, the control algorithm tilts up the chute 611 and rotates the chute 611 to the initial position at step 850, and resets the sensors for the next task at step 855.
  • the control algorithm determines whether the waste is glass at step 815, or HDPE or PET at step 825 (in each case rotating the chute 611), and then sorts the waste to appropriate bins at steps 820, 830 or 835, respectively, before performing steps 840 to 855 as required.
  • the focus has been on classification of glass, metal cans, PET bottles and HDPE bottles.
  • the techniques can be adapted to recognize and classify a wider range of waste items, including food containers, coffee cups, paper, cardboard, and other types of plastics.
  • the classification algorithm can be trained with a large number of samples at a variety of statuses, to help ensure that the module is able to accurately and reliably classify waste items, regardless of their condition or appearance. This can help improve the overall performance and effectiveness of the automatic smart bin technology.
  • the current version of the apparatus only contains four bins (glass, metal, PET and HD PE) centred around the chute 611. This is due to the limited size of the automatic smart bin and because of the fact that the rotation-based sorting mechanism can only handle this number of bins. As a result, the current version of the automatic smart bin is limited in its ability to sort waste items into the four categories. However, it will be appreciated that this can be expanded to enable sorting of a greater variety of waste items.
  • the above arrangement seeks to provide an apparatus, and in one example, a smart bin, that:
  • the apparatus can use one or more sensing modules to extract features of the waste items.
  • Typical sensing module(s) include a variety of sensors, such as ultrasonic sensors, metal sensors, weight sensors, accelerometers, Radio Frequency (RF) sensors, visible-light cameras, infrared cameras, and ultraviolet-light cameras.
  • RF sensors can sense metal objects within the detection range and weight sensors can measure the weight of the waste.
  • RF sensors can detect the presence of certain materials through their unique RF signatures and interact with RF tags within the communication range. Visible-light cameras, infrared cameras, and ultraviolet-light cameras can capture images of waste in different spectrums.
  • the sensors are located inside of the apparatus around the waste receiving chute to eliminate or reduce interference and noise.
  • the sensing module(s) may also include other supportive components.
  • the sensing module may have a power supply to provide electricity to the sensors and other components, fans to keep the sensors cool, and displays to show the results of the sensing. It may also have shields to block electromagnetic waves that could interfere with the sensors, and lights to illuminate the waste in order to improve the accuracy of the sensing.
  • the sensing module(s) should be able to allocate deposited waste regardless of how the waste is deposited into the bin, such that the sensing module(s) can control physical variables during sensing.
  • the apparatus integrates sensors, including weight sensors, ultrasonic sensors, and metal sensors, to detect and recognise glass and metal bottles that have been dropped into the bin.
  • the apparatus can employ computer vision-based object detection technologies to recognise PET (polyethylene terephthalate) and HDPE (high-density polyethylene) plastic bottles. After the classification, the smart bin controls robotic arms to physically sort waste bottles into the appropriate bins for recycling without the need for human intervention.
  • the apparatus can automatically classify and sort a variety of types of waste bottles without human intervention.
  • the apparatus provides a drop-and-go waste classification and sorting on-field solution. Individuals can simply deposit their waste bottles and other items into the apparatus, and it will automatically classify and sort the items into the appropriate containers or bins for recycling or disposal.
  • the apparatus can also process bottles without labels, while product code-based smart bins can only rely on easy-pealed bottle labels. This is because product code-based smart bins use the product codes, in the form of the bar code, QR code or Near-Field Communication (NFC) label, on the labels of waste bottles to identify and classify them. Without the labels, traditional arrangements are unable to identify the bottles and cannot sort them correctly. In contrast, the proposed apparatus can process waste items without labels using advanced sensing and Al technology. The developed apparatus can identify and classify the bottles based on their material, shape, colour, size, and other characteristics, allowing it to accurately and reliably sort waste bottles, even if they do not have labels.
  • product code-based smart bins use the product codes, in the form of the bar code, QR code or Near-Field Communication (NFC) label, on the labels of waste bottles to identify and classify them. Without the labels, traditional arrangements are unable to identify the bottles and cannot sort them correctly.
  • the proposed apparatus can process waste items without labels using advanced sensing and Al technology.
  • Waste management Smart bins can be used in waste management facilities to automatically classify and sort waste bottles and other items into the appropriate containers or bins for recycling or disposal. This can help improve the efficiency and accuracy of waste sorting and recycling processes.
  • Smart bins can be installed in public spaces, such as parks, malls, and schools, to provide individuals with a convenient way to dispose of waste bottles and other items.
  • the automatic classification and sorting capabilities of the smart bin can help reduce litter and improve waste management in these areas.
  • Smart bins can be used in manufacturing facilities to automatically classify and sort wastes and other items generated during the production process. This can help improve the efficiency and sustainability of manufacturing operations.
  • the developed apparatus can classify PET and HDPE plastic bottles in an affordable way, while conventional arrangements cannot identify the different types of plastics.
  • the apparatus develops advanced Al-based object detection algorithms to identify and accurately classify waste bottles based on their visual characteristics, such as their shape, size, and colour from the video stream from cheap visible-light devices.
  • Other traditional arrangements either cannot identify the types of plastics or use expensive infrared cameras to test plastic samples.
  • the apparatus can play an important role in implementing effective waste reduction and recycling strategies.
  • the automatic smart bin technology contributes to reducing waste, improving the recycling rate, reducing recycling costs, and improving sustainability in the waste management industry.
  • the classification and sorting technology in the automatic smart bin can also be used in other fields, e.g., recycling leftover material during manufacturing.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Business, Economics & Management (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Computing Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Mathematical Physics (AREA)
  • Medical Informatics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Operations Research (AREA)
  • Molecular Biology (AREA)
  • Databases & Information Systems (AREA)
  • General Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Strategic Management (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Sustainable Development (AREA)
  • Economics (AREA)
  • Biomedical Technology (AREA)
  • Marketing (AREA)
  • Mechanical Engineering (AREA)
  • Quality & Reliability (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Processing Of Solid Wastes (AREA)

Abstract

A method for sorting waste items, the method including receiving a waste item, acquiring sensor data from a plurality of sensors, each sensor being used to measure at least one waste item characteristic min one or more processing devices, determining a waste item category using the sensor data and at least one computational model, the at least one computational model being at least partially indicative of different waste item categories and being obtained by applying machine learning to sensor data acquired from a plurality of sensors used to measure at least one reference waste item characteristic from reference waste items in different known waste item categories and sorting the waste item based on the determined waste item category.

Description

WASTE SORTING METHOD AND APPARATUS
Background of the Invention
[0001] The present invention relates to a waste sorting method and apparatus and in one particular example, to a method and apparatus for sorting waste items for recycling by distinguishing between different types of bottles and cans.
Description of the Prior Art
[0002] The reference in this specification to any prior publication (or information derived from it), or to any matter which is known, is not, and should not be taken as an acknowledgement or admission or any form of suggestion that the prior publication (or information derived from it) or known matter forms part of the common general knowledge in the field of endeavour to which this specification relates.
[0003] There have been many studies and research classifying and sorting recycling wastes onsite. Often such systems rely on detecting specific information on waste items, such as a barcode or similar, and using a database to then identify the item, with this being used for sorting. However, such systems are not very robust and fail in situations where the specific information cannot be detected, for example if this is obscured or removed.
[0004] Bin-e (https ://bine. world) uses an Al-based recognition system to classify recycling wastes into glass, plastic, paper and metal. The smart waste bin uses a single camera in the bin to capture waste images for a recognition algorithm and achieves up to 92% accuracy. Although the Bin-e smart waste bins can recognise plastic, they cannot differentiate different types of plastics, such as PET (polyethylene terephthalate) and HDPE (high-density polyethylene) plastic bottles, each of which follows an individual recycling process, thereby limiting its value.
[0005] Another approach that has been used includes using specific equipment, such as hyperspectral cameras, that are more capable of distinguishing materials. However, these are expensive and cannot be widely adopted for on-site classification. Summary of the Present Invention
[0006] In one broad form, the present invention seeks to provide a method for sorting waste items, the method including: receiving a waste item; acquiring sensor data from a plurality of sensors, each sensor being used to measure at least one waste item characteristic; in one or more processing devices, determining a waste item category using the sensor data and at least one computational model, the at least one computational model being at least partially indicative of different waste item categories and being obtained by applying machine learning to sensor data acquired from a plurality of sensors used to measure at least one reference waste item characteristic from reference waste items in different known waste item categories; and, sorting the waste item based on the determined waste item category.
[0007] In one broad form, the present invention seeks to provide a system for sorting waste items, the system including: a plurality of sensors, each sensor being used to measure at least one waste item characteristic of a received waste item; and, one or more processing devices configured to: determine a waste item category using the sensor data and at least one computational model, the at least one computational model being at least partially indicative of different waste item categories and being obtained by applying machine learning to sensor data acquired from a plurality of sensors used to measure at least one reference waste item characteristic from reference waste items in different known waste item categories; and, sort the waste item based on the determined waste item category.
[0008] In one embodiment the at least one computational model includes a neural network.
[0009] In one embodiment the method includes, in the one or more processing devices, applying at least some of the sensor data to the computational model.
[0010] In one embodiment the method includes, in the one or more processing devices: determining one or more metrics from the sensor data; and applying the metrics to the at least one computational model.
[0011] In one embodiment the method includes, in the one or more processing devices, determining a waste item category using a single computational model and sensor data from each of the plurality of sensors. [0012] In one embodiment the method includes, in the one or more processing devices: determining a number of categorisation indications, each categorisation indication being determined using sensor data from one or more of the plurality of sensors; and, determining the waste item categorisation using the number of categorisation indications.
[0013] In one embodiment the waste item categorisation indications are probabilities and wherein the method includes, in the one or more processing devices, at least one of: determining a waste item category based on the categorisation indication having a highest probability; and determining a waste item category for a categorisation indication exceeding a threshold.
[0014] In one embodiment the method includes: assessing sensor data from a first sensor to determine if the waste item can be categorised; and, depending on the result of the determination at least one of: determining the waste item category; and assessing sensor data from at least one further sensor to determine the waste item category.
[0015] In one embodiment the method includes assessing sensor data from a number of sensors in sequence to determine the waste item category.
[0016] In one embodiment the plurality of sensors include: a visible-light camera; and, at least one of: a metal sensor; and, a weight sensor.
[0017] In one embodiment the waste item characteristic is indicative of at least one of: a quantity of metal in the waste item; a weight of the waste item; optical properties of the waste item; a reflectivity of the waste item; and, dimensions of the waste item.
[0018] In one embodiment the metrics include: text displayed on the waste item; dimensions of the waste item; and, a waste item reflectivity.
[0019] In one embodiment the method includes, in one or more processing devices: acquiring sensor data from each of a visible-light camera, a metal sensor and a weight sensor; and, applying the sensor data to a neural network indicative of different waste item categories to determine a waste item category, the neural network being trained on sensor data acquired from sensors used to measure reference waste items in different known waste item categories. [0020] In one embodiment the plurality of sensors includes one or more ultrasonic sensors configured to detect the presence of waste items to at least one of: trigger classification of waste items; and, detect a waste bin fill level.
[0021] In one embodiment the plurality of sensors include at least one of: accelerometers; Radio Frequency (RF) sensors; infrared cameras; and, ultraviolet-light cameras.
[0022] In one embodiment the waste item characteristics include: RF signatures of materials present in the waste item; data read from RF tags associated with the waste item; and, optical properties of the waste item.
[0023] In one embodiment the method includes: receiving the waste item in a receptacle; determining the waste item category of the waste item; and, moving the receptacle to transfer the waste item to one of a plurality of destinations in accordance with the waste item category.
[0024] In one embodiment the method includes transferring the waste item to one of a plurality of waste bins depending on the waste item category.
[0025] In one embodiment the method includes, in the one or more processing devices, controlling one or more actuators to move the receptacle.
[0026] In one embodiment the receptacle is a chute, and wherein the method includes, in the one ormore processing devices: controlling a first actuator to rotate the chute and thereby align the chute with a selected one of a plurality of waste bins; and, controlling a second actuator to release the waste item from the chute and thereby transfer the waste item into the selected waste bin.
[0027] In one embodiment the method includes, in the one or more processing devices, controlling the first and second actuators to align the chute with an opening to allow a waste item to be placed therein.
[0028] In one embodiment the method includes, in the one or more processing devices, detecting placement of a waste item on the chute using an ultrasonic sensor. [0029] It will be appreciated that the broad forms of the invention and their respective features can be used in conjunction and/or independently, and reference to separate broad forms is not intended to be limiting. Furthermore, it will be appreciated that features of the method can be performed using the system or apparatus and that features of the system or apparatus can be implemented using the method.
Brief Description of the Drawings
[0030] Various examples and embodiments of the present invention will now be described with reference to the accompanying drawings, in which: -
[0031] Figure 1A is a schematic side view of an example of an apparatus for sorting waste items;
[0032] Figure IB is a schematic plan view of the apparatus of Figure 1A;
[0033] Figure 1C is a schematic diagram of an example of a processing device for performing classification and/or controlling the apparatus of Figure 1 A;
[0034] Figure 2 is a flow chart of an example of a process for sorting waste items;
[0035] Figure 3 is a flow chart of a first specific example of a process for sorting waste items;
[0036] Figure 4 is a flow chart of a second specific example of a process for sorting waste items;
[0037] Figure 5 is a flow chart of a third specific example of a process for sorting waste items;
[0038] Figure 6A is an image of a specific example of an apparatus for sorting waste items;
[0039] Figure 6B is an image of the chute of the apparatus of Figure 6A;
[0040] Figure 7 is a flow chart of an example of operation of the apparatus of Figure 6A;
[0041] Figures 8A and 8B are a flow chart of an example of a control process for the apparatus of Figure 6A; and, [0042] Figures 9A to 9P are example images of waste items captured using the apparatus of Figure 6A.
Detailed Description of the Preferred Embodiments
[0043] An example of an apparatus for use in sorting waste items will now be described with reference to Figures 1A to 1C.
[0044] In this example, the apparatus 100 includes a receptacle 111, such as a chute or tray, which is configured to receive a waste item. In this example, the receptacle 111 is supported by a post 112, although it will be appreciated that this is not essential and other suitable arrangements could be used.
[0045] One or more actuators for are provided for moving the receptacle 111 to allow waste items to be selectively dispensed therefrom. In this example, the apparatus 100 includes two actuators 121, 122 allowing the receptacle 111 to be rotated and tilted, as shown by the arrows 123, 124, respectively.
[0046] A plurality of sensors 131, 132, 133 are provided, which are configured to measure characteristics of a waste item positioned in the receptacle 111. The nature of the sensors will vary depending on the preferred implementation, although these typically include at least a visible light camera 131, and more typically a visible light camera in combination with one or both of a metal detector 132 and a weight sensor 133.
[0047] The apparatus 100 also includes one or more processing devices 141, which typically form part of one or more processing systems 140. The processing device could be of any suitable form and could include a microprocessor, microchip processor, logic gate configuration, firmware optionally associated with implementing logic such as an FPGA (Field Programmable Gate Array), or any other electronic device, system or arrangement. The waste sorting process can be performed using multiple processing devices, with processing being distributed between one or more of the devices as needed, for example using one or more processing devices to perform classification and one or more other processing devices to control the physical apparatus. Nevertheless, for the purpose of ease of illustration, the following examples will refer to a single processing device, but it will be appreciated that reference to a singular processing device should be understood to encompass multiple processing devices and vice versa, with processing being distributed between the devices as appropriate.
[0048] In one example, a processing system 140 can include a processing device 141, a memory 142, an optional input/output device 143, such as a keyboard and/or display, and an external interface 144, interconnected via a bus 145 as shown. In this example the external interface 144 can be utilised for connecting the processing system 140 to the sensors 131, 132, 133 and the actuators 121, 122, and optionally to other peripheral devices, such as the communications networks, databases, storage devices, or the like. Although a single external interface 144 is shown, this is for the purpose of example only, and in practice multiple interfaces using various methods (e.g. Ethernet, serial, USB, wireless or the like) may be provided.
[0049] In use, the processing device 141 executes instructions in the form of applications software stored in the memory 142 to allow the required processes to be performed. The applications software may include one or more software modules, and may be executed in a suitable execution environment, such as an operating system environment, or the like. Accordingly, it will be appreciated that the processing system 140 may be formed from any suitable processing system, such as a suitably programmed computing device, or the like.
[0050] In use, the processing device is configured to determine a waste item category using sensor data from one or more of the sensors 131, 132, 133 and control the one or more actuators 121, 122 to move the receptacle 111 and thereby transfer the waste item to one of a plurality of destinations, such as waste bins 101, in accordance with the waste item category.
[0051] An example of this process will now be described with reference to Figure 2.
[0052] In this example, at step 200 a waste item is received, for example, by having the waste item placed in the receptacle 111. Sensor data can then be collected by the sensors 131, 132, 133, with this being acquired by the processing device 141 at step 210, allowing the processing device 141 to determine a waste item category at step 220. Once this has been performed, the processing device 141 can sort the waste item at step 230. For example, with the above described arrangement, this process includes controlling the actuator 121 to rotate the receptacle 111 as shown by the arrow 123 to align the receptacle with a waste bin 101 selected based on the determined category. The actuator 122 is then controlled to thereby tilt the receptacle 111 as show by the arrow 124 so the waste item is dispensed from the receptacle into the selected waste bin 101.
[0053] Accordingly, the above described arrangement provides a compact apparatus that is capable of using data from multiple sensors to sense characteristics of waste items and use these in order to categorise and subsequently sort waste items. This can be implemented using basic off the shelf components, avoiding the need for expensive equipment, such as multi- spectral cameras. Furthermore, the use of the receptacle and actuator arrangement ensures sensing can be performed at a single location, with the waste items being easily transported to a destination, such as a waste bin, avoiding the need for complex item transporting mechanisms, such as conveyor belts, or similar.
[0054] A number of further features will now be described.
[0055] In one example, the receptacle 111 is a chute, and in particular a curved chute having a closed end. This allows waste items, and particularly generally cylindrical waste items, such as bottles or cans, to be placed in the chute, and rest against the closed end of the chute, preventing the waste item accidentally falling from the chute, and hence entering the wrong waste bin. Furthermore, this can orientate waste items such as bottles and cans within the chute, to ensure these are presented consistently to the sensors, which in turn helps more accurately interpret resulting sensor data.
[0056] Furthermore, this arrangement allows for easy sorting and dispensing of waste items, allowing the first actuator 121 to be used to rotate the chute 111 to align the open end of the chute with a bin 101, with the second actuator 122 being used to tip the chute 111, allowing the waste item to slide into the waste bin 101.
[0057] In addition to the above described features, the apparatus can include a frame, which is configured to support the sorting apparatus relative to the waste bins. The actuator 121 can be a rotation actuator attached to the frame, allowing the chute 111 to be aligned with the bins. The post 112 extends upwardly from the rotation actuator 121, with a bracket (not shown) being rotatably mounted to an upper end of the post 112, with the chute 121 being supported by the bracket so that the actuator 122 acts as a tilting actuator to allow the chute to be tilted, thereby dispensing the waste item.
[0058] The apparatus can further include a cover (not shown) having an opening allowing a waste item to be inserted therein and positioned on the chute. This can be used to protect the apparatus, and prevent ingress of objects that could otherwise interfere with the sorting process. In this example, the processing devices can be configured to control the one or more actuators 121, 122 to align the chute with the opening to allow a waste item to be placed therein.
[0059] As mentioned above, multiple sensors can be provided. In this example, the sensors include a visible-light camera 131, and a metal sensor 132 and/or a weight sensor 133, although in other examples, additional and/or alternative sensors may be used.
[0060] In one example, the visible-light camera 131 and an optional illumination source are positioned above the receptacle 111, so that the camera faces downwards to capture images of the waste item. This has a number of benefits, including ensuring an uninterrupted view of the waste items, whilst allowing the waste item to rest within the receptacle 111. The receptacle 111 can be coloured, for example using a black colouring, to ensure more consistent imaging, which can be assisted through the use of an illumination source to thereby counteract changes in ambient radiation. This can help ensure consistent measurement of optical properties of the waste items, and hence improve categorisation.
[0061] In one example, the metal sensor 132 is attached to the receptacle, thereby ensuring the metal content of the waste item is accurately measured, whilst the weight sensor 133 can support the receptacle, for example by positioning this between the rotation actuator and post, so that an accurate weight of the waste item can be captured.
[0062] These sensors in combination allow waste item characteristics to be measured that are indicative of a quantity of metal in the waste item, a weight of the waste item, optical properties of the waste item, such as a reflectivity of the waste item, dimensions of the waste item, text or other visual information provided on the waste item, coded data, such as bar codes, QR codes, or the like.
[0063] Additionally other sensors could also be employed, including for example, one or more ultrasonic sensors, which can be configured to detect a presence of a waste item in the receptacle, for example to trigger the sorting process, and/or a fill level of a waste bin, for example to trigger emptying of the apparatus.
[0064] Other sensors for sensing waste item characteristics could include accelerometers, Radio Frequency (RF) sensors, infrared cameras and/or ultraviolet-light cameras, which can be used to measure characteristics such as the presence of RFID tags, waste material reflectivity, or the like.
[0065] A further example of the process of classifying waste items to allow sorting will now be described.
[0066] For the purpose of explanation it is assumed that this process is performed using one or more processing devices, and again it will be appreciated that reference to a singular processing device should be understood to encompass multiple processing devices and vice versa, with processing being distributed between the devices as appropriate.
[0067] As will be apparent from the foregoing, this process could be performed using the apparatus described above with respect to Figures 1A to 1C, although this is not essential and other suitable apparatus configurations could be used. For example, waste items could be provided on a conveyor belt and transported past sensors, to allow categorization to be performed, with processing of sensor data again being performed by one or more processing devices as appropriate.
[0068] In this example, the classification of the waste items involves using a computational model, and specifically, using the sensor data and at least one computational model that is at least partially indicative of different waste item categories. The computational model is typically obtained by applying machine learning to sensor data acquired from a plurality of sensors used to measure at least one reference waste item characteristic from reference waste items in different known waste item categories. [0069] Accordingly, this process uses a computational model and sensor data from multiple sensors, typically an imaging camera and one or more of a weight sensor and metal sensor, which increases the accuracy compared to using a single sensor only. In particular, using multiple sensors and a computational model allows a high degree of accuracy to be achieved when categorizing waste items, allowing the apparatus to accurately sort waste items.
[0070] A number of further features will now be described.
[0071] The nature of the computational model will vary depending on the preferred implementation, but could include for example a neural network such as a YOLO algorithm, which employs convolutional neural networks (CNN) to detect objects in real-time. However, it will be appreciated that the nature of the model can be of any appropriate form and could include any one or more of decision tree learning, random forest, logistic regression, association rule learning, artificial neural networks, deep learning, inductive logic programming, support vector machines, clustering, Bayesian networks, reinforcement learning, representation learning, similarity and metric learning, genetic algorithms, rule-based machine learning, learning classifier systems, or the like. As such schemes are known, these will not be described in any further detail.
[0072] In one example, the processing device applies at least some of the sensor data to the computational model. An example of this, for the specific embodiment of neural networks will now be described with reference to Figure 3.
[0073] In this example, at step 300 a waste item is received, with sensor data being collected by the sensors, with this being acquired by the processing device at step 310. At step 320, the processing device applies the sensor data to the neural network computational model, with this providing an indication of a categorisation at step 330.
[0074] Thus, in this example, sensor data, such as image data from an imaging device, is applied directly to the computational model. However, it will be appreciated that as an alternative to applying the sensor data directly to the computational model, in other examples, the method may include determining one or more metrics from the sensor data and then applying the metrics to the at least one computational model. For example, instead of applying image data directly to the model, the image data could be analysed, for example by performing edge detection to determine dimensions of the waste item, with the dimensions then being applied to the computational model. Similarly, image data could be analysed to determine other metrics, such as the reflectivity of the waste item material, the presence and content of any text or images on the waste item, the presence or content of coded data, such as barcodes, QR codes, or the like.
[0075] In the above specific example, a single computational model is used, and sensor data from each of the plurality of sensors is applied to the one model. However, this isn't essential and other approaches could be used. For example, a respective computational model could be used for each sensor, so that sensor data and/or metrics derived from the sensor data associated with one sensor can be applied to a respective computational model. Thus, image data could be applied to an image computational model, weight sensor data applied to a weight computational model, and so on. In this example, each model can then be used to provide an indication of a possible categorisation, with these being combined in some manner to determine a waste item category.
[0076] In one example, outputs from the sensors could be assessed sequentially, which can streamline the analysis in some instances. Thus, in this instance, the processing device could assess sensor data from a first sensor to determine if the waste item can be categorised. If so, the waste item is categorised, and if not sensor data from at least one further sensor is assessed, with this process continuing until a category can be determined. An example of this is shown in more detail in Figure 4.
[0077] In this example, at step 400 a waste item is received, with sensor data being collected by the sensors and acquired by the processing device at step 410. At step 420, the processing device selects a next sensor, and attempts to perform categorisation at step 430. This can involve the use of a computational model, although this may not be required, depending on the nature of the sensor and the waste items. For example, when sorting bottles and cans, typically only cans are metallic, and so if more than a certain amount of metal is detected, this can be used to determine the waste item is a can. [0078] At step 440 it is determined if a categorisation is determined, and if so, the process ends at step 450. Otherwise, the process returns to step 420 to select a next sensor, with this process being repeated until a categorisation has been determined. In this regard, it will be appreciated that sensor data from each sensor could be considered individually, or once multiple sensor data has been analysed, these could be considered collectively.
[0079] In one example, the processing device can determine a number of categorisation indications, such as categorisation probabilities, with each categorisation indication being determined using sensor data from one or more of the plurality of sensors. The categorisation is then performed using the categorisation indications. An example of this will now be described with reference to Figure 5.
[0080] In this example, at step 500 a waste item is received, with sensor data being collected by the sensors and acquired by the processing device at step 510. At step 520, the processing device analyses sensor data, and then determines a number of category probabilities at step 530. This can be performed for individual sensors, for example, determining separate probabilities using metal and weight sensors. Additionally, and/or alternatively, this could be performed using combinations of sensors, for example determining one probability using a combination of image data and metal sensor data and another probability using the combination of image data and weight sensor data.
[0081] Once the probabilities have been determined, these can be used to determine a waste item category at step 540. This can be achieved in different manners depending on the preferred implementation, for example by determining a waste item category based on the categorisation indication having a highest probability or by selecting a categorisation indication exceeding a threshold.
[0082] It will be appreciated that these approaches could also be used in conjunction. For example, categorisation probabilities could be used in the approach of Figure 4, with a categorisation probability being determined for each sensor in turn, with a categorisation being determined when any one of a combination of probabilities exceed a threshold. [0083] As in the example of the apparatus of Figures 1A to 1C, the plurality of sensors can include a visible-light camera and one or more of a metal sensor or weight sensor, which can measure waste item characteristics such as a quantity of metal in the waste item, a weight of the waste item, optical properties of the waste item, a reflectivity of the waste item, dimensions of the waste item, or the like.
[0084] The system can also extract metrics from the sensor data, such as text displayed on the waste item, dimensions of the waste item, a waste item reflectivity, or the like, applying the metrics to the computational model.
[0085] In one preferred example, the approach includes acquiring sensor data from each of a visible-light camera, a metal sensor and a weight sensor and applying the sensor data to a neural network indicative of different waste item categories to determine a waste item category, the neural network being trained on sensor data acquired from sensors used to measure reference waste items in different known waste item categories.
[0086] Other sensors can also be used including ultrasonic sensors configured to detect the presence of waste items to trigger classification of waste items and/or detect a waste bin fill level. Similarly, accelerometers, Radio Frequency (RF) sensors, infrared cameras and/or ultraviolet-light cameras, could also be used.
[0087] A specific implementation of an automatic smart bin for sorting recyclable items will now be described with reference to Figures 6A and 6B.
[0088] In this example, the apparatus 600 includes a chute 611 attached via a rotatable bracket 613 to a post 612. The post is attached to a rotary actuator 621 via a weight scale 633, whilst a tilting actuator 622 is attached to the bracket 613, allowing the chute 611 to be rotated and tilted. The rotary actuator 621 is mounted on a frame 602 to support the arrangement above four waste bins 601. A pyramidal cover 603 is also removably attached to the frame 602, with the cover 603 including an opening 604.
[0089] In use, the chute 611 can be rotated to align with the opening 604 so that waste items inserted into the opening 604 are received on the chute 611. Following categorisation of the waste item, the chute 611 can be rotated to align with one of the bins 601, before being tilted so that waste items can be deposited therein.
[0090] The apparatus 600 includes a camera system 631 positioned above the chute, allowing the camera to image waste items positioned on the chute 611, and example captured images are shown in Figures 9A to 9P. A light 631.1 is positioned proximate the camera 631, which in this example is in the form of a ring light surrounding the camera 631, which helps ensure even illumination of the waste item. This minimises the impact of variable ambient light, and thus ensures the waste sample is consistently illuminated, irrespective of external illumination sources. It will be appreciated that this can be further assisted by the use of an opaque cover.
[0091] In this example, one or more metal sensors 632 are mounted on an underside of the chute 611, allowing metal content within waste items positioned on the chute to be detected. The apparatus 600 also includes an ultrasonic sensor 634 mounted on the chute, allowing the presence of waste items on the chute to be detected, whilst ultrasonic sensors 635 are mounted on the frame, allowing the fill volume of each bin 601 to be detected.
[0092] In this example, the processing system 640 is mounted above the camera 631 and light 631. 1, with the apparatus 600 including one or more display screens 643, allowing information regarding the sorting process to be displayed.
[0093] In more detail, the vision part of the apparatus includes the ring light 631.1, the camera 631, and an Al processing system 640 (Jetson Nano). The ring light 631.1 provides stable illumination to lighten the chute 611, whilst the camera 631 is installed right above the chute 611 with a distance between the camera 631 and the chute 611 being adjusted according to the focal length of the camera 631 such that the chute 611 substantially fdls the video image frame . The camera 631 is connected to the Al processing system 640 and provides video streams of the recycle chamber to the Al processing system 640. The Al processing system 640 runs a pre-trained classification algorithm to detect objects in the video streams from the camera and to classify the object to be PET or HDPE. The Al processing system 640 then combines the object detection result and other sensing signal to make a final classification result. The Al processing system 640 sends the result to a separate microcontroller (not shown) for sorting control. The Al processing system 640 can also output the video stream to one of the display screens 643.
[0094] In one specific example, the processor system 640 uses an NVIDIA Jetson device to perform Al-based plastic classification, while the camera module uses a Raspberry Pi camera to capture video of waste items. The light module uses a ring light to provide additional illumination, and the apparatus uses an Arduino device to control the sensors and servos. An object and fullness detector module uses the ultrasonic sensors 634, 635 to detect waste items and measure the fullness of the bins, while a scale module uses the weight sensor 633 to measure the weight of waste items. A metal detector module uses the metal sensor 632 to detect metal waste items, and a movement module uses servos 621, 622 to move the chute 611. These modules and devices work together to enable the automatic smart bin to classify and sort waste items into the appropriate containers or bins for recycling or disposal.
[0095] The design of the chute 611 is shown in more detail in Figure 6B.
[0096] In this example, a rotation servo 621 is used to turn the chute 611 to the corresponding bin 601. The waste bins 601 are evenly located on the four comers of the apparatus 600 such that the rotation angles to the bins are fixed and pre-calculated. When a recycling task comes, the apparatus 600 can control the rotation servo to rotate to a specific angle according to the precalculated bin-angle mapping. After dumping the waste items, the automatic smart bin controls the rotation servo to rotate the same angle in the opposite direction such that the chute 611 can point to the opening 604 for the next recycling process.
[0097] The tilting servo 622 is used to tip the recycle item into the designated bin. The tilting servo 622 is by default at a position pointing the opening 604 to accept the deposited waste item. The uplifting angle assures that waste bottles can stay at the bottom of the chute 611 such that the bottles are within the detection range of the metal detector 632 and at the centre of the camera field of view. When dumping a waste item into a waste bin 601, the tilting servo 622 tilts down the chute 611 by 50 degrees, so that the waste item in the chute 611 can slide into the bin 601. After dumping the waste item, the tilting servo 622 tilts up the chute 611 by 50 degrees in the opposite direction for the next recycling process. [0098] Thus, in this example, the bins are evenly distributed around the chute, with a top view similar to that shown in Figure IB. To dump a deposited bottle in the chute 611 into a specific bin, the chamber can rotate to the bin direction with the corresponding degrees and then tilt down to dump the bottle. For example, bin 1, 2, 3 and 4 are for metal, glass, HDPE and PET bottles. The chamber can rotate 45 and 135 degrees clockwise to face the direction of bin 1 and bin 2, respectively, and then tilt down 50 degrees to deposit the bottle in the chamber. Likewise, the chamber can rotate 45 and 135 degrees counter-clockwise to dump bottles into bin 3 and bin 4, respectively.
[0099] Compared with other sorting mechanisms, such as one based on conveyor belts and robotic arms, the rotation-tilt-based sorting mechanism is more compact and suitable for confined space. The rotation-tilt-based sorting can also keep deposited waste bottles at a fixed point making it easy to design the locations of the sensors.
[0100] The rotation-tilt-based in-situ sorting mechanism can also be extended in a few ways. For example, more bins can be added to accept more types of wastes, with the rotation angles being updated accordingly. Additional sets of bins can be provided, with the chute 611 being moved horizontally to a different set of bins. A chute or splash guard can be added to assure bottles are always dumped into the bin.
[0101] The classification process will now be described in more detail with reference to Figure 7.
[0102] The automatic smart bin consistently monitors the chute 611 with the ultrasonic sensor 634 to detect when a waste item has been received at step 700. Once the processing system 640 detects a waste item being deposited onto the chute 611, the processing system 640 will start the sensing, classification and sorting process.
[0103] In this regard, the apparatus senses the waste with the metal sensor(s) 632 installed on the chute 611, and the weight sensor 633 under the chute 611 at step 730, as well as using the camera 631 on top of the chamber to image the waste item, capturing multimedia sensor data at step 710. Sensor data is sent to the Al classification module implemented by the processing system 640, which analyses the video stream to perform object detection at step 720, and then combines with other sensor data to classify the waste.
[0104] In this regard, the Al classification module analyses the sensor data from the sensing module(s), including the multimedia data and non-multimedia data. The multimedia data, such as video stream, will be firstly fed into the vision-based object detection model to identify the objects and their sizes in the video. For example, the object detection model can identify PET bottles in the video stream, and then the size of the bottles and the confidence of the detection result. The multimedia-based object detection module can use the off-the-shelf vision-based object detection models. This allows the apparatus to accurately recognise waste types, including the fine recognition on plastic types, with multiple sensing data.
[0105] The apparatus can also employ computer vision technologies, including object detection, image classification, Optical Character Recognition (OCR), and bar/QR code scanners, to recognise plastic types. Thus, the detection module can employ the OCR technology to extract the brand and other information on the bottle label, and could also include a bar/QR code scanner to read the product code on the bottle if there is any code appearing in the image. In other embodiments, the automatic smart bin further refines recognition accuracy with multiple sensing data, including visible light signals, ultraviolet-light signals, near infrared light signals, weight signals, ultrasonic signals, and metal signals.
[0106] Al vision models have been used to classify images according to image content and identify objects in the image, but has yet to classify different types of plastics with similar appearance. To classify different plastics, a large data set with a variety of plastics is created. To be specific, massive photo data sets of different types of plastics are taken at different angles and labelled with the corresponding plastic types, as shown for example in Figures 9A to 9P. To achieve high classification accuracy, the photos are taken using the automatic smart bin with the same background and lighting condition. The dataset is then used to train the object detection model such that the object detection model can classify different plastics.
[0107] The output of the multimedia-based object detection and other non-multimedia data are fed into the classification algorithm to get the final waste type detection result at step 740. The data will also be sent to a waste item status algorithm at step 750 to determine a bottle status. [0108] An example of classifying a PET bottle is as follows. The object detection model suggests the object could be PET or glass from the video stream; the classification algorithm can calculate the density of the bottle using the size information from the object detection model and the weight from the weight sensor, and then confirm the bottle is type PET.
[0109] The detection results and non-multimedia data are also sent to the status description algorithm for recycle analysis. For example, the algorithm can describe the bottle as a Brand- A’ product-P bottle with half full of water. The brand and product information can be deduced from the multimedia-based object detection algorithm, while the half full of water is deduced from the weight and the acceleration data.
[0110] The classification and status information are then passed to a control algorithm at step 760, and an example of operation of the control algorithm will now be described with reference to Figures 8 A and 8B.
[0111] The control algorithm receives the classification at step 800. If it is determined the waste is classified to be metal at step 805, the control algorithm controls the chute 611 to rotate to the metal bin direction, at step 810 and then tilts the chute 611 to dump the waste at step 840. Meanwhile, the control algorithm reads the fullness of the metal bin with the ultrasonic sensor on top of the bin at step 845 and optionally updates the bin fullness on the screen 643, or uploads an indication to a remote location, such as a monitoring system that coordinates bin emptying. Next, the control algorithm tilts up the chute 611 and rotates the chute 611 to the initial position at step 850, and resets the sensors for the next task at step 855.
[0112] If the waste is not metal, the control algorithm determines whether the waste is glass at step 815, or HDPE or PET at step 825 (in each case rotating the chute 611), and then sorts the waste to appropriate bins at steps 820, 830 or 835, respectively, before performing steps 840 to 855 as required.
[0113] In the above examples, the focus has been on classification of glass, metal cans, PET bottles and HDPE bottles. However, it will be appreciated that the techniques can be adapted to recognize and classify a wider range of waste items, including food containers, coffee cups, paper, cardboard, and other types of plastics. [0114] The classification algorithm can be trained with a large number of samples at a variety of statuses, to help ensure that the module is able to accurately and reliably classify waste items, regardless of their condition or appearance. This can help improve the overall performance and effectiveness of the automatic smart bin technology.
[0115] The current version of the apparatus only contains four bins (glass, metal, PET and HD PE) centred around the chute 611. This is due to the limited size of the automatic smart bin and because of the fact that the rotation-based sorting mechanism can only handle this number of bins. As a result, the current version of the automatic smart bin is limited in its ability to sort waste items into the four categories. However, it will be appreciated that this can be expanded to enable sorting of a greater variety of waste items.
[0116] Accordingly, the above arrangement seeks to provide an apparatus, and in one example, a smart bin, that:
1. Can automatically classify a variety of types of wastes, including different types of plastics, with high classification accuracy;
2. Can automatically sort waste into appropriate bins after classification; and
3. Is compact and has an affordable design for on-site waste classification and sorting.
[0117] The apparatus can use one or more sensing modules to extract features of the waste items. Typical sensing module(s) include a variety of sensors, such as ultrasonic sensors, metal sensors, weight sensors, accelerometers, Radio Frequency (RF) sensors, visible-light cameras, infrared cameras, and ultraviolet-light cameras. Metal sensors can sense metal objects within the detection range and weight sensors can measure the weight of the waste. RF sensors can detect the presence of certain materials through their unique RF signatures and interact with RF tags within the communication range. Visible-light cameras, infrared cameras, and ultraviolet-light cameras can capture images of waste in different spectrums. The sensors are located inside of the apparatus around the waste receiving chute to eliminate or reduce interference and noise.
[0118] The sensing module(s) may also include other supportive components. For example, the sensing module may have a power supply to provide electricity to the sensors and other components, fans to keep the sensors cool, and displays to show the results of the sensing. It may also have shields to block electromagnetic waves that could interfere with the sensors, and lights to illuminate the waste in order to improve the accuracy of the sensing.
[0119] The sensing module(s) should be able to allocate deposited waste regardless of how the waste is deposited into the bin, such that the sensing module(s) can control physical variables during sensing.
[0120] Physical variables, such as the distance between the waste and sensors and the lighting arrangements can be fine-tuned such that the sensors can work properly and ensure accurate sensor data is captured. Sensors should sense the deposited waste in proper order to reduce the interference of sensors and redundant sensors can be employed to get comprehensive sensing results of the waste, such as different angles, distances and locations.
[0121] Accordingly, the above described arrangements use sensors, Artificial Intelligence (Al) and robotics to classify and sort waste items automatically.
[0122] In one example, the apparatus integrates sensors, including weight sensors, ultrasonic sensors, and metal sensors, to detect and recognise glass and metal bottles that have been dropped into the bin. The apparatus can employ computer vision-based object detection technologies to recognise PET (polyethylene terephthalate) and HDPE (high-density polyethylene) plastic bottles. After the classification, the smart bin controls robotic arms to physically sort waste bottles into the appropriate bins for recycling without the need for human intervention.
[0123] The apparatus can automatically classify and sort a variety of types of waste bottles without human intervention. The apparatus provides a drop-and-go waste classification and sorting on-field solution. Individuals can simply deposit their waste bottles and other items into the apparatus, and it will automatically classify and sort the items into the appropriate containers or bins for recycling or disposal.
[0124] Some conventional waste sorting arrangements can only scan waste bottles and give suggestions on the types of bottles but cannot sort them. Users still need to manually drop bottles into the right bins following the suggestions. Some other traditional systems rely solely on the product code on waste bottles and require users to scan bottles manually and then deposit the bottles. The drop-and-go nature of the above described apparatus makes it a convenient and effective solution for waste classification and sorting on-field.
[0125] The apparatus can also process bottles without labels, while product code-based smart bins can only rely on easy-pealed bottle labels. This is because product code-based smart bins use the product codes, in the form of the bar code, QR code or Near-Field Communication (NFC) label, on the labels of waste bottles to identify and classify them. Without the labels, traditional arrangements are unable to identify the bottles and cannot sort them correctly. In contrast, the proposed apparatus can process waste items without labels using advanced sensing and Al technology. The developed apparatus can identify and classify the bottles based on their material, shape, colour, size, and other characteristics, allowing it to accurately and reliably sort waste bottles, even if they do not have labels.
[0126] There are many potential applications and fields of use for the developed automatic smart bin technology. Some examples of these fields include:
• Waste management: Smart bins can be used in waste management facilities to automatically classify and sort waste bottles and other items into the appropriate containers or bins for recycling or disposal. This can help improve the efficiency and accuracy of waste sorting and recycling processes.
• Public spaces: Smart bins can be installed in public spaces, such as parks, malls, and schools, to provide individuals with a convenient way to dispose of waste bottles and other items. The automatic classification and sorting capabilities of the smart bin can help reduce litter and improve waste management in these areas.
• Manufacturing: Smart bins can be used in manufacturing facilities to automatically classify and sort wastes and other items generated during the production process. This can help improve the efficiency and sustainability of manufacturing operations.
[0127] The developed apparatus can classify PET and HDPE plastic bottles in an affordable way, while conventional arrangements cannot identify the different types of plastics. The apparatus develops advanced Al-based object detection algorithms to identify and accurately classify waste bottles based on their visual characteristics, such as their shape, size, and colour from the video stream from cheap visible-light devices. Other traditional arrangements either cannot identify the types of plastics or use expensive infrared cameras to test plastic samples.
[0128] The apparatus can play an important role in implementing effective waste reduction and recycling strategies. The automatic smart bin technology contributes to reducing waste, improving the recycling rate, reducing recycling costs, and improving sustainability in the waste management industry. The classification and sorting technology in the automatic smart bin can also be used in other fields, e.g., recycling leftover material during manufacturing.
[0129] Throughout this specification and claims which follow, unless the context requires otherwise, the word “comprise”, and variations such as “comprises” or “comprising”, will be understood to imply the inclusion of a stated integer or group of integers or steps but not the exclusion of any other integer or group of integers. As used herein and unless otherwise stated, the term "approximately" means ± 20%.
[0130] Persons skilled in the art will appreciate that numerous variations and modifications will become apparent. All such variations and modifications which become apparent to persons skilled in the art, should be considered to fall within the spirit and scope that the invention broadly appearing before described.

Claims

THE CLAIMS DEFINING THE INVENTION ARE AS FOLLOWS:
1) A method for sorting waste items, the method including: a) receiving a waste item; b) acquiring sensor data from a plurality of sensors, each sensor being used to measure at least one waste item characteristic; c) in one or more processing devices, determining a waste item category using the sensor data and at least one computational model, the at least one computational model being at least partially indicative of different waste item categories and being obtained by applying machine learning to sensor data acquired from a plurality of sensors used to measure at least one reference waste item characteristic from reference waste items in different known waste item categories; and, d) sorting the waste item based on the determined waste item category.
2) A method according to claim 1, wherein the at least one computational model includes a neural network.
3) A method according to claim 1 or claim 2, wherein the method includes, in the one or more processing devices, applying at least some of the sensor data to the computational model.
4) A method according to any one of the claims 1 to 3, wherein the method includes, in the one or more processing devices: a) determining one or more metrics from the sensor data; and b) applying the metrics to the at least one computational model.
5) A method according to any one of the claims 1 to 4, wherein the method includes, in the one or more processing devices, determining a waste item category using a single computational model and sensor data from each of the plurality of sensors.
6) A method according to any one of the claims 1 to 5, wherein the method includes, in the one or more processing devices: a) determining a number of categorisation indications, each categorisation indication being determined using sensor data from one or more of the plurality of sensors; and, b) determining the waste item categorisation using the number of categorisation indications. 7) A method according to claim 6, wherein the waste item categorisation indications are probabilities and wherein the method includes, in the one or more processing devices, at least one of: a) determining a waste item category based on the categorisation indication having a highest probability; and b) determining a waste item category for a categorisation indication exceeding a threshold.
8) A method according to any one of the claims 1 to 7, wherein the method includes: a) assessing sensor data from a first sensor to determine if the waste item can be categorised; and, b) depending on result of the determination at least one of: i) determining the waste item category; and ii) assessing sensor data from at least one further sensor to determine the waste item category.
9) A method according to claim 8, wherein the method includes assessing sensor data from a number of sensors in sequence to determine the waste item category.
10) A method according to any one of the claims 1 to 9, wherein the plurality of sensors include: a) a visible-light camera; and, b) at least one of: i) a metal sensor; and, ii) a weight sensor.
11) A method according to claim 10, wherein the waste item characteristic is indicative of at least one of: a) a quantity of metal in the waste item; b) a weight of the waste item; c) optical properties of the waste item; d) a reflectivity of the waste item; and, e) dimensions of the waste item.
12) A method according to claim 4, wherein the metrics include: a) text displayed on the waste item; b) dimensions of the waste item; and, c) a waste item reflectivity. 13) A method according to any one of the claims 1 to 12, wherein the method includes, in one or more processing devices: a) acquiring sensor data from each of a visible-light camera, a metal sensor and a weight sensor; and, b) applying the sensor data to a neural network indicative of different waste item categories to determine a waste item category, the neural network being trained on sensor data acquired from sensors used to measure reference waste items in different known waste item categories.
14)A method according to any one of the claims 1 to 13, wherein the plurality of sensors includes one or more ultrasonic sensors configured to detect the presence of waste items to at least one of: a) trigger classification of waste items; and, b) detect a waste bin fill level.
15)A method according to any one of the claims 1 to 14, wherein the plurality of sensors include at least one of: a) accelerometers; b) Radio Frequency (RF) sensors; c) infrared cameras; and, d) ultraviolet-light cameras.
16) A method according to claim 15, wherein the waste item characteristics include: a) RF signatures of materials present in the waste item; b) data read from RF tags associated with the waste item; and, c) optical properties of the waste item.
17) A method according to any one of the claims 1 to 16, wherein the method includes: a) receiving the waste item in a receptacle; b) determining the waste item category of the waste item; and, c) moving the receptacle to transfer the waste item to one of a plurality of destinations in accordance with the waste item category.
18) A method according to any one of the claims 1 to 17, wherein the method includes transferring the waste item to one of a plurality of waste bins depending on the waste item category. 19) A method according to any one of the claims 1 to 18, wherein the method includes, in the one or more processing devices, controlling one or more actuators to move the receptacle.
20) A method according to any one of the claims 1 to 19, wherein the receptacle is a chute, and wherein the method includes, in the one or more processing devices: a) controlling a first actuator to rotate the chute and thereby align the chute with a selected one of a plurality of waste bins; and, b) controlling a second actuator to release the waste item from the chute and thereby transfer the waste item into the selected waste bin.
21) A method according to claim 20, wherein the method includes, in the one or more processing devices, controlling the first and second actuators to align the chute with an opening to allow a waste item to be placed therein.
22)A method according to claim 20 or claim 21, wherein the method includes, in the one or more processing devices, detecting placement of a waste item on the chute using an ultrasonic sensor.
23) A system for sorting waste items, the system including: a) a plurality of sensors, each sensor being used to measure at least one waste item characteristic of a received waste item; and, b) one or more processing devices configured to: i) determine a waste item category using the sensor data and at least one computational model, the at least one computational model being at least partially indicative of different waste item categories and being obtained by applying machine learning to sensor data acquired from a plurality of sensors used to measure at least one reference waste item characteristic from reference waste items in different known waste item categories; and, ii) sort the waste item based on the determined waste item category.
PCT/AU2024/050296 2023-04-05 2024-03-28 Waste sorting method and apparatus Pending WO2024207052A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2023901001A AU2023901001A0 (en) 2023-04-05 Waste sorting method and apparatus
AU2023901001 2023-04-05

Publications (1)

Publication Number Publication Date
WO2024207052A1 true WO2024207052A1 (en) 2024-10-10

Family

ID=92970645

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2024/050296 Pending WO2024207052A1 (en) 2023-04-05 2024-03-28 Waste sorting method and apparatus

Country Status (1)

Country Link
WO (1) WO2024207052A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN120133188A (en) * 2025-05-15 2025-06-13 北京霍里思特科技有限公司 Material sorting method and material sorting equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180016096A1 (en) * 2016-07-15 2018-01-18 CleanRobotics, Inc. Automatic sorting of waste
US20200222949A1 (en) * 2017-09-19 2020-07-16 Intuitive Robotics, Inc. Systems and methods for waste item detection and recognition
CN112009899A (en) * 2020-07-13 2020-12-01 西安建筑科技大学 Garbage classification box constructed based on convolutional neural network
CN112960313A (en) * 2021-04-20 2021-06-15 杭州恒集智能科技有限公司 Garbage classification device and method
CN113023158A (en) * 2021-03-25 2021-06-25 武汉工程大学 Household garbage classification device and method
CN114067488A (en) * 2021-11-03 2022-02-18 深圳黑蚂蚁环保科技有限公司 Recovery system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180016096A1 (en) * 2016-07-15 2018-01-18 CleanRobotics, Inc. Automatic sorting of waste
US20200222949A1 (en) * 2017-09-19 2020-07-16 Intuitive Robotics, Inc. Systems and methods for waste item detection and recognition
CN112009899A (en) * 2020-07-13 2020-12-01 西安建筑科技大学 Garbage classification box constructed based on convolutional neural network
CN113023158A (en) * 2021-03-25 2021-06-25 武汉工程大学 Household garbage classification device and method
CN112960313A (en) * 2021-04-20 2021-06-15 杭州恒集智能科技有限公司 Garbage classification device and method
CN114067488A (en) * 2021-11-03 2022-02-18 深圳黑蚂蚁环保科技有限公司 Recovery system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
FRIEDRICH KARL, KOINIG GERALD, FRITZ THERESA, POMBERGER ROLAND AND VOLLPRECHT DANIEL: "Sensor-based and Robot Sorting Processes and their Role in Achieving European Recycling Goals - A Review", ACADEMIC JOURNAL OF POLYMER SCIENCE, vol. 5, no. 4, 1 January 2022 (2022-01-01), pages 1 - 18, XP093223742, ISSN: 2641-8282, DOI: 10.19080/AJOP.2022.05.555668 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN120133188A (en) * 2025-05-15 2025-06-13 北京霍里思特科技有限公司 Material sorting method and material sorting equipment

Similar Documents

Publication Publication Date Title
US12371254B2 (en) Systems and methods for detecting waste receptacles
US11986859B2 (en) Perception systems and methods for identifying and processing a variety of objects
US20210371196A1 (en) Automatic sorting of waste
CN108351637B (en) Robotic systems and methods for recognizing and handling various objects
US11685598B2 (en) Refuse collection system
JP7514413B2 (en) Waste sorting equipment
CA2803510C (en) A checkout counter
US20160078414A1 (en) Solid waste identification and segregation system
WO2024207052A1 (en) Waste sorting method and apparatus
RU2731052C1 (en) Robot automatic system for sorting solid municipal waste based on neural networks
Moirogiorgou et al. Intelligent robotic system for urban waste recycling
EP3273393B1 (en) Package inspection system
WO2021205721A1 (en) Data collection method, data collection system, data collection device, data provision method, and computer program
WO2024207048A1 (en) Waste sorting method and apparatus
US12350713B2 (en) Perception systems and methods for identifying and processing a variety of objects
WO2012005661A1 (en) A checkout counter
Kokoulin et al. The development of a system for objects recognition in reverse vending machine using machine learning technologies
US20250278972A1 (en) Container recognition and identification system and method
KR102816892B1 (en) Waste sorting system including waste sorting robot
Athari et al. Design and implementation of a parcel sorter using deep learning
KR102815284B1 (en) The Standard Compliance Monitoring System of Waste Disposal, using Artificial Intelligence
Ríos-Zapata et al. Can the attributes of a waste bin improve recycling? A literature review for sensors and actuators to define product design objectives
WO2024009118A1 (en) Sorting container, sorting system, sorting container arrangement and use of sorting container
Tan et al. Bin There Dump That
BR102023025030A2 (en) DEVICE AND METHOD FOR RECORDING STORAGE LOCATION BY SIZE FOR HARVESTED TUBERS

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24783866

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE