[go: up one dir, main page]

WO2022166505A1 - Robot apparatus for remote and autonomous experiment, and management system and method - Google Patents

Robot apparatus for remote and autonomous experiment, and management system and method Download PDF

Info

Publication number
WO2022166505A1
WO2022166505A1 PCT/CN2022/000018 CN2022000018W WO2022166505A1 WO 2022166505 A1 WO2022166505 A1 WO 2022166505A1 CN 2022000018 W CN2022000018 W CN 2022000018W WO 2022166505 A1 WO2022166505 A1 WO 2022166505A1
Authority
WO
WIPO (PCT)
Prior art keywords
experimental
robot
main control
control system
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2022/000018
Other languages
French (fr)
Chinese (zh)
Inventor
谈斯聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to AU2022217204A priority Critical patent/AU2022217204A1/en
Priority to CN202280018415.4A priority patent/CN117616497A/en
Publication of WO2022166505A1 publication Critical patent/WO2022166505A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • B25J9/1666Avoiding collision or forbidden zones
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L13/00Speech synthesis; Text to speech systems
    • G10L13/02Methods for producing synthetic speech; Speech synthesisers
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • G10L2015/223Execution procedure of a spoken command

Definitions

  • the invention relates to the field of artificial intelligence robots, in particular to a biological experiment, physical experiment, medical experiment, chemical experiment and other multi-scene experiments.
  • artificial intelligence robot technology such as biophysics, chemical medicine, etc., is widely used in biological experiments, physical experiments, medical experiments, chemical experiments and other multi-scene experiments and inspections.
  • a voice control, remote control, autonomous operation, and the design of a remote and autonomous experimental robot device with increased experimental management functions have been developed into a market demand.
  • Remote control of the experimental machine by the administrator remote voice commands, remote monitoring of the experimental environment, and efficient realization of experimental step management, experimental personnel management, and experimental consumables.
  • Practical techniques such as medical physics. Using robot arms and cameras, machine vision and various intelligent identification methods, remote, autonomous experiments, monitoring, intelligent data identification, data analysis, long-distance experiments, and isolation experiments are realized.
  • the invention adds the functions of filtration, oscillator stirring, heating, cooling, drying, cell breaking, filtration, integrated functions of biosensor monitoring experiments, remote control and autonomous operation experiments.
  • the purpose of the present invention is to overcome the above-mentioned shortcomings and deficiencies of the prior art, and provide a voice interaction, remote and autonomous control, intelligent image recognition of experimental samples under a microscope, and auxiliary recognition of microorganisms such as bacteria, viruses, and various cells.
  • the robot arm moves the robot arm to grab, scan the code, and place the object.
  • the integrated robot device is connected to the experiment management system to realize the remote user-robot-machine user-side voice interaction, voice commands, and the mobile robot arm picks up the experimental utensils. Scan the code, improve the integrated functions of filtration, oscillator stirring, heating, cooling, drying, cell disruption, filtration, biosensor monitoring experiments, remote control and autonomous operation experiments.
  • the invention provides a visual identification method, which can identify and identify microorganisms such as bacteria, viruses, and various cell structures in the scene.
  • Use the information collection device to collect and manage the face, biological information, RFID information, and identify the experimental machine through the visual camera to identify the digital code, text, two-dimensional code, color identification, special identification, etc.
  • the invention provides a dual control method combining remote control and robot autonomy.
  • the present invention provides a remote control and autonomous control, and a robot arm action planning method.
  • the present invention provides a voice recognition method such as remote user-robot-robot user voice interaction, voice command, voice recognition, and voice synthesis.
  • the invention provides a code scanning device for scanning codes, identifying and managing experimental consumables, experimental personnel and experimental devices.
  • the invention provides an experiment management system, which is used for query, experiment reservation, real-time experiment supervision, observation, management of experiment utensils, experiment management of experiment logs, management of consumable samples, management of experiment personnel, remote robotic arm control, voice interaction, and instruction ,call.
  • the present invention solves the above problems, and the technical solutions adopted to realize these functions
  • a remote and autonomous experimental robot device, management system, method and method are characterized in that, a remote and autonomous experimental robot device includes:
  • a robot main control system the robot main control system is used to control the robot.
  • the robot main system controls the communication of each robot node through the robot main system, and the connected hardware devices drive and act.
  • the communication module of robot nodes realizes inter-node publishing and receiver communication through messages, services, actions and other communication methods.
  • the microscope visual acquisition and recognition module is connected with the main control system of the robot, which is used for visual acquisition of experimental samples, and the pictures under the microscope assist the intelligent identification of microorganisms and cells.
  • the incubator device includes: dispenser, test tube, beaker and other devices, used for cell, microorganism and other experiments culture, filtration, separation, centrifugation, cell disruption, extraction, biosynthesis, precipitation, drying .
  • the oscillator device is connected with the main control system of the robot, and the oscillator device includes: a stirring rod and an oscillatory device, which are used for shaking and stirring in the experiment of the mobile robot arm.
  • the heating device is connected with the main control system of the robot for heating the experimental sample.
  • the cooling device is connected to the main control system of the robot for cooling the experimental samples.
  • the drying device is connected with the main control system of the robot, and the drying device includes: a microwave drying device, a freeze drying device, a fluidized bed drying device, an infrared drying device, an airflow drying device, a spray drying device, and one of the chamber drying devices. one or more devices.
  • the stainer device is connected to the main control system of the robot and is used for staining experimental samples.
  • the filter device includes one or more of vacuum filtration device, centrifugal filtration device, tubular centrifugal device, disc centrifugal device and ultracentrifugation device.
  • the cell crushing device connected with the main control system of the robot, is used for various crushing methods, including: chemical method, mechanical method, enzymatic hydrolysis method, alkali treatment method, osmotic shock method, and lipolysis method.
  • Mechanical methods include: ultrasonic, high pressure homogenization, grinding, bead grinding.
  • the extractor is connected to the main control system of the robot, and the extractor device includes: an extractor, a separation tank, an expansion valve, and a compressor.
  • the crystallization device the crystallization device, is connected with the main control system of the robot for crystallization.
  • the multi-sensor device is connected to the main control system of the robot.
  • the multi-sensor device includes: nano biosensors, enzyme biosensors, biological pseudo-array chips, microfluidic chips, DNA sensors, immune biosensors, gas sensors, ion sensors, photoelectric sensors , Strain and Piezoresistive Sensors, Inductive Sensors, Capacitive Sensors, Hall Sensors, Piezoelectric Sensors, Impedance Sensors, Semiconductor Sensors, Acoustic Sensors, Thermal Sensors, Electrochemical Sensors, Photosensitive Sensors.
  • the 360-degree rotating table and scale are connected to the main control system of the robot for 360-degree observation of experimental details, and the scale is used to measure the experimental objects.
  • Vision device and magnifying device said vision device, magnifying vision device, connected with mobile robot arm, robot main control system, used to identify numbers, color labels, according to the shape and position of the vessel to identify steps in the experimental process, vessels in the process , the location of the reactor, etc.
  • the mobile device is connected with the main control system of the robot, the visual camera, and the obstacle avoidance device.
  • the mobile device includes wheel-type movement and crawler-type movement, which are used to move the robot arm.
  • Mobile robot arm and wheel type, crawler type can be disassembled, mobile base and robot body can be used independently.
  • the robot arm connected with the main control system of the robot and the camera, is used for the robot arm to grasp, pick up, take, and place the target items, scan codes, organize, and place items.
  • the improved neural network method is used to adaptively learn and adjust the parameters of the robotic arm to achieve autonomous robotic arm action planning, and to use the robot body control and remote user control to adjust the robotic arm planning parameters.
  • the voice device is connected with the main control system of the robot, and the voice module includes: a directional voice recognition device and a microphone. It is used for voice interaction between remote users and experimental devices, experimental guidance, voice commands, voice inquiry of experimental steps, and knowledge quiz.
  • the multimedia touch screen is connected with the main control system of the robot, and the multimedia touch screen is used for experimental steps, experimental process display, demonstration, experimental learning guidance, etc.
  • the scanning code information collection device is connected with the main control system of the robot, and the scanning code information collection device includes: barcode, two-dimensional code, biological information collector, and RFID information collector. It is used to manage experimental consumables, experimental samples, experimental personnel, and manage experimental steps and related information using barcodes and QR codes.
  • the visual recognition module is connected with the main control system of the robot, the camera, and the visual recognition amplifier, and the visual recognition module includes a camera and an amplifier. It is used to collect and publish image information, to identify people's face information, experimental biological reaction devices, color labels, experimental vessel information, to locate target objects, target people, and their locations, by identifying color, digital code, text, two-dimensional code, special identification and other comprehensive information.
  • the main control unit can manage the experimental personnel, experimental consumables and experimental items under the video camera in each experimental scene.
  • the microscope device is connected with the main control system of the robot, and through the improved machine learning method and the improved deep learning method, the contour, shape, structure, color, texture and other characteristics of the sample images such as cells and microorganisms are extracted to intelligently identify bacteria and viruses. and other types of microorganisms, learn and train image parameters to assist in identifying experimental samples, and visually collect pictures under a microscope to assist in intelligent identification of microorganisms and cells.
  • the robotic arm is connected with the main system and the visual camera, and the robotic arm is controlled by the action planning module of the main controller to recognize the target through vision, and use multiple robotic arms to grab, pick, and place the target item, scan the code for experimental samples, and experimental consumables.
  • Action planning, the robot arm action planning module configures the position parameters and angle parameters of the robot arm, wrist, and claw to plan grab, move, and place parameters, and the robot arm moves, grabs, places, scans, sorts, and places.
  • Items, configuring the action parameters of the robotic arm include adaptive learning and adjusting parameters and remote control and adjusting the parameters of the robotic arm.
  • the oscillator device is connected with the main control system of the robot, and the oscillator device includes: a stirring rod and a shaking device, which are used for the shaking and stirring of the mobile robot arm experiment.
  • the shaking device completes the shaking and stirring through the shaking device.
  • the parameters for setting the shaking device include: the number of times of shaking, the time of shaking, the strength of shaking, and the method of shaking.
  • the heating device is connected with the main control system of the robot and is used for heating the experimental sample.
  • Set the parameters of the shaking device including: heating temperature, heating time, heating position, range, etc.
  • the cooling device is connected with the main control system of the robot and is used for cooling the experimental sample. Cooling device parameters include: cooling temperature, cooling time, etc.
  • the drying device is connected with the main control system of the robot and is used for drying the experimental samples.
  • the drying device includes one or more of microwave drying device, freeze drying device, fluidized bed drying device, infrared drying device, airflow drying device, spray drying device, and chamber drying device. Parameters for drying samples include: drying time, drying method, drying intensity, etc.
  • the filtering device is connected with the main control system of the robot for filtering.
  • the filter device includes one or more of a vacuum suction filtration device, a centrifugal filter device, a tubular centrifugal device, a disc-type centrifugal device, and an ultracentrifugation device.
  • the cell crushing device is connected with the main control system of the robot and is used for cell crushing.
  • the various crushing devices correspond to various crushing methods including: chemical method, mechanical method, enzymatic hydrolysis method, alkali treatment method, osmotic shock method, and fat dissolving method.
  • Mechanical methods include: ultrasonic, high pressure homogenization, grinding, bead grinding.
  • the voice module is connected with the main control system of the robot, and the voice module includes a directional voice recognition device and a microphone.
  • directional recognition device By configuring directional recognition device, microphone and other parameters, through voice recognition, voice wake-up, voice-to-text conversion technology, remote user communication, configure language library for remote users. Voice interaction between remote users and robots, voice commands, voice inquiries, and voice knowledge quizzes.
  • the scanning code information collection device is connected with the main control system of the robot, and the scanning code information collection device includes: scanning code information collection, scanning and reading devices.
  • the scanning code information collection, scanning, and reading device is the main system of the robot is connected with the camera, scanner, reader, information collection and reading device, through the improved machine learning algorithm and improved neural network method, intelligent identification of two-dimensional code, Digital codes, biological information, RFID information and other management personnel, items, equipment and other information.
  • the 360-degree rotating table, scale and visual amplifier are connected to the main control system of the robot for 360-degree observation of the details of the experimental items.
  • the 360-degree rotating table and scale are used for rotation, assisting the visual recognition of the camera, and the amplifier device observes the details of the experiment in 360 degrees.
  • the dyer device is connected with the main control system of the robot and is used for dyeing experimental samples.
  • the parameters in the configuration file of each experimental device node include: frequency, maximum and minimum linear velocity, maximum and minimum rotational speed, maximum linear acceleration in x and y directions, maximum angular velocity, error from the target direction, and error from the target position , the weight of reaching the target position, the weight of avoiding obstacles and other parameters.
  • S14 initialize the placement and grasping, the position of the object, grasp the gesture object, and generate the grasping posture (initialize the grasping object, and create the opening and closing posture of the folder).
  • S16 Grab the gesture list. Change the posture and generate the grasping action (set the grasping posture; grasp the ID number; set the objects that are allowed to be touched, and set the grasping list).
  • the improved machine learning method for classifying and analyzing abnormal data of microorganisms and cells includes the following steps:
  • S2. Extract the shape, color, outline, size and other cell characteristics of microorganisms and cell specimens, including characteristics such as color, shape, and outline.
  • Characteristic values of the image such as color, shape, outline size, etc., enter the characteristic value of the detection item.
  • S4 classify and identify microorganisms, cell types (neutrophils, eosinophils, basophils, lymphocytes, monocytes), calculate and analyze the proportion and identify microorganisms and cells.
  • Described a kind of improved neural network algorithm microorganism, cell sample identification method, described method comprises the following steps:
  • the described experiment management system is connected with the robot main control system, the voice module, and the robot arm for browsing and querying commodities, experiment reservation, real-time experiment supervision, observation, management of experiment utensils, experiment management of experiment logs, management of consumable samples, management of Experimenter, etc.
  • An experiment management system includes: a browsing module, a query module, an experiment reservation module, a real-time experiment supervision, an observation module, a module for managing experiment utensils, a module for managing experiment logs, a module for managing samples of consumables, a module for managing experimenters, and a module for controlling a remote robotic arm, Visual display module, voice call module.
  • Fig. 1 is the schematic diagram of the robot module in this application, and accompanying drawing 1 is marked:
  • 101- robot main control system module 101- robot main control system module; 102- robot arm action planning module; 103- camera vision module;
  • 104-mobile module 105-voice module; 106-multimedia touch screen module;
  • 107-scan code information module 108-oscillation device; 109-incubator module;
  • 110-heating/cooling/drying module 110-heating/cooling/drying module; 111-extraction filtration crystallization module; 112-microscope module;
  • Fig. 2 is the robot structure composition diagram of the present application, and accompanying drawing 2 is marked:
  • 201-main control system 202-multimedia touch screen; 203-vision module; 204-amplifier;
  • 217-microscope device 218-remote client; 219-staining device; 220-voice device;
  • This solution mainly realizes the human-robot voice interaction through the parameter setting of the directional voice recognition device and the microphone module, and uses voice recognition, voice-to-text conversion, voice wake-up and other methods to solve voice interaction, voice commands, and voice query of item information.
  • This program mainly uses the camera, using the improved machine learning method and the deep neural network method, to identify the color, shape, outline and other comprehensive characteristics of the object, classify the experimental reactor, intelligently identify the color, number, letter, and text experimental identification information, and return to the experimenter, Experiment utensil information, solve the experiment information.
  • Robots use information acquisition and reading devices such as code scanners to realize experimenters and experimental sample management.
  • This scheme mainly uses the returned position information through the robotic arm module to plan actions such as arm grabbing, scanning code, placing, and operating the experimental reactor. Realize autonomous grasping, scanning, moving, placing, and operating the experimental reactor. Use robots to replace people to complete repetitive tasks, improve efficiency and save labor costs. Greatly reduce the pressure of human work and improve work efficiency.
  • an embodiment of a remote and autonomous experimental robot device, management system and method includes:
  • the robot main control system 201 the robot main control system 201, is used to control the robot.
  • the robot main system 201 controls the communication of each robot node through the robot main system, and drives and operates each connected hardware device.
  • the communication module of robot nodes realizes inter-node publishing and receiver communication through messages, services, actions and other communication methods.
  • the microscope 217 is connected to the robot main control system 201 and is used for visual collection of experimental samples, and pictures under the microscope assist in the intelligent identification of microorganisms and cells.
  • the incubator device includes: dispenser, test tube, beaker and other devices, used for cell, microorganism and other experiments culture, filtration, separation, centrifugation, cell disruption, extraction, biosynthesis, precipitation, dry.
  • the oscillator device includes: a stirring rod, a shaking device, which is used for shaking and stirring in the experiment of the moving robot arm.
  • the heating device 214 is connected to the robot main control system 201 for heating the experimental sample.
  • the cooling device 215 is connected to the robot main control system 201 and is used for cooling the experimental sample.
  • the drying device 216 includes: a microwave drying device, a freeze drying device, a fluidized bed drying device, an infrared drying device, an airflow drying device, a spray drying device, and one or more of the chamber drying devices. , for drying experimental samples.
  • the stainer device 219 the stainer device, is used to stain the sample.
  • the filter device 208 is connected to the robot main control system 201, and the filter device 208 includes one or more of: a vacuum suction filter device, a centrifugal filter device, a tubular centrifugal device, a disc type centrifugal device, and an ultracentrifugation device A device for filtering experimental samples.
  • the cell crushing device 212 is connected to the robot main control system 201 and is used for various crushing methods, including chemical method, mechanical method, enzymatic hydrolysis method, alkali treatment method, osmotic shock method, and lipolysis method.
  • Mechanical methods include: ultrasonic, high pressure homogenization, grinding, bead grinding.
  • the biosensor device 213 is connected to the robot main control system 201.
  • the biosensor device 213 includes: nano biosensors, enzyme biosensors, biological pseudo-array chips, microfluidic chips, DNA sensors, immune biosensors, gas sensors, and ion sensors.
  • the 360-degree rotating table and scale are connected to the main control system 201 of the robot for 360-degree observation of experimental details and scales to measure the experimental objects.
  • Visual magnification device and visual devices 203, 204, the described visual device 203, and the magnified visual device 203 are connected with the mobile robot arm 206 and the robot main control system 201, which are used to identify numbers, color labels, according to the shape of the vessel, and position recognition experiments The steps in the process, the utensils in the process, the location of the reactor, etc.
  • the mobile device 205 is connected with the robot main control system 201, the visual camera 202, and the obstacle avoidance device 205.
  • the mobile device 205 includes wheel-type movement and crawler-type movement for moving the robot arm. Mobile robot arm with wheel type, crawler type can be disassembled.
  • the robot arm 206 is connected to the robot main control system 201 and the camera 202, and is used for the robot arm 206 to grasp, pick up, pick up, place target items, scan codes, organize, and place items.
  • the action planning method of the robotic arm 206 includes: using the improved neural network method, adaptively learning and adjusting the parameters of the robotic arm to achieve autonomous robotic arm action planning, and using the robot body control and remote user control to adjust the planning parameters of the robotic arm.
  • the voice device 220 is connected to the robot main control system 201, and the voice module 220 includes: a directional voice recognition device and a microphone. It is used for voice interaction between remote users and experimental devices, experimental guidance, voice commands, voice inquiry of experimental steps, and knowledge quiz.
  • the multimedia touch screen 202 is connected to the robot main control system 201, and the multimedia touch screen 202 is used for experimental steps, experimental process display, demonstration, experimental learning guidance, and the like.
  • the scanning code information collecting device 207 includes: a barcode, a two-dimensional code, a biological information collector, and an RFID information collector. It is used to manage experimental consumables, experimental samples, experimental personnel, and manage experimental steps and related information using barcodes and QR codes.
  • the extractor 209, the extractor device 209 includes: an extractor, a separation tank, an expansion valve, and a compressor.
  • the X-ray crystallizing device 210 the X-ray crystallizing device 210 is connected to the robot main control system 201 for crystallizing.
  • the experimental robot device visually recognizes the experimental device, the method for the color identification of the experimental label, and the robotic arm movement to grab the experimental device, the steps are as follows:
  • the robotic arm 206 grabs the object from the 360-degree rotating table observation table 207, moves to the position of the biological experiment reactor under the positioned experiment label, grabs the target, the experimental sample, the experimental consumables, uses the robot arm, and operates and presses the special biological experiment reactor Mark the button and operate the bioreactor, according to the time interval, use the visual amplifier 203 to observe the 360-degree rotating table observation table, the experimental sample in the incubator 218, record the experimental process, and assist in recording the dynamic real-time experimental data and its changes to achieve real-time Data classification analysis, identification of microorganisms, bacteria and other experimental samples under microscope pictures.
  • the robotic arm 206 grabs the experimental sample from the 360-degree rotating table observation platform 207, and moves it to the biosensor device 213 according to the experimental steps, including nano biosensors, enzyme biosensors, biological pseudo-array chips, microfluidic chips, DNA sensors, Robotic arms such as immune biosensors, gas sensors, and ion sensors detect experimental data according to planned actions.
  • the robot main control system 201 module and the visual recognition module 203 interact with the robot arm 206, target setting, recognition, positioning, using the robot arm 206 to correspond, the robot arm 206 action planning to grab, move , scan the code, place the experimental vessel, the experimental sample, and press the action
  • the embodiment of the robot arm 206 of the present invention is not limited to this, and the specific implementation steps are as follows:
  • the robot main control system 201 Through the management system and the voice module 220 of the robot main control system 201, call, voice command, voice interaction, browse and query the experimental data.
  • Use the experiment reservation module to reserve an experiment, and remind the experimenter according to the time and appointment.
  • the experimental steps, the experimental process, the remote and the experimental guidance, the remote communication of the experimental user, the contact guidance, the experimental detection, and the monitoring of the experiment are displayed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • Apparatus Associated With Microorganisms And Enzymes (AREA)
  • Manipulator (AREA)

Abstract

A robot apparatus for a remote and autonomous experiment, and a management system and method. The apparatus comprises: a robot master control system (101), a robot arm action planning module (102), a camera visual module (103), a mobile module (104), a speech module (105), a multimedia touch screen module (106), a code scanning information module (107), an oscillator module (108), an incubator module (109), a heating/cooling/drying module (110), an extraction, filtration and crystallization module (111), a microscope module (112), a 360-degree rotary table scale/magnification module (113) and a multi-sensor module (114). By means of the apparatus, remote and autonomous experiments, monitoring, intelligent data identification and data analysis, experiments at a distance and isolated experiments can be realized, and the apparatus can be widely applied to experiments and inspections of multiple scenarios, such as biological experiments, physical experiments, medical experiments and chemical experiments.

Description

一种远端及自主实验机器人装置、管理系统及方法A remote and autonomous experimental robot device, management system and method 技术领域technical field

本发明涉及人工智能机器人领域,具体地涉及一种生物实验,物理实验,医学实验,化学实验等多场景实验,实验机器人利用远端及自主定位移动,语音,显微镜下视觉识别,机器臂动作规划,生物物理化学医学等人工智能机器人技术,广泛应用于生物实验,物理实验,医学实验,化学实验等多场景实验,检验。The invention relates to the field of artificial intelligence robots, in particular to a biological experiment, physical experiment, medical experiment, chemical experiment and other multi-scene experiments. , artificial intelligence robot technology such as biophysics, chemical medicine, etc., is widely used in biological experiments, physical experiments, medical experiments, chemical experiments and other multi-scene experiments and inspections.

背景技术Background technique

随着人工智能机器人在生物,物理,医学,化学领域的推广,生物,物理,医学,化学各实验,监测耗时长,实验操作精密度低,误操作导致实验失败,由于各种人为因素道中精准度差,各实验操作因人员专业能力差异,实验步骤,实验细节实验完成度,效率差异较大。因此标准化,精密化高效率实验机器装置成为重要课题。With the promotion of artificial intelligence robots in the fields of biology, physics, medicine, and chemistry, biological, physical, medical, and chemical experiments take a long time to monitor, the precision of experimental operations is low, and misoperations lead to experimental failures. Due to the differences in the professional ability of personnel, the experimental steps, the experimental details, the degree of experimental completion, and the efficiency of each experimental operation. Therefore, standardization, precision and high-efficiency experimental equipment has become an important issue.

一种语音控制,远端控制,自主操作,增加实验管理功能的远端及自主实验机器人装置的设计,开发成为市场需求。通过管理员远端控制实验机器,远端语音命令,远端监视实验环境,高效实现实验步骤管理,实验人员管理,实验耗材实验样本管理的机器人装置,管理平台涉及机器人理论,人工智能,生物化学医学物理等实践技术。利用机器人搭载的机器臂及摄像头,机器视觉及各种智能识别方法,实现远端,自主实验,监测,智能化识别数据,分析数据,远距离实验,隔离实验。A voice control, remote control, autonomous operation, and the design of a remote and autonomous experimental robot device with increased experimental management functions have been developed into a market demand. Remote control of the experimental machine by the administrator, remote voice commands, remote monitoring of the experimental environment, and efficient realization of experimental step management, experimental personnel management, and experimental consumables. Practical techniques such as medical physics. Using robot arms and cameras, machine vision and various intelligent identification methods, remote, autonomous experiments, monitoring, intelligent data identification, data analysis, long-distance experiments, and isolation experiments are realized.

目前的市面产品多是单一的生物反应机器,还未出现远端控制,自主操作,远端监管实验环境,实验步骤,管理实验人员,实验样本等装置,实验管理系统。还未实现语音指令,语音交互,机器臂操作实验等功能的实验机器人及实验管理系统。本发明增加了过滤,振荡器搅拌,加热,冷却,干燥,细胞破碎,过滤,生物传感器监测实验一体化功能,远端控制及自主操作实验等功能。At present, most of the products on the market are single biological reaction machines, and there is no remote control, autonomous operation, remote supervision of the experimental environment, experimental procedures, management of experimental personnel, experimental samples and other devices, and experimental management systems. Experimental robots and experimental management systems that have not yet implemented voice commands, voice interaction, and robotic arm operation experiments. The invention adds the functions of filtration, oscillator stirring, heating, cooling, drying, cell breaking, filtration, integrated functions of biosensor monitoring experiments, remote control and autonomous operation experiments.

解决了利用移动的机器臂远端控制及自主操作实验,实现语音指令,语音交互,显微镜视觉识别,过滤,振荡器搅拌,加热,冷却,干燥,细胞破碎,过滤,多生物传感器监测等。解决了由于各种人为因素道中精准度差,各实验操作因人员专业能力差异,实验细节实验完成度效率差异较大。解决了远端控制及自主操作实验,实现了过滤,振荡器搅拌,加热,冷却,干燥,细胞破碎,生物传感器监测实验一体化功能,提高了智能机器人的生物,物理,医学,化学实验效用。It solves the remote control and autonomous operation experiments of the mobile robot arm, realizing voice commands, voice interaction, microscope visual recognition, filtration, oscillator stirring, heating, cooling, drying, cell disruption, filtration, multi-biosensor monitoring, etc. It solves the problem that due to various human factors, the accuracy is poor, the professional ability of each experimental operation is different, and the experimental completion efficiency of the experimental details is quite different. It solves the remote control and autonomous operation experiments, realizes the integration functions of filtration, oscillator stirring, heating, cooling, drying, cell disruption, and biosensor monitoring experiments, and improves the biological, physical, medical, and chemical experimental utility of intelligent robots.

发明内容SUMMARY OF THE INVENTION

本发明的目的就在于克服上述现有技术的缺点和不足,提供一种语音交互,远端及自主控制,显微镜下实验样本的图像智能识别,辅助识别细菌,病毒等微生物,各类细胞。通过履带及滚轮移动机器臂抓取,扫码,放置物品一体化机器人装置,与实验管理系统连接,实现了远端用户-机器人-机器使用端的语音交互,语音指令,移动机器臂拾取实验器皿,扫码,提高了过滤,振荡器搅拌,加热,冷却,干燥,细胞破碎,过滤,生物传感器监测实验一体化功能,远端控制及自主操作实验。The purpose of the present invention is to overcome the above-mentioned shortcomings and deficiencies of the prior art, and provide a voice interaction, remote and autonomous control, intelligent image recognition of experimental samples under a microscope, and auxiliary recognition of microorganisms such as bacteria, viruses, and various cells. Through the crawler and rollers, the robot arm moves the robot arm to grab, scan the code, and place the object. The integrated robot device is connected to the experiment management system to realize the remote user-robot-machine user-side voice interaction, voice commands, and the mobile robot arm picks up the experimental utensils. Scan the code, improve the integrated functions of filtration, oscillator stirring, heating, cooling, drying, cell disruption, filtration, biosensor monitoring experiments, remote control and autonomous operation experiments.

本发明提供了视觉识别方法,可识别识别细菌,病毒等微生物以及其场景下各类细胞结构。利用信息采集装置采集,管理人脸,生物信息,RFID信息,通过视觉摄像头识别数字码,文字,二维码,颜色标识,特殊标识等辨识实验机器。The invention provides a visual identification method, which can identify and identify microorganisms such as bacteria, viruses, and various cell structures in the scene. Use the information collection device to collect and manage the face, biological information, RFID information, and identify the experimental machine through the visual camera to identify the digital code, text, two-dimensional code, color identification, special identification, etc.

本发明提供了一种远端控制,机器人自主相结合的双控制方法。The invention provides a dual control method combining remote control and robot autonomy.

进一步,本发明提供了一种远端控制及自主控制,机器臂动作规划方式。Further, the present invention provides a remote control and autonomous control, and a robot arm action planning method.

又进一步,本发明提供了一种远端用户-机器人-机器人端用户语音交互,语音指令,语音识别,语音合成等语音识别方法。本发明提供了一种扫码装置用于扫码,识别管理实验耗材,实验人员,实验装置。Still further, the present invention provides a voice recognition method such as remote user-robot-robot user voice interaction, voice command, voice recognition, and voice synthesis. The invention provides a code scanning device for scanning codes, identifying and managing experimental consumables, experimental personnel and experimental devices.

本发明提供了一种实验管理系统,用于查询,实验预约,实验实时监管,观察,管理实验器皿,实验管理实验日志,管理耗材样本,管理实验人员,远端机器臂控制,语音交互,指令,呼叫。The invention provides an experiment management system, which is used for query, experiment reservation, real-time experiment supervision, observation, management of experiment utensils, experiment management of experiment logs, management of consumable samples, management of experiment personnel, remote robotic arm control, voice interaction, and instruction ,call.

本发明解决了上述问题,实现这些功能所采用的技术方案The present invention solves the above problems, and the technical solutions adopted to realize these functions

一种远端及自主实验机器人装置、管理系统及方法及方法其特征在于,一种远端及自主实验机器人装置包括:A remote and autonomous experimental robot device, management system, method and method are characterized in that, a remote and autonomous experimental robot device includes:

机器人主控制系统,所述的机器人主控制系统用于控制机器人。机器人主系统是通过机器人主系统控制各机器人节点通信,各连接的硬件装置驱动,动作。机器人节点通信模块,通过消息,服务,动作等通信方式实现节点间发布,接收端通信。A robot main control system, the robot main control system is used to control the robot. The robot main system controls the communication of each robot node through the robot main system, and the connected hardware devices drive and act. The communication module of robot nodes realizes inter-node publishing and receiver communication through messages, services, actions and other communication methods.

显微镜视觉采集识别模块,与机器人主控制系统连接,用于实验样本的视觉采集,显微镜下的图片辅助微生物,细胞等智能识别。The microscope visual acquisition and recognition module is connected with the main control system of the robot, which is used for visual acquisition of experimental samples, and the pictures under the microscope assist the intelligent identification of microorganisms and cells.

培养器装置,所述的培养器装置包括:分装器,试管,烧杯等装置,用于细胞,微生物等实验的培养,过滤,分离,离心,细胞破碎,萃取,生物合成,沉析,干燥。Incubator device, the incubator device includes: dispenser, test tube, beaker and other devices, used for cell, microorganism and other experiments culture, filtration, separation, centrifugation, cell disruption, extraction, biosynthesis, precipitation, drying .

振荡器装置,与机器人主控制系统连接,所述的振荡器装置包括:搅拌棒,震荡装置,用于移动机器臂实验的振荡,搅拌。The oscillator device is connected with the main control system of the robot, and the oscillator device includes: a stirring rod and an oscillatory device, which are used for shaking and stirring in the experiment of the mobile robot arm.

加热装置,与机器人主控制系统连接,用于加热实验样本。The heating device is connected with the main control system of the robot for heating the experimental sample.

冷却装置,与机器人主控制系统连接,用于冷却实验样本。The cooling device is connected to the main control system of the robot for cooling the experimental samples.

干燥装置,与机器人主控制系统连接,所述的干燥装置包括:微波干燥装置,冷冻干燥装置,流化床干燥装置,红外干燥装置,气流干燥装置,喷雾干燥装置,厢式干燥装置中的一种或多种装置。The drying device is connected with the main control system of the robot, and the drying device includes: a microwave drying device, a freeze drying device, a fluidized bed drying device, an infrared drying device, an airflow drying device, a spray drying device, and one of the chamber drying devices. one or more devices.

染色器装置,与机器人主控制系统连接,用于染色实验样本。The stainer device is connected to the main control system of the robot and is used for staining experimental samples.

过滤装置,所述的过滤器装置包括:真空抽滤装置,离心过滤装置,管式离心装置,碟片式离心装置,超速离心装置中的一种或多种装置。Filtration device, the filter device includes one or more of vacuum filtration device, centrifugal filtration device, tubular centrifugal device, disc centrifugal device and ultracentrifugation device.

细胞破碎装置,与机器人主控制系统连接,用于多种破碎方式,包括:化学法,机械法,酶解法,碱处理法,渗透冲击法,脂溶解法。机械法包括:超声波,高压匀浆法,研磨法,珠磨法。The cell crushing device, connected with the main control system of the robot, is used for various crushing methods, including: chemical method, mechanical method, enzymatic hydrolysis method, alkali treatment method, osmotic shock method, and lipolysis method. Mechanical methods include: ultrasonic, high pressure homogenization, grinding, bead grinding.

萃取器,与机器人主控制系统连接,所述的萃取器装置包括:萃取器,分离槽,膨胀阀,压缩机。The extractor is connected to the main control system of the robot, and the extractor device includes: an extractor, a separation tank, an expansion valve, and a compressor.

结晶装置,所述的结晶装置,与机器人主控制系统连接,用于结晶。The crystallization device, the crystallization device, is connected with the main control system of the robot for crystallization.

多传感器装置,与机器人主控制系统连接,多传感器装置包括:纳米生物传感器,酶生物传感器,生物假阵列芯片,微流控芯片,DNA传感器,免疫生物传感器,气体传感器,离子传感器,光电式传感器,应变与压阻式传感器,电感式传感器,电容式传感器,霍尔传感器,压电式传感器,阻抗型传感器,半导体型传感器,声波型传感器,热量式传感器,电化学传感器,光敏传感器。The multi-sensor device is connected to the main control system of the robot. The multi-sensor device includes: nano biosensors, enzyme biosensors, biological pseudo-array chips, microfluidic chips, DNA sensors, immune biosensors, gas sensors, ion sensors, photoelectric sensors , Strain and Piezoresistive Sensors, Inductive Sensors, Capacitive Sensors, Hall Sensors, Piezoelectric Sensors, Impedance Sensors, Semiconductor Sensors, Acoustic Sensors, Thermal Sensors, Electrochemical Sensors, Photosensitive Sensors.

360度旋转台及秤,与机器人主控制系统连接,用于360度观察实验细节,秤用于度量实验品。The 360-degree rotating table and scale are connected to the main control system of the robot for 360-degree observation of experimental details, and the scale is used to measure the experimental objects.

视觉装置及放大装置,所述的视觉装置,放大视觉装置,与移动机器臂,机器人主控制系统连接,用于识别数字,颜色标签,依据器皿形状,位置识别实验过程中的步骤,流程中器皿,反应器的位置等。Vision device and magnifying device, said vision device, magnifying vision device, connected with mobile robot arm, robot main control system, used to identify numbers, color labels, according to the shape and position of the vessel to identify steps in the experimental process, vessels in the process , the location of the reactor, etc.

移动装置,与机器人主控制系统,视觉摄像头,避障装置连接,所述的移动装置,包括轮式移动及履带式移动,用于移动机器臂移动。移动机器臂与轮式,履带式可拆卸,移动底座及机器人本体可独立使用。The mobile device is connected with the main control system of the robot, the visual camera, and the obstacle avoidance device. The mobile device includes wheel-type movement and crawler-type movement, which are used to move the robot arm. Mobile robot arm and wheel type, crawler type can be disassembled, mobile base and robot body can be used independently.

机器臂,与机器人主控制系统,摄像头连接,用于机器臂抓,拾,取,放目标物品,扫码,整理,摆放物品等动作。利用改进的神经网络方法,自适应学习调整机器臂的参数实现自主的机器臂动作规划,以及利用机器人机体控制及远端用户控制调解机器臂规划参数。The robot arm, connected with the main control system of the robot and the camera, is used for the robot arm to grasp, pick up, take, and place the target items, scan codes, organize, and place items. The improved neural network method is used to adaptively learn and adjust the parameters of the robotic arm to achieve autonomous robotic arm action planning, and to use the robot body control and remote user control to adjust the robotic arm planning parameters.

语音装置,与机器人主控制系统连接,所述语音模块包括:定向识音装置,麦克。 用于远端用户-实验装置间的语音交互,实验指导,语音指令,实验步骤的语音问询,知识问答。The voice device is connected with the main control system of the robot, and the voice module includes: a directional voice recognition device and a microphone. It is used for voice interaction between remote users and experimental devices, experimental guidance, voice commands, voice inquiry of experimental steps, and knowledge quiz.

多媒体触摸屏,与机器人主控制系统连接,所述的多媒体触摸屏,用于实验步骤,实验过程展示,演示,实验学习指导等。The multimedia touch screen is connected with the main control system of the robot, and the multimedia touch screen is used for experimental steps, experimental process display, demonstration, experimental learning guidance, etc.

扫码信息采集装置,与机器人主控制系统连接,所述的扫码信息采集装置包括:条形码,二维码,生物信息采集器,RFID信息采集器。用于利用条形码,二维码,管理实验耗材,实验样本,实验人员,管理实验步骤及相关信息。The scanning code information collection device is connected with the main control system of the robot, and the scanning code information collection device includes: barcode, two-dimensional code, biological information collector, and RFID information collector. It is used to manage experimental consumables, experimental samples, experimental personnel, and manage experimental steps and related information using barcodes and QR codes.

所述的视觉识别模块,与机器人主控制系统,摄像头,视觉识别放大器连接,所述视觉识别模块包括:摄像头,放大器。用于采集发布图像信息,用于识别人员人脸信息,实验生物反应装置,颜色标签,实验器皿信息,定位目标物品,目标人员,及其所在位置,通过识别颜色,数字码,文字,二维码,特殊标识等综合信息。通过机器人系统实现主控制单元的对各实验场景下视频摄像头下实验人员,实验耗材,实验物品管理。The visual recognition module is connected with the main control system of the robot, the camera, and the visual recognition amplifier, and the visual recognition module includes a camera and an amplifier. It is used to collect and publish image information, to identify people's face information, experimental biological reaction devices, color labels, experimental vessel information, to locate target objects, target people, and their locations, by identifying color, digital code, text, two-dimensional code, special identification and other comprehensive information. Through the robot system, the main control unit can manage the experimental personnel, experimental consumables and experimental items under the video camera in each experimental scene.

所述的显微镜装置,与机器人主控制系统连接,通过改进的机器学习方法,改进的深度学习方法,抽取细胞,微生物等样本图像的轮廓,形状,结构,颜色,纹理等特征,智能识别细菌病毒等微生物的种类,学习训练图像参数,用于辅助识别实验样本,视觉采集显微镜下的图片辅助微生物,细胞等智能识别。The microscope device is connected with the main control system of the robot, and through the improved machine learning method and the improved deep learning method, the contour, shape, structure, color, texture and other characteristics of the sample images such as cells and microorganisms are extracted to intelligently identify bacteria and viruses. and other types of microorganisms, learn and train image parameters to assist in identifying experimental samples, and visually collect pictures under a microscope to assist in intelligent identification of microorganisms and cells.

所述的机器臂,与主系统和视觉摄像头相连接,机器臂由主控制器动作规划模块,通过视觉识别目标,利用多机器臂抓,取,放目标物品,扫码实验样本,实验耗材。动作规划,所述机器臂动作规划模块通过配置机器臂,腕,爪的位置参数,角度参数,规划抓,取动,放置参数,机器臂移动,抓取,放置,扫码,整理,摆放物品,配置机器臂的动作参数包括自适应学习调整参数以及远端控制调解机器臂参数。The robotic arm is connected with the main system and the visual camera, and the robotic arm is controlled by the action planning module of the main controller to recognize the target through vision, and use multiple robotic arms to grab, pick, and place the target item, scan the code for experimental samples, and experimental consumables. Action planning, the robot arm action planning module configures the position parameters and angle parameters of the robot arm, wrist, and claw to plan grab, move, and place parameters, and the robot arm moves, grabs, places, scans, sorts, and places. Items, configuring the action parameters of the robotic arm include adaptive learning and adjusting parameters and remote control and adjusting the parameters of the robotic arm.

所述的振荡器装置,与机器人主控制系统连接,所述的振荡器装置包括:搅拌棒,震荡装置,用于移动机器臂实验的振荡,搅拌,通过震荡装置,搅拌装置完成振荡,搅拌。设置震荡装置参数包括:振荡搅拌次数,振荡搅拌时间,振荡搅拌强度,振荡搅拌方法等。The oscillator device is connected with the main control system of the robot, and the oscillator device includes: a stirring rod and a shaking device, which are used for the shaking and stirring of the mobile robot arm experiment. The shaking device completes the shaking and stirring through the shaking device. The parameters for setting the shaking device include: the number of times of shaking, the time of shaking, the strength of shaking, and the method of shaking.

所述的加热装置,与机器人主控制系统连接,用于加热实验样本。设置震荡装置参数包括:加热温度,加热时间,加热位置,范围等。The heating device is connected with the main control system of the robot and is used for heating the experimental sample. Set the parameters of the shaking device including: heating temperature, heating time, heating position, range, etc.

所述的冷却装置,与机器人主控制系统连接,用于冷却实验样本。冷却装置参数包括:冷却温度,冷却时间等。The cooling device is connected with the main control system of the robot and is used for cooling the experimental sample. Cooling device parameters include: cooling temperature, cooling time, etc.

所述的干燥装置,与机器人主控制系统连接,用于干燥实验样本。所述的干燥装置包括:微波干燥装置,冷冻干燥装置,流化床干燥装置,红外干燥装置,气流干燥装置,喷雾干燥装置,厢式干燥装置中的一种或多种装置。干燥样本的参数包括:干燥时间,干燥的 方式,干燥的强度等。The drying device is connected with the main control system of the robot and is used for drying the experimental samples. The drying device includes one or more of microwave drying device, freeze drying device, fluidized bed drying device, infrared drying device, airflow drying device, spray drying device, and chamber drying device. Parameters for drying samples include: drying time, drying method, drying intensity, etc.

所述的过滤装置,与机器人主控制系统连接,用于过滤。所述的过滤器装置包括:真空抽滤装置,离心过滤装置,管式离心装置,碟片式离心装置,超速离心装置中的一种或多种装置。The filtering device is connected with the main control system of the robot for filtering. The filter device includes one or more of a vacuum suction filtration device, a centrifugal filter device, a tubular centrifugal device, a disc-type centrifugal device, and an ultracentrifugation device.

所述的细胞破碎装置,与机器人主控制系统连接,用于细胞破碎。所述的多种破碎装置对应多种破碎方式包括:化学法,机械法,酶解法,碱处理法,渗透冲击法,脂溶解法。机械法包括:超声波,高压匀浆法,研磨法,珠磨法。The cell crushing device is connected with the main control system of the robot and is used for cell crushing. The various crushing devices correspond to various crushing methods including: chemical method, mechanical method, enzymatic hydrolysis method, alkali treatment method, osmotic shock method, and fat dissolving method. Mechanical methods include: ultrasonic, high pressure homogenization, grinding, bead grinding.

所述的语音模块,与机器人主控制系统连接,语音模块包括:定向识音装置,麦克。通过配置定向识音装置,麦克等参数,通过语音识别,语音唤醒,语音文字转换技术,远端用户沟通,配置语言库,用于远端用户。远端用户-机器人间语音交互,语音指令,语音问询,语音知识问答。The voice module is connected with the main control system of the robot, and the voice module includes a directional voice recognition device and a microphone. By configuring directional recognition device, microphone and other parameters, through voice recognition, voice wake-up, voice-to-text conversion technology, remote user communication, configure language library for remote users. Voice interaction between remote users and robots, voice commands, voice inquiries, and voice knowledge quizzes.

所述的扫码信息采集装置,与机器人主控制系统连接,扫码信息采集装置包括:扫码信息采集,扫描,读取装置。扫码信息采集,扫描,读取装置是机器人主系统与摄像头,扫描器,读取器,信息采集读取装置连接,通过改进的机器学习算法与改进的神经网络方法,智能识别二维码,数字码,生物信息,RFID信息等管理人员,物品,器材等多种信息。The scanning code information collection device is connected with the main control system of the robot, and the scanning code information collection device includes: scanning code information collection, scanning and reading devices. The scanning code information collection, scanning, and reading device is the main system of the robot is connected with the camera, scanner, reader, information collection and reading device, through the improved machine learning algorithm and improved neural network method, intelligent identification of two-dimensional code, Digital codes, biological information, RFID information and other management personnel, items, equipment and other information.

所述的360度旋转台及秤与视觉放大器,与机器人主控制系统连接,用于360度观察实验物品细节。所述的360度旋转台及秤,用于旋转,协助摄像头视觉识别,放大器装置360度观察实验品细节。The 360-degree rotating table, scale and visual amplifier are connected to the main control system of the robot for 360-degree observation of the details of the experimental items. The 360-degree rotating table and scale are used for rotation, assisting the visual recognition of the camera, and the amplifier device observes the details of the experiment in 360 degrees.

所述的染色器装置,与机器人主控制系统连接,用于染色实验样本。The dyer device is connected with the main control system of the robot and is used for dyeing experimental samples.

所述的一种视觉识别实验装置,实验标签颜色,数字,字母,文字,特殊标识的方法,机器臂运动规划方法,所述的方法包括以下步骤:Described a visual recognition experiment device, experiment label color, number, letter, text, method for special identification, robot arm motion planning method, the method includes the following steps:

S1、设置实验场景的对应实验装置参数及其对应的位置参数。S1. Set the corresponding experimental device parameters of the experimental scene and their corresponding position parameters.

S2、输入实验台对应的实验装置,实验标签颜色,数字,字母,文字,特殊标识的数学模型。S2. Input the experimental device corresponding to the experimental bench, the color of the experimental label, the number, the letter, the text, and the mathematical model of the special logo.

S3、抽取实验场景下,器皿的形状,轮廓,结构,颜色,数字,字母,文字,特殊特标识图像,对应的图像特征作为输入值。S3. Extract the shape, outline, structure, color, number, letter, text, and special identification image of the vessel under the experimental scene, and use the corresponding image feature as the input value.

S4、改进权值优化器,快速训练图像,得到输出值。S4. Improve the weight optimizer, quickly train the image, and obtain the output value.

S5、依据输出形状,轮廓,结构,颜色,数字,字母,文字,特殊标识结果,精准识别目标,指定目标及定位目标位置。S5. According to the output shape, outline, structure, color, number, letter, text, special identification results, accurately identify the target, specify the target and locate the target position.

S6、依据实验,设置实验步骤,依据实验步骤规划机器臂动作,依据动作规划,移动机器臂到各实验步骤的试验装置及其位置,在主系统下,机器臂定位移动到指定的目标实验装置。S6. According to the experiment, set the experimental steps, plan the action of the robot arm according to the experimental steps, move the robot arm to the test device and its position of each experimental step according to the action plan, and move the robot arm to the designated target experimental device under the main system. .

S7、各实验装置节点的配置文件中参数包括:频率,最大最小线速度,最大最小的旋转速度,x方向y方向最大的线加速度,最大的角速度,距离目标方向的误差,距离目标位置的误差,到达目标位置的权重,避开障碍物的权重等参数。S7. The parameters in the configuration file of each experimental device node include: frequency, maximum and minimum linear velocity, maximum and minimum rotational speed, maximum linear acceleration in x and y directions, maximum angular velocity, error from the target direction, and error from the target position , the weight of reaching the target position, the weight of avoiding obstacles and other parameters.

S8、在各实验装置节点中配置机器人机器臂参数,障碍物位置以及其尺寸参数,更新频率,发布频率,实验装置位置,实验台各装置的图标,颜色以及其参数,在坐标转换框架之间的转换最大延时等参数。S8. Configure the robot arm parameters, obstacle positions and their size parameters, update frequency, release frequency, experimental device location, icons, colors and parameters of each device on the experimental platform in each experimental device node. parameters such as the conversion maximum delay.

S9、设置机器人初始参数,包括机器人id,目标id及其位置和角度位姿消息。S9, set the initial parameters of the robot, including the robot id, the target id and its position and angle pose information.

S10、设置运动规划,选择关节角度,关节限位设置,机械臂移动到指定的关节位形,关节限制,关节轨迹位置,速度分量,关节速度。设置运动约束,目标轨迹,速度设置,执行规划的轨迹,设置关节位置,关节角度。S10. Set the motion plan, select the joint angle, the joint limit setting, the mechanical arm moves to the specified joint shape, the joint limit, the joint trajectory position, the speed component, and the joint speed. Set motion constraints, target trajectory, speed settings, execute planned trajectory, set joint position, joint angle.

S11、设置机械臂上的笛卡尔路径,目标位姿所能拾取的物体对于机器人位姿参数设置。S11. Set the Cartesian path on the robotic arm, and the objects that can be picked up by the target pose are set for the robot pose parameters.

S12、设置机械臂防碰撞矩阵,防碰撞检测模块设置(机器人自身其他部位检测,场景障碍物检测)。S12 , setting the anti-collision matrix of the manipulator, and setting the anti-collision detection module (detection of other parts of the robot itself, detection of obstacles in the scene).

S13、机械臂,爪参数设置,抓握,取放,抓取位姿参数设置与匹配目标位姿。S13. Mechanical arm, claw parameter setting, grasping, pick-and-place, grasping pose parameter setting and matching target pose.

S14、初始化放置抓取,物体的位置,抓取姿态对象,生成抓取姿态(初始化抓取对象,创建夹瓜张开闭合的姿态)。设置期望的夹爪靠近,撤离目标的参数,设置抓取姿态。S14, initialize the placement and grasping, the position of the object, grasp the gesture object, and generate the grasping posture (initialize the grasping object, and create the opening and closing posture of the folder). Set the desired gripper approach, parameters to withdraw from the target, and set the grasping attitude.

S15、需求尝试改变姿态的数据列表。S15 , a data list of trying to change the attitude is required.

S16、抓取姿态列表。改变姿态,生成抓取动作(设置抓取姿态;抓取ID号;设置允许接触的物体,设置抓取列表)。S16. Grab the gesture list. Change the posture and generate the grasping action (set the grasping posture; grasp the ID number; set the objects that are allowed to be touched, and set the grasping list).

所述的一种改进的机器学习方法分类分析微生物,细胞异常数据,所述方法包括以下步骤:The improved machine learning method for classifying and analyzing abnormal data of microorganisms and cells includes the following steps:

S1、建立微生物,细胞标本数学模型。S1. Establish a mathematical model of microorganisms and cell specimens.

S2、抽取微生物,细胞标本的形状,颜色,轮廓,大小尺寸等细胞特征,包括颜色,形状,轮廓等的特征。S2. Extract the shape, color, outline, size and other cell characteristics of microorganisms and cell specimens, including characteristics such as color, shape, and outline.

S3、提取微生物,细胞标本图像的特。颜色形状轮廓尺寸等图像的特征值,输入检测项目特征值。S3, extracting the characteristics of microorganisms and cell specimen images. Characteristic values of the image such as color, shape, outline size, etc., enter the characteristic value of the detection item.

S4、分类识别微生物,细胞种类(中性粒细胞,嗜酸性粒细胞,嗜碱性粒细胞,淋巴细胞,单核细胞),计算,分析占比及其识别微生物,细胞。S4, classify and identify microorganisms, cell types (neutrophils, eosinophils, basophils, lymphocytes, monocytes), calculate and analyze the proportion and identify microorganisms and cells.

所述的一种改进的神经网络算法微生物,细胞样本识别方法,所述方法包括以下步骤:Described a kind of improved neural network algorithm microorganism, cell sample identification method, described method comprises the following steps:

S1、输入对应微生物,细胞的数学模型。S1. Input the mathematical model of the corresponding microorganism and cell.

S2、抽取标本的实验前后的形态,轮廓,染色反应下颜色,结构,尺寸大小,状态特征(粒状,棒状,泡沫)以及不规则性,核左移,核右移等显微镜下图像识别。S2. The shape, outline, color, structure, size, state characteristics (granular, rod, foam) and irregularity of the extracted specimen before and after the experiment under the staining reaction, and image recognition under the microscope, such as nuclear left shift, nuclear right shift, etc.

S3、建立标本图像的特征的数学模型,输入检测项目特征值。S3, establishing a mathematical model of the characteristics of the specimen image, and inputting the characteristic value of the detection item.

S4、改进权值优化器,快速训练图像,得到输出值。S4. Improve the weight optimizer, quickly train the image, and obtain the output value.

S5、依据输出结果,辅助识别显微镜下图片微生物,细胞及其在各时间间隔,实验步骤中各自的形态,轮廓,染色反应下颜色,结构,尺寸大小,状态特征(粒状,棒状,泡沫)以及不规则性,核左移,核右移等变化。S5. According to the output results, assist to identify the microorganisms, cells and their respective shapes, contours, color, structure, size, state characteristics (granular, rod, foam) and Irregularity, kernel shift left, kernel shift right, etc.

S6、辅助记录动态实时的实验数据及其变化,实时数据分类分析,识别显微镜下图片微生物,细菌。S6. Auxiliary recording of dynamic real-time experimental data and its changes, classification and analysis of real-time data, and identification of microorganisms and bacteria in pictures under the microscope.

所述的一种实验管理系统与机器人主控制系统,语音模块,机器臂连接用于浏览及查询商品,实验预约,实验实时监管,观察,管理实验器皿,实验管理实验日志,管理耗材样本,管理实验人员等功能。一种实验管理系统包括:浏览模块,查询模块,实验预约模块,实验实时监管,观察模块,管理实验器皿模块,管理实验日志模块,管理耗材样本模块,管理实验人员,远端机器臂控制模块,视觉展示模块,语音呼叫模块。The described experiment management system is connected with the robot main control system, the voice module, and the robot arm for browsing and querying commodities, experiment reservation, real-time experiment supervision, observation, management of experiment utensils, experiment management of experiment logs, management of consumable samples, management of Experimenter, etc. An experiment management system includes: a browsing module, a query module, an experiment reservation module, a real-time experiment supervision, an observation module, a module for managing experiment utensils, a module for managing experiment logs, a module for managing samples of consumables, a module for managing experimenters, and a module for controlling a remote robotic arm, Visual display module, voice call module.

附图说明Description of drawings

图1是本申请中机器人模块示意图,附图1标记:Fig. 1 is the schematic diagram of the robot module in this application, and accompanying drawing 1 is marked:

101-机器人主控制系统模块;  102-机器臂动作规划模块;  103-摄像头视觉模块;101- robot main control system module; 102- robot arm action planning module; 103- camera vision module;

104-移动模块;              105-语音模块;            106-多媒体触摸屏模块;104-mobile module; 105-voice module; 106-multimedia touch screen module;

107-扫码信息模块;          108-振荡装置;            109-培养器模块;107-scan code information module; 108-oscillation device; 109-incubator module;

110-加热/冷却/干燥模块;    111-萃取过滤结晶模块;    112-显微镜模块;110-heating/cooling/drying module; 111-extraction filtration crystallization module; 112-microscope module;

113-360度旋转台及放大模块; 114-生物传感器模块;113-360 degree turntable and amplification module; 114-biosensor module;

图2是本申请机器人结构组成图,附图2标记:Fig. 2 is the robot structure composition diagram of the present application, and accompanying drawing 2 is marked:

201-主控制系统;      202-多媒体触摸屏;   203-视觉模块;       204-放大器;201-main control system; 202-multimedia touch screen; 203-vision module; 204-amplifier;

205-移动装置;        206-机器臂;         207-扫码支付装置;   208-过滤装置;205-mobile device; 206-robot arm; 207-scan code payment device; 208-filter device;

209-萃取装置;        210-X射线结晶装置;  211-振荡器装置;     212-细胞破碎装置;209-extraction device; 210-X-ray crystallization device; 211-oscillator device; 212-cell disruption device;

213-生物传感器装置;  214-加热装置;       215-冷却装置;       216-干燥装置;213-biosensor device; 214-heating device; 215-cooling device; 216-drying device;

217-显微镜装置;      218-远端客户端;     219-染色装置;       220-语音装置;217-microscope device; 218-remote client; 219-staining device; 220-voice device;

221远端客户端;221 remote client;

具体的实施方式specific implementation

本方案主要是通过定向识音装置与麦克模块的参数设置,利用语音识别,语音文字转换,语音唤醒等方法,实现人-机器人语音交互,解决语音交互,语音指令,语音查询物品信息。This solution mainly realizes the human-robot voice interaction through the parameter setting of the directional voice recognition device and the microphone module, and uses voice recognition, voice-to-text conversion, voice wake-up and other methods to solve voice interaction, voice commands, and voice query of item information.

本方案主要是通过摄像头,利用改进的机器学习方法与深度神经网络方法,识别物品的颜色,形状,轮廓等综合特征,分类实验反应器,智能识别颜色数字字母文字实验标识信息,返回实验人员,实验器皿信息,解决了实验信息。机器人用扫码器等信息采集读取装置,实现实验员,实验样本管理等。This program mainly uses the camera, using the improved machine learning method and the deep neural network method, to identify the color, shape, outline and other comprehensive characteristics of the object, classify the experimental reactor, intelligently identify the color, number, letter, and text experimental identification information, and return to the experimenter, Experiment utensil information, solve the experiment information. Robots use information acquisition and reading devices such as code scanners to realize experimenters and experimental sample management.

本方案主要是通过机器臂模块,利用返回的位置信息,规划手臂抓取,扫码,摆放,操作实验反应器等动作。实现自主抓取,扫码,移动,摆放,操作实验反应器。利用机器人取代人完成重复作业,提高效率,节省人力成本。大大减轻了人力作业压力,提高了工作效率。This scheme mainly uses the returned position information through the robotic arm module to plan actions such as arm grabbing, scanning code, placing, and operating the experimental reactor. Realize autonomous grasping, scanning, moving, placing, and operating the experimental reactor. Use robots to replace people to complete repetitive tasks, improve efficiency and save labor costs. Greatly reduce the pressure of human work and improve work efficiency.

本申请实施中的技术方案为解决上述技术问题的总体思路如下:The technical solution in the implementation of the present application is the general idea of solving the above-mentioned technical problems as follows:

为了更好的理解上述技术方案,下面结合实施例及附图,对本发明作进一步地的详细说明,但本发明的实施方式不限于此。In order to better understand the above technical solutions, the present invention will be further described in detail below with reference to the embodiments and the accompanying drawings, but the embodiments of the present invention are not limited thereto.

实施例1Example 1

如图1,图2所示,一种远端及自主实验机器人装置、管理系统及方法实施例包括:As shown in FIG. 1 and FIG. 2 , an embodiment of a remote and autonomous experimental robot device, management system and method includes:

机器人主控制系统201,所述的机器人主控制系统201,用于控制机器人。机器人主系统201是通过机器人主系统控制各机器人节点通信,各连接的硬件装置驱动,动作。机器人节点通信模块,通过消息,服务,动作等通信方式实现节点间发布,接收端通信。The robot main control system 201, the robot main control system 201, is used to control the robot. The robot main system 201 controls the communication of each robot node through the robot main system, and drives and operates each connected hardware device. The communication module of robot nodes realizes inter-node publishing and receiver communication through messages, services, actions and other communication methods.

显微镜217,与机器人主控制系统201连接,用于实验样本的视觉采集,显微镜下的图片辅助微生物,细胞等智能识别。The microscope 217 is connected to the robot main control system 201 and is used for visual collection of experimental samples, and pictures under the microscope assist in the intelligent identification of microorganisms and cells.

培养器装置218,所述的培养器装置包括:分装器,试管,烧杯等装置,用于细胞,微生物等实验的培养,过滤,分离,离心,细胞破碎,萃取,生物合成,沉析,干燥。Incubator device 218, the incubator device includes: dispenser, test tube, beaker and other devices, used for cell, microorganism and other experiments culture, filtration, separation, centrifugation, cell disruption, extraction, biosynthesis, precipitation, dry.

振荡器装置211,所述的振荡器装置包括:搅拌棒,震荡装置,用于移动机器臂实验的振荡搅拌。Shaker device 211, the oscillator device includes: a stirring rod, a shaking device, which is used for shaking and stirring in the experiment of the moving robot arm.

加热装置214,与机器人主控制系统201连接,用于加热实验样本。The heating device 214 is connected to the robot main control system 201 for heating the experimental sample.

冷却装置215,与机器人主控制系统201连接,用于冷却实验样本。The cooling device 215 is connected to the robot main control system 201 and is used for cooling the experimental sample.

干燥装置216,所述的干燥装置216包括:微波干燥装置,冷冻干燥装置,流化床干燥装置,红外干燥装置,气流干燥装置,喷雾干燥装置,厢式干燥装置中的一种或多种装置,用于干燥实验样本。The drying device 216 includes: a microwave drying device, a freeze drying device, a fluidized bed drying device, an infrared drying device, an airflow drying device, a spray drying device, and one or more of the chamber drying devices. , for drying experimental samples.

染色器装置219,所述的染色器装置,用于染色样本。The stainer device 219, the stainer device, is used to stain the sample.

过滤装置208,与机器人主控制系统201连接,所述的过滤器装置208包括:真空抽滤装置,离心过滤装置,管式离心装置,碟片式离心装置,超速离心装置中的一种或多种装置,用于过滤实验样本。The filter device 208 is connected to the robot main control system 201, and the filter device 208 includes one or more of: a vacuum suction filter device, a centrifugal filter device, a tubular centrifugal device, a disc type centrifugal device, and an ultracentrifugation device A device for filtering experimental samples.

细胞破碎装置212,与机器人主控制系统201连接,用于多种破碎方式,包括:化学法,机械法,酶解法,碱处理法,渗透冲击法,脂溶解法。机械法包括:超声波,高压匀浆法,研磨法,珠磨法。The cell crushing device 212 is connected to the robot main control system 201 and is used for various crushing methods, including chemical method, mechanical method, enzymatic hydrolysis method, alkali treatment method, osmotic shock method, and lipolysis method. Mechanical methods include: ultrasonic, high pressure homogenization, grinding, bead grinding.

生物传感器装置213,与机器人主控制系统201连接,生物传感器装置213包括:纳米生物传感器,酶生物传感器,生物假阵列芯片,微流控芯片,DNA传感器,免疫生物传感器,气体传感器,离子传感器。The biosensor device 213 is connected to the robot main control system 201. The biosensor device 213 includes: nano biosensors, enzyme biosensors, biological pseudo-array chips, microfluidic chips, DNA sensors, immune biosensors, gas sensors, and ion sensors.

360度旋转台及秤,与机器人主控制系统201连接,用于360度观察实验细节,秤度量实验品。The 360-degree rotating table and scale are connected to the main control system 201 of the robot for 360-degree observation of experimental details and scales to measure the experimental objects.

视觉放大装置及视觉装置203,204,所述的视觉装置203,放大视觉装置203,与移动机器臂206,机器人主控制系统201连接,用于识别数字,颜色标签,依据器皿形状,位置识别实验过程中的步骤,流程中器皿,反应器的位置等。Visual magnification device and visual devices 203, 204, the described visual device 203, and the magnified visual device 203 are connected with the mobile robot arm 206 and the robot main control system 201, which are used to identify numbers, color labels, according to the shape of the vessel, and position recognition experiments The steps in the process, the utensils in the process, the location of the reactor, etc.

移动装置205,与机器人主控制系统201,视觉摄像头202,避障装置205连接,所述的移动装置205,包括轮式移动及履带式移动,用于移动机器臂移动。移动机器臂与轮式,履带式可拆卸。The mobile device 205 is connected with the robot main control system 201, the visual camera 202, and the obstacle avoidance device 205. The mobile device 205 includes wheel-type movement and crawler-type movement for moving the robot arm. Mobile robot arm with wheel type, crawler type can be disassembled.

机器臂206,与机器人主控制系统201,摄像头202连接,用于机器臂206抓,拾,取,放目标物品,扫码,整理,摆放物品等动作。机器臂206动作规划方法包括:利用改进的神经网络方法,自适应学习调整机器臂的参数实现自主的机器臂动作规划,以及利用机器人机体控制及远端用户控制调解机器臂规划参数。The robot arm 206 is connected to the robot main control system 201 and the camera 202, and is used for the robot arm 206 to grasp, pick up, pick up, place target items, scan codes, organize, and place items. The action planning method of the robotic arm 206 includes: using the improved neural network method, adaptively learning and adjusting the parameters of the robotic arm to achieve autonomous robotic arm action planning, and using the robot body control and remote user control to adjust the planning parameters of the robotic arm.

语音装置220,与机器人主控制系统201连接,所述语音模块220包括:定向识音装置,麦克。用于远端用户-实验装置间的语音交互,实验指导,语音指令,实验步骤的语音问询,知识问答。The voice device 220 is connected to the robot main control system 201, and the voice module 220 includes: a directional voice recognition device and a microphone. It is used for voice interaction between remote users and experimental devices, experimental guidance, voice commands, voice inquiry of experimental steps, and knowledge quiz.

多媒体触摸屏202,与机器人主控制系统201连接,所述的多媒体触摸屏202,用于实验步骤,实验过程展示,演示,实验学习指导等。The multimedia touch screen 202 is connected to the robot main control system 201, and the multimedia touch screen 202 is used for experimental steps, experimental process display, demonstration, experimental learning guidance, and the like.

扫码信息采集装置207,所述的扫码信息采集装置207包括:条形码,二维码,生物信息采集器,RFID信息采集器。用于利用条形码,二维码,管理实验耗材,实验样本,实验人员,管理实验步骤及相关信息。The scanning code information collecting device 207, the scanning code information collecting device 207 includes: a barcode, a two-dimensional code, a biological information collector, and an RFID information collector. It is used to manage experimental consumables, experimental samples, experimental personnel, and manage experimental steps and related information using barcodes and QR codes.

萃取器209,所述的萃取器装置209包括:萃取器,分离槽,膨胀阀,压缩机。The extractor 209, the extractor device 209 includes: an extractor, a separation tank, an expansion valve, and a compressor.

X射线结晶装置210,所述的X射线结晶装置210与机器人主控制系统201连接,用于结晶。The X-ray crystallizing device 210, the X-ray crystallizing device 210 is connected to the robot main control system 201 for crystallizing.

实施例2Example 2

如图1,图2所示,实验机器人装置视觉识别实验装置,实验标签颜色标识的方法,机器臂运动抓取实验装置,步骤如下:As shown in Figure 1 and Figure 2, the experimental robot device visually recognizes the experimental device, the method for the color identification of the experimental label, and the robotic arm movement to grab the experimental device, the steps are as follows:

设置实验场景的对应实验装置参数及其对应的位置参数,输入实验台对应的实验装置,实验标签颜色,数字,字母,文字,特殊标识的数学模型。抽取实验场景下,器皿的形状,轮廓,颜色,数字,字母,文字,特殊特标识图像,对应的图像特征作为输入值。改进权值优化器,快速训练图像,得到输出值。Set the corresponding experimental device parameters of the experimental scene and their corresponding position parameters, enter the experimental device corresponding to the experimental bench, the experimental label color, numbers, letters, characters, and the mathematical model of the special logo. In the experimental scene, the shape, outline, color, number, letter, text, and special identification image of the vessel are extracted, and the corresponding image feature is used as the input value. Improve the weight optimizer to quickly train images and get output values.

依据输出形状轮廓结构颜色数字字母文字特殊标识结果,精准识别目标,指定目标及定位目标位置。According to the output shape, outline, structure, color, number, letter, and text special identification results, accurately identify the target, specify the target and locate the target position.

设置实验步骤,规划机器臂206动作,依据动作移动机器臂206到各实验步骤的试验装置及其位置,在主系统下,机器臂定位移动到指定的目标实验装置。配置频率,最大最小线速度,最大最小的旋转速度,x方向y方向最大的线加速度,最大的角速度,距离目标方向的误差,距离目标位置的误差,到达目标位置的权重,避开障碍物的权重等参数。配置机器人机器臂参数,障碍物位置以及其尺寸参数,更新频率,发布频率,实验装置位置,实验台各装置的图标,颜色以及其参数,在坐标转换框架之间的转换最大延时等参数。Set the experimental steps, plan the action of the robotic arm 206, move the robotic arm 206 to the experimental device and its position of each experimental step according to the action, and under the main system, the robotic arm is positioned and moved to the specified target experimental device. Configuration frequency, maximum and minimum linear velocity, maximum and minimum rotational speed, maximum linear acceleration in x direction and y direction, maximum angular velocity, error from the target direction, error from the target position, weight to reach the target position, avoid obstacles parameters such as weights. Configure the parameters of the robot arm, the position of the obstacle and its size parameters, the update frequency, the release frequency, the position of the experimental device, the icons, colors and parameters of each device on the experimental platform, and the maximum delay of conversion between coordinate conversion frameworks and other parameters.

设置机器人初始化参数,目标id及其位置和角度位姿消息,设置运动规划,选择关节角度,关节限位设置,机械臂移动到指定的关节位形,关节限制,关节轨迹位置,速度分量,关节速度。设置运动约束,目标轨迹,速度设置,执行规划的轨迹,设置关节位置,关节角度。设置机械臂上的笛卡尔路径,目标位姿所能拾取的物体对于机器人位姿参数设置。设置机械臂防碰撞矩阵,防碰撞检测模块设置(机器人自身其他部位检测,场景障碍物检测)。实验动作如下:Set robot initialization parameters, target id and its position and angle pose information, set motion plan, select joint angle, joint limit setting, robot arm moves to specified joint shape, joint limit, joint trajectory position, velocity component, joint speed. Set motion constraints, target trajectory, speed settings, execute planned trajectory, set joint position, joint angle. Set the Cartesian path on the robotic arm, and the objects that can be picked up by the target pose are set for the robot pose parameters. Set the anti-collision matrix of the robot arm, and set the anti-collision detection module (detection of other parts of the robot itself, detection of obstacles in the scene). The experimental actions are as follows:

初始化机器臂206,抓取对象,爪,抓握,取放,抓取位姿的参数设置,抓取实验目标。Initialize the robot arm 206 , grab objects, claws, grab, pick and place, parameter settings for grabbing poses, and grab experimental targets.

利用扫码装置207扫码实验样本,实验耗材,实验员工作证,将实验样本放置在360度旋转台及秤207,利用视觉放大器203观察商品细节。Use the scanning device 207 to scan the experimental samples, experimental consumables, and the work permit of the experimenter, place the experimental samples on the 360-degree rotating table and the scale 207, and use the visual magnifier 203 to observe the details of the product.

机器臂206从360度旋转台观察台207抓取物体,移动到定位的实验标签下的生物实验反应器位置,抓取目标,实验样本,实验耗材,利用机器臂,操作按压生物实验反应器特殊标识按钮及操作生物反应器,按照时间间隔,利用视觉放大器203,观察360度旋转台观察台,培养器218中的实验样本,记录实验过程,辅助记录动态实时的实验数据及其变化, 实现实时数据分类分析,识别显微镜图片下的微生物,细菌及其他实验的样本等。The robotic arm 206 grabs the object from the 360-degree rotating table observation table 207, moves to the position of the biological experiment reactor under the positioned experiment label, grabs the target, the experimental sample, the experimental consumables, uses the robot arm, and operates and presses the special biological experiment reactor Mark the button and operate the bioreactor, according to the time interval, use the visual amplifier 203 to observe the 360-degree rotating table observation table, the experimental sample in the incubator 218, record the experimental process, and assist in recording the dynamic real-time experimental data and its changes to achieve real-time Data classification analysis, identification of microorganisms, bacteria and other experimental samples under microscope pictures.

机器臂206从360度旋转台观察台207抓取实验样本,按照实验步骤,定位移动到生物传感器装置213,包括纳米生物传感器,酶生物传感器,生物假阵列芯片,微流控芯片,DNA传感器,免疫生物传感器,气体传感器,离子传感器等机器臂按照规划动作,检测实验数据。The robotic arm 206 grabs the experimental sample from the 360-degree rotating table observation platform 207, and moves it to the biosensor device 213 according to the experimental steps, including nano biosensors, enzyme biosensors, biological pseudo-array chips, microfluidic chips, DNA sensors, Robotic arms such as immune biosensors, gas sensors, and ion sensors detect experimental data according to planned actions.

实施例3Example 3

在实施例1,2的基础上,机器人主控制系统201模块与视觉识别模块203与机器臂206交互,目标设定,识别,定位,利用机器臂206对应,机器臂206动作规划抓取,移动,扫码,放置实验器皿,实验样本,按压动作,本发明的机器臂206实施方式不限于此,具体实施步骤如下:On the basis of Embodiments 1 and 2, the robot main control system 201 module and the visual recognition module 203 interact with the robot arm 206, target setting, recognition, positioning, using the robot arm 206 to correspond, the robot arm 206 action planning to grab, move , scan the code, place the experimental vessel, the experimental sample, and press the action, the embodiment of the robot arm 206 of the present invention is not limited to this, and the specific implementation steps are as follows:

通过管理系统与机器人主控制系统201的语音模块220呼叫,语音指令,语音交互,浏览及查询实验数据。利用实验预约模块,预约实验,按照时间及预约,提醒实验员。利用摄像头及视觉模块203实时监管,观察,放大观察实验器皿,实验样本。记录,管理实验日志,管理耗材样本,管理实验人员。Through the management system and the voice module 220 of the robot main control system 201, call, voice command, voice interaction, browse and query the experimental data. Use the experiment reservation module to reserve an experiment, and remind the experimenter according to the time and appointment. Use the camera and the vision module 203 to supervise, observe, and magnify the observation of experimental vessels and experimental samples in real time. Record, manage experiment logs, manage consumable samples, and manage experimental staff.

通过多媒体触摸屏202,展示实验步骤,实验过程,远端与实验指导,实验用户远端通信,联系指导实验检测,监控实验。Through the multimedia touch screen 202 , the experimental steps, the experimental process, the remote and the experimental guidance, the remote communication of the experimental user, the contact guidance, the experimental detection, and the monitoring of the experiment are displayed.

Claims (10)

一种远端及自主实验机器人装置、管理系统及方法,其特征在于,一种远端及自主实验机器人装置包括:A remote and autonomous experimental robot device, management system and method, characterized in that a remote and autonomous experimental robot device comprises: 机器人主控制系统,所述的机器人主控制系统用于控制机器人。机器人主系统是通过机器人主系统控制各机器人节点通信,各连接的硬件装置驱动,动作。机器人节点通信模块,通过消息,服务,动作等通信方式实现节点间发布,接收端通信。A robot main control system, the robot main control system is used to control the robot. The robot main system controls the communication of each robot node through the robot main system, and each connected hardware device drives and moves. The communication module of robot nodes realizes inter-node publishing and receiver communication through messages, services, actions and other communication methods. 显微镜视觉采集识别模块,与机器人主控制系统连接,用于实验样本的视觉采集,显微镜下的图片辅助微生物,细胞等智能识别。The microscope visual acquisition and recognition module is connected with the main control system of the robot, which is used for visual acquisition of experimental samples, and the pictures under the microscope assist the intelligent identification of microorganisms and cells. 培养器装置,所述的培养器装置包括:分装器,试管,烧杯等装置,用于细胞,微生物等实验的培养,过滤,分离,离心,细胞破碎,萃取,生物合成,沉析,干燥。Incubator device, the incubator device includes: dispenser, test tube, beaker and other devices, used for cell, microorganism and other experiments culture, filtration, separation, centrifugation, cell disruption, extraction, biosynthesis, precipitation, drying . 振荡器装置,与机器人主控制系统连接,所述的振荡器装置包括:搅拌棒,震荡装置,用于移动机器臂实验的振荡,搅拌。The oscillator device is connected with the main control system of the robot, and the oscillator device includes: a stirring rod and an oscillatory device, which are used for shaking and stirring in the experiment of the mobile robot arm. 加热装置,与机器人主控制系统连接,用于加热实验样本。The heating device is connected with the main control system of the robot for heating the experimental sample. 冷却装置,与机器人主控制系统连接,用于冷却实验样本。The cooling device is connected to the main control system of the robot for cooling the experimental samples. 干燥装置,与机器人主控制系统连接,所述的干燥装置包括:微波干燥装置,冷冻干燥装置,流化床干燥装置,红外干燥装置,气流干燥装置,喷雾干燥装置,厢式干燥装置中的一种或多种装置。The drying device is connected with the main control system of the robot, and the drying device includes: a microwave drying device, a freeze drying device, a fluidized bed drying device, an infrared drying device, an airflow drying device, a spray drying device, and one of the chamber drying devices. one or more devices. 染色器装置,与机器人主控制系统连接,用于染色实验样本。The stainer device is connected to the main control system of the robot and is used for staining experimental samples. 过滤装置,所述的过滤器装置包括:真空抽滤装置,离心过滤装置,管式离心装置,碟片式离心装置,超速离心装置中的一种或多种装置。Filtration device, the filter device includes one or more of vacuum filtration device, centrifugal filtration device, tubular centrifugal device, disc centrifugal device and ultracentrifugation device. 细胞破碎装置,与机器人主控制系统连接,用于多种破碎方式,包括:化学法,机械法,酶解法,碱处理法,渗透冲击法,脂溶解法。机械法包括:超声波,高压匀浆法,研磨法,珠磨法。The cell crushing device, connected with the main control system of the robot, is used for various crushing methods, including: chemical method, mechanical method, enzymatic hydrolysis method, alkali treatment method, osmotic shock method, and lipolysis method. Mechanical methods include: ultrasonic, high pressure homogenization, grinding, bead grinding. 萃取器,与机器人主控制系统连接,所述的萃取器装置包括:萃取器,分离槽,膨胀阀,压缩机。The extractor is connected to the main control system of the robot, and the extractor device includes: an extractor, a separation tank, an expansion valve, and a compressor. 结晶装置,所述的结晶装置,与机器人主控制系统连接,用于结晶。The crystallization device, the crystallization device, is connected with the main control system of the robot for crystallization. 多传感器装置,与机器人主控制系统连接,多传感器装置包括:纳米生物传感器,酶生物传感器,生物假阵列芯片,微流控芯片,DNA传感器,免疫生物传感器,气体传感器,离子传感器,光电式传感器,应变与压阻式传感器,电感式传感器,电容式传感器,霍尔传感器,压电式传感器,阻抗型传感器,半导体型传感器,声波型传感器,热量式传感器,电化学传感器,光敏传感器中的一种或多种装置。The multi-sensor device is connected to the main control system of the robot. The multi-sensor device includes: nano biosensors, enzyme biosensors, biological pseudo-array chips, microfluidic chips, DNA sensors, immune biosensors, gas sensors, ion sensors, photoelectric sensors , Strain and Piezoresistive Sensors, Inductive Sensors, Capacitive Sensors, Hall Sensors, Piezoelectric Sensors, Impedance Sensors, Semiconductor Sensors, Acoustic Sensors, Thermal Sensors, Electrochemical Sensors, Photosensitive Sensors one or more devices. 360度旋转台及秤,与机器人主控制系统连接,用于360度观察实验细节,秤用于度量 实验品。The 360-degree rotating table and scale are connected to the main control system of the robot for 360-degree observation of experimental details, and the scale is used to measure the experimental objects. 视觉装置及放大装置,所述的视觉装置,放大视觉装置,与移动机器臂,机器人主控制系统连接,用于识别数字,颜色标签,依据器皿形状,位置识别实验过程中的步骤,流程中器皿,反应器的位置等。Vision device and magnifying device, said vision device, magnifying vision device, connected with mobile robot arm, robot main control system, used to identify numbers, color labels, according to the shape and position of the vessel to identify steps in the experimental process, vessels in the process , the location of the reactor, etc. 移动装置,与机器人主控制系统,视觉摄像头,避障装置连接,所述的移动装置,包括轮式移动及履带式移动,用于移动机器臂移动。移动机器臂与轮式,履带式可拆卸,移动底座及机器人本体可独立使用。The mobile device is connected with the main control system of the robot, the visual camera, and the obstacle avoidance device. The mobile device includes wheel-type movement and crawler-type movement, which are used to move the robot arm. Mobile robot arm and wheel type, crawler type can be disassembled, mobile base and robot body can be used independently. 移动式机器臂,与机器人主控制系统,移动装置,摄像头连接,用于机器臂抓,拾,取,放目标物品,扫码,整理,摆放物品,按压,操作实验反应器等动作。机器臂动作规划方法包括:利用改进的神经网络方法,自适应学习调整机器臂的参数实现自主的机器臂动作规划,以及利用机器人机体控制及远端用户控制调解机器臂规划参数。The mobile robot arm is connected with the main control system of the robot, the mobile device, and the camera. The robot arm action planning method includes: using an improved neural network method, adaptively learning and adjusting the parameters of the robot arm to achieve autonomous robot arm action planning, and using the robot body control and remote user control to adjust the robot arm planning parameters. 语音装置,与机器人主控制系统连接,所述语音模块包括:定向识音装置,麦克。用于远端用户-实验装置间的语音交互,实验指导,语音指令,实验步骤的语音问询,知识问答。The voice device is connected with the main control system of the robot, and the voice module includes: a directional voice recognition device and a microphone. It is used for voice interaction between remote users and experimental devices, experimental guidance, voice commands, voice inquiry of experimental steps, and knowledge quiz. 多媒体触摸屏,与机器人主控制系统连接,所述的多媒体触摸屏。用于实验步骤,实验过程展示,演示,实验学习指导等。The multimedia touch screen is connected with the main control system of the robot, and the multimedia touch screen is described. It is used for experimental steps, experimental process display, demonstration, experimental learning guidance, etc. 扫码信息采集装置,所述的扫码信息采集装置包括:条形码,二维码,生物信息采集器,RFID信息采集器。用于利用条形码,二维码,管理实验耗材,实验样本,实验人员,管理实验步骤及相关信息。The scanning code information collection device includes: barcode, two-dimensional code, biological information collector, and RFID information collector. It is used to manage experimental consumables, experimental samples, experimental personnel, and manage experimental steps and related information using barcodes and QR codes. 一种远端及自主实验机器人装置,其特征在于,所述的视觉识别模块,与机器人主控制系统,摄像头,视觉识别放大器连接,所述视觉识别模块包括:摄像头,放大器。用于采集发布图像信息,用于识别人员人脸信息,实验生物反应装置,颜色标签,实验器皿信息,定位目标物品,目标人员,及其所在位置,通过识别颜色,数字码,文字,二维码,特殊标识等综合信息。通过机器人系统实现主控制单元的对各实验场景下视频摄像头下实验人员,实验耗材,实验物品管理。所述的360度旋转台及秤与视觉放大器,与机器人主控制系统连接,用于360度观察实验物品细节。所述的360度旋转台及秤,用于旋转,协助摄像头视觉识别,放大器装置360度观察实验品细节。A remote and autonomous experimental robot device, characterized in that the visual recognition module is connected with the main robot control system, a camera, and a visual recognition amplifier, and the visual recognition module includes a camera and an amplifier. It is used to collect and publish image information, to identify people's face information, experimental biological reaction devices, color labels, experimental vessel information, to locate target objects, target people, and their locations, by identifying color, digital code, text, two-dimensional code, special identification and other comprehensive information. Through the robot system, the main control unit can manage the experimental personnel, experimental consumables and experimental items under the video camera in each experimental scene. The 360-degree rotating table, scale and visual amplifier are connected to the main control system of the robot for 360-degree observation of the details of the experimental items. The 360-degree rotating table and scale are used for rotation, assisting the visual recognition of the camera, and the amplifier device observes the details of the experimental product in 360 degrees. 一种远端及自主实验机器人装置,其特征在于,所述的显微镜视觉采集识别模块,与机器人主控制系统连接,通过改进的机器学习方法,改进的深度学习方法,抽取细胞,微生物等样本图像的轮廓,形状,结构,颜色等特征,智能识别细菌病毒等微生物的种类,学习训练图像参数,用于辅助识别实验样本,视觉采集显微镜下的图片辅助微生物,细胞等智能识别。A remote and autonomous experimental robot device, characterized in that the microscope visual acquisition and recognition module is connected to the main control system of the robot, and extracts sample images of cells, microorganisms, etc. through an improved machine learning method and an improved deep learning method. The contour, shape, structure, color and other characteristics of the device, intelligently identify the types of microorganisms such as bacteria and viruses, learn and train image parameters to assist in the identification of experimental samples, and visually collect pictures under the microscope to assist in intelligent identification of microorganisms and cells. 一种远端及自主实验机器人装置,其特征在于,所述的移动式机器臂,与主系统和视觉摄像头相连接,机器臂由主控制器动作规划模块,通过视觉识别目标,利用多机器臂抓,取,放目标物品,扫码实验样本,实验耗材。动作规划,所述机器臂动作规划模块通过配置机器臂,腕,爪的位置参数,角度参数,规划抓,取动,放置参数,机器臂移动,抓取,放置,扫码,整理,摆放物品,按压,操作实验反应器,配置机器臂的动作参数包括自适应学习调整参数以及远端控制调解机器臂参数。A remote and autonomous experimental robot device, characterized in that the mobile robot arm is connected with a main system and a visual camera, the robot arm is controlled by an action planning module of the main controller, recognizes targets through vision, and uses multiple robot arms. Grab, take, and put the target item, scan the code for experimental samples, and experimental consumables. Action planning, the robot arm action planning module configures the position parameters and angle parameters of the robot arm, wrist, and claw to plan grab, move, and place parameters, and the robot arm moves, grabs, places, scans, sorts, and places. Items, pressing, operating the experimental reactor, and configuring the action parameters of the robotic arm include adaptive learning adjustment parameters and remote control to adjust the robotic arm parameters. 一种远端及自主实验机器人装置,其特征在于,所述的实验反应器装置,实验反应器装置包括:振荡器装置,加热装置,冷却装置,干燥装置,过滤装置,细胞破碎装置,细胞破碎装置。A remote and autonomous experimental robot device, characterized in that the experimental reactor device includes: an oscillator device, a heating device, a cooling device, a drying device, a filtering device, a cell crushing device, and a cell crushing device. device. 进一步,所述的振荡器装置,与机器人主控制系统连接,所述的振荡器装置包括:搅拌棒,震荡装置,用于移动机器臂实验的振荡,搅拌,通过震荡装置,搅拌装置完成振荡,搅拌。设置震荡装置参数包括:振荡搅拌次数,振荡搅拌时间,振荡搅拌强度,振荡搅拌方法等。Further, the oscillator device is connected to the main control system of the robot, and the oscillator device includes: a stirring rod, a shaking device, which is used for the vibration and stirring of the mobile robot arm experiment, and the shaking device is used to complete the shaking by the stirring device, Stir. The parameters for setting the shaking device include: the number of times of shaking, the time of shaking, the strength of shaking, and the method of shaking. 进一步,所述的加热装置,与机器人主控制系统连接,用于加热实验样本。设置震荡装置参数包括:加热温度,加热时间,加热位置,范围等。Further, the heating device is connected to the main control system of the robot for heating the experimental sample. Set the parameters of the shaking device including: heating temperature, heating time, heating position, range, etc. 进一步,所述的冷却装置,与机器人主控制系统连接,用于冷却实验样本。冷却装置参数包括:冷却温度,冷却时间等。Further, the cooling device is connected to the main control system of the robot for cooling the experimental sample. Cooling device parameters include: cooling temperature, cooling time, etc. 进一步,所述的干燥装置,与机器人主控制系统连接,用于干燥实验样本。所述的干燥装置包括:微波干燥装置,冷冻干燥装置,流化床干燥装置,红外干燥装置,气流干燥装置,喷雾干燥装置,厢式干燥装置中的一种或多种装置。干燥样本的参数包括:干燥时间,干燥的方式,干燥的强度等。Further, the drying device is connected to the main control system of the robot for drying the experimental samples. The drying device includes one or more of microwave drying device, freeze drying device, fluidized bed drying device, infrared drying device, airflow drying device, spray drying device, and chamber drying device. Parameters for drying samples include: drying time, drying method, drying intensity, etc. 进一步,所述的过滤装置,与机器人主控制系统连接,用于过滤。所述的过滤器装置包括:真空抽滤装置,离心过滤装置,管式离心装置,碟片式离心装置,超速离心装置中的一种或多种装置。Further, the filtering device is connected with the main control system of the robot for filtering. The filter device includes one or more of a vacuum suction filtration device, a centrifugal filter device, a tubular centrifugal device, a disc-type centrifugal device, and an ultracentrifugation device. 进一步,所述的细胞破碎装置,与机器人主控制系统连接,用于细胞破碎。所述的多种破碎装置对应多种破碎方式包括:化学法,机械法,酶解法,碱处理法,渗透冲击法,脂溶解法。机械法包括:超声波,高压匀浆法,研磨法,珠磨法。Further, the cell crushing device is connected to the main control system of the robot for cell crushing. The various crushing devices correspond to various crushing methods including: chemical method, mechanical method, enzymatic hydrolysis method, alkali treatment method, osmotic shock method, and fat dissolving method. Mechanical methods include: ultrasonic, high pressure homogenization, grinding, bead grinding. 进一步,所述的染色器装置,与机器人主控制系统连接,用于染色实验样本。Further, the dyer device is connected with the main control system of the robot, and is used for dyeing experimental samples. 一种远端及自主实验机器人装置,其特征在于,所述的语音模块,与机器人主控制系统连接,语音模块包括:定向识音装置,麦克。通过配置定向识音装置,麦克等参数,通过语音识别, 语音唤醒,语音文字转换技术,远端用户沟通,配置语言库,用于远端用户。远端用户-机器人-实验员间语音交互,语音指令,语音问询,语音知识问答。A remote and autonomous experimental robot device is characterized in that the voice module is connected with the main control system of the robot, and the voice module includes a directional voice recognition device and a microphone. By configuring directional recognition device, microphone and other parameters, through voice recognition, voice wake-up, voice-to-text conversion technology, remote user communication, configure language library for remote users. Voice interaction between remote user-robot-experimenter, voice command, voice inquiry, voice knowledge quiz. 一种远端及自主实验机器人装置,其特征在于,所述的扫码信息采集装置,与机器人主控制系统连接,扫码信息采集装置包括:扫码信息采集,扫描,读取装置。扫码信息采集,扫描,读取装置是机器人主系统与摄像头,扫描器,读取器,信息采集读取装置连接,智能采集识别二维码,数字码,生物信息,RFID信息等管理实验人员,实验物品,实验器材等多种信息。A remote and autonomous experimental robot device is characterized in that the scanning code information collection device is connected with the robot main control system, and the scanning code information collection device comprises: scanning code information collection, scanning and reading devices. The scanning code information collection, scanning, and reading device is the main system of the robot connected with the camera, scanner, reader, information collection and reading device, intelligent collection and identification of two-dimensional code, digital code, biological information, RFID information and other management experiment personnel , experimental items, experimental equipment and other information. 一种远端及自主实验机器人装置、管理系统及方法,其特征在于,一种实验管理系统,与机器人主控制系统,语音模块,机器臂连接,用于浏览及查询商品,实验预约,实验实时监管,观察,管理实验器皿,实验管理实验日志,管理耗材样本,管理实验人员,远端机器臂控制模块,视觉展示模块,语音呼叫模块等功能。A remote and autonomous experimental robot device, management system and method, characterized in that an experiment management system is connected with a robot main control system, a voice module, and a robot arm, and is used for browsing and querying commodities, experiment reservation, and real-time experiments. Supervision, observation, management of experimental utensils, experimental management of experimental logs, management of consumables samples, management of experimental personnel, remote robot arm control module, visual display module, voice call module and other functions. 一种远端及自主实验机器人装置、管理系统及方法,其特征在于,一种视觉识别实验装置,实验标签的机器臂动作方法,所述的方法包括以下步骤:A remote and autonomous experimental robot device, management system and method, characterized in that, a visual recognition experimental device, a robotic arm action method for an experimental label, the method comprising the following steps: S1、设置实验场景的对应实验装置参数及其对应的位置参数。S1. Set the corresponding experimental device parameters of the experimental scene and their corresponding position parameters. S2、输入实验台对应的实验装置,实验标签颜色,数字,字母,文字,特殊标识的数学模型。S2. Input the experimental device corresponding to the experimental bench, the color of the experimental label, the number, the letter, the text, and the mathematical model of the special logo. S3、抽取实验场景下,器皿的形状,轮廓,结构,颜色,数字,字母,文字,特殊特标识图像,S3. Extract the shape, outline, structure, color, number, letter, text, and special identification image of the vessel under the experimental scene. 对应的图像特征作为输入值。The corresponding image features are used as input values. S4、改进权值优化器,快速训练图像,得到输出值。S4. Improve the weight optimizer, quickly train the image, and obtain the output value. S5、依据输出形状,轮廓,结构,颜色,数字,字母,文字,特殊标识结果,S5. According to the output shape, outline, structure, color, number, letter, text, special identification result, 精准识别目标,指定目标及定位目标位置。Accurately identify the target, specify the target and locate the target position. S6、依据实验,设置实验步骤,依据实验步骤规划机器臂动作,依据动作规划,移动机器臂到各实验步骤的试验装置及其位置,在主系统下,机器臂定位移动到指定的目标实验装置。S6. According to the experiment, set the experimental steps, plan the action of the robot arm according to the experimental steps, move the robot arm to the test device and its position of each experimental step according to the action plan, and move the robot arm to the designated target experimental device under the main system. . S7、各实验装置节点的配置文件中参数包括:频率,最大最小线速度,最大最小的旋转速度,x方向y方向最大的线加速度,最大的角速度,距离目标方向的误差,距离目标位置的误差,到达目标位置的权重,避开障碍物的权重等参数。S7. The parameters in the configuration file of each experimental device node include: frequency, maximum and minimum linear velocity, maximum and minimum rotational speed, maximum linear acceleration in x and y directions, maximum angular velocity, error from the target direction, and error from the target position , the weight of reaching the target position, the weight of avoiding obstacles and other parameters. S8、在各实验装置节点中配置机器人机器臂参数,障碍物位置以及其尺寸参数更新频率发布频率,实验装置位置,实验台各装置的图标颜色以及其参数,在坐标转换框架之间的转换最大延时等参数。S8. Configure the parameters of the robot arm, the position of the obstacle and the update frequency of its size parameters in each experimental device node parameters such as delay. S9、设置机器人初始参数,包括机器人,目标及其位置和角度位姿消息。S9, set the initial parameters of the robot, including the robot, the target and its position and angle pose information. S10、设置运动规划,选择关节角度,关节限位设置,机械臂移动到指定的关节位形,关节限制,S10. Set the motion plan, select the joint angle, set the joint limit, move the mechanical arm to the specified joint shape, joint limit, 关节轨迹位置,速度分量,关节速度。设置运动约束,目标轨迹,速度设置,执行规划的轨迹设置关节位置,关节角度。Joint trajectory position, velocity component, joint velocity. Set motion constraints, target trajectory, speed settings, execute planned trajectory, set joint position, joint angle. S11、设置机械臂上的笛卡尔路径,目标位姿所能拾取的物体对于机器人位姿参数设置。S11. Set the Cartesian path on the robotic arm, and the objects that can be picked up by the target pose are set for the robot pose parameters. S12、设置机械臂防碰撞矩阵,防碰撞检测模块设置(机器人自身其他部位检测,场景障碍物检测)。S12 , setting the anti-collision matrix of the manipulator, and setting the anti-collision detection module (detection of other parts of the robot itself, detection of obstacles in the scene). S13、机械臂,爪参数设置,抓握,取放,抓取位姿参数设置与匹配目标位姿。S13. Mechanical arm, claw parameter setting, grasping, pick-and-place, grasping pose parameter setting and matching target pose. S14、初始化放置抓取,物体的位置,抓取姿态对象,生成抓取姿态。S14, initialize the placement and grasping, the position of the object, the grasping gesture object, and generate the grasping posture. (初始化抓取对象,创建夹瓜张开闭合的姿态)。(Initialize the grab object and create the open and closed posture of the clip). 设置期望的夹爪靠近,撤离目标的参数,设置抓取姿态。Set the desired gripper approach, parameters to withdraw from the target, and set the grasping attitude. S15、需求尝试改变姿态的数据列表。S15 , a data list of trying to change the attitude is required. S16、抓取姿态列表。改变姿态,生成抓取动作S16. Grab the gesture list. Change pose, generate grab action (设置抓取姿态;抓取ID号;设置允许接触的物体,设置抓取列表)。(Set the grasping posture; grasp the ID number; set the objects that are allowed to be touched, and set the grasping list). 一种远端及自主实验机器人装置、管理系统及方法,其特征在于,一种实时记录实验数据,分析,分类实验数据及其变化的实验方法,利用改进的神经网络方法实时识别微生物,细胞方法,所述的方法包括以下步骤:A remote and autonomous experimental robot device, management system and method, characterized in that, an experimental method for recording experimental data in real time, analyzing, classifying experimental data and its changes, using an improved neural network method to identify microorganisms in real time, cell methods , the method includes the following steps: 一种改进的机器学习方法分类微生物,细胞异常数据,所述方法包括以下步骤:An improved machine learning method for classifying microbial, cellular anomaly data, the method comprising the steps of: S1、建立微生物,细胞标本数学模型,设定观察的时间间隔。S1. Establish a mathematical model of microorganisms and cell samples, and set the time interval for observation. S2、抽取微生物,细胞标本的形状,颜色,轮廓,大小尺寸等细胞特征,包括颜色,形状,轮廓等的特征,提取微生物,细胞标本图像的特。图像的特征值(颜色形状轮廓)等,输入检测项目特征值,,记录每个时间间隔内的样本信息及变化。S2. Extract the shape, color, outline, size and other cell characteristics of microorganisms and cell specimens, including features such as color, shape, and contour, and extract the characteristics of microorganisms and cell specimen images. The characteristic value of the image (color, shape, outline), etc., input the characteristic value of the detection item, and record the sample information and changes in each time interval. S3、辅助记录动态实时的实验数据及其变化,实现实时数据分类分析识别显微镜下图片微生物,细菌。分类识别微生物种类,细胞种类,计算,分析占比及其识别微生物,细胞。S3. Assist in recording dynamic real-time experimental data and its changes, and realize real-time data classification and analysis to identify microorganisms and bacteria in pictures under the microscope. Classification and identification of microorganisms, cell types, calculation, analysis of proportions and identification of microorganisms and cells. 进一步,一种改进的神经网络方法分析识别实验标本,智能化识别微生物,细胞方法,其特征在于,所述的方法包括以下步骤:Further, an improved neural network method for analyzing and identifying experimental specimens, and intelligently identifying microorganisms and cells, is characterized in that the method includes the following steps: S1、输入对应微生物,细胞的数学模型。S1. Input the mathematical model of the corresponding microorganism and cell. S2、抽取标本的实验前后的形态,轮廓,染色反应下颜色,结构,尺寸大小,状态特征(粒 状,棒状,泡沫)以及不规则性,核左移,核右移等显微镜下图像识别。S2. The shape, outline, color, structure, size, state characteristics (granular, rod, foam) and irregularity of the extracted specimen before and after the experiment under the staining reaction, and image recognition under the microscope such as nuclear left shift, nuclear right shift, etc. S3、建立标本图像的特征的数学模型,输入检测项目特征值。S3, establishing a mathematical model of the characteristics of the specimen image, and inputting the characteristic value of the detection item. S4、改进权值优化器,快速训练图像,得到输出值。S4. Improve the weight optimizer, quickly train the image, and obtain the output value. S5、依据输出结果,辅助识别显微镜下图片微生物,细胞及其在各时间间隔,实验步骤中各自的形态,轮廓,染色反应下颜色,结构,尺寸大小,状态特征(粒状,棒状,泡沫)以及不规则性,核左移,核右移等变化。S5. According to the output results, assist to identify the microorganisms, cells and their respective shapes, contours, color, structure, size, state characteristics (granular, rod, foam) and Irregularity, kernel shift left, kernel shift right, etc. S6、辅助记录动态实时的实验数据及其变化,实现实时数据分类分析识别显微镜下图片微生物,细菌。S6, assist in recording dynamic real-time experimental data and its changes, and realize real-time data classification and analysis to identify microorganisms and bacteria in pictures under the microscope.
PCT/CN2022/000018 2021-02-08 2022-02-07 Robot apparatus for remote and autonomous experiment, and management system and method Ceased WO2022166505A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU2022217204A AU2022217204A1 (en) 2021-02-08 2022-02-07 Robot apparatus for remote and autonomous experiment, and management system and method
CN202280018415.4A CN117616497A (en) 2021-02-08 2022-02-07 A remote and autonomous experimental robot device, management system and method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110180722.X 2021-02-08
CN202110180722.XA CN112951230A (en) 2021-02-08 2021-02-08 Remote and autonomous experimental robot device, management system and method

Publications (1)

Publication Number Publication Date
WO2022166505A1 true WO2022166505A1 (en) 2022-08-11

Family

ID=76245108

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/000018 Ceased WO2022166505A1 (en) 2021-02-08 2022-02-07 Robot apparatus for remote and autonomous experiment, and management system and method

Country Status (3)

Country Link
CN (2) CN112951230A (en)
AU (1) AU2022217204A1 (en)
WO (1) WO2022166505A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116141303A (en) * 2022-09-07 2023-05-23 颖态智能技术(上海)有限公司 A self-decision composite robot control system
CN118404587A (en) * 2024-06-19 2024-07-30 中国海洋大学 Dual-machine following system and method based on combination of optical identification and electromagnetic induction
CN120816543A (en) * 2025-09-17 2025-10-21 杭州默迅智能科技有限公司 Laboratory Automation System Based on Artificial Intelligence Technology

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112951230A (en) * 2021-02-08 2021-06-11 谈斯聪 Remote and autonomous experimental robot device, management system and method
CN113110325A (en) * 2021-04-12 2021-07-13 谈斯聪 Multi-arm sorting operation mobile delivery device, and optimized management system and method
WO2023164811A1 (en) * 2022-03-01 2023-09-07 深圳先进技术研究院 Robot scientist-aided crystal material digital manufacturing method, and system
CN114995467B (en) * 2022-08-08 2022-12-30 中国科学技术大学 Chemical robot management and planning scheduling system, method and equipment
CN116423471B (en) * 2023-06-13 2023-08-15 中国农业科学院蔬菜花卉研究所 Intelligent cooperative robot for flux experiment operation
CN116859788A (en) * 2023-08-04 2023-10-10 北京三维天地科技股份有限公司 Multi-equipment task scheduling central control management platform
CN120170713B (en) * 2025-05-19 2025-07-18 海南大学三亚南繁研究院 Experimental robot control method and device, electronic equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160144358A1 (en) * 2016-01-28 2016-05-26 Nilesh Patel Dynamic Lab on a Chip Based Point-Of-Care Device For Analysis of Pluripotent Stem Cells, Tumor Cells, Drug Metabolites, Immunological Response, Glucose Monitoring, Hospital Based Infectious Diseases, and Drone Delivery Point-of-Care Systems
CN110275037A (en) * 2019-06-14 2019-09-24 齐鲁工业大学 An unmanned production line and method for making cytological pathological test samples
CN111496770A (en) * 2020-04-09 2020-08-07 上海电机学院 Intelligent handling robotic arm system and using method based on 3D vision and deep learning
CN111704996A (en) * 2020-06-29 2020-09-25 杭州医学院 Intelligent automatic biosafety pathogenic microorganism research system
CN111906785A (en) * 2020-07-23 2020-11-10 谈斯聪 Multi-mode comprehensive information identification mobile double-arm robot device system and method
CN112205982A (en) * 2020-06-19 2021-01-12 谈斯聪 Blood data acquisition and analysis intelligent recognition diagnosis robot platform
CN112951230A (en) * 2021-02-08 2021-06-11 谈斯聪 Remote and autonomous experimental robot device, management system and method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050008720A (en) * 2002-05-17 2005-01-21 벡톤 디킨슨 앤드 컴퍼니 Automated system for isolating, amplifying and detecting a target nucleic acid sequence
CN111916195A (en) * 2020-08-05 2020-11-10 谈斯聪 Medical robot device, system and method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160144358A1 (en) * 2016-01-28 2016-05-26 Nilesh Patel Dynamic Lab on a Chip Based Point-Of-Care Device For Analysis of Pluripotent Stem Cells, Tumor Cells, Drug Metabolites, Immunological Response, Glucose Monitoring, Hospital Based Infectious Diseases, and Drone Delivery Point-of-Care Systems
CN110275037A (en) * 2019-06-14 2019-09-24 齐鲁工业大学 An unmanned production line and method for making cytological pathological test samples
CN111496770A (en) * 2020-04-09 2020-08-07 上海电机学院 Intelligent handling robotic arm system and using method based on 3D vision and deep learning
CN112205982A (en) * 2020-06-19 2021-01-12 谈斯聪 Blood data acquisition and analysis intelligent recognition diagnosis robot platform
CN111704996A (en) * 2020-06-29 2020-09-25 杭州医学院 Intelligent automatic biosafety pathogenic microorganism research system
CN111906785A (en) * 2020-07-23 2020-11-10 谈斯聪 Multi-mode comprehensive information identification mobile double-arm robot device system and method
CN112951230A (en) * 2021-02-08 2021-06-11 谈斯聪 Remote and autonomous experimental robot device, management system and method

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116141303A (en) * 2022-09-07 2023-05-23 颖态智能技术(上海)有限公司 A self-decision composite robot control system
CN118404587A (en) * 2024-06-19 2024-07-30 中国海洋大学 Dual-machine following system and method based on combination of optical identification and electromagnetic induction
CN120816543A (en) * 2025-09-17 2025-10-21 杭州默迅智能科技有限公司 Laboratory Automation System Based on Artificial Intelligence Technology

Also Published As

Publication number Publication date
CN112951230A (en) 2021-06-11
CN117616497A (en) 2024-02-27
AU2022217204A1 (en) 2023-09-28

Similar Documents

Publication Publication Date Title
WO2022166505A1 (en) Robot apparatus for remote and autonomous experiment, and management system and method
CN116600947A (en) Multi-mode comprehensive information identification mobile double-arm robot device, system and method
Che et al. Intelligent robotic control system based on computer vision technology
CN111590611B (en) Article classification and recovery method based on multi-mode active perception
Gurin et al. MobileNetv2 Neural Network Model for Human Recognition and Identification in the Working Area of a Collaborative Robot
CN108563995B (en) Gesture recognition control method for human-machine collaboration system based on deep learning
CN111347411B (en) Three-dimensional visual recognition and grasping method of dual-arm collaborative robot based on deep learning
CN100352623C (en) Control device and method for intelligent mobile robot capable of picking up article automatically
CN112205982A (en) Blood data acquisition and analysis intelligent recognition diagnosis robot platform
AU2021291903A1 (en) Integrated device, system and method for blood collection and analysis as well as intelligent image identification and diagnosis
Jang et al. Developing a cooking robot system for raw food processing based on instance segmentation
Shukla et al. A framework for improving information content of human demonstrations for enabling robots to acquire complex tool manipulation skills
Roudbari et al. Autonomous vision-based robotic grasping of household objects: A practical case study
Bhardwaj et al. Robotics and Automation: Revolutionizing Research and Development
Chandan et al. Intelligent Robotic Arm for Industry Applications
Eshaghi et al. Automated real time image based visual servo control of single cell surgery
Sawant et al. Implementation of faster RCNN algorithm for smart robotic arm based on computer vision
Saadati et al. Deep learning-based imitation of human actions for autonomous pick-and-place tasks
Zhang et al. [Retracted] Multifunctional Robot Grasping System Based on Deep Learning and Image Processing
Guo et al. Design of Household Robotic Arm System to sort Recyclable Resources based on Deep Learning
CN120170713B (en) Experimental robot control method and device, electronic equipment and storage medium
Shashank et al. Vision Guided Sorting of Medical Dissection Tools using a 3DOF SCARA Robot with Deep Learning Integration
Feiten et al. Modeling and control for mobile manipulation in everyday environments
US20240326243A1 (en) Automation system for performing lab protocols
Liu et al. Algorithm analysis of finger pressing action for robotic arm with dexterous hand

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22748784

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023550255

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 202280018415.4

Country of ref document: CN

WWE Wipo information: entry into national phase

Ref document number: 2022217204

Country of ref document: AU

Ref document number: 2022748784

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2022217204

Country of ref document: AU

Date of ref document: 20220207

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2022748784

Country of ref document: EP

Effective date: 20230908

ENP Entry into the national phase

Ref document number: 2022748784

Country of ref document: EP

Effective date: 20230908

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 22748784

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 23-10-2023)