WO2022047720A1 - Systèmes et procédés d'aide à la mise en place d'un instrument chirurgical dans un sujet - Google Patents
Systèmes et procédés d'aide à la mise en place d'un instrument chirurgical dans un sujet Download PDFInfo
- Publication number
- WO2022047720A1 WO2022047720A1 PCT/CN2020/113413 CN2020113413W WO2022047720A1 WO 2022047720 A1 WO2022047720 A1 WO 2022047720A1 CN 2020113413 W CN2020113413 W CN 2020113413W WO 2022047720 A1 WO2022047720 A1 WO 2022047720A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- surgical instrument
- subject
- laser beam
- trajectory information
- placing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/10—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
- A61B90/11—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints
- A61B90/13—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints guided by light, e.g. laser pointers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/107—Visualisation of planned trajectories or target regions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/365—Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3937—Visible markers
Definitions
- the present disclosure generally relates to a medical system, and in particular, to systems and methods for assisting in placing a surgical instrument into a subject.
- a surgical instrument e.g., a needle
- trajectory information e.g., a planned entry point, a planned target point, a planned entry angle
- the trajectory information may be transmitted to a control device and the control device may emit a laser beam used for assisting in placing the surgical instrument.
- the control device is independent from the imaging device, their coordinate systems are not consistent, accordingly, the trajectory information cannot be directly used to assist in placing the surgical instrument.
- the manual manner for determining the trajectory information is time-consuming. Thus, it is desirable to provide systems and methods for assisting in placing a surgical instrument into a subject accurately and efficiently.
- a system for assisting in placing a surgical instrument may be provided.
- the system may include an imaging component, a trajectory determination component, and a control component.
- the imaging component and the control component may correspond to a same coordinate system.
- the imaging component may be configured to acquire at least one image of a subject.
- the trajectory determination component may be configured to determine trajectory information for placing a surgical instrument into the subject based on a reference entry point and a reference target point in the at least one image.
- the control component may include a laser emitting unit configured to emit a laser beam towards the subject based on the trajectory information. An orientation of the laser beam may coincide with an orientation associated with the trajectory information and the laser beam may be used as a reference for placing the surgical instrument into the subject.
- the trajectory information may include at least one of a planned entry point for placing the surgical instrument on a surface of the subject, a planned target point for placing the surgical instrument in the subject, a planned trajectory between the planned entry point and the planned target point, a planned entry angle between the surgical instrument and a reference plane, or a planned entry depth of the surgical instrument in the subject.
- control component may further include a motion unit configured to adjust position information of the laser emitting unit based on the trajectory information, and a fastener configured to fasten the laser emitting unit.
- a freedom degree of the motion unit may be 6.
- the motion unit may be configured to make an intersection of the laser beam and the subject coincide with a planned entry point in the trajectory information, and make an angle between the laser beam and a reference plane equal to a planned entry angle in the trajectory information.
- control component may further include a guiding unit configured to guide the surgical instrument to be placed into the subject based at least in part on the laser beam or the trajectory information.
- the guiding unit may include a hole configured to accommodate the surgical instrument, one or more rails configured to adjust a position of the hole to make a center axis of the hole coincide with a center axis of the laser beam, and one or more fasteners configured to fasten the surgical instrument.
- system may further include an automation component configured to automatically place the surgical instrument into the subject based at least in part on the laser beam or the trajectory information.
- the automation component may be configured to make a start point of the surgical instrument coincide with an intersection of the laser beam and the subject, make an orientation of the surgical instrument coincide with the orientation of the laser beam, and place the surgical instrument into the subject based on a planned entry depth in the trajectory information.
- the surgical instrument may include at least one of a needle, a nail, a screw, or a drill.
- the trajectory information may be determined in real-time during a surgical treatment.
- a method for assisting in placing a surgical instrument may be provided.
- the method may include acquiring, by an imaging component, at least one image of a subject, determining, by a trajectory determination component, trajectory information for placing a surgical instrument into the subject based on a reference entry point and a reference target point in the at least one image, emitting, by a control component, a laser beam towards the subject based on the trajectory information.
- An orientation of the laser beam may coincide with an orientation associated with the trajectory information.
- the laser beam may be used as a reference for placing the surgical instrument into the subject.
- the imaging component and the control component may correspond to a same coordinate system.
- the trajectory information may include at least one of a planned entry point for placing the surgical instrument on a surface of the subject, a planned target point for placing the surgical instrument in the subject, a planned trajectory between the planned entry point and the planned target point, a planned entry angle between the surgical instrument and a reference plane, or a planned entry depth of the surgical instrument in the subject.
- the method may further include marking an intersection of the laser beam and the subject coincide with a planned entry point in the trajectory information, and making an angle between the laser beam and the reference plane equal to the entry angle.
- the method may further include automatically placing the surgical instrument into the subject based at least in part on the laser beam or the trajectory information.
- the placing the surgical instrument into the subject may include making a start point of the surgical instrument coincide with an intersection of the laser beam and the subject, making an orientation of the surgical instrument coincide with the orientation of the laser beam, and placing the surgical instrument into the subject based on a planned entry depth in the trajectory information.
- the reference entry point and the reference target point in the at least one image may be determined automatically.
- the surgical instrument may include at least one of a needle, a nail, a screw, or a drill.
- the method may further include determining the trajectory information for placing the surgical instrument in the subject in real-time during a surgical treatment.
- a non-transitory computer readable medium may include executable instructions that, when executed by at least one processor, direct the at least one processor to perform a method.
- the method may include acquiring, by an imaging component, at least one image of a subject, determining, by a trajectory determination component, trajectory information for placing a surgical instrument into the subject based on a reference entry point and a reference target point in the at least one image, emitting, by a control component, a laser beam towards the subject based on the trajectory information.
- An orientation of the laser beam may coincide with an orientation associated with the trajectory information.
- the laser beam may be used as a reference for placing the surgical instrument into the subject.
- the imaging component and the control component may correspond to a same coordinate system.
- FIG. 1 is a schematic diagram illustrating an exemplary medical system according to some embodiments of the present disclosure
- FIG. 2 is a schematic diagram illustrating exemplary hardware and/or software components of a computing device according to some embodiments of the present disclosure
- FIG. 3 is a schematic diagram illustrating exemplary hardware and/or software components of a mobile device according to some embodiments of the present disclosure
- FIG. 4 is a schematic diagram illustrating an exemplary medical device according to some embodiments of the present disclosure.
- FIG. 5A is a schematic diagram illustrating an exemplary control component according to some embodiments of the present disclosure.
- FIG. 5B is an enlarged view illustrating a part of the control component in FIG. 5A according to some embodiments of the present disclosure
- FIG. 5C is a schematic diagram illustrating a guiding unit of the control component in FIG. 5A according to some embodiments of the present disclosure
- FIG. 5D is a side view of the guiding unit in FIG. 5C according to some embodiments of the present disclosure.
- FIG. 5E is a top view of the guiding unit in FIG. 5C according to some embodiments of the present disclosure.
- FIG. 6 is a block diagram illustrating an exemplary processing device according to some embodiments of the present disclosure.
- FIG. 7 is a flowchart illustrating an exemplary process for placing a surgical instrument into a subject according to some embodiments of the present disclosure.
- FIG. 8 is a schematic diagram illustrating an image of a subject acquired by performing an intraoperative imaging on the subject according to some embodiments of the present disclosure.
- system, ” “unit, ” “module, ” and/or “block” used herein are one method to distinguish different components, elements, parts, sections or assemblies of different levels in ascending order. However, the terms may be displaced by another expression if they achieve the same purpose.
- the modules (or units, blocks, units) described in the present disclosure may be implemented as software and/or hardware modules and may be stored in any type of non-transitory computer-readable medium or other storage devices.
- a software module may be compiled and linked into an executable program. It will be appreciated that software modules can be callable from other modules or from themselves, and/or can be invoked in response to detected events or interrupts.
- Software modules configured for execution on computing devices can be provided on a computer-readable medium, such as a compact disc, a digital video disc, a flash drive, a magnetic disc, or any other tangible medium, or as a digital download (and can be originally stored in a compressed or installable format that requires installation, decompression, or decryption prior to execution) .
- Such software code can be stored, partially or fully, on a memory device of the executing computing device, for execution by the computing device.
- Software instructions can be embedded in a firmware, such as an EPROM.
- hardware modules e.g., circuits
- circuits can be included of connected or coupled logic units, such as gates and flip-flops, and/or can be included of programmable units, such as programmable gate arrays or processors.
- the modules or computing device functionality described herein are preferably implemented as hardware modules, but can be software modules as well. In general, the modules described herein refer to logical modules that can be combined with other modules or divided into units despite their physical organization or storage.
- the flowcharts used in the present disclosure illustrate operations that systems implement according to some embodiments of the present disclosure. It is to be expressly understood, the operations of the flowcharts may be implemented not in order. Conversely, the operations may be implemented in inverted order, or simultaneously. Moreover, one or more other operations may be added to the flowcharts. One or more operations may be removed from the flowcharts.
- a subject may include a biological object and/or a non-biological object.
- the biological subject may include a human being, an animal, a plant, or a specific portion, organ, and/or tissue thereof, such as the head, the neck, the thorax, the heart, the stomach, a blood vessel, soft tissue, a tumor, a nodule, or the like, or any combination thereof.
- object or “subject” are used interchangeably in the present disclosure.
- An aspect of the present disclosure relates to systems and methods for assisting in placing a surgical instrument (e.g., a needle) into a subject (e.g., a patient) .
- the system may include an imaging component, a trajectory determination component, and a control component.
- the imaging component may be configured to acquire at least one image of the subject.
- a reference entry point and a reference target point may be determined manually or automatically in the at least one image.
- the trajectory determination component may be configured to determine trajectory information for placing the surgical instrument into the subject based on the reference entry point and the reference target point.
- the control component may be configured to control a laser emitting unit thereof to emit a laser beam towards the subject based on the trajectory information.
- an orientation of the laser beam may coincide with an orientation associated with the trajectory information. Accordingly, the surgical instrument can be placed into the subject based at least in part on the laser beam and/or the trajectory information.
- the imaging component and the control component may correspond to a same coordinate system, accordingly, without a calibration operation (or a registration operation) , the trajectory information can be directly determined based on the reference entry point and the reference target point in the at least one image. Besides, in this manner, the trajectory information can be determined in a relatively short time (e.g., substantially in real-time) and an error that may occur in the calibration operation or (the registration operation) can be avoided. Therefore, the surgical instrument can be accurately and efficiently placed into the subject based on the trajectory information.
- FIG. 1 is a schematic diagram illustrating an exemplary medical system according to some embodiments of the present disclosure.
- the medical system 100 may assist in placing a surgical instrument (e.g., a needle) into a subject (e.g., a patient) during a surgical treatment.
- the medical system 100 may include a medical device 110, a network 120, a terminal device 130, a processing device 140, and a storage device 150.
- the components of the medical system 100 may be connected in one or more of various ways.
- the medical device 110 may be connected to the processing device 140 through the network 120.
- the medical device 110 may be connected to the processing device 140 directly as indicated by the bi-directional arrow in dotted lines linking the medical device 110 and the processing device 140.
- the storage device 150 may be connected to the processing device 140 directly or through the network 120.
- the terminal device 130 may be connected to the processing device 140 directly (as indicated by the bi-directional arrow in dotted lines linking the terminal device 130 and the processing device 140) or through the network 120.
- the medical device 110 may be configured to acquire image data of a subject (e.g., a patient) and/or assist in performing a treatment on the subject based on the image data.
- the medical device 110 may include an imaging component 112 and a control component 114.
- the imaging component 112 may be configured to acquire the image data (e.g., at least one image) of the subject.
- the image data may reflect an anatomical structure of the subject and be used to determine trajectory information (e.g., a planned entry point, a planned target point, a planned entry angle) for assisting in placing a surgical instrument (e.g., a needle) into the subject.
- trajectory information e.g., a planned entry point, a planned target point, a planned entry angle
- the control component 114 may be configured to control a laser emitting unit thereof to emit a laser beam towards the subject based on the trajectory information, wherein an orientation of the laser beam may coincide with an orientation (e.g., an orientation of the planned trajectory) associated with the trajectory information such that the laser beam can visualize at least a part of the trajectory information.
- the surgical instrument may be placed into the subject based at least in part on the laser beam and/or the trajectory information.
- the trajectory information may be determined by the processing device 140 based on the image data.
- the medical device 110 may include a trajectory determination component (not shown in FIG. 1) configured to determine the trajectory information based on the image data. More descriptions regarding the medical device 110 may be found elsewhere in the present disclosure (e.g., FIGs. 4-5E and the descriptions thereof) .
- the network 120 may include any suitable network that can facilitate the exchange of information and/or data for the medical system 100.
- one or more components e.g., the medical device 110, the terminal device 130, the processing device 140, or the storage device 150
- the processing device 140 may transmit an instruction for acquiring at least one image of the subject to the medical device 110 (e.g., the imaging component 112) via the network 120.
- the processing device 140 may obtain at least one image of the subject acquired by the medical device 110 (e.g., the imaging component 112) via the network 120.
- the processing device 140 may transmit an instruction for emitting a laser beam based on trajectory information to the medical device 110 (e.g., the control component 114) via the network 120.
- the network 120 may be any type of wired or wireless network, or a combination thereof.
- the network 120 may be and/or include a public network (e.g., the Internet) , a private network (e.g., a local area network (LAN) , a wide area network (WAN) ) ) , a wired network (e.g., an Ethernet network) , a wireless network (e.g., an 802.11 network, a Wi-Fi network) , a cellular network (e.g., a long term evolution (LTE) network) , a frame relay network, a virtual private network (VPN) , a satellite network, a telephone network, routers, hubs, switches, server computers, and/or any combination thereof.
- a public network e.g., the Internet
- a private network e.g., a local area network (LAN) , a wide area network (WAN)
- a wired network e.g., an Ethernet network
- a wireless network e.g., an 802.11 network, a Wi-Fi network
- the network 120 may include a cable network, a wireline network, a fiber-optic network, a telecommunications network, an intranet, a wireless local area network (WLAN) , a metropolitan area network (MAN) , a public telephone switched network (PSTN) , a Bluetooth TM network, a ZigBee TM network, a near field communication (NFC) network, or the like, or any combination thereof.
- the network 120 may include one or more network access points.
- the network 120 may include wired and/or wireless network access points such as base stations and/or internet exchange points through which one or more components of the medical system 100 may be connected to the network 120 to exchange data and/or information.
- the terminal device 130 may include a mobile device 131, a tablet computer 132, a laptop computer 133, or the like, or any combination thereof.
- the mobile device 131 may include a smart home device, a wearable device, a smart mobile device, a virtual reality device, an augmented reality device, or the like, or any combination thereof.
- the smart home device may include a smart lighting device, a control device of an intelligent electrical apparatus, a smart monitoring device, a smart television, a smart video camera, an interphone, or the like, or any combination thereof.
- the wearable device may include a smart bracelet, a smart footgear, a pair of smart glasses, a smart helmet, a smart watch, smart clothing, a smart backpack, a smart accessory, or the like, or any combination thereof.
- the smart mobile device may include a smartphone, a personal digital assistant (PDA) , a gaming device, a navigation device, a point of sale (POS) device, or the like, or any combination thereof.
- PDA personal digital assistant
- the virtual reality device and/or the augmented reality device may include a virtual reality helmet, a virtual reality glass, a virtual reality patch, an augmented reality helmet, an augmented reality glass, an augmented reality patch, or the like, or any combination thereof.
- the virtual reality device and/or the augmented reality device may include a Google TM Glass, an Oculus Rift, a Hololens, a Gear VR, etc.
- the terminal device 130 may remotely operate the medical device 110 and/or the processing device 140.
- the terminal device 130 may operate the medical device 110 and/or the processing device 140 via a wireless connection.
- the terminal device 130 may receive information and/or instructions inputted by a user, and send the received information and/or instructions to the medical device 110 or to the processing device 140 via the network 120.
- the terminal device 130 may receive data and/or information from the processing device 140.
- the terminal device 130 may be part of the processing device 140.
- the terminal device 130 may be omitted.
- the processing device 140 may process data and/or information obtained from the medical device 110, the terminal device 130, and/or the storage device 150. For example, the processing device 140 may determine a reference entry point and a reference target point in at least one image of a subject acquired by the medical device 110. As another example, the processing device 140 may determine trajectory information for assisting in placing a surgical instrument into the subject based at least in part on the reference entry point and the reference target point in the at least one image. In some embodiments, the processing device 140 may be a single server or a server group. The server group may be centralized or distributed. In some embodiments, the processing device 140 may be local or remote.
- the processing device 140 may access information and/or data stored in or acquired by the medical device 110, the terminal device 130, and/or the storage device 150 via the network 120.
- the processing device 140 may be directly connected to the medical device 110 (as illustrated by the bidirectional arrow in dashed lines connecting the processing device 140 and the medical device 110 in FIG. 1) , the terminal device 130 (as illustrated by the bidirectional arrow in dashed lines connecting the processing device 140 and the terminal device 130 in FIG. 1) , and/or the storage device 150 to access stored or acquired information and/or data.
- the processing device 140 may be implemented on a cloud platform.
- the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof.
- the storage device 150 may store data and/or instructions.
- the storage device 150 may store data obtained from the medical device 110, the terminal device 130, and/or the processing device 140.
- the processing device 140 may determine trajectory information for assisting in placing a surgical instrument into a subject based on at least one image of the subject acquired by the medical device 110, and then the trajectory information may be stored in the storage device 150 for further use or processing.
- the storage device 150 may store data obtained from the terminal device 130 and/or the processing device 140.
- the storage device 150 may store data and/or instructions that the processing device 140 may execute or use to perform exemplary methods described in the present disclosure.
- the storage device 150 may store instructions that the processing device 140 may execute or use to determine a reference entry point and a reference target point in at least one image of the subject acquired by the medical device 110.
- the storage device 150 may store instructions that the processing device 140 may execute or use to determine trajectory information based at least in part on the reference entry point and the reference target point in the at least one image.
- the storage device 150 may include a mass storage device, a removable storage device, a volatile read-and-write memory, a read-only memory (ROM) , or the like, or any combination thereof.
- Exemplary mass storage may include a magnetic disk, an optical disk, a solid-state drive, etc.
- Exemplary removable storage may include a flash drive, a floppy disk, an optical disk, a memory card, a zip disk, a magnetic tape, etc.
- Exemplary volatile read-and-write memory may include a random access memory (RAM) .
- Exemplary RAM may include a dynamic RAM (DRAM) , a double date rate synchronous dynamic RAM (DDR SDRAM) , a static RAM (SRAM) , a thyristor RAM (T-RAM) , and a zero-capacitor RAM (Z-RAM) , etc.
- DRAM dynamic RAM
- DDR SDRAM double date rate synchronous dynamic RAM
- SRAM static RAM
- T-RAM thyristor RAM
- Z-RAM zero-capacitor RAM
- Exemplary ROM may include a mask ROM (MROM) , a programmable ROM (PROM) , an erasable programmable ROM (PEROM) , an electrically erasable programmable ROM (EEPROM) , a compact disk ROM (CD-ROM) , and a digital versatile disk ROM, etc.
- the storage device 150 may be implemented on a cloud platform.
- the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof.
- the storage device 150 may be connected to the network 120 to communicate with one or more components (e.g., the medical device 110, the terminal device 130, the processing device 140) of the medical system 100.
- One or more components of the medical system 100 may access the data or instructions stored in the storage device 150 via the network 120.
- the storage device 150 may be directly connected to or communicate with one or more components (e.g., the medical device 110, the processing device 140, the terminal device 130) of the medical system 100.
- the storage device 150 may be part of the processing device 140.
- the medical system 100 may further include one or more power supplies (not shown in FIG. 1) connected to one or more components (e.g., the medical device 110, the processing device 140, the terminal device 130, the storage device 150) of the medical system 100.
- one or more power supplies not shown in FIG. 1
- components e.g., the medical device 110, the processing device 140, the terminal device 130, the storage device 150
- FIG. 2 is a schematic diagram illustrating exemplary hardware and/or software components of a computing device according to some embodiments of the present disclosure.
- the processing device 140 may be implemented on the computing device 200.
- the computing device 200 may include a processor 210, a storage 220, an input/output (I/O) 230, and a communication port 240.
- I/O input/output
- the processor 210 may execute computer instructions (program code) and perform functions of the processing device 140 in accordance with techniques described herein.
- the computer instructions may include routines, programs, objects, components, signals, data structures, procedures, modules, and functions, which perform particular functions described herein.
- the processor 210 may include a microcontroller, a microprocessor, a reduced instruction set computer (RISC) , an application specific integrated circuits (ASICs) , an application-specific instruction-set processor (ASIP) , a central processing unit (CPU) , a graphics processing unit (GPU) , a physics processing unit (PPU) , a microcontroller unit, a digital signal processor (DSP) , a field programmable gate array (FPGA) , an advanced RISC machine (ARM) , a programmable logic device (PLD) , any circuit or processor capable of executing one or more functions, or the like, or any combinations thereof.
- RISC reduced instruction set computer
- ASICs application specific integrated circuits
- ASIP application-specific instruction-set processor
- CPU central processing unit
- GPU graphics processing unit
- PPU physics processing unit
- DSP digital signal processor
- FPGA field programmable gate array
- ARM advanced RISC machine
- PLD programmable logic device
- the computing device 200 in the present disclosure may also include multiple processors, and thus operations of a method that are performed by one processor as described in the present disclosure may also be jointly or separately performed by the multiple processors.
- the processor of the computing device 200 executes both operations A and B
- operations A and step B may also be performed by two different processors jointly or separately in the computing device 200 (e.g., a first processor executes operation A and a second processor executes operation B, or the first and second processors jointly execute operations A and B) .
- the storage 220 may store data/information obtained from the medical device 110, the terminal device 130, the storage device 150, or any other component of the medical system 100.
- the storage 220 may include a mass storage device, a removable storage device, a volatile read-and-write memory, a read-only memory (ROM) , or the like, or any combination thereof.
- the mass storage device may include a magnetic disk, an optical disk, a solid-state drive, etc.
- the removable storage device may include a flash drive, a floppy disk, an optical disk, a memory card, a zip disk, a magnetic tape, etc.
- the volatile read-and-write memory may include a random access memory (RAM) .
- the RAM may include a dynamic RAM (DRAM) , a double date rate synchronous dynamic RAM (DDR SDRAM) , a static RAM (SRAM) , a thyristor RAM (T-RAM) , and a zero-capacitor RAM (Z-RAM) , etc.
- the ROM may include a mask ROM (MROM) , a programmable ROM (PROM) , an erasable programmable ROM (PEROM) , an electrically erasable programmable ROM (EEPROM) , a compact disk ROM (CD-ROM) , and a digital versatile disk ROM, etc.
- the storage 220 may store one or more programs and/or instructions to perform exemplary methods described in the present disclosure.
- the storage 220 may store a program for the processing device 140 for determining trajectory information for assisting in placing a surgical instrument into a subject based on at least one image of the subject.
- the I/O 230 may input or output signals, data, or information. In some embodiments, the I/O 230 may enable user interaction with the processing device 140. In some embodiments, the I/O 230 may include an input device and an output device. Exemplary input devices may include a keyboard, a mouse, a touch screen, a microphone, a trackball, or the like, or a combination thereof. Exemplary output devices may include a display device, a loudspeaker, a printer, a projector, or the like, or a combination thereof.
- Exemplary display devices may include a liquid crystal display (LCD) , a light-emitting diode (LED) -based display, a flat panel display, a curved screen, a television device, a cathode ray tube (CRT) , or the like, or a combination thereof.
- LCD liquid crystal display
- LED light-emitting diode
- CRT cathode ray tube
- a user e.g., an operator of the processing device 140 may input data related to a subject (e.g., a patient) that is being/to be imaged/scanned through the I/O 230.
- the data related to the subject may include identification information (e.g., the name, age, gender, medical history, contact information, physical examination result) and/or test information including the nature of the scan that must be performed.
- the user may also input parameters needed for the operation of the medical device 110, such as image contrast and/or ratio, a region of interest (ROI) , or the like, or any combination thereof.
- the I/O 230 may also display an image (or videos) generated based on the imaging/scan data.
- the communication port 240 may be connected to a network (e.g., the network 120) to facilitate data communications.
- the communication port 240 may establish connections between the processing device 140 and the medical device 110, the terminal device 130, or the storage device 150.
- the connection may be a wired connection, a wireless connection, or a combination of both that enables data transmission and reception.
- the wired connection may include an electrical cable, an optical cable, a telephone wire, or the like, or any combination thereof.
- the wireless connection may include Bluetooth, Wi-Fi, WiMax, WLAN, ZigBee, mobile network (e.g., 3G, 4G, 5G) , or the like, or a combination thereof.
- the communication port 240 may be a standardized communication port, such as RS232, RS485, etc. In some embodiments, the communication port 240 may be a specially designed communication port. For example, the communication port 240 may be designed in accordance with the digital imaging and communications in medicine (DICOM) protocol.
- DICOM digital imaging and communications in medicine
- FIG. 3 is a schematic diagram illustrating exemplary hardware and/or software components of a mobile device according to some embodiments of the present disclosure.
- the terminal device 130 may be implemented on the mobile device 300.
- the mobile device 300 may include a communication platform 310, a display 320, a graphic processing unit (GPU) 330, a central processing unit (CPU) 340, an I/O 350, a memory 360, and a storage 390.
- any other suitable component including but being not limited to a system bus or a controller (not shown) , may also be included in the mobile device 300.
- a mobile operating system 370 e.g., iOS, Android, Windows Phone
- one or more applications 380 may be loaded into the memory 360 from the storage 390 in order to be executed by the CPU 340.
- the applications 380 may include a browser or any other suitable mobile apps for receiving and rendering information relating to surgical instrument placement or other information from the processing device 140.
- User interactions with the information stream may be achieved via the I/O 350 and provided to the processing device 140 and/or other components of the medical system 100 via the network 120.
- computer hardware platforms may be used as the hardware platform (s) for one or more of the elements described herein.
- the hardware elements, operating systems and programming languages of such computers are conventional in nature, and it is presumed that those skilled in the art are adequately familiar therewith to adapt those technologies to the surgical instrument placement as described herein.
- a computer with user interface elements may be used to implement a personal computer (PC) or another type of work station or terminal device, although a computer may also act as a server if appropriately programmed. It is believed that those skilled in the art are familiar with the structure, programming and general operation of such computer equipment and as a result the drawings should be self-explanatory.
- FIG. 4 is a schematic diagram illustrating an exemplary medical device according to some embodiments of the present disclosure.
- the medical device 400 may be an example of the medical device 110 illustrated in FIG. 1.
- the medical device 400 may include an imaging component 410, a trajectory determination component 420, and a control component 430.
- the imaging component 410 and the control component 430 may be examples of the imaging component 112 and the control component 114 illustrated in FIG. 1 respectively.
- the imaging component 410 may be configured to acquire at least one image of a subject 440.
- the at least one image may reflect an anatomical structure of the subject 440.
- the imaging component 410 may include a computed tomography (CT) , an X-ray machine, a positron emission tomography (PET) , a single photon emission computed tomography (SPECT) , a magnetic resonance (MR) , or the like, or any combination thereof.
- CT computed tomography
- PET positron emission tomography
- SPECT single photon emission computed tomography
- MR magnetic resonance
- the at least one image may include a two-dimensional (2D) image, a three-dimensional (3D) image, a four-dimensional (4D) image, or the like, or any combination thereof.
- the imaging component 410 may include a frame 412, a radiation source (e.g., an X-ray source) (not shown in FIG. 4) , a detection component (not shown in FIG. 4) , and a platform 414.
- the radiation source and the detection component may be mounted on the frame 412; the platform 414 may be used to hold or support the subject 440.
- the radiation source may emit a radiation beam (e.g., an X-ray beam) towards the subject 440 and the radiation beam may attenuate when passing through the subject 440.
- the detection component may receive the attenuated radiation beam and generate imaging data (also can be considered as “image” for brevity) corresponding to the received radiation beam.
- the imaging component 410 may perform a preoperative imaging on the subject 410 and acquire the at least one image corresponding to the preoperative imaging. Accordingly, the at least one image may be used to assist in placing a surgical instrument into the subject 440 during a surgical treatment (e.g., a biopsy, a resection operation) .
- the surgical instrument may include a needle, a nail, a screw, a drill, a catheter, a guidewire, a debrider, an aspirator, a handle, a guide, an artificial disk, a shunt, a plate, a rod, etc.
- the surgical instrument may be made of a biocompatible material, for example, a metallic material, a polymeric material, a ceramic material, a bone material, or the like, or any combination thereof.
- the trajectory determination component 420 may be configured to determine trajectory information for placing the surgical instrument into the subject 440 based on a reference entry point and a reference target point in the at least one image.
- the reference entry point in the at least one image may indicate a point where the surgical instrument starts to enter the subject 440;
- the reference target point in the at least one image may indicate a point (e.g., a point in a tumor) inside the subject 440 where the surgical instrument stops.
- the reference entry point and/or the reference target point may be determined manually or automatically. For example, an operator (e.g., a doctor) may manually mark the reference entry point and/or the reference target point in the at least one image.
- the operator may determine the reference entry point and/or the reference target point using a virtual reality device (e.g., 3D glasses) .
- the trajectory determination component 420 may automatically determine the reference entry point and/or the reference target point, for example, using a machine learning model.
- the machine learning model may include a convolutional neural network model, an adaptive boosting model, a gradient boosting decision tree model, or the like, or any combination thereof.
- the trajectory determination component 420 may include an interface via which the operator may mark the reference entry point and/or the reference target point in the at least one image.
- the trajectory information may include a planned entry point for placing the surgical instrument on a surface of the subject 440, a planned target point for placing the surgical instrument in the subject 440, a planned trajectory between the planned entry point and the planned target point, a planned entry depth of the surgical instrument in the subject 440, a planned entry angle between the surgical instrument and a reference plane (e.g., the x-z plane illustrated in FIG. 1) or a reference axis (e.g., the x-axis, the y-axis, the z-axis illustrated in FIG. 1) , or the like, or any combination thereof.
- a reference plane e.g., the x-z plane illustrated in FIG. 1
- a reference axis e.g., the x-axis, the y-axis, the z-axis illustrated in FIG. 1
- the planned entry point refers to a point on the surface of the subject 440 where the surgical instrument starts to enter the subject 440; the planned target point refers to a point inside the subject 440 where the surgical instrument stops; the planned entry depth refers to a distance (e.g., a linear distance, a horizontal distance, a vertical distance) between the planned entry point and the planned target point or a length of the planned trajectory.
- the planned entry point may be the same as or substantially the same as (e.g., a distance between which is less than a threshold) the reference entry point.
- the planned target point may be the same as or substantially the same as the reference target point.
- the trajectory determination component 420 may be integrated into the processing device 140 or may be implemented by the processing device 140. In some embodiments, the trajectory determination component 420 may be a computing device connected to or in communication with the processing device 140. In some embodiments, the trajectory determination component 420 may be a computing device independent from the processing device 140.
- the control component 430 may be configured to provide guidance for placing the surgical instrument into the subject 440 based on the trajectory information.
- the control component 430 may include a laser emitting unit, a motion unit, a fastener, and/or a guiding unit.
- the laser emitting unit may be configured to emit a laser beam towards the subject 440 based on the trajectory information, which may be used as a reference for placing the surgical instrument into the subject 440.
- an orientation of the laser beam coincides with an orientation associated with the trajectory information.
- the laser emitting unit may include a crisscross laser generator, a T laser generator, a dot laser generator, or the like, or a combination thereof.
- the motion unit may be configured to adjust position information of the laser emitting unit based on the trajectory information. For example, the motion unit may adjust the position information of the laser emitting unit to make the orientation of the emitted laser beam coincide with the orientation associated with the trajectory information. In some embodiments, the motion unit may adjust the position information of the laser emitting unit based on the planned entry point and/or the planned entry angle in the trajectory information. For example, the motion unit may make an intersection of the laser beam and the subject 440 coincide with the planned entry point and make an angle between the laser beam and the reference plane or the reference axis equal to the planned entry angle. In some embodiments, a preliminary position of the laser emitting unit may be determined by the trajectory determination component 420 based on the trajectory information.
- the motion unit may adjust the preliminary position based on the trajectory information.
- the motion unit may implement a rotational motion and/or a translational motion for adjusting the position information of the laser emitting unit and/or the orientation of the laser beam.
- a freedom degree of the motion unit may be 6 such that the position information of the laser emitting unit and/or the orientation of the laser beam may be adjusted arbitrarily.
- the motion unit may be an automatic unit, for example, a robotic arm, a robot, etc. More descriptions regarding the motion unit may be found elsewhere in the present disclosure (e.g., FIGs. 5A-5E and the description thereof) .
- the fastener may be configured to fasten the laser emitting unit on the control component 430.
- the fastener may include a clasp, a screw, a nut, a bolt, a gasket, an airtight glue, an airtight adhesive tape, or the like, or any combination thereof.
- the guiding unit may be configured to guide the surgical instrument to be placed into the subject 440 based at least in part on the laser beam and/or the trajectory information described above.
- the guiding unit may include a hole, one or more rails, and one or more fasteners.
- the hole may be configured to accommodate the surgical instrument.
- a diameter of the hole may be the same as or slightly larger than a diameter of the surgical instrument.
- the one or more rails may be configured to adjust a position of the hole to make a center axis of the hole coincide with a center axis of the laser beam. Then the surgical instrument may be placed into the hole.
- the one or more fasteners may be configured to fasten the guiding unit and/or the surgical instrument.
- the one or more fasteners may include a clasp, a screw, a nut, a bolt, a gasket, an airtight glue, an airtight adhesive tape, or the like, or any combination thereof. More descriptions regarding the guiding unit may be found elsewhere in the present disclosure (e.g., FIGs. 5A-5E and the description thereof) .
- the medical device 400 may also include an automatic component (e.g., a robotic arm, a robot) (not shown in FIG. 4) configured to automatically place the surgical instrument into the subject 440 based at least in part on the laser beam and/or the trajectory information.
- the automatic component may place the surgical instrument into the subject 440 along the orientation of the laser beam until the surgical instrument reaches the planned target point inside the subject 440.
- the automatic component may make a start point of the surgical instrument coincide with an intersection of the laser beam and the subject 440 and make an orientation of the surgical instrument coincide with the orientation of the laser beam.
- the automatic component may place the surgical instrument into the subject 440 based on the planned entry depth in the trajectory information.
- the surgical instrument may be placed into the subject 440 manually by an operator (e.g., a doctor) .
- the trajectory information determined by the trajectory determination component 420 may be inaccurate due to some unexpected situations, for example, a system failure.
- the operator may examine whether an adjustment of the trajectory information is needed.
- the operator may instruct the trajectory determination component 420 to re-determine new trajectory information or the operator may manually modify the trajectory information.
- the laser beam may be emitted based on the new trajectory information or the modified information, and the surgical instrument may be placed into the subject 440 based at least in part on the laser beam and/or the new trajectory information or the modified information.
- the imaging component 410 and the control component 430 may correspond to a coordinate system, accordingly, without a calibration operation (or a registration operation) , the trajectory information determined based on the at least one image acquired by the imaging component 410 can be directly used to assist in placing the surgical instrument into the subject 440.
- the trajectory information can be directly determined based on the reference entry point and the reference target point in the at least one image without a calibration operation (or a registration operation) .
- the trajectory information can be determined in a relatively short time (e.g., substantially in real-time) and an error that may occur in the calibration operation or (the registration operation) can be avoided.
- a pre-calibration operation (or a pre-registration operation) may be performed when the imaging component 410 and the control component 430 are installed or assembled. As illustrated in FIG. 4, the control component 410 may be installed on the imaging component 410.
- the imaging component 410 may perform an intraoperative imaging on the subject 440 during the surgical treatment and acquire at least one second image.
- the at least one second image may reflect an anatomical structure of at least a portion (e.g., an organ, a tissue) of the subject 440 and a part of the surgical instrument in the subject 440.
- the at least one second image may be used to monitor position information of the surgical instrument in the subject 440 during the surgical treatment. In some embodiments, whether the position information of the surgical instrument deviates from the trajectory information may be determined. In response to a determination that the position information of the surgical instrument deviates from the trajectory information, the surgical instrument may be adjusted immediately.
- whether the deviation (e.g., an angle difference between the planned entry angle and an actual entry angle of the surgical instrument) of the surgical instrument is within a threshold may be determined.
- the position information of the surgical instrument may be adjusted based on the deviation, the trajectory information, and/or the laser beam. For example, if the actual entry angle is shifted one degree (which is within the threshold) to the left, the position information of the surgical instrument may be adjusted by shifting the surgical instrument one degree to the right.
- the surgical instrument may be pulled out of the subject 440 and re-placed into the subject 440 based at least in part on the trajectory information and/or the laser beam.
- a position of at least a portion of the subject 440 may change during the surgical treatment due to a motion of the subject 440 or a motion of an inner organ or tissue of the subject 440, such as a cardiac motion, motion of other organs or tissues affected by the cardiac motion, respiratory motion (e.g., motion of the lungs and/or the diaphragm) , motion of other organs or tissues affected by the respiratory motion, blood flow motion, motion induced by vascular pulsation, a muscle contraction, a muscle relaxation, a secretory activity of the pancreas, or the like, or any combination thereof.
- a motion of the subject 440 or a motion of an inner organ or tissue of the subject 440 such as a cardiac motion, motion of other organs or tissues affected by the cardiac motion, respiratory motion (e.g., motion of the lungs and/or the diaphragm) , motion of other organs or tissues affected by the respiratory motion, blood flow motion, motion induced by vascular pulsation, a muscle contraction,
- the trajectory determination component 420 may determine the trajectory information with the motion information into consideration. For example, during the surgical treatment, the motion information of the subject 440 may be monitored based on the at least one second image acquired by the imaging component 410 and/or other sensing information detected by a sensing device (e.g., a motion sensor, a distance sensor) , then the trajectory information may be adjusted in real-time or substantially in real-time based on the motion information.
- a sensing device e.g., a motion sensor, a distance sensor
- the trajectory information can be determined based on the reference entry point and the reference target point within a relatively short time (e.g., substantially in real-time) , which can improve the processing efficiency.
- the laser beam can visualize at least a part of the trajectory information, accordingly, the surgical instrument can be placed into the subject accurately based on the trajectory information with the laser beam used as a reference, for example, a difference between the planned entry angle in the trajectory information and a desired entry angle of the surgical instrument can be controlled within a range less than one degree.
- the medical device 400 can automatically determine the trajectory information and emit the laser beam based on the trajectory information, which can efficiently reduce manual operations.
- the components of the medical device 400 may be connected with each other in any suitable way.
- the imaging component 410 may be connected to the trajectory determination component 420 directly or through a network (e.g., the network 120 illustrated in FIG. 1)
- the control component 430 may be connected to the trajectory determination component 420 directly or through the network
- the trajectory determination component 420 may be connected to the control component 430 directly or through the network, etc.
- the components of the medical device 400 may be mutually communicated, thereby facilitating the control of the surgical instrument placement.
- the medical device 400 is merely provided for the purposes of illustration, and is not intended to limit the scope of the present disclosure.
- multiple variations or modifications may be made under the teachings of the present disclosure. However, those variations and modifications do not depart from the scope of the present disclosure.
- FIG. 5A is a schematic diagram illustrating an exemplary control component according to some embodiments of the present disclosure.
- FIG. 5B is an enlarged view illustrating a part of the control component in FIG. 5A according to some embodiments of the present disclosure.
- FIG. 5C is a schematic diagram illustrating a guiding unit of the control component in FIG. 5A according to some embodiments of the present disclosure.
- FIG. 5D is a side view of the guiding unit in FIG. 5C according to some embodiments of the present disclosure.
- FIG. 5E is a top view of the guiding unit in FIG. 5C according to some embodiments of the present disclosure.
- the control component 500 illustrated in FIGs. 5A-5E may be an example of the control component 430 illustrated in FIG. 4 or the control component 114 illustrated in FIG. 1.
- the control component 500 may include a laser emitting unit 510, a fastener 520, a motion unit 530, and a guiding unit 540.
- the laser emitting unit 510 may be configured to emit a laser beam 550.
- the fastener 520 may be configured to fasten the laser emitting unit 510 on the control component 500.
- the motion unit 530 may be configured to adjust position information of the laser emitting unit 510 to make an orientation of the laser beam 550 coincide with an orientation (e.g., an orientation of a planned trajectory) associated with trajectory information for placing a surgical instrument 570 into a subject 560.
- the motion unit 530 may be a robotic arm having 6 freedom degrees, that is, three translational freedom degrees along three coordinate axes (i.e., x-axis, y-axis, and z- axis) and three rotational freedom degrees around the three coordinate axes.
- the guiding unit 540 may be configured to guide the surgical instrument 570 to be placed into the subject 560 based at least in part on the laser beam 550 or the trajectory information.
- the guiding unit 540 may include a hole 541, one or more rails 542 (e.g., a rail 5421, a rail 5422, a rail 5423) , and one or more fasters 543 (e.g., a fastener 5431, a fastener 5432, and a fastener 5433) .
- the hole 541 may be configured to accommodate the surgical instrument 570.
- the one or more rails 542 may be configured to adjust a position of the hole 541 to make a center axis of the hole 541 coincide with a center axis of the laser beam 550.
- the position of the hole 541 may be adjusted by sliding the rail 5421 and the rail 5422 through the rail 5423.
- the one or more fasteners 543 may be configured to fasten the guiding unit 540 and/or the surgical instrument 570.
- the fastener 5432 and the fastener 5433 may be configured to fasten the surgical instrument 570 in the hole 541; the fastener 5431 may be configured to fasten the guiding unit 540 on the subject 560.
- FIG. 6 is a block diagram illustrating an exemplary processing device according to some embodiments of the present disclosure.
- the processing device 140 may include an image obtainment module 610, a trajectory determination module 620, and a beam emitting module 630.
- the image obtainment module 610 may be configured to obtain at least one image of a subject (e.g., the subject 440 illustrated in FIG. 4) from an imaging component (e.g., the imaging component 410 illustrated in FIG. 4) .
- the image obtainment module 610 may transmit an instruction for acquiring the at least one image to the imaging component, and the imaging component may acquire the at least one image accordingly.
- the at least one image may reflect an anatomical structure of the subject. More description regarding the at least one image may be found elsewhere in the present disclosure (e.g., FIG. 4 and FIG. 8 and the descriptions thereof) .
- the trajectory determination module 620 may be configured to determine trajectory information for placing a surgical instrument into the subject based on a reference entry point and a reference target point in the at least one image. For example, the trajectory determination module 620 may transmit an instruction for determining the trajectory information to a trajectory determination component (e.g., the trajectory determination component 420 illustrated in FIG. 4) , and the trajectory determination component may determine the trajectory information accordingly.
- a trajectory determination component e.g., the trajectory determination component 420 illustrated in FIG. 4
- the trajectory determination component may determine the trajectory information accordingly.
- the reference entry point in the at least one image may indicate a point where the surgical instrument starts to enter the subject; the reference target point in the at least one image may indicate a point (e.g., a point in a tumor) inside the subject where the surgical instrument stops.
- the reference entry point and/or the reference target point may be determined by the trajectory determination module 620.
- the trajectory information may include a planned entry point for placing the surgical instrument on a surface of the subject, a planned target point for placing the surgical instrument in the subject, a planned trajectory between the planned entry point and the planned target point, a planned entry depth of the surgical instrument in the subject, a planned entry angle between the surgical instrument and a reference plane (e.g., the x-z plane illustrated in FIG. 1) or a reference axis (e.g., the x-axis, the y-axis, the z-axis illustrated in FIG. 1) , or the like, or any combination thereof. More description regarding the reference entry point, the reference target point, and/or the trajectory information may be found elsewhere in the present disclosure (e.g., FIG. 4 and FIG. 7 and the descriptions thereof) .
- the beam emitting module 630 may be configured to cause a control component (e.g., the control component 430 described in FIG. 4) to emit a laser beam towards the subject based on the trajectory information, which may be used as a reference for placing the surgical instrument into the subject.
- the beam emitting module 630 may transmit an instruction for emitting the laser beam to the control component, and the control component may emit the laser beam accordingly.
- an orientation of the laser beam may coincide with an orientation associated with the trajectory information.
- the orientation of the laser beam may coincide with an orientation of the planned trajectory.
- the laser beam may be emitted based on the planned entry point and/or the planned entry angle in the trajectory information. For example, an intersection of the laser beam and the subject may be made coincide with the planned entry point and an angle between the laser beam and the reference plane or the reference axis may be made equal to the planned entry angle.
- the modules in the processing device 140 may be connected to or communicated with each other via a wired connection or a wireless connection.
- the wired connection may include a metal cable, an optical cable, a hybrid cable, or the like, or any combination thereof.
- the wireless connection may include a Local Area Network (LAN) , a Wide Area Network (WAN) , a Bluetooth, a ZigBee, a Near Field Communication (NFC) , or the like, or any combination thereof.
- LAN Local Area Network
- WAN Wide Area Network
- Bluetooth a ZigBee
- NFC Near Field Communication
- the processing device 140 may include a placement module (not shown in FIG. 6) configured to place the surgical instrument into the subject based at least in part on the laser beam and/or the trajectory information.
- the processing device 140 may include a storage module (not shown) configured to store data generated by the above-mentioned modules.
- two or more of the above-mentioned modules may be combined into a single module or any one of the above-mentioned modules may be divided into two or more units.
- FIG. 7 is a flowchart illustrating an exemplary process for placing a surgical instrument into a subject according to some embodiments of the present disclosure.
- the process 700 may be implemented as a set of instructions stored in the storage 220.
- the processor 210 and/or the modules in FIG. 6 may execute the set of instructions, and when executing the instructions, the processor 210 and/or the modules may be configured to perform the process 700.
- the operations of the illustrated process presented below are intended to be illustrative. In some embodiments, the process 700 may be accomplished with one or more additional operations not described and/or without one or more of the operations herein discussed. Additionally, the order in which the operations of the process as illustrated in FIG. 7 and described below is not intended to be limiting.
- the processing device 140 may obtain, from an imaging component (e.g., the imaging component 410 illustrated in FIG. 4) , at least one image of a subject (e.g., the subject 440 illustrated in FIG. 4) .
- the processing device 140 may transmit an instruction for acquiring the at least one image to the imaging component, and the imaging component may acquire the at least one image accordingly.
- the at least one image may reflect an anatomical structure of the subject. More description regarding the at least one image may be found elsewhere in the present disclosure (e.g., FIG. 4 and the descriptions thereof) .
- the subject before the at least one image is acquired, the subject should be registered to the medical system 100.
- registration information may include name, age, gender, medical history, contact information, physical examination result of the subject, etc.
- the processing device 140 may determine trajectory information for placing a surgical instrument into the subject based on a reference entry point and a reference target point in the at least one image. For example, the processing device 140 may transmit an instruction for determining the trajectory information to a trajectory determination component (e.g., the trajectory determination component 420 illustrated in FIG. 4) , and the trajectory determination component may determine the trajectory information accordingly.
- a trajectory determination component e.g., the trajectory determination component 420 illustrated in FIG. 4
- the reference entry point in the at least one image may indicate a point where the surgical instrument starts to enter the subject;
- the reference target point in the at least one image may indicate a point (e.g., a point in a tumor) inside the subject where the surgical instrument stops.
- the reference entry point and/or the reference target point may be determined manually or automatically.
- the trajectory information may include a planned entry point for placing the surgical instrument on a surface of the subject, a planned target point for placing the surgical instrument in the subject, a planned trajectory between the planned entry point and the planned target point, a planned entry depth of the surgical instrument in the subject, a planned entry angle between the surgical instrument and a reference plane (e.g., the x-z plane illustrated in FIG. 1) or a reference axis (e.g., the x-axis, the y-axis, the z-axis illustrated in FIG. 1) , or the like, or any combination thereof.
- a reference plane e.g., the x-z plane illustrated in FIG. 1
- a reference axis e.g., the x-axis, the y-axis, the z-axis illustrated in FIG. 1
- the planned entry point refers to a point on the surface of the subject where the surgical instrument starts to enter the subject; the planned target point refers to a point inside the subject where the surgical instrument stops; the planned entry depth refers to a distance (e.g., a linear distance, a horizontal distance, a vertical distance) between the planned entry point and the planned target point or a length of the planned trajectory.
- the planned entry point may be the same as or substantially the same as (e.g., a distance between which is less than a threshold) the reference entry point.
- the planned target point may be the same as or substantially the same as the reference target point. More description regarding the reference entry point, the reference target point, and/or the trajectory information may be found elsewhere in the present disclosure (e.g., FIG. 4 and the descriptions thereof) .
- the processing device 140 may cause a control component (e.g., the control component 430 described in FIG. 4) to emit a laser beam towards the subject based on the trajectory information, which may be used as a reference for placing the surgical instrument into the subject.
- the processing device 140 may transmit an instruction for emitting the laser beam to the control component, and the control component may emit the laser beam accordingly.
- an orientation of the laser beam may coincide with an orientation associated with the trajectory information.
- the orientation of the laser beam may coincide with an orientation of the planned trajectory.
- the control component may emit the laser beam based on the planned entry point and/or the planned entry angle in the trajectory information. For example, the control component may make an intersection of the laser beam and the subject coincide with the planned entry point and make an angle between the laser beam and the reference plane or the reference axis equal to the planned entry angle.
- the surgical instrument may be placed into the subject based at least in part on the laser beam and/or the trajectory information manually or automatically.
- the surgical instrument may be placed by making a start point of the surgical instrument coincide with an intersection of the laser beam and the subject and making an orientation of the surgical instrument coincide with the orientation of the laser beam. Further, the surgical instrument may be placed into the subject based on the planned entry depth in the trajectory information.
- the trajectory information determined in 720 may be inaccurate due to some unexpected situations, for example, a system failure.
- the operator may examine whether an adjustment of the trajectory information is needed.
- the operator may instruct the processing device 140 to re-determine new trajectory information or the operator may modify the trajectory information.
- the laser beam may be emitted based on the new trajectory information or the modified information in 730, and the surgical instrument may be placed based at least in part on the laser beam and/or the new trajectory information or the modified information.
- the processing device 140 may control the imaging component to perform an intraoperative imaging on the subject during the surgical treatment and acquire at least one second image.
- the at least one second image may reflect an anatomical structure of at least a portion (e.g., an organ, a tissue) of the subject and a part of the surgical instrument in the subject.
- the at least one second image may be used to monitor position information of the surgical instrument in the subject during the surgical treatment.
- the processing device 140 may determine whether the position information of the surgical instrument deviates from the trajectory information. In response to a determination that the position information of the surgical instrument deviates from the trajectory information, the surgical instrument may be adjusted immediately.
- the processing device 140 may determine whether the deviation (e.g., an angle difference between the planned entry angle and an actual entry angle of the surgical instrument) of the surgical instrument is within a threshold. In response to a determination that the deviation is within the threshold, the position information of the surgical instrument may be adjusted based on the deviation, the trajectory information and/or the laser beam. For example, if the actual entry angle is shifted one degree (which is within the threshold) to the left, the position information of the surgical instrument may be adjusted by shifting the surgical instrument one degree to the right. In response to a determination that the deviation exceeds the threshold, the surgical instrument may be pulled out of the subject and re-placed into the subject based at least in part on the trajectory information and/or the laser beam.
- the deviation e.g., an angle difference between the planned entry angle and an actual entry angle of the surgical instrument
- a position of at least a portion of the subject may change during the surgical treatment due to a motion of the subject or a motion of an inner organ or tissue of the subject, such as a cardiac motion, motion of other organs or tissues affected by the cardiac motion, respiratory motion (e.g., motion of the lungs and/or the diaphragm) , motion of other organs or tissues affected by the respiratory motion, blood flow motion, motion induced by vascular pulsation, a muscle contraction, a muscle relaxation, a secretory activity of the pancreas, or the like, or any combination thereof.
- the processing device 140 may determine the trajectory information with the motion information into consideration.
- the motion information of the subject may be monitored based on the at least one second image acquired by the imaging component and/or other sensing information detected by a sensing device (e.g., a motion sensor, a distance sensor) , then the trajectory information may be adjusted in real-time or substantially in real-time based on the motion information.
- a sensing device e.g., a motion sensor, a distance sensor
- an imaging component may acquire at least one image of a subject, a trajectory determination component may determine trajectory information for placing a surgical instrument into a subject based on a reference entry point and a reference target point in the at least one image, and a control component may emit a laser beam towards the subject based on the trajectory information. Further the surgical instrument may be placed into the subject based on the laser beam and/or the trajectory information.
- FIG. 8 is a schematic diagram illustrating an image of a subject acquired by performing an intraoperative imaging on the subject according to some embodiments of the present disclosure.
- trajectory information for placing a surgical instrument into a subject may be determined based on at least one image (which may be acquired by performing a preoperative imaging on the subject) of the subject, a laser beam may be emitted to visualize at least a part of the trajectory information, and then the surgical instrument may be placed into the subject based at least in part on the laser beam and/or the trajectory information.
- at least one second image may be acquired by performing an intraoperative imaging on the subject for monitoring position information of the surgical instrument in the subject.
- a point 820 represents a planned entry point in the trajectory information
- a point 830 represents a planned target point in the trajectory information
- a line segment 840 connecting the point 820 and the point 830 represents a planned trajectory in the trajectory information
- an angle ⁇ between the line segment 840 and the y-axis represents a planned entry angle in the trajectory information.
- the surgical instrument may be placed into the subject based on the point 820, the point 830, the line segment 840, and the angle ⁇ .
- a start point (e.g., an end of the surgical instrument near the subject) of the surgical instrument is placed at the point 820, an orientation of the surgical instrument coincides with an orientation of the laser beam (i.e., the other end of the surgical instrument distal to the subject is placed along the laser beam) , then the surgical instrument enters the subject.
- a line segment 850 represents the position information of the surgical instrument in the subject during the surgical treatment. It can be seen that the position information of the surgical instrument deviates from the trajectory information, for example, an actual entry angle ⁇ of the surgical instrument deviates from the planned entry angle ⁇ . In order to accurately place the surgical instrument, the position information of the surgical instrument may be adjusted immediately to make the actual entry angle equal to the planned entry angle.
- aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or context including any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof. Accordingly, aspects of the present disclosure may be implemented entirely hardware, entirely software (including firmware, resident software, micro-code) or combining software and hardware implementation that may all generally be referred to herein as a “unit, ” “module, ” or “system. ” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer-readable media having computer-readable program code embodied thereon.
- a computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including electro-magnetic, optical, or the like, or any suitable combination thereof.
- a computer-readable signal medium may be any computer-readable medium that is not a computer- readable storage medium and that may communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
- Program code embodied on a computer-readable signal medium may be transmitted using any appropriate medium, including wireless, wireline, optical fiber cable, RF, or the like, or any suitable combination of the foregoing.
- Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C++, C#, VB. NET, Python or the like, conventional procedural programming languages, such as the "C" programming language, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, dynamic programming languages such as Python, Ruby and Groovy, or other programming languages.
- the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
- the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN) , or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider) or in a cloud computing environment or offered as a service such as a Software as a Service (SaaS) .
- LAN local area network
- WAN wide area network
- SaaS Software as a Service
Landscapes
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Robotics (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Pathology (AREA)
- Medical Treatment And Welfare Office Work (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
Système d'aide à la mise en place d'un instrument chirurgical pouvant comprendre un composant d'imagerie (410), un composant de détermination de trajectoire (420) et un composant de commande (430). Le composant d'imagerie (410) et le composant de commande (430) peuvent correspondre à un même système de coordonnées. Le composant d'imagerie (410) peut acquérir au moins une image d'un sujet (440). Le composant de détermination de trajectoire (420) peut déterminer des informations de trajectoire pour placer un instrument chirurgical dans le sujet (440) sur la base d'un point d'entrée de référence et d'un point cible de référence dans ladite image. Le composant de commande (430) peut comprendre une unité d'émission laser conçue pour émettre un faisceau laser vers le sujet (440) sur la base des informations de trajectoire. Une orientation du faisceau laser peut coïncider avec une orientation associée aux informations de trajectoire. Le faisceau laser peut être utilisé comme référence pour placer l'instrument chirurgical dans le sujet (440).
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2020/113413 WO2022047720A1 (fr) | 2020-09-04 | 2020-09-04 | Systèmes et procédés d'aide à la mise en place d'un instrument chirurgical dans un sujet |
| CN202080103792.9A CN116096322A (zh) | 2020-09-04 | 2020-09-04 | 用于辅助将手术器械放置到对象中的系统和方法 |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2020/113413 WO2022047720A1 (fr) | 2020-09-04 | 2020-09-04 | Systèmes et procédés d'aide à la mise en place d'un instrument chirurgical dans un sujet |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2022047720A1 true WO2022047720A1 (fr) | 2022-03-10 |
Family
ID=80492393
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2020/113413 Ceased WO2022047720A1 (fr) | 2020-09-04 | 2020-09-04 | Systèmes et procédés d'aide à la mise en place d'un instrument chirurgical dans un sujet |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN116096322A (fr) |
| WO (1) | WO2022047720A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2023247444A1 (fr) * | 2022-06-24 | 2023-12-28 | B. Braun New Ventures GmbH | Robot de guidage laser servant à projeter visuellement un guide sur un plan de chirurgie, procédé de projection et système de robot de guidage laser |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106413621A (zh) * | 2013-09-18 | 2017-02-15 | 伊美格医药公司 | 光学靶向和轨迹可视化 |
| CN108472096A (zh) * | 2015-12-31 | 2018-08-31 | 史赛克公司 | 用于在由虚拟对象限定的目标部位处对患者执行手术的系统和方法 |
| US20180286135A1 (en) * | 2014-03-14 | 2018-10-04 | Synaptive Medical (Barbados) Inc | System and method for projected tool trajectories for surgical navigation systems |
| US20180333208A1 (en) * | 2017-05-17 | 2018-11-22 | General Electric Company | Guidance system for needle procedures |
| CN109481018A (zh) * | 2018-12-29 | 2019-03-19 | 上海联影医疗科技有限公司 | 一种应用在医疗操作中的导航设备及方法 |
| CN111329553A (zh) * | 2016-03-12 | 2020-06-26 | P·K·朗 | 用于手术的装置与方法 |
Family Cites Families (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5792215A (en) * | 1995-09-18 | 1998-08-11 | The University Of Virginia Patent Foundation | Laser integrated targeting and entry system and method |
| US6041249A (en) * | 1997-03-13 | 2000-03-21 | Siemens Aktiengesellschaft | Device for making a guide path for an instrument on a patient |
| US6021342A (en) * | 1997-06-30 | 2000-02-01 | Neorad A/S | Apparatus for assisting percutaneous computed tomography-guided surgical activity |
| EP2004083A2 (fr) * | 2006-03-31 | 2008-12-24 | Koninklijke Philips Electronics N.V. | Systeme de chirurgie guide par l'image |
| US20090281452A1 (en) * | 2008-05-02 | 2009-11-12 | Marcus Pfister | System and method for a medical procedure using computed tomography |
| US20170296273A9 (en) * | 2014-03-17 | 2017-10-19 | Roy Anthony Brown | Surgical Targeting Systems and Methods |
| US20200246085A1 (en) * | 2015-09-28 | 2020-08-06 | Koninklijke Philips N.V. | Optical registation of a remote center of motion robot |
| US20170354387A1 (en) * | 2016-06-08 | 2017-12-14 | General Electric Company | Fluoroscopic Guidance System With Offset Light Source and Method Of Use |
| CN108135563B (zh) * | 2016-09-20 | 2021-12-03 | 桑托沙姆·罗伊 | 光和阴影引导的针定位系统和方法 |
| CN206612845U (zh) * | 2016-12-12 | 2017-11-07 | 张昊 | 非血管经皮介入诊疗手术机器人 |
| EP3760127B1 (fr) * | 2017-09-25 | 2024-03-20 | Shanghai United Imaging Healthcare Co., Ltd. | Système et procédé d'emplacement d'un sujet cible |
| CN109223176B (zh) * | 2018-10-26 | 2021-06-25 | 中南大学湘雅三医院 | 一种手术规划系统 |
| CN114469343B (zh) * | 2019-10-31 | 2023-06-23 | 武汉联影智融医疗科技有限公司 | 标定件、手术导航坐标系配准系统、方法、设备和介质 |
-
2020
- 2020-09-04 CN CN202080103792.9A patent/CN116096322A/zh active Pending
- 2020-09-04 WO PCT/CN2020/113413 patent/WO2022047720A1/fr not_active Ceased
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106413621A (zh) * | 2013-09-18 | 2017-02-15 | 伊美格医药公司 | 光学靶向和轨迹可视化 |
| US20180286135A1 (en) * | 2014-03-14 | 2018-10-04 | Synaptive Medical (Barbados) Inc | System and method for projected tool trajectories for surgical navigation systems |
| CN108472096A (zh) * | 2015-12-31 | 2018-08-31 | 史赛克公司 | 用于在由虚拟对象限定的目标部位处对患者执行手术的系统和方法 |
| CN111329553A (zh) * | 2016-03-12 | 2020-06-26 | P·K·朗 | 用于手术的装置与方法 |
| US20180333208A1 (en) * | 2017-05-17 | 2018-11-22 | General Electric Company | Guidance system for needle procedures |
| CN109481018A (zh) * | 2018-12-29 | 2019-03-19 | 上海联影医疗科技有限公司 | 一种应用在医疗操作中的导航设备及方法 |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2023247444A1 (fr) * | 2022-06-24 | 2023-12-28 | B. Braun New Ventures GmbH | Robot de guidage laser servant à projeter visuellement un guide sur un plan de chirurgie, procédé de projection et système de robot de guidage laser |
Also Published As
| Publication number | Publication date |
|---|---|
| CN116096322A (zh) | 2023-05-09 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12336855B2 (en) | Imaging systems and methods | |
| US12161886B2 (en) | System and method for pretreatment imaging in adaptive radiation therapy | |
| US11896849B2 (en) | Subject positioning systems and methods | |
| US20210104055A1 (en) | Systems and methods for object positioning and image-guided surgery | |
| US11182927B2 (en) | Systems and methods for positioning an object | |
| US12179040B2 (en) | Systems and methods for adjusting beam-limiting devices | |
| US10857391B2 (en) | System and method for diagnosis and treatment | |
| US20220061781A1 (en) | Systems and methods for positioning | |
| US11738210B2 (en) | Medical systems and methods | |
| US20210353366A1 (en) | System and method for placing surgical instrument in subject | |
| EP4066155B1 (fr) | Systèmes et procédés d'imagerie | |
| EP3869460A1 (fr) | Systèmes et procédés permettant de déterminer la position d'une région d'intérêt | |
| US11458334B2 (en) | System and method for diagnosis and treatment | |
| CN116322902B (zh) | 图像配准系统和方法 | |
| WO2022047720A1 (fr) | Systèmes et procédés d'aide à la mise en place d'un instrument chirurgical dans un sujet | |
| US20250311996A1 (en) | Imaging systems and methods | |
| CN117078786A (zh) | 生成图像的系统、方法和介质 | |
| US11839777B2 (en) | Medical systems including a positioning lamp and a projection device and control methods of the medical systems | |
| US11963814B2 (en) | Systems and methods for determing target scanning phase | |
| WO2019141262A1 (fr) | Dispositif et système de réduction de fracture osseuse | |
| US12186591B2 (en) | X-ray imaging system for radiation therapy | |
| US20240415601A1 (en) | Systems and methods for object positioning and image-guided surgery | |
| US20250000475A1 (en) | System and method for couch sag compensation |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20951967 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 20951967 Country of ref document: EP Kind code of ref document: A1 |