WO2025120063A1 - Model of a joint of a human in a mixed reality environment - Google Patents
Model of a joint of a human in a mixed reality environment Download PDFInfo
- Publication number
- WO2025120063A1 WO2025120063A1 PCT/EP2024/084880 EP2024084880W WO2025120063A1 WO 2025120063 A1 WO2025120063 A1 WO 2025120063A1 EP 2024084880 W EP2024084880 W EP 2024084880W WO 2025120063 A1 WO2025120063 A1 WO 2025120063A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- apparatus member
- orientation
- head
- data processing
- processing device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B23/00—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
- G09B23/28—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
- G09B23/30—Anatomical models
- G09B23/32—Anatomical models with moving parts
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
Definitions
- the invention relates to a method for determining a position and orientation of an apparatus relative to a head-worn device, a data processing device, a model of a joint of a human, and a system.
- Mixed reality refers to environments or systems that mix a user's natural perception with a computer-generated (virtual) perception.
- Mixed reality systems can comprise a head-worn device and an apparatus, wherein the user can haptically perceive and modify the apparatus, and the head-worn device comprises a display for displaying virtual information related to the apparatus.
- the head-worn device may be transparent, so that the user can still see the apparatus and the environment, but real visual perceptions of the user may be augmented with virtual information displayed by the head-worn device.
- the head-worn device may be nontransparent and configured to display real information as well as virtual information, wherein the real information may be obtained by means of a camera that may be integrated in the head-worn device.
- the apparatus may be, for example, an industrial robotic system, a machine, or a vehicle.
- the method for determining the position and orientation of the apparatus relative to the head-worn device, the data processing device, and the mixed reality system may be used in various other applications, including medical applications.
- mixed reality systems can be used for the training of physicians and medical staff to physically examine and/or treat pathologies.
- knee joints comprise complex ligamentous structures that have a decisive influence on the biomechanics of the joint.
- a realistic knee joint model was developed that allows to haptically detect pathologies of anterior cruciate ligament rupture, posterior cruciate ligament rupture and medial ligament rupture by means of an internal ligament mechanism.
- the knee joint model is embedded in a mixed reality system, which displays virtual information to a user so that the user can observe internal pathology mechanisms in any joint position.
- the mixed reality system may provide an improved learning experience for a user such as a physician or medical staff.
- the mixed reality system requires an accurate determination of the position and orientation of the knee joint model relative to the head-worn device of the mixed reality system, so that virtual information of the patient can be displayed correctly by the head-worn device.
- a method for determining a position and orientation of an apparatus relative to a head-worn device is provided.
- the head-worn device has a camera, and a positioning signal receiver for receiving a positioning signal.
- the method comprises receiving, by a data processing device, first image data from the camera of the head-worn device, wherein the first image data shows a marker that is arranged on a surface of a first apparatus member of the apparatus.
- the data processing device determines a position and orientation of the first apparatus member relative to the head-worn device based on the first image data and based on geometry information of the marker. Further, the data processing device receives first data from the positioning signal receiver of the head-worn device, wherein the first data received from the positioning signal receiver indicates a first position and orientation of the head-worn device at a first time.
- the data processing device also receives second data from the positioning signal receiver of the head-worn device, wherein the second data received from the positioning signal receiver indicates a second position and orientation of the head-worn device at a second time that is after the first time.
- the data processing device then updates the position and orientation of the first apparatus member relative to the head-worn device based on the first data indicating the first position and orientation of the head- worn device at the first time and based on the second data indicating the second position and orientation of the head-worn device at the second time.
- the head-worn device is configured to be worn on the head of a user.
- the head-worn device comprises a display, and the head-worn device may be an augmented or virtual reality headset such as augmented or virtual reality glasses.
- the data processing device comprises one or more processors and one or more memories.
- the data processing device may be integrated in the head-worn device, or the data processing device may be separate from the head-worn device.
- the data processing device may be implemented in a distributed manner comprising a plurality of data processing subsystems. One or more of these data processing subsystems may be integrated in the head-worn device.
- the data processing device is communicatively coupled to the display, the camera, and the positioning signal receiver of the head-worn device, wherein communication connections between the data processing device and the display, the camera, and the positioning signal receiver of the head-worn device may be wireless and/or wireline.
- the apparatus may comprise a model of a part of a human being such as a joint or the heart. However, the disclosure of this patent application is not limited to medical models. For example, the apparatus may alternatively be an industrial robot, a machine, or a vehicle.
- the data processing device receives the first image data from the camera of the head-worn device, wherein the first image data shows a marker that is arranged on the surface of the first apparatus member of the apparatus.
- a plurality of markers may be arranged on the surface of the first apparatus member.
- the first image data may show one or more markers of the plurality of markers that are arranged on the surface of the first apparatus member, whereas other markers of the plurality of markers that are arranged on the surface of the first apparatus member may not be depicted by the first image data.
- the data processing device may determine the position and orientation of the first apparatus member relative to the head-worn device based on the first image data and based on geometry information of the one or more markers that are shown on the first image data, wherein the geometry information may comprise information about shapes, colors, and/or distances of the one or more markers, for example.
- Some markers may be QR codes.
- positions and orientations relative to the head-worn device may be positions and orientations relative to the display of the head-worn device.
- the position and orientation of the first apparatus member relative to the head-worn device may be determined by the data processing device in a first coordinate system, which may be centered on or fixed relative to the head-worn device, so that the first coordinate system moves and/or rotates together with the head-worn device.
- the head-worn device comprises a positioning signal receiver for receiving positioning signals.
- the positioning signals may be transmitted by a plurality of base stations.
- the positioning signal receiver may determine the first data based on positioning signals received at or close to the first time, and the positioning signal receiver may determine the second data based on positioning signals received at or close to the second time.
- the positioning signal receiver may determine parameters of the positioning signals such as times of arrival, angles of arrival, etc.
- the base stations may for example be base stations of the Valve lighthouse system as described in “DronOS: A flexible open-source prototyping framework for interactive drone routines” by M. Hoppe et al., Int. Conf.
- the base stations may also be referred to as lighthouses.
- the method is equally applicable to other positioning systems and associated positioning signal receivers including positioning systems based on times of arrival, directions of arrival, etc.
- the first data provided by the positioning signal receiver indicates a first position and orientation of the head-worn device at a first time.
- the first position and orientation may be a position and orientation in a base coordinate system, wherein the base coordinate system may be fixed relative to the positions of the base stations that transmit the positioning signals.
- the second position and orientation may be a position and orientation of the head-worn device in the base coordinate system.
- the first image data is preferably captured by the camera at or close to the first time.
- the position and orientation of the first apparatus member relative to the head-worn device in the first coordinate system and the first position and orientation of the head-worn device in the base coordinate system may be determined for approximately a same time instant, that is the first time. This allows to determine the position and orientation of the first apparatus member in the base coordinate system.
- the position and orientation of the first apparatus member in the base coordinate system may be determined based on the first data indicating the position and orientation of the head-worn device in the base coordinate system at the first time and based on the position and orientation of the first apparatus member relative to the head-worn device in the first coordinate system as determined based on the first image data.
- the position and orientation of the first apparatus member in the base coordinate system does not change or does not frequently change. Then, the position and orientation of the first apparatus member relative to the head-worn device can be updated based on the first data indicating the first position and orientation of the head-worn device at the first time and based on the second data indicating the second position and orientation of the head-worn device at the second time.
- the updating of the position and orientation of the first apparatus member relative to the head-worn device can be implemented based on the position and orientation of the first apparatus member in the base coordinate system and based on the second data indicating the second position and orientation of the head-worn device at the second time in the base coordinate system.
- the position and orientation of the first apparatus member in the base coordinate system is assumed to be static or semi-static.
- the updating of the position and orientation of the first apparatus member relative to the head-worn device can be implemented in different ways.
- the present patent application is not limited to the above manner for updating, by the data processing device, the position and orientation of the first apparatus member relative to the head-worn device.
- the updating of the position and orientation of the first apparatus member relative to the head-worn device does not require that the data processing device receives image data from the camera of the head-worn device for the second time depicting a marker on the surface of the first apparatus member. This is relevant, because images captured by the camera may not always show a marker that is arranged on the surface of the first apparatus member.
- the data processing device may determine a second position and orientation of the first apparatus member relative to the head- worn device based on the further image data captured at or close to the second time. Further, the data processing device may update the position and orientation of the first apparatus member in the base coordinate system based on the second data indicating the second position and orientation of the head-worn device at the second time in the base coordinate system and based on the determined second position and orientation of the first apparatus member relative to the head-worn device in the first coordinate system.
- the apparatus further comprises a second apparatus member that is movably mounted on the first apparatus member.
- the apparatus is configured such that a movement of the second apparatus member relative to the first apparatus member is limited to a curve or a surface.
- the second apparatus member comprises a first sensor.
- the method further comprises receiving, by the data processing device, data from the first sensor, wherein the data received from the first sensor indicates a first position or a first orientation of the second apparatus member.
- the data processing device determines a position and orientation of the second apparatus member relative to the head-worn device based on the updated position and orientation of the first apparatus member relative to the head-worn device, based on geometry information about the apparatus, and based on the data received from the first sensor indicating the first position or the first orientation of the second apparatus member.
- the first sensor may be disposed on or inside the second apparatus member.
- the first sensor may be an accelerometer, a gyroscope, or a magnetometer.
- the term ‘gyroscope’ is to be understood here and in the sequel as any sensor configured to measure an orientation, an angular rate, or an angular acceleration.
- the apparatus is configured to limit the movement of the second apparatus member relative to the first apparatus member to a manifold, wherein the manifold is a curve or a surface.
- the curve may be a section of a line or a section of a circle, for example.
- the surface may be a section of a plane or a section of a sphere, for example. Due to the limitation of the movement of the second apparatus member relative to the first apparatus member, the data processing device can efficiently and/or accurately determine the position and orientation of the second apparatus member relative to the first apparatus member based on the data received from the first sensor indicating the first position or the first orientation of the second apparatus member and based on the geometry information about the apparatus.
- the data processing device may determine the position and orientation of the second apparatus member relative to the first apparatus member in a second coordinate system that is centered on or fixed relative to the first apparatus member.
- the geometry information about the apparatus may be or may include information about the curve or the surface. That is, the geometry information about the apparatus includes information related to the limitation of the movement of the second apparatus member relative to the first apparatus member.
- the data processing device determines the position and orientation of the second apparatus member relative to the head-worn device based on the updated position and orientation of the first apparatus member relative to the head-worn device, based on the geometry information about the apparatus, and based on the data received from the first sensor indicating the first position or the first orientation of the second apparatus member. This may be implemented by determining the position and orientation of the second apparatus member relative to the head-worn device based on the updated position and orientation of the first apparatus member relative to the head-worn device and based on the position and orientation of the second apparatus member relative to the first apparatus member.
- the data processing device preferably determines the position and orientation of the second apparatus member relative to the head-worn device in the first coordinate system.
- the position and orientation of the second apparatus member relative to the head- worn device may be the position and orientation of the second apparatus member relative to the display of the head-worn device.
- the data processing device may further determine the position and orientation of the second apparatus member in the base coordinate system based on the position and orientation of the first apparatus member in the base coordinate system, based on the geometry information about the apparatus, and based on the data received from the first sensor indicating the first position or the first orientation of the second apparatus member. This may be implemented by determining the position and orientation of the second apparatus member in the base coordinate system based on the position and orientation of the first apparatus member in the base coordinate system and based on the position and orientation of the second apparatus member relative to the first apparatus member.
- the data processing device determines the position and orientation of the second apparatus member relative to the head-worn device in the first coordinate system based on the second data received from the positioning signal receiver indicating the second position and orientation of the head-worn device at the second time in the base coordinate system and based on the position and orientation of the second apparatus member in the base coordinate system.
- the second apparatus member may comprise a plurality of first sensors comprising one or more gyroscopes, one or more accelerometers, and/or one or more magnetometers.
- the plurality of first sensors may be disposed on or inside the second apparatus member.
- the data processing device may receive data from the plurality of first sensors, wherein the data received from the plurality of first sensors indicates the first position and/or the first orientation of the second apparatus member.
- the second apparatus member is rotatably mounted on the first apparatus member
- the first sensor is a first gyroscope
- the data received by the data processing device from the first sensor indicates the first orientation of the second apparatus member
- the determining, by the data processing device, a position and orientation of the second apparatus member relative to the head-worn device comprises: determining, by the data processing device, the position and orientation of the second apparatus member relative to the head-worn device based on the updated position and orientation of the first apparatus member relative to the head-worn device, based on geometry information about the apparatus, and based on the data received from the first gyroscope indicating the first orientation of the second apparatus member.
- the apparatus may be configured to limit the movement of the second apparatus member relative to the first apparatus member to at least a section of a circle or a section of a sphere.
- the geometry information about the apparatus may be or may include information about the circle, the sphere, or the section thereof.
- the apparatus further comprises a third apparatus member, the third apparatus member is attached to the second apparatus member, and the third apparatus member comprises a second gyroscope and a second accelerometer.
- the data processing device receives data from the second gyroscope, wherein the data received from the second gyroscope indicates an orientation of the third apparatus member.
- the data processing device further receives data from the second accelerometer, wherein the data received from the second accelerometer indicates an acceleration of the third apparatus member.
- the data processing device determines a position and orientation of the third apparatus member relative to the second apparatus member based on the data received from the first gyroscope indicating the first orientation of the second apparatus member, based on the data received from the second gyroscope indicating the orientation of the third apparatus member, and based on the data received from the second accelerometer indicating the acceleration of the third apparatus member.
- the data processing device determines a position and orientation of the third apparatus member relative to the head-worn device based on the position and orientation of the second apparatus member relative to the head- worn device and based on the position and orientation of the third apparatus member relative to the second apparatus member.
- the first gyroscope of the second apparatus member, the second gyroscope of the third apparatus member, and the second accelerometer of the third apparatus member can enable an accurate determination of the position and orientation of the third apparatus member relative to the second apparatus member.
- the data processing device may determine the position and orientation of the third apparatus member relative to the second apparatus member in a third coordinate system that is centered on or fixed relative to the second apparatus member. Further, the data processing device may determine the position and orientation of the third apparatus member relative to the head-worn device in the first coordinate system. The position and orientation of the third apparatus member relative to the head-worn device may be the position and orientation of the third apparatus member relative to the display of the head-worn device.
- the data processing device may further determine the position and orientation of the third apparatus member in the base coordinate system based on the position and orientation of the second apparatus member in the base coordinate system, based on the data received from the first gyroscope indicating the first orientation of the second apparatus member, based on the data received from the second gyroscope indicating the orientation of the third apparatus member, and based on the data received from the second accelerometer indicating the acceleration of the third apparatus member.
- the position and orientation of the third apparatus member in the base coordinate system may be determined, by the data processing device, based on the position and orientation of the second apparatus member in the base coordinate system and based on the position and orientation of the third apparatus member relative to the second apparatus member.
- the data processing device may determine the position and orientation of the third apparatus member relative to the head-worn device in the first coordinate system based on the second data received from the positioning signal receiver indicating the second position and orientation of the head- worn device at the second time in the base coordinate system and based on the position and orientation of the third apparatus member in the base coordinate system.
- the second apparatus member further comprises a first accelerometer.
- the data processing device receives data from the first accelerometer, wherein the data received from the first accelerometer indicates an acceleration of the second apparatus member.
- the data processing device determines the position and orientation of the third apparatus member relative to the second apparatus member further based on the data received from the first accelerometer indicating the acceleration of the second apparatus member.
- the data processing device determines the position and orientation of the third apparatus member relative to the second apparatus member based on the data received from the first gyroscope indicating the first orientation of the second apparatus member, based on the data received from the first accelerometer indicating the acceleration of the second apparatus member, based on the data received from the second gyroscope indicating the orientation of the third apparatus member, and based on the data received from the second accelerometer indicating the acceleration of the third apparatus member, thereby providing an improved accuracy of the determined position and orientation of the third apparatus member relative to the second apparatus member.
- the position and orientation of the third apparatus member relative to the second apparatus member may be determined in the third coordinate system, and the determined position and orientation of the third apparatus member relative to the second apparatus member may be used, as explained above, to determine the position and orientation of the third apparatus member relative to the head-worn device, and/or to determine the position and orientation of the third apparatus member in the base coordinate system.
- the first accelerometer may be disposed on or inside the second apparatus member.
- a translation sensor is disposed on the second apparatus member or the third apparatus member, wherein the translation sensor is a flex sensor or an optical distance sensor.
- the data processing device receives data from the translation sensor, wherein the data received from the translation sensor indicates a translation of the third apparatus member relative to the second apparatus member.
- the data processing device may then determine the position and orientation of the third apparatus member relative to the second apparatus member further based on the data received from the translation sensor indicating the translation of the third apparatus member relative to the second apparatus member.
- the data processing device determines the position and orientation of the third apparatus member relative to the second apparatus member based on the data received from the first gyroscope indicating the first orientation of the second apparatus member, based on the data received from the first accelerometer indicating the acceleration of the second apparatus member, based on the data received from the second gyroscope indicating the orientation of the third apparatus member, based on the data received from the second accelerometer indicating the acceleration of the third apparatus member, and based on the data received from the translation sensor indicating the translation of the third apparatus member relative to the second apparatus member, thereby providing a further improved accuracy of the determined position and orientation of the third apparatus member relative to the second apparatus member.
- the translation sensor may be any mechanical sensor configured to measure a displacement.
- the translation sensor is a strain sensor, and the translation sensor is configured such that a deflection of this sensor causes a change of an electrical resistance.
- the translation sensor may be configured to detect translational movements between the second and third apparatus members.
- the translation sensor may be mounted in a position such that rotations of the third apparatus member relative to the second apparatus member cause no or small deflections of the translation sensor, whereas translations of the third apparatus member relative to the second apparatus member cause large deflections of the translation sensor.
- the translation sensor may be used to resolve ambiguities in the determination of the position and orientation of the third apparatus member relative to the second apparatus member.
- the apparatus comprises a model of a joint of a human.
- the second apparatus member comprises a first bone model, wherein the first bone model is a model of a first bone of the joint.
- the third apparatus member comprises a second bone model, and the second bone model is a model of a second bone of the joint.
- the model of the joint may be a model of a knee joint.
- the second apparatus member may comprise a femur model
- the third apparatus member may comprise a tibia model.
- the third apparatus member may comprise a fibula model.
- the second apparatus member may comprise a tibia model
- the third apparatus member may comprise a femur model.
- the second apparatus member may comprise a fibula model.
- the model of the joint may be adapted for the training of physicians or medical staff in the examination or treatment of human joints.
- the determined position and orientation of the second apparatus member relative to the head-worn device is preferably a position and orientation of the first bone model relative to the head-worn device. More specifically, the determined position and orientation of the second apparatus member relative to the head-worn device may be the position and orientation of the first bone model relative to the display of the head-worn device.
- the determined position and orientation of the third apparatus member relative to the head-worn device is preferably a position and orientation of the second bone model relative to the head-worn device or relative to the display of the head-worn device.
- the first apparatus member may be a frame, and the model of the joint may be mounted on the frame.
- the first bone model may be mounted rotatably on the frame.
- the data processing device receives second image data from the camera of the head-worn device.
- the data processing device augments the second image data with virtual representations of anatomical structures of a virtual patient, wherein the augmenting of the second image data is based on the determined position and orientation of the second apparatus member relative to the head-worn device and based on the determined position and orientation of the third apparatus member relative to the head-worn device.
- the data processing device further sends the augmented second image data to the head- worn device for being displayed.
- the head-worn device may not be transparent.
- the anatomical structures of the virtual patient may comprise portions of skin, one or more bones, one or more ligaments, and/or one or more muscles of the virtual patient.
- the one or more bones may comprise the first and second bones mentioned before.
- the anatomical structures of the virtual patient are rendered by using the determined position and orientation of the second apparatus member relative to the head-worn device and the determined position and orientation of the third apparatus member relative to the head-worn device.
- the determined position and orientation of the second apparatus member relative to the head-worn device may be the position and orientation of the first bone model relative to the head-worn device
- the determined position and orientation of the third apparatus member relative to the head-worn device may be the position and orientation of the second bone model relative to the head-worn device
- the augmentation of the second image data with virtual representations of anatomical structures of the virtual patient can improve a learning experience of a physician or medical staff, who uses the model of the joint to train the examination and/or treatment of pathologies, because the physician or medical staff can not only haptically perceive a pathology using the model of the joint, but can also observe internal anatomical structures such as bones, ligaments, and/or muscles of the virtual patient.
- the data processing device may further augment the second image data with a menu.
- the menu may allow a user to configure the augmentation of the second image data.
- the menu may allow selecting the anatomical structures of the virtual patient that are to be augmented on the second image data.
- the menu may allow selecting between different pathology settings, which may be, in the example of a knee joint, anterior cruciate ligament rupture, posterior cruciate ligament rupture, or medial ligament rupture.
- the menu may allow selecting an exam mode with restricted functions and choices for the user.
- the second image data may be augmented with anamneses instead of anatomical structures of the virtual patient.
- the menu may further allow selecting between different modes such as a mode for the training of beginners and another mode for the training of advanced learners, wherein the second image data may be augmented with different subsets of anatomical structures of the virtual patient in these different modes.
- a user input may be determined by tracking the hands of the user. This may be implemented, for example, by means of the Ultraleap technology provided by LeapMotion Inc., San Francisco, USA.
- the second image data may also be augmented with virtual representations of the hands of the user.
- the menu navigation may be implemented by means of gesture recognition.
- the present disclosure is not limited to user inputs based on an augmented menu and the tracking of the hands of the user.
- Various other input manners may be utilized such as touch screens, voice recognition, keyboards, mouses, etc.
- the data processing device computes third image data by rendering virtual representations of anatomical structures of a virtual patient, wherein the anatomical structures of the virtual patient are rendered based on the determined position and orientation of the second apparatus member relative to the head-worn device and based on the determined position and orientation of the third apparatus member relative to the head-worn device.
- the data processing device further sends the third image data to the head-worn device for being displayed.
- the head-worn device may be at least partially transparent.
- real direct visual perceptions of the user may be augmented with the third image data to improve the learning experience of the user.
- the anatomical structures of the virtual patient may comprise portions of skin, one or more bones, one or more ligaments, and/or one or more muscles of the virtual patient, wherein the one or more bones may comprise the first and second bones mentioned before.
- the anatomical structures of the virtual patient are rendered by using the determined position and orientation of the second apparatus member relative to the head-worn device and the determined position and orientation of the third apparatus member relative to the head-worn device.
- the determined position and orientation of the second apparatus member relative to the head-worn device may be the position and orientation of the first bone model relative to the head-worn device
- the determined position and orientation of the third apparatus member relative to the head-worn device may be the position and orientation of the second bone model relative to the head-worn device.
- the third image data may further comprise a visualization of a menu similar as described before for the augmented second image data. Details are not repeated here.
- a user input may be determined by tracking the hands of the user while the third image data including the menu is displayed by the display of the head-worn device.
- the menu navigation may be implemented using gesture recognition.
- the present disclosure is not limited to the user inputs based on displaying a menu and tracking the hands of the user.
- Various other input manners may be utilized such as touch screens, voice recognition, keyboards, mouses, etc.
- the positioning signal receiver receives a first positioning signal from a first base station and a second positioning signal from a second base station, and the positioning signal receiver determines the first data based on the first positioning signal and the second positioning signal. Further, the positioning signal receiver sends the first data to the data processing device.
- a data processing device is provided, wherein the data processing device is adapted to perform the method of the first aspect or any possible implementation of this method.
- a computer program comprises instructions for causing the data processing device of the second aspect to carry out the method of the first aspect or any of the possible implementation of this method.
- a computer-readable medium is presented, wherein the computer-readable medium stores the computer program of the third aspect.
- a model of a joint of a human comprises a first bone model, wherein the first bone model is a model of a first bone of the joint.
- the model of the joint further comprises a second bone model, wherein the second bone model is a model of a second bone of the joint.
- the model of the joint further comprises one or more first ligament models, wherein the one or more first ligament models are models of first ligaments connecting the first bone and the second bone of the joint.
- the model of the joint further comprises an apparatus for tensioning the one or more first ligament models such that a first end section of the second bone model is pulled towards a first end section of the first bone model.
- the apparatus for tensioning the one or more first ligament models is attached to the first bone model.
- the one or more first ligament models are attached to the first end section of the second bone model.
- the first bone model has one or more first apertures at its first end section, and the one or more first ligament models extend from the first end section of the second bone model through the one or more first apertures to the apparatus for tensioning the one or more first ligament models.
- the model of the joint may be a model of a knee joint.
- the first bone model may be a femur model
- the second bone model may be a tibia model.
- the second bone model may further comprise a fibulae model.
- the first bone model may be a tibia model
- the second bone model may be a femur model, wherein the first bone model may further comprise a fibula model.
- the first bone model and the second bone model may each comprise a bone base structure, which may be reinforced by means of reinforcement ribs and/or fiberglass mesh. Additionally, for each of the first and second bone models, a tube such as an aluminum tube may be used to improve a mechanical strength, wherein the tube may be laminated using fiberglass mesh.
- the one or more first ligament models may be implemented using cotton textiles.
- the cotton textiles can be flat woven.
- the cotton textiles may be laid in a sheath, similar to a Bowden cable sheath.
- the first ligament models may be bendable and compressible.
- the first ligament models are preferable not elastic.
- the joint model may be embedded in layers of a flexible polyurethane foam and silicone, wherein these layers may be configured to provide realistic haptic representations of soft tissues such as muscles and skin structures.
- the tensions of the one or more first ligament models may be set manually using the apparatus for tensioning the one or more first ligament models.
- the apparatus for tensioning the one or more first ligament models may comprise one or more actuators for tensioning the one or more first ligament models.
- the apparatus for tensioning the one or more first ligament models may be attached to a part of the first bone model that is remote from the first end section of the first bone model.
- the first end section of the first bone model may comprise a separate first aperture for each first ligament model of the one or more first ligament models so that the one or more first ligament models extend through separate first apertures at the first end section of the first bone model.
- several first ligament models of the one or more first ligament models may extend through a single first aperture of the one or more first apertures.
- the model of the joint further comprises one or more second ligament models and an apparatus for tensioning the one or more second ligament models.
- the one or more second ligament models are models of second ligaments connecting the first bone and the second bone of the joint.
- the one or more second ligament models are attached to the first end section of the first bone model, the second bone model has one or more second apertures at its first end section, the apparatus for tensioning the one or more second ligament models is attached to the second bone model, and the one or more second ligament models extend from the first end section of the first bone model through the one or more second apertures to the apparatus for tensioning the one or more second ligament models.
- the first end section of the second bone model may comprise a separate second aperture for each second ligament model of the one or more second ligament models so that the one or more second ligament models extend through separate second apertures at the first end section of the second bone model.
- several second ligament models of the one or more second ligament models may extend through one second aperture of the one or more second apertures.
- the tensions of the one or more second ligament models may be set manually using the apparatus for tensioning the one or more second ligament models.
- the apparatus for tensioning the one or more second ligament models may comprise one or more actuators for tensioning the one or more second ligament models.
- the apparatus for tensioning the one or more second ligament models may be attached to a part of the second bone model that is remote from the first end section of the second bone model.
- the first and second ligament models may be implemented similarly. Further, the apparatus for tensioning the one or more first ligament models may be implemented similarly to the apparatus for tensioning the one or more second ligament models. In some implementations, the model of the joint comprises no first ligament models, but one or more second ligament models.
- the model of the joint comprises a controller that is connected to the apparatus for tensioning the one or more first ligament models, the apparatus for tensioning the one or more first ligament models comprises one or more actuators for tensioning the one or more first ligament models, and the controller is configured to control tensions of the one or more first ligament models based on a pathology setting using the one or more actuators.
- the model of the joint may be a model of a knee joint.
- the controller may be configured to obtain the pathology setting.
- the controller may be configured to receive the pathology setting from a data processing device, or the controller may be a unit of the data processing device, wherein the data processing device may obtain the pathology setting based on user input.
- the user input may be implemented by tracking the hands of the user and by displaying a menu by the head-worn device as described above, wherein the menu allows a user to select the pathology setting.
- the user input may be implemented by means of a touch screen, by means of voice input, or by means of another user input technique.
- the controller may be configured to send signals to the actuators of the apparatus for tensioning the one or more first ligament models to control these actuators. Based on the signals received from the controller, the actuators of the apparatus for tensioning the one or more first ligament models may control tensions of the one or more first ligament models.
- the controller may further be configured to send signals to the actuators of the apparatus for tensioning the one or more second ligament models to control these actuators. Based on the signals received from the controller, the actuators of the apparatus for tensioning the one or more second ligament models may control tensions of the one or more second ligament models.
- the actuators of the apparatus for tensioning the one or more first ligament models and/or the actuators of the apparatus for tensioning the one or more second ligament models may control tensions of the one or more first and/or second ligament models to represent an anterior cruciate ligament rupture, a posterior cruciate ligament rupture, or a medial ligament rupture.
- the model of the joint further comprises a first gyroscope disposed on the first bone model, a second gyroscope disposed on the second bone model, and a second accelerometer disposed on the second bone model.
- the first gyroscope, the second gyroscope, and the second accelerometer may correspond to the first gyroscope, the second gyroscope, and the second accelerometer of the first aspect. Details are not described again.
- the model of the joint further comprises a first accelerometer disposed on the first bone model.
- the first accelerometer may correspond to the first accelerometer of the first aspect. Details are not described again.
- the model of the joint further comprises a translation sensor disposed on the first bone model or the second bone model.
- the translation sensor is a flex sensor or an optical distance sensor, and the translation sensor is configured to measure a translation of the second bone model relative to the first bone model.
- the translation sensor may correspond to the translation sensor of the first aspect. Details are not described again.
- a system comprising the data processing device of the second aspect.
- the system further comprises an apparatus comprising the model of a joint of the fifth aspect, wherein the model of the joint is disposed on a first apparatus member of the apparatus.
- the system further comprises a head-worn device, the head-worn device comprising a display, a camera, and a positioning signal receiver.
- the system further comprises a first base station and a second base station for transmitting positioning signals.
- the data processing device, the apparatus, the head-worn device, the first base station, and the second base station are adapted to perform the method according to the first aspect or any of the possible implementations of this method.
- the first apparatus member may be a frame.
- Fig. 1 shows schematically and exemplarily a mixed reality system according to an embodiment of the present application
- Fig. 2 shows schematically and exemplarily a method for determining a position and orientation of an apparatus relative to a head-worn device according to an embodiment of the present application
- Figs. 3A, 3B, and 3C show schematically and exemplarily a views of a knee joint model according to an embodiment of the present application.
- Fig. 1 shows schematically and exemplarily a mixed reality system according to an embodiment of the present application.
- the mixed reality system comprises an apparatus, wherein the apparatus comprises a first apparatus member 110 and a model of a human knee joint.
- the model of the knee joint comprises a second apparatus member 120 and a third apparatus member 130.
- the second apparatus member 120 comprises a first bone model, wherein the first bone model is a model of a first bone of the knee joint.
- the third apparatus member 130 comprises a second bone model, wherein the second bone model is a model of a second bone of the knee joint.
- the first bone model is a femur model
- the second bone model is a tibia model, wherein the second bone model may further comprise a fibula model.
- the first bone model is a tibia model
- the second bone model is a femur model
- the first bone model may further comprise a fibular model.
- the second apparatus member 120 comprises one or more first sensors 121 .
- the one or more first sensors are disposed on or inside the second apparatus member 120.
- the third apparatus member 130 comprises one or more second sensors 131 .
- the one or more second sensors are disposed on or inside the third apparatus member.
- the knee joint model is rotatably mounted on the first apparatus member 110, which may be a frame.
- One or more markers 111 are arranged on the surface of the first apparatus member 110.
- the mixed reality system further comprises a plurality of base stations 101 for transmitting positioning signals.
- the base stations 101 are configured to transmit positioning signals.
- the base stations may for example be base stations of the Valve lighthouse system as described in “DronOS: A flexible open-source prototyping framework for interactive drone routines” by M. Hoppe et al., Int. Conf, on Mobile and Ubiquitous Multimedia (MUM), ACM, 2019, 15:1 15:7, “Performance bounds in positioning with the be lighthouse system” by M. Greiff et al., IEEE Int. Conf, on Information Fusion (FUSION), 2019, pp. 1-8, or “Automated testing of industrial robots using HTC make for motion tracking” by K. Sletten, M.Sc. thesis, University of Stavanger, Norway, 2017.
- the base stations may also be referred to as lighthouses.
- other base stations 101 may be used to transmit positioning signals.
- the mixed reality system further comprises a head-worn device 140, which comprises a display 142, a camera 144, and a positioning signal receiver 146 for receiving positioning signals.
- the head-worn device may be worn on the head of a user and may be an augmented or virtual reality headset or augmented or virtual reality glasses.
- the positioning signal receiver 146 is configured to receive positioning signals transmitted by the plurality of base stations 101 and to provide data to a data processing device 150, wherein the data provided to the data processing device 150 indicates the position and orientation of the head-worn device 140 in a base coordinate system that is fixed relative to the positions of the base stations 101.
- the mixed reality system further comprises a data processing device 150.
- the data processing device is depicted in Fig. 1 as separate from the head-worn device 140. In other examples, the data processing device 150 may be integrated in the head-worn device 140, or the data processing device may be implemented in a distributed manner comprising a plurality of data processing subsystems, wherein one or more of these data processing subsystems may be integrated in the head-worn device 140.
- the data processing device comprises one or more memories 152, one or more processors 154, and one or more transceiver units 156.
- the data processing device 150 is communicatively coupled to the display 142, the camera 144, and the positioning signal receiver 146 of the head- worn device 140 via the one or more transceiver units 156 and via one or more cables 148.
- the data processing device 150 may be wirelessly connected to the display 142, the camera 144, and/or the positioning signal receiver 146 of the head-worn device 140.
- the data processing device 150 is configured to receive image data from the camera 144 of the head-worn device 140.
- the image data received by the data processing device 150 from the camera 144 of the head-worn device 144 may show a marker 111 that is arranged on the surface of the first apparatus member 110.
- the data processing device 150 is configured to determine a position and orientation of the first apparatus member 110 relative to the head-worn device 140 based on the image data received from the camera 144 and based on geometry information about the marker 111.
- the data processing device 150 is configured to receive data from the positioning signal receiver 146 of the head-worn device 140, wherein the data received from the positioning signal receiver 146 indicates the position and orientation of the head-worn device in the base coordinate system. Further, the data processing device is configured to update the position and orientation of the first apparatus member 110 relative to the head-worn device 140 based on data received from the positioning signal receiver 146 indicating the position and orientation of the head-worn device in the base coordinate system.
- the second apparatus member 120 is rotatably mounted on the first apparatus member 110.
- the data processing device 150 is configured to receive data from the one or more first sensors 121 , and the data processing device 150 is configured to determine the position and orientation of the second apparatus member 120 relative to the head-worn device 140 based on the determined position and orientation of the first apparatus member 110 relative to the head-worn device, based on geometry information about the first and second apparatus members, and based on the data received from the one or more first sensors 121.
- the one or more first sensors 121 comprise a first gyroscope
- the data processing device 150 is configured to receive data from the first gyroscope, wherein the data received from the first gyroscope indicates an orientation of the second apparatus member 120.
- the data processing device 150 may be configured to determine the position and orientation of the second apparatus member 120 relative to the head-worn device 140 based on the determined position and orientation of the first apparatus member 110 relative to the head-worn device, based on geometry information about the first and second apparatus members, and based on the data received from the first gyroscope indicating the orientation of the second apparatus member 120.
- the data processing device 150 is further configured to receive data from the one or more second sensors 131 of the third apparatus member 130.
- the data processing device 150 is configured to determine the position and orientation of the third apparatus member 130 relative to the head-worn device 140 based on the determined position and orientation of the first apparatus member 110 relative to the head-worn device, based on the geometry information about the first and second apparatus members, based on the data received from the one or more first sensors 121 , and based on the data received from the one or more second sensors 131 .
- the second sensors 131 may comprise a second gyroscope and a second accelerometer.
- the data processing device 150 may be configured to receive data from the second gyroscope, wherein the data received from the second gyroscope indicates an orientation of the third apparatus member 130.
- the data processing device 150 may further be configured to receive data from the second accelerometer, wherein the data received from the second accelerometer indicates an acceleration of the third apparatus member 130.
- the data processing device may further be configured to determine a position and orientation of the third apparatus member 130 relative to the second apparatus member 120 based on the data received from the first gyroscope indicating the orientation of the second apparatus member, based on the data received from the second gyroscope indicating the orientation of the third apparatus member, and based on the data received from the second accelerometer indicating the acceleration of the third apparatus member.
- the data processing device 150 may further be configured to augment second image data received from the camera 144 of the head-worn device 140 with virtual representations of anatomical structures of a virtual patient, wherein the augmenting of the second image data is based on the determined position and orientation of the second apparatus member 120 relative to the head-worn device and based on the determined position and orientation of the third apparatus member 130 relative to the head-worn device.
- the data processing device 150 may send the augmented second image data or the computed third image data to the head-worn device 140 for being displayed.
- the anatomical structures of the virtual patient may comprise portions of skin, one or more bones, one or more ligaments, and/or one or more muscles of the virtual patient.
- the displaying of the augmented second image data comprising the virtual representations of anatomical structures of the virtual patient can improve a learning experience of a physician or medical staff, who uses the knee joint model to train the examination and/or the treatment of pathologies, because the physician or medical staff can not only haptically perceive a pathology using the knee joint model, but can also observe internal anatomical structures such as bones, ligaments, and/or muscles of the virtual patient.
- Fig. 2 shows schematically and exemplarily a method for determining a position and orientation of an apparatus relative to a head-worn device according to an embodiment of the present application.
- the method may be performed by the data processing device 150 of Fig. 1 .
- the apparatus comprises the first apparatus member 110 of Fig. 1 .
- the apparatus may further comprise the second apparatus member 120, and the third apparatus member 130 of Fig. 1.
- the head-worn device may correspond to the head-worn device 140 of Fig. 1 .
- the method of Fig. 2 is not limited to apparatuses comprising a knee joint model or a medical model.
- the apparatus may alternatively be an industrial robot, a machine, or a vehicle, for example. This is not limited herein.
- the method of Fig. 2 includes the following steps:
- the data processing device receives first image data from the camera of the head-worn device, wherein the first image data shows a marker that is arranged on a surface of a first apparatus member of the apparatus.
- the camera of the head-worn device may correspond to the camera 144 of the head-worn device 140 of Fig. 1
- the marker may correspond to one of the markers 111 of Fig. 1 .
- a plurality of markers may be arranged on the surface of the first apparatus member.
- the first image data may show one or more markers of the plurality of markers that are arranged on the surface of the first apparatus member, whereas other markers of the plurality of markers that are arranged on the surface of the first apparatus member may not be depicted by the first image data.
- the data processing device may determine the position and orientation of the first apparatus member relative to the head-worn device based on the first image data and based on geometry information about the one or more markers that are shown on the first image data, wherein the geometry information may comprise information about shapes, colors, and/or distances of the one or more markers, for example.
- Some markers may be QR codes.
- positions and orientations relative to the head-worn device may be positions and orientations relative to the display of the head-worn device.
- the data processing device determines a position and orientation of the first apparatus member relative to the head-worn device based on the first image data and based on geometry information of the marker.
- the position and orientation of the first apparatus member relative to the head-worn device may be determined by the data processing device in a first coordinate system, which may be centered on or fixed relative to the head-worn device, so that the first coordinate system moves and/or rotates together with the head-worn device.
- Step S230 The data processing device receives first data from a positioning signal receiver of the head-worn device, wherein the first data received from the positioning signal receiver indicates a first position and orientation of the head-worn device at a first time. Step S230 may be performed before, between, or after S210 and S220.
- the positioning signal receiver may correspond to the positioning signal receiver 146 of the head-worn device 140 of Fig. 1 .
- the positioning signals may be transmitted by a plurality of base stations, wherein the base stations may correspond to the base stations 110 of Fig. 1.
- the positioning signal receiver may determine the first data based on positioning signals received at or close to the first time. To determine the first and/or second data, the positioning signal receiver may determine parameters of the positioning signals such as times of arrival, angles of arrival, etc.
- the first data provided by the positioning signal receiver indicates a first position and orientation of the head-worn device at a first time.
- the first position and orientation may be a position and orientation in a base coordinate system, wherein the base coordinate system may be fixed relative to the positions of the base stations that transmit the positioning signals.
- the data processing device also receives second data from the positioning signal receiver of the head-worn device, wherein the second data received from the positioning signal receiver indicates a second position and orientation of the head-worn device at a second time that is after the first time.
- Step S240 is similar to S230, but the positioning signal receiver may determine the second data based on positioning signals received at or close to the second time.
- the second position and orientation may be a position and orientation of the head-worn device in the base coordinate system.
- the data processing device updates the position and orientation of the first apparatus member relative to the head-worn device based on the first data indicating the first position and orientation of the head-worn device at the first time and based on the second data indicating the second position and orientation of the head-worn device at the second time.
- the first image data is preferably captured by the camera at or close to the first time.
- the position and orientation of the first apparatus member relative to the head-worn device in the first coordinate system and the first position and orientation of the head-worn device in the base coordinate system may be determined based on measurements at approximately a same time instant. Assuming that the head-worn device does not or does hardly move relative to the base stations between capturing the first image data and the first time, the position and orientation of the first apparatus member in the base coordinate system may be determined.
- the position and orientation of the first apparatus member in the base coordinate system may be determined based on the first data indicating the position and orientation of the head-worn device in the base coordinate system at the first time and based on the position and orientation of the first apparatus member relative to the head-worn device in the first coordinate system as determined based on the first image data.
- the position and orientation of the first apparatus member in the base coordinate system does not change or does not frequently change.
- the position and orientation of the first apparatus member in the base coordinate system does not change or does hardly change between the first and second times. Based on this assumption, the position and orientation of the first apparatus member relative to the head-worn device can be updated based on the first data indicating the first position and orientation of the head-worn device at the first time and based on the second data indicating the second position and orientation of the head-worn device at the second time.
- the updating of the position and orientation of the first apparatus member relative to the head-worn device can be implemented based on the position and orientation of the first apparatus member in the base coordinate system and based on the second data indicating the second position and orientation of the head- worn device at the second time in the base coordinate system, wherein the position and orientation of the first apparatus member in the base coordinate system is assumed to be static or semi-static. It is to be noted that the updating of the position and orientation of the first apparatus member relative to the head-worn device does not require that the data processing device receives image data from the camera of the head-worn device for the second time depicting a marker on the surface of the first apparatus member. This is relevant, because images captured by the camera may not always show a marker that is arranged on the surface of the first apparatus member.
- the data processing device may determine a second position and orientation of the first apparatus member relative to the head-worn device based on the further image data captured at or close to the second time. Further, the data processing device may update the position and orientation of the first apparatus member in the base coordinate system based on the second data indicating the second position and orientation of the head-worn device at the second time in the base coordinate system and based on the determined second position and orientation of the first apparatus member relative to the head-worn device in the first coordinate system.
- the apparatus may further comprise a second apparatus member that is movably mounted on the first apparatus member.
- the second apparatus member may correspond to the second apparatus member of Fig. 1 .
- the first and second apparatus members may be configured such that a movement of the second apparatus member relative to the first apparatus member is limited to a curve or a surface.
- the second apparatus member may comprise a first sensor, which may be disposed on or inside the second apparatus member.
- the first sensor may correspond to the one or more first sensors 121 of Fig. 1 .
- the method optionally further comprises step S260:
- the data processing device receives data from the first sensor, wherein the data received from the first sensor indicates a first position or a first orientation of the second apparatus member.
- the data processing device determines a position and orientation of the second apparatus member relative to the head-worn device based on the updated position and orientation of the first apparatus member relative to the head-worn device, based on geometry information about the first and second apparatus members, and based on the data received from the first sensor indicating the first position or the first orientation of the second apparatus member.
- the first sensor may be an accelerometer, a gyroscope, or a magnetometer.
- the apparatus may be configured to limit the movement of the second apparatus member relative to the first apparatus member to a manifold, wherein the manifold is a curve or a surface.
- the curve may be a section of a line or a section of a circle, for example.
- the surface may be a section of a plane or a section of a sphere, for example.
- the data processing device may efficiently and/or accurately determine the position and orientation of the second apparatus member relative to the first apparatus member based on the data received from the first sensor indicating the first position or the first orientation of the second apparatus member and based on the geometry information about the apparatus.
- the data processing device may determine the position and orientation of the second apparatus member relative to the first apparatus member in a second coordinate system that is centered on or fixed relative to the first apparatus member.
- the geometry information about the apparatus may be or may include information about the curve or the surface.
- the data processing device may determine the position and orientation of the second apparatus member relative to the head-worn device based on the updated position and orientation of the first apparatus member relative to the head-worn device and based on the determined position and orientation of the second apparatus member relative to the first apparatus member.
- the data processing device may preferably determine the position and orientation of the second apparatus member relative to the head-worn device in the first coordinate system.
- the position and orientation of the second apparatus member relative to the head-worn device may be the position and orientation of the second apparatus member relative to the display of the head-worn device, wherein the display of the head-worn device may correspond to the display 142 of the head-worn device 140 of Fig. 1 .
- the data processing device may further determine the position and orientation of the second apparatus member in the base coordinate system based on the position and orientation of the first apparatus member in the base coordinate system, based on the geometry information about the first and second apparatus members, and based on the data received from the first sensor indicating the first position or the first orientation of the second apparatus member. This may be implemented by determining the position and orientation of the second apparatus member in the base coordinate system based on the position and orientation of the first apparatus member in the base coordinate system and based on the position and orientation of the second apparatus member relative to the first apparatus member.
- the data processing device may determine the position and orientation of the second apparatus member relative to the head-worn device in the first coordinate system based on the second data received from the positioning signal receiver indicating the second position and orientation of the head- worn device at the second time in the base coordinate system and based on the position and orientation of the second apparatus member in the base coordinate system.
- the second apparatus member may be rotatably mounted on the first apparatus member
- the first sensor may be a first gyroscope
- the data received by the data processing device from the first sensor may indicate the first orientation of the second apparatus member
- the data processing device may determine the position and orientation of the second apparatus member relative to the head-worn device based on the updated position and orientation of the first apparatus member relative to the head-worn device, based on the geometry information about the apparatus, and based on the data received from the first gyroscope indicating the first orientation of the second apparatus member.
- the apparatus may be configured to limit the movement of the second apparatus member relative to the first apparatus member to at least a section of a circle or a section of a sphere.
- the geometry information about the apparatus may be or may include information about the circle, the sphere, or the section thereof.
- the apparatus may further comprise a third apparatus member, wherein the third apparatus member may be attached to the second apparatus member.
- the third apparatus member may correspond to the third apparatus member 130 of Fig. 1.
- the third apparatus member may comprise a second gyroscope and a second accelerometer.
- the second gyroscope and the second accelerometer may be disposed on or inside the third apparatus member.
- the second gyroscope and the second accelerometer may correspond to the one or more second sensors 131 of Fig. 1 .
- the method optionally further comprises step S270:
- the data processing device receives data from the second gyroscope, wherein the data received from the second gyroscope indicates an orientation of the third apparatus member.
- the data processing device further receives data from the second accelerometer, wherein the data received from the second accelerometer indicates an acceleration of the third apparatus member.
- the data processing device determines a position and orientation of the third apparatus member relative to the second apparatus member based on the data received from the first gyroscope indicating the first orientation of the second apparatus member, based on the data received from the second gyroscope indicating the orientation of the third apparatus member, and based on the data received from the second accelerometer indicating the acceleration of the third apparatus member.
- the data processing device determines a position and orientation of the third apparatus member relative to the head-worn device based on the position and orientation of the second apparatus member relative to the head-worn device and based on the position and orientation of the third apparatus member relative to the second apparatus member.
- the data processing device may determine the position and orientation of the third apparatus member relative to the second apparatus member in a third coordinate system that is centered on or fixed relative to the second apparatus member. Further, the data processing device may determine the position and orientation of the third apparatus member relative to the head-worn device in the first coordinate system. The position and orientation of the third apparatus member relative to the head-worn device may be the position and orientation of the third apparatus member relative to the display of the head-worn device.
- the data processing device may further determine the position and orientation of the third apparatus member in the base coordinate system based on the position and orientation of the second apparatus member in the base coordinate system, based on the data received from the first gyroscope indicating the first orientation of the second apparatus member, based on the data received from the second gyroscope indicating the orientation of the third apparatus member, and based on the data received from the second accelerometer indicating the acceleration of the third apparatus member.
- the position and orientation of the third apparatus member in the base coordinate system may be determined, by the data processing device, based on the position and orientation of the second apparatus member in the base coordinate system and based on the position and orientation of the third apparatus member relative to the second apparatus member.
- the data processing device may determine the position and orientation of the third apparatus member relative to the head-worn device in the first coordinate system based on the second data received from the positioning signal receiver indicating the second position and orientation of the head- worn device at the second time in the base coordinate system and based on the position and orientation of the third apparatus member in the base coordinate system.
- the second apparatus member comprises a first accelerometer, which may be disposed on or inside the second apparatus member.
- the first accelerometer may correspond to the one or more first sensors 121 of Fig. 1 .
- the data processing device may receive data from the first accelerometer, wherein the data received from the first accelerometer indicates an acceleration of the second apparatus member.
- the data processing device may then determine the position and orientation of the third apparatus member relative to the second apparatus member further based on the data received from the first accelerometer indicating the acceleration of the second apparatus member.
- the position and orientation of the third apparatus member relative to the second apparatus member may be determined in the third coordinate system, and the determined position and orientation of the third apparatus member relative to the second apparatus member may be used to determine the position and orientation of the third apparatus member relative to the head-worn device, and/or to determine the position and orientation of the third apparatus member in the base coordinate system.
- a translation sensor is disposed on the second apparatus member or the third apparatus member, wherein the translation sensor is a flex sensor or an optical distance sensor.
- the data processing device may receive data from the translation sensor, wherein the data received from the translation sensor indicates a translation of the third apparatus member relative to the second apparatus member.
- the data processing device may determine the position and orientation of the third apparatus member relative to the second apparatus member further based on the data received from the translation sensor indicating the translation of the third apparatus member relative to the second apparatus member.
- the translation sensor may be any mechanical sensor configured to measure a displacement.
- the translation sensor is a strain sensor, and the translation sensor is configured such that a deflection of this sensor causes a change of an electrical resistance.
- the translation sensor may be configured to detect translational movements between the second and third apparatus members.
- the translation sensor may be mounted in a position such that rotations of the third apparatus member relative to the second apparatus member cause no or small deflections of the translation sensor, whereas translations of the third apparatus member relative to the second apparatus member cause large deflections of the translation sensor.
- the translation sensor may be used to resolve ambiguities in the determination of the position and orientation of the third apparatus member relative to the second apparatus member.
- the determined position and orientation of the second apparatus member relative to the head-worn device may be the position and orientation of the first bone model relative to the head-worn device. More specifically, the determined position and orientation of the second apparatus member relative to the head-worn device may be the position and orientation of the first bone model relative to the display of the head- worn device.
- the first gyroscope, the first accelerometer, and/or the translation sensor may be disposed on the first bone model.
- the third apparatus member includes the second bone model
- the determined position and orientation of the third apparatus member relative to the head-worn device may be the position and orientation of the second bone model relative to the head-worn device or relative to the display of the head-worn device.
- the head-worn device may be a virtual reality headset or virtual reality glasses.
- the data processing device may receive second image data from the camera of the head-worn device.
- the data processing device may augment the second image data with virtual representations of anatomical structures of a virtual patient, wherein the augmenting of the second image data may be based on the determined position and orientation of the second apparatus member relative to the head-worn device and based on the determined position and orientation of the third apparatus member relative to the head-worn device.
- the data processing device may further send the augmented second image data to the head-worn device for being displayed.
- the anatomical structures of the virtual patient may comprise portions of skin, one or more bones, one or more ligaments, and/or one or more muscles of the virtual patient.
- the anatomical structures of the virtual patient may be rendered by using the determined position and orientation of the second apparatus member relative to the head-worn device and the determined position and orientation of the third apparatus member relative to the head-worn device.
- the augmentation of the second image data with virtual representations of anatomical structures of the virtual patient can improve a learning experience of a physician or medical staff, who uses the model of the joint to train the examination and/or treatment of pathologies, because the physician or medical staff can not only haptically perceive a pathology using the model of the joint, but can also observe internal anatomical structures such as bones, ligaments, and/or muscles of the virtual patient.
- the data processing device may further augment the second image data with a menu.
- the menu may allow a user to configure the augmentation of the second image data.
- the menu may allow selecting the anatomical structures of the virtual patient that are to be augmented on the second image data.
- the menu may allow selecting between different pathology settings, which may be anterior cruciate ligament rupture, posterior cruciate ligament rupture, or medial ligament rupture of a knee joint.
- the menu may allow selecting an exam mode with restricted functions and choices for the user.
- the second image data may be augmented with anamneses instead of anatomical structures of the virtual patient.
- the menu may further allow selecting between different modes such as a mode for the training of beginners and another mode for the training of advanced learners, wherein the second image data may be augmented with different anatomical structures of the virtual patient in these different modes.
- a user input may be determined by tracking the hands of the user.
- the second image data may also be augmented with virtual representations of the hands of the user.
- the menu navigation may be implemented by means of gesture recognition.
- the present disclosure is not limited to user inputs that are based the augmentation of image data with a menu and the tracking of the hands of the user.
- Various other input manners are known in the art such as touch screens, voice recognition, keyboards, mouses, etc.
- the head-worn device may be an augmented reality headset or augmented reality glasses.
- the data processing device may compute third image data by rendering virtual representations of anatomical structures of a virtual patient, wherein the anatomical structures of the virtual patient are rendered based on the determined position and orientation of the second apparatus member relative to the head-worn device and based on the determined position and orientation of the third apparatus member relative to the head-worn device.
- the data processing device further sends the third image data to the head- worn device for being displayed by the display of the head-worn device.
- real visual perceptions of the user may be augmented with the third image data to improve the learning experience of the user.
- the anatomical structures of the virtual patient may again comprise portions of skin, one or more bones, one or more ligaments, and/or one or more muscles of the virtual patient.
- the anatomical structures of the virtual patient are rendered by using the determined position and orientation of the second apparatus member relative to the head-worn device and the determined position and orientation of the third apparatus member relative to the head-worn device.
- the positions and orientations of the first and second apparatus members may be the positions and orientations of the first and second bone models, respectively.
- the third image data may comprise a visualization of a menu.
- a user input may be determined by tracking the hands of the user.
- the menu navigation may be implemented using gesture recognition.
- the present disclosure is not limited to user inputs based on the displaying of a menu and the tracking of the hands of the user.
- Various other input manners are known in the art such as touch screens, voice recognition, keyboards, mouses, etc.
- Figures 3A to 3C show schematically and exemplarily a model of a knee joint of a human.
- the knee joint model comprises a first bone model 320, wherein the first bone model is a model of a first bone of the joint.
- the model of the joint further comprises a second bone model 330, wherein the second bone model is a model of a second bone of the joint.
- the model of the joint further comprises one or more first ligament models 323, 324, wherein the one or more first ligament models 323, 324 are models of first ligaments connecting the first bone and the second bone of the joint.
- the model of the joint further comprises an apparatus 322 for tensioning the one or more first ligament models such that a first end section of the second bone model 330 is pulled towards a first end section of the first bone model 320.
- the apparatus 322 for tensioning the one or more first ligament models is attached to the first bone model 320.
- the one or more first ligament models 323, 324 are attached to the first end section of the second bone model 330.
- the first bone model 320 has one or more first apertures at its first end section, and the one or more first ligament models 323, 324 extend from the first end section of the second bone model 330 through the one or more first apertures to the apparatus 322 for tensioning the one or more first ligament models.
- the model of the knee joint further comprises a portion 328 for guiding the one or more first ligament models 323, 324 to the apparatus 322 for tensioning the one or more first ligament models.
- the apparatus 322 for tensioning the one or more first ligament models further comprises an aperture 329 that is configured to rotatably mount the knee joint model on a frame, wherein the frame may correspond to the first apparatus member 110 of Fig. 1 .
- the knee joint model may further comprise a model 360 of the patella.
- the first bone model 320 may be a model of a femur
- the second bone model 330 may be a model of a tibia
- the second bone model may further comprise a model 339 of a fibula.
- the first bone model may be a model of a tibia
- the second bone model may be a model of a femur
- the first bone model may further comprise a model 339 of a fibula.
- the first bone model 320 may comprise a bone base structure, which may be reinforced by means of reinforcement ribs and/or fiberglass mesh. Additionally, a tube such as an aluminium tube may be used to improve a mechanical strength of the first bone model 320, wherein the tube may be laminated using fiberglass mesh.
- the second apparatus member 120 of Fig. 1 may comprise the first bone model 320, wherein the first bone model may be embedded in layers of a flexible polyurethane foam and silicone, wherein these layers may be configured to provide realistic haptic representations of soft tissues such as muscles and skin structures.
- a first gyroscope and a first accelerometer may be disposed on the first bone model.
- the apparatus 322 for tensioning the one or more first ligament models may be configured to enable manual settings of the tensions of the one or more first ligament models 323, 324.
- the apparatus 322 for tensioning the one or more first ligament models may comprise one or more actuators for tensioning the one or more first ligament models 323, 324.
- the apparatus 322 for tensioning the one or more first ligament models may be attached to a part of the first bone model 320 that is remote from the first end section of the first bone model.
- the apparatus 322 for tensioning the one or more first ligament models may be attached to the second end section of the first bone model 320.
- the first end section of the first bone model 320 may comprise a separate first aperture for each first ligament model of the one or more first ligament models 323, 324 so that the one or more first ligament models extend through separate first apertures at the first end section of the first bone model.
- several first ligament models 323, 324 may extend through a single first aperture of the one or more first apertures.
- the knee joint model further comprises one or more second ligament models 333 and an apparatus 332 for tensioning the one or more second ligament models.
- the one or more second ligament models 333 are models of second ligaments connecting the first bone and the second bone of the joint.
- the one or more second ligament models are attached to the first end section of the first bone model 320, the second bone model 330 has one or more second apertures at its first end section, the apparatus 332 for tensioning the one or more second ligament models is attached to the second bone model, and the one or more second ligament models 333 extend from the first end section of the first bone model 320 through the one or more second apertures to the apparatus 322 for tensioning the one or more second ligament models.
- the first end section of the second bone model 330 may comprise a separate second aperture for each second ligament model of the one or more second ligament models 333 so that the one or more second ligament models extend through separate second apertures at the first end section of the second bone model.
- several second ligament models 333 may extend through one second aperture of the one or more second apertures.
- the tensions of the one or more second ligament models may be set manually using the apparatus 332 for tensioning the one or more second ligament models.
- the apparatus 332 for tensioning the one or more second ligament models may comprise one or more actuators for tensioning the one or more second ligament models 333.
- the apparatus 332 for tensioning the one or more second ligament models may be attached to a part of the second bone model 330 that is remote from the first end section of the second bone model.
- the first and second ligament models 323, 324, 333 may be implemented using cotton textiles.
- the cotton textiles can be flat woven.
- the cotton textiles may be laid in a sheath, similar to a Bowden cable sheath.
- the first and second ligament models 323, 324, 333 may be bendable and compressible.
- the first and second ligament models 323, 324, 333 are preferably not elastic.
- the second bone model 330 may comprise a bone base structure, which may be reinforced by means of reinforcement ribs and/or fiberglass mesh. Additionally, a tube such as an aluminium tube may be used to improve a mechanical strength of the second bone model 330, wherein the tube may be laminated using fiberglass mesh.
- the third apparatus member 130 of Fig. 1 may comprise the second bone model 330, wherein the second bone model may be embedded in layers of a flexible polyurethane foam and silicone, wherein these layers may be configured to provide realistic haptic representations of soft tissues such as muscles and skin structures.
- a second gyroscope and a second accelerometer may be disposed on the second bone model 330.
- the model of the joint comprises no first ligament models, but one or more second ligament models.
- the knee joint model comprises a controller, which is connected to the apparatus 322 for tensioning the one or more first ligament models and/or to the apparatus 332 for tensioning the one or more second ligament models.
- the controller is configured to control tensions of the one or more first ligament models 323, 324 and/or to control tensions of the one or more second ligament models 333.
- the controller may be configured to obtain a pathology setting.
- the controller may be configured to receive the pathology setting from a data processing device, or the controller may be a unit of the data processing device, wherein the data processing device may obtain the pathology setting based on user input.
- the user input may be implemented by displaying image data that visualizes a menu, wherein the menu allows a user to select the pathology setting, and by tracking the hands of the user.
- the user input may be implemented by means of a touch screen, by means of voice input, or by means of another input technique.
- the controller may be configured to send one or more signals to actuators of the apparatus 322 for tensioning the one or more first ligament models to control these actuators. Based on the signals received from the controller, the actuators of the apparatus 322 for tensioning the one or more first ligament models may control tensions of the one or more first ligament models 323, 324. Further, the controller may be configured to send one or more signals to actuators of the apparatus 332 for tensioning the one or more second ligament models to control these actuators. Based on the signals received from the controller, the actuators of the apparatus 332 for tensioning the one or more second ligament models may control tensions of the one or more second ligament models 323, 324.
- the actuators of the apparatuses 322, 332 for tensioning the first and second ligament models may control tensions of the first and second ligament models 323, 324, 333 to represent an anterior cruciate ligament rupture, a posterior cruciate ligament rupture, or a medial ligament rupture.
- the model of the knee joint further comprises one or more first sensors 321 disposed on the first bone model 320.
- the one or more first sensors 321 are configured to provide data that indicates a first position and/or a first orientation of the first bone model 320.
- the one or more first sensors 321 may correspond to the one or more first sensors 121 of Fig. 1.
- the one or more first sensors 321 comprise a first gyroscope. Further, the one or more first sensors may comprise a first accelerometer and/or a first magnetometer.
- the one or more first sensors 321 may be integrated in one system on chip, SOC.
- the model of the knee joint further comprises one or more second sensors 331 disposed on the second bone model 330.
- the one or more second sensors 331 are configured to provide data that indicates a position and an orientation of the second bone model 330.
- the one or more second sensors may correspond to the one or more second sensors 131 of Fig. 1 .
- the one or more second sensors 331 comprise a second gyroscope and a second accelerometer. Further, the one or more second sensors 331 may comprise a second magnetometer.
- the one or more second sensors may be integrated on one SOC.
- the knee joint model further comprises a translation sensor 335.
- the translation sensor 335 is a flex sensor or a strain sensor configured to provide data that indicates a translation of the second bone model 330 relative to the first bone model 320.
- the translation sensor is preferably mounted in a position such that rotations of the second bone model 330 relative to the first bone model 320 cause no or small deflections of the translation sensor, whereas translations of the second bone model relative to the first bone model cause large deflections of the translation sensor.
- the data provided by the translation sensor may be used to resolve ambiguities in the determination of the position and orientation of the second bone model 330 relative to the first bone model 320.
- the translation sensor may be any mechanical sensor configured to measure a displacement between the first and second bone models. In other embodiments, the translation sensor is an optical distance sensor.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Optics & Photonics (AREA)
- Mathematical Analysis (AREA)
- Business, Economics & Management (AREA)
- Algebra (AREA)
- Health & Medical Sciences (AREA)
- Mathematical Optimization (AREA)
- Mathematical Physics (AREA)
- Pure & Applied Mathematics (AREA)
- Computational Mathematics (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Medicinal Chemistry (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Disclosed are a method for determining a position and orientation of an apparatus relative to a head-worn device (140) of a mixed reality system and a data processing device (150) for performing the method. The apparatus may be a model of a joint of a human, comprising bone models (320, 330), ligament models (323, 324, 333), and an apparatus (322) for tensioning ligament models. Also disclosed is a system comprising the data processing device (150), the model of the joint, the head-worn device (140), and a plurality of base stations (101) for transmitting positioning signals.
Description
MODEL OF A JOINT OF A HUMAN IN A MIXED REALITY ENVIRONMENT
FIELD OF THE INVENTION
The invention relates to a method for determining a position and orientation of an apparatus relative to a head-worn device, a data processing device, a model of a joint of a human, and a system.
BACKGROUND OF THE INVENTION
Mixed reality refers to environments or systems that mix a user's natural perception with a computer-generated (virtual) perception. Mixed reality systems can comprise a head-worn device and an apparatus, wherein the user can haptically perceive and modify the apparatus, and the head-worn device comprises a display for displaying virtual information related to the apparatus. The head-worn device may be transparent, so that the user can still see the apparatus and the environment, but real visual perceptions of the user may be augmented with virtual information displayed by the head-worn device. Alternatively, the head-worn device may be nontransparent and configured to display real information as well as virtual information, wherein the real information may be obtained by means of a camera that may be integrated in the head-worn device. For both kinds of mixed reality systems, it is important to accurately determine the position and orientation of the apparatus relative to the head-worn device so that virtual information related to the apparatus can be displayed correctly.
SUMMARY OF THE INVENTION
Hence, it may be desirable to provide an improved method for determining a position and orientation of an apparatus relative to a head-worn device, a data processing device configured to perform this method, and a mixed reality system comprising the head-worn device and the data processing device. The apparatus may be, for example, an industrial robotic system, a machine, or a vehicle. The method for determining the position and orientation of the apparatus relative to the head-worn device, the data processing device, and the mixed reality system may be used in various other applications, including medical applications.
In particular, mixed reality systems can be used for the training of physicians and medical staff to physically examine and/or treat pathologies. For example, knee joints comprise complex ligamentous structures that have a decisive influence on the biomechanics of the joint. For the correct diagnosis of pathologies, especially in the case of injuries to ligamentous structures such as cruciate ligament ruptures or collateral ligament damage, physical examination is of great importance, as these injuries are hardly detectable using other diagnostic tools such as X-ray imaging. Detecting injuries to ligamentous structures by means of physical examination also allows to save cost- and time-intensive resources of X-ray or magnetic resonance imaging (MRI) systems. However, the physical examination and treatment of pathologies requires a high degree of anatomical as well as pathophysiological understanding and experience in "feeling" and classifying a possible pathology. Further, the examinations necessary for learning are usually painful for patients and often cannot be performed by means of dummy patients.
For these reasons, a realistic knee joint model was developed that allows to haptically detect pathologies of anterior cruciate ligament rupture, posterior cruciate ligament rupture and medial ligament rupture by means of an internal ligament mechanism. The knee joint model is embedded in a mixed reality system, which displays virtual information to a user so that the user can observe internal pathology mechanisms in any joint position. In this way, the mixed reality system may provide an improved learning experience for a user such as a physician or medical staff. The mixed reality system requires an accurate determination of the position and orientation of the knee joint model relative to the head-worn device of the mixed reality system, so that virtual information of the patient can be displayed correctly by the head-worn device.
Hence, it may further be desirable to provide an improved model for a pathology of a knee joint, and a mixed reality system for presenting virtual information related to the knee joint model.
The invention is defined by the independent claims. Further embodiments are provided by the dependent claims and the following description. It should be noted that any feature, element, and/or function of the method for determining the position and orientation of the apparatus relative to the head-worn device, as described in the following, equally applies to the data processing device, and the system, as described in the following, and vice versa.
According to a first aspect of the present disclosure, a method for determining a position and orientation of an apparatus relative to a head-worn device is provided. The head-worn device has a camera, and a positioning signal receiver for receiving a positioning signal. The method comprises receiving, by a data processing device, first image data from the camera of the head-worn device, wherein the first image data shows a marker that is arranged on a surface of a first apparatus member of the apparatus. The data processing device determines a position and orientation of the first apparatus member relative to the head-worn device based on the first image data and based on geometry information of the marker. Further, the data processing device receives first data from the positioning signal receiver of the head-worn device, wherein the first data received from the positioning signal receiver indicates a first position and orientation of the head-worn device at a first time. The data processing device also receives second data from the positioning signal receiver of the head-worn device, wherein the second data received from the positioning signal receiver indicates a second position and orientation of the head-worn device at a second time that is after the first time. The data processing device then updates the position and orientation of the first apparatus member relative to the head-worn device based on the first data indicating the first position and orientation of the head- worn device at the first time and based on the second data indicating the second position and orientation of the head-worn device at the second time.
The head-worn device is configured to be worn on the head of a user. The head-worn device comprises a display, and the head-worn device may be an augmented or virtual reality headset such as augmented or virtual reality glasses.
The data processing device comprises one or more processors and one or more memories. The data processing device may be integrated in the head-worn device, or the data processing device may be separate from the head-worn device. The data processing device may be implemented in a distributed manner comprising a plurality of data processing subsystems. One or more of these data processing subsystems may be integrated in the head-worn device. The data processing device is communicatively coupled to the display, the camera, and the positioning signal receiver of the head-worn device, wherein communication connections between the data processing device and the display, the camera, and the positioning signal receiver of the head-worn device may be wireless and/or wireline.
The apparatus may comprise a model of a part of a human being such as a joint or the heart. However, the disclosure of this patent application is not limited to medical models. For example, the apparatus may alternatively be an industrial robot, a machine, or a vehicle.
The data processing device receives the first image data from the camera of the head-worn device, wherein the first image data shows a marker that is arranged on the surface of the first apparatus member of the apparatus. A plurality of markers may be arranged on the surface of the first apparatus member. The first image data may show one or more markers of the plurality of markers that are arranged on the surface of the first apparatus member, whereas other markers of the plurality of markers that are arranged on the surface of the first apparatus member may not be depicted by the first image data. The data processing device may determine the position and orientation of the first apparatus member relative to the head-worn device based on the first image data and based on geometry information of the one or more markers that are shown on the first image data, wherein the geometry information may comprise information about shapes, colors, and/or distances of the one or more markers, for example. Some markers may be QR codes. Here and in the sequel, positions and orientations relative to the head-worn device may be positions and orientations relative to the display of the head-worn device.
The position and orientation of the first apparatus member relative to the head-worn device may be determined by the data processing device in a first coordinate system, which may be centered on or fixed relative to the head-worn device, so that the first coordinate system moves and/or rotates together with the head-worn device.
The head-worn device comprises a positioning signal receiver for receiving positioning signals. The positioning signals may be transmitted by a plurality of base stations. The positioning signal receiver may determine the first data based on positioning signals received at or close to the first time, and the positioning signal receiver may determine the second data based on positioning signals received at or close to the second time. To determine the first and/or second data, the positioning signal receiver may determine parameters of the positioning signals such as times of arrival, angles of arrival, etc. The base stations may for example be base stations of the Valve lighthouse system as described in “DronOS: A flexible open-source prototyping framework for interactive drone routines” by M. Hoppe et al., Int. Conf.
on Mobile and Ubiquitous Multimedia (MUM), ACM, 2019, 15:1-15:7, “Performance bounds in positioning with the vive lighthouse system” by M. Greiff et al., IEEE Int. Conf, on Information Fusion (FUSION), 2019, pp. 1-8, or “Automated testing of industrial robots using HTC vive for motion tracking” by K. Sletten, M.Sc. thesis, University of Stavanger, Norway, 2017. Hence, the base stations may also be referred to as lighthouses. The method is equally applicable to other positioning systems and associated positioning signal receivers including positioning systems based on times of arrival, directions of arrival, etc.
The first data provided by the positioning signal receiver indicates a first position and orientation of the head-worn device at a first time. The first position and orientation may be a position and orientation in a base coordinate system, wherein the base coordinate system may be fixed relative to the positions of the base stations that transmit the positioning signals. Similarly, the second position and orientation may be a position and orientation of the head-worn device in the base coordinate system.
The first image data is preferably captured by the camera at or close to the first time. Hence, the position and orientation of the first apparatus member relative to the head-worn device in the first coordinate system and the first position and orientation of the head-worn device in the base coordinate system may be determined for approximately a same time instant, that is the first time. This allows to determine the position and orientation of the first apparatus member in the base coordinate system. In particular, the position and orientation of the first apparatus member in the base coordinate system may be determined based on the first data indicating the position and orientation of the head-worn device in the base coordinate system at the first time and based on the position and orientation of the first apparatus member relative to the head-worn device in the first coordinate system as determined based on the first image data.
It may be assumed that the position and orientation of the first apparatus member in the base coordinate system does not change or does not frequently change. Then, the position and orientation of the first apparatus member relative to the head-worn device can be updated based on the first data indicating the first position and orientation of the head-worn device at the first time and based on the second data indicating the second position and orientation of the head-worn device at the second time.
The updating of the position and orientation of the first apparatus member relative to the head-worn device can be implemented based on the position and orientation of the first apparatus member in the base coordinate system and based on the second data indicating the second position and orientation of the head-worn device at the second time in the base coordinate system. Thereby, the position and orientation of the first apparatus member in the base coordinate system is assumed to be static or semi-static.
The updating of the position and orientation of the first apparatus member relative to the head-worn device can be implemented in different ways. The present patent application is not limited to the above manner for updating, by the data processing device, the position and orientation of the first apparatus member relative to the head-worn device.
It is to be noted that the updating of the position and orientation of the first apparatus member relative to the head-worn device does not require that the data processing device receives image data from the camera of the head-worn device for the second time depicting a marker on the surface of the first apparatus member. This is relevant, because images captured by the camera may not always show a marker that is arranged on the surface of the first apparatus member.
However, if the camera can provide further image data to the data processing device, which shows at least one marker that is arranged on the surface of the first apparatus member, wherein the further image data has been captured by the camera at or close to the second time, the data processing device may determine a second position and orientation of the first apparatus member relative to the head- worn device based on the further image data captured at or close to the second time. Further, the data processing device may update the position and orientation of the first apparatus member in the base coordinate system based on the second data indicating the second position and orientation of the head-worn device at the second time in the base coordinate system and based on the determined second position and orientation of the first apparatus member relative to the head-worn device in the first coordinate system.
In a possible implementation of the first aspect, the apparatus further comprises a second apparatus member that is movably mounted on the first apparatus member. The apparatus is configured such that a movement of the second apparatus member relative to the first apparatus member is limited to a
curve or a surface. The second apparatus member comprises a first sensor. The method further comprises receiving, by the data processing device, data from the first sensor, wherein the data received from the first sensor indicates a first position or a first orientation of the second apparatus member. The data processing device determines a position and orientation of the second apparatus member relative to the head-worn device based on the updated position and orientation of the first apparatus member relative to the head-worn device, based on geometry information about the apparatus, and based on the data received from the first sensor indicating the first position or the first orientation of the second apparatus member.
The first sensor may be disposed on or inside the second apparatus member. The first sensor may be an accelerometer, a gyroscope, or a magnetometer. The term ‘gyroscope’ is to be understood here and in the sequel as any sensor configured to measure an orientation, an angular rate, or an angular acceleration.
The apparatus is configured to limit the movement of the second apparatus member relative to the first apparatus member to a manifold, wherein the manifold is a curve or a surface. The curve may be a section of a line or a section of a circle, for example. The surface may be a section of a plane or a section of a sphere, for example. Due to the limitation of the movement of the second apparatus member relative to the first apparatus member, the data processing device can efficiently and/or accurately determine the position and orientation of the second apparatus member relative to the first apparatus member based on the data received from the first sensor indicating the first position or the first orientation of the second apparatus member and based on the geometry information about the apparatus. The data processing device may determine the position and orientation of the second apparatus member relative to the first apparatus member in a second coordinate system that is centered on or fixed relative to the first apparatus member. The geometry information about the apparatus may be or may include information about the curve or the surface. That is, the geometry information about the apparatus includes information related to the limitation of the movement of the second apparatus member relative to the first apparatus member.
The data processing device determines the position and orientation of the second apparatus member relative to the head-worn device based on the updated position and orientation of the first apparatus member relative to the head-worn device, based on the geometry information about the apparatus, and based on the
data received from the first sensor indicating the first position or the first orientation of the second apparatus member. This may be implemented by determining the position and orientation of the second apparatus member relative to the head-worn device based on the updated position and orientation of the first apparatus member relative to the head-worn device and based on the position and orientation of the second apparatus member relative to the first apparatus member. The data processing device preferably determines the position and orientation of the second apparatus member relative to the head-worn device in the first coordinate system. The position and orientation of the second apparatus member relative to the head- worn device may be the position and orientation of the second apparatus member relative to the display of the head-worn device.
The data processing device may further determine the position and orientation of the second apparatus member in the base coordinate system based on the position and orientation of the first apparatus member in the base coordinate system, based on the geometry information about the apparatus, and based on the data received from the first sensor indicating the first position or the first orientation of the second apparatus member. This may be implemented by determining the position and orientation of the second apparatus member in the base coordinate system based on the position and orientation of the first apparatus member in the base coordinate system and based on the position and orientation of the second apparatus member relative to the first apparatus member.
In some implementations, the data processing device determines the position and orientation of the second apparatus member relative to the head-worn device in the first coordinate system based on the second data received from the positioning signal receiver indicating the second position and orientation of the head-worn device at the second time in the base coordinate system and based on the position and orientation of the second apparatus member in the base coordinate system.
The second apparatus member may comprise a plurality of first sensors comprising one or more gyroscopes, one or more accelerometers, and/or one or more magnetometers. The plurality of first sensors may be disposed on or inside the second apparatus member. The data processing device may receive data from the plurality of first sensors, wherein the data received from the plurality of first sensors indicates the first position and/or the first orientation of the second apparatus member.
In another possible implementation, the second apparatus member is rotatably mounted on the first apparatus member, the first sensor is a first gyroscope, the data received by the data processing device from the first sensor indicates the first orientation of the second apparatus member, and the determining, by the data processing device, a position and orientation of the second apparatus member relative to the head-worn device comprises: determining, by the data processing device, the position and orientation of the second apparatus member relative to the head-worn device based on the updated position and orientation of the first apparatus member relative to the head-worn device, based on geometry information about the apparatus, and based on the data received from the first gyroscope indicating the first orientation of the second apparatus member.
Hence, the apparatus may be configured to limit the movement of the second apparatus member relative to the first apparatus member to at least a section of a circle or a section of a sphere. The geometry information about the apparatus may be or may include information about the circle, the sphere, or the section thereof.
In another possible implementation of the first aspect, the apparatus further comprises a third apparatus member, the third apparatus member is attached to the second apparatus member, and the third apparatus member comprises a second gyroscope and a second accelerometer. The data processing device receives data from the second gyroscope, wherein the data received from the second gyroscope indicates an orientation of the third apparatus member. The data processing device further receives data from the second accelerometer, wherein the data received from the second accelerometer indicates an acceleration of the third apparatus member. The data processing device determines a position and orientation of the third apparatus member relative to the second apparatus member based on the data received from the first gyroscope indicating the first orientation of the second apparatus member, based on the data received from the second gyroscope indicating the orientation of the third apparatus member, and based on the data received from the second accelerometer indicating the acceleration of the third apparatus member. The data processing device then determines a position and orientation of the third apparatus member relative to the head-worn device based on the position and orientation of the second apparatus member relative to the head- worn device and based on the position and orientation of the third apparatus member relative to the second apparatus member.
The first gyroscope of the second apparatus member, the second gyroscope of the third apparatus member, and the second accelerometer of the third apparatus member can enable an accurate determination of the position and orientation of the third apparatus member relative to the second apparatus member.
The data processing device may determine the position and orientation of the third apparatus member relative to the second apparatus member in a third coordinate system that is centered on or fixed relative to the second apparatus member. Further, the data processing device may determine the position and orientation of the third apparatus member relative to the head-worn device in the first coordinate system. The position and orientation of the third apparatus member relative to the head-worn device may be the position and orientation of the third apparatus member relative to the display of the head-worn device.
The data processing device may further determine the position and orientation of the third apparatus member in the base coordinate system based on the position and orientation of the second apparatus member in the base coordinate system, based on the data received from the first gyroscope indicating the first orientation of the second apparatus member, based on the data received from the second gyroscope indicating the orientation of the third apparatus member, and based on the data received from the second accelerometer indicating the acceleration of the third apparatus member. The position and orientation of the third apparatus member in the base coordinate system may be determined, by the data processing device, based on the position and orientation of the second apparatus member in the base coordinate system and based on the position and orientation of the third apparatus member relative to the second apparatus member.
In some implementations, the data processing device may determine the position and orientation of the third apparatus member relative to the head-worn device in the first coordinate system based on the second data received from the positioning signal receiver indicating the second position and orientation of the head- worn device at the second time in the base coordinate system and based on the position and orientation of the third apparatus member in the base coordinate system.
In another possible implementation, the second apparatus member further comprises a first accelerometer. The data processing device receives data from the first accelerometer, wherein the data received from the first accelerometer indicates
an acceleration of the second apparatus member. The data processing device then determines the position and orientation of the third apparatus member relative to the second apparatus member further based on the data received from the first accelerometer indicating the acceleration of the second apparatus member.
In other words, the data processing device determines the position and orientation of the third apparatus member relative to the second apparatus member based on the data received from the first gyroscope indicating the first orientation of the second apparatus member, based on the data received from the first accelerometer indicating the acceleration of the second apparatus member, based on the data received from the second gyroscope indicating the orientation of the third apparatus member, and based on the data received from the second accelerometer indicating the acceleration of the third apparatus member, thereby providing an improved accuracy of the determined position and orientation of the third apparatus member relative to the second apparatus member. Thereby, the position and orientation of the third apparatus member relative to the second apparatus member may be determined in the third coordinate system, and the determined position and orientation of the third apparatus member relative to the second apparatus member may be used, as explained above, to determine the position and orientation of the third apparatus member relative to the head-worn device, and/or to determine the position and orientation of the third apparatus member in the base coordinate system. The first accelerometer may be disposed on or inside the second apparatus member.
In another possible implementation, a translation sensor is disposed on the second apparatus member or the third apparatus member, wherein the translation sensor is a flex sensor or an optical distance sensor. The data processing device receives data from the translation sensor, wherein the data received from the translation sensor indicates a translation of the third apparatus member relative to the second apparatus member. The data processing device may then determine the position and orientation of the third apparatus member relative to the second apparatus member further based on the data received from the translation sensor indicating the translation of the third apparatus member relative to the second apparatus member.
In other words, the data processing device determines the position and orientation of the third apparatus member relative to the second apparatus member
based on the data received from the first gyroscope indicating the first orientation of the second apparatus member, based on the data received from the first accelerometer indicating the acceleration of the second apparatus member, based on the data received from the second gyroscope indicating the orientation of the third apparatus member, based on the data received from the second accelerometer indicating the acceleration of the third apparatus member, and based on the data received from the translation sensor indicating the translation of the third apparatus member relative to the second apparatus member, thereby providing a further improved accuracy of the determined position and orientation of the third apparatus member relative to the second apparatus member.
The translation sensor may be any mechanical sensor configured to measure a displacement. In one example, the translation sensor is a strain sensor, and the translation sensor is configured such that a deflection of this sensor causes a change of an electrical resistance. The translation sensor may be configured to detect translational movements between the second and third apparatus members. In particular, the translation sensor may be mounted in a position such that rotations of the third apparatus member relative to the second apparatus member cause no or small deflections of the translation sensor, whereas translations of the third apparatus member relative to the second apparatus member cause large deflections of the translation sensor. The translation sensor may be used to resolve ambiguities in the determination of the position and orientation of the third apparatus member relative to the second apparatus member.
In another possible implementation, the apparatus comprises a model of a joint of a human. The second apparatus member comprises a first bone model, wherein the first bone model is a model of a first bone of the joint. The third apparatus member comprises a second bone model, and the second bone model is a model of a second bone of the joint.
In particular, the model of the joint may be a model of a knee joint. The second apparatus member may comprise a femur model, and the third apparatus member may comprise a tibia model. Further, the third apparatus member may comprise a fibula model.
Alternatively, the second apparatus member may comprise a tibia model, and the third apparatus member may comprise a femur model. Further, the second apparatus member may comprise a fibula model.
The model of the joint may be adapted for the training of physicians or medical staff in the examination or treatment of human joints.
The determined position and orientation of the second apparatus member relative to the head-worn device is preferably a position and orientation of the first bone model relative to the head-worn device. More specifically, the determined position and orientation of the second apparatus member relative to the head-worn device may be the position and orientation of the first bone model relative to the display of the head-worn device.
Similarly, the determined position and orientation of the third apparatus member relative to the head-worn device is preferably a position and orientation of the second bone model relative to the head-worn device or relative to the display of the head-worn device.
The first apparatus member may be a frame, and the model of the joint may be mounted on the frame. In particular, the first bone model may be mounted rotatably on the frame.
In another possible implementation, the data processing device receives second image data from the camera of the head-worn device. The data processing device augments the second image data with virtual representations of anatomical structures of a virtual patient, wherein the augmenting of the second image data is based on the determined position and orientation of the second apparatus member relative to the head-worn device and based on the determined position and orientation of the third apparatus member relative to the head-worn device. The data processing device further sends the augmented second image data to the head- worn device for being displayed.
The head-worn device may not be transparent. The anatomical structures of the virtual patient may comprise portions of skin, one or more bones, one or more ligaments, and/or one or more muscles of the virtual patient. The one or more bones may comprise the first and second bones mentioned before. To obtain the augmented second image data, the anatomical structures of the virtual patient are rendered by using the determined position and orientation of the second apparatus member relative to the head-worn device and the determined position and orientation of the third apparatus member relative to the head-worn device. Thereby, the determined position and orientation of the second apparatus member relative to the head-worn device may be the position and orientation of the first bone model
relative to the head-worn device, and the determined position and orientation of the third apparatus member relative to the head-worn device may be the position and orientation of the second bone model relative to the head-worn device.
The augmentation of the second image data with virtual representations of anatomical structures of the virtual patient can improve a learning experience of a physician or medical staff, who uses the model of the joint to train the examination and/or treatment of pathologies, because the physician or medical staff can not only haptically perceive a pathology using the model of the joint, but can also observe internal anatomical structures such as bones, ligaments, and/or muscles of the virtual patient.
The data processing device may further augment the second image data with a menu. The menu may allow a user to configure the augmentation of the second image data. In particular, the menu may allow selecting the anatomical structures of the virtual patient that are to be augmented on the second image data. Further, the menu may allow selecting between different pathology settings, which may be, in the example of a knee joint, anterior cruciate ligament rupture, posterior cruciate ligament rupture, or medial ligament rupture. Further, the menu may allow selecting an exam mode with restricted functions and choices for the user. In the exam mode, the second image data may be augmented with anamneses instead of anatomical structures of the virtual patient. The menu may further allow selecting between different modes such as a mode for the training of beginners and another mode for the training of advanced learners, wherein the second image data may be augmented with different subsets of anatomical structures of the virtual patient in these different modes.
Based on the augmentation of the second image data with the menu, a user input may be determined by tracking the hands of the user. This may be implemented, for example, by means of the Ultraleap technology provided by LeapMotion Inc., San Francisco, USA. The second image data may also be augmented with virtual representations of the hands of the user. The menu navigation may be implemented by means of gesture recognition.
The present disclosure is not limited to user inputs based on an augmented menu and the tracking of the hands of the user. Various other input manners may be utilized such as touch screens, voice recognition, keyboards, mouses, etc.
In another possible implementation, the data processing device computes third image data by rendering virtual representations of anatomical structures of a virtual patient, wherein the anatomical structures of the virtual patient are rendered based on the determined position and orientation of the second apparatus member relative to the head-worn device and based on the determined position and orientation of the third apparatus member relative to the head-worn device. The data processing device further sends the third image data to the head-worn device for being displayed.
The head-worn device may be at least partially transparent. Hence, real direct visual perceptions of the user may be augmented with the third image data to improve the learning experience of the user. The anatomical structures of the virtual patient may comprise portions of skin, one or more bones, one or more ligaments, and/or one or more muscles of the virtual patient, wherein the one or more bones may comprise the first and second bones mentioned before. To obtain the third image data, the anatomical structures of the virtual patient are rendered by using the determined position and orientation of the second apparatus member relative to the head-worn device and the determined position and orientation of the third apparatus member relative to the head-worn device. Thereby, the determined position and orientation of the second apparatus member relative to the head-worn device may be the position and orientation of the first bone model relative to the head-worn device, and the determined position and orientation of the third apparatus member relative to the head-worn device may be the position and orientation of the second bone model relative to the head-worn device..
The third image data may further comprise a visualization of a menu similar as described before for the augmented second image data. Details are not repeated here.
A user input may be determined by tracking the hands of the user while the third image data including the menu is displayed by the display of the head-worn device. The menu navigation may be implemented using gesture recognition.
The present disclosure is not limited to the user inputs based on displaying a menu and tracking the hands of the user. Various other input manners may be utilized such as touch screens, voice recognition, keyboards, mouses, etc.
In another possible implementation, the positioning signal receiver receives a first positioning signal from a first base station and a second positioning signal from
a second base station, and the positioning signal receiver determines the first data based on the first positioning signal and the second positioning signal. Further, the positioning signal receiver sends the first data to the data processing device.
According to a second aspect of the present disclosure, a data processing device is provided, wherein the data processing device is adapted to perform the method of the first aspect or any possible implementation of this method.
According to a third aspect of the present disclosure, a computer program is provided, wherein the computer program comprises instructions for causing the data processing device of the second aspect to carry out the method of the first aspect or any of the possible implementation of this method.
According to a fourth aspect of the present disclosure, a computer-readable medium is presented, wherein the computer-readable medium stores the computer program of the third aspect.
According to a fifth aspect of the present disclosure, a model of a joint of a human is provided. The model of the joint comprises a first bone model, wherein the first bone model is a model of a first bone of the joint. The model of the joint further comprises a second bone model, wherein the second bone model is a model of a second bone of the joint. The model of the joint further comprises one or more first ligament models, wherein the one or more first ligament models are models of first ligaments connecting the first bone and the second bone of the joint. The model of the joint further comprises an apparatus for tensioning the one or more first ligament models such that a first end section of the second bone model is pulled towards a first end section of the first bone model. The apparatus for tensioning the one or more first ligament models is attached to the first bone model. The one or more first ligament models are attached to the first end section of the second bone model. The first bone model has one or more first apertures at its first end section, and the one or more first ligament models extend from the first end section of the second bone model through the one or more first apertures to the apparatus for tensioning the one or more first ligament models.
The model of the joint may be a model of a knee joint. The first bone model may be a femur model, and the second bone model may be a tibia model. The second bone model may further comprise a fibulae model.
Alternatively, the first bone model may be a tibia model, and the second bone model may be a femur model, wherein the first bone model may further comprise a fibula model.
The first bone model and the second bone model may each comprise a bone base structure, which may be reinforced by means of reinforcement ribs and/or fiberglass mesh. Additionally, for each of the first and second bone models, a tube such as an aluminum tube may be used to improve a mechanical strength, wherein the tube may be laminated using fiberglass mesh.
The one or more first ligament models may be implemented using cotton textiles. The cotton textiles can be flat woven. The cotton textiles may be laid in a sheath, similar to a Bowden cable sheath. The first ligament models may be bendable and compressible. The first ligament models are preferable not elastic.
The joint model may be embedded in layers of a flexible polyurethane foam and silicone, wherein these layers may be configured to provide realistic haptic representations of soft tissues such as muscles and skin structures.
The tensions of the one or more first ligament models may be set manually using the apparatus for tensioning the one or more first ligament models. Alternatively, the apparatus for tensioning the one or more first ligament models may comprise one or more actuators for tensioning the one or more first ligament models. The apparatus for tensioning the one or more first ligament models may be attached to a part of the first bone model that is remote from the first end section of the first bone model.
The first end section of the first bone model may comprise a separate first aperture for each first ligament model of the one or more first ligament models so that the one or more first ligament models extend through separate first apertures at the first end section of the first bone model. Alternatively, several first ligament models of the one or more first ligament models may extend through a single first aperture of the one or more first apertures.
In another possible implementation of the fifth aspect, the model of the joint further comprises one or more second ligament models and an apparatus for tensioning the one or more second ligament models. The one or more second ligament models are models of second ligaments connecting the first bone and the second bone of the joint. The one or more second ligament models are attached to the first end section of the first bone model, the second bone model has one or more
second apertures at its first end section, the apparatus for tensioning the one or more second ligament models is attached to the second bone model, and the one or more second ligament models extend from the first end section of the first bone model through the one or more second apertures to the apparatus for tensioning the one or more second ligament models.
The first end section of the second bone model may comprise a separate second aperture for each second ligament model of the one or more second ligament models so that the one or more second ligament models extend through separate second apertures at the first end section of the second bone model. Alternatively, several second ligament models of the one or more second ligament models may extend through one second aperture of the one or more second apertures.
The tensions of the one or more second ligament models may be set manually using the apparatus for tensioning the one or more second ligament models. Alternatively, the apparatus for tensioning the one or more second ligament models may comprise one or more actuators for tensioning the one or more second ligament models. The apparatus for tensioning the one or more second ligament models may be attached to a part of the second bone model that is remote from the first end section of the second bone model.
The first and second ligament models may be implemented similarly. Further, the apparatus for tensioning the one or more first ligament models may be implemented similarly to the apparatus for tensioning the one or more second ligament models. In some implementations, the model of the joint comprises no first ligament models, but one or more second ligament models.
In another possible implementation of the fifth aspect, the model of the joint comprises a controller that is connected to the apparatus for tensioning the one or more first ligament models, the apparatus for tensioning the one or more first ligament models comprises one or more actuators for tensioning the one or more first ligament models, and the controller is configured to control tensions of the one or more first ligament models based on a pathology setting using the one or more actuators.
The model of the joint may be a model of a knee joint.
The controller may be configured to obtain the pathology setting. In particular, the controller may be configured to receive the pathology setting from a data
processing device, or the controller may be a unit of the data processing device, wherein the data processing device may obtain the pathology setting based on user input. The user input may be implemented by tracking the hands of the user and by displaying a menu by the head-worn device as described above, wherein the menu allows a user to select the pathology setting. Alternatively, the user input may be implemented by means of a touch screen, by means of voice input, or by means of another user input technique.
The controller may be configured to send signals to the actuators of the apparatus for tensioning the one or more first ligament models to control these actuators. Based on the signals received from the controller, the actuators of the apparatus for tensioning the one or more first ligament models may control tensions of the one or more first ligament models.
The controller may further be configured to send signals to the actuators of the apparatus for tensioning the one or more second ligament models to control these actuators. Based on the signals received from the controller, the actuators of the apparatus for tensioning the one or more second ligament models may control tensions of the one or more second ligament models.
In the example of a knee joint, based on the signals received from the controller, the actuators of the apparatus for tensioning the one or more first ligament models and/or the actuators of the apparatus for tensioning the one or more second ligament models may control tensions of the one or more first and/or second ligament models to represent an anterior cruciate ligament rupture, a posterior cruciate ligament rupture, or a medial ligament rupture.
In another possible implementation, the model of the joint further comprises a first gyroscope disposed on the first bone model, a second gyroscope disposed on the second bone model, and a second accelerometer disposed on the second bone model.
The first gyroscope, the second gyroscope, and the second accelerometer may correspond to the first gyroscope, the second gyroscope, and the second accelerometer of the first aspect. Details are not described again.
In another possible implementation, the model of the joint further comprises a first accelerometer disposed on the first bone model.
The first accelerometer may correspond to the first accelerometer of the first aspect. Details are not described again.
In another possible implementation, the model of the joint further comprises a translation sensor disposed on the first bone model or the second bone model. The translation sensor is a flex sensor or an optical distance sensor, and the translation sensor is configured to measure a translation of the second bone model relative to the first bone model. The translation sensor may correspond to the translation sensor of the first aspect. Details are not described again.
According to a sixth aspect of the present disclosure, a system is provided, wherein the system comprises the data processing device of the second aspect. The system further comprises an apparatus comprising the model of a joint of the fifth aspect, wherein the model of the joint is disposed on a first apparatus member of the apparatus. The system further comprises a head-worn device, the head-worn device comprising a display, a camera, and a positioning signal receiver. The system further comprises a first base station and a second base station for transmitting positioning signals. The data processing device, the apparatus, the head-worn device, the first base station, and the second base station are adapted to perform the method according to the first aspect or any of the possible implementations of this method.
The first apparatus member may be a frame.
It shall be understood that the method for determining the position and orientation of the apparatus, the data processing device, the computer program, the (non-transitory) computer-readable medium, the model of a joint of a human, and the system have similar and/or identical embodiments.
These and other aspects of the present disclosure will become apparent from and be elucidated with reference to the embodiments described hereinafter.
BRIEF DESCRIPTION OF THE DRAWINGS
Exemplary embodiments of the invention will be described in the following with reference to the accompanying drawings:
Fig. 1 shows schematically and exemplarily a mixed reality system according to an embodiment of the present application;
Fig. 2 shows schematically and exemplarily a method for determining a position and orientation of an apparatus relative to a head-worn device according to an embodiment of the present application;
Figs. 3A, 3B, and 3C show schematically and exemplarily a views of a knee joint model according to an embodiment of the present application.
DETAILED DESCRIPTION OF EMBODIMENTS
Fig. 1 shows schematically and exemplarily a mixed reality system according to an embodiment of the present application.
The mixed reality system comprises an apparatus, wherein the apparatus comprises a first apparatus member 110 and a model of a human knee joint. The model of the knee joint comprises a second apparatus member 120 and a third apparatus member 130. The second apparatus member 120 comprises a first bone model, wherein the first bone model is a model of a first bone of the knee joint. The third apparatus member 130 comprises a second bone model, wherein the second bone model is a model of a second bone of the knee joint. The first bone model is a femur model, and the second bone model is a tibia model, wherein the second bone model may further comprise a fibula model. In other embodiments not depicted by the figures, the first bone model is a tibia model, and the second bone model is a femur model, wherein the first bone model may further comprise a fibular model. The second apparatus member 120 comprises one or more first sensors 121 . In some examples, the one or more first sensors are disposed on or inside the second apparatus member 120. The third apparatus member 130 comprises one or more second sensors 131 . In some examples, the one or more second sensors are disposed on or inside the third apparatus member. The knee joint model is rotatably mounted on the first apparatus member 110, which may be a frame. One or more markers 111 are arranged on the surface of the first apparatus member 110.
The mixed reality system further comprises a plurality of base stations 101 for transmitting positioning signals. The base stations 101 are configured to transmit positioning signals. The base stations may for example be base stations of the Valve lighthouse system as described in “DronOS: A flexible open-source prototyping framework for interactive drone routines” by M. Hoppe et al., Int. Conf, on Mobile and Ubiquitous Multimedia (MUM), ACM, 2019, 15:1 15:7, “Performance bounds in positioning with the vive lighthouse system” by M. Greiff et al., IEEE Int. Conf, on Information Fusion (FUSION), 2019, pp. 1-8, or “Automated testing of industrial robots using HTC vive for motion tracking” by K. Sletten, M.Sc.
thesis, University of Stavanger, Norway, 2017. Hence, the base stations may also be referred to as lighthouses. In other examples, other base stations 101 may be used to transmit positioning signals.
The mixed reality system further comprises a head-worn device 140, which comprises a display 142, a camera 144, and a positioning signal receiver 146 for receiving positioning signals. The head-worn device may be worn on the head of a user and may be an augmented or virtual reality headset or augmented or virtual reality glasses. The positioning signal receiver 146 is configured to receive positioning signals transmitted by the plurality of base stations 101 and to provide data to a data processing device 150, wherein the data provided to the data processing device 150 indicates the position and orientation of the head-worn device 140 in a base coordinate system that is fixed relative to the positions of the base stations 101.
The mixed reality system further comprises a data processing device 150. The data processing device is depicted in Fig. 1 as separate from the head-worn device 140. In other examples, the data processing device 150 may be integrated in the head-worn device 140, or the data processing device may be implemented in a distributed manner comprising a plurality of data processing subsystems, wherein one or more of these data processing subsystems may be integrated in the head-worn device 140. The data processing device comprises one or more memories 152, one or more processors 154, and one or more transceiver units 156. The data processing device 150 is communicatively coupled to the display 142, the camera 144, and the positioning signal receiver 146 of the head- worn device 140 via the one or more transceiver units 156 and via one or more cables 148. In other examples, the data processing device 150 may be wirelessly connected to the display 142, the camera 144, and/or the positioning signal receiver 146 of the head-worn device 140.
The data processing device 150 is configured to receive image data from the camera 144 of the head-worn device 140. The image data received by the data processing device 150 from the camera 144 of the head-worn device 144 may show a marker 111 that is arranged on the surface of the first apparatus member 110. The data processing device 150 is configured to determine a position and orientation of the first apparatus member 110 relative to the head-worn device 140
based on the image data received from the camera 144 and based on geometry information about the marker 111.
Further, the data processing device 150 is configured to receive data from the positioning signal receiver 146 of the head-worn device 140, wherein the data received from the positioning signal receiver 146 indicates the position and orientation of the head-worn device in the base coordinate system. Further, the data processing device is configured to update the position and orientation of the first apparatus member 110 relative to the head-worn device 140 based on data received from the positioning signal receiver 146 indicating the position and orientation of the head-worn device in the base coordinate system.
In the embodiment of figure 1 , the second apparatus member 120 is rotatably mounted on the first apparatus member 110. The data processing device 150 is configured to receive data from the one or more first sensors 121 , and the data processing device 150 is configured to determine the position and orientation of the second apparatus member 120 relative to the head-worn device 140 based on the determined position and orientation of the first apparatus member 110 relative to the head-worn device, based on geometry information about the first and second apparatus members, and based on the data received from the one or more first sensors 121. For example, the one or more first sensors 121 comprise a first gyroscope, and the data processing device 150 is configured to receive data from the first gyroscope, wherein the data received from the first gyroscope indicates an orientation of the second apparatus member 120. The data processing device 150 may be configured to determine the position and orientation of the second apparatus member 120 relative to the head-worn device 140 based on the determined position and orientation of the first apparatus member 110 relative to the head-worn device, based on geometry information about the first and second apparatus members, and based on the data received from the first gyroscope indicating the orientation of the second apparatus member 120.
The data processing device 150 is further configured to receive data from the one or more second sensors 131 of the third apparatus member 130. The data processing device 150 is configured to determine the position and orientation of the third apparatus member 130 relative to the head-worn device 140 based on the determined position and orientation of the first apparatus member 110 relative to the head-worn device, based on the geometry information about the first and second
apparatus members, based on the data received from the one or more first sensors 121 , and based on the data received from the one or more second sensors 131 . For example, the second sensors 131 may comprise a second gyroscope and a second accelerometer. The data processing device 150 may be configured to receive data from the second gyroscope, wherein the data received from the second gyroscope indicates an orientation of the third apparatus member 130. The data processing device 150 may further be configured to receive data from the second accelerometer, wherein the data received from the second accelerometer indicates an acceleration of the third apparatus member 130. The data processing device may further be configured to determine a position and orientation of the third apparatus member 130 relative to the second apparatus member 120 based on the data received from the first gyroscope indicating the orientation of the second apparatus member, based on the data received from the second gyroscope indicating the orientation of the third apparatus member, and based on the data received from the second accelerometer indicating the acceleration of the third apparatus member.
The data processing device 150 may further be configured to augment second image data received from the camera 144 of the head-worn device 140 with virtual representations of anatomical structures of a virtual patient, wherein the augmenting of the second image data is based on the determined position and orientation of the second apparatus member 120 relative to the head-worn device and based on the determined position and orientation of the third apparatus member 130 relative to the head-worn device. The data processing device 150 may send the augmented second image data or the computed third image data to the head-worn device 140 for being displayed.
The anatomical structures of the virtual patient may comprise portions of skin, one or more bones, one or more ligaments, and/or one or more muscles of the virtual patient. The displaying of the augmented second image data comprising the virtual representations of anatomical structures of the virtual patient can improve a learning experience of a physician or medical staff, who uses the knee joint model to train the examination and/or the treatment of pathologies, because the physician or medical staff can not only haptically perceive a pathology using the knee joint model, but can also observe internal anatomical structures such as bones, ligaments, and/or muscles of the virtual patient.
Fig. 2 shows schematically and exemplarily a method for determining a position and orientation of an apparatus relative to a head-worn device according to an embodiment of the present application. The method may be performed by the data processing device 150 of Fig. 1 . The apparatus comprises the first apparatus member 110 of Fig. 1 . The apparatus may further comprise the second apparatus member 120, and the third apparatus member 130 of Fig. 1. The head-worn device may correspond to the head-worn device 140 of Fig. 1 . However, the method of Fig. 2 is not limited to apparatuses comprising a knee joint model or a medical model. The apparatus may alternatively be an industrial robot, a machine, or a vehicle, for example. This is not limited herein.
The method of Fig. 2 includes the following steps:
S210: The data processing device receives first image data from the camera of the head-worn device, wherein the first image data shows a marker that is arranged on a surface of a first apparatus member of the apparatus.
The camera of the head-worn device may correspond to the camera 144 of the head-worn device 140 of Fig. 1 , and the marker may correspond to one of the markers 111 of Fig. 1 .
A plurality of markers may be arranged on the surface of the first apparatus member. The first image data may show one or more markers of the plurality of markers that are arranged on the surface of the first apparatus member, whereas other markers of the plurality of markers that are arranged on the surface of the first apparatus member may not be depicted by the first image data. The data processing device may determine the position and orientation of the first apparatus member relative to the head-worn device based on the first image data and based on geometry information about the one or more markers that are shown on the first image data, wherein the geometry information may comprise information about shapes, colors, and/or distances of the one or more markers, for example. Some markers may be QR codes. Here and in the sequel, positions and orientations relative to the head-worn device may be positions and orientations relative to the display of the head-worn device.
S220: The data processing device determines a position and orientation of the first apparatus member relative to the head-worn device based on the first image data and based on geometry information of the marker.
The position and orientation of the first apparatus member relative to the head-worn device may be determined by the data processing device in a first coordinate system, which may be centered on or fixed relative to the head-worn device, so that the first coordinate system moves and/or rotates together with the head-worn device.
S230: The data processing device receives first data from a positioning signal receiver of the head-worn device, wherein the first data received from the positioning signal receiver indicates a first position and orientation of the head-worn device at a first time. Step S230 may be performed before, between, or after S210 and S220.
The positioning signal receiver may correspond to the positioning signal receiver 146 of the head-worn device 140 of Fig. 1 . The positioning signals may be transmitted by a plurality of base stations, wherein the base stations may correspond to the base stations 110 of Fig. 1. The positioning signal receiver may determine the first data based on positioning signals received at or close to the first time. To determine the first and/or second data, the positioning signal receiver may determine parameters of the positioning signals such as times of arrival, angles of arrival, etc.
The first data provided by the positioning signal receiver indicates a first position and orientation of the head-worn device at a first time. The first position and orientation may be a position and orientation in a base coordinate system, wherein the base coordinate system may be fixed relative to the positions of the base stations that transmit the positioning signals.
S240: The data processing device also receives second data from the positioning signal receiver of the head-worn device, wherein the second data received from the positioning signal receiver indicates a second position and orientation of the head-worn device at a second time that is after the first time.
Step S240 is similar to S230, but the positioning signal receiver may determine the second data based on positioning signals received at or close to the second time. The second position and orientation may be a position and orientation of the head-worn device in the base coordinate system.
S250: The data processing device updates the position and orientation of the first apparatus member relative to the head-worn device based on the first data indicating the first position and orientation of the head-worn device at the first
time and based on the second data indicating the second position and orientation of the head-worn device at the second time.
The first image data is preferably captured by the camera at or close to the first time. Hence, the position and orientation of the first apparatus member relative to the head-worn device in the first coordinate system and the first position and orientation of the head-worn device in the base coordinate system may be determined based on measurements at approximately a same time instant. Assuming that the head-worn device does not or does hardly move relative to the base stations between capturing the first image data and the first time, the position and orientation of the first apparatus member in the base coordinate system may be determined. In particular, the position and orientation of the first apparatus member in the base coordinate system may be determined based on the first data indicating the position and orientation of the head-worn device in the base coordinate system at the first time and based on the position and orientation of the first apparatus member relative to the head-worn device in the first coordinate system as determined based on the first image data.
It may further be assumed that the position and orientation of the first apparatus member in the base coordinate system does not change or does not frequently change. In particular, it may be assumed that the position and orientation of the first apparatus member in the base coordinate system does not change or does hardly change between the first and second times. Based on this assumption, the position and orientation of the first apparatus member relative to the head-worn device can be updated based on the first data indicating the first position and orientation of the head-worn device at the first time and based on the second data indicating the second position and orientation of the head-worn device at the second time.
The updating of the position and orientation of the first apparatus member relative to the head-worn device can be implemented based on the position and orientation of the first apparatus member in the base coordinate system and based on the second data indicating the second position and orientation of the head- worn device at the second time in the base coordinate system, wherein the position and orientation of the first apparatus member in the base coordinate system is assumed to be static or semi-static.
It is to be noted that the updating of the position and orientation of the first apparatus member relative to the head-worn device does not require that the data processing device receives image data from the camera of the head-worn device for the second time depicting a marker on the surface of the first apparatus member. This is relevant, because images captured by the camera may not always show a marker that is arranged on the surface of the first apparatus member. However, if the camera can provide further image data to the data processing device, which shows one or more third markers that are arranged on the surface of the first apparatus member, wherein the further image data has been captured by the camera at or close to the second time, the data processing device may determine a second position and orientation of the first apparatus member relative to the head-worn device based on the further image data captured at or close to the second time. Further, the data processing device may update the position and orientation of the first apparatus member in the base coordinate system based on the second data indicating the second position and orientation of the head-worn device at the second time in the base coordinate system and based on the determined second position and orientation of the first apparatus member relative to the head-worn device in the first coordinate system.
In an embodiment, the apparatus may further comprise a second apparatus member that is movably mounted on the first apparatus member. The second apparatus member may correspond to the second apparatus member of Fig. 1 . The first and second apparatus members may be configured such that a movement of the second apparatus member relative to the first apparatus member is limited to a curve or a surface. The second apparatus member may comprise a first sensor, which may be disposed on or inside the second apparatus member. The first sensor may correspond to the one or more first sensors 121 of Fig. 1 .
The method optionally further comprises step S260: The data processing device receives data from the first sensor, wherein the data received from the first sensor indicates a first position or a first orientation of the second apparatus member. The data processing device determines a position and orientation of the second apparatus member relative to the head-worn device based on the updated position and orientation of the first apparatus member relative to the head-worn device, based on geometry information about the first and second
apparatus members, and based on the data received from the first sensor indicating the first position or the first orientation of the second apparatus member.
The first sensor may be an accelerometer, a gyroscope, or a magnetometer. The apparatus may be configured to limit the movement of the second apparatus member relative to the first apparatus member to a manifold, wherein the manifold is a curve or a surface. The curve may be a section of a line or a section of a circle, for example. The surface may be a section of a plane or a section of a sphere, for example. Due to the limitation of the movement of the second apparatus member relative to the first apparatus member, the data processing device may efficiently and/or accurately determine the position and orientation of the second apparatus member relative to the first apparatus member based on the data received from the first sensor indicating the first position or the first orientation of the second apparatus member and based on the geometry information about the apparatus. The data processing device may determine the position and orientation of the second apparatus member relative to the first apparatus member in a second coordinate system that is centered on or fixed relative to the first apparatus member. The geometry information about the apparatus may be or may include information about the curve or the surface.
In an embodiment, the data processing device may determine the position and orientation of the second apparatus member relative to the head-worn device based on the updated position and orientation of the first apparatus member relative to the head-worn device and based on the determined position and orientation of the second apparatus member relative to the first apparatus member. The data processing device may preferably determine the position and orientation of the second apparatus member relative to the head-worn device in the first coordinate system. The position and orientation of the second apparatus member relative to the head-worn device may be the position and orientation of the second apparatus member relative to the display of the head-worn device, wherein the display of the head-worn device may correspond to the display 142 of the head-worn device 140 of Fig. 1 .
The data processing device may further determine the position and orientation of the second apparatus member in the base coordinate system based on the position and orientation of the first apparatus member in the base coordinate system, based on the geometry information about the first and second apparatus
members, and based on the data received from the first sensor indicating the first position or the first orientation of the second apparatus member. This may be implemented by determining the position and orientation of the second apparatus member in the base coordinate system based on the position and orientation of the first apparatus member in the base coordinate system and based on the position and orientation of the second apparatus member relative to the first apparatus member.
In some embodiments, the data processing device may determine the position and orientation of the second apparatus member relative to the head-worn device in the first coordinate system based on the second data received from the positioning signal receiver indicating the second position and orientation of the head- worn device at the second time in the base coordinate system and based on the position and orientation of the second apparatus member in the base coordinate system.
In an implementation of S260, the second apparatus member may be rotatably mounted on the first apparatus member, the first sensor may be a first gyroscope, the data received by the data processing device from the first sensor may indicate the first orientation of the second apparatus member, and the data processing device may determine the position and orientation of the second apparatus member relative to the head-worn device based on the updated position and orientation of the first apparatus member relative to the head-worn device, based on the geometry information about the apparatus, and based on the data received from the first gyroscope indicating the first orientation of the second apparatus member.
Hence, the apparatus may be configured to limit the movement of the second apparatus member relative to the first apparatus member to at least a section of a circle or a section of a sphere. The geometry information about the apparatus may be or may include information about the circle, the sphere, or the section thereof.
In another embodiment, the apparatus may further comprise a third apparatus member, wherein the third apparatus member may be attached to the second apparatus member. The third apparatus member may correspond to the third apparatus member 130 of Fig. 1. The third apparatus member may comprise a second gyroscope and a second accelerometer. The second gyroscope and the second accelerometer may be disposed on or inside the third apparatus member.
The second gyroscope and the second accelerometer may correspond to the one or more second sensors 131 of Fig. 1 . The method optionally further comprises step S270: The data processing device receives data from the second gyroscope, wherein the data received from the second gyroscope indicates an orientation of the third apparatus member. The data processing device further receives data from the second accelerometer, wherein the data received from the second accelerometer indicates an acceleration of the third apparatus member. The data processing device determines a position and orientation of the third apparatus member relative to the second apparatus member based on the data received from the first gyroscope indicating the first orientation of the second apparatus member, based on the data received from the second gyroscope indicating the orientation of the third apparatus member, and based on the data received from the second accelerometer indicating the acceleration of the third apparatus member. The data processing device determines a position and orientation of the third apparatus member relative to the head-worn device based on the position and orientation of the second apparatus member relative to the head-worn device and based on the position and orientation of the third apparatus member relative to the second apparatus member.
The data processing device may determine the position and orientation of the third apparatus member relative to the second apparatus member in a third coordinate system that is centered on or fixed relative to the second apparatus member. Further, the data processing device may determine the position and orientation of the third apparatus member relative to the head-worn device in the first coordinate system. The position and orientation of the third apparatus member relative to the head-worn device may be the position and orientation of the third apparatus member relative to the display of the head-worn device.
The data processing device may further determine the position and orientation of the third apparatus member in the base coordinate system based on the position and orientation of the second apparatus member in the base coordinate system, based on the data received from the first gyroscope indicating the first orientation of the second apparatus member, based on the data received from the second gyroscope indicating the orientation of the third apparatus member, and based on the data received from the second accelerometer indicating the acceleration of the third apparatus member. The position and orientation of the third apparatus member in the base coordinate system may be determined, by the data
processing device, based on the position and orientation of the second apparatus member in the base coordinate system and based on the position and orientation of the third apparatus member relative to the second apparatus member.
In some embodiments, the data processing device may determine the position and orientation of the third apparatus member relative to the head-worn device in the first coordinate system based on the second data received from the positioning signal receiver indicating the second position and orientation of the head- worn device at the second time in the base coordinate system and based on the position and orientation of the third apparatus member in the base coordinate system.
In another embodiment, the second apparatus member comprises a first accelerometer, which may be disposed on or inside the second apparatus member. The first accelerometer may correspond to the one or more first sensors 121 of Fig. 1 . The data processing device may receive data from the first accelerometer, wherein the data received from the first accelerometer indicates an acceleration of the second apparatus member. The data processing device may then determine the position and orientation of the third apparatus member relative to the second apparatus member further based on the data received from the first accelerometer indicating the acceleration of the second apparatus member.
Thereby, the position and orientation of the third apparatus member relative to the second apparatus member may be determined in the third coordinate system, and the determined position and orientation of the third apparatus member relative to the second apparatus member may be used to determine the position and orientation of the third apparatus member relative to the head-worn device, and/or to determine the position and orientation of the third apparatus member in the base coordinate system.
In another embodiment, a translation sensor is disposed on the second apparatus member or the third apparatus member, wherein the translation sensor is a flex sensor or an optical distance sensor. The data processing device may receive data from the translation sensor, wherein the data received from the translation sensor indicates a translation of the third apparatus member relative to the second apparatus member. The data processing device may determine the position and orientation of the third apparatus member relative to the second apparatus member
further based on the data received from the translation sensor indicating the translation of the third apparatus member relative to the second apparatus member.
The translation sensor may be any mechanical sensor configured to measure a displacement. In one example, the translation sensor is a strain sensor, and the translation sensor is configured such that a deflection of this sensor causes a change of an electrical resistance. The translation sensor may be configured to detect translational movements between the second and third apparatus members. In particular, the translation sensor may be mounted in a position such that rotations of the third apparatus member relative to the second apparatus member cause no or small deflections of the translation sensor, whereas translations of the third apparatus member relative to the second apparatus member cause large deflections of the translation sensor. The translation sensor may be used to resolve ambiguities in the determination of the position and orientation of the third apparatus member relative to the second apparatus member.
If the second apparatus member includes the first bone model, the determined position and orientation of the second apparatus member relative to the head-worn device may be the position and orientation of the first bone model relative to the head-worn device. More specifically, the determined position and orientation of the second apparatus member relative to the head-worn device may be the position and orientation of the first bone model relative to the display of the head- worn device. The first gyroscope, the first accelerometer, and/or the translation sensor may be disposed on the first bone model.
Similarly, the third apparatus member includes the second bone model, the determined position and orientation of the third apparatus member relative to the head-worn device may be the position and orientation of the second bone model relative to the head-worn device or relative to the display of the head-worn device.
The head-worn device may be a virtual reality headset or virtual reality glasses. The data processing device may receive second image data from the camera of the head-worn device. The data processing device may augment the second image data with virtual representations of anatomical structures of a virtual patient, wherein the augmenting of the second image data may be based on the determined position and orientation of the second apparatus member relative to the head-worn device and based on the determined position and orientation of the third apparatus member relative to the head-worn device. The data processing device
may further send the augmented second image data to the head-worn device for being displayed.
The anatomical structures of the virtual patient may comprise portions of skin, one or more bones, one or more ligaments, and/or one or more muscles of the virtual patient. To obtain the augmented second image data, the anatomical structures of the virtual patient may be rendered by using the determined position and orientation of the second apparatus member relative to the head-worn device and the determined position and orientation of the third apparatus member relative to the head-worn device.
The augmentation of the second image data with virtual representations of anatomical structures of the virtual patient can improve a learning experience of a physician or medical staff, who uses the model of the joint to train the examination and/or treatment of pathologies, because the physician or medical staff can not only haptically perceive a pathology using the model of the joint, but can also observe internal anatomical structures such as bones, ligaments, and/or muscles of the virtual patient.
The data processing device may further augment the second image data with a menu. The menu may allow a user to configure the augmentation of the second image data. In particular, the menu may allow selecting the anatomical structures of the virtual patient that are to be augmented on the second image data. Further, the menu may allow selecting between different pathology settings, which may be anterior cruciate ligament rupture, posterior cruciate ligament rupture, or medial ligament rupture of a knee joint. Further, the menu may allow selecting an exam mode with restricted functions and choices for the user. In the exam mode, the second image data may be augmented with anamneses instead of anatomical structures of the virtual patient. The menu may further allow selecting between different modes such as a mode for the training of beginners and another mode for the training of advanced learners, wherein the second image data may be augmented with different anatomical structures of the virtual patient in these different modes.
Based on the augmentation of the second image data with the menu, a user input may be determined by tracking the hands of the user. The second image data may also be augmented with virtual representations of the hands of the user. The menu navigation may be implemented by means of gesture recognition.
The present disclosure is not limited to user inputs that are based the augmentation of image data with a menu and the tracking of the hands of the user. Various other input manners are known in the art such as touch screens, voice recognition, keyboards, mouses, etc.
In another embodiment, the head-worn device may be an augmented reality headset or augmented reality glasses. The data processing device may compute third image data by rendering virtual representations of anatomical structures of a virtual patient, wherein the anatomical structures of the virtual patient are rendered based on the determined position and orientation of the second apparatus member relative to the head-worn device and based on the determined position and orientation of the third apparatus member relative to the head-worn device. The data processing device further sends the third image data to the head- worn device for being displayed by the display of the head-worn device.
Hence, real visual perceptions of the user may be augmented with the third image data to improve the learning experience of the user. The anatomical structures of the virtual patient may again comprise portions of skin, one or more bones, one or more ligaments, and/or one or more muscles of the virtual patient. To obtain the third image data, the anatomical structures of the virtual patient are rendered by using the determined position and orientation of the second apparatus member relative to the head-worn device and the determined position and orientation of the third apparatus member relative to the head-worn device. Thereby, the positions and orientations of the first and second apparatus members may be the positions and orientations of the first and second bone models, respectively.
Similar to the augmented second image data, the third image data may comprise a visualization of a menu. Based on the displayed third image data, a user input may be determined by tracking the hands of the user. The menu navigation may be implemented using gesture recognition.
The present disclosure is not limited to user inputs based on the displaying of a menu and the tracking of the hands of the user. Various other input manners are known in the art such as touch screens, voice recognition, keyboards, mouses, etc.
Figures 3A to 3C show schematically and exemplarily a model of a knee joint of a human. The knee joint model comprises a first bone model 320, wherein the first bone model is a model of a first bone of the joint. The model of the
joint further comprises a second bone model 330, wherein the second bone model is a model of a second bone of the joint. The model of the joint further comprises one or more first ligament models 323, 324, wherein the one or more first ligament models 323, 324 are models of first ligaments connecting the first bone and the second bone of the joint. The model of the joint further comprises an apparatus 322 for tensioning the one or more first ligament models such that a first end section of the second bone model 330 is pulled towards a first end section of the first bone model 320. The apparatus 322 for tensioning the one or more first ligament models is attached to the first bone model 320. The one or more first ligament models 323, 324 are attached to the first end section of the second bone model 330. The first bone model 320 has one or more first apertures at its first end section, and the one or more first ligament models 323, 324 extend from the first end section of the second bone model 330 through the one or more first apertures to the apparatus 322 for tensioning the one or more first ligament models. The model of the knee joint further comprises a portion 328 for guiding the one or more first ligament models 323, 324 to the apparatus 322 for tensioning the one or more first ligament models. The apparatus 322 for tensioning the one or more first ligament models further comprises an aperture 329 that is configured to rotatably mount the knee joint model on a frame, wherein the frame may correspond to the first apparatus member 110 of Fig. 1 . The knee joint model may further comprise a model 360 of the patella.
The first bone model 320 may be a model of a femur, and the second bone model 330 may be a model of a tibia, wherein the second bone model may further comprise a model 339 of a fibula. In other embodiments not depicted in the figures, the first bone model may be a model of a tibia, and the second bone model may be a model of a femur, wherein the first bone model may further comprise a model 339 of a fibula.
The first bone model 320 may comprise a bone base structure, which may be reinforced by means of reinforcement ribs and/or fiberglass mesh. Additionally, a tube such as an aluminium tube may be used to improve a mechanical strength of the first bone model 320, wherein the tube may be laminated using fiberglass mesh.
The second apparatus member 120 of Fig. 1 may comprise the first bone model 320, wherein the first bone model may be embedded in layers of a flexible polyurethane foam and silicone, wherein these layers may be configured to
provide realistic haptic representations of soft tissues such as muscles and skin structures. A first gyroscope and a first accelerometer may be disposed on the first bone model.
The apparatus 322 for tensioning the one or more first ligament models may be configured to enable manual settings of the tensions of the one or more first ligament models 323, 324. Alternatively, the apparatus 322 for tensioning the one or more first ligament models may comprise one or more actuators for tensioning the one or more first ligament models 323, 324. The apparatus 322 for tensioning the one or more first ligament models may be attached to a part of the first bone model 320 that is remote from the first end section of the first bone model. In particular, the apparatus 322 for tensioning the one or more first ligament models may be attached to the second end section of the first bone model 320.
The first end section of the first bone model 320 may comprise a separate first aperture for each first ligament model of the one or more first ligament models 323, 324 so that the one or more first ligament models extend through separate first apertures at the first end section of the first bone model. Alternatively, several first ligament models 323, 324 may extend through a single first aperture of the one or more first apertures.
The knee joint model further comprises one or more second ligament models 333 and an apparatus 332 for tensioning the one or more second ligament models. The one or more second ligament models 333 are models of second ligaments connecting the first bone and the second bone of the joint. The one or more second ligament models are attached to the first end section of the first bone model 320, the second bone model 330 has one or more second apertures at its first end section, the apparatus 332 for tensioning the one or more second ligament models is attached to the second bone model, and the one or more second ligament models 333 extend from the first end section of the first bone model 320 through the one or more second apertures to the apparatus 322 for tensioning the one or more second ligament models.
The first end section of the second bone model 330 may comprise a separate second aperture for each second ligament model of the one or more second ligament models 333 so that the one or more second ligament models extend through separate second apertures at the first end section of the second
bone model. Alternatively, several second ligament models 333 may extend through one second aperture of the one or more second apertures.
The tensions of the one or more second ligament models may be set manually using the apparatus 332 for tensioning the one or more second ligament models. Alternatively, the apparatus 332 for tensioning the one or more second ligament models may comprise one or more actuators for tensioning the one or more second ligament models 333. The apparatus 332 for tensioning the one or more second ligament models may be attached to a part of the second bone model 330 that is remote from the first end section of the second bone model.
The first and second ligament models 323, 324, 333 may be implemented using cotton textiles. The cotton textiles can be flat woven. The cotton textiles may be laid in a sheath, similar to a Bowden cable sheath. The first and second ligament models 323, 324, 333 may be bendable and compressible. The first and second ligament models 323, 324, 333 are preferably not elastic.
The second bone model 330 may comprise a bone base structure, which may be reinforced by means of reinforcement ribs and/or fiberglass mesh. Additionally, a tube such as an aluminium tube may be used to improve a mechanical strength of the second bone model 330, wherein the tube may be laminated using fiberglass mesh.
The third apparatus member 130 of Fig. 1 may comprise the second bone model 330, wherein the second bone model may be embedded in layers of a flexible polyurethane foam and silicone, wherein these layers may be configured to provide realistic haptic representations of soft tissues such as muscles and skin structures. A second gyroscope and a second accelerometer may be disposed on the second bone model 330.
In other embodiments not depicted by the figures, the model of the joint comprises no first ligament models, but one or more second ligament models.
In another embodiment, the knee joint model comprises a controller, which is connected to the apparatus 322 for tensioning the one or more first ligament models and/or to the apparatus 332 for tensioning the one or more second ligament models. The controller is configured to control tensions of the one or more first ligament models 323, 324 and/or to control tensions of the one or more second ligament models 333.
The controller may be configured to obtain a pathology setting. In particular, the controller may be configured to receive the pathology setting from a data processing device, or the controller may be a unit of the data processing device, wherein the data processing device may obtain the pathology setting based on user input. The user input may be implemented by displaying image data that visualizes a menu, wherein the menu allows a user to select the pathology setting, and by tracking the hands of the user. Alternatively, the user input may be implemented by means of a touch screen, by means of voice input, or by means of another input technique.
The controller may be configured to send one or more signals to actuators of the apparatus 322 for tensioning the one or more first ligament models to control these actuators. Based on the signals received from the controller, the actuators of the apparatus 322 for tensioning the one or more first ligament models may control tensions of the one or more first ligament models 323, 324. Further, the controller may be configured to send one or more signals to actuators of the apparatus 332 for tensioning the one or more second ligament models to control these actuators. Based on the signals received from the controller, the actuators of the apparatus 332 for tensioning the one or more second ligament models may control tensions of the one or more second ligament models 323, 324. In particular, the actuators of the apparatuses 322, 332 for tensioning the first and second ligament models may control tensions of the first and second ligament models 323, 324, 333 to represent an anterior cruciate ligament rupture, a posterior cruciate ligament rupture, or a medial ligament rupture.
The model of the knee joint further comprises one or more first sensors 321 disposed on the first bone model 320. The one or more first sensors 321 are configured to provide data that indicates a first position and/or a first orientation of the first bone model 320. The one or more first sensors 321 may correspond to the one or more first sensors 121 of Fig. 1. The one or more first sensors 321 comprise a first gyroscope. Further, the one or more first sensors may comprise a first accelerometer and/or a first magnetometer. The one or more first sensors 321 may be integrated in one system on chip, SOC.
The model of the knee joint further comprises one or more second sensors 331 disposed on the second bone model 330. The one or more second sensors 331 are configured to provide data that indicates a position and an
orientation of the second bone model 330. The one or more second sensors may correspond to the one or more second sensors 131 of Fig. 1 . The one or more second sensors 331 comprise a second gyroscope and a second accelerometer. Further, the one or more second sensors 331 may comprise a second magnetometer. The one or more second sensors may be integrated on one SOC.
The knee joint model further comprises a translation sensor 335. The translation sensor 335 is a flex sensor or a strain sensor configured to provide data that indicates a translation of the second bone model 330 relative to the first bone model 320. The translation sensor is preferably mounted in a position such that rotations of the second bone model 330 relative to the first bone model 320 cause no or small deflections of the translation sensor, whereas translations of the second bone model relative to the first bone model cause large deflections of the translation sensor. The data provided by the translation sensor may be used to resolve ambiguities in the determination of the position and orientation of the second bone model 330 relative to the first bone model 320. Generally, the translation sensor may be any mechanical sensor configured to measure a displacement between the first and second bone models. In other embodiments, the translation sensor is an optical distance sensor.
It has to be noted that embodiments of the invention are described with reference to different subject matters. However, a person skilled in the art will gather that, unless otherwise notified, in addition to any combination of features belonging to one type of subject matter also any combination between features relating to different subject matters is considered to be disclosed with this application. However, all features can be combined providing synergetic effects that are more than the simple summation of the features.
While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary. Other variations to the disclosed embodiments can be understood and effected by those skilled in the art from a study of the drawings, the disclosure, and the dependent claims.
In the claims, the word “comprising” does not exclude other elements or steps, and the indefinite article “a” or “an” does not exclude a plurality. A single processor or other unit may fulfil the functions of several items re-cited in the claims. The mere fact that certain measures are re-cited in mutually different dependent
claims does not indicate that a combination of these measures cannot be used to advantage. Any reference signs in the claims should not be construed as limiting the scope.
Claims
1 . A method for determining a position and orientation of an apparatus relative to a head-worn device (140), the head-worn device having a camera (144) and a positioning signal receiver (146), the method comprising: receiving (S210), by a data processing device (150), first image data from the camera (144) of the head-worn device (140), wherein the first image data shows a marker (111 ) that is arranged on a surface of a first apparatus member (110) of the apparatus; determining (S220), by the data processing device (150), a position and orientation of the first apparatus member (110) relative to the head-worn device (140) based on the first image data and based on geometry information about the marker (111 ); receiving (S230), by the data processing device (150), first data from the positioning signal receiver (146) of the head-worn device (140), wherein the first data received from the positioning signal receiver indicates a first position and orientation of the head-worn device at a first time; receiving (S240), by the data processing device (150), second data from the positioning signal receiver (146) of the head-worn device (140), wherein the second data received from the positioning signal receiver indicates a second position and orientation of the head-worn device at a second time that is after the first time; and updating (S250), by the data processing device (150), the position and orientation of the first apparatus member (110) relative to the head-worn device (140) based on the first data indicating the first position and orientation of the head-worn device at the first time and based on the second data indicating the second position and orientation of the head-worn device at the second time.
2. The method of claim 1 , wherein the apparatus further comprises a second apparatus member (120) movably mounted on the first apparatus member (110), wherein the apparatus is configured such that a movement of the second apparatus member relative to the first apparatus member (110) is limited to a curve or surface, the second apparatus member comprises a first sensor (121 , 321 ), and the method further comprises:
receiving, by the data processing device (150), data from the first sensor (121 , 321 ), wherein the data received from the first sensor indicates a first position or a first orientation of the second apparatus member (120); determining, by the data processing device (150), a position and orientation of the second apparatus member (120) relative to the head-worn device (140) based on the updated position and orientation of the first apparatus member (110) relative to the head-worn device, based on geometry information about the apparatus, and based on the data received from the first sensor (121 , 321 ) indicating the first position or the first orientation of the second apparatus member.
3. The method of claim 2, wherein: the second apparatus member (120) is rotatably mounted on the first apparatus member (110), the first sensor (121 , 321 ) is a first gyroscope, the data received by the data processing device (150) from the first sensor (121 , 321 ) indicates the first orientation of the second apparatus member (120), and the determining, by the data processing device (150), a position and orientation of the second apparatus member (120) relative to the head-worn device (140) comprises: determining, by the data processing device, the position and orientation of the second apparatus member (120) relative to the head-worn device based on the updated position and orientation of the first apparatus member (110) relative to the head-worn device, based on the geometry information about the apparatus, and based on the data received from the first gyroscope indicating the first orientation of the second apparatus member.
4. The method of claim 3, wherein the apparatus further comprises a third apparatus member (130), the third apparatus member is attached to the second apparatus member (120), the third apparatus member comprises a second gyroscope (131 , 331 ) and a second accelerometer (131 , 331 ), and the method further comprises:
receiving, by the data processing device (150), data from the second gyroscope (131 , 331 ), wherein the data received from the second gyroscope indicates an orientation of the third apparatus member (130); receiving, by the data processing device (150), data from the second accelerometer (131 , 331 ), wherein the data received from the second accelerometer indicates an acceleration of the third apparatus member (130); determining, by the data processing device (150), a position and orientation of the third apparatus member (130) relative to the second apparatus member (120) based on the data received from the first gyroscope indicating the first orientation of the second apparatus member, based on the data received from the second gyroscope (131 , 331 ) indicating the orientation of the third apparatus member, and based on the data received from the second accelerometer (131 , 331 ) indicating the acceleration of the third apparatus member; and determining, by the data processing device (150), a position and orientation of the third apparatus member (130) relative to the head-worn device (140) based on the position and orientation of the second apparatus member (120) relative to the head-worn device and based on the position and orientation of the third apparatus member relative to the second apparatus member.
5. The method of claim 4, wherein the second apparatus member (120) comprises a first accelerometer, and the method further comprises: receiving, by the data processing device (150), data from the first accelerometer, wherein the data received from the first accelerometer indicates an acceleration of the second apparatus member (120); and wherein the determining, by the data processing device (150), the position and orientation of the third apparatus member (130) relative to the second apparatus member (120) is further based on the data received from the first accelerometer indicating the acceleration of the second apparatus member.
6. The method of claim 4 or 5, wherein a translation sensor (335) is disposed on the second apparatus member (120) or the third apparatus member (130), the translation sensor is a flex sensor or an optical distance sensor, and the method further comprises:
receiving, by the data processing device (150), data from the translation sensor (335), wherein the data received from the translation sensor indicates a translation of the third apparatus member (130) relative to the second apparatus member (120); and wherein the determining, by the data processing device (150), the position and orientation of the third apparatus member (130) relative to the second apparatus member (120) is further based on the data received from the translation sensor (335) indicating the translation of the third apparatus member relative to the second apparatus member.
7. The method according to any one of claims 4 to 6, wherein: the apparatus comprises a model of a joint of a human, the second apparatus member (120) comprises a first bone model (320), the first bone model being a model of a first bone of the joint; and the third apparatus member (130) comprises a second bone model (330), and the second bone model is a model of a second bone of the joint.
8. The method according to claim 7, wherein the method further comprises: receiving, by the data processing device (150), second image data from the camera (144); augmenting, by the data processing device (150), the second image data with virtual representations of portions of skin, one or more bones, one or more ligaments, and/or one or more muscles of a virtual patient, wherein the augmenting of the second image data is based on the determined position and orientation of the second apparatus member (120) relative to the head-worn device (140) and based on the determined position and orientation of the third apparatus member (130) relative to the head-worn device; and sending, by the data processing device (150), the augmented second image data to the head-worn device (140) for being displayed.
9. The method according to any one of the preceding claims, further comprising:
receiving, by the positioning signal receiver (146), a first positioning signal from a first base station (101 ) and a second positioning signal from a second base station (101 ); determining, by the positioning signal receiver (146), the first data based on the first positioning signal and the second positioning signal; and sending, by the positioning signal receiver (146), the first data to the data processing device (150).
10. A data processing device (150) a processor (154), a memory (152), and a transceiver, wherein the data processing device is adapted to perform the method of any one of claims 1 to 8.
11. A model of a joint of a human, comprising a first bone model (320), wherein the first bone model is a model of a first bone of the joint; a second bone model (330), wherein the second bone model is a model of a second bone of the joint; one or more first ligament models (323, 324), wherein the one or more first ligament models are models of one or more first ligaments connecting the first bone and the second bone of the joint; and an apparatus (322) for tensioning the one or more first ligament models such that a first end section of the second bone model (330) is pulled towards a first end section of the first bone model (320); wherein: the apparatus (322) for tensioning the one or more first ligament models is attached to the first bone model (320); the one or more first ligament models (323, 324) are attached to the first end section of the second bone model (330); the first bone model (320) has one or more first apertures at its first end section; and the one or more first ligament models (323, 324) extend from the first end section of the second bone model (330) through the one or more first apertures to the apparatus (322) for tensioning the one or more first ligament models.
12. The model of the joint according to claim 11 , further comprising one or more second ligament models (333) and an apparatus (332) for tensioning the one or more second ligament models, wherein the one or more second ligament models are models of one or more second ligaments connecting the first bone and the second bone of the joint, the one or more second ligament models are attached to the first end section of the first bone model (320), the second bone model (330) has one or more second apertures at its first end section, the apparatus for tensioning the one or more second ligament models is attached to the second bone model, and the one or more second ligament models extend from the first end section of the first bone model through the one or more second apertures to the apparatus for tensioning the one or more second ligament models.
13. The model of the joint according to claim 11 or 12, wherein the model of the joint comprises a controller that is connected to the apparatus (322) for tensioning the one or more first ligament models, the apparatus (322) for tensioning the one or more first ligament models comprises one or more actuators for tensioning the one or more first ligament models (323, 324), and the controller is configured to control tensions of the one or more first ligament models (323, 324) based on a pathology setting by using the one or more actuators.
14. The model of the joint according to any one of claims 11 to 13, further comprising a first gyroscope (121 , 321 ) disposed on the first bone model (320); a second gyroscope (131 , 331 ) disposed on the second bone model (330); a first accelerometer (121 , 321 ) disposed on the first bone model (320); a second accelerometer (131 , 331 ) disposed on the second bone model (330); and a translation sensor (335) disposed on the first bone model (320) or the second bone model (330), wherein the translation sensor is a flex sensor or an optical distance sensor, and the translation sensor is configured to measure a translation of the second bone model relative to the first bone model.
15. A system, comprising
the data processing device (150) of claim 10, an apparatus comprising the model of a joint of any one of claims 11 to 14, wherein the model of the joint is mounted on a first apparatus member (110) of the apparatus; a head-worn device (140) comprising a display, a camera (144), and a positioning signal receiver (146); and a first base station (101 ) and a second base station (101 ) for transmitting positioning signals, wherein the data processing device (150), the apparatus, the head-worn device (140), the first base station (101 ), and the second base station (101 ) are adapted to perform a method according to any one of claims 1 to 9.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP23215258.7 | 2023-12-08 | ||
| EP23215258 | 2023-12-08 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025120063A1 true WO2025120063A1 (en) | 2025-06-12 |
Family
ID=89164214
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2024/084880 Pending WO2025120063A1 (en) | 2023-12-08 | 2024-12-05 | Model of a joint of a human in a mixed reality environment |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025120063A1 (en) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US4605373A (en) * | 1985-01-10 | 1986-08-12 | Rosen Bernard A | Training device for setting broken limbs |
| WO2004064010A2 (en) * | 2003-01-13 | 2004-07-29 | Oliver Browne-Wilkinson | Orthopaedic demonstration aid |
| US20120123592A1 (en) * | 2010-11-15 | 2012-05-17 | Advanced Mechanical Technology | Method and apparatus for joint motion simulation |
| US20190057620A1 (en) * | 2017-08-16 | 2019-02-21 | Gaumard Scientific Company, Inc. | Augmented reality system for teaching patient care |
| US10410542B1 (en) * | 2018-07-18 | 2019-09-10 | Simulated Inanimate Models, LLC | Surgical training apparatus, methods and systems |
-
2024
- 2024-12-05 WO PCT/EP2024/084880 patent/WO2025120063A1/en active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US4605373A (en) * | 1985-01-10 | 1986-08-12 | Rosen Bernard A | Training device for setting broken limbs |
| WO2004064010A2 (en) * | 2003-01-13 | 2004-07-29 | Oliver Browne-Wilkinson | Orthopaedic demonstration aid |
| US20120123592A1 (en) * | 2010-11-15 | 2012-05-17 | Advanced Mechanical Technology | Method and apparatus for joint motion simulation |
| US20190057620A1 (en) * | 2017-08-16 | 2019-02-21 | Gaumard Scientific Company, Inc. | Augmented reality system for teaching patient care |
| US10410542B1 (en) * | 2018-07-18 | 2019-09-10 | Simulated Inanimate Models, LLC | Surgical training apparatus, methods and systems |
Non-Patent Citations (4)
| Title |
|---|
| K. SLETTEN: "Automated testing of industrial robots using HTC vive for motion tracking", M.SC. THESIS, UNIVERSITY OF STAVANGER, 2017 |
| M. GREIFF ET AL.: "Performance bounds in positioning with the vive lighthouse system", IEEE INT. CONF. ON INFORMATION FUSION (FUSION, 2019, pages 1 - 8, XP033725213 |
| M. HOPPE ET AL.: "DronOS: A flexible open-source prototyping framework for interactive drone routines", INT. CONF. ON MOBILE AND UBIQUITOUS MULTIMEDIA (MUM), ACM, vol. 15, 2019, pages 1 15 - 15 |
| SLETTEN KRISTIAN: "Automated testing of industrial robots using HTC vive for motion tracking", 15 June 2017 (2017-06-15), pages 1 - 62, XP093145504, Retrieved from the Internet <URL:https://uis.brage.unit.no/uis-xmlui/bitstream/handle/11250/2455902/Sletten_Kristian.pdf?sequence=1> [retrieved on 20240326] * |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20230346481A1 (en) | System and method for medical device placement | |
| KR102144671B1 (en) | Position correction apparatus of ultrasound scanner for ai ultrasound self-diagnosis using ar glasses, and remote medical-diagnosis method using the same | |
| US9142145B2 (en) | Medical training systems and methods | |
| US10154823B2 (en) | Guiding system for positioning a patient for medical imaging | |
| US20020168618A1 (en) | Simulation system for image-guided medical procedures | |
| US9424761B2 (en) | Medical simulation system and method with configurable anatomy model manufacturing | |
| CN112022201A (en) | Machine guided imaging techniques | |
| WO2019036524A1 (en) | System and method using augmented reality with shape alignment for medical device placement in bone | |
| CN106909771A (en) | Method and system for outputting augmented reality information | |
| CN112331049B (en) | An ultrasonic simulation training method, device, storage medium and ultrasonic equipment | |
| US20210315545A1 (en) | Ultrasonic diagnostic apparatus and ultrasonic diagnostic system | |
| Villard et al. | A prototype percutaneous transhepatic cholangiography training simulator with real-time breathing motion | |
| EP3094249B1 (en) | Method and system for the visual representation of the kinematics of a patient's joint and associated parameters | |
| WO2025120063A1 (en) | Model of a joint of a human in a mixed reality environment | |
| KR102301863B1 (en) | A method for verifying a spatial registration of a surgical target object, the apparatus therof and the system comprising the same | |
| WO2022225847A1 (en) | Mixed reality combination system | |
| KR20040084243A (en) | Virtual surgical simulation system for total hip arthroplasty | |
| CN112397189A (en) | Medical guiding device and using method thereof | |
| JP2024037139A (en) | Medical imaging system and medical imaging method | |
| CN113870636A (en) | Ultrasound simulation training method, ultrasound apparatus, and storage medium | |
| EP4181789B1 (en) | One-dimensional position indicator | |
| EP4364668A1 (en) | A device, computer program product and method for assisting in positioning at least one body part of a patent for an x-ray acquisition | |
| KR102345827B1 (en) | The method of simulation for taking medical imaging | |
| JP2024176659A (en) | Medical imaging diagnostic device and method for operating the same | |
| WO2024006348A1 (en) | Systems and methods for clinical procedure training using mixed environment technology |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24817647 Country of ref document: EP Kind code of ref document: A1 |