WO2011010644A1 - 医用画像表示装置及び医用画像表示方法 - Google Patents
医用画像表示装置及び医用画像表示方法 Download PDFInfo
- Publication number
- WO2011010644A1 WO2011010644A1 PCT/JP2010/062198 JP2010062198W WO2011010644A1 WO 2011010644 A1 WO2011010644 A1 WO 2011010644A1 JP 2010062198 W JP2010062198 W JP 2010062198W WO 2011010644 A1 WO2011010644 A1 WO 2011010644A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- medical image
- image
- image display
- organ
- virtual liquid
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/08—Volume rendering
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
Definitions
- the present invention relates to a medical image display apparatus and a medical image display method for displaying a medical image obtained from a medical image diagnostic apparatus including an X-ray CT apparatus, an MRI apparatus, and an ultrasonic apparatus, and retains a pixel value acquired at the time of imaging.
- the present invention also relates to a technique for displaying a projection image having depth information.
- the virtual endoscopy display method is a method of creating and displaying an image such as diagnosing the inside of a hollow organ of a subject using an endoscope from image data acquired by a medical image diagnostic apparatus. Yes, an image created by this method is called a virtual endoscopic image. Since the virtual endoscopic image is an imaginary image, diagnosis from a direction impossible in reality is possible. On the other hand, the virtual endoscopic image is difficult to handle for an operator who is accustomed to the actual endoscopic image because there is a difference from the actual endoscopic image, for example, the color in the lumen is not displayed. have.
- Patent Document 1 discloses displaying a virtual endoscopic image with light and dark.
- the texture here refers to the freshness unique to the living body and the luster associated with the freshness.
- the freshness and accompanying luster are attributed to the water contained in the mucous membrane present on the surface of the organ and the mucus secreted by the mucosa.
- a virtual endoscopic image without a texture is difficult to handle.
- an object of the present invention is to provide a medical image display apparatus and a medical image display method for creating and displaying a medical image having a texture closer to that of an actual endoscopic image or an image obtained by directly viewing an organ.
- the present invention adds a virtual liquid corresponding to water contained in a mucous membrane present on the surface of an organ or mucus secreted by the mucous membrane to a projection image created using a medical image.
- a projected image with a texture is created.
- a medical image reading unit that reads a medical image acquired by a medical image diagnostic apparatus, a projection image creation unit that projects the medical image onto a projection surface and creates a projection image, and the projection image are displayed.
- the projection image display unit and a medical image display method characterized by comprising: a virtual liquid adding unit for adding to the projection image, wherein the projection image display unit displays the projection image to which the virtual liquid is added. is there.
- the figure which shows the hardware constitutions of the medical image display apparatus of this invention The figure which shows the flow of a process of Example 1 of this invention.
- the figure which shows the 1st example of the flow of a process of S102 Supplementary explanatory diagram of the first example of the processing flow of S102 The figure which illustrates the liquid level shape of virtual liquid
- the figure which illustrates the liquid level shape of virtual liquid The figure which illustrates the liquid level shape of virtual liquid
- the figure which illustrates the liquid level shape of virtual liquid The figure which illustrates the liquid level shape of virtual liquid
- the figure which illustrates the liquid level shape of virtual liquid The figure which illustrates the liquid level shape of virtual liquid
- the figure which illustrates the liquid level shape of virtual liquid The figure which shows an example of GUI for the process of Example 1 of this invention
- FIG. 1 is a diagram showing a hardware configuration of the medical image display apparatus 1.
- the medical image display device 1 has a CPU (Central Processing Unit) 2, a main memory 3, a storage device 4, a display memory 5, a display device 6, a controller 7 connected to a mouse 8, a keyboard 9, and a network adapter 10, and a system bus 11 is configured so as to be able to send and receive signals.
- the medical image display device 1 is connected to a medical image photographing device 13 and a medical image database 14 via a network 12 so as to be able to send and receive signals.
- “to enable signal transmission / reception” indicates a state in which signals can be transmitted / received to each other or from one to the other, regardless of whether they are electrically or optically wired or wireless.
- the CPU2 is a device that controls the operation of each component.
- the CPU 2 loads a program stored in the storage device 4 and data necessary for program execution into the main memory 3 and executes it.
- the storage device 4 is a device that stores medical image information captured by the medical image capturing device 13, and is specifically a hard disk or the like.
- the storage device 4 may be a device that exchanges data with a portable recording medium such as a flexible disk, an optical (magnetic) disk, a ZIP memory, and a USB memory.
- the medical image information is acquired from the medical image capturing device 13 and the medical image database 14 via a network 12 such as a LAN (Local Area Network).
- the storage device 4 stores a program executed by the CPU 2 and data necessary for program execution.
- the main memory 3 stores programs executed by the CPU 2 and the progress of arithmetic processing.
- the display memory 5 temporarily stores display data to be displayed on the display device 6 such as a liquid crystal display or a CRT (Cathode Ray Tube).
- the mouse 8 and the keyboard 9 are operation devices for an operator to give an operation instruction to the medical image display device 1.
- the mouse 8 may be another pointing device such as a trackpad or a trackball.
- the controller 7 detects the state of the mouse 8, acquires the position of the mouse pointer on the display device 6, and outputs the acquired position information and the like to the CPU 2.
- the network adapter 10 is for connecting the medical image display apparatus 1 to a network 12 such as a LAN, a telephone line, or the Internet.
- the medical image photographing device 13 is a device that acquires medical image information such as a tomographic image of a subject.
- the medical imaging apparatus 13 is, for example, an MRI apparatus, an X-ray CT apparatus, an ultrasonic diagnostic apparatus, a scintillation camera apparatus, a PET apparatus, a SPECT apparatus, or the like.
- the medical image database 14 is a database system that stores medical image information captured by the medical image capturing device 13.
- a medical image having a texture closer to that of an actual endoscopic image or an image obtained by directly viewing an organ is created, and the created medical image is displayed on the display device 6. Is done.
- FIG. 2 is a diagram showing a processing flow of the first embodiment of the present invention. Hereinafter, each step of FIG. 2 will be described in detail.
- Step S101 The CPU 2 acquires the three-dimensional image data of the subject selected by the operator by operating the mouse 8 or the keyboard 9 from the medical image photographing device 13 or the medical image database 14 via the network 12.
- the 3D image data is composed of several to several hundreds of tomographic images obtained by imaging a subject arranged in succession in a certain direction, for example, a direction perpendicular to the tomographic plane. is there.
- Step S102 The CPU 2 creates a medical image to which the virtual liquid is added using the 3D image data acquired in S101.
- a virtual liquid corresponding to the water contained in the mucous membrane present on the surface of the organ and the mucus secreted by the mucous membrane is added, for example, closer to the actual endoscopic image
- FIG. 3 shows a first example of the flow of processing for creating a medical image to which a virtual liquid is added, and each step will be described below with reference to FIG. 4 which is a supplementary explanatory diagram.
- Step S201 The CPU 2 creates 3D shape data of the organ using the 3D image data acquired in S101.
- the 3D image data 500 includes an organ 501 that is a diagnosis target. Therefore, the CPU 2 performs region determination in the three-dimensional image data 500 using a threshold value corresponding to the organ 501, thereby extracting the organ 501 and creating the shape data 510 of the organ 501.
- the CPU 2 may extract the organ 501 and create the shape data 510 by performing region shape determination based on the anatomical shape characteristics of the organ 501.
- the shape data 510 of the organ 501 is held in the main memory 3 or the storage device 4 as three-dimensional shape data.
- the three-dimensional shape data may be three-dimensional surface shape data by polygon representation or the like.
- Step S202 The CPU 2 sets characteristic parameters for the organ 501 extracted in S201.
- the organ characteristic parameter include reflectance and refractive index representing the optical characteristics of the organ.
- the optical characteristic of the organ set in this step may be a physical property value of the target organ or a physical property value of an arbitrary substance similar to the target organ.
- the color of the organ which is one of the organ characteristic parameters, may reflect the anatomical color of the target organ, or may be set to any color.
- Step S203 CPU2 creates 3D shape data of the virtual liquid.
- the three-dimensional shape data of the virtual liquid is created so as to cover the surface of the organ 501 with an arbitrary thickness using the surface of the organ shape data 510 created in S201 as a reference plane.
- the surface of the organ shape data 510 is a boundary surface of the organ shape data 510, which is a boundary surface close to the viewpoint position set when a projection image described later is created.
- FIG. 5A An example of the liquid surface shape of the virtual liquid 1001 is shown in FIG. FIG. 5A is an example in which the shape of the surface 2000 of the organ 501 and the liquid surface 1000 of the virtual liquid 1001 are the same, and the thickness 1002 of the virtual liquid 1001 is constant regardless of the location.
- the liquid level 1000 of the virtual liquid 1001 can be created simply by translating the shape data of the surface 2000 of the organ 501, the load on the CPU 2 when creating the shape data of the virtual liquid 1001 can be reduced.
- FIG. 5B shows an example in which the shape of the liquid level 1000 of the virtual liquid 1001 is a constant liquid level regardless of the shape of the surface 2000 of the organ 501.
- the liquid surface 1000 of the virtual liquid 1001 has a simple shape, the load on the CPU 2 when creating the shape data of the virtual liquid 1001 can be reduced.
- FIG. 5C shows an example in which the shape of the liquid surface 1000 of the virtual liquid 1001 is a liquid surface having a certain curvature.
- the curvature of the liquid surface 1001 may be determined by a physical property value such as the viscosity of the virtual liquid, or may be determined according to the shape of the surface 2000 of the organ 501.
- FIG. 5D and FIG. 5E are examples in which the shape of the liquid surface 1000 of the virtual liquid 1001 is obtained assuming that the flow 1003 and the vortex 1004 are generated in the virtual liquid 1001. That is, the shape of the liquid surface 1000 of the virtual liquid 1001 may be locally deformed in consideration of the influence of the flow 1003 and the vortex 1004.
- the flow 1003 and the vortex 1004 may be determined based on the shape and temperature of the organ surface 2000, the surface tension and temperature of mucus, and the like.
- FIG. 5F is an example in which the shape of the liquid surface 1000 of the virtual liquid 1001 is determined so that the virtual liquid 1001 does not cover the entire surface 2000 of the organ 501 but partially covers it. It is assumed that the amount of mucus is not enough to cover the entire surface of the organ 501 and mucus exists only in the concave portion of the organ 501.
- FIG. 5G is an example in which the shape of the liquid surface 1000 of the virtual liquid 1001 is obtained assuming that the virtual liquid 1001 is not in close contact with the surface 2000 of the organ 501 and there is a gap 1003 between the organ 501 and the virtual liquid 1001. . It is assumed that mucus with a large surface tension cannot enter the concave portion of the organ 501.
- the liquid surface shapes of the virtual liquid shown in FIG. 5 are examples, and are not limited to these, and may be combined in a timely manner. For example, by combining FIG. 5C and FIG. 5G, bubbles generated on the surface of the organ 501 can be simulated.
- the CPU 2 sets characteristic parameters for the virtual liquid created in S203.
- the characteristic parameter of the virtual liquid includes a reflectance, a refractive index, an absorptivity, and the like that represent the optical characteristics of the virtual liquid.
- the optical property of the virtual liquid set in this step may be a physical property value of mucus, or an arbitrary substance similar to mucus, for example, a physical property value of water.
- the color of the virtual liquid, which is one of the characteristic parameters of the virtual liquid may be colorless, may reflect the anatomical color of the target organ, or may be any other color.
- the characteristic parameters described above physically depend on the wavelength of light, but may or may not depend on the creation of the medical image in S102. However, the light transmittance of the virtual liquid should not be zero.
- Step S205 The CPU 2 arranges the virtual liquid created so as to cover the organ to be diagnosed and the organ in the virtual space.
- the CPU 2 uses the three-dimensional shape data of the organ created in S201 and the three-dimensional shape data of the virtual liquid created in S203 when arranging the organ and the virtual liquid in the virtual space.
- organ data to which virtual liquid data is added is created in the virtual space.
- Step S206 The CPU 2 creates a three-dimensional projection image using the organ data to which the virtual liquid data created in the virtual space is added.
- the CPU 2 sets a light source, a viewpoint, a line-of-sight direction, and a projection plane when creating a three-dimensional projection image.
- FIG. 4 shows an example of the organ surface 2000 and the virtual liquid level 1000 arranged in the virtual space 600, and the set light source 601, viewpoint 602, line-of-sight direction 603, and projection plane 604.
- ray tracing method considering direct light and indirect light (diffuse reflected light, specular reflected light, refracted light, ambient light) from the light source 601 and more detailed calculation of the influence of indirect light
- direct light and indirect light diffuse reflected light, specular reflected light, refracted light, ambient light
- indirect light diffuse reflected light, specular reflected light, refracted light, ambient light
- a more realistic three-dimensional projection image can be created by taking into account the optical properties of the virtual liquid and the organ.
- the light source 601 may be positioned in the vicinity of the viewpoint 602 so as to match the actual endoscopic imaging situation, or the light source 601 may be used as a surface light source to approximate the actual surgical situation. good.
- the three-dimensional projection image of the organ to which the virtual liquid is added is created by the processing of this step, it is possible to obtain a medical image having a texture closer to that of an actual endoscopic image or an image when the organ is directly viewed. it can.
- Step S103 The CPU 2 causes the display device 6 to display the medical image created in S102.
- the display device 6 may display an interface for setting various parameters used when creating a medical image together with the medical image.
- FIG. 6 shows an example of a display screen displayed on the display device 6.
- the display screen 700 of FIG. 6 includes a medical image display area 701, a medical image creation mode switching interface 711, a virtual liquid property setting interface 710, a light source setting interface 721, a viewpoint setting interface 720, A projection plane setting interface 722.
- the medical image display area 701 is an area where the medical image created in S102 is displayed, and the region of interest 702 can be set by operating the mouse 8 by the operator.
- the medical image creation mode switching interface 711 is an interface for switching whether or not to add a virtual liquid to the medical image created in S102.
- “texture enhancement mode” is selected, which indicates that a medical image to which a virtual liquid is added is created.
- the virtual liquid may be added only to the region of interest 702 set by the operator's mouse 8 operation, so that a medical image partially having a texture may be obtained.
- the virtual liquid characteristic setting interface 710 is an interface for setting the characteristic parameter of the virtual liquid added to the organ.
- thickness, liquid surface shape, transparency, reflectance, refractive index, and liquid color are displayed as settable characteristic parameters.
- the desired liquid surface shape is selected by selecting a button, and the buttons A to G shown in FIG. 6 are associated with the liquid surface shapes shown in FIGS. 5A to 5G, for example. Parameters other than the liquid surface shape are input with a slider. The influence of each characteristic parameter on the medical image with the virtual liquid added will be described below.
- a medical image closer to an actual endoscopic image or an image obtained when viewing an organ directly is called an image having a strong texture enhancement effect, and hue, lightness, and chrominance, which are terms of color engineering, are used as necessary. Degree, object color, etc. are used.
- the thickness of the virtual liquid affects the strength of the texture enhancement effect. The thicker the virtual liquid, the stronger the texture enhancement effect, and the thinner the virtual liquid, the weaker the texture enhancement effect. If the thickness is zero, a conventional medical image is obtained.
- the liquid surface shape of the virtual liquid affects the area where the texture is emphasized.
- An example of the liquid surface shape shown in FIG. 5 will be specifically described below.
- the texture of the organ surface on the image is uniformly enhanced.
- the organ surface on the image is texture-enhanced according to the surface shape.
- the organ surface on the image is texture-enhanced according to the curvature of the virtual liquid surface. In areas where the curvature is positive, the area near the center of the area is texture-enhanced, and in areas where the curvature is negative, the area near the area boundary is emphasized.
- FIG. 5A the texture of the organ surface on the image is uniformly enhanced.
- the organ surface on the image is texture-enhanced according to the surface shape.
- the organ surface on the image is texture-enhanced according to the curvature of the virtual liquid surface. In areas where the curvature is positive, the area near the center of the area is texture-enhanced, and in areas where the curvature is negative
- the surface of the organ on the image is globally texture-enhanced and further texture-enhanced to a wave pattern with a local direction.
- the surface of the organ on the image is globally texture-enhanced, and further texture-enhanced locally in a vortex wavefront pattern.
- the texture of the convex part of the organ surface on the image is enhanced.
- the texture of the concave portion of the organ surface on the image is further enhanced.
- the liquid surface shape of the virtual liquid is not limited to the shape shown in FIG.
- the transparency of the virtual liquid affects the brightness and saturation of the organ surface on the image. Increasing the transparency increases the brightness and saturation, and decreasing the transparency decreases the brightness and saturation.
- the reflectance of the virtual liquid affects the brightness and saturation of the organ surface on the image.
- the virtual liquid on the image takes on the hue of the light source color, so that the brightness of the organ surface increases and the saturation decreases.
- the light source color is white
- the virtual liquid is whitened and the organ surface is also whitened.
- the reflectance is lowered, the organ surface on the image and the hue of the object color of the virtual liquid are tinged. Therefore, the brightness of the organ surface decreases and the saturation changes according to the object color of the virtual liquid. For example, if the object color on the organ surface is red and the object color of the virtual liquid is colorless, the organ surface is reddish.
- Refractive index affects the distortion of the organ surface on the image, increasing the refractive index increases the distortion, and decreasing the refractive index decreases the distortion.
- the liquid color of the virtual liquid changes the object color of the virtual liquid on the image. Therefore, since the organ surface on the image has a hue of the virtual liquid, the saturation of the organ surface changes according to the hue of the virtual liquid.
- the viewpoint setting interface 720 is for moving the position of the viewpoint 602 in the virtual space 600. Each time the position of the viewpoint 602 is moved, the medical image displayed in the medical image display area 701 may be updated.
- the light source setting interface 721 is for moving the position of the light source 601 in the virtual space 600. Each time the position of the light source 601 is moved, the medical image displayed in the medical image display area 701 may be updated. Further, by positioning the light source 601 in the vicinity of the viewpoint 602, it may be adapted to the actual situation of endoscopic photography. Further, by operating the interface 721, the light source 601 may be switched from a point light source to a surface light source.
- the projection plane setting interface 722 is for moving the position of the projection plane 604 within the projection space 605. Each time the position of the projection plane 604 is moved, the medical image displayed in the medical image display area 701 may be updated.
- a medical image having a texture closer to that of an actual endoscopic image or an image obtained by directly viewing an organ is created and displayed.
- the processing flow of the second embodiment of the present invention is the same as that of the first embodiment, except that the processing of creating a medical image in S102 is different, and an organ in the medical image is handled as three-dimensional shape data in the first embodiment.
- Example 2 it is treated as surface data. Therefore, in the description of this embodiment, descriptions other than S102 are omitted, and FIG. 7 shows a second example of the processing flow for creating a medical image, and referring to FIG. 8 which is a supplementary explanatory diagram for each step below. While explaining.
- Step S301 The CPU 2 creates organ surface data using the 3D image data acquired in S101.
- the 3D image data 500 includes an organ 501 that is a diagnosis target. Therefore, the CPU 2 performs region determination in the three-dimensional image data 500 using a threshold value corresponding to the organ 501, thereby extracting the organ 501 and creating surface data 511 of the organ 501.
- the CPU 2 may extract the organ 501 by determining the region shape based on the anatomical shape characteristics of the organ 501.
- the surface data 511 of the organ 501 is held in the main memory 3 or the storage device 4 as two-dimensional image data, that is, pixel data.
- Step S302 The CPU 2 sets characteristic parameters for the surface 2000 of the organ 501 extracted in S301.
- the organ characteristic parameter include reflectance and refractive index representing the optical characteristics of the organ.
- the optical characteristic of the organ set in this step may be a physical property value of the target organ or a physical property value of an arbitrary substance similar to the target organ.
- the color of the organ which is one of the organ characteristic parameters, may reflect the anatomical color of the target organ, or an arbitrary color may be set.
- Step S303 The CPU 2 arranges the surface data 511 of the organ to be diagnosed in the virtual space 600.
- the organ surface data 511 is pasted to the reference plane 610 arbitrarily set in the virtual space 600 by using a texture mapping method that is a known technique. May be.
- the reference surface 610 set in the virtual space 600 may be a flat surface, a curved surface, or a surface having irregularities.
- Step S304 CPU2 creates 3D shape data of the virtual liquid.
- the three-dimensional shape data of the virtual liquid has an arbitrary thickness and is arranged closer to the viewpoint position than the reference plane 610 set in S303.
- the liquid surface shape of the virtual liquid may be set in the same manner as in the first embodiment.
- Step S305 The CPU 2 sets characteristic parameters for the virtual liquid created in S304.
- the characteristic parameter of the virtual liquid may be the same as in the first embodiment.
- Step S306 The CPU 2 creates a three-dimensional projection image using the surface data 511 of the organ 501 arranged in the virtual space 600 and the created virtual liquid.
- the CPU 2 sets a light source, a viewpoint, a line-of-sight direction, and a projection plane when creating a three-dimensional projection image.
- FIG. 8 shows an example of the organ surface 2000 and the virtual liquid level 1000 arranged in the virtual space 600, and the set light source 601, viewpoint 602, line-of-sight direction 603, and projection plane 604.
- the method for creating a three-dimensional projection image may be the same as in the first embodiment.
- the shape of the organ is handled as surface data, that is, two-dimensional image data, not as three-dimensional shape data. Suitable for high-speed processing.
- the processing flow of the third embodiment of the present invention is the same as that of the first and second embodiments, except that the processing for creating the medical image in S102 is different.
- the organ in the medical image is a three-dimensional shape in the first embodiment.
- the data is handled as surface data in the second embodiment, but is handled as three-dimensional image data in the third embodiment. Therefore, in the description of the present embodiment, descriptions other than S102 are omitted, and FIG. 9 shows a third example of the flow of processing for creating a medical image, and referring to FIG. 10 which is a supplementary explanatory diagram for each step below. While explaining.
- Step S401 The CPU 2 creates 3D image data of the organ using the 3D image data acquired in S101.
- the three-dimensional image data 500 includes an organ 501 that is a diagnosis target. Therefore, the CPU 2 performs region determination in the 3D image data 500 using a threshold corresponding to the organ 501, thereby extracting the organ 501 and creating 3D image data 512 of the organ 501. Note that the CPU 2 may extract the organ 501 by determining the region shape based on the anatomical shape characteristics of the organ 501.
- the three-dimensional image data 512 of the organ 501, that is, voxel data is held in the main memory 3 or the storage device 4.
- Step S402 The CPU 2 creates 3D image data of the virtual liquid.
- the virtual liquid 3D image data 522 is a shape obtained by enlarging the shape of the organ 501 according to the thickness of the virtual liquid 1001 with the center of gravity of the organ 501 extracted in S401 as a reference point, and is created as a transparent one .
- the shape of the created virtual liquid 1001 may be corrected based on, for example, the liquid surface shape set using the virtual liquid property setting interface 710 shown in FIG.
- Step S403 CPU2 corrects the concentration value of the three-dimensional image data of the virtual liquid.
- the concentration value for example, transparency, reflectance, refractive index, liquid color, etc. set using the property setting interface 710 of the virtual liquid 1001 shown in FIG. 6 may be used.
- the concentration values of all voxels of the virtual liquid 1001 may be corrected according to the set transparency and liquid color.
- the concentration value of the voxel on the surface of the virtual liquid may be corrected according to the level of the reflectance, or the concentration value of the voxel of the virtual liquid contacting the organ may be corrected according to the level of the refractive index.
- Step S404 The CPU 2 arranges the organ to be diagnosed and the virtual liquid added to the organ in the virtual space.
- the CPU 2 creates the three-dimensional image data 512 of the organ created in S401 and the three-dimensional image data 522 of the virtual liquid created in S402 and modified in S403. Is used.
- organ data to which virtual liquid data is added is created in the virtual space 600.
- Step S405 The CPU 2 creates a three-dimensional projection image using the organ data to which the virtual liquid data created in the virtual space is added.
- the CPU 2 sets a light source, a viewpoint, a line-of-sight direction, and a projection plane when creating a three-dimensional projection image.
- FIG. 10C shows an example of the organ 2001 and the virtual liquid 1001 arranged in the virtual space 600, and the set light source 601, viewpoint 602, line-of-sight direction 603, and projection plane 604.
- a volume rendering method that is a well-known technique is used for organ data to which virtual liquid data created in a virtual space is added.
- the optical properties such as transparency, reflectance, refractive index, and liquid color of the virtual liquid may be used to correct the concentration value of the virtual liquid as described in S402, and 3D projection image creation by volume rendering method It may be reflected in the opacity of time.
- the three-dimensional projection image of the organ to which the virtual liquid is added is created by the processing in this step, it is possible to obtain a medical image having a texture closer to that of an actual endoscopic image or an image when the organ is directly viewed. it can.
- a medical image display apparatus may be configured by appropriately combining these embodiments.
- the difference between the image created by the present invention and the image created by the surface rendering method, which is a known technique, will be described below.
- Even in an image created by the surface rendering method it is possible to change the gloss of the organ surface, but it is difficult to change the gloss partially because the degree of gloss depends on the uneven shape of the organ.
- the gloss of the organ surface can be partially changed by adding a virtual liquid.
- the mucus secreted by the mucous membrane present on the surface of the organ may be unevenly distributed. May cause partial distortion.
- 1 medical image display device 2 CPU, 3 main memory, 4 storage device, 5 display memory, 6 display device, 7 controller, 8 mouse, 9 keyboard, 10 network adapter, 11 system bus, 12 network, 13 medical imaging device , 14 Medical image database, 500 3D image data, 501 Organ to be diagnosed, 510 Organ 501 shape data, 511 Organ 501 surface data, 512 Organ 501 3D image data, 522 Virtual liquid 3D image data , 600 virtual space, 601 light source, 602 viewpoint, 603 line of sight, 604 projection plane, 605 projection space, 610 reference plane 700 display screen, 701 medical image display area 701, 702 region of interest, 710 virtual liquid property setting interface, 711 Medical image creation mode switching interface, 720 viewpoint setting interface, 721 light source setting interface 721, 722 Projection plane setting interface, 1000 virtual liquid level, 1001 virtual liquid, 1002 virtual liquid thickness, 1003 flow, 1004 vortex flow, 2000 organ surface
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Image Generation (AREA)
- Image Processing (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Endoscopes (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
CPU2は、操作者がマウス8やキーボード9を操作して選択した被検体の3次元画像データを医用画像撮影装置13または医用画像データベース14からネットワーク12を介して取得する。ここで3次元画像データとは被検体を撮影して得られた数枚から数百枚の断層画像が、ある方向、例えば断層面に垂直な方向に連続して並べられて構成されるものである。
CPU2は、S101で取得された3次元画像データを用いて仮想液体が付加された医用画像を作成する。本ステップで作成される医用画像には、臓器の表面に存在する粘膜や粘膜が分泌する粘液が含んでいる水分に相当する仮想液体が付加されており、例えば実際の内視鏡画像により近づけた質感を有する仮想内視鏡画像が作成される。図3に仮想液体が付加された医用画像を作成する処理の流れの第一の例を示し、以下で各ステップについて補足説明図である図4を参照しながら説明する。
CPU2は、S101で取得された3次元画像データを用いて臓器の3次元形状データを作成する。図4に示すように3次元画像データ500の中には診断対象である臓器501が含まれている。そこで、CPU2は、臓器501に対応する閾値を用いて3次元画像データ500内にて領域判定を行うことにより、臓器501を抽出し、臓器501の形状データ510を作成する。なお、CPU2は、臓器501の解剖学的な形状の特徴に基づいて領域形状の判定を行うことにより、臓器501を抽出し、形状データ510を作成してもよい。臓器501の形状データ510は3次元形状データとして、主メモリ3または記憶装置4に保持される。3次元形状データは、ポリゴン表現等による3次元表面形状データでもよい。
CPU2は、S201で抽出された臓器501に対し特性パラメータを設定する。臓器の特性パラメータとしては、臓器の光学的特性を表す反射率や屈折率等が挙げられる。本ステップで設定される臓器の光学的特性は対象臓器の物性値でもよいし、対象臓器に類似する任意の物質の物性値でも良い。また、臓器の特性パラメータの一つである臓器の色には、対象臓器の解剖学的な色を反映させてもよいし、任意の色を設定してもよい。
CPU2は、仮想液体の3次元形状データを作成する。仮想液体の3次元形状データは、S201で作成された臓器の形状データ510の表面を基準面とし、任意の厚さを有して臓器501の表面を覆うように作成される。ここで、臓器の形状データ510の表面とは、臓器の形状データ510の境界面であり、後述する投影画像を作成する際に設定される視点位置に近い側の境界面である。
CPU2は、S203で作成された仮想液体に対し特性パラメータを設定する。仮想液体の特性パラメータとしては、仮想液体の光学的特性を表す反射率や屈折率、吸収率等が挙げられる。本ステップで設定される仮想液体の光学的特性は粘液の物性値でもよいし、粘液に類似する任意の物質、例えば水の物性値でも良い。また、仮想液体の特性パラメータの一つである仮想液体の色は無色でも良いし、対象臓器の解剖学的な色を反映させても良いし、その他の任意の色でもよい。以上述べた特性パラメータは物理的には光の波長に依存するが、S102の医用画像の作成では依存させても依存させなくても良い。ただし、仮想液体の光の透過率はゼロであってはならない。
CPU2は、診断対象である臓器と臓器を覆うように作成された仮想液体を仮想空間内に配置する。CPU2は、仮想空間内に臓器及び仮想液体を配置する際に、S201で作成された臓器の3次元形状データとS203で作成された仮想液体の3次元形状データを用いる。本ステップの処理により、仮想液体データが付加された臓器データが仮想空間内に作成される。
CPU2は、仮想空間内に作成された仮想液体データが付加された臓器データを用いて3次元投影画像を作成する。CPU2は、3次元投影画像を作成する際に、光源及び視点、視線方向、投影面を設定する。図4に、仮想空間600内に配置された臓器の表面2000と仮想液体の液面1000と、設定された光源601及び視点602、視線方向603、投影面604の一例を示す。
CPU2は、S102で作成された医用画像を表示装置6に表示させる。表示装置6には、医用画像と共に医用画像を作成する際に用いられる様々なパラメータを設定するためのインターフェースを表示しても良い。図6に表示装置6に表示される表示画面の一例を示す。
視点設定用インターフェース720は、仮想空間600内の視点602の位置を移動させるためのものである。視点602の位置を移動させるたびに、医用画像表示領域701に表示される医用画像を更新しても良い。
CPU2は、S101で取得された3次元画像データを用いて臓器の表面データを作成する。図8に示すように3次元画像データ500の中には診断対象である臓器501が含まれている。そこで、CPU2は、臓器501に対応する閾値を用いて、3次元画像データ500内にて領域判定を行うことにより、臓器501を抽出し、臓器501の表面データ511を作成する。なお、CPU2は、臓器501の解剖学的な形状の特徴に基づいて領域形状の判定を行うことにより、臓器501を抽出してもよい。臓器501の表面データ511は2次元画像データ、すなわちピクセルデータとして主メモリ3または記憶装置4に保持される。
CPU2は、S301で抽出された臓器501の表面2000に対し特性パラメータを設定する。臓器の特性パラメータとしては、臓器の光学的特性を表す反射率や屈折率等が挙げられる。本ステップで設定される臓器の光学的特性は対象臓器の物性値でもよいし、対象臓器に類似する任意の物質の物性値でも良い。また、臓器特性パラメータの一つである臓器の色には、対象臓器の解剖学的な色を反映させてもよいし、任意の色を設定してもよい。
CPU2は、診断対象である臓器の表面データ511を仮想空間600内に配置する。表面データ511を仮想空間600内に配置するには、例えば仮想空間600内に任意に設定された基準面610に対し公知技術であるテクスチャマッピング法を用いて臓器の表面データ511を貼り付けるようにしても良い。ここで、仮想空間600内に設定された基準面610は、平面でも曲面でもよいし、凹凸を有する面であってもよい。
CPU2は、仮想液体の3次元形状データを作成する。仮想液体の3次元形状データは、任意の厚さを有し、S303で設定された基準面610よりも視点位置に近い側に配置される。仮想液体の液面形状は実施例1と同様に設定しても良い。
CPU2は、S304で作成された仮想液体に対し特性パラメータを設定する。仮想液体の特性パラメータとしては、実施例1と同様で良い。
CPU2は、仮想空間600内に配置された臓器501の表面データ511と作成された仮想液体とを用いて3次元投影画像を作成する。CPU2は、3次元投影画像を作成する際に、光源及び視点、視線方向、投影面を設定する。図8に、仮想空間600内に配置された臓器の表面2000と仮想液体の液面1000と、設定された光源601及び視点602、視線方向603、投影面604と、の一例を示す。3次元投影画像の作成方法は実施例1と同様で良い。
CPU2は、S101で取得された3次元画像データを用いて臓器の3次元画像データを作成する。図10に示すように3次元画像データ500の中には診断対象である臓器501が含まれている。そこで、CPU2は、臓器501に対応する閾値を用いて、3次元画像データ500内にて領域判定を行うことにより、臓器501を抽出し、臓器501の3次元画像データ512を作成する。なお、CPU2は、臓器501の解剖学的な形状の特徴に基づいて領域形状の判定を行うことにより、臓器501を抽出してもよい。臓器501の3次元画像データ512、すなわちボクセルデータは、主メモリ3または記憶装置4に保持される。
CPU2は、仮想液体の3次元画像データを作成する。仮想液体の3次元画像データ522は、S401で抽出された臓器501の重心を基準点として仮想液体1001の厚さに応じて臓器501の形状を拡大した形状であり、透明なものとして作成される。作成された仮想液体1001の形状は、例えば図6に示した仮想液体の特性設定用インターフェース710を用いて設定される液面形状に基づき、修正されても良い。
CPU2は、仮想液体の3次元画像データの濃度値を修正する。濃度値の修正には、例えば図6に示した仮想液体1001の特性設定用インターフェース710を用いて設定される透明度、反射率、屈折率、液色等が用いられても良い。例えば、設定される透明度や液色に応じて仮想液体1001の全ボクセルの濃度値を修正しても良い。また、反射率の高低に応じて仮想液体の表面のボクセルの濃度値を修正しても良いし、屈折率の高低に応じて臓器に接する仮想液体のボクセルの濃度値を修正しても良い。
CPU2は、診断対象である臓器と臓器に付加された仮想液体を仮想空間内に配置する。CPU2は、仮想空間600内に臓器501及び仮想液体1001を配置する際に、S401で作成された臓器の3次元画像データ512とS402で作成されS403で修正された仮想液体の3次元画像データ522を用いる。本ステップの処理により、仮想液体データが付加された臓器データが仮想空間600内に作成される。
CPU2は、仮想空間内に作成された仮想液体データが付加された臓器データを用いて3次元投影画像を作成する。CPU2は、3次元投影画像を作成する際に、光源及び視点、視線方向、投影面を設定する。図10(c)に、仮想空間600内に配置された臓器2001と仮想液体1001と、設定された光源601及び視点602、視線方向603、投影面604と、の一例を示す。
Claims (8)
- 医用画像診断装置により取得された医用画像を読み込む医用画像読込部と、前記医用画像を投影面に投影して投影画像を作成する投影画像作成部と、前記投影画像を表示する投影画像表示部と、を備えた医用画像表示装置であって、
前記投影画像作成部は、
光透過率がゼロでない仮想液体を生成する仮想液体生成部と、
前記仮想液体を前記医用画像内の臓器表面に付加する仮想液体付加部と、を有し、
前記仮想液体が付加された医用画像の投影画像を作成することを特徴とする医用画像表示装置。 - 請求項1に記載の医用画像表示装置において、
前記仮想液体の特性パラメータを設定するパラメータ設定部をさらに備え、
前記仮想液体生成部は、前記パラメータ設定部により設定された前記特性パラメータに基づき、前記仮想液体を生成することを特徴とする医用画像表示装置。 - 請求項2に記載の医用画像表示装置において、
前記パラメータ設定部により設定された前記特性パラメータは、厚さ、液面形状、透明度、反射率、屈折率、液色の内の少なくとも一つを含むことを特徴とする医用画像表示装置。 - 請求項1に記載の医用画像表示装置において、
前記医用画像から診断対象となる対象臓器を抽出する対象臓器抽出部と、
前記対象臓器の表面データを作成する対象臓器表面データ作成部と、
をさらに備え、
前記仮想液体付加部は前記表面データに前記仮想液体を付加することを特徴とする医用画像表示装置。 - 請求項2に記載の医用画像表示装置において、
前記特性パラメータに基づき、前記仮想液体の濃度値を修正する仮想液体濃度値修正部をさらに備えることを特徴とする医用画像表示装置。 - 医用画像診断装置により取得された医用画像を読み込む医用画像読込ステップと、前記医用画像を投影面に投影して投影画像を作成する投影画像作成ステップと、前記投影画像を表示する投影画像表示ステップと、を備えた医用画像表示方法であって、
前記投影画像作成ステップは、
光透過率がゼロでない仮想液体を生成する仮想液体生成ステップと、
前記仮想液体を前記医用画像内の臓器表面に付加する仮想液体付加ステップと、を有し、
前記仮想液体が付加された医用画像の投影画像を作成することを特徴とする医用画像表示方法。 - 医用画像診断装置から得られた医用画像を投影面に投影して投影画像を作成する投影画像作成部と、前記投影画像を表示する投影画像表示部と、を備えた医用画像表示装置であって、
前記投影画像に質感強調処理がなされるか否かを切り替える質感強調処理切替部と、
前記投影画像中の光沢を部分的に変化させる質感強調処理部と、をさらに備え、
質感強調処理がなされない画像中の光沢に対し、質感強調処理がなされる画像中の光沢が部分的に変化した画像が前記投影画像表示部に表示されることを特徴とする医用画像表示装置。 - 医用画像診断装置から得られた医用画像を投影面に投影して投影画像を作成する投影画像作成部と、前記投影画像を表示する投影画像表示部と、を備えた医用画像表示装置であって、
前記投影画像に質感強調処理がなされるか否かを切り替える質感強調処理切替部と、
前記投影画像中の凹凸に部分的なゆがみを生じさせる質感強調処理部と、をさらに備え、
質感強調処理がなされない画像中の凹凸に対し、質感強調処理がなされる画像中の凹凸に部分的なゆがみが生じた画像が前記投影画像表示部に表示されることを特徴とする医用画像表示装置。
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2011523661A JP5562339B2 (ja) | 2009-07-24 | 2010-07-21 | 医用画像表示装置及び医用画像表示方法 |
| US13/383,759 US8830263B2 (en) | 2009-07-24 | 2010-07-21 | Medical image display device and medical image display method |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2009172602 | 2009-07-24 | ||
| JP2009-172602 | 2009-07-24 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2011010644A1 true WO2011010644A1 (ja) | 2011-01-27 |
Family
ID=43499116
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2010/062198 Ceased WO2011010644A1 (ja) | 2009-07-24 | 2010-07-21 | 医用画像表示装置及び医用画像表示方法 |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US8830263B2 (ja) |
| JP (1) | JP5562339B2 (ja) |
| WO (1) | WO2011010644A1 (ja) |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20130076761A (ko) * | 2011-12-28 | 2013-07-08 | 제너럴 일렉트릭 캄파니 | 볼륨 렌더링된 이미지에 대한 광의 방향을 지시하는 방법 및 시스템 |
| JP2018102912A (ja) * | 2016-11-29 | 2018-07-05 | バイオセンス・ウエブスター・(イスラエル)・リミテッドBiosense Webster (Israel), Ltd. | 解剖学的空腔の改善された視覚化 |
| JP2018182353A (ja) * | 2017-04-03 | 2018-11-15 | 日本電信電話株式会社 | 映像生成装置、映像生成方法、およびプログラム |
| JP2020516003A (ja) * | 2017-03-30 | 2020-05-28 | ノバラッド コーポレイションNovarad Corporation | 三次元データによる患者のリアルタイムビューの拡張 |
| JP2020151268A (ja) * | 2019-03-20 | 2020-09-24 | ソニー・オリンパスメディカルソリューションズ株式会社 | 医療用画像処理装置及び医療用観察システム |
| JPWO2022168925A1 (ja) * | 2021-02-04 | 2022-08-11 |
Families Citing this family (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP6271145B2 (ja) * | 2013-04-09 | 2018-01-31 | 株式会社日立製作所 | 医用画像誘導装置及び医用画像誘導方法 |
| KR102111626B1 (ko) * | 2013-09-10 | 2020-05-15 | 삼성전자주식회사 | 영상 처리 장치 및 영상 처리 방법 |
| KR102347038B1 (ko) * | 2014-11-06 | 2022-01-04 | 삼성메디슨 주식회사 | 초음파 진단 장치 및 초음파 진단 방법 |
| US9384548B1 (en) * | 2015-01-23 | 2016-07-05 | Kabushiki Kaisha Toshiba | Image processing method and apparatus |
| GB2542114B (en) * | 2015-09-03 | 2018-06-27 | Heartfelt Tech Limited | Method and apparatus for determining volumetric data of a predetermined anatomical feature |
| CN106682424A (zh) | 2016-12-28 | 2017-05-17 | 上海联影医疗科技有限公司 | 医学图像的调节方法及其系统 |
| CN110313940B (zh) * | 2019-08-01 | 2021-06-01 | 无锡海斯凯尔医学技术有限公司 | 信号衰减计算方法、装置、设备及计算机可读存储介质 |
| CN112581596A (zh) * | 2019-09-29 | 2021-03-30 | 深圳迈瑞生物医疗电子股份有限公司 | 超声图像绘制方法、超声图像绘制设备及存储介质 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2004219731A (ja) * | 2003-01-15 | 2004-08-05 | Matsushita Electric Ind Co Ltd | 情報表示装置及び情報処理装置 |
| WO2005117712A1 (ja) * | 2004-06-03 | 2005-12-15 | Hitachi Medical Corporation | 画像診断支援方法及び画像診断支援装置 |
| JP2006018606A (ja) * | 2004-07-01 | 2006-01-19 | Ziosoft Inc | 展開画像投影方法、展開画像投影プログラム、展開画像投影装置 |
| JP2007503060A (ja) * | 2003-08-18 | 2007-02-15 | フォヴィア インコーポレイテッド | 適応直接ボリュームレンダリングの方法及びシステム |
| JP2008100107A (ja) * | 2007-12-27 | 2008-05-01 | Ziosoft Inc | 展開画像投影方法、展開画像投影プログラム、展開画像投影装置 |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP3483929B2 (ja) * | 1994-04-05 | 2004-01-06 | 株式会社日立製作所 | 3次元画像生成方法 |
| US8165358B2 (en) * | 2004-01-12 | 2012-04-24 | Ge Medical Systems Global Technology Co., Llc | System and method for overlaying color cues on a virtual representation of an anatomical structure |
| DE102007021035A1 (de) * | 2007-05-04 | 2008-11-13 | Siemens Ag | Bildverarbeitungs-, Bildvisualisierungs- und Bildarchivierungssystem zur kontrasterhaltenden Fusionierung und Visualisierung koregistrierter Bilddaten |
| US20090067027A1 (en) * | 2007-09-07 | 2009-03-12 | Michael Ross Hennigan | Liquid space telescope |
-
2010
- 2010-07-21 JP JP2011523661A patent/JP5562339B2/ja active Active
- 2010-07-21 US US13/383,759 patent/US8830263B2/en not_active Expired - Fee Related
- 2010-07-21 WO PCT/JP2010/062198 patent/WO2011010644A1/ja not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2004219731A (ja) * | 2003-01-15 | 2004-08-05 | Matsushita Electric Ind Co Ltd | 情報表示装置及び情報処理装置 |
| JP2007503060A (ja) * | 2003-08-18 | 2007-02-15 | フォヴィア インコーポレイテッド | 適応直接ボリュームレンダリングの方法及びシステム |
| WO2005117712A1 (ja) * | 2004-06-03 | 2005-12-15 | Hitachi Medical Corporation | 画像診断支援方法及び画像診断支援装置 |
| JP2006018606A (ja) * | 2004-07-01 | 2006-01-19 | Ziosoft Inc | 展開画像投影方法、展開画像投影プログラム、展開画像投影装置 |
| JP2008100107A (ja) * | 2007-12-27 | 2008-05-01 | Ziosoft Inc | 展開画像投影方法、展開画像投影プログラム、展開画像投影装置 |
Cited By (16)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR102018386B1 (ko) * | 2011-12-28 | 2019-09-04 | 제너럴 일렉트릭 캄파니 | 볼륨 렌더링된 이미지에 대한 광 방향을 지시하는 방법 및 시스템 |
| JP2013140581A (ja) * | 2011-12-28 | 2013-07-18 | General Electric Co <Ge> | ボリューム・レンダリング画像について光方向を指示するための方法及びシステム |
| JP2017199422A (ja) * | 2011-12-28 | 2017-11-02 | ゼネラル・エレクトリック・カンパニイ | ボリューム・レンダリング画像について光方向を指示するための方法及びシステム |
| US9818220B2 (en) | 2011-12-28 | 2017-11-14 | General Electric Company | Method and system for indicating light direction for a volume-rendered image |
| KR20130076761A (ko) * | 2011-12-28 | 2013-07-08 | 제너럴 일렉트릭 캄파니 | 볼륨 렌더링된 이미지에 대한 광의 방향을 지시하는 방법 및 시스템 |
| US10380787B2 (en) | 2011-12-28 | 2019-08-13 | General Electric Company | Method and system for indicating light direction for a volume-rendered image |
| JP2018102912A (ja) * | 2016-11-29 | 2018-07-05 | バイオセンス・ウエブスター・(イスラエル)・リミテッドBiosense Webster (Israel), Ltd. | 解剖学的空腔の改善された視覚化 |
| JP2020516003A (ja) * | 2017-03-30 | 2020-05-28 | ノバラッド コーポレイションNovarad Corporation | 三次元データによる患者のリアルタイムビューの拡張 |
| US11004271B2 (en) | 2017-03-30 | 2021-05-11 | Novarad Corporation | Augmenting real-time views of a patient with three-dimensional data |
| US11481987B2 (en) | 2017-03-30 | 2022-10-25 | Novarad Corporation | Augmenting real-time views of a patient with three-dimensional data |
| JP2018182353A (ja) * | 2017-04-03 | 2018-11-15 | 日本電信電話株式会社 | 映像生成装置、映像生成方法、およびプログラム |
| JP2020151268A (ja) * | 2019-03-20 | 2020-09-24 | ソニー・オリンパスメディカルソリューションズ株式会社 | 医療用画像処理装置及び医療用観察システム |
| JP7239362B2 (ja) | 2019-03-20 | 2023-03-14 | ソニー・オリンパスメディカルソリューションズ株式会社 | 医療用画像処理装置及び医療用観察システム |
| JPWO2022168925A1 (ja) * | 2021-02-04 | 2022-08-11 | ||
| WO2022168925A1 (ja) * | 2021-02-04 | 2022-08-11 | 株式会社Kompath | 情報処理システム、プログラムおよび情報処理方法 |
| JP7741498B2 (ja) | 2021-02-04 | 2025-09-18 | 株式会社Kompath | 情報処理システム、プログラムおよび情報処理方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| JP5562339B2 (ja) | 2014-07-30 |
| US20120127200A1 (en) | 2012-05-24 |
| JPWO2011010644A1 (ja) | 2013-01-07 |
| US8830263B2 (en) | 2014-09-09 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP5562339B2 (ja) | 医用画像表示装置及び医用画像表示方法 | |
| CN106981098B (zh) | 虚拟场景组分的视角表示 | |
| CN104008568B (zh) | 医用图像处理装置以及医用图像处理方法 | |
| US9595088B2 (en) | Method of, and apparatus for, visualizing medical image data | |
| US10832420B2 (en) | Dynamic local registration system and method | |
| CN102024271A (zh) | 借助体绘制有效地可视化对象特征 | |
| CN103702613A (zh) | 切割模拟装置以及切割模拟程序 | |
| JP6215057B2 (ja) | 可視化装置、可視化プログラムおよび可視化方法 | |
| JP2017189460A (ja) | 医用画像処理装置、医用画像処理方法、及び医用画像処理プログラム | |
| US7576741B2 (en) | Method, computer program product, and device for processing projection images | |
| Ahmad et al. | 3D reconstruction of gastrointestinal regions using single-view methods | |
| JP2006000127A (ja) | 画像処理方法および装置並びにプログラム | |
| Chung et al. | Patient-specific bronchoscopy visualization through BRDF estimation and disocclusion correction | |
| KR102111707B1 (ko) | 렌더링을 수행하기 위한 방법 및 시스템 | |
| JP2010131315A (ja) | 医用画像処理装置及び医用画像処理プログラム | |
| Chung et al. | Enhancement of visual realism with BRDF for patient specific bronchoscopy simulation | |
| JP7095409B2 (ja) | 医用画像処理装置、医用画像処理方法、プログラム、及びmpr像生成方法 | |
| JP5908241B2 (ja) | 医用画像表示装置 | |
| WO2011062108A1 (ja) | 画像処理装置及び画像処理方法 | |
| JP6035057B2 (ja) | 三次元画像表示装置及び三次元画像表示方法 | |
| US12340458B2 (en) | Image processing device, image processing method, and storage medium for controlling the display of the shape of the surface represented by surface data | |
| Khare et al. | Optimization of CT-video registration for image-guided bronchoscopy | |
| CN119832135A (zh) | 图像绘制方法、装置及计算机设备 | |
| JP2025160835A (ja) | 画像処理装置、画像処理方法及びプログラム | |
| JP2013022086A (ja) | 医用画像生成装置および医用画像生成プログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10802264 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2011523661 Country of ref document: JP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 13383759 Country of ref document: US |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 10802264 Country of ref document: EP Kind code of ref document: A1 |