[go: up one dir, main page]

CN109313818A - System and method for lighting in rendered images - Google Patents

System and method for lighting in rendered images Download PDF

Info

Publication number
CN109313818A
CN109313818A CN201780035916.2A CN201780035916A CN109313818A CN 109313818 A CN109313818 A CN 109313818A CN 201780035916 A CN201780035916 A CN 201780035916A CN 109313818 A CN109313818 A CN 109313818A
Authority
CN
China
Prior art keywords
light source
user
data set
image
analog light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201780035916.2A
Other languages
Chinese (zh)
Other versions
CN109313818B (en
Inventor
B·J-D·B·M·莫里
E·M·S·阿蒂亚
J-M·鲁埃
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips Electronics NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics NV filed Critical Koninklijke Philips Electronics NV
Publication of CN109313818A publication Critical patent/CN109313818A/en
Application granted granted Critical
Publication of CN109313818B publication Critical patent/CN109313818B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/08Volume rendering
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/466Displaying means of special interest adapted to display 3D data
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

本公开描述了一种图像绘制技术,其提供被定位在三维(3D)数据集内的模拟光源以绘制3D数据集的二维投影图像。所述模拟光源可以被定位在所述3D数据集内部或者外部任何地方,包括在感兴趣区域内。所述模拟光源可以是多向光源。用户可以经由用户接口选择所述模拟光源的X‑Y‑Z坐标。

This disclosure describes an image rendering technique that provides a simulated light source positioned within a three-dimensional (3D) dataset to render a two-dimensional projected image of the 3D dataset. The simulated light source can be positioned anywhere inside or outside the 3D dataset, including within a region of interest. The simulated light source can be a multi-directional light source. The user can select the X-Y-Z coordinates of the simulated light source via a user interface.

Description

System and method for the illumination in drawing image
Background technique
In medical imaging, image can be drawn in real time or after dataset acquisition.Image can be in volume Two dimension (2D) slice or plane or image of interior acquisition can be three-dimensional (3D) volume.3D volume rendering technique can be related to Virtual ray is projected to the 2D projection that the data that can be displayed in final drawing image are obtained in imaging 3D volume.Number According to may include anatomical structure in imaging volume.When ray is interested in imaging volume from being positioned against for imaginary observer When region project, various anatomical structures can be inserted into along sight.Shade on the surface of incident light direction driving anatomical structure With the appearance of reflection.Use of the analog light source in drawing image can provide a user how various anatomical structures are disposed in In 3D volume and the feeling of depth.One or more anatomical structures can prevent or otherwise interfere acquisition region of interest The clear image in domain.User can rotate 3D volume, this can change imaginary observer and/or analog light source relative to 3D volume Position.It can be projected with the new 2D of drawing data.Shade and other illuminating effects from analog light source can be based on 3D volume Rotation shift, to provide a user the additional information of the depth and arrangement about anatomical features.
For given 3D rendering data set, image rendering technologies, which are used by, to be made about in predefined color and intensity Light source under the hypothesis of the optical property of tissue that is imaged generate 2D image from given viewpoint.Currently, it is used for ultrasonic image-forming system Image rendering technologies depend on and be positioned in the directional light sources of fixed range or infinite general goal.Incident light direction can pass through rail Arrow on the dedicated sphere small tool of mark ball control is presented to the user.Other than rotating 3D volume, user, which can change, to come From the direction of the incident light of analog light source.
Fig. 1 is the schematic illustration of the example of conventional images rendering technique 100.3D data set 130 can be via ultrasound Probe or the acquisition of other imaging techniques.3D data set 130 may include the data corresponding to the 3D volume in body.3D data Collection 130 may include area-of-interest 135.Area-of-interest 135 can be the part of object (for example, the wall of blood vessel, heart Valve) or can be entire object (for example, tumour, fetus).When drafting includes the 3D data set 130 of area-of-interest 135 Image when, virtual light source can be used to provide for one or more surfaces in 3D data set 130 (for example, area-of-interest 135 surface 136) on shade and reflection, the shade and reflection can provide depth perception for user.Analog light source can be with It is directional light sources 105.Directional light sources 105 only can send up light in the side indicated by arrow 115.It can permit user's selection The position of directional light sources 105 at fixed range away from 3D data set 130.Based on from the viewing point 3D indicated by arrow 125 The 2D projection of the imaginary observer of data set 130,3D data set 130 can be drawn relative to the display plane of delineation 120.Display The plane of delineation 120 can be aligned with the X-Y plane of 3D data set 130.Arrow 125 can be perpendicular to the plane of delineation 120.That is, empty Quasi- observer is considered the figure that 3D data set 130 is worn by the depth " seeing " of the 3D data set 130 indicated by Z axis As plane 120.2D projection at the display plane of delineation 120 of 3D data set 130 can be provided as image over the display To user.
Although directional light sources 105 can be located in by user relative to the mobile directional light sources 105 of 3D data set 130 Object self-shileding can be caused and make the configurations difficult for irradiating area-of-interest 135 by drawing volumes-outer.It can make interested Volume and/or the details in region 135 are fuzzy.It is recessed in the case where the shearing of no 3D data set 130 or other significant adjustings Intracavitary anatomical detail may be sightless.
Fig. 2 is the example for the image 200 drawn using exterior orientation light source according to 3D data set.Image 200 shows uterus Fetus 205 in 210.By based on use the image rendering technologies for the directional light sources being positioned in outside uterus 210 by uterus The shade of 210 projections keeps many anatomical structures of fetus 205 fuzzy.This can inhibit user, and (it can be ultrasonic doctor, obstetrics Doctor or other clinicians) it makes diagnosis or can navigate in the volume by 3D data set definition.
Summary of the invention
Ultrasonic image-forming system according at least one embodiment of the disclosure may include: ultrasonic probe, can be matched It is set to from object reception ultrasonic echo with the volume to the object and is imaged;Scan converter can be configured as root Three-dimensional (3D) data set is generated according to the ultrasonic echo;Volume rendering device can be configured as and be based at least partially on simulation Light source calculates the surface shading information on the surface of the 3D data set relative to the position of the 3D data set and draws institute Two dimension (2D) projected image of 3D data set is stated, the 2D projected image includes the shadow information;And user interface, it can To include display, the display can be configured as the display 2D projected image;And input equipment, may include User's input element, user's input element, which can be configured as, receives user's input so that the analog light source is located in institute It states at the subsequent position in surface of 3D data set.In some embodiments, the analog light source can be multidirectional light source.
Method according at least one embodiment of the disclosure may include: to receive to the selection of analog light source to draw 3D The 2D projected image of data set, wherein the 3D data set can be constructed according to the ultrasonic echo received from the volume of object; It inputs and is received in the plane of the analog light source in the plane for corresponding to the projection plane of the 2D projected image in response to user The instruction of position;It is determined perpendicular to the depth location of the analog light source on the axis of the projection plane;At least partly ground The surface shading information on the surface of the 3D data set is calculated in position in plane and depth location;And drawing includes display The 2D projected image of shadow information on device.
Detailed description of the invention
Fig. 1 is the schematic illustration using the image rendering technologies of exterior orientation light source;
Fig. 2 is the example using the drawing image of image rendering technologies shown in FIG. 1;
Fig. 3 is the block diagram of imaging system according to an embodiment of the present disclosure;
Fig. 4 is the schematic illustration of the image rendering technologies according to an embodiment of the present disclosure using analog light source;
Fig. 5 is the example using the drawing image of image rendering technologies shown in Fig. 4;
Fig. 6 is the schematic illustration of image rendering technologies shown in Fig. 4;
Fig. 7 A-C is the example using the drawing image of image rendering technologies shown in fig. 6;
Fig. 8 is the diagram of user interface according to an embodiment of the present disclosure;
Fig. 9 is the schematic illustration of the display of user interface according to an embodiment of the present disclosure;
Figure 10 is the schematic illustration of the display of user interface according to an embodiment of the present disclosure;
Figure 11 A-C is the example of drawing image according to an embodiment of the present disclosure;And
Figure 12 is the flow chart of method according to an embodiment of the present disclosure.
Specific embodiment
Particular exemplary embodiment be described below actually be only it is exemplary and be in no way intended to limit the present invention or It is applied or uses.System and method embodiment it is described in detail below in, with reference to the attached drawing for forming its part, and Wherein, by diagrammatically showing in the specific embodiment that can wherein practice described system and method.These embodiments with Enough details are described so that those skilled in the art can practice presently disclosed system and method, and should manage Solution, can use other embodiments and can make structure and logical changes without departing from the spirit and scope of this system.This Outside, for clarity purposes, when it will be evident not obscure retouching for this system for a person skilled in the art When stating, the detailed description of special characteristic will not be discussed.It is described in detail below therefore should not be taken with restrictive sense, and this is The range of system is only defined by the claims.
In some applications, can be according to 3D data drawing image using the analog light source being positioned in 3D data set It is desired.In some applications, using the analog light source in the area-of-interest of 3D data set according to 3D data drawing image energy It is enough desired.In some applications, analog light source can be desired for multidirectional light source.For example, analog light source can be modeled as In all directions from the ball of the whole surface projected light of ball.In another example, analog light source can be modeled as all The point source of projected light on direction.Allow user analog light source is placed in 3D data set can provide it is less by when image utilize The fuzzy drafting figure of the shade and/or other artifacts generated when the analog directional light source drafting being positioned in outside 3D data set Picture.Compared with the illumination using external light source, the illumination of nearly range can provide the shape of object and the better office of curvature Portion's depth perception.It can be provided using the image that the analog light source in 3D data set is drawn and be easier to clinician or other use The image that family is explained.The energy that this can improve clinician or other users make diagnosis and/or navigate in 3D data set Power.
In illustrative example, clinician can execute ultrasonic examination and from patient (for example, in uterus on patient Fetus) acquisition 3D data set.Imaging system can use the image for simulating the 2D projection that multidirectional light source draws 3D data set.Face Bed doctor can be with the light source in mobile 3 D data set, and imaging system can be based in part on the new position of light source and draw to adjust It is imaged.For example, clinician can contact display drawing image and light source visually indicates (for example, sphere, square, X Deng) touch screen and by light source " towing " to the different positions in image.Clinician can move light source to study not Same area-of-interest.Continue the example, clinician can move light source and check to highlight the profile of the face of fetus Cleft palate.Then clinician can move light source to irradiate backbone to check deformity.Clinician can choose control image The position (for example, position, X-Y plane position in plane) of light source in plane and the depth (Z of the light source in 3D data set Axis).Clinician can control light source during the checking of storage image during ultrasonic examination or after check.
Fig. 3 shows the block diagram of the ultrasonic image-forming system 10 according to the building of the principle of the disclosure.Although in reality of the invention It applies and shows ultrasonic image-forming system in the illustrative examples of example, but the embodiment of the present invention is real using other medical imaging modalities It tramples.Other mode can include but is not limited to magnetic resonance imaging and computer tomography.Ultrasonic image-forming system 10 in Fig. 3 wraps Include ultrasonic probe 12 comprising for sending ultrasonic wave and receiving the transducer array 14 of echo information.Various transducer arrays It is well known in the present art, such as linear array, convex array or phased array.Transducer array 14 for example may include energy It is enough to be scanned in both the elevation angle and azimuth dimension with the two-dimensional array of the element of transducer for 2D and/or 3D imaging (such as It is shown).Element of transducer 14 is coupled to the microbeamformers 16 in transducer probe 12, and control passes through changing in array Energy device element sends and receives signal.In this example, microbeamformers 16 by pop one's head in cable be coupled to transmission/ (T/R) switch 18 is received, switches between sending and receiving and main beamformer 22 is protected to send from high-energy and believe Number influence.In some embodiments, such as in portable ultrasound system, other elements in T/R switch 18 and system can be with It is included in ultrasonic probe rather than in isolated ultrasonic system pedestal.It comes from and changes under the control of microbeamformers 16 The transmission of the ultrasonic beam of energy device array 14 is guided by the transmission controller 20 for being coupled to T/R switch 18 and Beam-former 22, It receives input from the operation of the user of user interface or control panel 24.The function of being controlled by transmission controller 20 first is that The direction of beam steering.Wave beam (can be orthogonal to transducer array) forward from transducer array or exist for wider visual field It is manipulated at different angles.The part Wave beam forming signal generated by microbeamformers 16 is coupled to main Beam-former 22, wherein the part Wave beam forming signal of the individual tile from element of transducer is combined into complete Wave beam forming signal.
Wave beam forming signal is coupled to signal processor 26.Signal processor 26 can be handled in various ways and be received Echo-signal, such as bandpass filtering, sampling, I and Q component separate harmonious wave Signal separator.Signal processor 26 can also be held The additional signal enhancing of row, such as speckle is reduced, signal is compound and noise is eliminated.Processed signal can be coupled to B mould Formula processor 28, can be using the amplitude detection of the imaging for the structure in body.The letter generated by B-mode processor 28 Number it is coupled to scan converter 30 and how multi-planar reformatted device 32.Scan converter 30 is with it with desired picture format The spatial relationship arrangement echo-signal being received.For example, echo-signal can be arranged as two-dimentional (2D) fan by scan converter 30 Shape format or three-dimensional (3D) image of pyramid.In some embodiments, scan converter 30 can be from echo signal form 3D Data set.Mostly multi-planar reformatted device 32 can be returned what is received from the point in the common plane in the volumetric region of body Wave is converted to the ultrasound image of the plane, as described in United States Patent (USP) US 6443896 (Detmer).Volume rendering device 34 are converted to the echo-signal of 3D data set the projection 3D rendering such as checked from given reference point, such as in United States Patent (USP) US Described in 6530885 (Entrekin et al.).In some embodiments, volume rendering device 34 can connect from user interface 24 Receive input.Input may include given reference point (for example, viewpoint of imaginary observer), analog light source position and/or be used for The property of the analog light source for the projected image drawn.In some embodiments, volume rendering device 34 can at least partly ground The surface shading information on one or more surfaces in 3D data set is calculated in the position of analog light source and/or property.2D or 3D rendering from scan converter 30, how multi-planar reformatted device 32 and volume rendering device 34 be coupled to image processor 36 with For further enhancing, buffering and temporarily store to be shown in image display 38.In some embodiments, image processor 36 can draw the visual cues (for example, sphere, the ring of light) for analog light source.In some embodiments, visual cues can be with It is drawn by volume rendering device 34.It is overlapping that the figure for displaying together with ultrasound image can be generated in graphics processor 40.These Figure is overlapping to may include such as standard identification information, patient's title, the date and time of image, imaging parameters etc..Out In these purposes, graphics processor is received from user interface 24 and is inputted, the patient's title such as keyed in.User interface can also quilt It is coupled to how multi-planar reformatted device 32 with the selection and control of the display for several how multi-planar reformatted (MPR) images System.
In accordance with an embodiment of the present disclosure, ultrasonic probe 12 can be configured as from object and receive ultrasonic echo to object Volume is imaged.Scan converter 30 can receive ultrasonic echo and generate 3D data set.As described above, ultrasonic Echo can be pre- by Beam-former 22, signal processor 26 and/or B-mode processor before being received by scan converter 30 Processing.3D data set may include the value for each point (for example, voxel) in imaging volume.It is strong that value can correspond to echo Degree, tissue density, flow rate and/or material composition.Based on the value in 3D data set, scan converter 30 and/or volume rendering device 34 can define one or more surfaces in imaging volume.Surface can indicate two different object (examples in imaging volume Such as, fetus and uterus) or material (for example, bone and muscle) or region (the different flow rates in blood vessel) between boundary.? In some embodiments, surface can be contour surface.
When drawing the 2D projected image of 3D data set, volume rendering device 34 can receive analog light source relative to 3D data The position of collection.In some embodiments, the position of analog light source can be by 10 pre-programmed of imaging system.Analog light source can be defaulted To preprogrammed position (for example, in the case where enabling volume rendering mode), and in some cases, when in volume rendering mould When in formula, light source can be moveable by user.In some embodiments, the position of analog light source can be via user interface 24 receive, and user interface 24 may include that the input with the one or more input elements for being configured as receiving user's input is set It is standby.For example, user interface 24 may include the touch screen with graphical user interface (GUI), GUI allows user in 3D data set The position of analog light source inside Anywhere and/or is nearby set.As example, graphical user interface (GUI) can provide so that One or more GUI components of the position of analog light source can be arranged in user.In some examples, GUI component is (for example, photosphere Body) visual cues about light source relative to the position of volume can be additionally provided.In other examples, GUI component can be with It is input small tool, wherein user can specify the position (for example, specified X, Y, Z coordinate) of light source.GUI can be used Other examples of element.In another example, user's input can be via Mechanical course (for example, the trace ball on control panel Or rotary encoder) receive, the Mechanical course can specifically be associated and is configured as in volume rendering mode Generate the steering command for moving light source.
Volume rendering device 34 can be based at least partially on analog light source relative to the position of 3D data set to calculate 3D number According to the surface shading information on one or more surfaces in collection.Surface shading information may include about the 2D projection for indicating to draw The information of the brightness of any given pixel on the surface of the 3D data set in image, the information can be to other 2D drawing images It provides three-dimensionality.In addition to light source is relative to the position on surface, surface shading information can be based on the property of the volume near surface Matter (for example, the value for the voxel being inserted between light source and surface).For example, when calculating the shadow information for being directed to given surface When, the density of the tissue between analog light source and the outer surface of drafting can be considered in volume rendering device 34.When analog light source is determined When position is before the surface of imaging volume, only zero voxel can be inserted in the irradiation between light source and surface and on surface Region compared in analog light source behind surface and therefore can have in the example at nonzero value voxel and surface interval High radiance or brightness.It can be by known by the light transmission of the nonzero value voxel in the region around the 3D data set of drafting Optical analog technology approximation with similar with by the light transmission of air, therefore can be subtracted by the light transmission of nonzero value voxel Less approximately through the transmissivity of the tissue more more dense than air.Therefore, compare surrounding volume when analog light source is positioned in encirclement It, can by the surface shading information that volume rendering device 34 calculates when behind the surface of the volume of the 3D data set with higher density With different from when analog light source is positioned in front of surface.For example, surface shading information may include it is less reflection and It shows as when analog light source is positioned in behind surface out of it " luminous ", while surface shading information can make surface work as It is shown as when analog light source is positioned in front of surface opaquer.As it will be realized, being positioned in the object before light source Density and other properties will affect the light transmission by object, therefore volume rendering device 34 is configured as considering to be inserted in The density of material between light source and drawn surface.
Although referring to surface shaded, volume rendering device 34 may or may not be mentioned clearly from 3D data set Take surface for gauging surface shadow information.For example, volume rendering device 34 can be calculated for every individual in 3D data set The shadow information (for example, volume shade) of element.It, can be at least partly for the shadow information of each voxel as previously mentioned The density of distance of the ground based on voxel away from analog light source, the density of voxel and/or surrounding voxels.For obtaining for 3D data set The appearance on the surface 3D in 3D data set can be supplied to user by shadow information.For simplicity, the sense in 3D data set The shadow information on the surface in interest object and/or area will be referred to as surface shading information, without considering it by volume rendering device 34 The mode of calculating.
Surface shading information can be used by volume rendering device 34 to draw 2D projected image.In some embodiments, it draws The 2D projected image of system can be provided by volume rendering device 34 to image processor 36.The 2D projected image of drafting can be provided It is used to be checked by user (such as clinician) to display 38.In some examples, the drafting of volume rendering device 34 and aobvious Show that the obtained 2D projected image provided on device 38 can be inputted in response to user via user interface 24 and be updated, such as to refer to Show that the movement (for example, translation or rotation) of volume, analog light source are drawn relative to the movement of volume and/or with various in drafting System constructs other changes of associated parameter.
Fig. 4 is the schematic illustration of image rendering technologies 400 according to an embodiment of the present disclosure.In some embodiments, Image rendering technologies 400 can be executed by imaging system (such as ultrasonic image-forming system 10).3D data set 430 can be via super Acoustic imaging system acquisition, all ultrasonic image-forming systems 10 as shown in Figure 3 or another imaging system are (for example, magnetic resonance imaging (MRI) machine).3D data set 430 may include the data corresponding to the 3D volume in body.3D data set 430 may include Area-of-interest 435.Area-of-interest 435 can be object part (for example, valve of the wall of blood vessel, heart) or can be with It is entire object (for example, tumour, fetus).In some embodiments, 3D data set 430 may include multiple semi-cylindrical hills 435.Based on imaginary observer from the viewing point 3D data set 430 for indicating image by arrow 425, the 2D of 3D data set 430 is thrown Shadow can be drawn relative to the display plane of delineation 420.The display plane of delineation 420 can be aligned with X-Y plane.By arrow 425 The vector of instruction can pass through the plane of delineation 420.That is, imaginary observer is considered through the Z by being orthogonal to X-Y plane The depth " seeing " of 3D data set 430 indicated by axis wears the plane of delineation 420 at 3D data set 430.Although flat perpendicular to image Face 420 is shown, but arrow 425 can be at other specific angles relative to the plane of delineation 420 (for example, 10,30,45 degree). 2D projected image at the display plane of delineation 420 of 3D data set 430 can be in display (all displays as shown in Figure 3 38) user is provided to as image on.When drawing includes the image of 3D data set 430 of area-of-interest 435, simulated light Source 405 can be used for gauging surface shadow information to draw one or more surfaces in 3D data set 430 (for example, sense is emerging The surface 436 in interesting region 435) on shade and reflection, provide depth perception for user.Surface shading information can at least portion Divide position of the ground based on analog light source 405 relative to 3D data set 430 and/or area-of-interest 435.In some embodiments, Analog light source 405 can be multidirectional light source.Light source 405 such as can send up light by all sides that arrow 415 indicates.It is different In light source 105 shown in FIG. 1, the position that user selects 430 outside of 3D data set or interior light source 405 Anywhere can permit It sets.Shown in embodiment as illustrated in Figure 4, light source 405 is being less than the depth of the depth of area-of-interest 435 in 3D In data set 430.That is, light source 405 area-of-interest 435 with from the imaginary observer in terms of the direction indicated by arrow 425 it Between Z axis depth.
Fig. 5 is the image example 500 drawn using image rendering technologies 400 shown in Fig. 4.Image 500 be according to The identical 3D data set of image 200 (fetus 505 in uterus 510) shown in Fig. 2 is drawn.In some embodiments, mould Quasi- light source can be plotted as the radioactive material in image.In the example shown in Figure 50 0, analog light source is plotted as sending out Photosphere body 515.Photosphere 515 is plotted in the 3D data set in uterus 510.As a result, uterus 510, which does not project, makes fetus 505 fuzzy shades.It is compared with the fetus 205 in Fig. 2, can identify the left arm, right shoulder and trunk of fetus 505.These are identical Uterus shadow blur in the image 200 of feature as shown in Figure 2.
As previously mentioned, light source 405 is not limited to the setting distance away from 3D data set 430.Fig. 6 is according to the disclosure The schematic illustration of the various example possible positions of the light source 405a-e of embodiment.As shown in FIG. 6, light source 405 can be by It is plotted at the different positions (for example, different positions on X-Y plane) in the plane of delineation 420 and in 3D data set At different depth (for example, along Z axis) in 430.For example, light source 405a is in the position shown in Fig. 4, and light source 405b is in depth identical with light source 405a, but at the difference in the plane of delineation 420 in 3D data set 430.Light Both different depth of the source 405c in the different points and 3D data set 430 on the plane of delineation 420 place.Such as the institute in Fig. 6 Show, light source 405c is in the deeper depth of area-of-interest 435 than reference picture plane 420.Light source 405 can even be put It sets in area-of-interest 435, as shown in light source 405d.The position of light source 405 is not limited to 3D data set 430.Light source 405e shows the example of the position outside 3D data set 430.Example location is shown merely for the purpose of explanation, and light source 405 are not limited to position shown in Fig. 6.
Although being not shown in Fig. 6, analog light source 405 can be directional light sources rather than multidirectional light source.Some In embodiment, user can switch between multidirectional mode and directional pattern.In some applications, in 3D data set 430 Directional light sources can be it is desired.For example, user may wish to highlight the given zone in 3D data set, while making other The irradiation in area minimizes, this can reduce divert one's attention (for example, " spotlight " effect).
Fig. 7 a-c is two point of example that the analog light source according to an embodiment of the present disclosure using different depths is drawn Valve image 700a-c.As shown in Figure 7 a, the bicuspid valve that analog light source 705 is plotted in image 700a from the visual angle of observer is " preceding Face ".Light source 705, which is plotted in, can permit clinician and identifies mitral surface and around the heart of valve before bicuspid valve Structural feature.In fig.7b, analog light source 705 is plotted in the bicuspid valve in image 700b.When light source 705 is drawn When system is in bicuspid valve, clinician can make the subtleer profile and Depth Blur of mitral different piece.Figure 7c is the example using the drawing image for being positioned in the subsequent light source 705 of bicuspid valve.Light source is placed on behind bicuspid valve can To be advantageous for at least qualitative determination of mitral thickness in different parts.Clinician can have it His reason and/or the additional advantage there may be the different location of light source 705.For example, clinician can determine light source 705 Position is in depth to avoid from other anatomical structure cast shadows on area-of-interest.
Fig. 8 is the diagram that can be used to implement the part of ultrasonic system 800 of embodiment of the disclosure.Ultrasonic system 800 may include user interface 805 and display 810.In some embodiments, user interface 805 can be used for implementing Fig. 3 Shown in user interface 24.In some embodiments, display 810 can be used for display 38 shown in implementing Fig. 3. User interface 805 may include one or more input equipments, and one or more of input equipments include one or more use Family input element.For example, user interface 805 may include touch screen 815, one or more rotation controls 820, trace ball 825 With button 830.In some embodiments, button 830 may include arrow key and/or qwerty keyboard.In some embodiments, Display 810 can also be the part of user interface 805.For example, touch screen can be used to implement in display 810.User can With such option: using display 810, touch screen 815 and/or other controls being included in user interface 805 To position analog light source in drawing image and/or control other properties of analog light source (for example, multidirectional strong of the vs. of orientation Degree, color).
User can control the simulation in drawing image via user interface (all user interfaces 805 as shown in Figure 8) The position of light source.In some embodiments, trace ball 825 and rotation control 820 can be used in user.User can use track Ball 825 selects position (for example, X-Y coordinate) in the plane on the plane of delineation and utilizes 820 selected depth positions of rotation control (for example, coordinate on Z axis) is to be arranged the position of analog light source.In some embodiments, individual rotation control is directed to each certainly It is provided by spending (for example, X-axis control, Y-axis control and Z axis control) the position of analog light source is arranged.In some embodiments, Button 830 (such as arrow key) can be used to select the position (for example, X-Y-Z coordinate) of analog light source in user.
In some embodiments, the input element of user interface 805 or user interface includes graphical user interface (GUI).For example, display 810 and/or touch screen 815 may include GUI.In some embodiments, touch can be used in user Shield 815 to position analog light source.Various gestures on touch screen 815 can be used to select the position of analog light source.For example, with Family can tap touch screen 815 at a position position in plane is arranged and/or touches the figure being displayed on touch screen 815 As in drafting photosphere body and by moving its finger for its " towing " to a position along touch screen 815.On touch screen 815 Each point can be consistent with each point of the plane of delineation.User can press and keep touch screen 815 so that the depth of light source is arranged It spends position and/or uses " extruding " and " expansion " gesture using two or more fingers.In other words, user can be in touch screen Two fingers are placed closely together on 815 and slide them along touch screen 815 and are separated to increase 3D data set Depth of the interior light source relative to the plane of delineation.In order to reduce depth, user can be separated in touch screen by two fingers It is pulled together on 815 and by them.These gestures are provided as example, and other gestures can be used to set up 3D number According to the position (for example, the control button provided on the touchscreen) of the analog light source of concentration.In some embodiments, user can be with Use one or integrated positioning analog light source in user's input method.For example, touch screen setting simulated light can be used in user The position in source and then using trace ball and/or rotation control " fine tuning " position.In some embodiments, user interface 805 It may include for positioning the additional and/or alternative user input control of analog light source (for example, sliding control, motion-sensing Device, light pen).In some embodiments, user interface 810 can be used to control the property of analog light source in user.For example, user The intensity and/or color of light source can be set.
Fig. 9 is the diagram of the drawing image 910 on display 905 according to an embodiment of the present disclosure.In some embodiments In, the display 38 of Fig. 3 or the display 810 of Fig. 8 can be used to implement display 905.In some embodiments, it shows Device 905 may include GUI and analog light source 915 can use visual cues drafting, to assist user's positioned light source.Such as scheming Shown in 9, when light source is positioned further from the plane of delineation in 3D data set, analog light source 915 can in image 910 quilt It is plotted as smaller in size.In some embodiments, the plane of delineation is aligned with display 905.As shown in FIG. 9, light source 915 will appear as also being moved in the page.In this example, light source 915a closest to the plane of delineation and light source 915c it is farthest From the plane of delineation.The size for changing the light source 915 in image 910 can provide instruction along the light source of the Z axis in 3D data set The visual cues of 915 depth and it can assist user that light source is located in 3D data set.
Figure 10 is the diagram of the drawing image 1010 on display 1005 according to an embodiment of the present disclosure.In some implementations In example, the display 38 of Fig. 3 or the display 810 of Fig. 8 can be used to implement display 1005.In some embodiments, Display 1005, which may include GUI, and analog light source 1015 can be drawn in image 1010 ring of light 1020.The ring of light 1020, which can permit user, visually positions light source 1015 in image 1010.In some embodiments, the ring of light 1020 can be permitted Family allowable positioned light source 1015 when light source 1015 is positioned in outside the visual field of image 1010.In some embodiments, user The ring of light 1020 can be opened and closed.That is, user can control the ring of light 1020 whether the light source 1015 in the image 1010 It is drawn.In some embodiments, light source 1020 can light source 1015 whithin a period of time (for example, half second, two seconds, Ten seconds) fix automatic later disappear.In some embodiments, user can close the visual cues of light source 1015.By closing, It is not intended to user's selection and removes the illumination drawn from light source from image 1010, but user closes the light source in image 1010 The drafting of 1015 visual cues (for example, sphere).In some embodiments, the drafting of the visual cues of light source 1015 can be It is automatic after light source 1015 is fixed within a period of time (for example, half second, two seconds, ten seconds) to disappear.Open and close the ring of light 1020 and/or the drafting of light source 1015 can permit user in the interference not from the visual cues for positioned light source 1015 In the case where observe image 1010.Visual cues (such as sphere and/or the ring of light) can by imaging system volume rendering device and/ Or image processor is drawn.For example, the volume rendering device 34 and image processor 36 of ultrasonic image-forming system 10 shown in Fig. 1 can Be used to implement embodiment of the disclosure.
Figure 11 a-b is the image example with the drawing image 1100a-c of light source 1115 of the ring of light 1120.Figure 11 a is shown Image 1100a with the analog light source 1115 for being plotted as sphere.Figure 11 b shows the light source for being drawn and having the ring of light 1120 1115.Some users can find the light source 1115 being more easily positioned in Figure 11 b compared with Figure 11 a.The ring of light 1120 can be with Indicate to the user that light source 1115 is multidirectional rather than is oriented.Figure 11 c is shown in which that light source 1115 has been positioned in user Visual field outside image 1100c.However, user can still positioned light source 1115 because the ring of light 1120 is in the visual field of user It is interior.In some embodiments, the ring of light 1120 can make to be easier to position and place light source 1115 for a user.
Figure 12 be it is according to an embodiment of the present disclosure for from the visual angle of the imaginary observer of 3D data set by analog light source It is located in the flow chart of the method 1200 in the 3D data set for drawing 2D projection.In some embodiments, method 1200 can So as to be used in Fig. 4 ultrasonic image-forming system shown in illustrated image rendering technologies 400 and Fig. 3 to implement.In some realities It applies in example, user can select the position of the analog light source in 3D data set before the 2D projected image for drawing 3D data set. In some embodiments, imaging system can use the initial default light source in default location to draw 2D throwing according to 3D data set Shadow image.Default light source and position can be preprogrammed into imaging system and/or can be by user settings.In some embodiments In, default light source can be away from the exterior orientation light source at data set fixed range.In some embodiments, default light source can be with It is to be positioned in multidirectional light source in 3D data set or neighbouring.At step 1205, imaging system be can receive for drawing 3D The selection of the analog light source of the 2D projected image of data set.In some embodiments, user can choose analog light source.User can To select light source via user interface (such as, the user interface 24 in Fig. 1 or the user interface 810 in Fig. 8).In some realities It applies in example, user, which can navigate, enters the Lighting control mode of imaging system by user interface.In some embodiments, it uses Family can tap button or touch screen to select light source.Optionally, user and/or imaging system can open at step 1210 With the visual cues of light source.That is, user can choose with the light source drawn in the picture as object, for example, sphere.One In a little embodiments, light source can default drafting in the picture.Optionally, user and/or imaging system can be at step 1215 Enable the ring of light for surrounding light source.In some embodiments, light source, which can be defaulted to be drawn, the ring of light.In some embodiments, it uses Family can prefer drawing image in the case where no ring of light.
At step 1220, imaging system can be inputted in response to user and be received flat corresponding to the projection of 2D projected image The instruction of position in the plane of analog light source in the plane in face (for example, plane of delineation 420 of Fig. 4).User can choose needle To position in the plane of light source.In some embodiments, position can correspond to the position in the plane of delineation in plane.In step At 1225, the depth location of the analog light source on the axis (for example, Z axis) of projection plane can be determined perpendicular to.In some implementations In example, user can choose the depth location for light source.Depth location can correspond to the 3D data relative to the plane of delineation Depth in collection.In some embodiments, step 1225 and step 1220 can execute in reverse order.In some embodiments In, step 1220 and 1225 can be performed simultaneously.User can by using trace ball, touch screen and/or another method and/ Or position and depth location in user interface (such as, above with reference to those described in Fig. 8) selection plane.In step 1230 Place, imaging system may then based in plane position and depth location to calculate one or more surfaces in 3D data set Surface shading information.At step 1235, imaging system can draw the 2D projected image including the shadow information on display. In some embodiments, imaging system can be moved by user with the position of light source and repaint image.That is, the light of image With shade (for example, surface shading information can be recalculated) can be dynamically changed as the position of light source is changed.This It can permit user and promptly compare the potential site of light source by the part of irradiation image in order and/or study image Feature.For example, user can move light source along backbone to check each vertebra.
At step 1240, once light source is in place, the ring of light can be deactivated in the case where drawn.In some embodiments In, user, which can choose, is deactivated (for example, via user interface).In some embodiments, imaging system can work as light source Automatically stop drawing the ring of light when fixing whithin a period of time.Alternatively, the ring of light can continue to be drawn.This has been selected as user It can be when selecting the position of the light source outside visual field desired.Optionally, it in step 1245, can deactivate for light source Visual cues.That is, the object for the light source being plotted as in image can be removed from image.Imaging system can automatically deactivate needle Visual cues or user to light source can choose the visual cues deactivated for light source.When user wishes that observation is attached in light source When the tiny characteristic irradiated in close image, the visual cues deactivated for light source can be advantageous.
In some embodiments, method 1200 can be performed during Image Acquisition.For example, imaging system can basis Carry out drawing image from the 3D data set that matrix array ultrasound energy converter acquires during ultrasonic examination.Imaging can be stored in Method is executed on 3D data set in system or other calculating equipment (for example, computer, hospital's mainframe, cloud service) 1200.For example, radiologist can check the image drawn according to the 3D data set acquired during previous check.
It, can be for multiple light sources execution and/or repetition methods although describing method 1200 with reference to single source 1200 all or part.For example, user first light source can be arranged at shallow depth (for example, near plane of delineation), It can provide general illumination to the drafting volume in image.Continue the example, user second light source can be arranged deeper Depth and/or close to area-of-interest.This can permit the feature that user highlights area-of-interest, at the same to Feature around area-of-interest provides visibility.
As it is used herein, the analog light source that can be placed in 3D data set and/or from anywhere in surrounding can Additional irradiation option is provided according to the image that 3D data set is drawn to be directed to.In some embodiments, analog light source can be Multidirectional light source.These additional options can permit surface and/or the thickness less inclined to by other anatomical features self-shiledings and tissue The drafting for the image of degree preferably defined.
Component, system and or method use programmable device (such as computer based system or programmable wherein Logic) in the various embodiments implemented, it should be appreciated that system described above and method can be used it is various known to or after Any of programming language (" C ", " C++ ", " FORTRAN ", " Pascal ", " VHDL " etc.) of exploitation is implemented.Cause This, can prepare various storage mediums, magnetic computer disk, CD, electronic memory etc., and may include can guide The equipment of such as computer implements the information of system described above and/or method.Once equipment appropriate has to quilt Access comprising information and program on a storage medium, information and program can be supplied to equipment by storage medium, therefore be made Obtain the function that equipment is able to carry out system and or method described herein.For example, if (all comprising material appropriate Such as source file, file destination, executable file) computer disks be provided to computer, then computer can receive information, Properly configure itself and execute the various system and method summarized in the above diagram and flow chart function it is various to implement Function.That is, computer can receive information from disk related from the different elements of system described above and/or method Each section, implement individual system and or method and coordinate the function of individual system and or method as described above.
In view of the disclosure, it should be noted that various methods and apparatus described herein may be implemented within hardware, software In firmware.In addition, only by example rather than including various methods and parameter with any restrictions meaning.In view of the disclosure, originally The those of ordinary skill in field can implement this introduction in the technology for determining its own and equipment needed for realizing these technologies In, while keeping within the scope of the present invention.
Although describing this system with specific reference to ultrasonic image-forming system, also it is contemplated that this system can expand Open up other medical image systems, wherein one or more image is obtained with system mode.Therefore, this system can be used for Obtain and/or record about but be not limited to kidney, testis, breast, ovary, uterus, thyroid gland, liver, lung, flesh skeleton, spleen, heart, The image information of artery and vascular system, and other imaging applications about the intervention of ultrasonic guidance formula.In addition, this system may be used also To include one or more programs, one or more of programs can use conventional imaging systems use, makes it possible to and mentions For the feature and advantage of this system.The specific additional advantage and feature of the disclosure are being studied for a person skilled in the art It can be it will be apparent that can be experienced by the people of innovative system and method using the disclosure after the disclosure.This system It can be with another advantage of method, general medical picture system can easily upgrade with by this system, device and method Feature and advantage be incorporated to.
Certainly, it should be appreciated that example described herein, embodiment or either one or two of in the process can with one or Other multiple examples, embodiment and/or process combination according to the individual equipment of this system, device and method or are being set Standby part is intermediate to be separated and/or executes.
Finally, described above be intended to only illustrate this system and be not construed as claims being limited to any specific The group of embodiment or embodiment.Therefore, although describing this system by reference to exemplary embodiment with specific detail, It is to be further appreciated that those skilled in the art are contemplated that many modifications and alternative embodiment, without departing from such as following Claim described in this system wider and expected spirit and scope.Therefore, the description and the appended drawings should be with illustrative Mode is treated and is not intended to be limited to the range of claims.

Claims (15)

1. a kind of ultrasonic image-forming system, comprising:
Ultrasonic probe is configured as being imaged from object reception ultrasonic echo with the volume to the object;
Scan converter is configured as generating three-dimensional (3D) data set according to the ultrasonic echo;
Volume rendering device is configured as being based at least partially on analog light source relative to the position of the 3D data set to calculate The surface shading information on the surface of the 3D data set and two dimension (2D) projected image for drawing the 3D data set, the 2D Projected image includes the shadow information;And
User interface comprising:
Display is configured as showing the 2D projected image;And
Input equipment comprising user's input element, user's input element are configured as receiving user's input with will be described Analog light source is located at the subsequent position in the surface of the 3D data set.
2. imaging system according to claim 1, wherein the analog light source is multidirectional light source.
3. imaging system according to claim 1, wherein the surface indicate imaging volume two different materials it Between boundary.
4. imaging system according to claim 1, wherein the volume rendering device is configured to respond to the simulated light Source is positioned in the instruction to set a distance before the surface perceived by imaginary observer and calculates the first shade letter Breath, and be positioned in response to the analog light source surface perceived by the imaginary observer it is subsequent it is described to Instruction at set a distance and calculate second shadow information different from first shadow information.
5. imaging system according to claim 1, wherein the user interface include trace ball, touch tablet, touch screen or A combination thereof, and wherein, the user interface element is provided via the trace ball, the touch tablet or the touch screen 's.
6. imaging system according to claim 1, wherein user's input element includes being displayed on the ultrasound system GUI on the touch screen of system, and wherein, the GUI includes the analog light source being displayed in the 2D projected image Visual cues and the 3D data set drawn, and wherein, the visual cues, which are able to respond, to be inputted in user and is moved To allow the user to dynamically change the position of the analog light source relative to the 3D data set drawn.
7. imaging system according to claim 1, wherein the volume rendering device is configured as drawing the 2D perspective view The visual cues of the analog light source as in.
8. imaging system according to claim 7, wherein the visual cues include sphere.
9. imaging system according to claim 7, wherein the size of the visual cues is based at least partially on the 3D The depth of the analog light source in data set.
10. imaging system according to claim 1, wherein the analog light source includes multiple analog light sources.
11. imaging system according to claim 1, wherein user's input element is first user's input element, and And wherein, the input equipment includes second user input element, and the second user input element is configured as receiving user It inputs so that the intensity of the analog light source is arranged.
12. a kind of method, comprising:
Selection to analog light source is received to draw the 2D projected image of 3D data set, wherein the 3D data set be according to from The received ultrasonic echo building of the volume of object;
It is inputted in response to user and receives the analog light source in plane corresponding with the projection plane of the 2D projected image Plane in position instruction;
Determine depth location of the analog light source on the axis perpendicular to the projection plane;
Position and the depth location in the plane are based at least partially on to calculate the surface on the surface of the 3D data set Shadow information;And
The 2D projected image including the shadow information is drawn over the display.
13. further including according to the method for claim 12, being opened after receiving to the selection of the analog light source With the visual cues of the analog light source in the 2D projected image drawn.
14. according to the method for claim 13, wherein the visual cues include the ring of light.
15. according to the method for claim 14, further include in the plane position be received and the depth Position deactivates the visual cues and the ring of light after being determined.
CN201780035916.2A 2016-06-10 2017-05-31 Systems and methods for lighting in rendered images Active CN109313818B (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201662348272P 2016-06-10 2016-06-10
US62/348,272 2016-06-10
EP16306454 2016-11-07
EP16306454.6 2016-11-07
PCT/EP2017/063080 WO2017211626A1 (en) 2016-06-10 2017-05-31 Systems and methods for lighting in rendered images

Publications (2)

Publication Number Publication Date
CN109313818A true CN109313818A (en) 2019-02-05
CN109313818B CN109313818B (en) 2023-10-20

Family

ID=57286429

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780035916.2A Active CN109313818B (en) 2016-06-10 2017-05-31 Systems and methods for lighting in rendered images

Country Status (2)

Country Link
CN (1) CN109313818B (en)
WO (1) WO2017211626A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112515705A (en) * 2019-09-18 2021-03-19 通用电气精准医疗有限责任公司 Method and system for projection contour enabled Computer Aided Detection (CAD)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021525632A (en) 2018-05-31 2021-09-27 マット・マクグラス・デザイン・アンド・カンパニー,リミテッド・ライアビリティー・カンパニーMatt Mcgrath Design & Co, Llc Medical imaging method using multiple sequences

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040125103A1 (en) * 2000-02-25 2004-07-01 Kaufman Arie E. Apparatus and method for volume processing and rendering
CN101357077A (en) * 2007-08-03 2009-02-04 美国西门子医疗解决公司 Multi-volume rendering of single mode data in medical diagnostic imaging
CN104885126A (en) * 2012-12-27 2015-09-02 皇家飞利浦有限公司 Computer-aided identification of a tissue of interest
US20160063758A1 (en) * 2014-08-26 2016-03-03 General Electric Company Method, system, and medical imaging device for shading volume-rendered images with multiple light sources

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6530885B1 (en) 2000-03-17 2003-03-11 Atl Ultrasound, Inc. Spatially compounded three dimensional ultrasonic images
US6443896B1 (en) 2000-08-17 2002-09-03 Koninklijke Philips Electronics N.V. Method for creating multiplanar ultrasonic images of a three dimensional object

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040125103A1 (en) * 2000-02-25 2004-07-01 Kaufman Arie E. Apparatus and method for volume processing and rendering
CN101357077A (en) * 2007-08-03 2009-02-04 美国西门子医疗解决公司 Multi-volume rendering of single mode data in medical diagnostic imaging
CN104885126A (en) * 2012-12-27 2015-09-02 皇家飞利浦有限公司 Computer-aided identification of a tissue of interest
US20160063758A1 (en) * 2014-08-26 2016-03-03 General Electric Company Method, system, and medical imaging device for shading volume-rendered images with multiple light sources

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112515705A (en) * 2019-09-18 2021-03-19 通用电气精准医疗有限责任公司 Method and system for projection contour enabled Computer Aided Detection (CAD)

Also Published As

Publication number Publication date
WO2017211626A1 (en) 2017-12-14
CN109313818B (en) 2023-10-20

Similar Documents

Publication Publication Date Title
CN109937435B (en) Systems and methods for simulated light source positioning in rendered images
CN109601018B (en) Systems and methods for generating B-mode images from 3D ultrasound data
Mohamed et al. A survey on 3D ultrasound reconstruction techniques
JP6887449B2 (en) Systems and methods for illuminating rendered images
KR101524085B1 (en) Apparatus and method for providing medical image
US8819591B2 (en) Treatment planning in a virtual environment
JP5495357B2 (en) Image display method and medical image diagnostic system
US20070046661A1 (en) Three or four-dimensional medical imaging navigation methods and systems
WO2015105619A9 (en) Apparatus and method for distributed ultrasound diagnostics
US20130328874A1 (en) Clip Surface for Volume Rendering in Three-Dimensional Medical Imaging
US7567701B2 (en) Input system for orientation in a three-dimensional visualization and method for visualization of three-dimensional data sets
CN115136200A (en) Rendering three-dimensional overlays on two-dimensional images
US6616618B2 (en) Method of and device for visualizing the orientation of therapeutic sound waves onto an area to be treated or processed
JP2020014551A (en) Medical image processing apparatus, medical image processing method, and medical image processing program
US11941765B2 (en) Representation apparatus for displaying a graphical representation of an augmented reality
CN109313818A (en) System and method for lighting in rendered images
CN112241996B (en) Method and system for coloring a volume rendered image
Gaurav et al. Integrating images from a moveable tracked display of three-dimensional data
Øye et al. Illustrative couinaud segmentation for ultrasound liver examinations

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant