[go: up one dir, main page]

US7812850B1 - Editing control for spatial deformations - Google Patents

Editing control for spatial deformations Download PDF

Info

Publication number
US7812850B1
US7812850B1 US11/771,726 US77172607A US7812850B1 US 7812850 B1 US7812850 B1 US 7812850B1 US 77172607 A US77172607 A US 77172607A US 7812850 B1 US7812850 B1 US 7812850B1
Authority
US
United States
Prior art keywords
unmodified
deformed
displayed
parameter
window
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US11/771,726
Inventor
John Nelson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Adobe Inc
Original Assignee
Adobe Systems Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Adobe Systems Inc filed Critical Adobe Systems Inc
Priority to US11/771,726 priority Critical patent/US7812850B1/en
Assigned to ADOBE SYSTEMS INCORPORATED reassignment ADOBE SYSTEMS INCORPORATED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NELSON, JOHN
Assigned to ADOBE SYSTEMS INCORPORATED reassignment ADOBE SYSTEMS INCORPORATED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NELSON, JOHN
Application granted granted Critical
Publication of US7812850B1 publication Critical patent/US7812850B1/en
Assigned to ADOBE INC. reassignment ADOBE INC. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: ADOBE SYSTEMS INCORPORATED
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2021Shape modification

Definitions

  • the present invention generally relates to editing graphical images, and more specifically relates to spatial deformation of objects in images.
  • Two-dimensional and three-dimensional electronic images may be edited using conventional computer-based image editing tools.
  • a two-dimensional object in an image may be deformed to create successive image frames for use in an animated video.
  • Conventional deformation tools use multiple windows to allow a user to deform a portion of an original image and to view the results of the deformation.
  • a conventional deformation tool for two-dimensional images allows a user to deform an object or objects that make up the image in one window, and to view the resulting deformed objects in the image in a second window.
  • a user may be presented with four or more windows: one each for deforming the unmodified image in the X, Y, and Z axes, and a fourth in which the deformed image is shown.
  • a user must repeatedly shift their focus between multiple windows and multiple representations of the image, a problem referred to as “edit this, look at that” or ETLAT.
  • the use of multiple windows can significantly reduce the available screen space for the display of the deformed image, or other representations of interest to the user.
  • a method for deforming an image comprises causing an unmodified image to be displayed in a window, receiving a deformation of the unmodified image, generating a deformed image based at least in part on the deformation and the unmodified image, and causing the deformed image to be displayed in the window.
  • This illustrative method further comprises causing a representation of the unmodified image to be displayed in the window with the deformed image, receiving a selection of a point within the representation of the unmodified image, receiving a modification of a first parameter associated with the unmodified image at the point, regenerating the deformed image based at least in part on the deformation, the unmodified image, and the first parameter, and causing the regenerated deformed image to be displayed in the window.
  • a computer-readable medium comprises program code for carrying out such a method.
  • FIG. 1 shows a system for editing an image according to one embodiment of the present invention
  • FIG. 2 shows a flowchart illustrating a method for editing an image according to one embodiment of the present invention
  • FIGS. 3-10 show an editing window and an object according to one embodiment of the present invention.
  • FIGS. 11-18 show an editing window and an object according to one embodiment of the present invention.
  • Embodiments of the present invention provide methods, systems and products for editing controls for spatial deformations. Methods, systems and products according to the present invention may be embodied in a number of ways. Certain embodiments of the present invention may, for example, reduce or eliminate ETLAT, streamline a graphical artist's workflow when editing an image, and/or make more efficient use of available space on a display screen, and allow the user to work at a higher resolution.
  • a computer system running computer software for editing two-dimensional images displays an editing window.
  • an unmodified, two-dimensional image containing an object such as a gingerbread man
  • the editing tool analyzes the image and generates a standard tessellated mesh to define the object, and to provide parameters that affect the response of the object to a deformation.
  • a user may deform the object by selecting a point on the image, such as a point on the gingerbread man's leg, and dragging that point to another location in the window, which causes the leg to move and the image to deform.
  • the deformation engine calculates the effect of the movement of the leg on the rest of the image. For example, by moving the leg, the editing tool may calculate that the leg stretches somewhat, the torso is pulled in the direction of the stretched leg, and still other parts of the gingerbread man may be stretched or compressed.
  • the editing tool uses a rendering engine to generate the visual representation of the image, and then displays the deformed image.
  • the editing tool may not display the desired part of the object in the foreground. For example, if the left leg of the gingerbread man is deformed so that it overlaps with the gingerbread man's right leg, the rendering engine may simply randomly determine which leg should be displayed in the foreground.
  • one illustrative embodiment of the present invention provides the user with a representation, such as an outline, of the undeformed object overlaid on the deformed object.
  • the user is able to adjust parameters in the outline of the unmodified object that result in changes to the display of the deformed object. For example, the user may adjust an “overlap” parameter associated with the left leg in the outline to specify that it should be displayed in the foreground.
  • the deformed image in which the right and left legs overlap, can then be immediately updated to reflect the change made to the unmodified image.
  • FIG. 1 shows a system 100 for editing an image according to one embodiment of the present invention.
  • System 100 comprises a computer 101 , a display device 102 , and an input device 103 .
  • Computer 101 comprises a processor 110 and a computer-readable medium, such as a memory 111 , and is in communication with the display device 102 and the input device 103 .
  • Memory 111 comprises an application 120 for editing an image, and an image 130 .
  • Application 120 comprises a deformation engine 121 and a rendering engine 122 .
  • a processor 101 executes the application 120 to allow a user to edit an image.
  • a user of system 100 may use the application 120 to retrieve image 130 from storage, load it into memory 111 , and display it on the display device 102 .
  • an image such as image 130
  • an image includes one or more deformable objects, which, when initially displayed, are in an undeformed state.
  • a user of system 100 may then deform one or more of the objects by manipulating it with the input device 103 .
  • Deformation engine 121 then deforms the object based on the deformation, and passes the deformed object to the rendering engine 122 .
  • the rendering engine 122 renders the deformed object, and the system 100 displays the modified image, including the deformed object, on the display device 102 .
  • a user may further edit the object by modifying parameters associated with the deformed object. For example, if the deformation caused one part of the object to overlap with another part, the user may wish to specify which of the two overlapping areas should be displayed in the foreground.
  • System 100 can display a representation, such as an outline, of the undeformed object overlaid on the deformed object to help the user select a part of the object to modify. The user may then select a point within the outline of the unmodified object, rather than within the deformed object, and change a parameter that affects which of the overlapping parts should be displayed in the foreground, and which should be obscured.
  • the change made to the parameter within the outline causes system 100 to execute deformation engine 121 to deform the object again; this time the deformation engine 121 deforms the object based at least in part on the changed parameter.
  • the deformation engine 121 then passes the deformed object to the rendering engine 122 , which renders the object.
  • System 100 displays the deformed object with the desired area in the foreground.
  • FIG. 2 shows a flowchart illustrating a method 200 for editing an image according to one embodiment of the present invention, which will be described with reference to the system 100 discussed above.
  • FIGS. 3-9 show a gingerbread man in various states of deformation.
  • the method 200 begins in block 201 , where the system 101 displays an unmodified object.
  • a user may execute an image editing application 120 on the system 100 to display an image 130 , such as a gingerbread man 300 as shown in FIG. 3 , on a display device 102 in an editing window 123 .
  • the user may start the application 120 and load the gingerbread man 300 from storage, such as from a hard drive, into memory 111 .
  • the system 100 can then display 201 the image 130 on the display device 102 .
  • the gingerbread man 300 is the only object in the image 130 , though other objects could be included in the image 130 as well. Initially, when the image 130 is displayed, it has zero or more unmodified objects. Each of the objects within the image 130 may be subsequently deformed, but the initial state of each object when loaded into the application 120 is the object's unmodified state.
  • the unmodified state of the objects in the image 130 may be different.
  • Various embodiments of the invention may handle this situation differently.
  • one embodiment of the present invention may determine an unmodified state for each object, e.g., the state of the object when the object is created.
  • the initial unmodified state of each object within the image may be stored with the deformed image.
  • the initial unmodified state of each object is loaded along with the deformed state of each object.
  • Other embodiments may allow a user to select a new unmodified state for an object. For example, the user may create an image having an object, in which the object has an initial unmodified state. The user may edit the object, and reach a configuration of the object that is more desirable. The user may then use the application 120 to store the new, desirable state as the new unmodified state. This may provide the user with a new reference point for future edits of the object. Still further embodiments would be apparent to one of ordinary skill in the art.
  • an image editing application 120 may generate a tessellated mesh for each object in an image, or such a mesh may be saved as a part of the image 130 .
  • An object such as a gingerbread man 300
  • An object such as a gingerbread man 300
  • the shape of the mesh generally conforms to the shape of the object, and provides data to allow the deformation engine 121 and the rendering engine 121 to process the object.
  • the deformation engine 121 may use the mesh to determine how a deformation causes the shape of an object to change.
  • the rendering engine 122 may use the mesh to apply textures, shadows, shading, or other features to the rendered object.
  • deformation engine 121 and/or rendering engine 122 may comprise specialized hardware configured to efficiently process a tessellated mesh.
  • a user interacts with the application 120 to indicate a deformation of the object. For example, a user may specify that the right leg 301 of the gingerbread man 300 should be moved to the left.
  • FIG. 4 shows this deformation as an arrow 400 .
  • a user may deform an object in a variety of ways. For example, in one embodiment of the present invention, a user uses a mouse to click on a point on the object, and drag the point to a new location. In such an embodiment, the application 120 interprets this input as a deformation of the object. In another embodiment, a user may add one or more control points to the object.
  • a control point in one embodiment of the present invention, may be added to an object to provide a user with a point at which to manipulate the object.
  • the control point may be associated with the unmodified object, and allow the user to use the control point to deform the object.
  • one or more control points may be associated with one or more vertices within a mesh corresponding to the object. In such an embodiment, a user may not be able to deform the object from any arbitrary point, but only by manipulating one of the control points.
  • control points may have one or more associated parameters.
  • a control point may have an overlap parameter and a starchiness parameter associated with it.
  • a user may manipulate the one or more parameters to cause a change in the appearance of the object.
  • a user specifies a control point 310 and a movement path.
  • the movement path specifies how the control point 310 should move over one or more frames of animation.
  • the object For each frame in which the control point 310 moves along the movement path, the object may be deformed.
  • the arrow 400 in FIG. 4 may specify a movement path for several frames of animation.
  • the leg moves a distance along the path.
  • Each frame may comprise a deformed gingerbread man with its leg at different states of travel along the movement path.
  • Such an embodiment may allow an animator to more easily animate an object, rather than by manually creating multiple, successive deformations.
  • the method 200 continues to block 203 , where the unmodified object is deformed.
  • the system 100 executes deformation engine 121 to determine a new shape for the object based at least in part on the unmodified object and the deformation.
  • the image editing application 120 may generate a tessellated mesh corresponding to the gingerbread man 300 .
  • the deformation engine 121 may determine how one of the gingerbread man's 300 legs should change shape based on the deformation received from the user. In addition, the deformation of the leg may also cause other parts of the gingerbread man 300 to change shape, as is discussed in greater detail below.
  • System 100 then displays the deformed gingerbread man 300 in the editing window 123 , as shown in block 204 and in FIG. 5 .
  • the gingerbread man's 300 right leg 301 has crossed underneath its left leg 302 based on the user's specified deformation 400 .
  • the user may command the system 100 to display a representation of the undeformed object in the same window as the deformed object.
  • a representation such as an outline 600 of the undeformed object
  • the user may select an option from a context-sensitive menu to display the outline in the editing window 123 .
  • the outline may automatically appear. This allows the user to view in the same window the outline 600 of the unmodified object and the deformed object. The user can then make a modification to the undeformed object, represented by the outline 600 , and view the resulting change to the deformed object without shifting focus to another window.
  • the representation is shown as an outline 600 , other representations are possible, and are within the scope of this invention.
  • the outline may be optionally filled with a partially-transparent color.
  • the outline 400 may comprise a partially transparent, fully-rendered image of the unmodified object.
  • the representation may comprise a wire-mesh representation of the unmodified object.
  • Still further embodiments may comprise other representations of the unmodified object that would be apparent to one of ordinary skill in the art.
  • the outline 600 of the undeformed object is overlaid on the deformed object; however, in one embodiment, the outline 600 may be offset from the deformed object, though still displayed in the same window 123 .
  • the user may wish to change how the deformed object looks.
  • the outline 600 provides the user with a visual depiction of how the object has deformed from its unmodified state, and may be useful for providing an alternate means for editing the deformed object.
  • a user designates a control point within the outline 600 of the undeformed object. For example, the user may want the right leg 301 to appear in front of the left leg 302 .
  • a user designates a control point 700 within the outline 600 of the unmodified gingerbread man.
  • FIG. 7 shows a control point 700 added within the outline 600 of the unmodified gingerbread man. While the control point 700 was added within the outline 600 , it is not necessary that control points be added within a representation of the unmodified image.
  • a control point may be added to an image and associated with an unmodified object, such as by associating the control point with a representation of the unmodified object. This may allow a user greater flexibility when adding control points to an object.
  • FIG. 7 Also visible in FIG. 7 is the position the control point 700 ′ would have in the deformed gingerbread man 300 .
  • the right leg 301 is located behind the left leg 301 , it may be impossible for a user to spatially select the right leg 301 of the deformed gingerbread man 300 to adjust a parameter associated with the right leg 301 .
  • the user specifies a location in the outline 600 , and sees the corresponding control point 700 ′ in the deformed object. This allows a user to easily interact with an obscured part of a deformed object.
  • the overlaid outline of the unmodified image allows the user to easily shift their focus between modifications made within the outline 600 and the result in the deformed image. For example, a user may adjust a parameter in the outline 600 and view the resulting change to the deformed object.
  • the immediate feedback offered by displaying the two representations of the object in the same window, as well as the reduced need to move the user's field of vision between different windows may aid a user when deforming an object.
  • more room on the display is available for the single window, rather than in a multi-windowed display that may significant reduce the available screen space to show both the unmodified object and the deformed object.
  • the user may modify a parameter associated with the control point 700 , shown in FIGS. 7 and 8 , to cause a change at the corresponding control point 700 ′ in the deformed object. For example, if the user wishes to display the right leg 301 in front of the left leg 302 , the user may modify a parameter associated with control point 700 , and view the resulting change in the deformed gingerbread man 300 .
  • the overlap parameter includes two values, an overlap value and an extent value.
  • the overlap value specifies a level of “in-frontness” or overlap for a part of an object, while the extent value specifies the region affected by the control point 700 .
  • the overlap parameter may include other values.
  • gingerbread man 300 is a two-dimensional object, when two parts of the gingerbread man overlap, the rendering engine 122 may not have a clear indication of which of the overlapping regions should be in the foreground.
  • the overlap value of an overlap parameter may be used by the rendering engine 122 as a pseudo third dimension. For example, before the user adjusts the overlap parameter associated with the control point 700 , the right leg 301 and the left leg 302 may have the same overlap value, or no overlap value. In which case, the rendering engine may randomly select which leg to display in the foreground. But if the user increases the overlap value of the right leg's 301 overlap parameter so that it is greater than the overlap value associated with the left leg 302 , the rendering engine 122 displays the right leg 301 in the foreground.
  • the user may decrease the overlap value associated with the right leg 301 to a value less than the left leg's 302 overlap value to ensure that the right leg 301 is obscured by the left leg 302 .
  • Other parameters such as starchiness, may be associated with the control point 700 as well, and are discussed in more detail below.
  • a shaded wireframe mesh region surrounds control point 700 .
  • the shading provides the user with a graphical indication of the overlap value. For example, brighter shading may indicate a higher overlap value, while darker shading may indicate a lower overlap value.
  • the graphical indication may provide an easy reference for a user regarding the overlap value without requiring the user to view a numerical value, or perform additional actions to display the overlap value for a region.
  • the relative brightness of the various overlap parameter extents may provide the user with an indication of which elements have greater overlap values with respect to other overlap values. This too may aid a user editing the object.
  • the extent value of the overlap parameter indicates how large of an area is affected by the overlap parameter.
  • FIG. 8 shows a shaded wireframe mesh region surrounding control point 700 .
  • the shaded wireframe mesh region is a graphical representation of the extent value of the overlap parameter.
  • the extent value as shown in FIG. 8 , is sized to include substantially all of the gingerbread man's 300 right leg 301 ′. This causes the overlap value to be applied to each point within the region defined by the extent value. Because substantially all of the right leg 301 ′ is within the region, the overlap value is applied to substantially all of the right leg 301 . Thus, the right leg 301 is shown in front of the left leg 302 .
  • the extent value defines a radius around the control point 700 ; however, the extent value may not always define a radius. In some embodiments the extent value may define the length of a side of a shape, such as square or polygon, or a region of arbitrary shape and size.
  • control over which leg is displayed in the foreground may be accomplished in other manners as well.
  • a user may add control point 700 to unmodified right leg 301 ′, and increase the overlap value so that the modified right leg 301 is displayed in front of the left leg 302 . If the user later decides it would be better to have the left leg 302 in the foreground, the user may accomplish this in different ways.
  • One method was described above: the user may simply decrease the overlap value associated with the control point 700 until the right leg's 301 ′ overlap value is less than the left leg's overlap value.
  • the user instead may cause the same result by moving control point 700 , with the increased overlap value, over to the left leg 302 ′ in the outline 600 . Because the overlap value is associated with the control point 700 , when the control point 700 is relocated to the left leg 302 ′, the corresponding overlap value is applied to the left leg 302 . This may cause the left leg 302 to be displayed in the foreground. Alternatively, the user may add a new selected point to the left leg 302 ′ and increase an overlap value associated with the new control point until it is greater than the overlap value of the control point 700 in the right leg 301 . These techniques for modifying parameter values, though discussed in relation to an overlap parameter, are equally applicable to modifying other parameters associated with a selected point.
  • the system 100 regenerates the deformed object based at least in part on the deformation and the modified parameter. For example, after the user has modified a parameter associated with the control point 700 , the application 120 executes the deformation engine 121 , if necessary, and the rendering engine 122 to regenerate the deformed object.
  • the deformation engine 121 may not be executed in all cases as all parameters may not be associated with the physical deformation of the object.
  • the overlap parameter may generally indicate which of two or more overlapping areas in the deformed object should be displayed in the foreground, but may not indicate how the object should react to a deformation.
  • the rendering engine determined that the left leg should be displayed in the foreground.
  • the overlap parameter gives the user the ability to specify that the right leg should be displayed in the foreground.
  • the overlap parameter may affect the appearance of the deformed object, it may not have any effect on the deformation.
  • the deformation engine may deform the object differently when an overlap parameter is modified. Other parameters may affect how the deformation engine 121 deforms the object and are discussed below.
  • FIG. 9 shows the regenerated deformed object based on the modification to the overlap parameter at control point 700 . This shows that a user who has manipulated the overlap parameter may be able to view the result of the modification while the overlaid editing controls are still visible.
  • FIG. 10 only the regenerated deformed image is shown. The user may then return to block 202 to deform the object again, or may return to block 206 to select another point to modify.
  • FIGS. 11-18 show a deformation of an object in an image according to one embodiment of the present invention. The figures will be described in relation to the method 200 shown in FIG. 2 and the system shown in FIG. 1 . As discussed previously, a user may deform and object, such as gingerbread man 1100 shown in FIG. 11 , by using a system 100 executing an application 120 for editing an image.
  • the system 100 displays the unmodified object.
  • the user may interact with the application 120 to deform the gingerbread man 1100 , such as by using the input device 103 as described above.
  • system 100 receives a deformation of the object from the user.
  • the user has specified deformation 1200 to move the gingerbread man's 1100 right leg 1101 to the left, similar to the deformation shown in FIG. 4 .
  • deformation 1201 is deformation 1201 .
  • Deformation 1201 was not specified by the user, but rather is an indirect deformation caused by the movement of the gingerbread man's 1100 right leg 1101 .
  • Indirect deformations may be caused by the deformation engine's 121 calculations when deforming an object based on a user's deformation, such as deformation 1200 .
  • FIG. 12 the user has specified deformation 1200 to move the gingerbread man's 1100 right leg 1101 to the left, similar to the deformation shown in FIG. 4 .
  • deformation 1201 was not specified by the user, but rather is an indirect deformation caused by the movement of the gingerbread man's 1100 right leg 1101 .
  • Indirect deformations may be caused by the deformation engine's 121 calculations when
  • the indirect deformation will cause the gingerbread man's 1100 head 1103 to tilt to his left, as indicated by arrow 1201 .
  • arrows 1200 , 1201 are not visible to a user in the embodiment shown, but rather are to aid in understanding the deformation of the gingerbread man 1100 . However, such indicators may be visible in one embodiment of the present invention.
  • the deformation engine 121 deforms the gingerbread man 1100 based on the deformation 1200 .
  • deformation engine 121 determines that the right leg 1101 will deform by moving to the left, and that the head 1103 will rotate to the left as well.
  • an application 120 may generate a mesh corresponding to an object. Further, this mesh may be used by the deformation engine 121 to determine how the object will deform based on the input deformation.
  • a mesh comprises a plurality of vertices and a plurality of lines (called edges) connecting the vertices to create a plurality of adjacent polygons.
  • the edges of the polygons may stretch or compress as the object deforms. If this stretching or compression propagates to other areas of the object, those areas may deform as well. Therefore, to reduce or eliminate the propagation of a deformation into a region, a user may modify a starchiness parameter associated with the region.
  • a starchiness parameter comprises two values: a starchiness value, and an extent value.
  • the starchiness value describes the resilience of a location in an object. A greater starchiness value in a location may reduce the amount of indirect deformation caused to the location, while a lower starchiness value may make a location more susceptible to deformation.
  • the extent value defines the size of the area affected by the starchiness parameter, and may be a radius extending from a selected point, or as otherwise described above with respect the overlap parameter.
  • the deformed object is displayed.
  • the deformed gingerbread man 1100 can be seen in FIG. 13 .
  • the right leg 1101 has crossed under the left leg 1102 , and the head has tilted to the left.
  • a user may desire that the head 1103 not deform from its original location, or deform to a lesser degree than is shown in FIG. 13 .
  • the user may adjust a parameter associated with the head 1103 .
  • the user causes the application 120 to display a representation of the unmodified object in the same window.
  • the representation comprises an outline 1400 , which is overlaid on the deformed object in the same window.
  • a user may maintain focus on the deformed object while interacting with the unmodified object. This may provide a more streamlined and efficient workflow for the user.
  • the user may add a control point 1500 within the outline 1400 of the unmodified object as shown in FIG. 15 .
  • a corresponding control point 1500 ′ is displayed on the head 1103 of the deformed gingerbread man 1103 . This may provide a visual cue that the user is editing the correct location on the deformed object. However, in one embodiment, the corresponding control point 1500 ′ is not displayed.
  • the user modifies a starchiness parameter associated with the control point 1500 .
  • the user has increased the starchiness value for the starchiness parameter associated with control point 1500 .
  • the user has selected an extent value such that the starchiness parameter affects substantially the entire head 1103 ′ of the gingerbread man.
  • the gingerbread man's head 1103 may become less susceptible to the indirect deformation caused by the deformation of the right leg 1101 .
  • the deformed object is regenerated.
  • the application 120 executes the deformation engine 121 to deform the gingerbread man 1100 based at least in part on the deformation and the modified parameter.
  • the deformation engine passes the deformed gingerbread man 1100 to the rendering engine, which renders the deformed gingerbread man 1100 .
  • FIG. 17 shows the outline 1400 of the unmodified gingerbread man overlaid on the regenerated deformed gingerbread man 1100 .
  • the head 1103 of the deformed gingerbread man 1100 has not undergone the same amount of indirect deformation that can be seen in FIG. 13 , and relative to the outline 1400 of the unmodified gingerbread man.
  • FIG. 18 shows the deformed gingerbread man 1100 without the outline 1400 .
  • the increased starchiness value associated with the head 1103 has caused the deformation engine to substantially reduce the amount of indirect deformation propagated to the head 1103 from the deformation of the right leg 1101 .
  • the right leg 1101 is displayed behind the left leg 1101 .
  • a user may wish to move the right leg 1101 into the foreground.
  • user may add a second selected point within outline 1400 corresponding to the right leg 1101 ′.
  • the user may then modify an overlap parameter associated with the second selected point to cause the right leg 1101 to be displayed in front of left leg 1102 .
  • the user may use a method, such as the embodiment described with respect to FIGS. 6-10 to move the right leg 1101 into the foreground.
  • Still further selected points may be added to outline 1400 to adjust additional parameters.
  • An application for editing an image may allow a user to edit three-dimensional images. For example, in one embodiment, the application displays a three-dimensional object in an editing window. A user may deform the three-dimensional object. The application may execute a deformation engine and a rendering engine to generate a deformed object based at least in part on the deformation and the unmodified object. The user may then cause the application to display an outline of the unmodified object in the editing window, and may be overlaid on the deformed object.
  • an outline of an unmodified object may comprise an outline of one cross section of the unmodified object along one of the X, Y, or Z axes.
  • a user may select one of the three separate to be displayed.
  • the outline may be a three-dimensional outline, and may be rotatable to provide different perspectives of the unmodified object.
  • the outline may comprise a partially-transparent rendered object of the unmodified object, which may allow a user to more easily adjust a perspective of the unmodified object.
  • a user may be able to view an outline of the unmodified object at varying depths. For example, a user may wish to add a control point or a selected point at a location inside the volume of the three-dimensional object. A user may be allowed to navigate into the interior of the object to add a control point or selected point.
  • a computer 101 may comprise a processor 110 or processors.
  • the processor 110 comprises a computer-readable medium, such as a random access memory (RAM) coupled to the processor.
  • the processor 110 executes computer-executable program instructions stored in memory 111 , such as executing one or more computer programs for editing an image.
  • processors may comprise a microprocessor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), field programmable gate arrays (FPGAs), and state machines.
  • DSP digital signal processor
  • ASIC application-specific integrated circuit
  • FPGAs field programmable gate arrays
  • Such processors may further comprise programmable electronic devices such as PLCs, programmable interrupt controllers (PICs), programmable logic devices (PLDs), programmable read-only memories (PROMs), electronically programmable read-only memories (EPROMs or EEPROMs), or other similar devices.
  • PLCs programmable interrupt controllers
  • PLDs programmable logic devices
  • PROMs programmable read-only memories
  • EPROMs or EEPROMs electronically programmable read-only memories
  • Such processors may comprise, or may be in communication with, media, for example computer-readable media, that may store instructions that, when executed by the processor, can cause the processor to perform the steps described herein as carried out, or assisted, by a processor.
  • Embodiments of computer-readable media may comprise, but are not limited to, an electronic, optical, magnetic, or other storage or transmission device capable of providing a processor, such as the processor in a web server, with computer-readable instructions.
  • Other examples of media comprise, but are not limited to, a floppy disk, CD-ROM, magnetic disk, memory chip, ROM, RAM, ASIC, configured processor, all optical media, all magnetic tape or other magnetic media, or any other medium from which a computer processor can read.
  • various other forms of computer-readable media may transmit or carry instructions to a computer, such as a router, private or public network, or other transmission device or channel.
  • the processor, and the processing, described may be in one or more structures, and may be dispersed through one or more structures.
  • the processor may comprise code for carrying out one or more of the methods (or parts of methods) described herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Architecture (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

Systems, methods, and computer readable media for editing controls for spatial deformations are described. One embodiment includes a method including the steps of causing an image having at least one unmodified object to be displayed in a window, receiving a deformation of the unmodified object, generating a deformed object based at least in part on the deformation and the unmodified object, and causing the deformed object to be displayed in the window. The method further includes the steps of causing a representation of the unmodified object to be displayed in the window with the deformed object, receiving a selection of a point within the representation of the unmodified object, receiving a modification of a first parameter associated with the unmodified object at the point, regenerating the deformed object based at least in part on the deformation, the unmodified object, and the modified first parameter, and causing the regenerated deformed object to be displayed in the window.

Description

FIELD OF THE INVENTION
The present invention generally relates to editing graphical images, and more specifically relates to spatial deformation of objects in images.
BACKGROUND
Two-dimensional and three-dimensional electronic images may be edited using conventional computer-based image editing tools. For example, a two-dimensional object in an image may be deformed to create successive image frames for use in an animated video. Conventional deformation tools use multiple windows to allow a user to deform a portion of an original image and to view the results of the deformation. For example, a conventional deformation tool for two-dimensional images allows a user to deform an object or objects that make up the image in one window, and to view the resulting deformed objects in the image in a second window. In a conventional deformation tool for three-dimensional images, a user may be presented with four or more windows: one each for deforming the unmodified image in the X, Y, and Z axes, and a fourth in which the deformed image is shown. In such conventional deformation tools, a user must repeatedly shift their focus between multiple windows and multiple representations of the image, a problem referred to as “edit this, look at that” or ETLAT. In addition, the use of multiple windows can significantly reduce the available screen space for the display of the deformed image, or other representations of interest to the user.
SUMMARY
Embodiments of the present invention provide methods and systems for editing controls for spatial deformations. For example, in one illustrative embodiment, a method for deforming an image comprises causing an unmodified image to be displayed in a window, receiving a deformation of the unmodified image, generating a deformed image based at least in part on the deformation and the unmodified image, and causing the deformed image to be displayed in the window. This illustrative method further comprises causing a representation of the unmodified image to be displayed in the window with the deformed image, receiving a selection of a point within the representation of the unmodified image, receiving a modification of a first parameter associated with the unmodified image at the point, regenerating the deformed image based at least in part on the deformation, the unmodified image, and the first parameter, and causing the regenerated deformed image to be displayed in the window. In another embodiment, a computer-readable medium comprises program code for carrying out such a method.
These illustrative embodiments are mentioned not to limit or define the invention, but to provide examples to aid understanding thereof. Illustrative embodiments are discussed in the Detailed Description, and further description of the invention is provided there. Advantages offered by various embodiments of this invention may be further understood by examining this specification.
BRIEF DESCRIPTION OF THE DRAWINGS
These and other features, aspects, and advantages of the present invention are better understood when the following Detailed Description is read with reference to the accompanying drawings, wherein:
FIG. 1 shows a system for editing an image according to one embodiment of the present invention;
FIG. 2 shows a flowchart illustrating a method for editing an image according to one embodiment of the present invention;
FIGS. 3-10 show an editing window and an object according to one embodiment of the present invention; and
FIGS. 11-18 show an editing window and an object according to one embodiment of the present invention.
DETAILED DESCRIPTION
Embodiments of the present invention provide methods, systems and products for editing controls for spatial deformations. Methods, systems and products according to the present invention may be embodied in a number of ways. Certain embodiments of the present invention may, for example, reduce or eliminate ETLAT, streamline a graphical artist's workflow when editing an image, and/or make more efficient use of available space on a display screen, and allow the user to work at a higher resolution.
Illustrative Editing Tool
In one illustrative embodiment of the present invention, a computer system running computer software for editing two-dimensional images displays an editing window. Within the window, an unmodified, two-dimensional image containing an object, such as a gingerbread man, is displayed. The editing tool analyzes the image and generates a standard tessellated mesh to define the object, and to provide parameters that affect the response of the object to a deformation.
A user may deform the object by selecting a point on the image, such as a point on the gingerbread man's leg, and dragging that point to another location in the window, which causes the leg to move and the image to deform. The deformation engine calculates the effect of the movement of the leg on the rest of the image. For example, by moving the leg, the editing tool may calculate that the leg stretches somewhat, the torso is pulled in the direction of the stretched leg, and still other parts of the gingerbread man may be stretched or compressed. The editing tool uses a rendering engine to generate the visual representation of the image, and then displays the deformed image. However, since the image is two-dimensional, if a part of the gingerbread man is moved so that it overlaps with another part of the gingerbread man, the editing tool may not display the desired part of the object in the foreground. For example, if the left leg of the gingerbread man is deformed so that it overlaps with the gingerbread man's right leg, the rendering engine may simply randomly determine which leg should be displayed in the foreground.
To allow a user to easily modify the deformed object, one illustrative embodiment of the present invention provides the user with a representation, such as an outline, of the undeformed object overlaid on the deformed object. The user is able to adjust parameters in the outline of the unmodified object that result in changes to the display of the deformed object. For example, the user may adjust an “overlap” parameter associated with the left leg in the outline to specify that it should be displayed in the foreground. The deformed image, in which the right and left legs overlap, can then be immediately updated to reflect the change made to the unmodified image.
This example is given to introduce the reader to the general subject matter discussed herein. The invention is not limited to this example. Further details regarding various embodiments of products and methods for controlling aspects of a deformation of two-dimensional and three-dimensional images are described below.
Illustrative System
Referring now to the drawings in which like numerals refer to like elements throughout the several figures, FIG. 1 shows a system 100 for editing an image according to one embodiment of the present invention. System 100 comprises a computer 101, a display device 102, and an input device 103. Computer 101 comprises a processor 110 and a computer-readable medium, such as a memory 111, and is in communication with the display device 102 and the input device 103. Memory 111 comprises an application 120 for editing an image, and an image 130. Application 120 comprises a deformation engine 121 and a rendering engine 122. A processor 101 executes the application 120 to allow a user to edit an image.
For example, a user of system 100 may use the application 120 to retrieve image 130 from storage, load it into memory 111, and display it on the display device 102. In general, an image, such as image 130, includes one or more deformable objects, which, when initially displayed, are in an undeformed state. A user of system 100 may then deform one or more of the objects by manipulating it with the input device 103. Deformation engine 121 then deforms the object based on the deformation, and passes the deformed object to the rendering engine 122. The rendering engine 122 renders the deformed object, and the system 100 displays the modified image, including the deformed object, on the display device 102.
After the deformed object has been displayed, a user may further edit the object by modifying parameters associated with the deformed object. For example, if the deformation caused one part of the object to overlap with another part, the user may wish to specify which of the two overlapping areas should be displayed in the foreground. System 100 can display a representation, such as an outline, of the undeformed object overlaid on the deformed object to help the user select a part of the object to modify. The user may then select a point within the outline of the unmodified object, rather than within the deformed object, and change a parameter that affects which of the overlapping parts should be displayed in the foreground, and which should be obscured. The change made to the parameter within the outline causes system 100 to execute deformation engine 121 to deform the object again; this time the deformation engine 121 deforms the object based at least in part on the changed parameter. The deformation engine 121 then passes the deformed object to the rendering engine 122, which renders the object. System 100 displays the deformed object with the desired area in the foreground.
Illustrative Methods
Further embodiments of the present invention will be described in more detail below with respect to FIGS. 2-18. For example, FIG. 2 shows a flowchart illustrating a method 200 for editing an image according to one embodiment of the present invention, which will be described with reference to the system 100 discussed above. FIGS. 3-9 show a gingerbread man in various states of deformation.
The method 200 begins in block 201, where the system 101 displays an unmodified object. For example, a user may execute an image editing application 120 on the system 100 to display an image 130, such as a gingerbread man 300 as shown in FIG. 3, on a display device 102 in an editing window 123. To display the gingerbread man 300, the user may start the application 120 and load the gingerbread man 300 from storage, such as from a hard drive, into memory 111. The system 100 can then display 201 the image 130 on the display device 102.
In the embodiment shown in FIG. 3, the gingerbread man 300 is the only object in the image 130, though other objects could be included in the image 130 as well. Initially, when the image 130 is displayed, it has zero or more unmodified objects. Each of the objects within the image 130 may be subsequently deformed, but the initial state of each object when loaded into the application 120 is the object's unmodified state.
But, because a user may edit an image 130 for a time, then save the image 130 to storage for later use and close the application 120. The next time the user loads the image from storage, the unmodified state of the objects in the image 130 may be different. Various embodiments of the invention may handle this situation differently. For example, one embodiment of the present invention may determine an unmodified state for each object, e.g., the state of the object when the object is created. As the user repeatedly edits, saves, and reloads the image 130, the initial unmodified state of each object within the image may be stored with the deformed image. Thus, the next time the user loads the image 130 from storage, even after many iterations of editing, the initial unmodified state of each object is loaded along with the deformed state of each object.
Other embodiments may allow a user to select a new unmodified state for an object. For example, the user may create an image having an object, in which the object has an initial unmodified state. The user may edit the object, and reach a configuration of the object that is more desirable. The user may then use the application 120 to store the new, desirable state as the new unmodified state. This may provide the user with a new reference point for future edits of the object. Still further embodiments would be apparent to one of ordinary skill in the art.
Some embodiments of the present invention may also perform additional steps when an image 130 is loaded from storage. For example, in one embodiment of the present invention, an image editing application 120 may generate a tessellated mesh for each object in an image, or such a mesh may be saved as a part of the image 130. An object, such as a gingerbread man 300, may be represented by the application 120 as a grouping of adjoining polygons, such as triangles, that make up a mesh. The shape of the mesh generally conforms to the shape of the object, and provides data to allow the deformation engine 121 and the rendering engine 121 to process the object. For example, the deformation engine 121 may use the mesh to determine how a deformation causes the shape of an object to change. The rendering engine 122 may use the mesh to apply textures, shadows, shading, or other features to the rendered object.
While the embodiments discussed herein have referred to software deformation engines and rendering engines, other embodiments may comprise software and/or hardware deformation engines and rendering engines. For example, in some embodiments, deformation engine 121 and/or rendering engine 122 may comprise specialized hardware configured to efficiently process a tessellated mesh.
In block 202, a user interacts with the application 120 to indicate a deformation of the object. For example, a user may specify that the right leg 301 of the gingerbread man 300 should be moved to the left. FIG. 4 shows this deformation as an arrow 400.
A user may deform an object in a variety of ways. For example, in one embodiment of the present invention, a user uses a mouse to click on a point on the object, and drag the point to a new location. In such an embodiment, the application 120 interprets this input as a deformation of the object. In another embodiment, a user may add one or more control points to the object.
A control point, in one embodiment of the present invention, may be added to an object to provide a user with a point at which to manipulate the object. The control point may be associated with the unmodified object, and allow the user to use the control point to deform the object. In one embodiment, one or more control points may be associated with one or more vertices within a mesh corresponding to the object. In such an embodiment, a user may not be able to deform the object from any arbitrary point, but only by manipulating one of the control points.
Further, control points may have one or more associated parameters. For example, in one embodiment, a control point may have an overlap parameter and a starchiness parameter associated with it. A user may manipulate the one or more parameters to cause a change in the appearance of the object. A more detailed description of types of parameters and their effect on an object is described below.
For example, in one embodiment of the present invention, a user specifies a control point 310 and a movement path. The movement path specifies how the control point 310 should move over one or more frames of animation. For each frame in which the control point 310 moves along the movement path, the object may be deformed. For example, the arrow 400 in FIG. 4 may specify a movement path for several frames of animation. For each frame of animation, the leg moves a distance along the path. Each frame may comprise a deformed gingerbread man with its leg at different states of travel along the movement path. Such an embodiment may allow an animator to more easily animate an object, rather than by manually creating multiple, successive deformations.
After receiving the deformation, the method 200 continues to block 203, where the unmodified object is deformed. Once the user has specified a deformation for an object, the system 100 executes deformation engine 121 to determine a new shape for the object based at least in part on the unmodified object and the deformation. As discussed above, the image editing application 120 may generate a tessellated mesh corresponding to the gingerbread man 300. The deformation engine 121, according to one embodiment of the present invention, may determine how one of the gingerbread man's 300 legs should change shape based on the deformation received from the user. In addition, the deformation of the leg may also cause other parts of the gingerbread man 300 to change shape, as is discussed in greater detail below. Once the deformation engine 121 has finished deforming the object, it passes the deformed object to the rendering engine 122, which renders the image.
System 100 then displays the deformed gingerbread man 300 in the editing window 123, as shown in block 204 and in FIG. 5. As can be seen in FIG. 5, the gingerbread man's 300 right leg 301 has crossed underneath its left leg 302 based on the user's specified deformation 400.
In block 205, the user may command the system 100 to display a representation of the undeformed object in the same window as the deformed object. As shown in FIG. 6, for example, a representation, such as an outline 600 of the undeformed object, is displayed in the same window 123 as the deformed object. For example, the user may select an option from a context-sensitive menu to display the outline in the editing window 123. In another embodiment, the outline may automatically appear. This allows the user to view in the same window the outline 600 of the unmodified object and the deformed object. The user can then make a modification to the undeformed object, represented by the outline 600, and view the resulting change to the deformed object without shifting focus to another window.
While the representation is shown as an outline 600, other representations are possible, and are within the scope of this invention. For example, in one embodiment of the present invention, the outline may be optionally filled with a partially-transparent color. In one embodiment of the present invention, the outline 400 may comprise a partially transparent, fully-rendered image of the unmodified object. In another embodiment, the representation may comprise a wire-mesh representation of the unmodified object. Still further embodiments may comprise other representations of the unmodified object that would be apparent to one of ordinary skill in the art.
In the embodiment shown in FIG. 2 and FIG. 6, the outline 600 of the undeformed object is overlaid on the deformed object; however, in one embodiment, the outline 600 may be offset from the deformed object, though still displayed in the same window 123. After the user has deformed the gingerbread man 300, the user may wish to change how the deformed object looks. The outline 600 provides the user with a visual depiction of how the object has deformed from its unmodified state, and may be useful for providing an alternate means for editing the deformed object.
In block 206, a user designates a control point within the outline 600 of the undeformed object. For example, the user may want the right leg 301 to appear in front of the left leg 302. To accomplish this task, a user designates a control point 700 within the outline 600 of the unmodified gingerbread man. FIG. 7 shows a control point 700 added within the outline 600 of the unmodified gingerbread man. While the control point 700 was added within the outline 600, it is not necessary that control points be added within a representation of the unmodified image. A control point may be added to an image and associated with an unmodified object, such as by associating the control point with a representation of the unmodified object. This may allow a user greater flexibility when adding control points to an object.
Also visible in FIG. 7 is the position the control point 700′ would have in the deformed gingerbread man 300. As can be seen, because the right leg 301 is located behind the left leg 301, it may be impossible for a user to spatially select the right leg 301 of the deformed gingerbread man 300 to adjust a parameter associated with the right leg 301. Thus, it may be necessary for a user to interact with the corresponding location in the outline 600. By adding control point 700, the user specifies a location in the outline 600, and sees the corresponding control point 700′ in the deformed object. This allows a user to easily interact with an obscured part of a deformed object.
In addition, the overlaid outline of the unmodified image allows the user to easily shift their focus between modifications made within the outline 600 and the result in the deformed image. For example, a user may adjust a parameter in the outline 600 and view the resulting change to the deformed object. The immediate feedback offered by displaying the two representations of the object in the same window, as well as the reduced need to move the user's field of vision between different windows may aid a user when deforming an object. Further, by displaying the unmodified object and the deformed object in the same window, more room on the display is available for the single window, rather than in a multi-windowed display that may significant reduce the available screen space to show both the unmodified object and the deformed object.
In block 207, the user may modify a parameter associated with the control point 700, shown in FIGS. 7 and 8, to cause a change at the corresponding control point 700′ in the deformed object. For example, if the user wishes to display the right leg 301 in front of the left leg 302, the user may modify a parameter associated with control point 700, and view the resulting change in the deformed gingerbread man 300.
One of the parameters that may be associated with a control point 700 is an overlap parameter. In the embodiment shown, the overlap parameter includes two values, an overlap value and an extent value. The overlap value specifies a level of “in-frontness” or overlap for a part of an object, while the extent value specifies the region affected by the control point 700. In other embodiments, the overlap parameter may include other values.
Because gingerbread man 300 is a two-dimensional object, when two parts of the gingerbread man overlap, the rendering engine 122 may not have a clear indication of which of the overlapping regions should be in the foreground. The overlap value of an overlap parameter may be used by the rendering engine 122 as a pseudo third dimension. For example, before the user adjusts the overlap parameter associated with the control point 700, the right leg 301 and the left leg 302 may have the same overlap value, or no overlap value. In which case, the rendering engine may randomly select which leg to display in the foreground. But if the user increases the overlap value of the right leg's 301 overlap parameter so that it is greater than the overlap value associated with the left leg 302, the rendering engine 122 displays the right leg 301 in the foreground. Conversely, the user may decrease the overlap value associated with the right leg 301 to a value less than the left leg's 302 overlap value to ensure that the right leg 301 is obscured by the left leg 302. Other parameters, such as starchiness, may be associated with the control point 700 as well, and are discussed in more detail below.
As can be seen in FIG. 8, a shaded wireframe mesh region surrounds control point 700. In the embodiment shown, the shading provides the user with a graphical indication of the overlap value. For example, brighter shading may indicate a higher overlap value, while darker shading may indicate a lower overlap value. The graphical indication may provide an easy reference for a user regarding the overlap value without requiring the user to view a numerical value, or perform additional actions to display the overlap value for a region. In addition, if multiple overlap values are shown within the outline 600, the relative brightness of the various overlap parameter extents may provide the user with an indication of which elements have greater overlap values with respect to other overlap values. This too may aid a user editing the object.
The extent value of the overlap parameter indicates how large of an area is affected by the overlap parameter. FIG. 8 shows a shaded wireframe mesh region surrounding control point 700. The shaded wireframe mesh region is a graphical representation of the extent value of the overlap parameter. The extent value, as shown in FIG. 8, is sized to include substantially all of the gingerbread man's 300 right leg 301′. This causes the overlap value to be applied to each point within the region defined by the extent value. Because substantially all of the right leg 301′ is within the region, the overlap value is applied to substantially all of the right leg 301. Thus, the right leg 301 is shown in front of the left leg 302. However, if the extent value were reduced so that only part of the right leg 301′ were affected by the overlap parameter, only the portions of the right leg 301 affected by the overlap parameter will be displayed in front of the left leg 302, which may have an undesirable result in the displayed gingerbread man.
As shown in FIG. 8, the extent value defines a radius around the control point 700; however, the extent value may not always define a radius. In some embodiments the extent value may define the length of a side of a shape, such as square or polygon, or a region of arbitrary shape and size.
While methods for modifying an overlap parameter associated with a control point were described above, control over which leg is displayed in the foreground may be accomplished in other manners as well. A user may add control point 700 to unmodified right leg 301′, and increase the overlap value so that the modified right leg 301 is displayed in front of the left leg 302. If the user later decides it would be better to have the left leg 302 in the foreground, the user may accomplish this in different ways. One method was described above: the user may simply decrease the overlap value associated with the control point 700 until the right leg's 301′ overlap value is less than the left leg's overlap value. But the user instead may cause the same result by moving control point 700, with the increased overlap value, over to the left leg 302′ in the outline 600. Because the overlap value is associated with the control point 700, when the control point 700 is relocated to the left leg 302′, the corresponding overlap value is applied to the left leg 302. This may cause the left leg 302 to be displayed in the foreground. Alternatively, the user may add a new selected point to the left leg 302′ and increase an overlap value associated with the new control point until it is greater than the overlap value of the control point 700 in the right leg 301. These techniques for modifying parameter values, though discussed in relation to an overlap parameter, are equally applicable to modifying other parameters associated with a selected point.
In block 208, the system 100 regenerates the deformed object based at least in part on the deformation and the modified parameter. For example, after the user has modified a parameter associated with the control point 700, the application 120 executes the deformation engine 121, if necessary, and the rendering engine 122 to regenerate the deformed object. The deformation engine 121 may not be executed in all cases as all parameters may not be associated with the physical deformation of the object. The overlap parameter may generally indicate which of two or more overlapping areas in the deformed object should be displayed in the foreground, but may not indicate how the object should react to a deformation.
For example, in FIG. 7, the rendering engine determined that the left leg should be displayed in the foreground. However, the overlap parameter gives the user the ability to specify that the right leg should be displayed in the foreground. Thus, while the overlap parameter may affect the appearance of the deformed object, it may not have any effect on the deformation. Though in some embodiments, the deformation engine may deform the object differently when an overlap parameter is modified. Other parameters may affect how the deformation engine 121 deforms the object and are discussed below.
In block 209, after the deformed image has been regenerated, the system 100 displays the regenerated deformed object on a display device 102 as shown in FIGS. 9 and 10. FIG. 9 shows the regenerated deformed object based on the modification to the overlap parameter at control point 700. This shows that a user who has manipulated the overlap parameter may be able to view the result of the modification while the overlaid editing controls are still visible. In FIG. 10, only the regenerated deformed image is shown. The user may then return to block 202 to deform the object again, or may return to block 206 to select another point to modify.
Further permutations are contemplated in other embodiments of the present invention, and the blocks described above may be performed in different orders. In addition, additional parameters may be edited by a user according to other embodiments of the present invention. For example, FIGS. 11-18 show a deformation of an object in an image according to one embodiment of the present invention. The figures will be described in relation to the method 200 shown in FIG. 2 and the system shown in FIG. 1. As discussed previously, a user may deform and object, such as gingerbread man 1100 shown in FIG. 11, by using a system 100 executing an application 120 for editing an image.
In block 201, the system 100 displays the unmodified object. The user may interact with the application 120 to deform the gingerbread man 1100, such as by using the input device 103 as described above.
In block 202, system 100 receives a deformation of the object from the user. In the embodiment shown in FIG. 12, the user has specified deformation 1200 to move the gingerbread man's 1100 right leg 1101 to the left, similar to the deformation shown in FIG. 4. But, also shown in FIG. 12, is deformation 1201. Deformation 1201 was not specified by the user, but rather is an indirect deformation caused by the movement of the gingerbread man's 1100 right leg 1101. Indirect deformations may be caused by the deformation engine's 121 calculations when deforming an object based on a user's deformation, such as deformation 1200. In the embodiment shown in FIG. 12, the indirect deformation will cause the gingerbread man's 1100 head 1103 to tilt to his left, as indicated by arrow 1201. It should be noted that arrows 1200, 1201 are not visible to a user in the embodiment shown, but rather are to aid in understanding the deformation of the gingerbread man 1100. However, such indicators may be visible in one embodiment of the present invention.
In block 203, the deformation engine 121 deforms the gingerbread man 1100 based on the deformation 1200. In the embodiment shown, deformation engine 121 determines that the right leg 1101 will deform by moving to the left, and that the head 1103 will rotate to the left as well.
As was discussed earlier, in some embodiments of the present invention, an application 120 may generate a mesh corresponding to an object. Further, this mesh may be used by the deformation engine 121 to determine how the object will deform based on the input deformation.
A mesh comprises a plurality of vertices and a plurality of lines (called edges) connecting the vertices to create a plurality of adjacent polygons. The edges of the polygons may stretch or compress as the object deforms. If this stretching or compression propagates to other areas of the object, those areas may deform as well. Therefore, to reduce or eliminate the propagation of a deformation into a region, a user may modify a starchiness parameter associated with the region.
A starchiness parameter comprises two values: a starchiness value, and an extent value. The starchiness value describes the resilience of a location in an object. A greater starchiness value in a location may reduce the amount of indirect deformation caused to the location, while a lower starchiness value may make a location more susceptible to deformation. The extent value defines the size of the area affected by the starchiness parameter, and may be a radius extending from a selected point, or as otherwise described above with respect the overlap parameter. Thus, by adjusting a starchiness parameter associated with a location, a user may affect how the object deforms.
In block 204, the deformed object is displayed. The deformed gingerbread man 1100 can be seen in FIG. 13. As suggested by arrows 1200, 1201, the right leg 1101 has crossed under the left leg 1102, and the head has tilted to the left. However, a user may desire that the head 1103 not deform from its original location, or deform to a lesser degree than is shown in FIG. 13. To address this unwanted deformation, the user may adjust a parameter associated with the head 1103.
In block 205, the user causes the application 120 to display a representation of the unmodified object in the same window. In the embodiment shown in FIG. 14, the representation comprises an outline 1400, which is overlaid on the deformed object in the same window. As discussed earlier, by displaying the outline 1400 of the unmodified image in the same window as the deformed image, a user may maintain focus on the deformed object while interacting with the unmodified object. This may provide a more streamlined and efficient workflow for the user.
In block 206, the user may add a control point 1500 within the outline 1400 of the unmodified object as shown in FIG. 15. As can be seen, a corresponding control point 1500′ is displayed on the head 1103 of the deformed gingerbread man 1103. This may provide a visual cue that the user is editing the correct location on the deformed object. However, in one embodiment, the corresponding control point 1500′ is not displayed.
In block 207, the user modifies a starchiness parameter associated with the control point 1500. In the embodiment shown in FIG. 16, the user has increased the starchiness value for the starchiness parameter associated with control point 1500. In addition the user has selected an extent value such that the starchiness parameter affects substantially the entire head 1103′ of the gingerbread man. By increasing the starchiness value associated with control point 1500, the gingerbread man's head 1103 may become less susceptible to the indirect deformation caused by the deformation of the right leg 1101.
In block 208, the deformed object is regenerated. In one embodiment, the application 120 executes the deformation engine 121 to deform the gingerbread man 1100 based at least in part on the deformation and the modified parameter. The deformation engine passes the deformed gingerbread man 1100 to the rendering engine, which renders the deformed gingerbread man 1100.
In block 209, the application 120 displays the regenerated deformed object, as can be seen in FIGS. 17 and 18. FIG. 17 shows the outline 1400 of the unmodified gingerbread man overlaid on the regenerated deformed gingerbread man 1100. As can be seen, the head 1103 of the deformed gingerbread man 1100 has not undergone the same amount of indirect deformation that can be seen in FIG. 13, and relative to the outline 1400 of the unmodified gingerbread man. FIG. 18 shows the deformed gingerbread man 1100 without the outline 1400. As can be seen, the increased starchiness value associated with the head 1103 has caused the deformation engine to substantially reduce the amount of indirect deformation propagated to the head 1103 from the deformation of the right leg 1101.
In the embodiment shown in FIGS. 13-18, the right leg 1101 is displayed behind the left leg 1101. However, a user may wish to move the right leg 1101 into the foreground. Using one embodiment of the present invention, user may add a second selected point within outline 1400 corresponding to the right leg 1101′. The user may then modify an overlap parameter associated with the second selected point to cause the right leg 1101 to be displayed in front of left leg 1102. For example, the user may use a method, such as the embodiment described with respect to FIGS. 6-10 to move the right leg 1101 into the foreground. Still further selected points may be added to outline 1400 to adjust additional parameters.
While embodiments of the present invention have been described for editing controls for two-dimensional objects, embodiments of the invention also include corresponding editing controls for three-dimensional objects. An application for editing an image may allow a user to edit three-dimensional images. For example, in one embodiment, the application displays a three-dimensional object in an editing window. A user may deform the three-dimensional object. The application may execute a deformation engine and a rendering engine to generate a deformed object based at least in part on the deformation and the unmodified object. The user may then cause the application to display an outline of the unmodified object in the editing window, and may be overlaid on the deformed object.
In a three-dimensional editing tool, an outline of an unmodified object may comprise an outline of one cross section of the unmodified object along one of the X, Y, or Z axes. In such an embodiment, a user may select one of the three separate to be displayed. In one embodiment, the outline may be a three-dimensional outline, and may be rotatable to provide different perspectives of the unmodified object. Further, the outline may comprise a partially-transparent rendered object of the unmodified object, which may allow a user to more easily adjust a perspective of the unmodified object. In one embodiment of the present invention for editing three-dimensional objects, a user may be able to view an outline of the unmodified object at varying depths. For example, a user may wish to add a control point or a selected point at a location inside the volume of the three-dimensional object. A user may be allowed to navigate into the interior of the object to add a control point or selected point.
Referring again to FIG. 1, embodiments of the present invention can be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them. In one embodiment, a computer 101 may comprise a processor 110 or processors. The processor 110 comprises a computer-readable medium, such as a random access memory (RAM) coupled to the processor. The processor 110 executes computer-executable program instructions stored in memory 111, such as executing one or more computer programs for editing an image. Such processors may comprise a microprocessor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), field programmable gate arrays (FPGAs), and state machines. Such processors may further comprise programmable electronic devices such as PLCs, programmable interrupt controllers (PICs), programmable logic devices (PLDs), programmable read-only memories (PROMs), electronically programmable read-only memories (EPROMs or EEPROMs), or other similar devices.
Such processors may comprise, or may be in communication with, media, for example computer-readable media, that may store instructions that, when executed by the processor, can cause the processor to perform the steps described herein as carried out, or assisted, by a processor. Embodiments of computer-readable media may comprise, but are not limited to, an electronic, optical, magnetic, or other storage or transmission device capable of providing a processor, such as the processor in a web server, with computer-readable instructions. Other examples of media comprise, but are not limited to, a floppy disk, CD-ROM, magnetic disk, memory chip, ROM, RAM, ASIC, configured processor, all optical media, all magnetic tape or other magnetic media, or any other medium from which a computer processor can read. Also, various other forms of computer-readable media may transmit or carry instructions to a computer, such as a router, private or public network, or other transmission device or channel. The processor, and the processing, described may be in one or more structures, and may be dispersed through one or more structures. The processor may comprise code for carrying out one or more of the methods (or parts of methods) described herein.
General
The foregoing description of the embodiments, including preferred embodiments, of the invention has been presented only for the purpose of illustration and description and is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Numerous modifications and adaptations thereof will be apparent to those skilled in the art without departing from the spirit and scope of the invention.

Claims (28)

1. A method for editing an image, comprising:
using a processor to cause an image comprising at least one unmodified object to be displayed in a window;
receiving, by a processor, a deformation of the unmodified object;
using a processor to generate a deformed object based at least in part on the deformation and the unmodified object;
using a processor to cause the deformed object to be displayed in the window;
using a processor to cause a representation of the unmodified object to be displayed in the window with the deformed object;
receiving, by a processor, a selection of a first point associated with the unmodified object while the representation of the unmodified object and the deformed object are displayed in the window;
receiving, by a processor, a modification of a first parameter at the first point while the representation of the unmodified object and the deformed object are displayed in the window;
using a processor to generate a first regenerated deformed object based at least in part on the deformation, the unmodified object, and the modification of the first parameter; and
using a processor to cause the first regenerated deformed object to be displayed in the window.
2. The method of claim 1, wherein receiving the selection of the first point further comprises receiving a selection of the first point within the representation of the unmodified object.
3. The method of claim 1, wherein the representation comprises an outline of the unmodified object and causing a representation of the unmodified object to be displayed in the window with the deformed object further comprises overlaying the outline of the unmodified object on the deformed object.
4. The method of claim 1, wherein the first parameter comprises an overlap parameter or a starchiness parameter.
5. The method of claim 4, wherein the overlap parameter indicates which of two or more overlapping areas of the deformed object should be visible, and the starchiness parameter indicates a resistance to deforming for an area of the unmodified object.
6. The method of claim 1, wherein the first parameter comprises a magnitude and an extent.
7. The method of claim 1, further comprising using a processor to cause a graphical indication of a magnitude and an extent of the first parameter to be displayed within the representation.
8. The method of claim 1, wherein the deformation comprises at least one of a stretching or a compression of the unmodified object.
9. The method of claim 1, further comprising:
receiving, by a processor, a selection of a second point associated with the unmodified object while the representation of the unmodified object and the deformed object are displayed in the window;
receiving, by a processor, a modification of a second parameter at the second point while the representation of the unmodified object and the deformed object are displayed in the window;
using a processor to generate a second regenerated deformed object based at least in part on the unmodified object, the deformation, the modification of the first parameter, and the modification of the second parameter; and
using a processor to cause the second regenerated deformed object to be displayed in the window.
10. The method of claim 1, further comprising:
using a processor to generate a tessellated mesh associated with the object, the tessellated mesh comprising a plurality of vertices and a plurality of edges,
wherein the first point is associated with a vertex in the tessellated mesh.
11. A non-transitory computer-readable medium on which is encoded program code configured to be executed by a processor, the program code comprising:
program code to cause an image comprising at least one unmodified object to be displayed in a window;
program code to receive a deformation of the unmodified object;
program code to generate a deformed object based at least in part on the deformation and the unmodified object;
program code to cause the deformed object to be displayed in the window;
program code to cause a representation of the unmodified object to be displayed in the window with the deformed object;
program code to receive a selection of a first point associated with the unmodified object while the representation of the unmodified object and the deformed object are displayed in the window;
program code to receive a modification of a first parameter at the first point while the representation of the unmodified object and the deformed object are displayed in the window;
program code to generate a first regenerated deformed object based at least in part on the deformation, the unmodified object, and the modification of the first parameter; and
program code to cause the first regenerated deformed object to be displayed in the window.
12. The computer-readable medium of claim 11, wherein the program code to receive the selection of the first point further comprises program code to receive a selection of the first point within the representation of the unmodified object.
13. The computer-readable medium of claim 11, wherein the representation comprises an outline of the unmodified object, and the program code to cause a representation of the unmodified object to be displayed in the window with the deformed object further comprises program code to overlay the outline of the unmodified object on the deformed object.
14. The computer-readable medium of claim 11, wherein the first parameter comprises an overlap parameter or a starchiness parameter.
15. The computer-readable medium of claim 14, wherein the overlap parameter indicates which of two or more overlapping areas of the deformed object should be visible, and the starchiness parameter indicates a resistance to deforming for an area of the unmodified object.
16. The computer-readable medium of claim 11, wherein the first parameter comprises a level and an extent.
17. The computer-readable medium of claim 11, further comprising program code to cause a graphical indication of a level and an extent of the first parameter to be displayed within the representation.
18. The computer-readable medium of claim 11, wherein the deformation comprises at least one of a stretching or a compression of the unmodified object.
19. The computer-readable medium of claim 11, further comprising:
program code to receive a selection of a second point associated with the unmodified object while the representation of the unmodified object and the deformed object are displayed in the window;
program code to receive a modification of a second parameter at the second point while the representation of the unmodified object and the deformed object are displayed in the window;
program code to generate a second regenerated deformed object based at least in part on the unmodified object, the deformation, the modification of the first parameter, and the modification of the second parameter; and
program code to cause the second regenerated deformed object to be displayed in the window.
20. The computer-readable medium of claim 11, further comprising:
program code to generate a triangular mesh associated with the object, the triangular mesh comprising a plurality of vertices and a plurality of edges,
wherein the first point is associated with a vertex in the triangular mesh.
21. A system for editing an image, comprising:
a deformation engine;
a rendering engine; and
a processor in communication with the deformation engine and the rendering engine, the processor configured to:
receive an image comprising an unmodified object;
cause the image to be displayed in a window
receive a deformation to the unmodified object;
cause the deformation engine to generate a deformed object based at least in part on the unmodified object and the deformation
cause the rendering engine to render the image and the deformed object,
cause the image and the deformed object to be displayed in a window,
cause a representation of the unmodified object to be displayed in the window with the deformed object;
receive a selection of a point associated with the unmodified object while the representation of the unmodified object and the deformed object are displayed in the window;
receive a modification of a first parameter at the point while the representation of the unmodified object and the deformed object are displayed in the window;
cause the deformation engine to generate a regenerated deformed object based at least in part on the deformation, the unmodified object, and the modified first parameter;
cause the rendering engine to render the regenerated deformed object, and
cause the regenerated deformed object to be displayed in the window.
22. The system of claim 21, further comprising a display.
23. The system of claim 21, further comprising an input device.
24. The system of claim 21, wherein the processor is further configured to receive the selection of the first point within the representation of the unmodified object.
25. The system of claim 21, wherein the processor is further configured to cause the representation of the unmodified object to be overlaid on the deformed object.
26. The system of claim 21, wherein the first parameter comprises an overlap parameter or a starchiness parameter.
27. The system of claim 26, wherein the rendering engine is configured to render the object based at least in part on the overlap parameter.
28. The system of claim 26, wherein the deformation engine is configured to generate the deformed object based at least in part on the starchiness parameter.
US11/771,726 2007-06-29 2007-06-29 Editing control for spatial deformations Active 2029-02-18 US7812850B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/771,726 US7812850B1 (en) 2007-06-29 2007-06-29 Editing control for spatial deformations

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US11/771,726 US7812850B1 (en) 2007-06-29 2007-06-29 Editing control for spatial deformations

Publications (1)

Publication Number Publication Date
US7812850B1 true US7812850B1 (en) 2010-10-12

Family

ID=42830927

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/771,726 Active 2029-02-18 US7812850B1 (en) 2007-06-29 2007-06-29 Editing control for spatial deformations

Country Status (1)

Country Link
US (1) US7812850B1 (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080193013A1 (en) * 2007-02-13 2008-08-14 Thomas Schiwietz System and method for on-the-fly segmentations for image deformations
JP2013045295A (en) * 2011-08-24 2013-03-04 Casio Comput Co Ltd Image processor, image processing method, and program
JP2013045296A (en) * 2011-08-24 2013-03-04 Casio Comput Co Ltd Image processor, image processing method, and program
US20130305172A1 (en) * 2012-05-10 2013-11-14 Motorola Mobility, Inc. Pen Tool Editing Modes
GB2509369A (en) * 2012-11-02 2014-07-02 Imagination Tech Ltd 3-D rendering using geometry control points or an acceleration structure
US8849032B2 (en) 2011-03-08 2014-09-30 Canon Kabushiki Kaisha Shape parameterisation for editable document generation
US9367933B2 (en) 2012-06-26 2016-06-14 Google Technologies Holdings LLC Layering a line with multiple layers for rendering a soft brushstroke
US20170011549A1 (en) * 2015-07-09 2017-01-12 Disney Enterprises, Inc. Object Deformation Modeling
US9646410B2 (en) 2015-06-30 2017-05-09 Microsoft Technology Licensing, Llc Mixed three dimensional scene reconstruction from plural surface models
US9665978B2 (en) * 2015-07-20 2017-05-30 Microsoft Technology Licensing, Llc Consistent tessellation via topology-aware surface tracking
US20180130243A1 (en) * 2016-11-08 2018-05-10 Samsung Electronics Co., Ltd. Display apparatus and control method thereof
US10163247B2 (en) 2015-07-14 2018-12-25 Microsoft Technology Licensing, Llc Context-adaptive allocation of render model resources
US10297075B2 (en) * 2017-09-19 2019-05-21 Arm Limited Method of processing image data
US20200098087A1 (en) * 2018-09-25 2020-03-26 Adobe Inc. Generating enhanced digital content using piecewise parametric patch deformations
US10628918B2 (en) 2018-09-25 2020-04-21 Adobe Inc. Generating enhanced digital content using piecewise parametric patch deformations
WO2020220679A1 (en) * 2019-04-30 2020-11-05 北京市商汤科技开发有限公司 Method and device for image processing, and computer storage medium
US10832376B2 (en) 2018-09-25 2020-11-10 Adobe Inc. Generating enhanced digital content using piecewise parametric patch deformations
US11393135B1 (en) 2020-02-28 2022-07-19 Apple Inc. Modifying objects in a graphical environment
US11694414B2 (en) * 2019-08-19 2023-07-04 Clo Virtual Fashion Inc. Method and apparatus for providing guide for combining pattern pieces of clothing
US12190467B2 (en) 2022-08-11 2025-01-07 Adobe Inc. Modifying parametric continuity of digital image content in piecewise parametric patch deformations

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6573889B1 (en) 1999-02-08 2003-06-03 Adobe Systems Incorporated Analytic warping
US6608631B1 (en) * 2000-05-02 2003-08-19 Pixar Amination Studios Method, apparatus, and computer program product for geometric warps and deformations
US20060038832A1 (en) * 2004-08-03 2006-02-23 Smith Randall C System and method for morphable model design space definition
US7102652B2 (en) 2001-10-01 2006-09-05 Adobe Systems Incorporated Compositing two-dimensional and three-dimensional image layers
US7103236B2 (en) 2001-08-28 2006-09-05 Adobe Systems Incorporated Methods and apparatus for shifting perspective in a composite image
US7113187B1 (en) * 2000-05-11 2006-09-26 Dan Kikinis Method and system for localized advertising using localized 3-D templates
US7385612B1 (en) * 2002-05-30 2008-06-10 Adobe Systems Incorporated Distortion of raster and vector artwork
US20080150962A1 (en) * 2006-12-20 2008-06-26 Bezryadin Sergey N Image display using a computer system, including, but not limited to, display of a reference image for comparison with a current image in image editing

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6573889B1 (en) 1999-02-08 2003-06-03 Adobe Systems Incorporated Analytic warping
US6608631B1 (en) * 2000-05-02 2003-08-19 Pixar Amination Studios Method, apparatus, and computer program product for geometric warps and deformations
US7113187B1 (en) * 2000-05-11 2006-09-26 Dan Kikinis Method and system for localized advertising using localized 3-D templates
US7103236B2 (en) 2001-08-28 2006-09-05 Adobe Systems Incorporated Methods and apparatus for shifting perspective in a composite image
US7102652B2 (en) 2001-10-01 2006-09-05 Adobe Systems Incorporated Compositing two-dimensional and three-dimensional image layers
US7385612B1 (en) * 2002-05-30 2008-06-10 Adobe Systems Incorporated Distortion of raster and vector artwork
US20060038832A1 (en) * 2004-08-03 2006-02-23 Smith Randall C System and method for morphable model design space definition
US20080150962A1 (en) * 2006-12-20 2008-06-26 Bezryadin Sergey N Image display using a computer system, including, but not limited to, display of a reference image for comparison with a current image in image editing

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Kai's Power Goo, Encyclopedia of Science Fiction and SimuWeb and SimuNet, Cyber News and Reviews, web page at http://www.cyber-reviews.com/oct96.html, as available via the Internet and printed Feb. 12, 2008.
Newcomb, M., Mini-Reviews, web page available at http://www.miken.com/winpost/sep96/minirev.htm, as available via the Internet and printed Feb. 12, 2008.

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7961945B2 (en) * 2007-02-13 2011-06-14 Technische Universität München System and method for on-the-fly segmentations for image deformations
US20080193013A1 (en) * 2007-02-13 2008-08-14 Thomas Schiwietz System and method for on-the-fly segmentations for image deformations
US8849032B2 (en) 2011-03-08 2014-09-30 Canon Kabushiki Kaisha Shape parameterisation for editable document generation
JP2013045295A (en) * 2011-08-24 2013-03-04 Casio Comput Co Ltd Image processor, image processing method, and program
JP2013045296A (en) * 2011-08-24 2013-03-04 Casio Comput Co Ltd Image processor, image processing method, and program
US20130305172A1 (en) * 2012-05-10 2013-11-14 Motorola Mobility, Inc. Pen Tool Editing Modes
US9367933B2 (en) 2012-06-26 2016-06-14 Google Technologies Holdings LLC Layering a line with multiple layers for rendering a soft brushstroke
US11568592B2 (en) 2012-11-02 2023-01-31 Imagination Technologies Limited On demand geometry and acceleration structure creation with tile object lists
CN104885123A (en) * 2012-11-02 2015-09-02 想象技术有限公司 On-demand geometry and accelerated structure formation
GB2509369B (en) * 2012-11-02 2017-05-10 Imagination Tech Ltd On demand geometry processing for 3-d rendering
US12211136B2 (en) 2012-11-02 2025-01-28 Imagination Technologies Limited On demand geometry and acceleration structure creation with tile object lists
CN104885123B (en) * 2012-11-02 2018-01-09 想象技术有限公司 Geometry processing method and graphics rendering system for graphics rendering
US10339696B2 (en) 2012-11-02 2019-07-02 Imagination Technologies Limited On demand geometry and acceleration structure creation with discrete production scheduling
GB2509369A (en) * 2012-11-02 2014-07-02 Imagination Tech Ltd 3-D rendering using geometry control points or an acceleration structure
US10186070B2 (en) 2012-11-02 2019-01-22 Imagination Technologies Limited On demand geometry and acceleration structure creation
US10242487B2 (en) 2012-11-02 2019-03-26 Imagination Technologies Limited On demand geometry and acceleration structure creation
US10943386B2 (en) 2012-11-02 2021-03-09 Imagination Technologies Limited On demand geometry and acceleration structure creation with tile object lists
US9646410B2 (en) 2015-06-30 2017-05-09 Microsoft Technology Licensing, Llc Mixed three dimensional scene reconstruction from plural surface models
US20170011549A1 (en) * 2015-07-09 2017-01-12 Disney Enterprises, Inc. Object Deformation Modeling
US9799146B2 (en) * 2015-07-09 2017-10-24 Disney Enterprises, Inc. Object deformation modeling
US10163247B2 (en) 2015-07-14 2018-12-25 Microsoft Technology Licensing, Llc Context-adaptive allocation of render model resources
US9665978B2 (en) * 2015-07-20 2017-05-30 Microsoft Technology Licensing, Llc Consistent tessellation via topology-aware surface tracking
US20180130243A1 (en) * 2016-11-08 2018-05-10 Samsung Electronics Co., Ltd. Display apparatus and control method thereof
US10297075B2 (en) * 2017-09-19 2019-05-21 Arm Limited Method of processing image data
US10832376B2 (en) 2018-09-25 2020-11-10 Adobe Inc. Generating enhanced digital content using piecewise parametric patch deformations
US20200098087A1 (en) * 2018-09-25 2020-03-26 Adobe Inc. Generating enhanced digital content using piecewise parametric patch deformations
US10706500B2 (en) * 2018-09-25 2020-07-07 Adobe Inc. Generating enhanced digital content using piecewise parametric patch deformations
US10628918B2 (en) 2018-09-25 2020-04-21 Adobe Inc. Generating enhanced digital content using piecewise parametric patch deformations
WO2020220679A1 (en) * 2019-04-30 2020-11-05 北京市商汤科技开发有限公司 Method and device for image processing, and computer storage medium
US20210035260A1 (en) * 2019-04-30 2021-02-04 Beijing Sensetime Technology Development Co., Ltd. Method and apparatus for image processing, and computer storage medium
US11501407B2 (en) * 2019-04-30 2022-11-15 Beijing Sensetime Technology Development Co., Ltd. Method and apparatus for image processing, and computer storage medium
US11694414B2 (en) * 2019-08-19 2023-07-04 Clo Virtual Fashion Inc. Method and apparatus for providing guide for combining pattern pieces of clothing
US12125162B2 (en) 2019-08-19 2024-10-22 Clo Virtual Fashion Inc. Method and apparatus for providing guide for combining pattern pieces of clothing
US11393135B1 (en) 2020-02-28 2022-07-19 Apple Inc. Modifying objects in a graphical environment
US12190467B2 (en) 2022-08-11 2025-01-07 Adobe Inc. Modifying parametric continuity of digital image content in piecewise parametric patch deformations

Similar Documents

Publication Publication Date Title
US7812850B1 (en) Editing control for spatial deformations
US6867770B2 (en) Systems and methods for voxel warping
KR101257849B1 (en) Method and Apparatus for rendering 3D graphic objects, and Method and Apparatus to minimize rendering objects for the same
US5977978A (en) Interactive authoring of 3D scenes and movies
US6417850B1 (en) Depth painting for 3-D rendering applications
EP1918881B1 (en) Techniques and workflows for computer graphics animation system
EP2419885B1 (en) Method for adding shadows to objects in computer graphics
US9153072B2 (en) Reducing the size of a model using visibility factors
US9471996B2 (en) Method for creating graphical materials for universal rendering framework
EP1550984A2 (en) Integrating particle rendering and three-dimensional geometry rendering
US8134551B2 (en) Frontend for universal rendering framework
EP2469474A1 (en) Creation of a playable scene with an authoring system
JP6333840B2 (en) Method for forming shell mesh based on optimized polygons
WO2012037157A2 (en) System and method for displaying data having spatial coordinates
WO2004107764A1 (en) Image display device and program
JP6445825B2 (en) Video processing apparatus and method
EP2797054B1 (en) Rendering of an indirect illumination data buffer
US9754406B2 (en) Multiple light source simulation in computer graphics
KR20120104071A (en) 3d image visual effect processing method
US7944443B1 (en) Sliding patch deformer
US20040012593A1 (en) Generating animation data with constrained parameters
CN113470153A (en) Rendering method and device of virtual scene and electronic equipment
US11915349B1 (en) Extrusion technique for curve rendering
WO2007130018A1 (en) Image-based occlusion culling
US20250156029A1 (en) Techniques for motion editing for character animations

Legal Events

Date Code Title Description
AS Assignment

Owner name: ADOBE SYSTEMS INCORPORATED, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NELSON, JOHN;REEL/FRAME:019500/0201

Effective date: 20070629

AS Assignment

Owner name: ADOBE SYSTEMS INCORPORATED, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NELSON, JOHN;REEL/FRAME:019505/0963

Effective date: 20070629

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552)

Year of fee payment: 8

AS Assignment

Owner name: ADOBE INC., CALIFORNIA

Free format text: CHANGE OF NAME;ASSIGNOR:ADOBE SYSTEMS INCORPORATED;REEL/FRAME:048525/0042

Effective date: 20181008

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 12