US7812850B1 - Editing control for spatial deformations - Google Patents
Editing control for spatial deformations Download PDFInfo
- Publication number
- US7812850B1 US7812850B1 US11/771,726 US77172607A US7812850B1 US 7812850 B1 US7812850 B1 US 7812850B1 US 77172607 A US77172607 A US 77172607A US 7812850 B1 US7812850 B1 US 7812850B1
- Authority
- US
- United States
- Prior art keywords
- unmodified
- deformed
- displayed
- parameter
- window
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2021—Shape modification
Definitions
- the present invention generally relates to editing graphical images, and more specifically relates to spatial deformation of objects in images.
- Two-dimensional and three-dimensional electronic images may be edited using conventional computer-based image editing tools.
- a two-dimensional object in an image may be deformed to create successive image frames for use in an animated video.
- Conventional deformation tools use multiple windows to allow a user to deform a portion of an original image and to view the results of the deformation.
- a conventional deformation tool for two-dimensional images allows a user to deform an object or objects that make up the image in one window, and to view the resulting deformed objects in the image in a second window.
- a user may be presented with four or more windows: one each for deforming the unmodified image in the X, Y, and Z axes, and a fourth in which the deformed image is shown.
- a user must repeatedly shift their focus between multiple windows and multiple representations of the image, a problem referred to as “edit this, look at that” or ETLAT.
- the use of multiple windows can significantly reduce the available screen space for the display of the deformed image, or other representations of interest to the user.
- a method for deforming an image comprises causing an unmodified image to be displayed in a window, receiving a deformation of the unmodified image, generating a deformed image based at least in part on the deformation and the unmodified image, and causing the deformed image to be displayed in the window.
- This illustrative method further comprises causing a representation of the unmodified image to be displayed in the window with the deformed image, receiving a selection of a point within the representation of the unmodified image, receiving a modification of a first parameter associated with the unmodified image at the point, regenerating the deformed image based at least in part on the deformation, the unmodified image, and the first parameter, and causing the regenerated deformed image to be displayed in the window.
- a computer-readable medium comprises program code for carrying out such a method.
- FIG. 1 shows a system for editing an image according to one embodiment of the present invention
- FIG. 2 shows a flowchart illustrating a method for editing an image according to one embodiment of the present invention
- FIGS. 3-10 show an editing window and an object according to one embodiment of the present invention.
- FIGS. 11-18 show an editing window and an object according to one embodiment of the present invention.
- Embodiments of the present invention provide methods, systems and products for editing controls for spatial deformations. Methods, systems and products according to the present invention may be embodied in a number of ways. Certain embodiments of the present invention may, for example, reduce or eliminate ETLAT, streamline a graphical artist's workflow when editing an image, and/or make more efficient use of available space on a display screen, and allow the user to work at a higher resolution.
- a computer system running computer software for editing two-dimensional images displays an editing window.
- an unmodified, two-dimensional image containing an object such as a gingerbread man
- the editing tool analyzes the image and generates a standard tessellated mesh to define the object, and to provide parameters that affect the response of the object to a deformation.
- a user may deform the object by selecting a point on the image, such as a point on the gingerbread man's leg, and dragging that point to another location in the window, which causes the leg to move and the image to deform.
- the deformation engine calculates the effect of the movement of the leg on the rest of the image. For example, by moving the leg, the editing tool may calculate that the leg stretches somewhat, the torso is pulled in the direction of the stretched leg, and still other parts of the gingerbread man may be stretched or compressed.
- the editing tool uses a rendering engine to generate the visual representation of the image, and then displays the deformed image.
- the editing tool may not display the desired part of the object in the foreground. For example, if the left leg of the gingerbread man is deformed so that it overlaps with the gingerbread man's right leg, the rendering engine may simply randomly determine which leg should be displayed in the foreground.
- one illustrative embodiment of the present invention provides the user with a representation, such as an outline, of the undeformed object overlaid on the deformed object.
- the user is able to adjust parameters in the outline of the unmodified object that result in changes to the display of the deformed object. For example, the user may adjust an “overlap” parameter associated with the left leg in the outline to specify that it should be displayed in the foreground.
- the deformed image in which the right and left legs overlap, can then be immediately updated to reflect the change made to the unmodified image.
- FIG. 1 shows a system 100 for editing an image according to one embodiment of the present invention.
- System 100 comprises a computer 101 , a display device 102 , and an input device 103 .
- Computer 101 comprises a processor 110 and a computer-readable medium, such as a memory 111 , and is in communication with the display device 102 and the input device 103 .
- Memory 111 comprises an application 120 for editing an image, and an image 130 .
- Application 120 comprises a deformation engine 121 and a rendering engine 122 .
- a processor 101 executes the application 120 to allow a user to edit an image.
- a user of system 100 may use the application 120 to retrieve image 130 from storage, load it into memory 111 , and display it on the display device 102 .
- an image such as image 130
- an image includes one or more deformable objects, which, when initially displayed, are in an undeformed state.
- a user of system 100 may then deform one or more of the objects by manipulating it with the input device 103 .
- Deformation engine 121 then deforms the object based on the deformation, and passes the deformed object to the rendering engine 122 .
- the rendering engine 122 renders the deformed object, and the system 100 displays the modified image, including the deformed object, on the display device 102 .
- a user may further edit the object by modifying parameters associated with the deformed object. For example, if the deformation caused one part of the object to overlap with another part, the user may wish to specify which of the two overlapping areas should be displayed in the foreground.
- System 100 can display a representation, such as an outline, of the undeformed object overlaid on the deformed object to help the user select a part of the object to modify. The user may then select a point within the outline of the unmodified object, rather than within the deformed object, and change a parameter that affects which of the overlapping parts should be displayed in the foreground, and which should be obscured.
- the change made to the parameter within the outline causes system 100 to execute deformation engine 121 to deform the object again; this time the deformation engine 121 deforms the object based at least in part on the changed parameter.
- the deformation engine 121 then passes the deformed object to the rendering engine 122 , which renders the object.
- System 100 displays the deformed object with the desired area in the foreground.
- FIG. 2 shows a flowchart illustrating a method 200 for editing an image according to one embodiment of the present invention, which will be described with reference to the system 100 discussed above.
- FIGS. 3-9 show a gingerbread man in various states of deformation.
- the method 200 begins in block 201 , where the system 101 displays an unmodified object.
- a user may execute an image editing application 120 on the system 100 to display an image 130 , such as a gingerbread man 300 as shown in FIG. 3 , on a display device 102 in an editing window 123 .
- the user may start the application 120 and load the gingerbread man 300 from storage, such as from a hard drive, into memory 111 .
- the system 100 can then display 201 the image 130 on the display device 102 .
- the gingerbread man 300 is the only object in the image 130 , though other objects could be included in the image 130 as well. Initially, when the image 130 is displayed, it has zero or more unmodified objects. Each of the objects within the image 130 may be subsequently deformed, but the initial state of each object when loaded into the application 120 is the object's unmodified state.
- the unmodified state of the objects in the image 130 may be different.
- Various embodiments of the invention may handle this situation differently.
- one embodiment of the present invention may determine an unmodified state for each object, e.g., the state of the object when the object is created.
- the initial unmodified state of each object within the image may be stored with the deformed image.
- the initial unmodified state of each object is loaded along with the deformed state of each object.
- Other embodiments may allow a user to select a new unmodified state for an object. For example, the user may create an image having an object, in which the object has an initial unmodified state. The user may edit the object, and reach a configuration of the object that is more desirable. The user may then use the application 120 to store the new, desirable state as the new unmodified state. This may provide the user with a new reference point for future edits of the object. Still further embodiments would be apparent to one of ordinary skill in the art.
- an image editing application 120 may generate a tessellated mesh for each object in an image, or such a mesh may be saved as a part of the image 130 .
- An object such as a gingerbread man 300
- An object such as a gingerbread man 300
- the shape of the mesh generally conforms to the shape of the object, and provides data to allow the deformation engine 121 and the rendering engine 121 to process the object.
- the deformation engine 121 may use the mesh to determine how a deformation causes the shape of an object to change.
- the rendering engine 122 may use the mesh to apply textures, shadows, shading, or other features to the rendered object.
- deformation engine 121 and/or rendering engine 122 may comprise specialized hardware configured to efficiently process a tessellated mesh.
- a user interacts with the application 120 to indicate a deformation of the object. For example, a user may specify that the right leg 301 of the gingerbread man 300 should be moved to the left.
- FIG. 4 shows this deformation as an arrow 400 .
- a user may deform an object in a variety of ways. For example, in one embodiment of the present invention, a user uses a mouse to click on a point on the object, and drag the point to a new location. In such an embodiment, the application 120 interprets this input as a deformation of the object. In another embodiment, a user may add one or more control points to the object.
- a control point in one embodiment of the present invention, may be added to an object to provide a user with a point at which to manipulate the object.
- the control point may be associated with the unmodified object, and allow the user to use the control point to deform the object.
- one or more control points may be associated with one or more vertices within a mesh corresponding to the object. In such an embodiment, a user may not be able to deform the object from any arbitrary point, but only by manipulating one of the control points.
- control points may have one or more associated parameters.
- a control point may have an overlap parameter and a starchiness parameter associated with it.
- a user may manipulate the one or more parameters to cause a change in the appearance of the object.
- a user specifies a control point 310 and a movement path.
- the movement path specifies how the control point 310 should move over one or more frames of animation.
- the object For each frame in which the control point 310 moves along the movement path, the object may be deformed.
- the arrow 400 in FIG. 4 may specify a movement path for several frames of animation.
- the leg moves a distance along the path.
- Each frame may comprise a deformed gingerbread man with its leg at different states of travel along the movement path.
- Such an embodiment may allow an animator to more easily animate an object, rather than by manually creating multiple, successive deformations.
- the method 200 continues to block 203 , where the unmodified object is deformed.
- the system 100 executes deformation engine 121 to determine a new shape for the object based at least in part on the unmodified object and the deformation.
- the image editing application 120 may generate a tessellated mesh corresponding to the gingerbread man 300 .
- the deformation engine 121 may determine how one of the gingerbread man's 300 legs should change shape based on the deformation received from the user. In addition, the deformation of the leg may also cause other parts of the gingerbread man 300 to change shape, as is discussed in greater detail below.
- System 100 then displays the deformed gingerbread man 300 in the editing window 123 , as shown in block 204 and in FIG. 5 .
- the gingerbread man's 300 right leg 301 has crossed underneath its left leg 302 based on the user's specified deformation 400 .
- the user may command the system 100 to display a representation of the undeformed object in the same window as the deformed object.
- a representation such as an outline 600 of the undeformed object
- the user may select an option from a context-sensitive menu to display the outline in the editing window 123 .
- the outline may automatically appear. This allows the user to view in the same window the outline 600 of the unmodified object and the deformed object. The user can then make a modification to the undeformed object, represented by the outline 600 , and view the resulting change to the deformed object without shifting focus to another window.
- the representation is shown as an outline 600 , other representations are possible, and are within the scope of this invention.
- the outline may be optionally filled with a partially-transparent color.
- the outline 400 may comprise a partially transparent, fully-rendered image of the unmodified object.
- the representation may comprise a wire-mesh representation of the unmodified object.
- Still further embodiments may comprise other representations of the unmodified object that would be apparent to one of ordinary skill in the art.
- the outline 600 of the undeformed object is overlaid on the deformed object; however, in one embodiment, the outline 600 may be offset from the deformed object, though still displayed in the same window 123 .
- the user may wish to change how the deformed object looks.
- the outline 600 provides the user with a visual depiction of how the object has deformed from its unmodified state, and may be useful for providing an alternate means for editing the deformed object.
- a user designates a control point within the outline 600 of the undeformed object. For example, the user may want the right leg 301 to appear in front of the left leg 302 .
- a user designates a control point 700 within the outline 600 of the unmodified gingerbread man.
- FIG. 7 shows a control point 700 added within the outline 600 of the unmodified gingerbread man. While the control point 700 was added within the outline 600 , it is not necessary that control points be added within a representation of the unmodified image.
- a control point may be added to an image and associated with an unmodified object, such as by associating the control point with a representation of the unmodified object. This may allow a user greater flexibility when adding control points to an object.
- FIG. 7 Also visible in FIG. 7 is the position the control point 700 ′ would have in the deformed gingerbread man 300 .
- the right leg 301 is located behind the left leg 301 , it may be impossible for a user to spatially select the right leg 301 of the deformed gingerbread man 300 to adjust a parameter associated with the right leg 301 .
- the user specifies a location in the outline 600 , and sees the corresponding control point 700 ′ in the deformed object. This allows a user to easily interact with an obscured part of a deformed object.
- the overlaid outline of the unmodified image allows the user to easily shift their focus between modifications made within the outline 600 and the result in the deformed image. For example, a user may adjust a parameter in the outline 600 and view the resulting change to the deformed object.
- the immediate feedback offered by displaying the two representations of the object in the same window, as well as the reduced need to move the user's field of vision between different windows may aid a user when deforming an object.
- more room on the display is available for the single window, rather than in a multi-windowed display that may significant reduce the available screen space to show both the unmodified object and the deformed object.
- the user may modify a parameter associated with the control point 700 , shown in FIGS. 7 and 8 , to cause a change at the corresponding control point 700 ′ in the deformed object. For example, if the user wishes to display the right leg 301 in front of the left leg 302 , the user may modify a parameter associated with control point 700 , and view the resulting change in the deformed gingerbread man 300 .
- the overlap parameter includes two values, an overlap value and an extent value.
- the overlap value specifies a level of “in-frontness” or overlap for a part of an object, while the extent value specifies the region affected by the control point 700 .
- the overlap parameter may include other values.
- gingerbread man 300 is a two-dimensional object, when two parts of the gingerbread man overlap, the rendering engine 122 may not have a clear indication of which of the overlapping regions should be in the foreground.
- the overlap value of an overlap parameter may be used by the rendering engine 122 as a pseudo third dimension. For example, before the user adjusts the overlap parameter associated with the control point 700 , the right leg 301 and the left leg 302 may have the same overlap value, or no overlap value. In which case, the rendering engine may randomly select which leg to display in the foreground. But if the user increases the overlap value of the right leg's 301 overlap parameter so that it is greater than the overlap value associated with the left leg 302 , the rendering engine 122 displays the right leg 301 in the foreground.
- the user may decrease the overlap value associated with the right leg 301 to a value less than the left leg's 302 overlap value to ensure that the right leg 301 is obscured by the left leg 302 .
- Other parameters such as starchiness, may be associated with the control point 700 as well, and are discussed in more detail below.
- a shaded wireframe mesh region surrounds control point 700 .
- the shading provides the user with a graphical indication of the overlap value. For example, brighter shading may indicate a higher overlap value, while darker shading may indicate a lower overlap value.
- the graphical indication may provide an easy reference for a user regarding the overlap value without requiring the user to view a numerical value, or perform additional actions to display the overlap value for a region.
- the relative brightness of the various overlap parameter extents may provide the user with an indication of which elements have greater overlap values with respect to other overlap values. This too may aid a user editing the object.
- the extent value of the overlap parameter indicates how large of an area is affected by the overlap parameter.
- FIG. 8 shows a shaded wireframe mesh region surrounding control point 700 .
- the shaded wireframe mesh region is a graphical representation of the extent value of the overlap parameter.
- the extent value as shown in FIG. 8 , is sized to include substantially all of the gingerbread man's 300 right leg 301 ′. This causes the overlap value to be applied to each point within the region defined by the extent value. Because substantially all of the right leg 301 ′ is within the region, the overlap value is applied to substantially all of the right leg 301 . Thus, the right leg 301 is shown in front of the left leg 302 .
- the extent value defines a radius around the control point 700 ; however, the extent value may not always define a radius. In some embodiments the extent value may define the length of a side of a shape, such as square or polygon, or a region of arbitrary shape and size.
- control over which leg is displayed in the foreground may be accomplished in other manners as well.
- a user may add control point 700 to unmodified right leg 301 ′, and increase the overlap value so that the modified right leg 301 is displayed in front of the left leg 302 . If the user later decides it would be better to have the left leg 302 in the foreground, the user may accomplish this in different ways.
- One method was described above: the user may simply decrease the overlap value associated with the control point 700 until the right leg's 301 ′ overlap value is less than the left leg's overlap value.
- the user instead may cause the same result by moving control point 700 , with the increased overlap value, over to the left leg 302 ′ in the outline 600 . Because the overlap value is associated with the control point 700 , when the control point 700 is relocated to the left leg 302 ′, the corresponding overlap value is applied to the left leg 302 . This may cause the left leg 302 to be displayed in the foreground. Alternatively, the user may add a new selected point to the left leg 302 ′ and increase an overlap value associated with the new control point until it is greater than the overlap value of the control point 700 in the right leg 301 . These techniques for modifying parameter values, though discussed in relation to an overlap parameter, are equally applicable to modifying other parameters associated with a selected point.
- the system 100 regenerates the deformed object based at least in part on the deformation and the modified parameter. For example, after the user has modified a parameter associated with the control point 700 , the application 120 executes the deformation engine 121 , if necessary, and the rendering engine 122 to regenerate the deformed object.
- the deformation engine 121 may not be executed in all cases as all parameters may not be associated with the physical deformation of the object.
- the overlap parameter may generally indicate which of two or more overlapping areas in the deformed object should be displayed in the foreground, but may not indicate how the object should react to a deformation.
- the rendering engine determined that the left leg should be displayed in the foreground.
- the overlap parameter gives the user the ability to specify that the right leg should be displayed in the foreground.
- the overlap parameter may affect the appearance of the deformed object, it may not have any effect on the deformation.
- the deformation engine may deform the object differently when an overlap parameter is modified. Other parameters may affect how the deformation engine 121 deforms the object and are discussed below.
- FIG. 9 shows the regenerated deformed object based on the modification to the overlap parameter at control point 700 . This shows that a user who has manipulated the overlap parameter may be able to view the result of the modification while the overlaid editing controls are still visible.
- FIG. 10 only the regenerated deformed image is shown. The user may then return to block 202 to deform the object again, or may return to block 206 to select another point to modify.
- FIGS. 11-18 show a deformation of an object in an image according to one embodiment of the present invention. The figures will be described in relation to the method 200 shown in FIG. 2 and the system shown in FIG. 1 . As discussed previously, a user may deform and object, such as gingerbread man 1100 shown in FIG. 11 , by using a system 100 executing an application 120 for editing an image.
- the system 100 displays the unmodified object.
- the user may interact with the application 120 to deform the gingerbread man 1100 , such as by using the input device 103 as described above.
- system 100 receives a deformation of the object from the user.
- the user has specified deformation 1200 to move the gingerbread man's 1100 right leg 1101 to the left, similar to the deformation shown in FIG. 4 .
- deformation 1201 is deformation 1201 .
- Deformation 1201 was not specified by the user, but rather is an indirect deformation caused by the movement of the gingerbread man's 1100 right leg 1101 .
- Indirect deformations may be caused by the deformation engine's 121 calculations when deforming an object based on a user's deformation, such as deformation 1200 .
- FIG. 12 the user has specified deformation 1200 to move the gingerbread man's 1100 right leg 1101 to the left, similar to the deformation shown in FIG. 4 .
- deformation 1201 was not specified by the user, but rather is an indirect deformation caused by the movement of the gingerbread man's 1100 right leg 1101 .
- Indirect deformations may be caused by the deformation engine's 121 calculations when
- the indirect deformation will cause the gingerbread man's 1100 head 1103 to tilt to his left, as indicated by arrow 1201 .
- arrows 1200 , 1201 are not visible to a user in the embodiment shown, but rather are to aid in understanding the deformation of the gingerbread man 1100 . However, such indicators may be visible in one embodiment of the present invention.
- the deformation engine 121 deforms the gingerbread man 1100 based on the deformation 1200 .
- deformation engine 121 determines that the right leg 1101 will deform by moving to the left, and that the head 1103 will rotate to the left as well.
- an application 120 may generate a mesh corresponding to an object. Further, this mesh may be used by the deformation engine 121 to determine how the object will deform based on the input deformation.
- a mesh comprises a plurality of vertices and a plurality of lines (called edges) connecting the vertices to create a plurality of adjacent polygons.
- the edges of the polygons may stretch or compress as the object deforms. If this stretching or compression propagates to other areas of the object, those areas may deform as well. Therefore, to reduce or eliminate the propagation of a deformation into a region, a user may modify a starchiness parameter associated with the region.
- a starchiness parameter comprises two values: a starchiness value, and an extent value.
- the starchiness value describes the resilience of a location in an object. A greater starchiness value in a location may reduce the amount of indirect deformation caused to the location, while a lower starchiness value may make a location more susceptible to deformation.
- the extent value defines the size of the area affected by the starchiness parameter, and may be a radius extending from a selected point, or as otherwise described above with respect the overlap parameter.
- the deformed object is displayed.
- the deformed gingerbread man 1100 can be seen in FIG. 13 .
- the right leg 1101 has crossed under the left leg 1102 , and the head has tilted to the left.
- a user may desire that the head 1103 not deform from its original location, or deform to a lesser degree than is shown in FIG. 13 .
- the user may adjust a parameter associated with the head 1103 .
- the user causes the application 120 to display a representation of the unmodified object in the same window.
- the representation comprises an outline 1400 , which is overlaid on the deformed object in the same window.
- a user may maintain focus on the deformed object while interacting with the unmodified object. This may provide a more streamlined and efficient workflow for the user.
- the user may add a control point 1500 within the outline 1400 of the unmodified object as shown in FIG. 15 .
- a corresponding control point 1500 ′ is displayed on the head 1103 of the deformed gingerbread man 1103 . This may provide a visual cue that the user is editing the correct location on the deformed object. However, in one embodiment, the corresponding control point 1500 ′ is not displayed.
- the user modifies a starchiness parameter associated with the control point 1500 .
- the user has increased the starchiness value for the starchiness parameter associated with control point 1500 .
- the user has selected an extent value such that the starchiness parameter affects substantially the entire head 1103 ′ of the gingerbread man.
- the gingerbread man's head 1103 may become less susceptible to the indirect deformation caused by the deformation of the right leg 1101 .
- the deformed object is regenerated.
- the application 120 executes the deformation engine 121 to deform the gingerbread man 1100 based at least in part on the deformation and the modified parameter.
- the deformation engine passes the deformed gingerbread man 1100 to the rendering engine, which renders the deformed gingerbread man 1100 .
- FIG. 17 shows the outline 1400 of the unmodified gingerbread man overlaid on the regenerated deformed gingerbread man 1100 .
- the head 1103 of the deformed gingerbread man 1100 has not undergone the same amount of indirect deformation that can be seen in FIG. 13 , and relative to the outline 1400 of the unmodified gingerbread man.
- FIG. 18 shows the deformed gingerbread man 1100 without the outline 1400 .
- the increased starchiness value associated with the head 1103 has caused the deformation engine to substantially reduce the amount of indirect deformation propagated to the head 1103 from the deformation of the right leg 1101 .
- the right leg 1101 is displayed behind the left leg 1101 .
- a user may wish to move the right leg 1101 into the foreground.
- user may add a second selected point within outline 1400 corresponding to the right leg 1101 ′.
- the user may then modify an overlap parameter associated with the second selected point to cause the right leg 1101 to be displayed in front of left leg 1102 .
- the user may use a method, such as the embodiment described with respect to FIGS. 6-10 to move the right leg 1101 into the foreground.
- Still further selected points may be added to outline 1400 to adjust additional parameters.
- An application for editing an image may allow a user to edit three-dimensional images. For example, in one embodiment, the application displays a three-dimensional object in an editing window. A user may deform the three-dimensional object. The application may execute a deformation engine and a rendering engine to generate a deformed object based at least in part on the deformation and the unmodified object. The user may then cause the application to display an outline of the unmodified object in the editing window, and may be overlaid on the deformed object.
- an outline of an unmodified object may comprise an outline of one cross section of the unmodified object along one of the X, Y, or Z axes.
- a user may select one of the three separate to be displayed.
- the outline may be a three-dimensional outline, and may be rotatable to provide different perspectives of the unmodified object.
- the outline may comprise a partially-transparent rendered object of the unmodified object, which may allow a user to more easily adjust a perspective of the unmodified object.
- a user may be able to view an outline of the unmodified object at varying depths. For example, a user may wish to add a control point or a selected point at a location inside the volume of the three-dimensional object. A user may be allowed to navigate into the interior of the object to add a control point or selected point.
- a computer 101 may comprise a processor 110 or processors.
- the processor 110 comprises a computer-readable medium, such as a random access memory (RAM) coupled to the processor.
- the processor 110 executes computer-executable program instructions stored in memory 111 , such as executing one or more computer programs for editing an image.
- processors may comprise a microprocessor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), field programmable gate arrays (FPGAs), and state machines.
- DSP digital signal processor
- ASIC application-specific integrated circuit
- FPGAs field programmable gate arrays
- Such processors may further comprise programmable electronic devices such as PLCs, programmable interrupt controllers (PICs), programmable logic devices (PLDs), programmable read-only memories (PROMs), electronically programmable read-only memories (EPROMs or EEPROMs), or other similar devices.
- PLCs programmable interrupt controllers
- PLDs programmable logic devices
- PROMs programmable read-only memories
- EPROMs or EEPROMs electronically programmable read-only memories
- Such processors may comprise, or may be in communication with, media, for example computer-readable media, that may store instructions that, when executed by the processor, can cause the processor to perform the steps described herein as carried out, or assisted, by a processor.
- Embodiments of computer-readable media may comprise, but are not limited to, an electronic, optical, magnetic, or other storage or transmission device capable of providing a processor, such as the processor in a web server, with computer-readable instructions.
- Other examples of media comprise, but are not limited to, a floppy disk, CD-ROM, magnetic disk, memory chip, ROM, RAM, ASIC, configured processor, all optical media, all magnetic tape or other magnetic media, or any other medium from which a computer processor can read.
- various other forms of computer-readable media may transmit or carry instructions to a computer, such as a router, private or public network, or other transmission device or channel.
- the processor, and the processing, described may be in one or more structures, and may be dispersed through one or more structures.
- the processor may comprise code for carrying out one or more of the methods (or parts of methods) described herein.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Architecture (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims (28)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US11/771,726 US7812850B1 (en) | 2007-06-29 | 2007-06-29 | Editing control for spatial deformations |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US11/771,726 US7812850B1 (en) | 2007-06-29 | 2007-06-29 | Editing control for spatial deformations |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US7812850B1 true US7812850B1 (en) | 2010-10-12 |
Family
ID=42830927
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US11/771,726 Active 2029-02-18 US7812850B1 (en) | 2007-06-29 | 2007-06-29 | Editing control for spatial deformations |
Country Status (1)
| Country | Link |
|---|---|
| US (1) | US7812850B1 (en) |
Cited By (20)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080193013A1 (en) * | 2007-02-13 | 2008-08-14 | Thomas Schiwietz | System and method for on-the-fly segmentations for image deformations |
| JP2013045295A (en) * | 2011-08-24 | 2013-03-04 | Casio Comput Co Ltd | Image processor, image processing method, and program |
| JP2013045296A (en) * | 2011-08-24 | 2013-03-04 | Casio Comput Co Ltd | Image processor, image processing method, and program |
| US20130305172A1 (en) * | 2012-05-10 | 2013-11-14 | Motorola Mobility, Inc. | Pen Tool Editing Modes |
| GB2509369A (en) * | 2012-11-02 | 2014-07-02 | Imagination Tech Ltd | 3-D rendering using geometry control points or an acceleration structure |
| US8849032B2 (en) | 2011-03-08 | 2014-09-30 | Canon Kabushiki Kaisha | Shape parameterisation for editable document generation |
| US9367933B2 (en) | 2012-06-26 | 2016-06-14 | Google Technologies Holdings LLC | Layering a line with multiple layers for rendering a soft brushstroke |
| US20170011549A1 (en) * | 2015-07-09 | 2017-01-12 | Disney Enterprises, Inc. | Object Deformation Modeling |
| US9646410B2 (en) | 2015-06-30 | 2017-05-09 | Microsoft Technology Licensing, Llc | Mixed three dimensional scene reconstruction from plural surface models |
| US9665978B2 (en) * | 2015-07-20 | 2017-05-30 | Microsoft Technology Licensing, Llc | Consistent tessellation via topology-aware surface tracking |
| US20180130243A1 (en) * | 2016-11-08 | 2018-05-10 | Samsung Electronics Co., Ltd. | Display apparatus and control method thereof |
| US10163247B2 (en) | 2015-07-14 | 2018-12-25 | Microsoft Technology Licensing, Llc | Context-adaptive allocation of render model resources |
| US10297075B2 (en) * | 2017-09-19 | 2019-05-21 | Arm Limited | Method of processing image data |
| US20200098087A1 (en) * | 2018-09-25 | 2020-03-26 | Adobe Inc. | Generating enhanced digital content using piecewise parametric patch deformations |
| US10628918B2 (en) | 2018-09-25 | 2020-04-21 | Adobe Inc. | Generating enhanced digital content using piecewise parametric patch deformations |
| WO2020220679A1 (en) * | 2019-04-30 | 2020-11-05 | 北京市商汤科技开发有限公司 | Method and device for image processing, and computer storage medium |
| US10832376B2 (en) | 2018-09-25 | 2020-11-10 | Adobe Inc. | Generating enhanced digital content using piecewise parametric patch deformations |
| US11393135B1 (en) | 2020-02-28 | 2022-07-19 | Apple Inc. | Modifying objects in a graphical environment |
| US11694414B2 (en) * | 2019-08-19 | 2023-07-04 | Clo Virtual Fashion Inc. | Method and apparatus for providing guide for combining pattern pieces of clothing |
| US12190467B2 (en) | 2022-08-11 | 2025-01-07 | Adobe Inc. | Modifying parametric continuity of digital image content in piecewise parametric patch deformations |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6573889B1 (en) | 1999-02-08 | 2003-06-03 | Adobe Systems Incorporated | Analytic warping |
| US6608631B1 (en) * | 2000-05-02 | 2003-08-19 | Pixar Amination Studios | Method, apparatus, and computer program product for geometric warps and deformations |
| US20060038832A1 (en) * | 2004-08-03 | 2006-02-23 | Smith Randall C | System and method for morphable model design space definition |
| US7102652B2 (en) | 2001-10-01 | 2006-09-05 | Adobe Systems Incorporated | Compositing two-dimensional and three-dimensional image layers |
| US7103236B2 (en) | 2001-08-28 | 2006-09-05 | Adobe Systems Incorporated | Methods and apparatus for shifting perspective in a composite image |
| US7113187B1 (en) * | 2000-05-11 | 2006-09-26 | Dan Kikinis | Method and system for localized advertising using localized 3-D templates |
| US7385612B1 (en) * | 2002-05-30 | 2008-06-10 | Adobe Systems Incorporated | Distortion of raster and vector artwork |
| US20080150962A1 (en) * | 2006-12-20 | 2008-06-26 | Bezryadin Sergey N | Image display using a computer system, including, but not limited to, display of a reference image for comparison with a current image in image editing |
-
2007
- 2007-06-29 US US11/771,726 patent/US7812850B1/en active Active
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6573889B1 (en) | 1999-02-08 | 2003-06-03 | Adobe Systems Incorporated | Analytic warping |
| US6608631B1 (en) * | 2000-05-02 | 2003-08-19 | Pixar Amination Studios | Method, apparatus, and computer program product for geometric warps and deformations |
| US7113187B1 (en) * | 2000-05-11 | 2006-09-26 | Dan Kikinis | Method and system for localized advertising using localized 3-D templates |
| US7103236B2 (en) | 2001-08-28 | 2006-09-05 | Adobe Systems Incorporated | Methods and apparatus for shifting perspective in a composite image |
| US7102652B2 (en) | 2001-10-01 | 2006-09-05 | Adobe Systems Incorporated | Compositing two-dimensional and three-dimensional image layers |
| US7385612B1 (en) * | 2002-05-30 | 2008-06-10 | Adobe Systems Incorporated | Distortion of raster and vector artwork |
| US20060038832A1 (en) * | 2004-08-03 | 2006-02-23 | Smith Randall C | System and method for morphable model design space definition |
| US20080150962A1 (en) * | 2006-12-20 | 2008-06-26 | Bezryadin Sergey N | Image display using a computer system, including, but not limited to, display of a reference image for comparison with a current image in image editing |
Non-Patent Citations (2)
| Title |
|---|
| Kai's Power Goo, Encyclopedia of Science Fiction and SimuWeb and SimuNet, Cyber News and Reviews, web page at http://www.cyber-reviews.com/oct96.html, as available via the Internet and printed Feb. 12, 2008. |
| Newcomb, M., Mini-Reviews, web page available at http://www.miken.com/winpost/sep96/minirev.htm, as available via the Internet and printed Feb. 12, 2008. |
Cited By (35)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7961945B2 (en) * | 2007-02-13 | 2011-06-14 | Technische Universität München | System and method for on-the-fly segmentations for image deformations |
| US20080193013A1 (en) * | 2007-02-13 | 2008-08-14 | Thomas Schiwietz | System and method for on-the-fly segmentations for image deformations |
| US8849032B2 (en) | 2011-03-08 | 2014-09-30 | Canon Kabushiki Kaisha | Shape parameterisation for editable document generation |
| JP2013045295A (en) * | 2011-08-24 | 2013-03-04 | Casio Comput Co Ltd | Image processor, image processing method, and program |
| JP2013045296A (en) * | 2011-08-24 | 2013-03-04 | Casio Comput Co Ltd | Image processor, image processing method, and program |
| US20130305172A1 (en) * | 2012-05-10 | 2013-11-14 | Motorola Mobility, Inc. | Pen Tool Editing Modes |
| US9367933B2 (en) | 2012-06-26 | 2016-06-14 | Google Technologies Holdings LLC | Layering a line with multiple layers for rendering a soft brushstroke |
| US11568592B2 (en) | 2012-11-02 | 2023-01-31 | Imagination Technologies Limited | On demand geometry and acceleration structure creation with tile object lists |
| CN104885123A (en) * | 2012-11-02 | 2015-09-02 | 想象技术有限公司 | On-demand geometry and accelerated structure formation |
| GB2509369B (en) * | 2012-11-02 | 2017-05-10 | Imagination Tech Ltd | On demand geometry processing for 3-d rendering |
| US12211136B2 (en) | 2012-11-02 | 2025-01-28 | Imagination Technologies Limited | On demand geometry and acceleration structure creation with tile object lists |
| CN104885123B (en) * | 2012-11-02 | 2018-01-09 | 想象技术有限公司 | Geometry processing method and graphics rendering system for graphics rendering |
| US10339696B2 (en) | 2012-11-02 | 2019-07-02 | Imagination Technologies Limited | On demand geometry and acceleration structure creation with discrete production scheduling |
| GB2509369A (en) * | 2012-11-02 | 2014-07-02 | Imagination Tech Ltd | 3-D rendering using geometry control points or an acceleration structure |
| US10186070B2 (en) | 2012-11-02 | 2019-01-22 | Imagination Technologies Limited | On demand geometry and acceleration structure creation |
| US10242487B2 (en) | 2012-11-02 | 2019-03-26 | Imagination Technologies Limited | On demand geometry and acceleration structure creation |
| US10943386B2 (en) | 2012-11-02 | 2021-03-09 | Imagination Technologies Limited | On demand geometry and acceleration structure creation with tile object lists |
| US9646410B2 (en) | 2015-06-30 | 2017-05-09 | Microsoft Technology Licensing, Llc | Mixed three dimensional scene reconstruction from plural surface models |
| US20170011549A1 (en) * | 2015-07-09 | 2017-01-12 | Disney Enterprises, Inc. | Object Deformation Modeling |
| US9799146B2 (en) * | 2015-07-09 | 2017-10-24 | Disney Enterprises, Inc. | Object deformation modeling |
| US10163247B2 (en) | 2015-07-14 | 2018-12-25 | Microsoft Technology Licensing, Llc | Context-adaptive allocation of render model resources |
| US9665978B2 (en) * | 2015-07-20 | 2017-05-30 | Microsoft Technology Licensing, Llc | Consistent tessellation via topology-aware surface tracking |
| US20180130243A1 (en) * | 2016-11-08 | 2018-05-10 | Samsung Electronics Co., Ltd. | Display apparatus and control method thereof |
| US10297075B2 (en) * | 2017-09-19 | 2019-05-21 | Arm Limited | Method of processing image data |
| US10832376B2 (en) | 2018-09-25 | 2020-11-10 | Adobe Inc. | Generating enhanced digital content using piecewise parametric patch deformations |
| US20200098087A1 (en) * | 2018-09-25 | 2020-03-26 | Adobe Inc. | Generating enhanced digital content using piecewise parametric patch deformations |
| US10706500B2 (en) * | 2018-09-25 | 2020-07-07 | Adobe Inc. | Generating enhanced digital content using piecewise parametric patch deformations |
| US10628918B2 (en) | 2018-09-25 | 2020-04-21 | Adobe Inc. | Generating enhanced digital content using piecewise parametric patch deformations |
| WO2020220679A1 (en) * | 2019-04-30 | 2020-11-05 | 北京市商汤科技开发有限公司 | Method and device for image processing, and computer storage medium |
| US20210035260A1 (en) * | 2019-04-30 | 2021-02-04 | Beijing Sensetime Technology Development Co., Ltd. | Method and apparatus for image processing, and computer storage medium |
| US11501407B2 (en) * | 2019-04-30 | 2022-11-15 | Beijing Sensetime Technology Development Co., Ltd. | Method and apparatus for image processing, and computer storage medium |
| US11694414B2 (en) * | 2019-08-19 | 2023-07-04 | Clo Virtual Fashion Inc. | Method and apparatus for providing guide for combining pattern pieces of clothing |
| US12125162B2 (en) | 2019-08-19 | 2024-10-22 | Clo Virtual Fashion Inc. | Method and apparatus for providing guide for combining pattern pieces of clothing |
| US11393135B1 (en) | 2020-02-28 | 2022-07-19 | Apple Inc. | Modifying objects in a graphical environment |
| US12190467B2 (en) | 2022-08-11 | 2025-01-07 | Adobe Inc. | Modifying parametric continuity of digital image content in piecewise parametric patch deformations |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US7812850B1 (en) | Editing control for spatial deformations | |
| US6867770B2 (en) | Systems and methods for voxel warping | |
| KR101257849B1 (en) | Method and Apparatus for rendering 3D graphic objects, and Method and Apparatus to minimize rendering objects for the same | |
| US5977978A (en) | Interactive authoring of 3D scenes and movies | |
| US6417850B1 (en) | Depth painting for 3-D rendering applications | |
| EP1918881B1 (en) | Techniques and workflows for computer graphics animation system | |
| EP2419885B1 (en) | Method for adding shadows to objects in computer graphics | |
| US9153072B2 (en) | Reducing the size of a model using visibility factors | |
| US9471996B2 (en) | Method for creating graphical materials for universal rendering framework | |
| EP1550984A2 (en) | Integrating particle rendering and three-dimensional geometry rendering | |
| US8134551B2 (en) | Frontend for universal rendering framework | |
| EP2469474A1 (en) | Creation of a playable scene with an authoring system | |
| JP6333840B2 (en) | Method for forming shell mesh based on optimized polygons | |
| WO2012037157A2 (en) | System and method for displaying data having spatial coordinates | |
| WO2004107764A1 (en) | Image display device and program | |
| JP6445825B2 (en) | Video processing apparatus and method | |
| EP2797054B1 (en) | Rendering of an indirect illumination data buffer | |
| US9754406B2 (en) | Multiple light source simulation in computer graphics | |
| KR20120104071A (en) | 3d image visual effect processing method | |
| US7944443B1 (en) | Sliding patch deformer | |
| US20040012593A1 (en) | Generating animation data with constrained parameters | |
| CN113470153A (en) | Rendering method and device of virtual scene and electronic equipment | |
| US11915349B1 (en) | Extrusion technique for curve rendering | |
| WO2007130018A1 (en) | Image-based occlusion culling | |
| US20250156029A1 (en) | Techniques for motion editing for character animations |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: ADOBE SYSTEMS INCORPORATED, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NELSON, JOHN;REEL/FRAME:019500/0201 Effective date: 20070629 |
|
| AS | Assignment |
Owner name: ADOBE SYSTEMS INCORPORATED, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NELSON, JOHN;REEL/FRAME:019505/0963 Effective date: 20070629 |
|
| FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| FPAY | Fee payment |
Year of fee payment: 4 |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552) Year of fee payment: 8 |
|
| AS | Assignment |
Owner name: ADOBE INC., CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:ADOBE SYSTEMS INCORPORATED;REEL/FRAME:048525/0042 Effective date: 20181008 |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 12 |