US20180329597A1 - Ink Anchoring - Google Patents
Ink Anchoring Download PDFInfo
- Publication number
- US20180329597A1 US20180329597A1 US15/638,067 US201715638067A US2018329597A1 US 20180329597 A1 US20180329597 A1 US 20180329597A1 US 201715638067 A US201715638067 A US 201715638067A US 2018329597 A1 US2018329597 A1 US 2018329597A1
- Authority
- US
- United States
- Prior art keywords
- ink
- computing device
- ink object
- anchor
- anchoring
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- G06F17/24—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/166—Editing, e.g. inserting or deleting
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/166—Editing, e.g. inserting or deleting
- G06F40/171—Editing, e.g. inserting or deleting by use of digital ink
-
- G06K9/00409—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
- G06V30/32—Digital ink
- G06V30/333—Preprocessing; Feature extraction
-
- G06F17/242—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
Definitions
- users provide input to touch display devices using a stylus.
- users may draw on the display device using a stylus in order to generate an ink object that is intended to be associated with an object.
- a user may annotate a picture by writing on or proximate the picture.
- the user may draw a line between multiple objects in order to show a relationship between the multiple objects.
- user input intended to manipulate an object that is associated with an ink object causes only the object to be modified thereby destroying the intended relationship between the ink object and the one or more objects.
- an interactive canvas is displayed on one or more display devices of a computing device.
- An ink object is generated by digitizing ink input received to the interactive canvas.
- the ink object is anchored to the object such that a spatial relationship between the ink object and the object is maintained if the ink object or the object is manipulated.
- an interactive canvas is displayed on one or more display devices of a computing device.
- An ink object is generated by digitizing ink input received to the interactive canvas.
- anchor points on the ink object that are aligned with a corresponding anchor position within each of the multiple objects are determined.
- the anchor points are anchored on the ink object to the respective anchor positions within each of the multiple objects.
- the ink object is adjusted to maintain the alignment of the anchor points with the corresponding anchor positions within the multiple objects.
- an interactive canvas is displayed on one or more display devices of a computing device.
- An ink object is generated by digitizing ink input received to the interactive canvas.
- anchor points on the ink object are determined.
- the anchor points are aligned with a corresponding anchor position within each of the three or more objects.
- the anchor points on the ink object are anchored to the respective anchor positions within each of the three or more objects.
- the object is then divided into multiple segments such that each of the multiple segments overlaps two of the three or more objects.
- a respective one of the multiple segments of the ink object which overlaps the manipulated object is adjusted.
- FIG. 1 is an illustration of an environment in an example implementation that is operable to support techniques for ink anchoring discussed herein.
- FIG. 2 illustrates a system showing the ink anchoring module of FIG. 1 in more detail.
- FIGS. 3A-3C illustrate various examples of ink anchoring in accordance with one or more implementations.
- FIG. 4 illustrates a system in which the ink anchoring module 134 is configured to anchor an ink object to multiple objects.
- FIGS. 5A and 5B illustrate an example of ink anchoring for multiple objects in accordance with one or more implementations.
- FIGS. 6A and 6B illustrate an example of ink anchoring for three or more objects in accordance with one or more implementations.
- FIG. 7 is a flow diagram that describes steps in a method for ink anchoring in accordance with one or more implementations.
- FIG. 8 is a flow diagram that describes steps in an additional method for ink anchoring in accordance with one or more implementations.
- FIG. 9 is a flow diagram that describes steps in an additional method for ink anchoring in accordance with one or more implementations.
- FIG. 10 illustrates an example system that includes an example computing device that is representative of one or more computing systems and/or devices that may implement the various techniques described herein.
- the techniques described herein anchor an ink object corresponding to ink input that is received in “free form” to one or more objects that the ink object overlaps or intersects such that a spatial relationship between the ink object and the object is maintained if the ink object or the object is manipulated, such as by moving or re-sizing the object.
- the anchoring causes the ink object and the object to be maintained as separate objects by creating a “non-destructive link” between the ink object and the object, such that the object is not permanently edited by the overlapping ink object.
- the ink anchoring techniques are configured to anchor an ink object to multiple objects that the ink object overlaps or intersects.
- the complexity of representing and maintaining the relationship between the ink object and the multiple objects increases.
- an anchor point on the ink object that is aligned with an anchor position e.g., an x,y position
- the ink object is adjusted in order to maintain the alignment of the anchor points with each respective anchor position within the objects.
- a geometric transform can be applied to the ink object in order to deform the ink object such that the anchor points are aligned with the corresponding anchor positions.
- the ink anchoring techniques discussed throughout can be applied to scenarios in which the ink object overlaps three or more objects.
- the ink object is divided into multiple segments such that each segment overlaps two of the three or more objects. For example, if the ink object overlaps a first, second, and third object, the ink object can be divided into a first segment of the ink object which overlaps the first and second objects, and a second segment of the ink object that overlaps the second and third objects.
- the third object is moved, then the second segment between the second and third objects can be adjusted or deformed a greater amount than the adjustment of the first segment which is less affected by movement of the third object.
- the described techniques reduce user frustration which often occurs in conventional solutions when user input intended to manipulate an object that is associated with an ink object, causes only the object to be modified thereby destroying the intended relationship between the ink object and the one or more objects.
- the ink anchoring techniques improve the user experience by reducing the number of steps required to link an ink object with one or more objects.
- the user does not need to select both the ink object and the one or more objects, and then initiate a grouping command in order to link the ink object with the one or more objects object.
- the ink object is automatically anchored to the one or more objects in response to detecting an overlap, or a close proximity, between the ink object and the one or more objects which is indicative of user intent to anchor the ink object to the one or more objects.
- FIG. 1 is an illustration of an environment 100 in an example implementation that is operable to employ techniques for ink anchoring discussed herein.
- Environment 100 includes a client device 102 which can be configured for mobile use, such as a mobile phone, a tablet computer, a wearable device, a handheld gaming device, a media player, and so on.
- the client device 102 is implemented as a “dual-display” device, and includes a display device 104 and a display device 106 that are connected to one another by a hinge 108 .
- the display device 104 includes a touch surface 110
- the display device 106 includes a touch surface 112 .
- the client device 102 also includes an input module 114 configured to process input received via one of the touch surfaces 110 , 112 and/or via the hinge 108 . While some of the techniques discussed herein will be described with reference to a dual-display device, it is to be appreciated that in some cases the techniques may also be implemented on a single-screen device, such as a mobile phone, tablet computer, media player, laptop computer, desktop computer, and so forth.
- the hinge 108 may allow the display devices 104 and 106 to fold back on each other to provide a “single display” device. As such, the techniques described herein may be designed to function whether the user is operating in a two-display mode or a single-display mode.
- the dual display device is illustrated with a hinge in this example, it is to be appreciated that in some cases the techniques may be implemented in single display, dual-display, or multi-display devices without the hinge.
- the hinge 108 is configured to rotationally move about a longitudinal axis 116 of the hinge 108 to allow an angle between the display devices 104 , 106 to change. In this way, the hinge 108 allows the display devices 104 , 106 to be connected to one another yet be oriented at different angles and/or planar orientations relative to each other.
- the touch surfaces 110 , 112 may represent different portions of a single integrated and continuous display surface that can be bent along the hinge 108 .
- the client device 102 may range from full resource devices with substantial memory and processor resources, to a low-resource device with limited memory and/or processing resources. An example implementation of the client device 102 is discussed below with reference to FIG. 10 .
- the client device 102 includes a variety of different functionalities that enable various activities and tasks to be performed.
- the client device 102 includes an operating system 118 , applications 120 , and a communication module 122 .
- the operating system 118 is representative of functionality for abstracting various system components of the client device 102 , such as hardware, kernel-level modules and services, and so forth.
- the operating system 118 can abstract various components (e.g., hardware, software, and firmware) of the client device 102 to enable interaction between the components and applications running on the client device 102 .
- the applications 120 are representative of functionality for performing different tasks via the client device 102 .
- the applications 120 represent a web browser, web platform, or other application that can be leveraged to browse websites over a network.
- the communication module 122 is representative of functionality for enabling the client device 102 to communicate over wired and/or wireless connections.
- the communication module 122 represents hardware and logic for communicating data via a variety of different wired and/or wireless technologies and protocols.
- the display devices 104 , 106 generally represent functionality for visual output for the client device 102 . Additionally, the display devices 104 , 106 represent functionality for receiving various types of input, such as touch input, stylus input, touchless proximity input, and so forth via one or more of the touch surfaces 110 , 112 , which can be used as visual output portions of the display devices 104 , 106 .
- the input module 114 is representative of functionality to enable the client device 102 to receive input (e.g., via input mechanisms 124 ) and to process and route the input in various ways.
- the input mechanisms 124 generally represent different functionalities for receiving input to the client device 102 , and include a digitizer 126 , touch input devices 128 , and analog input devices 130 .
- Examples of the input mechanisms 124 include gesture-sensitive sensors and devices (e.g., such as touch-based sensors), a stylus, a touch pad, accelerometers, a microphone with accompanying voice recognition software, and so forth.
- the input mechanisms 124 may be separate or integral with the display devices 104 , 106 ; integral examples include gesture-sensitive displays with integrated touch-sensitive sensors.
- the digitizer 126 represents functionality for converting various types of input to the display devices 104 , 106 , the touch input devices 128 , and the analog input devices 130 into digital data that can be used by the client device 102 in various ways.
- the analog input devices 130 represent hardware mechanisms (e.g., the hinge 108 ) that are usable to generate different physical quantities that represent data.
- the hinge 108 represents a mechanism that can be leveraged to generate input data by measurement of a physical variable, such as hinge angle of the hinge 108 .
- One or more sensors 132 can measure the hinge angle, and the digitizer 126 can convert such measurements into digital data usable by the client device 102 to perform operations to content displayed via the display devices 104 , 106 .
- the sensors 132 represent functionality for detecting different input signals received by the client device 102 .
- the sensors 132 can include one or more hinge sensors configured to detect a hinge angle between the display devices 104 , 106 .
- the sensors 132 can include grip sensors, such as touch sensors, configured to detect how a user is holding the client device 102 . Accordingly, a variety of different sensors 132 can be implemented to detect various different types of digital and/or analog input. These and other aspects are discussed in further detail below.
- the applications 120 represent a journal application which provides an interactive canvas representative of pages of a journal.
- a first page of the journal application can be presented on touch surface 110 of display device 104 while a second page of the journal application is presented on touch surface 112 of display device 106 .
- the user can then write and draw on the interactive canvas, as well as insert and/or manipulate various different objects.
- the applications 120 include or otherwise make use of an ink anchoring module 134 .
- the ink anchoring module 134 represents a standalone application. In other implementations, the ink anchoring module 134 is included as part of another application or system software, such as the operating system 118 .
- the ink anchoring module 134 is configured to anchor ink input that is received in “free form” (e.g., writing or drawing on a touch surface of a device using a stylus) to one or more objects that the ink objects overlaps or intersects such that a spatial relationship between the ink object and the object is maintained if the ink object or the object is manipulated, such as by moving or re-sizing the object.
- the ink anchoring module 134 is configured to anchor an ink object to multiple objects that the ink object overlaps or intersects. In this scenario, when one or more of the objects is manipulated, the ink anchoring module 134 adjusts the ink object in order to maintain an alignment of anchor points on the ink object with respective anchor positions within the objects. Further discussion of this and other features is provided below.
- FIG. 2 illustrates a system 200 showing the ink anchoring module 134 in more detail.
- ink anchoring module 134 monitors ink input 202 to an interactive canvas 204 .
- a user can provide ink input 202 to an interactive canvas 204 by writing or drawing on the interactive canvas using a stylus or the user's finger.
- ink input 202 encompasses any type of input that can be provided to the interactive canvas using a stylus or the user's finger, including drawings strokes, drawing shapes, drawing lines, drawing pictures, writing, and so forth.
- the digitizer 126 creates an ink object 206 by digitizing and displaying a representation of the ink input 202 on the interactive canvas 204 .
- the interactive canvas 204 may also include one or more objects 208 .
- objects may include any type of content, such as images and photos, videos, audio files, text, symbols, drawings, and so forth.
- ink anchoring module 134 determines whether the ink object 206 overlaps one or more objects 208 displayed on the interactive canvas 204 . Generally, the ink anchoring module 134 determines that the ink object 206 overlaps an object 208 if at least a portion of the ink object 206 at least partially overlaps or intersects the object 208 .
- the ink anchoring module 134 anchors the ink object 206 to the object 208 such that the spatial relationship between the ink object 206 and the object 208 is maintained if either the ink object or object are manipulated.
- the ink anchoring module 134 may anchor the ink object to the object if the ink object 206 is within close spatial proximity to the object 208 without actually overlapping the ink object 206 .
- the ink anchoring module 134 may also factor in a time proximity between the ink object and the object in order to determine an overlap.
- the time proximity between inserting the object 208 and creating the ink object 206 combined with the close spatial proximity between the objects may cause the ink anchoring module 134 to anchor the ink object 206 to the object 208 .
- FIGS. 3A to 3C which illustrate an example 300 of ink anchoring in accordance with one or more implementations.
- client device 102 displays an interactive canvas 302 on one or more displays.
- the interactive canvas 302 is presented on display devices 104 and 106 of a “dual-display” client device 102 , and is associated with a journal application.
- the interactive canvas 302 may be presented on a “single-display” device and/or associated with a different type of application.
- the journal application enables the user to take notes and/or draw on the interactive canvas 302 using an input device, such as a stylus 304 .
- the interactive canvas 302 includes an object 306 , which corresponds to a picture of a car.
- objects may include any type of content, such as images and photos, videos, audio files, text, symbols, drawings, and so forth.
- ink input is provided to interactive canvas 302 when the user draws a picture of the sun on stylus 304 .
- the ink input is digitized and displayed on the interactive canvas 302 as an ink object 308 .
- ink anchoring module 134 determines that the ink object 308 overlaps the object 306 , and thus anchors the ink object 308 to the object 306 such that the spatial relationship between the ink object 306 and the object 308 is maintained if either the ink object or object are manipulated, such as by moving or resizing the ink object or object.
- the user has moved object 306 to the upper right hand corner of interactive canvas 302 displayed on display device 106 .
- the user has selected object 306 using stylus 304 , and dragged the object 306 to the upper right corner of interactive canvas 302 .
- the user manipulation to move object 306 also causes the ink anchoring module 134 to maintain the spatial relationship between the objects by moving ink object 308 with the object 306 .
- anchoring the ink object to the object creates a non-destructive link 214 between the ink object 206 and the object 208 such that the object is not permanently edited by the overlapping ink object 206 .
- the user can remove the non-destructive link 214 at any time.
- the anchoring module 134 creates the non-destructive link 214 by maintaining separate object identifiers for each of the ink object 206 and the object 208 . The separate object identifiers enable the ink object and the object to be accessed or searched for separately. For example, in FIG.
- the non-destructive link is illustrated as including an ink object identifier 216 which identifies the ink object 206 , and an object identifier 218 which identifies the object 208 .
- the object identifiers each include a references to each other.
- ink object identifier 216 is illustrated as including an object link 220 which links to the object
- object identifier 218 is illustrated as including a link object ID which links to the ink object.
- the links 220 and 222 enable the ink object 206 and object 208 to be treated as a single entity, while the separate identifiers 216 and 218 maintain independence between the objects.
- the ink anchoring module 134 is configured to anchor an ink object to multiple objects that the ink object overlaps or intersects. In this scenario, when one or more of the objects is manipulated, the ink anchoring module 134 adjusts the ink object in order to maintain an alignment of anchor points on the ink object with respective anchor positions within the objects.
- FIG. 4 which illustrates a system 400 in which the ink anchoring module 134 is configured to anchor an ink object to multiple objects.
- ink anchoring module 134 monitors ink input 402 to an interactive canvas 404 .
- a user can provide ink input 402 to an interactive canvas 404 by writing or drawing on the interactive canvas using a stylus or the user's finger.
- ink input 402 encompasses any type of input that can be provided to the interactive canvas using a stylus or the user's finger, including drawings strokes, drawing shapes, drawing lines, drawing pictures, writing, and so forth.
- the digitizer 126 creates an ink object 406 by digitizing and displaying a representation of the ink input 402 on the interactive canvas 404 .
- the interactive canvas 404 may also include one or more objects 408 .
- ink anchoring module 134 determines whether the ink object 406 overlaps multiple objects 408 on the interactive canvas 404 . Generally, the ink anchoring module 134 determines that the ink object 406 overlaps multiple objects if the ink object 406 overlaps or intersects two or more objects. However, as discussed above, an overlap or intersect may also be determined by the ink anchoring module 134 based on a spatial and/or time proximity between the ink object and the multiple objects.
- the ink anchoring module 134 anchors the ink object 406 to each of the multiple objects. To do so, the ink anchoring module determines anchor points on the ink object 406 that are aligned with corresponding anchor positions within the multiple objects 408 . Subsequently, when one or more of the objects is manipulated, the ink anchoring module 134 adjusts the ink object 406 in order to maintain an alignment of anchor points on the ink object 406 with respective anchor positions within the objects 308 .
- client device 102 displays an interactive canvas 502 on one or more displays.
- the interactive canvas 502 may be associated with a journal application and presented on display devices 104 and 106 of a “dual-display” client device 102 .
- the interactive canvas 502 may be presented on a “single-display” device and/or associated with a different type of application.
- the journal application enables the user to take notes and/or draw on the interactive canvas 502 using an input device, such as a stylus 504 .
- the interactive canvas 502 includes a first object 504 and a second object 506 .
- objects may include any type of content, such as images and photos, videos, audio files, text, symbols, drawings, and so forth.
- Ink input is provided to interactive canvas 502 when the user draws a line with an arrow from object 506 to object 508 .
- the ink input is digitized and displayed on the interactive canvas 502 as an ink object 510 .
- ink anchoring module 134 determines that the ink object 510 overlaps both the first object 506 and the second object 508 .
- the ink anchoring module 134 determines a first anchor point 512 on the ink object 510 that is aligned with a first anchor position 514 on the first object 506 . Similarly, the ink anchoring module 134 determines a second anchor point 516 on the ink object 510 that is aligned with a second anchor position 518 on the second object 508 .
- the ink anchoring module 134 can determine the anchor points in a variety of different ways, such as based on the a beginning or ending drawing stroke, the amount of ink within the object, and so forth.
- Ink anchoring module 134 then anchors the first anchor point 512 on the ink object 510 to the first anchor position 514 within the first object 506 and anchors the second anchor point 516 on the ink object 510 to the second anchor position 518 within the second object 508 . Subsequently, in response to the first object 506 or the second object 508 being manipulated, the ink anchoring module 134 adjusts the ink object 510 to maintain the alignment of the first and second anchor points 512 and 516 with the first and second anchor positions 514 and 516 , respectively.
- ink anchoring module 134 adjusts the ink object 510 in order to maintain the alignment of the first and second anchor points 512 and 516 on ink object 510 with the corresponding first and second anchor positions 514 and 518 within objects 506 and 508 , respectively.
- the ink anchoring module 134 has deformed the shape of the ink object 510 in order to maintain the alignment of anchor points 512 and 516 with anchor positions 514 and 518 .
- anchoring the ink object 406 to the multiple object 408 creates a non-destructive link 414 between the ink object 406 and the multiple objects 408 such that the multiple objects 408 are not permanently edited by the overlapping ink object 406 .
- the user can remove the non-destructive link 414 at any time.
- the anchoring module 134 creates the non-destructive link 414 by maintaining separate object identifiers for each of the ink object 406 and each of the multiple objects 408 . The separate object identifiers enable the ink object and each of the multiple objects to be accessed or searched for separately.
- the non-destructive link is illustrated as including an ink object identifier 416 which identifies the ink object 406 , a first object identifier 418 with identifies a first object 408 , and a second object identifier 420 which identifies a second object 408 .
- an ink object identifier 416 which identifies the ink object 406
- a first object identifier 418 with identifies a first object 408
- a second object identifier 420 which identifies a second object 408 .
- additional object identifiers are created.
- the ink object identifier 416 provides default positioning information about the ink object 406 which can be understood by different types of applications and/or devices. This improves compatibility with different applications and devices which do not understand the ink anchoring scheme.
- the object identifiers each include a references to each other.
- ink object ID 416 is illustrated as including a first object link 422 and a second object link 424 which links to the first and second objects 408 , respectively.
- first object identifier 418 and second object identifier 420 include ink object links 426 and 428 , respectively, which link to the ink object 406 .
- the links 422 , 424 , 426 , and 428 enable the ink object 406 and multiple objects 408 to be treated as a single entity, while the separate identifiers 416 , 418 , and 420 maintain independence between the ink object 406 and each object 408 .
- ink anchoring module 134 associates anchor points 430 and 432 of the ink object 406 with the ink object identifier 416 , and associates anchor positions 434 and 436 (e.g., x,y position information) with the object identifiers 418 and 420 , respectively.
- anchor positions 434 and 436 e.g., x,y position information
- the ink anchoring module 134 looks up the object identifiers, and adjusts the ink object 406 in order to maintain an alignment of the anchor points 430 and 432 on the ink object 406 with respective anchor positions 434 and 436 .
- the ink anchoring module 134 changes the default positioning information of the ink object in order to align the anchor points with the x,y anchor positions within the objects.
- the ink anchoring module 134 adjusts the ink object 406 by applying a geometric transform toe the ink object 406 in order to align the first and second anchor points with the corresponding first and second anchor positions.
- the ink anchoring techniques discussed throughout can be applied to scenarios in which the ink object overlaps three or more objects.
- the ink anchoring module 134 divides the ink object into segments, where each segment overlaps two of the three or more objects. For example, if the ink object overlaps a first, second, and third object, the ink anchoring module can divide the ink object into a first segment of the ink object which overlaps the first and second objects, and a second segment of the ink object that overlaps the second and third objects.
- the third object is moved, then the second segment between the second and third objects can be adjusted or deformed a greater amount than the adjustment of the first segment which is less affected by movement of the third object.
- FIGS. 6A and 6B illustrate an example 600 of ink anchoring for three or more objects in accordance with one or more implementations.
- client device 102 displays an interactive canvas 602 on one or display devices.
- the interactive canvas 602 may be associated with a journal application and presented on display devices 104 and 106 of a “dual-display” client device 102 .
- the interactive canvas 602 may be displayed on a “single-display” device and/or associated with a different type of application.
- the journal application enables the user to take notes and/or draw on the interactive canvas 602 using an input device, such as a stylus 604 .
- the interactive canvas 602 includes multiple objects, including a first object 606 , a second object 608 , and a third object 610 .
- objects may include any type of content, such as images and photos, videos, audio files, text, symbols, drawings, and so forth.
- Ink input is provided to interactive canvas 602 when the user draws a line with an arrow from first object 606 , through second object 608 , and ending at third object 610 .
- the ink input is digitized and displayed on the interactive canvas 602 as an ink object 612 .
- ink anchoring module 134 determines that the ink object 612 overlaps three or more objects, which in this example includes the first object 606 , the second object 608 , and the third object 610 .
- the ink anchoring module 134 determines anchor points on the ink object that are aligned with a corresponding anchor position within each of the three or more objects. For example, the ink anchoring module 134 determines a first anchor point 614 on the ink object 612 that is aligned with a first anchor position 616 on the first object 606 . Similarly, the ink anchoring module 134 determines a second anchor point 618 on the ink object 612 that is aligned with a second anchor position 620 on the second object 608 . Similarly, the ink anchoring module 134 determines a third anchor point 622 on the ink object 612 that is aligned with a third anchor position 624 on the third object 610 . The ink anchoring module 134 can determine the anchor points in a variety of different ways, such as based on the a beginning or ending drawing stroke, the amount of ink within the object, and so forth.
- Ink anchoring module 134 then anchors the anchor points on the ink object to the respective anchor positions within each of the three or more objects. For example, ink anchoring module 134 anchors the first anchor point 614 on the ink object 612 to the first anchor position 616 within the first object 606 , anchors the second anchor point 618 on the ink object 612 to the second anchor position 620 within the second object 608 , and anchors the third anchor point 612 on the ink object 612 to the third anchor position 624 within the third object 610
- the ink anchoring module 134 divides the ink object 612 into multiple segments such that each of the multiple segments overlaps two of the three or more objects. For example, in FIG. 6A , ink anchoring module 134 divides ink object 602 into a first segment 626 which overlaps both first object 606 and second object 608 , and a second segment 628 which overlaps both second object 608 and third object 610 .
- the ink anchoring module 134 adjusts a respective segment of the ink object which overlaps the manipulated object. For example, in FIG. 6B the user has manipulated third object 610 by moving the third object 610 to the bottom right hand corner of interactive canvas 602 . To do so, the user has selected third object 610 using stylus 604 , and dragged the third object 610 to the lower right corner of interactive canvas 602 . In response to the manipulation of third object 610 , ink anchoring module 134 adjusts the second segment 628 of ink object 612 which overlaps the manipulated third object 610 .
- ink anchoring module 134 has adjusted the second segment 628 by deforming the shape of the second segment 628 in order to maintain the alignment of anchor points 618 and 622 with corresponding anchor positions 620 and 624 within the respective second and third objects 608 and 610 .
- ink anchoring module 134 adjusts the respective segment of the ink object which overlaps the manipulated object without adjusting at least one of the multiple segments. For example, in FIG. 6B , ink anchoring module 134 adjusts second segment 628 without adjusting the first segment 626 . Alternately, in some cases ink anchoring module may adjust the respective segment of the ink object a greater amount than an adjustment of at least one of the multiple segments.
- an ink object may be linked to an object being transferred to another container, as well as objects that are not being transferred.
- the ink anchoring module may be configured to break a link between the ink object and either the objects being transferred to the new container or the objects remaining in the original container.
- the ink anchoring module may be configured to maintain the link across container boundaries. In this instance, a robust reference system may be utilized to maintain links across containers.
- the following discussion describes some example procedures in accordance with one or more implementations.
- the example procedures may be employed in the environment 100 of FIG. 1 , the system 1000 of FIG. 10 , and/or any other suitable environment.
- the procedures for instance, represent example procedures for implementation of the scenarios described above.
- the steps described for the various procedures can be implemented automatically and independent of user interaction.
- FIG. 7 is a flow diagram that describes steps in a method for ink anchoring in accordance with one or more implementations.
- an interactive canvas is displayed on the one or more display devices of the computing device.
- client device 102 displays an interactive canvas 302 on one or more displays.
- ink object is generated by digitizing ink input received to the interactive canvas.
- ink input is provided to interactive canvas 302 when the user draws a picture of the sun on stylus 304 .
- the ink input is digitized and displayed on the interactive canvas 302 as an ink object 308
- ink anchoring module 134 determines that the ink object 308 overlaps the object 306 , and thus anchors the ink object 308 to the object 306 such that the spatial relationship between the ink object 306 and the object 308 is maintained if either the ink object or object are manipulated, such as by moving or resizing the ink object or object.
- FIG. 8 is a flow diagram that describes steps in an additional method for ink anchoring in accordance with one or more implementations.
- an interactive canvas is displayed on the one or more display devices of the computing device.
- client device 102 displays an interactive canvas 502 on one or more displays.
- an ink object is generated by digitizing ink input received to the interactive canvas.
- ink input is provided to interactive canvas 502 when the user draws a line with an arrow from object 506 to object 508 .
- the ink input is digitized and displayed on the interactive canvas 502 as an ink object 510 .
- the ink anchoring module 134 determines that the ink object 510 overlaps both the first object 506 and the second object 508 .
- anchor points on the ink object that are aligned with a corresponding anchor position within each of the multiple objects are determined. For example, the ink anchoring module 134 determines a first anchor point 512 on the ink object 510 that is aligned with a first anchor position 514 on the first object 506 . Similarly, the ink anchoring module 134 determines a second anchor point 516 on the ink object 510 that is aligned with a second anchor position 518 on the second object 508 . The ink anchoring module 134 can determine the anchor points in a variety of different ways, such as based on the a beginning or ending drawing stroke, the amount of ink within the object, and so forth.
- the anchor points on the ink object are anchored to the respective anchor positions within each of the multiple objects.
- ink anchoring module 134 anchors the first anchor point 512 on the ink object 510 to the first anchor position 514 within the first object 506 and anchors the second anchor point 516 on the ink object 510 to the second anchor position 518 within the second object 508 .
- the ink object is adjusted to maintain the alignment of the anchor points with the corresponding anchor positions within the multiple objects. For example, in FIG. 5B the user has manipulated second object 508 by moving the second object 508 to the upper right hand corner of interactive canvas 502 . To do so, the user has selected object 508 using stylus 504 , and dragged the second object 508 to the upper right corner of interactive canvas 502 .
- ink anchoring module 134 adjusts the ink object 510 in order to maintain the alignment of the first and second anchor points 512 and 516 on ink object 510 with the corresponding first and second anchor positions 514 and 518 within objects 506 and 508 , respectively.
- the ink anchoring module 134 has deformed the shape of the ink object 510 in order to maintain the alignment of anchor points 512 and 516 with anchor positions 514 and 518 .
- FIG. 9 is a flow diagram that describes steps in an additional method for ink anchoring in accordance with one or more implementations.
- an interactive canvas is displayed on the one or more display devices of a computing device.
- client device 102 displays an interactive canvas 602 on one or display devices.
- an ink object is generated by digitizing ink input received to the interactive canvas.
- ink input is provided to interactive canvas 602 when the user draws a line with an arrow from a first object 606 , through a second object 608 , and ending at a third object 610 .
- the ink input is digitized and displayed on the interactive canvas 602 as an ink object 612 .
- the ink anchoring module 134 determines that the ink object 612 overlaps the first object 606 , the second object 608 , and the third object 610 .
- anchor points on the ink object that are aligned with a corresponding anchor position within each of the three or more objects are determined. For example, the ink anchoring module 134 determines a first anchor point 614 on the ink object 612 that is aligned with a first anchor position 616 on the first object 606 . Similarly, the ink anchoring module 134 determines a second anchor point 618 on the ink object 612 that is aligned with a second anchor position 620 on the second object 608 . Similarly, the ink anchoring module 134 determines a third anchor point 622 on the ink object 612 that is aligned with a third anchor position 624 on the third object 610 . The ink anchoring module 134 can determine the anchor points in a variety of different ways, such as based on the a beginning or ending drawing stroke, the amount of ink within the object, and so forth.
- the anchor points on the ink object are anchored to the respective anchor positions within each of the three or more objects.
- ink anchoring module 134 anchors the first anchor point 614 on the ink object 612 to the first anchor position 616 within the first object 606 , anchors the second anchor point 618 on the ink object 612 to the second anchor position 620 within the second object 608 , and anchors the third anchor point 612 on the ink object 612 to the third anchor position 624 within the third object 610
- the ink object is divided into multiple segments such that each of the multiple segments are associated with two of the three or more objects.
- ink anchoring module 134 divides ink object 602 into a first segment 626 which overlaps both first object 606 and second object 608 , and a second segment 628 which overlaps both second object 608 and third object 610 .
- a respective one of the multiple segments of the ink object which overlaps the manipulated object is adjusted. For example, as illustrated in in FIG. 6B the user has manipulated third object 610 by moving the third object 610 to the bottom right hand corner of interactive canvas 602 . To do so, the user has selected third object 610 using stylus 604 , and dragged the third object 610 to the lower right corner of interactive canvas 602 . In response to the manipulation of third object 610 , ink anchoring module 134 adjusts the second segment 628 of ink object 612 which overlaps the manipulated third object 610 .
- ink anchoring module 134 has adjusted the second segment 628 by deforming the shape of the second segment 628 in order to maintain the alignment of anchor points 618 and 622 with corresponding anchor positions 620 and 624 within the respective second and third objects 608 and 610 .
- ink anchoring module 134 adjusts the respective segment of the ink object which overlaps the manipulated object without adjusting at least one of the multiple segments. For example, as illustrated in FIG. 6B , ink anchoring module 134 adjusts second segment 628 without adjusting the first segment 626 . Alternately, in some cases ink anchoring module may adjust the respective segment of the ink object a greater amount than an adjustment of at least one of the multiple segments.
- FIG. 10 illustrates an example system generally at 1000 that includes an example computing device 1002 that is representative of one or more computing systems and/or devices that may implement the various techniques described herein.
- the computing device 1002 represents an implementation of the client device 102 discussed above, such as a dual-display device.
- the computing device 1002 may, for example, be configured to assume a mobile configuration through use of a housing formed and sized to be grasped and carried by one or more hands of a user, illustrated examples of which include a mobile phone, mobile game and music device, and tablet computer although other examples are also contemplated.
- the client device 102 may be implemented as a wearable device, such as a smart watch, smart glasses, a dual-surface gesture-input peripheral for a computing device, and so forth.
- the example computing device 1002 as illustrated includes a processing system 1004 , one or more computer-readable media 1006 , and one or more I/O interface 1008 that are communicatively coupled, one to another.
- the computing device 1002 may further include a system bus or other data and command transfer system that couples the various components, one to another.
- a system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures.
- a variety of other examples are also contemplated, such as control and data lines.
- the processing system 1004 is representative of functionality to perform one or more operations using hardware. Accordingly, the processing system 1004 is illustrated as including hardware element 1010 that may be configured as processors, functional blocks, and so forth. This may include implementation in hardware as an application specific integrated circuit or other logic device formed using one or more semiconductors.
- the hardware elements 1010 are not limited by the materials from which they are formed or the processing mechanisms employed therein.
- processors may be comprised of semiconductor(s) and/or transistors (e.g., electronic integrated circuits (ICs)).
- processor-executable instructions may be electronically-executable instructions.
- the computer-readable storage media 1006 is illustrated as including memory/storage 1012 .
- the memory/storage 1012 represents memory/storage capacity associated with one or more computer-readable media.
- the memory/storage component 1012 may include volatile media (such as random access memory (RAM)) and/or nonvolatile media (such as read only memory (ROM), Flash memory, optical disks, magnetic disks, and so forth).
- the memory/storage component 1012 may include fixed media (e.g., RAM, ROM, a fixed hard drive, and so on) as well as removable media (e.g., Flash memory, a removable hard drive, an optical disc, and so forth).
- the computer-readable media 1006 may be configured in a variety of other ways as further described below.
- Input/output interface(s) 1008 are representative of functionality to allow a user to enter commands and information to computing device 1002 , and also allow information to be presented to the user and/or other components or devices using various input/output devices.
- input devices include a keyboard, a cursor control device (e.g., a mouse), a microphone, a scanner, touch functionality (e.g., capacitive or other sensors that are configured to detect physical touch), a camera (e.g., which may employ visible or non-visible wavelengths such as infrared frequencies to recognize movement as gestures that do not involve touch), and so forth.
- Examples of output devices include a display device (e.g., a monitor or projector), speakers, a printer, a network card, tactile-response device, and so forth.
- the computing device 1002 may be configured in a variety of ways to support user interaction.
- modules include routines, programs, objects, elements, components, data structures, and so forth that perform particular tasks or implement particular abstract data types.
- module generally represent software, firmware, hardware, or a combination thereof.
- the features of the techniques described herein are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.
- Computer-readable media may include a variety of media that may be accessed by the computing device 1002 .
- computer-readable media may include “computer-readable storage media” and “computer-readable signal media.”
- Computer-readable storage media may refer to media and/or devices that enable persistent storage of information in contrast to mere signal transmission, carrier waves, or signals per se. Thus, computer-readable storage media refers to non-signal bearing media and does not include signals per se.
- the computer-readable storage media includes hardware such as volatile and non-volatile, removable and non-removable media and/or storage devices implemented in a method or technology suitable for storage of information such as computer readable instructions, data structures, program modules, logic elements/circuits, or other data.
- Examples of computer-readable storage media may include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, hard disks, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other storage device, tangible media, or article of manufacture suitable to store the desired information and which may be accessed by a computer.
- Computer-readable signal media may refer to a signal-bearing medium that is configured to transmit instructions to the hardware of the computing device 1002 , such as via a network.
- Signal media typically may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as carrier waves, data signals, or other transport mechanism.
- Signal media also include any information delivery media.
- modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
- communication media include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared, and other wireless media.
- hardware elements 1010 and computer-readable media 1006 are representative of modules, programmable device logic and/or fixed device logic implemented in a hardware form that may be employed in some implementations to implement at least some aspects of the techniques described herein, such as to perform one or more instructions.
- Hardware may include components of an integrated circuit or on-chip system, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a complex programmable logic device (CPLD), and other implementations in silicon or other hardware.
- ASIC application-specific integrated circuit
- FPGA field-programmable gate array
- CPLD complex programmable logic device
- hardware may operate as a processing device that performs program tasks defined by instructions and/or logic embodied by the hardware as well as a hardware utilized to store instructions for execution, e.g., the computer-readable storage media described previously.
- software, hardware, or executable modules may be implemented as one or more instructions and/or logic embodied on some form of computer-readable storage media and/or by one or more hardware elements 1010 .
- the computing device 1002 may be configured to implement particular instructions and/or functions corresponding to the software and/or hardware modules. Accordingly, implementation of a module that is executable by the computing device 1002 as software may be achieved at least partially in hardware, e.g., through use of computer-readable storage media and/or hardware elements 1010 of the processing system 1004 .
- the instructions and/or functions may be executable/operable by one or more articles of manufacture (for example, one or more computing devices 1002 and/or processing systems 1004 ) to implement techniques, modules, and examples described herein.
- Example implementations described herein include, but are not limited to, one or any combinations of one or more of the following examples:
- a computing device comprises: one or more display devices; at least one processor; and at least one computer-readable storage media storing instructions that are executable by the at least one processor to: display an interactive canvas on the one or more display devices of the computing device; generate an ink object by digitizing ink input received to the interactive canvas; determine that the ink object overlaps an object in the interactive canvas; and anchor the ink object to the object such that a spatial relationship between the ink object and the object is maintained if the ink object or the object is manipulated.
- the one or more objects comprise one or more images, text, videos, or audio files.
- the computing device comprises a dual-display device comprising a first display device and a second display device.
- a method implemented by a computing device comprises: displaying an interactive canvas on the one or more display devices of the computing device; generating an ink object by digitizing ink input received to the interactive canvas; determining that the ink object overlaps an object in the interactive canvas; and anchoring the ink object to the object such that a spatial relationship between the ink object and the object is maintained if the ink object or the object is manipulated.
- a method implemented by a computing device comprises: displaying an interactive canvas on one or more display devices of a computing device; generating an ink object by digitizing ink input received to the interactive canvas; determining that the ink object overlaps multiple objects in the interactive canvas; determining anchor points on the ink object that are aligned with a corresponding anchor position within each of the multiple objects; anchoring the anchor points on the ink object to the respective anchor positions within each of the multiple objects; and in response to at least one of the multiple objects being manipulated, adjusting the ink object to maintain the alignment of the anchor points with the corresponding anchor positions within the multiple objects.
- the adjusting comprises applying a geometric transform to the ink object to maintain the alignment of the anchor points with the corresponding anchor positions within the multiple objects.
- non-destructive link is created by maintaining separate object identifiers for the ink object and each of the multiple objects.
- the separate object identifiers include an ink object identifier which identifies the ink object and object identifiers which identify each of the multiple objects, wherein the ink object identifier includes links to each of the multiple objects, and wherein each object identifier includes a link to the ink object.
- ink object identifier provides default positioning information about the ink object which can be understood by different types of applications or devices to improve compatibility with the different applications or devices.
- a computing device comprises: one or more display devices; at least one processor; and at least one computer-readable storage media storing instructions that are executable by the at least one processor to: display an interactive canvas on the one or more display devices of the computing device; generate an ink object by digitizing ink input received to the interactive canvas; determine that the ink object overlaps three or more objects in the interactive canvas; determine anchor points on the ink object that are aligned with a corresponding anchor position within each of the three or more objects; anchor the anchor points on the ink object to the respective anchor positions within each of the three or more objects; divide the object into multiple segments such that each of the multiple segments overlaps two of the three or more objects; and in response to at least one of the three or more objects being manipulated, adjust a respective one of the multiple segments of the ink object which overlaps the manipulated object.
- the adjusting comprises applying a geometric transform to the respective segment of the ink object to maintain the alignment of the anchor points with the corresponding anchor positions within the respective two objects of the respective segment.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- This application claims priority to U.S. Provisional Application No. 62/506,519, filed May 15, 2017, entitled “Ink Anchoring”, the disclosure of which is hereby incorporated by reference herein in its entirety.
- Increasingly, users provide input to touch display devices using a stylus. In some creative applications, such as a journal application or a drawing application, users may draw on the display device using a stylus in order to generate an ink object that is intended to be associated with an object. For example, a user may annotate a picture by writing on or proximate the picture. As another example, the user may draw a line between multiple objects in order to show a relationship between the multiple objects. In conventional solutions, user input intended to manipulate an object that is associated with an ink object causes only the object to be modified thereby destroying the intended relationship between the ink object and the one or more objects.
- Techniques for ink anchoring are described. In one or more implementations, an interactive canvas is displayed on one or more display devices of a computing device. An ink object is generated by digitizing ink input received to the interactive canvas. In response to determining that the ink object overlaps an object in the interactive canvas, the ink object is anchored to the object such that a spatial relationship between the ink object and the object is maintained if the ink object or the object is manipulated.
- In one or more implementations, an interactive canvas is displayed on one or more display devices of a computing device. An ink object is generated by digitizing ink input received to the interactive canvas. In response to determining that the ink object overlaps multiple objects in the interactive canvas, anchor points on the ink object that are aligned with a corresponding anchor position within each of the multiple objects are determined. Next, the anchor points are anchored on the ink object to the respective anchor positions within each of the multiple objects. In response to at least one of the multiple objects being manipulated, the ink object is adjusted to maintain the alignment of the anchor points with the corresponding anchor positions within the multiple objects.
- In one or more implementations, an interactive canvas is displayed on one or more display devices of a computing device. An ink object is generated by digitizing ink input received to the interactive canvas. In response to determining that the ink object overlaps three or more objects in the interactive canvas, anchor points on the ink object are determined. The anchor points are aligned with a corresponding anchor position within each of the three or more objects. Next, the anchor points on the ink object are anchored to the respective anchor positions within each of the three or more objects. The object is then divided into multiple segments such that each of the multiple segments overlaps two of the three or more objects. In response to at least one of the three or more objects being manipulated, a respective one of the multiple segments of the ink object which overlaps the manipulated object is adjusted.
- This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
- The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different instances in the description and the figures may indicate similar or identical items.
-
FIG. 1 is an illustration of an environment in an example implementation that is operable to support techniques for ink anchoring discussed herein. -
FIG. 2 illustrates a system showing the ink anchoring module ofFIG. 1 in more detail. -
FIGS. 3A-3C illustrate various examples of ink anchoring in accordance with one or more implementations. -
FIG. 4 illustrates a system in which theink anchoring module 134 is configured to anchor an ink object to multiple objects. -
FIGS. 5A and 5B illustrate an example of ink anchoring for multiple objects in accordance with one or more implementations. -
FIGS. 6A and 6B illustrate an example of ink anchoring for three or more objects in accordance with one or more implementations. -
FIG. 7 is a flow diagram that describes steps in a method for ink anchoring in accordance with one or more implementations. -
FIG. 8 is a flow diagram that describes steps in an additional method for ink anchoring in accordance with one or more implementations. -
FIG. 9 is a flow diagram that describes steps in an additional method for ink anchoring in accordance with one or more implementations. -
FIG. 10 illustrates an example system that includes an example computing device that is representative of one or more computing systems and/or devices that may implement the various techniques described herein. - Techniques for ink anchoring are described. Generally, the techniques described herein anchor an ink object corresponding to ink input that is received in “free form” to one or more objects that the ink object overlaps or intersects such that a spatial relationship between the ink object and the object is maintained if the ink object or the object is manipulated, such as by moving or re-sizing the object. The anchoring causes the ink object and the object to be maintained as separate objects by creating a “non-destructive link” between the ink object and the object, such that the object is not permanently edited by the overlapping ink object.
- In one or more implementations, the ink anchoring techniques are configured to anchor an ink object to multiple objects that the ink object overlaps or intersects. Notably, as the number of objects that are intersected by the ink object increases, the complexity of representing and maintaining the relationship between the ink object and the multiple objects increases. Thus, for each object, an anchor point on the ink object that is aligned with an anchor position (e.g., an x,y position) within the object is determined. Subsequently, when one or more of the objects is manipulated, the ink object is adjusted in order to maintain the alignment of the anchor points with each respective anchor position within the objects. For example, to adjust the ink object, a geometric transform can be applied to the ink object in order to deform the ink object such that the anchor points are aligned with the corresponding anchor positions.
- In one or more implementations, the ink anchoring techniques discussed throughout can be applied to scenarios in which the ink object overlaps three or more objects. In this scenario, the ink object is divided into multiple segments such that each segment overlaps two of the three or more objects. For example, if the ink object overlaps a first, second, and third object, the ink object can be divided into a first segment of the ink object which overlaps the first and second objects, and a second segment of the ink object that overlaps the second and third objects. In this scenario, if the third object is moved, then the second segment between the second and third objects can be adjusted or deformed a greater amount than the adjustment of the first segment which is less affected by movement of the third object.
- Thus, the described techniques reduce user frustration which often occurs in conventional solutions when user input intended to manipulate an object that is associated with an ink object, causes only the object to be modified thereby destroying the intended relationship between the ink object and the one or more objects. In addition, the ink anchoring techniques improve the user experience by reducing the number of steps required to link an ink object with one or more objects. For example, unlike conventional solutions, the user does not need to select both the ink object and the one or more objects, and then initiate a grouping command in order to link the ink object with the one or more objects object. Instead, the ink object is automatically anchored to the one or more objects in response to detecting an overlap, or a close proximity, between the ink object and the one or more objects which is indicative of user intent to anchor the ink object to the one or more objects.
-
FIG. 1 is an illustration of anenvironment 100 in an example implementation that is operable to employ techniques for ink anchoring discussed herein.Environment 100 includes aclient device 102 which can be configured for mobile use, such as a mobile phone, a tablet computer, a wearable device, a handheld gaming device, a media player, and so on. In this example, theclient device 102 is implemented as a “dual-display” device, and includes adisplay device 104 and adisplay device 106 that are connected to one another by ahinge 108. Thedisplay device 104 includes atouch surface 110, and thedisplay device 106 includes atouch surface 112. Theclient device 102 also includes aninput module 114 configured to process input received via one of the touch surfaces 110, 112 and/or via thehinge 108. While some of the techniques discussed herein will be described with reference to a dual-display device, it is to be appreciated that in some cases the techniques may also be implemented on a single-screen device, such as a mobile phone, tablet computer, media player, laptop computer, desktop computer, and so forth. In addition, thehinge 108 may allow the 104 and 106 to fold back on each other to provide a “single display” device. As such, the techniques described herein may be designed to function whether the user is operating in a two-display mode or a single-display mode. In addition, while the dual display device is illustrated with a hinge in this example, it is to be appreciated that in some cases the techniques may be implemented in single display, dual-display, or multi-display devices without the hinge.display devices - The
hinge 108 is configured to rotationally move about alongitudinal axis 116 of thehinge 108 to allow an angle between the 104, 106 to change. In this way, thedisplay devices hinge 108 allows the 104, 106 to be connected to one another yet be oriented at different angles and/or planar orientations relative to each other. In at least some implementations, the touch surfaces 110, 112 may represent different portions of a single integrated and continuous display surface that can be bent along thedisplay devices hinge 108. - While implementations presented herein are discussed in the context of a mobile device, it is to be appreciated that various other types and form factors of devices may be utilized in accordance with the claimed implementations. Thus, the
client device 102 may range from full resource devices with substantial memory and processor resources, to a low-resource device with limited memory and/or processing resources. An example implementation of theclient device 102 is discussed below with reference toFIG. 10 . - The
client device 102 includes a variety of different functionalities that enable various activities and tasks to be performed. For instance, theclient device 102 includes anoperating system 118,applications 120, and acommunication module 122. Generally, theoperating system 118 is representative of functionality for abstracting various system components of theclient device 102, such as hardware, kernel-level modules and services, and so forth. Theoperating system 118, for instance, can abstract various components (e.g., hardware, software, and firmware) of theclient device 102 to enable interaction between the components and applications running on theclient device 102. - The
applications 120 are representative of functionality for performing different tasks via theclient device 102. In one particular implementation, theapplications 120 represent a web browser, web platform, or other application that can be leveraged to browse websites over a network. - The
communication module 122 is representative of functionality for enabling theclient device 102 to communicate over wired and/or wireless connections. For instance, thecommunication module 122 represents hardware and logic for communicating data via a variety of different wired and/or wireless technologies and protocols. - According to various implementations, the
104, 106 generally represent functionality for visual output for thedisplay devices client device 102. Additionally, the 104, 106 represent functionality for receiving various types of input, such as touch input, stylus input, touchless proximity input, and so forth via one or more of the touch surfaces 110, 112, which can be used as visual output portions of thedisplay devices 104, 106. Thedisplay devices input module 114 is representative of functionality to enable theclient device 102 to receive input (e.g., via input mechanisms 124) and to process and route the input in various ways. - The
input mechanisms 124 generally represent different functionalities for receiving input to theclient device 102, and include adigitizer 126,touch input devices 128, andanalog input devices 130. Examples of theinput mechanisms 124 include gesture-sensitive sensors and devices (e.g., such as touch-based sensors), a stylus, a touch pad, accelerometers, a microphone with accompanying voice recognition software, and so forth. Theinput mechanisms 124 may be separate or integral with the 104, 106; integral examples include gesture-sensitive displays with integrated touch-sensitive sensors.display devices - The
digitizer 126 represents functionality for converting various types of input to the 104, 106, thedisplay devices touch input devices 128, and theanalog input devices 130 into digital data that can be used by theclient device 102 in various ways. Theanalog input devices 130 represent hardware mechanisms (e.g., the hinge 108) that are usable to generate different physical quantities that represent data. For instance, thehinge 108 represents a mechanism that can be leveraged to generate input data by measurement of a physical variable, such as hinge angle of thehinge 108. One ormore sensors 132, for example, can measure the hinge angle, and thedigitizer 126 can convert such measurements into digital data usable by theclient device 102 to perform operations to content displayed via the 104, 106.display devices - Generally, the
sensors 132 represent functionality for detecting different input signals received by theclient device 102. For example, thesensors 132 can include one or more hinge sensors configured to detect a hinge angle between the 104, 106. Additionally, thedisplay devices sensors 132 can include grip sensors, such as touch sensors, configured to detect how a user is holding theclient device 102. Accordingly, a variety ofdifferent sensors 132 can be implemented to detect various different types of digital and/or analog input. These and other aspects are discussed in further detail below. - In one particular implementation, the
applications 120 represent a journal application which provides an interactive canvas representative of pages of a journal. For example, a first page of the journal application can be presented ontouch surface 110 ofdisplay device 104 while a second page of the journal application is presented ontouch surface 112 ofdisplay device 106. The user can then write and draw on the interactive canvas, as well as insert and/or manipulate various different objects. - In at least some implementations, the
applications 120 include or otherwise make use of anink anchoring module 134. Theink anchoring module 134, for example, represents a standalone application. In other implementations, theink anchoring module 134 is included as part of another application or system software, such as theoperating system 118. Generally, theink anchoring module 134 is configured to anchor ink input that is received in “free form” (e.g., writing or drawing on a touch surface of a device using a stylus) to one or more objects that the ink objects overlaps or intersects such that a spatial relationship between the ink object and the object is maintained if the ink object or the object is manipulated, such as by moving or re-sizing the object. In one or more implementations, theink anchoring module 134 is configured to anchor an ink object to multiple objects that the ink object overlaps or intersects. In this scenario, when one or more of the objects is manipulated, theink anchoring module 134 adjusts the ink object in order to maintain an alignment of anchor points on the ink object with respective anchor positions within the objects. Further discussion of this and other features is provided below. -
FIG. 2 illustrates asystem 200 showing theink anchoring module 134 in more detail. - In this example,
ink anchoring module 134monitors ink input 202 to aninteractive canvas 204. For example, a user can provideink input 202 to aninteractive canvas 204 by writing or drawing on the interactive canvas using a stylus or the user's finger. Thus,ink input 202 encompasses any type of input that can be provided to the interactive canvas using a stylus or the user's finger, including drawings strokes, drawing shapes, drawing lines, drawing pictures, writing, and so forth. Thedigitizer 126 creates anink object 206 by digitizing and displaying a representation of theink input 202 on theinteractive canvas 204. Theinteractive canvas 204 may also include one ormore objects 208. As described herein, objects may include any type of content, such as images and photos, videos, audio files, text, symbols, drawings, and so forth. - At 210,
ink anchoring module 134 determines whether theink object 206 overlaps one ormore objects 208 displayed on theinteractive canvas 204. Generally, theink anchoring module 134 determines that theink object 206 overlaps anobject 208 if at least a portion of theink object 206 at least partially overlaps or intersects theobject 208. - If an overlap is detected, then at 212 the
ink anchoring module 134 anchors theink object 206 to theobject 208 such that the spatial relationship between theink object 206 and theobject 208 is maintained if either the ink object or object are manipulated. In some cases, theink anchoring module 134 may anchor the ink object to the object if theink object 206 is within close spatial proximity to theobject 208 without actually overlapping theink object 206. Theink anchoring module 134 may also factor in a time proximity between the ink object and the object in order to determine an overlap. For example, if an object is inserted into theinteractive canvas 204, and shortly thereafter the ink object is drawn within a close spatial proximity to the object, the time proximity between inserting theobject 208 and creating theink object 206 combined with the close spatial proximity between the objects, may cause theink anchoring module 134 to anchor theink object 206 to theobject 208. - Consider, for example,
FIGS. 3A to 3C which illustrate an example 300 of ink anchoring in accordance with one or more implementations. - In
FIG. 3A ,client device 102 displays aninteractive canvas 302 on one or more displays. In this example, theinteractive canvas 302 is presented on 104 and 106 of a “dual-display”display devices client device 102, and is associated with a journal application. However, as described throughout, in other cases theinteractive canvas 302 may be presented on a “single-display” device and/or associated with a different type of application. The journal application enables the user to take notes and/or draw on theinteractive canvas 302 using an input device, such as astylus 304. - In this example, the
interactive canvas 302 includes anobject 306, which corresponds to a picture of a car. As described herein, objects may include any type of content, such as images and photos, videos, audio files, text, symbols, drawings, and so forth. - Referring now to
FIG. 3B , ink input is provided tointeractive canvas 302 when the user draws a picture of the sun onstylus 304. The ink input is digitized and displayed on theinteractive canvas 302 as anink object 308. In this case,ink anchoring module 134 determines that theink object 308 overlaps theobject 306, and thus anchors theink object 308 to theobject 306 such that the spatial relationship between theink object 306 and theobject 308 is maintained if either the ink object or object are manipulated, such as by moving or resizing the ink object or object. - For example, in
FIG. 3C , the user has movedobject 306 to the upper right hand corner ofinteractive canvas 302 displayed ondisplay device 106. To do so, the user has selectedobject 306 usingstylus 304, and dragged theobject 306 to the upper right corner ofinteractive canvas 302. Notably, the user manipulation to moveobject 306 also causes theink anchoring module 134 to maintain the spatial relationship between the objects by movingink object 308 with theobject 306. - Referring back to
FIG. 2 , in one or more implementations anchoring the ink object to the object creates anon-destructive link 214 between theink object 206 and theobject 208 such that the object is not permanently edited by the overlappingink object 206. In fact, the user can remove thenon-destructive link 214 at any time. In one or more implementations, theanchoring module 134 creates thenon-destructive link 214 by maintaining separate object identifiers for each of theink object 206 and theobject 208. The separate object identifiers enable the ink object and the object to be accessed or searched for separately. For example, inFIG. 2 , the non-destructive link is illustrated as including anink object identifier 216 which identifies theink object 206, and anobject identifier 218 which identifies theobject 208. The object identifiers each include a references to each other. For example,ink object identifier 216 is illustrated as including anobject link 220 which links to the object, and objectidentifier 218 is illustrated as including a link object ID which links to the ink object. Thus, the 220 and 222 enable thelinks ink object 206 and object 208 to be treated as a single entity, while the 216 and 218 maintain independence between the objects.separate identifiers - In one or more implementations, the
ink anchoring module 134 is configured to anchor an ink object to multiple objects that the ink object overlaps or intersects. In this scenario, when one or more of the objects is manipulated, theink anchoring module 134 adjusts the ink object in order to maintain an alignment of anchor points on the ink object with respective anchor positions within the objects. - Consider, for example,
FIG. 4 which illustrates asystem 400 in which theink anchoring module 134 is configured to anchor an ink object to multiple objects. In this example,ink anchoring module 134monitors ink input 402 to aninteractive canvas 404. For example, a user can provideink input 402 to aninteractive canvas 404 by writing or drawing on the interactive canvas using a stylus or the user's finger. Thus,ink input 402 encompasses any type of input that can be provided to the interactive canvas using a stylus or the user's finger, including drawings strokes, drawing shapes, drawing lines, drawing pictures, writing, and so forth. Thedigitizer 126 creates anink object 406 by digitizing and displaying a representation of theink input 402 on theinteractive canvas 404. Theinteractive canvas 404 may also include one ormore objects 408. - At 410,
ink anchoring module 134 determines whether theink object 406 overlapsmultiple objects 408 on theinteractive canvas 404. Generally, theink anchoring module 134 determines that theink object 406 overlaps multiple objects if theink object 406 overlaps or intersects two or more objects. However, as discussed above, an overlap or intersect may also be determined by theink anchoring module 134 based on a spatial and/or time proximity between the ink object and the multiple objects. - If an overlap of multiple objects is detected, then at 412 the
ink anchoring module 134 anchors theink object 406 to each of the multiple objects. To do so, the ink anchoring module determines anchor points on theink object 406 that are aligned with corresponding anchor positions within themultiple objects 408. Subsequently, when one or more of the objects is manipulated, theink anchoring module 134 adjusts theink object 406 in order to maintain an alignment of anchor points on theink object 406 with respective anchor positions within theobjects 308. - As an example, consider
FIGS. 5A and 5B , which illustrate an example 500 of ink anchoring for multiple objects in accordance with one or more implementations. InFIG. 5A ,client device 102 displays aninteractive canvas 502 on one or more displays. As described throughout, theinteractive canvas 502 may be associated with a journal application and presented on 104 and 106 of a “dual-display”display devices client device 102. However, in other cases theinteractive canvas 502 may be presented on a “single-display” device and/or associated with a different type of application. The journal application enables the user to take notes and/or draw on theinteractive canvas 502 using an input device, such as astylus 504. - In this example, the
interactive canvas 502 includes afirst object 504 and asecond object 506. As described throughout, objects may include any type of content, such as images and photos, videos, audio files, text, symbols, drawings, and so forth. Ink input is provided tointeractive canvas 502 when the user draws a line with an arrow fromobject 506 to object 508. The ink input is digitized and displayed on theinteractive canvas 502 as anink object 510. In this case,ink anchoring module 134 determines that theink object 510 overlaps both thefirst object 506 and thesecond object 508. Theink anchoring module 134 then determines afirst anchor point 512 on theink object 510 that is aligned with afirst anchor position 514 on thefirst object 506. Similarly, theink anchoring module 134 determines asecond anchor point 516 on theink object 510 that is aligned with asecond anchor position 518 on thesecond object 508. Theink anchoring module 134 can determine the anchor points in a variety of different ways, such as based on the a beginning or ending drawing stroke, the amount of ink within the object, and so forth. -
Ink anchoring module 134 then anchors thefirst anchor point 512 on theink object 510 to thefirst anchor position 514 within thefirst object 506 and anchors thesecond anchor point 516 on theink object 510 to thesecond anchor position 518 within thesecond object 508. Subsequently, in response to thefirst object 506 or thesecond object 508 being manipulated, theink anchoring module 134 adjusts theink object 510 to maintain the alignment of the first and second anchor points 512 and 516 with the first and second anchor positions 514 and 516, respectively. - For example, in
FIG. 5B the user has manipulatedsecond object 508 by moving thesecond object 508 to the upper right hand corner ofinteractive canvas 502. To do so, the user has selectedobject 508 usingstylus 504, and dragged thesecond object 508 to the upper right corner ofinteractive canvas 502. In response to the manipulation ofobject 508,ink anchoring module 134 adjusts theink object 510 in order to maintain the alignment of the first and second anchor points 512 and 516 onink object 510 with the corresponding first and second anchor positions 514 and 518 within 506 and 508, respectively. For example, as depicted inobjects FIG. 5B , theink anchoring module 134 has deformed the shape of theink object 510 in order to maintain the alignment of anchor points 512 and 516 with 514 and 518.anchor positions - Referring back to
FIG. 4 , in one or more implementations anchoring theink object 406 to themultiple object 408 creates anon-destructive link 414 between theink object 406 and themultiple objects 408 such that themultiple objects 408 are not permanently edited by the overlappingink object 406. In fact, the user can remove thenon-destructive link 414 at any time. In one or more implementations, theanchoring module 134 creates thenon-destructive link 414 by maintaining separate object identifiers for each of theink object 406 and each of themultiple objects 408. The separate object identifiers enable the ink object and each of the multiple objects to be accessed or searched for separately. - For example, in
FIG. 4 , the non-destructive link is illustrated as including anink object identifier 416 which identifies theink object 406, afirst object identifier 418 with identifies afirst object 408, and asecond object identifier 420 which identifies asecond object 408. Of course, if more than two objects are intersected by the ink object, additional object identifiers are created. Theink object identifier 416 provides default positioning information about theink object 406 which can be understood by different types of applications and/or devices. This improves compatibility with different applications and devices which do not understand the ink anchoring scheme. - The object identifiers each include a references to each other. For example,
ink object ID 416 is illustrated as including afirst object link 422 and asecond object link 424 which links to the first andsecond objects 408, respectively. Similarly,first object identifier 418 andsecond object identifier 420 include ink object links 426 and 428, respectively, which link to theink object 406. Thus, the 422, 424, 426, and 428 enable thelinks ink object 406 andmultiple objects 408 to be treated as a single entity, while the 416, 418, and 420 maintain independence between theseparate identifiers ink object 406 and eachobject 408. - In addition,
ink anchoring module 134 associates anchor points 430 and 432 of theink object 406 with theink object identifier 416, and associates anchorpositions 434 and 436 (e.g., x,y position information) with the 418 and 420, respectively. Thus, when one or more of the objects is manipulated, theobject identifiers ink anchoring module 134 looks up the object identifiers, and adjusts theink object 406 in order to maintain an alignment of the anchor points 430 and 432 on theink object 406 with 434 and 436. For example, therespective anchor positions ink anchoring module 134 changes the default positioning information of the ink object in order to align the anchor points with the x,y anchor positions within the objects. In one or more implementations, theink anchoring module 134 adjusts theink object 406 by applying a geometric transform toe theink object 406 in order to align the first and second anchor points with the corresponding first and second anchor positions. - In one or more implementations, the ink anchoring techniques discussed throughout can be applied to scenarios in which the ink object overlaps three or more objects. In this scenario, the
ink anchoring module 134 divides the ink object into segments, where each segment overlaps two of the three or more objects. For example, if the ink object overlaps a first, second, and third object, the ink anchoring module can divide the ink object into a first segment of the ink object which overlaps the first and second objects, and a second segment of the ink object that overlaps the second and third objects. In this scenario, if the third object is moved, then the second segment between the second and third objects can be adjusted or deformed a greater amount than the adjustment of the first segment which is less affected by movement of the third object. - As an example, consider
FIGS. 6A and 6B , which illustrate an example 600 of ink anchoring for three or more objects in accordance with one or more implementations. - In
FIG. 6A ,client device 102 displays aninteractive canvas 602 on one or display devices. As described throughout, theinteractive canvas 602 may be associated with a journal application and presented on 104 and 106 of a “dual-display”display devices client device 102. However, in other cases theinteractive canvas 602 may be displayed on a “single-display” device and/or associated with a different type of application. The journal application enables the user to take notes and/or draw on theinteractive canvas 602 using an input device, such as astylus 604. - In this example, the
interactive canvas 602 includes multiple objects, including afirst object 606, asecond object 608, and athird object 610. As described throughout, objects may include any type of content, such as images and photos, videos, audio files, text, symbols, drawings, and so forth. - Ink input is provided to
interactive canvas 602 when the user draws a line with an arrow fromfirst object 606, throughsecond object 608, and ending atthird object 610. The ink input is digitized and displayed on theinteractive canvas 602 as anink object 612. - In this case,
ink anchoring module 134 determines that theink object 612 overlaps three or more objects, which in this example includes thefirst object 606, thesecond object 608, and thethird object 610. - The
ink anchoring module 134 then determines anchor points on the ink object that are aligned with a corresponding anchor position within each of the three or more objects. For example, theink anchoring module 134 determines afirst anchor point 614 on theink object 612 that is aligned with afirst anchor position 616 on thefirst object 606. Similarly, theink anchoring module 134 determines asecond anchor point 618 on theink object 612 that is aligned with asecond anchor position 620 on thesecond object 608. Similarly, theink anchoring module 134 determines athird anchor point 622 on theink object 612 that is aligned with athird anchor position 624 on thethird object 610. Theink anchoring module 134 can determine the anchor points in a variety of different ways, such as based on the a beginning or ending drawing stroke, the amount of ink within the object, and so forth. -
Ink anchoring module 134 then anchors the anchor points on the ink object to the respective anchor positions within each of the three or more objects. For example,ink anchoring module 134 anchors thefirst anchor point 614 on theink object 612 to thefirst anchor position 616 within thefirst object 606, anchors thesecond anchor point 618 on theink object 612 to thesecond anchor position 620 within thesecond object 608, and anchors thethird anchor point 612 on theink object 612 to thethird anchor position 624 within thethird object 610 - Next, the
ink anchoring module 134 divides theink object 612 into multiple segments such that each of the multiple segments overlaps two of the three or more objects. For example, inFIG. 6A ,ink anchoring module 134 dividesink object 602 into afirst segment 626 which overlaps bothfirst object 606 andsecond object 608, and asecond segment 628 which overlaps bothsecond object 608 andthird object 610. - Subsequently, in response to one of the three or more objects being manipulated, the
ink anchoring module 134 adjusts a respective segment of the ink object which overlaps the manipulated object. For example, inFIG. 6B the user has manipulatedthird object 610 by moving thethird object 610 to the bottom right hand corner ofinteractive canvas 602. To do so, the user has selectedthird object 610 usingstylus 604, and dragged thethird object 610 to the lower right corner ofinteractive canvas 602. In response to the manipulation ofthird object 610,ink anchoring module 134 adjusts thesecond segment 628 ofink object 612 which overlaps the manipulatedthird object 610. Doing so maintains alignment of the anchor points within the corresponding anchor positions within the respective two objects of the respective segment. For example, inFIG. 6B ,ink anchoring module 134 has adjusted thesecond segment 628 by deforming the shape of thesecond segment 628 in order to maintain the alignment of anchor points 618 and 622 with corresponding anchor positions 620 and 624 within the respective second and 608 and 610.third objects - In one or more implementations,
ink anchoring module 134 adjusts the respective segment of the ink object which overlaps the manipulated object without adjusting at least one of the multiple segments. For example, inFIG. 6B ,ink anchoring module 134 adjustssecond segment 628 without adjusting thefirst segment 626. Alternately, in some cases ink anchoring module may adjust the respective segment of the ink object a greater amount than an adjustment of at least one of the multiple segments. - In one or more implementations, an ink object may be linked to an object being transferred to another container, as well as objects that are not being transferred. In this case, the ink anchoring module may be configured to break a link between the ink object and either the objects being transferred to the new container or the objects remaining in the original container. Alternately, the ink anchoring module may be configured to maintain the link across container boundaries. In this instance, a robust reference system may be utilized to maintain links across containers.
- The following discussion describes some example procedures in accordance with one or more implementations. The example procedures may be employed in the
environment 100 ofFIG. 1 , thesystem 1000 ofFIG. 10 , and/or any other suitable environment. The procedures, for instance, represent example procedures for implementation of the scenarios described above. In at least some implementations, the steps described for the various procedures can be implemented automatically and independent of user interaction. -
FIG. 7 is a flow diagram that describes steps in a method for ink anchoring in accordance with one or more implementations. - At 702, an interactive canvas is displayed on the one or more display devices of the computing device. For example,
client device 102 displays aninteractive canvas 302 on one or more displays. - At 704, ink object is generated by digitizing ink input received to the interactive canvas. For example, in
FIG. 3B , ink input is provided tointeractive canvas 302 when the user draws a picture of the sun onstylus 304. The ink input is digitized and displayed on theinteractive canvas 302 as anink object 308 - At 706, it is determined that the ink object overlaps an object in the interactive canvas, and at 708, the ink object is anchored to the object such that a spatial relationship between the ink object and the object is maintained if the ink object or the object is manipulated. For example,
ink anchoring module 134 determines that theink object 308 overlaps theobject 306, and thus anchors theink object 308 to theobject 306 such that the spatial relationship between theink object 306 and theobject 308 is maintained if either the ink object or object are manipulated, such as by moving or resizing the ink object or object. -
FIG. 8 is a flow diagram that describes steps in an additional method for ink anchoring in accordance with one or more implementations. - At 802, an interactive canvas is displayed on the one or more display devices of the computing device. For example,
client device 102 displays aninteractive canvas 502 on one or more displays. - At 804, an ink object is generated by digitizing ink input received to the interactive canvas. For example, ink input is provided to
interactive canvas 502 when the user draws a line with an arrow fromobject 506 to object 508. The ink input is digitized and displayed on theinteractive canvas 502 as anink object 510. - At 806, it is determined that the ink object overlaps multiple objects in the interactive canvas. For example, the
ink anchoring module 134 determines that theink object 510 overlaps both thefirst object 506 and thesecond object 508. - At 808, anchor points on the ink object that are aligned with a corresponding anchor position within each of the multiple objects are determined. For example, the
ink anchoring module 134 determines afirst anchor point 512 on theink object 510 that is aligned with afirst anchor position 514 on thefirst object 506. Similarly, theink anchoring module 134 determines asecond anchor point 516 on theink object 510 that is aligned with asecond anchor position 518 on thesecond object 508. Theink anchoring module 134 can determine the anchor points in a variety of different ways, such as based on the a beginning or ending drawing stroke, the amount of ink within the object, and so forth. - At 810, the anchor points on the ink object are anchored to the respective anchor positions within each of the multiple objects. For example,
ink anchoring module 134 anchors thefirst anchor point 512 on theink object 510 to thefirst anchor position 514 within thefirst object 506 and anchors thesecond anchor point 516 on theink object 510 to thesecond anchor position 518 within thesecond object 508. - At 812, in response to at least one of the multiple objects being manipulated, the ink object is adjusted to maintain the alignment of the anchor points with the corresponding anchor positions within the multiple objects. For example, in
FIG. 5B the user has manipulatedsecond object 508 by moving thesecond object 508 to the upper right hand corner ofinteractive canvas 502. To do so, the user has selectedobject 508 usingstylus 504, and dragged thesecond object 508 to the upper right corner ofinteractive canvas 502. In response to the manipulation ofobject 508,ink anchoring module 134 adjusts theink object 510 in order to maintain the alignment of the first and second anchor points 512 and 516 onink object 510 with the corresponding first and second anchor positions 514 and 518 within 506 and 508, respectively. For example, as depicted inobjects FIG. 5B , theink anchoring module 134 has deformed the shape of theink object 510 in order to maintain the alignment of anchor points 512 and 516 with 514 and 518.anchor positions -
FIG. 9 is a flow diagram that describes steps in an additional method for ink anchoring in accordance with one or more implementations. - At 902, an interactive canvas is displayed on the one or more display devices of a computing device. For example,
client device 102 displays aninteractive canvas 602 on one or display devices. - At 904, an ink object is generated by digitizing ink input received to the interactive canvas. For example, ink input is provided to
interactive canvas 602 when the user draws a line with an arrow from afirst object 606, through asecond object 608, and ending at athird object 610. The ink input is digitized and displayed on theinteractive canvas 602 as anink object 612. - At 906, it is determined that the ink object overlaps three or more objects in the interactive canvas. For example, the
ink anchoring module 134 determines that theink object 612 overlaps thefirst object 606, thesecond object 608, and thethird object 610. - At 908, anchor points on the ink object that are aligned with a corresponding anchor position within each of the three or more objects are determined. For example, the
ink anchoring module 134 determines afirst anchor point 614 on theink object 612 that is aligned with afirst anchor position 616 on thefirst object 606. Similarly, theink anchoring module 134 determines asecond anchor point 618 on theink object 612 that is aligned with asecond anchor position 620 on thesecond object 608. Similarly, theink anchoring module 134 determines athird anchor point 622 on theink object 612 that is aligned with athird anchor position 624 on thethird object 610. Theink anchoring module 134 can determine the anchor points in a variety of different ways, such as based on the a beginning or ending drawing stroke, the amount of ink within the object, and so forth. - At 910, the anchor points on the ink object are anchored to the respective anchor positions within each of the three or more objects. For example,
ink anchoring module 134 anchors thefirst anchor point 614 on theink object 612 to thefirst anchor position 616 within thefirst object 606, anchors thesecond anchor point 618 on theink object 612 to thesecond anchor position 620 within thesecond object 608, and anchors thethird anchor point 612 on theink object 612 to thethird anchor position 624 within thethird object 610 - At 912, the ink object is divided into multiple segments such that each of the multiple segments are associated with two of the three or more objects. For example,
ink anchoring module 134 dividesink object 602 into afirst segment 626 which overlaps bothfirst object 606 andsecond object 608, and asecond segment 628 which overlaps bothsecond object 608 andthird object 610. - At 914, in response to at least one of the three or more objects being manipulated, a respective one of the multiple segments of the ink object which overlaps the manipulated object is adjusted. For example, as illustrated in in
FIG. 6B the user has manipulatedthird object 610 by moving thethird object 610 to the bottom right hand corner ofinteractive canvas 602. To do so, the user has selectedthird object 610 usingstylus 604, and dragged thethird object 610 to the lower right corner ofinteractive canvas 602. In response to the manipulation ofthird object 610,ink anchoring module 134 adjusts thesecond segment 628 ofink object 612 which overlaps the manipulatedthird object 610. Doing so maintains alignment of the anchor points within the corresponding anchor positions within the respective two objects of the respective segment. For example, inFIG. 6B ,ink anchoring module 134 has adjusted thesecond segment 628 by deforming the shape of thesecond segment 628 in order to maintain the alignment of anchor points 618 and 622 with corresponding anchor positions 620 and 624 within the respective second and 608 and 610.third objects - In one or more implementations,
ink anchoring module 134 adjusts the respective segment of the ink object which overlaps the manipulated object without adjusting at least one of the multiple segments. For example, as illustrated inFIG. 6B ,ink anchoring module 134 adjustssecond segment 628 without adjusting thefirst segment 626. Alternately, in some cases ink anchoring module may adjust the respective segment of the ink object a greater amount than an adjustment of at least one of the multiple segments. -
FIG. 10 illustrates an example system generally at 1000 that includes anexample computing device 1002 that is representative of one or more computing systems and/or devices that may implement the various techniques described herein. In at least some implementations, thecomputing device 1002 represents an implementation of theclient device 102 discussed above, such as a dual-display device. Thecomputing device 1002 may, for example, be configured to assume a mobile configuration through use of a housing formed and sized to be grasped and carried by one or more hands of a user, illustrated examples of which include a mobile phone, mobile game and music device, and tablet computer although other examples are also contemplated. In at least some implementations, theclient device 102 may be implemented as a wearable device, such as a smart watch, smart glasses, a dual-surface gesture-input peripheral for a computing device, and so forth. - The
example computing device 1002 as illustrated includes aprocessing system 1004, one or more computer-readable media 1006, and one or more I/O interface 1008 that are communicatively coupled, one to another. Although not shown, thecomputing device 1002 may further include a system bus or other data and command transfer system that couples the various components, one to another. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures. A variety of other examples are also contemplated, such as control and data lines. - The
processing system 1004 is representative of functionality to perform one or more operations using hardware. Accordingly, theprocessing system 1004 is illustrated as including hardware element 1010 that may be configured as processors, functional blocks, and so forth. This may include implementation in hardware as an application specific integrated circuit or other logic device formed using one or more semiconductors. The hardware elements 1010 are not limited by the materials from which they are formed or the processing mechanisms employed therein. For example, processors may be comprised of semiconductor(s) and/or transistors (e.g., electronic integrated circuits (ICs)). In such a context, processor-executable instructions may be electronically-executable instructions. - The computer-
readable storage media 1006 is illustrated as including memory/storage 1012. The memory/storage 1012 represents memory/storage capacity associated with one or more computer-readable media. The memory/storage component 1012 may include volatile media (such as random access memory (RAM)) and/or nonvolatile media (such as read only memory (ROM), Flash memory, optical disks, magnetic disks, and so forth). The memory/storage component 1012 may include fixed media (e.g., RAM, ROM, a fixed hard drive, and so on) as well as removable media (e.g., Flash memory, a removable hard drive, an optical disc, and so forth). The computer-readable media 1006 may be configured in a variety of other ways as further described below. - Input/output interface(s) 1008 are representative of functionality to allow a user to enter commands and information to
computing device 1002, and also allow information to be presented to the user and/or other components or devices using various input/output devices. Examples of input devices include a keyboard, a cursor control device (e.g., a mouse), a microphone, a scanner, touch functionality (e.g., capacitive or other sensors that are configured to detect physical touch), a camera (e.g., which may employ visible or non-visible wavelengths such as infrared frequencies to recognize movement as gestures that do not involve touch), and so forth. Examples of output devices include a display device (e.g., a monitor or projector), speakers, a printer, a network card, tactile-response device, and so forth. Thus, thecomputing device 1002 may be configured in a variety of ways to support user interaction. - Various techniques may be described herein in the general context of software, hardware elements, or program modules. Generally, such modules include routines, programs, objects, elements, components, data structures, and so forth that perform particular tasks or implement particular abstract data types. The terms “module,” “functionality,” and “component” as used herein generally represent software, firmware, hardware, or a combination thereof. The features of the techniques described herein are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.
- An implementation of the described modules and techniques may be stored on or transmitted across some form of computer-readable media. The computer-readable media may include a variety of media that may be accessed by the
computing device 1002. By way of example, and not limitation, computer-readable media may include “computer-readable storage media” and “computer-readable signal media.” - “Computer-readable storage media” may refer to media and/or devices that enable persistent storage of information in contrast to mere signal transmission, carrier waves, or signals per se. Thus, computer-readable storage media refers to non-signal bearing media and does not include signals per se. The computer-readable storage media includes hardware such as volatile and non-volatile, removable and non-removable media and/or storage devices implemented in a method or technology suitable for storage of information such as computer readable instructions, data structures, program modules, logic elements/circuits, or other data. Examples of computer-readable storage media may include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, hard disks, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other storage device, tangible media, or article of manufacture suitable to store the desired information and which may be accessed by a computer.
- “Computer-readable signal media” may refer to a signal-bearing medium that is configured to transmit instructions to the hardware of the
computing device 1002, such as via a network. Signal media typically may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as carrier waves, data signals, or other transport mechanism. Signal media also include any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared, and other wireless media. - As previously described, hardware elements 1010 and computer-
readable media 1006 are representative of modules, programmable device logic and/or fixed device logic implemented in a hardware form that may be employed in some implementations to implement at least some aspects of the techniques described herein, such as to perform one or more instructions. Hardware may include components of an integrated circuit or on-chip system, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a complex programmable logic device (CPLD), and other implementations in silicon or other hardware. In this context, hardware may operate as a processing device that performs program tasks defined by instructions and/or logic embodied by the hardware as well as a hardware utilized to store instructions for execution, e.g., the computer-readable storage media described previously. - Combinations of the foregoing may also be employed to implement various techniques described herein. Accordingly, software, hardware, or executable modules may be implemented as one or more instructions and/or logic embodied on some form of computer-readable storage media and/or by one or more hardware elements 1010. The
computing device 1002 may be configured to implement particular instructions and/or functions corresponding to the software and/or hardware modules. Accordingly, implementation of a module that is executable by thecomputing device 1002 as software may be achieved at least partially in hardware, e.g., through use of computer-readable storage media and/or hardware elements 1010 of theprocessing system 1004. The instructions and/or functions may be executable/operable by one or more articles of manufacture (for example, one ormore computing devices 1002 and/or processing systems 1004) to implement techniques, modules, and examples described herein. - Example implementations described herein include, but are not limited to, one or any combinations of one or more of the following examples:
- In one or more examples, a computing device comprises: one or more display devices; at least one processor; and at least one computer-readable storage media storing instructions that are executable by the at least one processor to: display an interactive canvas on the one or more display devices of the computing device; generate an ink object by digitizing ink input received to the interactive canvas; determine that the ink object overlaps an object in the interactive canvas; and anchor the ink object to the object such that a spatial relationship between the ink object and the object is maintained if the ink object or the object is manipulated.
- An example as described alone or in combination with any of the other examples described above or below, wherein the anchoring creates a non-destructive link between the ink object and the object such that the object is not permanently edited by the overlapping ink object.
- An example as described alone or in combination with any of the other examples described above or below, wherein the non-destructive link can be removed by a user.
- An example as described alone or in combination with any of the other examples described above or below, wherein the anchoring maintains separate object identifiers for the ink object and the object.
- An example as described alone or in combination with any of the other examples described above or below, wherein the separate object identifiers enable the ink object and the object to be accessed or searched for individually.
- An example as described alone or in combination with any of the other examples described above or below, wherein the instructions cause the at least one processor to anchor the ink object to the object based on the ink object being within close spatial proximity to the object without overlapping the object.
- An example as described alone or in combination with any of the other examples described above or below, wherein the instructions cause the at least one processor to anchor the ink object to the object based on the ink object being within close spatial proximity and time proximity to the object without overlapping the object.
- An example as described alone or in combination with any of the other examples described above or below, wherein the anchoring causes the ink object to move with the object in response to user input to move the object.
- An example as described alone or in combination with any of the other examples described above or below, wherein the anchoring causes the ink object to be re-sized with the object in response to user input to re-size the object.
- An example as described alone or in combination with any of the other examples described above or below, wherein the ink object comprises handwriting or drawing strokes.
- An example as described alone or in combination with any of the other examples described above or below, wherein the one or more objects comprise one or more images, text, videos, or audio files.
- An example as described alone or in combination with any of the other examples described above or below, wherein the computing device comprises a dual-display device comprising a first display device and a second display device.
- An example as described alone or in combination with any of the other examples described above or below, wherein the interactive canvas is displayed on both the first and second display devices of the dual-display device.
- An example as described alone or in combination with any of the other examples described above or below, wherein the interactive canvas is displayed as pages of a journal application.
- In one or more examples, a method implemented by a computing device comprises: displaying an interactive canvas on the one or more display devices of the computing device; generating an ink object by digitizing ink input received to the interactive canvas; determining that the ink object overlaps an object in the interactive canvas; and anchoring the ink object to the object such that a spatial relationship between the ink object and the object is maintained if the ink object or the object is manipulated.
- An example as described alone or in combination with any of the other examples described above or below, wherein the anchoring creates a non-destructive link between the ink object and the object such that the object is not permanently edited by the overlapping ink object.
- An example as described alone or in combination with any of the other examples described above or below, wherein the non-destructive link can be removed by a user.
- An example as described alone or in combination with any of the other examples described above or below, wherein the anchoring maintains separate object identifiers for the ink object and the object.
- An example as described alone or in combination with any of the other examples described above or below, wherein the separate object identifiers enable the ink object and the object to be accessed or searched for individually.
- An example as described alone or in combination with any of the other examples described above or below, wherein the instructions cause the at least one processor to anchor the ink object to the object based on the ink object being within close spatial proximity and time proximity to the object without overlapping the object.
- In one or more examples, a method implemented by a computing device comprises: displaying an interactive canvas on one or more display devices of a computing device; generating an ink object by digitizing ink input received to the interactive canvas; determining that the ink object overlaps multiple objects in the interactive canvas; determining anchor points on the ink object that are aligned with a corresponding anchor position within each of the multiple objects; anchoring the anchor points on the ink object to the respective anchor positions within each of the multiple objects; and in response to at least one of the multiple objects being manipulated, adjusting the ink object to maintain the alignment of the anchor points with the corresponding anchor positions within the multiple objects.
- An example as described alone or in combination with any of the other examples described above or below, wherein the adjusting comprises applying a geometric transform to the ink object to maintain the alignment of the anchor points with the corresponding anchor positions within the multiple objects.
- An example as described alone or in combination with any of the other examples described above or below, wherein the adjusting deforms the ink object in order to maintain the alignment of the anchor points with the corresponding anchor positions within the multiple objects.
- An example as described alone or in combination with any of the other examples described above or below, wherein the anchor positions correspond to an x,y location within each respective object.
- An example as described alone or in combination with any of the other examples described above or below, wherein the ink input correspond to free-form ink input.
- An example as described alone or in combination with any of the other examples described above or below, wherein the ink input correspond to free-form ink input wherein the manipulation comprises user input to move the at least one of the multiple objects.
- An example as described alone or in combination with any of the other examples described above or below, wherein anchoring the anchor points on the ink object to the respective anchor positions within each of the multiple object creates a non-destructive link between the ink object and the multiple objects such that the multiple objects are not permanently edited by the overlapping ink object.
- An example as described alone or in combination with any of the other examples described above or below, wherein the non-destructive link can be removed by a user.
- An example as described alone or in combination with any of the other examples described above or below, wherein the non-destructive link is created by maintaining separate object identifiers for the ink object and each of the multiple objects.
- An example as described alone or in combination with any of the other examples described above or below, wherein the separate object identifiers enable the ink object and each of the multiple objects to be accessed or searched for individually.
- An example as described alone or in combination with any of the other examples described above or below, wherein the separate object identifiers include an ink object identifier which identifies the ink object and object identifiers which identify each of the multiple objects, wherein the ink object identifier includes links to each of the multiple objects, and wherein each object identifier includes a link to the ink object.
- An example as described alone or in combination with any of the other examples described above or below, wherein the ink object identifier provides default positioning information about the ink object which can be understood by different types of applications or devices to improve compatibility with the different applications or devices.
- An example as described alone or in combination with any of the other examples described above or below, further comprising associating the anchor points of the ink object with the ink object identifier and associating the anchor positions of each of the multiple objects with the respective object identifiers.
- An example as described alone or in combination with any of the other examples described above or below, wherein it is determined that the ink object overlaps multiple objects based on the ink object being within close spatial proximity to the at least one of the multiple objects without overlapping the object.
- An example as described alone or in combination with any of the other examples described above or below, wherein it is determined that the ink object overlaps multiple objects based on the ink object being within close spatial proximity and time proximity to at least one of the multiple objects without overlapping the object.
- In one or more examples, a computing device comprises: one or more display devices; at least one processor; and at least one computer-readable storage media storing instructions that are executable by the at least one processor to: display an interactive canvas on the one or more display devices of the computing device; generate an ink object by digitizing ink input received to the interactive canvas; determine that the ink object overlaps three or more objects in the interactive canvas; determine anchor points on the ink object that are aligned with a corresponding anchor position within each of the three or more objects; anchor the anchor points on the ink object to the respective anchor positions within each of the three or more objects; divide the object into multiple segments such that each of the multiple segments overlaps two of the three or more objects; and in response to at least one of the three or more objects being manipulated, adjust a respective one of the multiple segments of the ink object which overlaps the manipulated object.
- An example as described alone or in combination with any of the other examples described above or below, further comprising instructions that are executable by the at least one processor to adjust the respective segment of the ink object which overlaps the manipulated object without adjusting at least one of the multiple segments.
- An example as described alone or in combination with any of the other examples described above or below, further comprising instructions that are executable by the at least one processor to adjust the respective segment of the ink object a greater amount than an adjustment of at least one of the multiple segments.
- An example as described alone or in combination with any of the other examples described above or below, wherein the adjusting comprises applying a geometric transform to the respective segment of the ink object to maintain the alignment of the anchor points with the corresponding anchor positions within the respective two objects of the respective segment.
- An example as described alone or in combination with any of the other examples described above or below, wherein the adjusting deforms the respective segment of the ink object in order to maintain the alignment of the anchor points with the corresponding anchor positions within the respective two objects of the respective segment.
- Although the example implementations have been described in language specific to structural features and/or methodological acts, it is to be understood that the implementations defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the claimed features.
Claims (20)
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US15/638,067 US20180329597A1 (en) | 2017-05-15 | 2017-06-29 | Ink Anchoring |
| PCT/US2018/027412 WO2018212869A1 (en) | 2017-05-15 | 2018-04-13 | Ink anchoring |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201762506519P | 2017-05-15 | 2017-05-15 | |
| US15/638,067 US20180329597A1 (en) | 2017-05-15 | 2017-06-29 | Ink Anchoring |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20180329597A1 true US20180329597A1 (en) | 2018-11-15 |
Family
ID=64097727
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/638,067 Abandoned US20180329597A1 (en) | 2017-05-15 | 2017-06-29 | Ink Anchoring |
| US15/638,058 Active US10599320B2 (en) | 2017-05-15 | 2017-06-29 | Ink Anchoring |
Family Applications After (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/638,058 Active US10599320B2 (en) | 2017-05-15 | 2017-06-29 | Ink Anchoring |
Country Status (2)
| Country | Link |
|---|---|
| US (2) | US20180329597A1 (en) |
| WO (2) | WO2018212869A1 (en) |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20180239742A1 (en) * | 2017-02-21 | 2018-08-23 | Canon Kabushiki Kaisha | Information processing apparatus, control method thereof, and storage medium |
| US20200160573A1 (en) * | 2018-11-16 | 2020-05-21 | Cimpress Schweiz Gmbh | Technology for enabling elastic graphic design |
| US10846897B2 (en) | 2018-11-16 | 2020-11-24 | Cimpress Schweiz Gmbh | Technology for managing graphic design using metadata relationships |
| US11138647B2 (en) | 2018-11-16 | 2021-10-05 | Cimpress Schweiz Gmbh | Method, device, and computer-readable storage medium for managing variations of a graphic design within a framework |
| US12411589B2 (en) * | 2022-03-08 | 2025-09-09 | Casio Computer Co., Ltd. | Information processing apparatus, display control method and storage medium |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11941776B1 (en) | 2023-03-30 | 2024-03-26 | Illuscio, Inc. | Systems and methods for improved interactivity with three-dimensional objects |
Family Cites Families (110)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5539427A (en) | 1992-02-10 | 1996-07-23 | Compaq Computer Corporation | Graphic indexing system |
| US6493006B1 (en) | 1996-05-10 | 2002-12-10 | Apple Computer, Inc. | Graphical user interface having contextual menus |
| US6687876B1 (en) | 1998-12-30 | 2004-02-03 | Fuji Xerox Co., Ltd. | Method and system for maintaining freeform ink annotations on changing views |
| US6683600B1 (en) | 2000-04-19 | 2004-01-27 | Microsoft Corporation | Adaptive input pen mode selection |
| US20010037719A1 (en) | 2000-05-03 | 2001-11-08 | Gardner Gary L. | Digital sheet music display system |
| US6559869B1 (en) | 2000-05-04 | 2003-05-06 | Microsoft Corporation | Adaptive auto-scrolling merge for hand written input |
| US7259753B2 (en) | 2000-06-21 | 2007-08-21 | Microsoft Corporation | Classifying, anchoring, and transforming ink |
| US7036077B2 (en) | 2002-03-22 | 2006-04-25 | Xerox Corporation | Method for gestural interpretation in a system for selecting and arranging visible material in document images |
| US7259752B1 (en) | 2002-06-28 | 2007-08-21 | Microsoft Corporation | Method and system for editing electronic ink |
| US7058902B2 (en) | 2002-07-30 | 2006-06-06 | Microsoft Corporation | Enhanced on-object context menus |
| US7895536B2 (en) | 2003-01-08 | 2011-02-22 | Autodesk, Inc. | Layer editor system for a pen-based computer |
| US7373590B2 (en) | 2003-05-19 | 2008-05-13 | Microsoft Corporation | Shared electronic ink annotation method and system |
| US7218783B2 (en) * | 2003-06-13 | 2007-05-15 | Microsoft Corporation | Digital ink annotation process and system for recognizing, anchoring and reflowing digital ink annotations |
| US7324691B2 (en) | 2003-09-24 | 2008-01-29 | Microsoft Corporation | System and method for shape recognition of hand-drawn objects |
| US7185280B2 (en) | 2003-10-14 | 2007-02-27 | Papilia, Inc. | Personalized automatic publishing extensible layouts |
| GB2409541A (en) | 2003-12-23 | 2005-06-29 | Mandorla Technology Ltd | Editable information management system and method |
| US7237202B2 (en) | 2004-05-11 | 2007-06-26 | Cynthia Joanne Gage | Multiple document viewing apparatus and user interface |
| US20060001656A1 (en) | 2004-07-02 | 2006-01-05 | Laviola Joseph J Jr | Electronic ink system |
| US8117542B2 (en) | 2004-08-16 | 2012-02-14 | Microsoft Corporation | User interface for displaying selectable software functionality controls that are contextually relevant to a selected object |
| US8880597B1 (en) | 2004-09-07 | 2014-11-04 | Evernote Corporation | Electronic note management system and user-interface |
| US20060080616A1 (en) | 2004-10-13 | 2006-04-13 | Xerox Corporation | Systems, methods and user interfaces for document workflow construction |
| US7454717B2 (en) | 2004-10-20 | 2008-11-18 | Microsoft Corporation | Delimiters for selection-action pen gesture phrases |
| US8464175B2 (en) | 2004-12-09 | 2013-06-11 | Microsoft Corporation | Journal display having three dimensional appearance |
| US20060224952A1 (en) | 2005-03-30 | 2006-10-05 | Xiaofan Lin | Adaptive layout templates for generating electronic documents with variable content |
| US7730399B2 (en) | 2005-04-22 | 2010-06-01 | Microsoft Corporation | Journal file reader |
| US7536641B2 (en) | 2005-04-29 | 2009-05-19 | Google Inc. | Web page authoring tool for structured documents |
| US20060267967A1 (en) | 2005-05-24 | 2006-11-30 | Microsoft Corporation | Phrasing extensions and multiple modes in one spring-loaded control |
| US20070079260A1 (en) | 2005-10-04 | 2007-04-05 | Bhogal Kulvir S | Method and apparatus to transmit a calendar event in target calendaring system format |
| US20070088735A1 (en) | 2005-10-17 | 2007-04-19 | International Business Machines Corporation | Optimization-based visual context management |
| US8643605B2 (en) | 2005-11-21 | 2014-02-04 | Core Wireless Licensing S.A.R.L | Gesture based document editor |
| US20070180377A1 (en) | 2006-01-30 | 2007-08-02 | Microsoft Corporation | Self-translating template |
| US7748634B1 (en) | 2006-03-29 | 2010-07-06 | Amazon Technologies, Inc. | Handheld electronic book reader device having dual displays |
| JP5172169B2 (en) | 2007-02-16 | 2013-03-27 | シャープ株式会社 | Content display device, television receiver, content display method, content display control program, and recording medium |
| US8347206B2 (en) | 2007-03-15 | 2013-01-01 | Microsoft Corporation | Interactive image tagging |
| US8014607B2 (en) | 2007-03-23 | 2011-09-06 | Palo Alto Research Center Incorporated | Method and apparatus for creating and editing node-link diagrams in pen computing systems |
| US20080238887A1 (en) | 2007-03-28 | 2008-10-02 | Gateway Inc. | Method and apparatus for programming an interactive stylus button |
| US8116570B2 (en) | 2007-04-19 | 2012-02-14 | Microsoft Corporation | User interface for providing digital ink input and correcting recognition errors |
| WO2009018314A2 (en) | 2007-07-30 | 2009-02-05 | Perceptive Pixel, Inc. | Graphical user interface for large-scale, multi-user, multi-touch systems |
| US8464167B2 (en) | 2008-12-01 | 2013-06-11 | Palo Alto Research Center Incorporated | System and method for synchronized authoring and access of chat and graphics |
| KR20100065418A (en) | 2008-12-08 | 2010-06-17 | 삼성전자주식회사 | Flexible display device and data output method thereof |
| US20100171754A1 (en) * | 2009-01-07 | 2010-07-08 | Microsoft Corporation | Converting digital ink to shapes and text |
| US8645383B2 (en) | 2009-01-27 | 2014-02-04 | Stephen J. Brown | Content management system using sources of experience data and modules for quantification and visualization |
| US20100318916A1 (en) | 2009-06-11 | 2010-12-16 | David Wilkins | System and method for generating multimedia presentations |
| US8484027B1 (en) | 2009-06-12 | 2013-07-09 | Skyreader Media Inc. | Method for live remote narration of a digital book |
| US8451238B2 (en) | 2009-09-02 | 2013-05-28 | Amazon Technologies, Inc. | Touch-screen user interface |
| KR101104721B1 (en) | 2009-09-04 | 2012-01-10 | 임병근 | Portable multimedia device for displaying a document having multiple pages and a driving method thereof |
| US9092115B2 (en) | 2009-09-23 | 2015-07-28 | Microsoft Technology Licensing, Llc | Computing system with visual clipboard |
| US20110102314A1 (en) | 2009-10-30 | 2011-05-05 | Xerox Corporation | Dual-screen electronic reader with tilt detection for page navigation |
| US9081464B2 (en) | 2009-11-20 | 2015-07-14 | Adobe Systems Incorporated | Object selection |
| KR20110074166A (en) | 2009-12-24 | 2011-06-30 | 삼성전자주식회사 | How to create digital content |
| KR101642722B1 (en) | 2010-02-04 | 2016-07-27 | 삼성전자 주식회사 | Portable terminal having dual display unit and method for controlling display thereof |
| US9075522B2 (en) | 2010-02-25 | 2015-07-07 | Microsoft Technology Licensing, Llc | Multi-screen bookmark hold gesture |
| US8707174B2 (en) | 2010-02-25 | 2014-04-22 | Microsoft Corporation | Multi-screen hold and page-flip gesture |
| US9454304B2 (en) | 2010-02-25 | 2016-09-27 | Microsoft Technology Licensing, Llc | Multi-screen dual tap gesture |
| US9053553B2 (en) | 2010-02-26 | 2015-06-09 | Adobe Systems Incorporated | Methods and apparatus for manipulating images and objects within images |
| JP5557314B2 (en) | 2010-03-24 | 2014-07-23 | Necカシオモバイルコミュニケーションズ株式会社 | Terminal device and program |
| JP2011215909A (en) | 2010-03-31 | 2011-10-27 | Toshiba Corp | Electronic device and search control method, and search control program |
| EP2381347B1 (en) | 2010-04-26 | 2018-07-11 | Sony Mobile Communications Inc. | Method for displaying an object having a predetermined information content on a touch screen |
| EP2614448A1 (en) | 2010-09-09 | 2013-07-17 | Sony Ericsson Mobile Communications AB | Annotating e-books/e-magazines with application results |
| US9001149B2 (en) | 2010-10-01 | 2015-04-07 | Z124 | Max mode |
| US20120096345A1 (en) | 2010-10-19 | 2012-04-19 | Google Inc. | Resizing of gesture-created markings for different display sizes |
| US20120113019A1 (en) | 2010-11-10 | 2012-05-10 | Anderson Michelle B | Portable e-reader and method of use |
| US9015610B2 (en) | 2010-11-16 | 2015-04-21 | General Instrument Corporation | Display of controllable attributes for a controllable item based on context |
| US9292171B2 (en) | 2010-11-17 | 2016-03-22 | International Business Machines Corporation | Border menu for context dependent actions within a graphical user interface |
| US20120176416A1 (en) | 2011-01-10 | 2012-07-12 | King Fahd University Of Petroleum And Minerals | System and method for shape recognition and correction |
| KR101891803B1 (en) | 2011-05-23 | 2018-08-27 | 삼성전자주식회사 | Method and apparatus for editing screen of mobile terminal comprising touch screen |
| RU2014110394A (en) | 2011-08-19 | 2015-09-27 | Эппл Инк. | DEVELOPMENT OF CONTENT FOR DIGITAL BOOKS |
| US9977876B2 (en) | 2012-02-24 | 2018-05-22 | Perkinelmer Informatics, Inc. | Systems, methods, and apparatus for drawing chemical structures using touch and gestures |
| US20130238964A1 (en) | 2012-03-06 | 2013-09-12 | Apple Inc. | Application for designing journals |
| US9015581B2 (en) | 2012-03-26 | 2015-04-21 | Vistaprint Schweiz Gmbh | Self-adjusting document layouts using system optimization modeling |
| US9304656B2 (en) | 2012-03-30 | 2016-04-05 | Google Inc. | Systems and method for object selection on presence sensitive devices |
| US20140204014A1 (en) | 2012-03-30 | 2014-07-24 | Sony Mobile Communications Ab | Optimizing selection of a media object type in which to present content to a user of a device |
| US20130268848A1 (en) | 2012-04-05 | 2013-10-10 | Nokia Corporation | User event content, associated apparatus and methods |
| US9098192B2 (en) | 2012-05-11 | 2015-08-04 | Perceptive Pixel, Inc. | Overscan display device and method of using the same |
| US9170667B2 (en) | 2012-06-01 | 2015-10-27 | Microsoft Technology Licensing, Llc | Contextual user interface |
| CN102750104A (en) | 2012-06-29 | 2012-10-24 | 鸿富锦精密工业(深圳)有限公司 | Electronic device with touch input unit |
| US20140033027A1 (en) | 2012-07-12 | 2014-01-30 | Michael Joseph Polo | E-Book Application with Multi-Document Display |
| US20140047332A1 (en) | 2012-08-08 | 2014-02-13 | Microsoft Corporation | E-reader systems |
| US9113033B2 (en) | 2012-08-28 | 2015-08-18 | Microsoft Technology Licensing, Llc | Mobile video conferencing with digital annotation |
| KR102141044B1 (en) | 2012-12-03 | 2020-08-04 | 삼성전자주식회사 | Apparatus having a plurality of touch screens and method for sound output thereof |
| US20140173173A1 (en) | 2012-12-13 | 2014-06-19 | Elpida Memory, Inc. | Method, device, and system including configurable bit-per-cell capability |
| TWI563397B (en) | 2012-12-20 | 2016-12-21 | Chiun Mai Comm Systems Inc | Method and system for inserting image objects to a note software |
| US20140189593A1 (en) | 2012-12-27 | 2014-07-03 | Kabushiki Kaisha Toshiba | Electronic device and input method |
| US9652109B2 (en) | 2013-01-11 | 2017-05-16 | Microsoft Technology Licensing, Llc | Predictive contextual toolbar for productivity applications |
| EP2759921B1 (en) | 2013-01-25 | 2020-09-23 | Morpho, Inc. | Image display apparatus, image displaying method and program |
| US20140215341A1 (en) | 2013-01-31 | 2014-07-31 | Lsi Corporation | Transitioning between pages of content on a display of a user device |
| US20140298223A1 (en) | 2013-02-06 | 2014-10-02 | Peter Duong | Systems and methods for drawing shapes and issuing gesture-based control commands on the same draw grid |
| JP5851652B2 (en) | 2013-03-27 | 2016-02-03 | 株式会社東芝 | Electronic device, display method and program |
| KR20140134018A (en) | 2013-05-13 | 2014-11-21 | 삼성전자주식회사 | Apparatus, method and computer readable recording medium for fulfilling functions rerated to the user input on the screen |
| US9250786B2 (en) * | 2013-07-16 | 2016-02-02 | Adobe Systems Incorporated | Snapping of object features via dragging |
| US9811238B2 (en) | 2013-08-29 | 2017-11-07 | Sharp Laboratories Of America, Inc. | Methods and systems for interacting with a digital marking surface |
| US20150089389A1 (en) | 2013-09-24 | 2015-03-26 | Sap Ag | Multiple mode messaging |
| US20150121179A1 (en) | 2013-10-25 | 2015-04-30 | Palo Alto Research Center Incorporated | System and method for creating graphically rich messages incorporating shared docments |
| US9606664B2 (en) | 2013-11-13 | 2017-03-28 | Dell Products, Lp | Dynamic hover sensitivity and gesture adaptation in a dual display system |
| KR102311221B1 (en) | 2014-04-28 | 2021-10-13 | 삼성전자주식회사 | operating method and electronic device for object |
| KR101632008B1 (en) | 2014-04-30 | 2016-07-01 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
| US20150338940A1 (en) | 2014-05-23 | 2015-11-26 | Microsoft Technology Licensing, Llc | Pen Input Modes for Digital Ink |
| US20160048318A1 (en) | 2014-08-15 | 2016-02-18 | Microsoft Technology Licensing, Llc | Detecting selection of digital ink |
| WO2016035097A2 (en) | 2014-09-03 | 2016-03-10 | Vaidya Suyog Sharad | School script |
| US10509853B2 (en) | 2014-09-05 | 2019-12-17 | Microsoft Technology Licensing, Llc | Creating an annotation pane for a document by augmenting the document |
| US9400570B2 (en) | 2014-11-14 | 2016-07-26 | Apple Inc. | Stylus with inertial sensor |
| US9990107B2 (en) | 2015-03-08 | 2018-06-05 | Apple Inc. | Devices, methods, and graphical user interfaces for displaying and using menus |
| US10884580B2 (en) | 2015-06-07 | 2021-01-05 | Apple Inc. | Devices and methods for displaying content in a note-taking application |
| US20160378291A1 (en) | 2015-06-26 | 2016-12-29 | Haworth, Inc. | Object group processing and selection gestures for grouping objects in a collaboration system |
| US9880735B2 (en) | 2015-08-10 | 2018-01-30 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
| US20180095653A1 (en) | 2015-08-14 | 2018-04-05 | Martin Hasek | Device, method and graphical user interface for handwritten interaction |
| US9965175B2 (en) | 2015-08-25 | 2018-05-08 | Myscript | System and method of digital note taking |
| US20170285932A1 (en) | 2016-03-29 | 2017-10-05 | Microsoft Technology Licensing, Llc | Ink Input for Browser Navigation |
| US10318253B2 (en) | 2016-05-13 | 2019-06-11 | Sap Se | Smart templates for use in multiple platforms |
| US10209789B2 (en) | 2017-02-16 | 2019-02-19 | Dell Products L.P. | Enabling a user to enter notes without authenticating the user |
-
2017
- 2017-06-29 US US15/638,067 patent/US20180329597A1/en not_active Abandoned
- 2017-06-29 US US15/638,058 patent/US10599320B2/en active Active
-
2018
- 2018-04-13 WO PCT/US2018/027412 patent/WO2018212869A1/en not_active Ceased
- 2018-04-16 WO PCT/US2018/027695 patent/WO2018212878A1/en not_active Ceased
Cited By (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20180239742A1 (en) * | 2017-02-21 | 2018-08-23 | Canon Kabushiki Kaisha | Information processing apparatus, control method thereof, and storage medium |
| US10599749B2 (en) * | 2017-02-21 | 2020-03-24 | Canon Kabushiki Kaisha | Information processing apparatus configured to blot out confidential information in a document, control method thereof, and storage medium |
| US20200160573A1 (en) * | 2018-11-16 | 2020-05-21 | Cimpress Schweiz Gmbh | Technology for enabling elastic graphic design |
| US10846897B2 (en) | 2018-11-16 | 2020-11-24 | Cimpress Schweiz Gmbh | Technology for managing graphic design using metadata relationships |
| US11138647B2 (en) | 2018-11-16 | 2021-10-05 | Cimpress Schweiz Gmbh | Method, device, and computer-readable storage medium for managing variations of a graphic design within a framework |
| US11380031B2 (en) * | 2018-11-16 | 2022-07-05 | Cimpress Schweiz Gmbh | Technology for enabling elastic graphic design |
| US11640685B2 (en) | 2018-11-16 | 2023-05-02 | Cimpress Schweiz Gmbh | Technology for managing graphic design using metadata relationships |
| US12277634B2 (en) | 2018-11-16 | 2025-04-15 | Cimpress Schweiz Gmbh | Technology for managing graphic design using metadata relationships |
| US12411589B2 (en) * | 2022-03-08 | 2025-09-09 | Casio Computer Co., Ltd. | Information processing apparatus, display control method and storage medium |
Also Published As
| Publication number | Publication date |
|---|---|
| US10599320B2 (en) | 2020-03-24 |
| WO2018212878A1 (en) | 2018-11-22 |
| WO2018212869A1 (en) | 2018-11-22 |
| US20180329596A1 (en) | 2018-11-15 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10599320B2 (en) | Ink Anchoring | |
| US20180329589A1 (en) | Contextual Object Manipulation | |
| US9830049B2 (en) | Apparatus and method for providing a visual transition between screens | |
| US10534524B2 (en) | Method and device for controlling reproduction speed of multimedia content | |
| TWI720595B (en) | Electronic apparatus and controlling method of electronic apparatus | |
| US11068074B2 (en) | Flexible device and interfacing method thereof | |
| US20180329621A1 (en) | Object Insertion | |
| CN106716493B (en) | Method for styling content and touch screen device for styling content | |
| US9747007B2 (en) | Resizing technique for display content | |
| KR20140046346A (en) | Multi display apparatus and method for contorlling thereof | |
| CN104114658A (en) | Flexible display device and display method thereof | |
| US8762840B1 (en) | Elastic canvas visual effects in user interface | |
| US10691880B2 (en) | Ink in an electronic document | |
| US20250173035A1 (en) | Window display method, electronic device, and computer-readable storage medium | |
| KR20140096780A (en) | Contents display method and mobile terminal implementing the same | |
| US20180329876A1 (en) | Smart Templates | |
| US20180329871A1 (en) | Page-Based Navigation for a Dual-Display Device | |
| US20180329610A1 (en) | Object Selection Mode | |
| US20240184443A1 (en) | Display control method, electronic device, and readable storage medium | |
| JP2013228908A (en) | Information processing device, display control method, and display control program |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SONNINO, EDUARDO;DART, ANTHONY;PENDLAY, RYAN CHANDLER;AND OTHERS;SIGNING DATES FROM 20170524 TO 20170606;REEL/FRAME:042978/0116 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
| AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE TYPOGRAPHICAL ERROR IN AN APPLICATION DERIVATIVE PREVIOUSLY RECORDED ON REEL 042978 FRAME 0116. ASSIGNOR(S) HEREBY CONFIRMS THE FILING DATE OF APPLICATION NO. 15/638,067 IS JUNE 29, 2017;ASSIGNORS:SONNINO, EDUARDO;DART, ANTHONY;PENDLAY, RYAN CHANDLER;AND OTHERS;SIGNING DATES FROM 20170524 TO 20170606;REEL/FRAME:054701/0377 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |