[go: up one dir, main page]

HK1181163B - Visual navigation of documents by object - Google Patents

Visual navigation of documents by object Download PDF

Info

Publication number
HK1181163B
HK1181163B HK13108430.6A HK13108430A HK1181163B HK 1181163 B HK1181163 B HK 1181163B HK 13108430 A HK13108430 A HK 13108430A HK 1181163 B HK1181163 B HK 1181163B
Authority
HK
Hong Kong
Prior art keywords
objects
interaction
display
spreadsheet
chart
Prior art date
Application number
HK13108430.6A
Other languages
Chinese (zh)
Other versions
HK1181163A (en
Inventor
A.林
G.贝瑞
Original Assignee
微软技术许可有限责任公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 微软技术许可有限责任公司 filed Critical 微软技术许可有限责任公司
Publication of HK1181163A publication Critical patent/HK1181163A/en
Publication of HK1181163B publication Critical patent/HK1181163B/en

Links

Description

Object-wise document visual navigation
Technical Field
The invention relates to per-object visual navigation of documents.
Background
Some documents, such as spreadsheets, may include multiple spreadsheets that include a large amount of data. Data in a spreadsheet is often represented as objects such as tables and charts. These objects are often arranged in a non-specific order and may be spread over several pages. Locating data and objects in a spreadsheet can be difficult. For example, a user may need to search through all of the different spreadsheets to locate desired information, which can be laborious and time consuming.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
Navigable views of objects (e.g., charts, tables, graphs, data sources, pages of a workbook, …) obtained from one or more electronic object sources are displayed as graphical objects (e.g., thumbnails). Objects may be automatically/manually organized within the navigable display (e.g., by type, by page, by relationship, by data source, …). The user may navigate through the displayed objects using touch input and/or non-touch input. For example, a user may zoom in on an object to view a full screen version of the object. When an object is zoomed in, the navigable display can be panned (e.g., to the left, right, up, or down) to view nearby objects. The user can explore related elements from one object using the same shared data (e.g., exploring table objects also display data source elements). The user may also perform supported operations on the objects that affect the display of related objects (e.g., sort/filter/drill down/drill up).
Drawings
FIG. 1 illustrates an exemplary computing device;
FIG. 2 illustrates an exemplary system for visually navigating a display of objects obtained from one or more electronic object sources;
FIG. 3 illustrates a process for creating and interacting with a navigable display of objects;
FIG. 4 illustrates a process for updating a navigable display of objects;
FIG. 5 illustrates a system architecture for creating and interacting with a navigable display of objects; and
6-15 show exemplary displays illustrating visual navigation objects.
Detailed Description
Referring now to the drawings, in which like numerals represent like elements, various embodiments will be described. In particular, FIG. 1 and the corresponding discussion are intended to provide a brief, general description of a suitable computing environment in which embodiments may be implemented.
Generally, program modules include routines, programs, components, data structures, and other types of structures that perform particular tasks or implement particular abstract data types. Other computer system configurations may also be used, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, and the like. Distributed computing environments may also be used where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
Referring now to FIG. 1, an illustrative computer architecture for a computer 100 utilized in the various embodiments will be described. The computer architecture shown in fig. 1 may be configured as a server computing device, a desktop computing device, a mobile computing device (e.g., a smartphone, a notebook, a tablet … …) and includes a central processing unit 5 ("CPU"), a system memory 7 including a random access memory 9 ("RAM") and a read only memory ("ROM") 10, and a system bus 12 that couples the memory to the central processing unit ("CPU") 5.
A basic input/output system containing the basic routines that help to transfer information between elements within the computer, such as during startup, is stored in the ROM 10. The computer 100 also includes a mass storage device 14 for storing an operating system 16, applications 24, presentations/documents 27, and other program modules, such as a Web browser 25, a navigation manager 26, which will be described in greater detail below.
The mass storage device 14 is connected to the CPU5 through a mass storage controller (not shown) connected to the bus 12. The mass storage device 14 and its associated computer-readable media provide non-volatile storage for the computer 100. Although the description of computer-readable media contained herein refers to a mass storage device, such as a hard disk or CD-ROM drive, the computer-readable media can be any available media that can be accessed by the computer 100.
By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, erasable programmable read-only memory ("EPROM"), electrically erasable programmable read-only memory ("EEPROM"), flash memory or other solid state memory technology, CD-ROM, digital versatile disks ("DVD"), or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the computer 100.
According to various embodiments, the computer 100 may operate in a networked environment using logical connections to remote computers through a network 18, such as the Internet. The computer 100 may connect to the network 18 through a network interface unit 20 connected to the bus 12. The network connection may be wireless and/or wired. The network interface unit 20 may also be used to connect to other types of networks and remote computer systems. The computer 100 may also include an input/output controller 22 for receiving and processing input from a number of other devices, such as a touch input device. The touch input device may utilize any technology that allows for the recognition of single/multi-touch inputs (touch/no-touch). For example, the above techniques may include, but are not limited to: heat, finger pressure, high capture rate cameras, infrared light, optical capture, tuned electromagnetic induction, ultrasonic receivers, sensing microphones, laser rangefinders, shadow capture, and the like. According to one embodiment, the touch input device may be configured to detect a proximity touch (i.e., within a certain distance from, but not in physical contact with, the touch input device). A touch input device may also serve as the display 28. An input/output controller 22 may also provide output to one or more display screens, printers, or other types of output devices.
The camera and/or some other sensing device may be operable to record one or more users and capture motions and/or gestures made by the user of the computing device. The sensing device may also be operable to capture words such as dictated by a microphone and/or to capture words from a user such as by a keyboard and/or mouse (not depicted)Plotted) other inputs. The sensing device may comprise any motion detection device capable of detecting movement of a user. For example, the camera may include Microsoft WindowsA motion capture device comprising a plurality of cameras and a plurality of microphones.
Embodiments of the invention may be practiced with a system on a chip (SOC) in which each or many of the components/processes shown in the figures may be integrated onto a single integrated circuit. Such SOC devices may include one or more processing units, graphics units, communication units, system virtualization units, and various application functions, all integrated (or "burned") onto a chip substrate as a single integrated circuit. When running via an SOC, all/some of the functionality described herein may be integrated with other components of the computer 100 onto a single integrated circuit (chip).
As mentioned briefly above, a number of program modules and data files may be stored in the mass storage device 14 and RAM 9 of the computer 100, including an operating system 16 suitable for controlling the operation of a networked computer, such as WINDOWS from MICROSOFT CORPORATION of Redmond, WashWINDOWSAnd (4) operating the system.
The mass storage device 14 and RAM 9 may also store one or more program modules. In particular, the mass storage device 14 and the RAM 9 may store one or more applications, such as a navigation manager 26, a productivity application 24 (e.g., a spreadsheet application such as Microsoft EXCEL, a presentation application such as Microsoft POWERPOINT, a WORD processing application such as Microsoft WORD, a messaging application such as Microsoft OUTLOOK, etc.), and may store one or more Web browsers 25. The Web browser 25 may be used to request, receive, render, and provide interaction with electronic content, such as Web pages, videos, documents, and the like. According to one embodiment, the Web browser includes the INTERNET EXPLORER Web browser application program from MICROSOFT CORPORATION.
Navigation manager 26 may be located on a client device and/or a server device (e.g., within service 19). Navigation manager 26 may be configured as an application/process for providing resources to different tenants (e.g., microsoft OFFICE 365, microsoft WEB APPS, microsoft SHAREPOINT ONLINE) and/or as part of a cloud-based multi-tenant service.
Generally speaking, navigation manager 26 is configured to display navigable views of objects (e.g., charts, tables, graphs, data sources, pages of a workbook, …) obtained from one or more electronic object sources, which are displayed as graphical objects (e.g., thumbnails). Some objects included within the navigable display may be from different pages of a spreadsheet workbook, other objects may be from another spreadsheet workbook, and still other objects displayed in the navigable display may be from other object sources (e.g., spreadsheets, web pages, documents, etc.). Objects may be automatically/manually organized within the navigable display (e.g., by type, by page, by relationship, by data source, …). The user may navigate through the displayed objects. For example, a user may zoom in on an object to view a full screen version of the object. When an object is zoomed in, the navigable display can be panned (e.g., to the left, right, up, or down) to view nearby objects. The user may explore related elements from one object using the same shared data (e.g., exploring chart objects also display data source elements). The user may also perform supported operations on the objects that affect the display of related objects (e.g., sort/filter/drill down/drill up). Additional details regarding navigation manager 26 will be provided below.
FIG. 2 illustrates an exemplary system for visually navigating a display of objects obtained from one or more electronic object sources. As illustrated, system 200 includes a service 210, a navigation manager 240, storage 245, a touch screen input device/display 250 (e.g., tile), and a smartphone 230.
As shown, service 210 is a cloud-based and/or enterprise-based service that may be configured to provide productivity services (e.g., microsoft OFFICE 365, microsoft WEB APPS, microsoft POWERPOINT). The functionality of one or more of the services/applications provided by service 210 may also be configured as client-based applications. For example, a client device may include a spreadsheet application that interacts with pages that store objects within a grid of pages. Although the system 200 illustrates a productivity service, other services/applications may be configured to visually navigate objects obtained from one or more electronic object sources.
As illustrated, service 210 is a multi-tenant service that provides resources 215 and services to any number of tenants (e.g., tenants 1-N). According to an embodiment, multi-tenant service 210 is a cloud-based service that provides resources/services 215 to tenants that subscribe to the service, as well as maintains and protects each tenant's data separately from other tenant data.
The illustrated system 200 includes a touch screen input device/display 250 (e.g., a pad/tablet device) and a mobile phone 230 that detect when a touch input (e.g., a finger touch or near contact to the touch screen) has been received. Any type of touch screen that detects a touch input by a user may be utilized. For example, a touch screen may include one or more layers of capacitive material that detect touch inputs. Other sensors may be used in addition to or in place of capacitive materials. For example, an Infrared (IR) sensor may be used. According to an embodiment, the touch screen is configured to detect an object in contact with or above the touchable surface. Although the term "above" is used in this specification, it should be understood that the orientation of the touch panel system is irrelevant. The term "above" is intended to be applicable to all such orientations. The touch screen may be configured to determine a location (e.g., a start point, an intermediate point, and an end point) at which the touch input is received. The actual contact between the touchable surface and the object may be detected by any suitable means, including, for example, a vibration sensor or microphone coupled to the touch panel. A non-exhaustive list of examples of sensors for detecting contact includes: pressure-based mechanisms, micromechanical accelerometers, piezoelectric devices, capacitive sensors, resistive sensors, inductive sensors, laser vibrometers, and LED vibrometers.
As illustrated, the touchscreen input device/display 250 shows an exemplary navigable display 252 of objects obtained from one or more electronic object sources (e.g., spreadsheet documents). As illustrated, navigable display 252 shows a display (e.g., as thumbnails) of objects obtained from a spreadsheet (e.g., one or more pages of a spreadsheet from one or more workbooks) and organized by object type. A spreadsheet workbook may include multiple objects on different pages. The objects may be any item in the workbook that may be individually selected or manipulated. For example, objects in the workbook 300 may include tables, charts, pivot tables, pivot charts, pages of a spreadsheet workbook, data sources, and so forth. According to an embodiment, objects may be obtained from other electronic object sources (e.g., web pages, databases, documents, data feeds, etc.). Some of the objects displayed in the navigable display 252 may use data from a common data source. For example, both charts and tables may use data from the same data source. According to one embodiment, an object may be defined to include a plurality of smaller objects. For example, an object may correspond to multiple pages of a workbook. Each of the object sources may include a plurality of objects. The objects may be associated with static or dynamic information. Objects within different object sources may be associated/related to each other. For example, an object from a first object source may be a different view of an object within a second object source. The object may also be an object similar to another. According to an embodiment, the object source is configurable. For example, a user may use the user interface to select an object from a single spreadsheet, multiple spreadsheets, a single document, multiple documents, and/or from other electronic object sources to create a navigable display. An application (e.g., a spreadsheet application) may be configured to create a navigable display of spreadsheet objects associated with a current file being used by a user.
The navigation manager 240 is configured to determine electronic objects within the selected electronic object source and then graphically display the objects (e.g., as thumbnails). Objects may be automatically/manually organized within the navigable display (e.g., by type, by page, by relationship, by data source, …). As illustrated, navigable display 252 arranges the objects by type (e.g., chart, worksheet, table, and pivot sheet). The navigable display 252 displays to the user a top-level view of each of the different types of objects contained within the selected electronic object source. The user may interact with navigable display 252 to navigate through different objects. For example, navigable display 252 can be manipulated (e.g., zoomed, panned, …) to display one or more objects differently. For example, when a user selects an object, the view is updated to show a larger version of the object. The user may zoom in on the object by selecting the object to view a full screen version (or different zoom levels) of the object. In the current example, the user of the mobile device has enlarged the chart object 260 to obtain a better view of the object. When an object is zoomed in, the navigable display (e.g., display 232) can be panned (e.g., to the left, right, up, or down) to view nearby objects. The user can explore related elements from one object using the same shared data (e.g., exploring table objects also display data source elements). The user may also perform supported operations on the objects that affect the display of related objects (e.g., sort/filter/drill down/drill up). For example, a user may select chart 260 to interact with and/or drill down in the chart to see the associated data source from which the chart was created. According to an embodiment, the navigable display displays the relationship between different objects by showing lines between them. The navigable display may also display elements in a relationship. For example, the navigable display can be arranged by usage of the data sources (e.g., objects using a first data source grouped together, objects using a second data source grouped together).
The user may manually configure the layout of the objects. For example, the user may select objects to display together within the navigable display. Thus, users can navigate the workbook in their own manner and are not limited by how the workbook was originally authored.
3-4 illustrate exemplary processes for creating and interacting with a navigable display of objects. Upon reading the discussion of the routines presented herein, it should be appreciated that the logical operations of the various embodiments are implemented as: (1) a series of computer implemented acts or program modules running on a computing system; and/or (2) interconnected machine logic circuits or circuit modules within the computing system. Such implementation is a matter of choice dependent on the performance requirements of the computing system implementing the invention. Accordingly, the logical operations illustrated and making up the embodiments described herein are referred to variously as operations, structural devices, acts or modules. These operations, structural devices, acts and modules may be implemented in software, in firmware, in special purpose digital logic, and any combination thereof. Although the operations are shown in a particular order, the order of the operations may be changed and performed in other orders.
FIG. 3 illustrates a process for creating and interacting with a navigable display of objects.
After a start operation, the process flows to operation 310, where objects to be included within the navigable display are determined from one or more electronic object sources. According to one embodiment, objects are obtained from one or more spreadsheet workbooks. The objects may also be obtained from other electronic object sources (e.g., web pages, databases, documents, data feeds, etc.). An object included within a navigable display can be any item in an electronic object source that is defined as an object (e.g., a table, a chart, a pivot table, a page of a spreadsheet workbook, a data source, etc.).
Moving to operation 320, the organization of the object is determined. The organization may be determined automatically/manually and may be updated during interaction with the navigable display. For example, objects may be organized in the following classes: the type of the object; the use of a common data source; from the same spreadsheet/workbook, etc. The user may determine how to organize a portion/all of the objects. For example, the objects may be automatically organized first, and then the user may selectively change the organization (e.g., move the objects to another area within the navigable display).
Flowing to operation 330, the object is displayed on the navigable display. According to an embodiment, each object is displayed as a selectable thumbnail on a canvas that can be manipulated (e.g., moved, zoomed).
Turning to operation 340, interactions are received that affect a navigable display. The interaction may be a variety of different interactions, such as but not limited to: touch input, mouse input, stylus input, and the like. The interaction may include selecting an object, adjusting the display of the object (e.g., zooming, panning), adjusting the organization of elements, and so forth. For example, the user may tap an object within the display to zoom in on the object. The user may perform a pan gesture to pan the navigable display, and the like. The interaction may include selection of one or more of the displayed objects and/or manipulation of the navigable display to generate different views (e.g., pan/zoom). For example, the user may zoom in on the object. The user may then drill down into the object to view the data source associated with the object. During interaction, a user may change data that affects the display of an object (e.g., changing the original data of a chart changes the display of the chart). The user can explore other elements from the top-level display of the object and from lower-level views of the object. For example, the user may manually and/or automatically cycle through each object (e.g., each object is displayed for a period of time until interrupted by the user). The user may also perform supported operations on the objects that affect the display of related objects (e.g., sort/filter/search/drill down/drill up).
Moving to operation 350, in response to the interaction, the navigable display is updated (e.g., the view of one or more objects in the navigable display is changed).
The process then moves to an end operation and returns to processing other actions.
FIG. 4 illustrates a process for updating a navigable display of objects.
After a start operation, process 400 flows to operation 410, where the organization of the objects is updated upon determination. For example, an object may be deleted/added to an electronic object source. The user may change the organization of the objects. For example, a user may organize objects by their relationship (e.g., sales, purchase, …), by the type of data, by the data source, and so forth. The organization and updating of the navigable display is updated in response to any changes.
Moving to operation 420, any operations may be performed on the object when determined. For example, the user may edit values within the object, select different display options for the object (e.g., display a pie chart instead of a bar chart), drill down/drill out the object, and so on.
Flowing to operation 430, the display of the navigable display is updated as determined. For example, the user may pan the navigable display, zoom in on the navigable display, select another object to display, and so on.
The process then moves to an end operation and returns to processing other actions.
FIG. 5 illustrates a system architecture for creating and interacting with a navigable display of objects as described herein. Content used and displayed by applications (e.g., application 1020) and navigation manager 26 may be stored in different locations. For example, application 1020 may use/store data using directory service 1022, web portal 1024, mailbox service 1026, instant messaging store 1028, and social networking site 1030. The application 1020 may use any of these types of systems or the like. A server 1032 may be used to access electronic object sources and to generate navigable displays of objects. For example, the server 1032 may generate a navigable display of the application 1020 for display at a client (e.g., a browser or some other window). As one example, server 1032 may be a web server configured to provide productivity services (e.g., spreadsheets, presentations, word processing, messaging, document collaboration, etc.) to one or more users. The server 1032 may interact with clients using the web through the network 1008. The server 1032 may also include an application program (e.g., a spreadsheet application). Examples of clients that may interact with the server 1032 and the presentation application include computing device 1002, which computing device 1002 may include any general purpose personal computer, tablet computing device 1004, and/or mobile computing device 1006 that may include a smart phone. Any of these devices may obtain content from storage 1016.
6-15 show exemplary displays illustrating visual navigation objects. The examples shown herein are for illustration purposes and are not intended to be limiting. Although the objects illustrated in FIGS. 6-15 are obtained from spreadsheets, electronic objects may be obtained from other electronic object sources.
FIG. 6 illustrates an exemplary foreground panel (landscapeslat) display showing a top-level view of an object in a navigable display.
Display 600 shows an exemplary top-level view of objects within a spreadsheet. As shown, objects within the navigable display are organized by a worksheet 610 within a spreadsheet workbook, a pivot table 620 within a spreadsheet workbook, a chart 630 within a spreadsheet workbook, and a table within a spreadsheet workbook. By viewing the top-level navigable display, a user can easily view objects contained within a selected electronic object source (e.g., spreadsheet).
In the current example, user 622 selects the display of worksheet 610 on the navigable display.
FIG. 7 illustrates an exemplary foreground panel display showing selection of tissue classifications from a top-level view of objects in a navigable display.
Display 700 shows a larger view of the objects contained within the worksheet 610 of the navigable display. As illustrated, the worksheet 610 includes a gas usage 710 object, a power usage 720 object, and a data source object 730. The data source objects may be associated with gas usage 710 objects and power usage 720 objects.
In the present example, user 622 selects gas usage object 710 on the display of navigable displayed worksheet 610.
FIG. 8 illustrates an exemplary foreground panel display showing selection of a single object from a navigable display.
Display 800 shows a larger view of gas usage object 710 contained within a navigable display's worksheet 610. The user may also drill down into the gas usage object 710 to display one or more data sources used by the gas usage object 710. For example, selecting the gas usage object 710 may display an associated data source object (e.g., see FIG. 10). The user may also interact with the gas usage object (change view, filter data, edit data, …).
According to an embodiment, a user may move between layers of a navigable display and/or move from one object to the next. For example, the user may select to move from the display displayed by gas usage object 720 and/or from the display displayed by the worksheet to power usage object 720.
FIG. 9 illustrates an exemplary foreground panel display showing selection of a single object from a navigable display.
Display 900 shows a larger view of power usage object 720 contained within worksheet 610 of the navigable display. The user may also drill down into power usage object 720 to display one or more data sources used by power usage object 720. For example, selecting power usage object 720 may display an associated data source object (e.g., see FIG. 10). The user may also interact with the power usage object (change view, filter data, edit data, …).
FIG. 10 illustrates an exemplary foreground panel display showing data source objects.
Display 1000 shows a larger view of data source objects 730 contained within a navigable display's worksheet 610. The data source object 730 may be associated with one or more of the objects in the navigable display. The user may also drill down into the data source object 720 to display one or more data sources included within the data source object 730. The user may also interact with the data source objects (change view, filter data, edit data, …).
FIG. 11 illustrates an exemplary foreground panel display showing selection of chart categories from a top-level view of objects in a navigable display.
Display 1100 shows a larger view of the objects contained within chart 630 of the navigable display. As illustrated, chart 630 includes a temperature by time of day 1110 chart object, a heat by month and temperature 1120 chart object, a projected weekday heat 1130 chart object, and an average per day KWH1140 chart object.
In the current example, user 622 selects chart object 1110 on the display of navigable displayed worksheet 630.
FIG. 12 illustrates an exemplary foreground panel display showing selection of a single object from a navigable display.
Display 1200 shows a larger view of the temperature 1110 by time of day chart objects contained within the navigable display's worksheet 630. The user may also drill down into chart object 1110 to display one or more data sources used by object 1110.
FIG. 13 illustrates an exemplary foreground panel display showing selection of a single object from the navigable display.
Display 1300 shows a larger view of the heat by month and temperature 1120 chart objects contained within the navigable display's worksheet 630. The user may also drill down into the chart object 1120 to display one or more data sources used by the object 1120. As illustrated, the user 622 selects a bar within the chart object. According to an embodiment, objects may be interacted with in the same manner when included in the electronic object source from which they were obtained. For example, objects include the same functionality when included within a navigable display.
FIG. 14 illustrates an exemplary foreground panel display showing selection of a single object from a navigable display.
Display 1400 shows a larger view of the projected weekday heat 1130 chart object contained within the navigable displayed worksheet 630. The user may also drill down into chart object 1130 to display one or more data sources used by object 1130. As illustrated, the user 622 selects a bar within the chart object.
FIG. 15 illustrates an exemplary foreground panel display showing selection of a single object from a navigable display.
Display 1500 shows a larger view of the average kWh1140 by day chart objects contained within the worksheet 630 of the navigable display. The user may also drill down into chart object 1140 to display one or more data sources used by object 1140.
While specific embodiments of the invention have been described, other embodiments are possible. In addition, although embodiments of the present invention have been described as being associated with data stored in memory and other storage media, data may also be stored on or read from other types of computer-readable media, such as secondary storage devices (like hard disks, floppy disks, or a CD-ROM), a carrier wave from the Internet, or other forms of RAM or ROM. In addition, steps of the disclosed methods may be modified in any manner, including by reordering steps and/or inserting or deleting steps, without departing from the invention.
The above specification, examples and data provide a complete description of the manufacture and use of the composition of the invention. Since many embodiments of the invention can be made without departing from the spirit and scope of the invention, the invention resides in the claims hereinafter appended.

Claims (9)

1. A method for visually navigating an object, comprising:
determining objects (310) within a file, the objects comprising one or more of the following types of objects: table objects, chart objects, spreadsheet objects, and data source objects; wherein the file includes other content;
determining an organization of the objects (320), including automatically organizing the objects according to a type of each of the objects; wherein the organization links related objects;
displaying the objects organized according to object type on a navigable display according to the determined organization (330);
receiving an interaction (340) related to the displayed object, the interaction comprising a drill-down to one of the object types;
updating a view of one of the object types in response to the interaction (350);
receiving an interaction with one of a table, chart, or spreadsheet of the updated view that drill down to one of the object types; and
updating a view of one of a table, chart, or spreadsheet of one of the object types in response to receiving an interaction with the updated view to drill down to the one of the table, chart, or spreadsheet.
2. The method of claim 1, wherein determining the objects (310) within the file comprises determining each of the objects contained within a spreadsheet file.
3. The method of claim 2, further comprising correlating a data source object with at least a portion of the determined objects such that the correlated data source object is displayed in response to drilling down (340) into at least a portion of the determined objects.
4. The method of claim 1, wherein determining the organization (320) of the object comprises determining at least one of: automatic organization of the objects, and manual organization of the objects.
5. The method of claim 1, wherein receiving the interaction (340) related to the displayed object comprises at least one of: including determining when the interaction is at least one of: a sorting operation, a filtering operation, a drill-down operation, and a drill-up operation; automatically zooming in to display the object; updating any other objects affected in response to the received interaction; and determining when the indication is an update to the navigable display, the update comprising at least one of: pan the navigable display to show another object, and change a zoom of the navigable display.
6. A method for visually navigating an object, comprising:
determining objects (310) from an electronic object source, the objects comprising two or more of the following types of objects: table objects, chart objects, spreadsheet objects, graphic objects, and data source objects; wherein the electronic object source includes other content;
automatically organizing the objects according to object type (320);
displaying the objects organized according to object type on a navigable display according to the organization (330);
receiving an interaction (340) related to the displayed object, the interaction comprising a drill-down to one of the object types;
updating a view of one of the object types in response to the interaction (350);
receiving an interaction with one of a table, chart, or spreadsheet of the updated view that drill down to one of the object types; and
updating a view of one of a table, chart, or spreadsheet of one of the object types in response to receiving an interaction with the updated view to drill down to the one of the table, chart, or spreadsheet.
7. A system for visually navigating an object, comprising:
a display section (28);
a network connection (20) coupled to a tenant of a multi-tenant service;
a processor (5) and a computer readable medium (14); and
an operating environment (16) stored on the computer-readable medium and executing on the processor;
wherein the processor (5) is configured to perform a process operating under control of the operating environment and operative to perform actions (26), comprising:
determining objects (310) from an electronic object source, the objects comprising two or more of the following types of objects: table objects, chart objects, spreadsheet objects, graphic objects, and data source objects; wherein the electronic object source includes other content;
automatically organizing the objects according to object type (320);
displaying the objects organized according to object type on a navigable display according to the organization (330);
receiving an interaction (340) related to the displayed object, the interaction comprising a drill-down to one of the object types;
updating a view of one of the object types in response to the interaction (350);
receiving an interaction with one of a table, chart, or spreadsheet of the updated view that drill down to one of the object types; and
updating a view of one of a table, chart, or spreadsheet of one of the object types in response to receiving an interaction with the updated view to drill down to the one of the table, chart, or spreadsheet.
8. The system of claim 7, further comprising, for each object, determining when the object uses data from a data source object (340) and displaying the data source object within the navigable display.
9. The system of claim 7, wherein receiving the interaction (340) related to the displayed object comprises determining at least one of: when the interaction is at least one of: a sorting operation, a filtering operation, a drill-down operation, and a drill-up operation; and when the indication is an update to the navigable display, the update comprising at least one of: pan the navigable display to show another object, and change a zoom of the navigable display.
HK13108430.6A 2011-11-02 2013-07-18 Visual navigation of documents by object HK1181163B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/287,754 2011-11-02

Publications (2)

Publication Number Publication Date
HK1181163A HK1181163A (en) 2013-11-01
HK1181163B true HK1181163B (en) 2018-04-20

Family

ID=

Similar Documents

Publication Publication Date Title
US8990686B2 (en) Visual navigation of documents by object
US10705707B2 (en) User interface for editing a value in place
US10324592B2 (en) Slicer elements for filtering tabular data
JP6165154B2 (en) Content adjustment to avoid occlusion by virtual input panel
US20130191785A1 (en) Confident item selection using direct manipulation
US20140304648A1 (en) Displaying and interacting with touch contextual user interface
US20130191779A1 (en) Display of user interface elements based on touch or hardware input
HK1181163B (en) Visual navigation of documents by object
HK1181163A (en) Visual navigation of documents by object
HK1181158A (en) Adjusting content to avoid occlusion by a virtual input panel
HK1182486B (en) Slicer elements for filtering tabular data