HK1131230A - Method, apparatus and computer program product for providing metadata entry - Google Patents
Method, apparatus and computer program product for providing metadata entry Download PDFInfo
- Publication number
- HK1131230A HK1131230A HK09109670.9A HK09109670A HK1131230A HK 1131230 A HK1131230 A HK 1131230A HK 09109670 A HK09109670 A HK 09109670A HK 1131230 A HK1131230 A HK 1131230A
- Authority
- HK
- Hong Kong
- Prior art keywords
- metadata
- key
- metadata entry
- rendered
- activity
- Prior art date
Links
Description
Technical Field
Embodiments of the present invention relate generally to content management technology and, more particularly, relate to a method, apparatus, mobile terminal and computer program product for serving content management using metadata entries.
Background
The modern communications era has brought about a tremendous expansion of wireline and wireless networks. Computer networks, television networks, and telephony networks are experiencing an unprecedented technological expansion, fueled by consumer demand. Wireless networking and mobile networking technologies have addressed related consumer demands and have made information transfer more flexible and intuitive.
Current and future networking technologies continue to facilitate ease of information transfer and convenience to users by expanding the capabilities of mobile electronic devices. As the capabilities of mobile electronic devices expand, the storage capacity of these devices correspondingly increases, allowing users to store very large amounts of content on the devices. Given the trend of devices to increase the performance of stored content, and also given the limitations often faced by mobile electronic devices such as mobile phones with respect to display size, text entry speed, and physical implementation of a User Interface (UI), challenges have been created in content management. In particular, an imbalance between the development of the performance of the stored content and the development of the performance of the physical UI may be perceived.
To provide a solution to the aforementioned imbalance, content management is enhanced with metadata. Metadata, which may also be referred to as "tags," includes information separate from but related to the object. Objects can be tagged by adding metadata to the object. Likewise, metadata may be used to specify properties related to an object that are not apparent from the object itself. The metadata may then be used to organize the objects to improve content management performance.
Currently, devices such as mobile terminals are becoming more and more adept at content creation (e.g., images, videos, product descriptions, event descriptions, etc.). However, tagging objects as a result of content creation is typically a challenge given the limited physical UI capabilities of mobile terminals. For example, it is cumbersome to type in a new metadata entry for each content item created. Thus, while tagging objects with metadata improves content management performance, the efficiency of tagging may become a limiting factor.
In addition, some methods of inserting metadata based on context have been developed. Context metadata describes the context in which a particular content item is "created". In the following, the term "created" should be understood as being defined such as to also include the terms "captured", "received" and "downloaded". In other words, by whatever means, content is defined as "created" whenever it is first stored on a device, regardless of whether the content was previously present on other devices. Context metadata may be associated with each content item to provide annotations to facilitate effective content management features such as search functionality and organization functionality. Thus, context metadata can be used to provide an automated mechanism that can enhance content management and minimize user labor. However, context metadata, as well as other types of metadata, may be standardized based on factors such as context. Thus, for example, tagging content items that may have more than one context may become complicated.
It would therefore be advantageous to provide an improved method of associating metadata with a created content item, which is simpler and easier to use in a mobile environment.
Disclosure of Invention
Methods, apparatus and computer program products are therefore provided to support efficient metadata entry. In particular, methods, apparatuses and computer program products are provided that allow a user (e.g. of a mobile terminal) to associate particular metadata with particular functional elements such as keys of any of a keyboard, keypad or touch screen key element. Thus, more efficient tagging can be performed without using drag and drop operations that are difficult to employ in mobile devices. In addition, dedicated keys or shared dedicated keys may be provided to implement certain embodiments of the provided tagging features. Selection of a key may result in insertion of an editable tag, may initiate a separate tagging feature in an internet service or mobile device application, or may result in display of a pre-created set of metadata tags from which a user may select one or more tags to be used. Accordingly, efficiency and versatility of metadata tag use may be enhanced, and content management of the mobile terminal may be improved.
In one exemplary embodiment, a method for providing metadata tagging is presented. The method comprises the following steps: the method includes rendering an activity, such as a media file, via an electronic device, receiving a selection of a key of the electronic device, and modifying a metadata entry of the rendered media file in response to the selection of the key. Modifying metadata may include: the metadata entry for the selected key is designated as metadata for the rendered media file. Optionally, modifying the metadata may include: metadata of the rendered media file associated with the selected key is rendered.
In another exemplary embodiment, a computer program product for providing metadata tags is presented. The computer program product includes at least one computer-readable storage medium having computer-readable program code portions stored therein. The computer-readable program code portions include first, second and third executable portions. The first executable portion is for rendering an activity, such as a media file, via an electronic device. The second executable portion is for receiving a selection of a key of the electronic device. The third executable portion is for modifying a metadata entry for the rendered activity in response to selection of the key.
In another exemplary embodiment, an apparatus for providing metadata tagging is presented. The device includes an output device, an interface element, and a tagging element. The output device is capable of rendering an activity, such as a media file, via the electronic device. The interface element is capable of receiving a user input of a selection of a key of the electronic device. The tagging element is configured to modify a metadata entry of the rendered activity in response to selection of the key.
In another exemplary embodiment, an apparatus for providing metadata tagging is presented. The apparatus comprises: means for rendering an activity, such as a media file, via an electronic device; means for receiving a selection of a key of an electronic device; means for modifying a metadata entry for the rendered activity in response to selection of the key.
In further embodiments, metadata assigned to a key is also assigned to a selected or rendered media file by selection of the key.
Embodiments of the present invention may provide methods, apparatus and computer program products for beneficial deployment in a mobile electronic device environment, such as on a mobile terminal capable of creating content items and objects related to various media types. As a result, for example, mobile terminal users may enjoy improved content management capabilities.
Drawings
Having thus described embodiments of the invention in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
FIG. 1 is a schematic block diagram of a mobile terminal according to an exemplary embodiment of the present invention;
fig. 2 is a schematic block diagram of a wireless communication system according to an exemplary embodiment of the present invention;
FIG. 3 shows a schematic block diagram of a portion of a system for providing metadata entry in accordance with an exemplary embodiment of the present invention;
FIG. 4 illustrates a keyboard including dedicated keys for use in a system for providing metadata entry according to an exemplary embodiment of the present invention;
FIG. 5 illustrates a list of metadata tags and related content items according to an exemplary embodiment of the present invention;
FIG. 6 illustrates an example of a metadata entry that may be used to specify a predetermined relationship between a key and particular metadata, according to an exemplary embodiment of the invention; and
FIG. 7 illustrates a block diagram of an exemplary method for providing metadata entries, according to an exemplary embodiment of the present invention.
Detailed Description
Embodiments of the present invention now will be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all embodiments of the invention are shown. Indeed, embodiments of the invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements. Like reference numerals refer to like elements throughout.
FIG. 1, one aspect of the present invention, illustrates a block diagram of a mobile terminal 10 that would benefit from embodiments of the present invention. It should be understood, however, that a mobile telephone as illustrated and hereinafter described is merely illustrative of one type of mobile terminal that would benefit from embodiments of the present invention and, therefore, should not be taken to limit the scope of embodiments of the present invention. While some embodiments of the mobile terminal 10 are illustrated and will be hereinafter described for purposes of example, other types of mobile terminals, such as Portable Digital Assistants (PDAs), pagers, mobile televisions, gaming devices, laptop computers, cameras, video recorders, audio/video player, radio, GPS devices or any combination of the aforementioned, and other types of voice and text communications systems, can readily employ embodiments of the present invention. Furthermore, devices that are not mobile may also readily employ embodiments of the present invention.
Additionally, while some embodiments of the method of the present invention are performed or used by a mobile terminal 10, the method may also be used without a mobile terminal. Moreover, the system and method of embodiments of the present invention will be primarily described in conjunction with mobile communications applications. It will be appreciated, however, that the system and method of embodiments of the present invention can be utilized in conjunction with a variety of other applications, both in the mobile communications industries and outside of the mobile communications industries.
The mobile terminal 10 includes an antenna 12, or multiple antennas, in operable communication with a transmitter 14 and a receiver 16. The mobile terminal 10 further includes a controller 20 or other processing element that provides signals to and receives signals from the transmitter 14 and receiver 16, respectively. The signals include signaling information in accordance with the air interface standard of the applicable cellular system, and also user speech and/or user generated data. In this regard, the mobile terminal 10 is capable of operating with one or more air interface standards, communication protocols, modulation types, and access types. By way of illustration, the mobile terminal 10 is capable of operating in accordance with any of a number of first, second and/or third-generation communication protocols or the like. For example, the mobile terminal 10 may be capable of operating in accordance with second-generation (2G) wireless communication protocols IS-136(TDMA), GSM and IS-95(CDMA), or third-generation (3G) wireless communication protocols such as UMTS, CDMA2000 and TD-SCDMA. Alternatively, the mobile device may receive a broadband broadcast program and thus have a corresponding receiver for this purpose.
It is understood that the controller 20 includes circuitry required for implementing audio and logic functions of the mobile terminal 10. For example, the controller 20 may be comprised of a digital signal processor device, a microprocessor device, and various analog to digital converters, digital to analog converters, and other support circuits. Control and signal processing functions of the mobile terminal 10 are allocated among these devices according to their respective capabilities. The controller 20 thus may also include the functionality to convolutionally encode and interleave message and data prior to modulation and transmission. The controller 20 may additionally include an internal voice coder, and may also include an internal data modem. Further, the controller 20 may also include the functionality to operate one or more software programs, which may be stored in memory. For example, the controller 20 may be capable of operating a connectivity program, such as a conventional Web browser. The connectivity program may then allow the mobile terminal 10 to transmit and receive Web content, such as location-based content, according to a Wireless Application Protocol (WAP), for example.
The mobile terminal 10 further comprises a user interface including an output device such as a conventional earphone or speaker 24, a ringer 22, a microphone 26, a display 28, which may also be an external display (e.g., a TV, monitor, or LCD projector), and a user input interface, all of which are coupled to the controller 20. The user input interface, which allows the mobile terminal 10 to receive data, may comprise any of a number of devices allowing the mobile terminal 10 to receive data, such as a keypad 30, a touch screen (not shown) or other input device. In embodiments including the keypad 30, the keypad 30 may include conventional numbers (0-9) and associated keys (#, #), as well as other keys for operating the mobile terminal 10. Alternatively, the keypad 30 may include a conventional QWERTY keypad arrangement or any variation of a keypad or keyboard specifically for mobile devices. The keypad 30 may also include various software keys with associated functions. In addition, or alternatively, the mobile terminal 10 may include an interface device such as a joystick or other user input interface. The mobile terminal 10 further includes a battery 34, such as an oscillating battery pack, for powering various circuits that are required to operate the mobile terminal 10, as well as optionally providing mechanical vibration as a detectable output.
In an exemplary embodiment, the mobile terminal 10 includes a media capture module 36, such as a camera, video and/or audio module, in communication with the controller 20. The media capture module 36 may be any means of capturing images, video, and/or audio for storage, display, or transmission. For example, in an exemplary embodiment in which the media capture module 36 is a camera module, the media capture module 36 may include a digital camera capable of forming a digital image file from a captured image. As such, the camera module 36 includes all hardware, such as a lens or other optical device, and software necessary for creating a digital image file from a captured image. Alternatively, the camera module 36 may include only the hardware needed to view an image, while a memory device of the mobile terminal 10 stores instructions executed by the controller 20 in the form of software necessary to create a digital image file from a captured image. In an exemplary embodiment, the camera module 36 may further include processing elements such as a co-processor that assists the controller 20 in processing image data and an encoder and/or decoder for compressing and/or decompressing image data. The encoder and/or decoder may encode and/or decode according to a JPEG standard format.
The mobile terminal 10 may further include a universal identification element (UIM) 38. Typically, the UIM38 is a memory device having a built-in processor. For example, UIM38 may comprise a subscriber identity element (SIM), a Universal Integrated Circuit Card (UICC), a Universal subscriber identity element (USIM), a removable user identity element (R-UIM), or the like. Typically, the UIM38 stores information elements for a mobile subscriber. In addition to the UIM38, the mobile terminal 10 may be equipped with memory. For example, the mobile terminal 10 may include volatile memory 40, such as volatile Random Access Memory (RAM) including a cache area for the temporary storage of data. The mobile terminal 10 may also include other non-volatile memory 42, which can be embedded and/or may be removable. The non-volatile memory 42 may additionally or alternatively comprise an EEPROM, flash memory or the like, such as those available from SanDisk corporation of Sunnyvale, California, or Lexar Meida corporation of Fremont, California. The memories can store any of a number of pieces of information, and data, used by the mobile terminal 10 to implement the functions of the mobile terminal 10. For example, the memories can include an identifier, such as an International Mobile Equipment Identification (IMEI) code, capable of uniquely identifying the mobile terminal 10.
Referring now to FIG. 2, an illustration of one type of system that would benefit from embodiments of the present invention is provided. The system includes a plurality of network devices. As shown, one or more mobile terminals 10 may each include an antenna 12 for transmitting signals to and for receiving signals from a base site or Base Station (BS) 44. The base station 44 may be a part of one or more cellular or mobile networks each of which includes elements required to operate the network, such as a Mobile Switching Center (MSC) 46. As known to those skilled in the art, the mobile network may also be referred to as a base station/MSC/interworking function (BMI). In operation, the MSC46 is capable of routing calls to and from the mobile terminal 10 when the mobile terminal 10 is making and receiving calls. The MSC46 can also provide a connection to landline trunks when the mobile terminal 10 is involved in a call. In addition, the MSC46 may be capable of controlling the forwarding of messages to and from the mobile terminal 10, and may also control the forwarding of messages for the mobile terminal 10 to and from a messaging center. It should be appreciated that although the MSC46 is shown in the system of FIG. 2, the MSC46 is merely an exemplary network device and embodiments of the present invention are not limited to use in a network employing an MSC.
The MSC46 can be coupled to a data network, such as a Local Area Network (LAN), a Metropolitan Area Network (MAN), and/or a Wide Area Network (WAN). The MSC46 can be directly coupled to the data network. In one typical embodiment, however, the MSC46 is coupled to a GTW48, and the GTW48 is coupled to a WAN, such as the Internet 50. In turn, devices such as processing elements (e.g., personal computers, server computers or the like) can be coupled to the mobile terminal 10 via the Internet 50. For example, as illustrated below, the processing elements may include one or more processing elements associated with a computing system 52 (2 shown in FIG. 2), origin server 54 (one shown in FIG. 2) or the like, as described below.
The BS 44 can also be coupled to a signaling General Packet Radio Service (GPRS) support node (SGSN) 56. As known to those skilled in the art, the SGSN56 is typically capable of performing functions similar to the MSC46 for packet switched services. The SGSN56, like the MSC46, can be coupled to a data network, such as the Internet 50. The SGSN56 can be directly coupled to the data network. In a more typical embodiment, however, the SGSN56 is coupled to a packet-switched core network, such as a GPRS core network 58. The packet-switched core network is then coupled to another GTW48, such as a GTW GPRS Support Node (GGSN)60, and the GGSN60 is coupled to the Internet 50. In addition to the GGSN60, the packet-switched core network can be coupled to a GTW 48. Also, the GGSN60 can be coupled to a messaging center. In this regard, the GGSN60 and the SGSN56, similar to the MSC46, may be capable of controlling the forwarding of messages, such as MMS messages. The GGSN60 and SGSN56 may also be capable of controlling the forwarding of messages for the mobile terminal 10 to and from the messaging center.
In addition, by coupling the SGSN56 to the GPRS core network 58 and the GGSN60, devices such as a computing system 52 and/or origin server 54 may be coupled to the mobile terminal 10 via the Internet 50, SGSN56 and GGSN 60. In this regard, devices such as the computing system 52 and/or origin server 54 may communicate with the mobile terminal 10 across the SGSN56, GPRS core network 58 and the GGSN 60. By directly or indirectly connecting mobile terminals 10 and the other devices (e.g., computing system 52, origin server 54, etc.) to the internet 50, the mobile terminals 10 may communicate with the other devices and with one another, such as according to the hypertext transfer protocol (HTTP), to thereby carry out various functions of the mobile terminals 10.
Although not all elements of all possible mobile networks are shown and described herein, it should be appreciated that the mobile terminal 10 may be coupled to one or more of any of a number of different networks through the BS 44. In this regard, the network may be capable of supporting communication in accordance with any one or more of first generation (1G), second generation (2G), 2.5G and/or third generation (3G) mobile communication protocols or the like. For example, one or more of the networks may be capable of supporting communication in accordance with 2G wireless communication protocols IS-136(TDMA), GSM, and IS-95 (CDMA). Also, for example, one or more of the networks may be capable of supporting communication in accordance with 2.5G wireless communication protocols GPRS, Enhanced Data GSM Environment (EDGE), or the like. Also, for example, one or more of the networks may be capable of supporting communication in accordance with 3G wireless communication protocols, such as a Universal Mobile Telephone System (UMTS) network employing Wideband Code Division Multiple Access (WCDMA) radio access technology. Some narrow-band amps (namps), as well as TACS, network(s) may also benefit from embodiments of the present invention, as should dual or higher mode mobile stations (e.g., digital/analog or TDMA/CDMA/analog phones).
The mobile terminal 10 may be further coupled to one or more wireless Access Points (APs) 62. The APs 62 may comprise access points configured to communicate with the mobile terminal 10 in accordance with techniques such as: for example, Radio Frequency (RF), Bluetooth (BT), infrared (IrDA) or wireless lan (wlan) technologies including IEEE802.11 (e.g., 802.11a, 802.11b, 802.11g, 802.11n, etc.), WiMAX technologies such as IEEE802.16, and/or Ultra Wideband (UWB) technologies such as IEEE802.15 or any of a number of different wireless networking technologies. The APs 62 may be coupled to the internet 50. Like with the MSC46, the APs 62 can be directly coupled to the Internet 50. In one embodiment, however, the APs 62 are indirectly coupled to the Internet 50 via a GTW 48. Also, in one embodiment, the BS 44 may be considered as another AP 62. As will be appreciated, by directly or indirectly connecting the mobile terminals 10 and the computing system 52, the origin server 54, and/or any number of other devices, to the Internet 50, the mobile terminals 10 can communicate with one another and with the computing system, etc., to thereby carry out various functions of the mobile terminals 10, such as to transmit data, content or the like to the computing system 52 and/or to receive content, data or the like from the computing system 52. As used herein, the terms "data," "content," "information" and similar terms may be used interchangeably to refer to data capable of being transmitted, received and/or stored in accordance with embodiments of the present invention. Thus, use of any such terms should not be taken to limit the spirit and scope of the present invention.
Although not shown in fig. 2, in addition to or in lieu of coupling the mobile terminal 10 to computing systems 52 across the internet 50, the mobile terminal 10 and computing system 52 may be coupled to one another and communicate in accordance with, for example, RF, BT, IrDA or any of a number of different wireline or wireless communication techniques, including LAN, WLAN, WiMAX and/or UWB techniques. One or more of the computing systems 52 can additionally, or alternatively, include a removable memory capable of storing content, which can thereafter be transferred to the mobile terminal 10. Moreover, the mobile terminal 10 can be coupled to one or more electronic devices, such as printers, digital projectors and/or other multimedia capturing, producing and/or storing devices (e.g., other terminals). Similar to the computing systems 52, the mobile terminal 10 may be configured to communicate with the portable electronic devices in accordance with techniques such as, for example, RF, BT, IrDA or any of a number of different wireline or wireless communication techniques, including LAN, WLAN, WiMAX and/or UWB techniques.
An exemplary embodiment of the present invention will now be described with reference to fig. 3, which illustrates certain elements of a system for providing metadata entry. The system of fig. 3 may be employed, for example, on the mobile terminal 10 of fig. 1. It will be appreciated, however, that the system of FIG. 3 may also be employed on a variety of other devices, both mobile and fixed, and thus the present invention should not be limited to application on devices such as the mobile terminal 10 of FIG. 1. For example, the system of FIG. 3 may be used in a personal computer, a camera, a video recorder, a remote server, and so forth. Alternatively, embodiments may be used in combinations that include, for example, the devices listed above. It will be appreciated, however, that while FIG. 3 illustrates one example of a configuration of a system for providing metadata entries for use in metadata-based content management, numerous other configurations may also be used to implement embodiments of the present invention.
Referring now to FIG. 3, a system for providing metadata entry is provided. The system includes a tagging element 70, an interface element 72, and a storage element 74. It will be appreciated that all or any of the tagging element 70, the interface element 72, and the storage element 74 may be configured in a single device. For example, the mobile terminal 10 of FIG. 1, may include all of the tagging element 70, the interface element 72, and the storage element 74. Alternatively, all or any of the tagging element 70, the interface element 72, and the storage element 74 may be disposed in different devices. For example, one or more of the tagging element 70, the interface element 72, and the storage element 74 may be disposed at a server or remote display, while others are disposed at a mobile terminal in communication with the server or remote display. In an exemplary embodiment, the tagging element 70 may be embodied as software instructions stored in a memory of the mobile terminal 10 or server and executed by a processing element, such as the controller 20 or a processing element of the server. The interface element 72 may include, for example, the keypad 30 and the display 28 and associated hardware and software. It will be appreciated that the interface element 72 may optionally be embodied entirely in software, such as in the case of an interface employing a touch screen for using functional elements (such as software keys accessible via a touch screen using a finger, stylus, or the like). The memory element 74 may be any of the memory devices described above in connection with the mobile terminal 10 of fig. 1, or any other suitable memory device accessible by the processing element and in communication with the tagging element 70 and the interface element 72. It will also be appreciated that while embodiments of the present invention are described below primarily in the context of content items such as still images, e.g., pictures or photographs, any content item that may be created at the mobile terminal 10 or any other device employing embodiments of the present invention is contemplated. An alternative is video that is typically recorded by the device. According to the embodiment of the present invention, a user can easily add metadata to a video. Depending on the context, the metadata to be inserted may suggest "do you want to add today's weather conditions as metadata to the video? "later, the user can find out what the recording is done and what the weather was when it was recorded (e.g., summer and temperature 70F, etc.).
A further embodiment may be that, using this innovative approach, e-mails or any other application that the user has activated on the screen may be merged by means of metadata. In one embodiment, a user may be instructed to cross-reference between information of different applications. For example, if the user has opened an email and also opened a browser, the user may give metadata information to the email by making a tag. In one embodiment, when the tag button is clicked, the system may query "do you want to add the current URL to associate with the email? By clicking on the option "yes", metadata is added to the email. Further, in one embodiment, a user may be browsing many web pages, and may be linking different things or transactions together in the user's mind by associating transactions. When a user desires to add metadata to a browsed web page, the user is asked when the tag button is clicked, e.g. "you want to be metadatawww.nokia.comAdded to this web page? ". By clicking "yes," the metadata is added. In one example, the metadata information may be extracted from the web page text or even a picture shown by the web page. By adding browser functionality to the metadata selection, the interface element 72 may include an option to capture text or image data from the web page. For example, when the user has selected "add metadata" (e.g., via a mark-up button), a browser type window may be opened and the user can scroll through the page and select a desired portion of text or images, video, or, for example, an advertisement. When the user selects desired portions, the user may be asked whether to accept adding those portions as metadata to the currently open application. If the user selects "yes", metadata is added and the user then has the advantage of using this information, for example to search in the device. The metadata may even be linked to a specific part of the html or xml file. When this is done, it is possible to do so,a mapping of these specific portions and metadata may be further formulated.
In an exemplary embodiment, the tagging element 70 and/or the interface element 72 may be embodied in software as instructions stored in a memory of the mobile terminal 10 and executed by the controller 20. However, each of the above-described elements may alternatively operate under control of a respective local processing element or processing element of other devices not shown in fig. 3. Processing elements such as those described above may be embodied in a variety of ways. For example, the processing element may be embodied as a processor, a coprocessor, a controller or various other processing means or devices including integrated circuits such as, for example, an Application Specific Integrated Circuit (ASIC).
The tagging element 70 may be any device or means embodied in either hardware, software, or a combination of hardware and software that is capable of executing a tagging application 82, the tagging application 82 for assigning metadata tags 76 to content items 78 or modifying metadata tags 76. In an exemplary embodiment, the tagging element 70 is in operable communication with the camera module 36. At this point, the tagging element 70 may receive an indication 80 from the camera module 36 that the content item 78 is to be created or has been created. For example, the indication 80 may be an indication of an intent to create a content item, which may be inferred when a camera application is launched, when lens cover removal is detected, or in any other suitable manner. In embodiments where the indication 80 is an indication of an intent to create a content item, for example, the tagging element 70 may utilize context information or proximity information regarding devices or individuals in proximity to assign the metadata tag 76 including a reference to the context or proximity information to the content item 78 after the content item 78 is created. In this regard, the contextual information may be collected from any suitable source, such as the currently running application, sound, time and date, and other real-time means, i.e., "on the fly". Alternatively, the indication 80 may be triggered in response to the actual creation of the content item 78. Likewise, receipt of the indication 80 may trigger the tagging element 70 to launch a tagging application to enable the user to apply the metadata tag 76 to the content item 78.
The tagging application may also be launched manually, such as by selecting an object and/or rendering such as content item 78, for example, and further selecting an option to launch the tagging application, such as through a menu option, for example. Optionally, as shown in FIG. 4, in one aspect of the invention, a dedicated key 84 may be provided to launch the tagging application. It will be appreciated that while fig. 4 shows the dedicated keys 84 as keys of a standard QWERTY keyboard, the dedicated keys 84 may also be buttons or keys on a typical mobile terminal keypad or touch screen display. In this regard, the dedicated key 84 may be a soft key or any other key regardless of the location of the key on the mobile terminal.
The dedicated key 84 may have any of a number of functions. For example, a dedicated key 84 may be used to manually launch the tagging application, thereby enabling a user to utilize the tagging element 70 to apply metadata tagging, for example, to a selected content item. Alternatively, as shown in FIG. 5, which describes one aspect of the present invention, if a content item is not currently selected, a dedicated key 84 may be used to launch a tagging application that displays a list of metadata tags and related content items. In this regard, the list 86 may further include contextual information or other information that may be used to organize the list 86. For example, user preferences may be used to specify: the list 86 is organized based on proximity information of the date when entered or modified, contextual information, alphabetical order, or the like, or a combination of the above. The list 86 may be considered a universal indicia database because the list 86 may access and display all indicia within all services regardless of their associated services or applications. Conventional markup can be reused in different contexts, so in a particular context, only markup used in that particular context can be browsed. However, embodiments of the present invention provide for: the list 86 publishes a general list of tags for each context so that all tags and their corresponding content items can be browsed from a single database according to user preference or user selection. The list 86 may be stored, for example, on the storage element 74, which may be disposed, for example, at a mobile terminal or server. Individual metadata tags in the list 86 may be selected to modify the selected metadata tag. The modification of the metadata tag may also be undertaken by the content item itself. In other words, when a content item is rendered, it is possible to select an option to modify the metadata tag associated with the content item. In a further embodiment of the present invention, the list 86 may include, for example, a list of browsable content such as web pages, URLs, or other portions of content shown in a particular URL. For example, when a URL is selected, then the next step would be to define whether the URL should be associated as metadata or as part of a web page, such as text. By giving a key such as "do you want to add a URL to metadata? "," do you want to add text in a web page as metadata? ", or" do you want to add the selected portion as metadata? "may be displayed hierarchically in the user interface to the user for selection. For example, when adding metadata to a video, the added metadata may be, for example, detailed information to be browsed or at least a selection information part of a user (operator) in a markup window to be included as metadata.
It will be appreciated that although the dedicated key 84 may be used in any number of services, the dedicated key 84 may have different functionality in each service, as provided by the developer of each service. In this way, the dedicated key 84 provides an open API for the service developer by enabling the developer to "record" the functionality that the dedicated key 84 requires as it relates to launching the tagging application and specifying the tags in each of the different services.
In an exemplary embodiment, the tagging element 70 may associate a particular key with corresponding metadata, thereby providing a predetermined metadata tag and value to be associated with a selected key. Likewise, when a tagging application is executed for a content item, a particular key may be selected as a shortcut to tag the content item with metadata corresponding to the particular key. At this point, the tagging application may be launched in advance, such as by tagging the content item using the dedicated key 84 and then selecting a particular key. Alternatively, when a particular key is selected, merely selecting a content item may be sufficient to support tagging the content item with a tag corresponding to the particular key. For example, a key from a keypad of a mobile terminal, such as the number 2 key, may be associated with a particular metadata type, such as a theme, and a value, such as a birthday. Thus, after an image obtained from a birthday event is created, an indication of the creation of the image may launch a tagging application and the tagging element 70 may be used to enable a user to press the number 2 key to designate metadata identifying the image as being associated with the birthday event. Alternatively, pressing the number 2 key may launch the tagging application when the content item is rendered. In an exemplary embodiment, further features regarding the specified metadata may be obtained. For example, other information such as the name of the person on the birthday may be entered. At this point, other information may be pre-assigned to the number 2 key, or added after the number 2 key is pressed to specify the birthday metadata. The metadata may be in hierarchical order, such as birthday, name, or journey, china, shanghai, Xuan restaurant, attic building. For example, the back-part information may be presented via WLAN access point information based on, for example, local services in the area or the restaurant. As another alternative, the metadata entry may be used as an input to the journal.
FIG. 6 shows an example of a metadata entry that may be used to specify a predetermined relationship between a key and particular metadata according to an exemplary embodiment, according to one aspect of the invention. The metadata entries may be stored in the storage element 74 and accessible by a user via, for example, a mobile terminal to specify the predetermined relationship. Thus, the metadata entry may be stored in a server (i.e., an internet server or a presence server) or a mobile terminal or other electronic device. As shown in FIG. 6, the display 88 may present information about the metadata entry 89, which may include fields that define characteristics about the metadata entry 89. For example, the characteristics may include an attribute field 90 that defines the subject or type of metadata, such as event, date, time, title, device or person name in proximity (determined via bluetooth, WLAN, or RFID), presence information, playlist or genre information (i.e., rock, leisure, party, sports, latin, R & B, heavy metals), and so forth. The feature may also include a key indicator field 92 that indicates which key is associated with the metadata entry 89. Optionally, the feature may also include a modifier field 94 (or multiple modifier fields) that may provide a related value to the value of the attribute field 90 or further specify information associated with the metadata entry 89.
The predetermined relationship may be specified via a tagging application. The keys of the keypad may be pre-assigned characteristics related to the metadata entry. For example, the tagging application may be launched via the dedicated key 84 or any other mechanism, and the tagging element 70 may be used in cooperation with the interface element 72 to assign a value to the attribute field 90 associated with a particular key. One way of specifying (i.e., predefining the relationship between the keys and the metadata entries) may be a separate option or menu within the tagging application. As shown in fig. 6, a key of the keypad, such as the number 2 key, may be selected. The number 2 is then displayed in the key indicator field to specify that the metadata entry 89 for the current key arrangement will be associated with the number 2 key. The interface element 72 may then be used to enter a value into the subject field 90 and, when desired, into the modifier field 94. In an exemplary embodiment, in response to assigning the metadata entry 89 to a content entry, a value may be inserted into the modifier field 94 instead of being predefined.
Values may be entered into the attribute field 90 or the modifier field 94 via the interface element 72 in any suitable manner. For example, values may be manually entered into one or more of the attribute and/or modifier fields 90 and 94 via typing or spelling out the values using a keypad. As another example, various themes may be predefined and selected from a list. For example, the interface element 72 may be used to cycle through various possible pre-entered values such as holidays, soccer, hockey, christmas, spring holidays, anniversaries, birthdays, and the like. As another example, a combination of the above examples may be provided. Additionally, it will be appreciated that a particular combination or order of keys may also be associated with a particular metadata entry. For example, entering a key sequence of "313" may associate a particular metadata entry, such as "car", with the content item currently being rendered. Additional information about a particular car may then be added as a trim such as "donald duck" for donald duck's car.
Thus, in operation according to the example shown in FIG. 6, if a picture was taken at a birthday event, following image capture, the number 2 key may be pressed to specify metadata corresponding to the birthday event. In response to inserting the name of Ari, an option to add a modifier may then be presented, indicating that the birthday is that of Ari. Alternatively, the number 2 key is associated with Ari, and a birthday event may be added as a decoration. As still another alternative, the number 2 key may be associated not only with one of Ari and the birthday event and then adding the other as a modifier, but also with a combination of Ari and the birthday. In other words, the number 2 key may be directly associated with Ari's birthday via the predefined relationship established above.
As a more specific example of an exemplary embodiment, after taking a picture and displaying the image on the display 88, the number 2 key (previously defined as corresponding to Ari's birthday) may be selected. A dialog may be displayed, for example, "do you wish to add Ari's birthday as metadata for this image? ". The user may then select the number 2 key (or other designation key) again to accept and assign metadata to the image. Alternatively, the metadata may be automatically specified after the selection. Ari's birthday may then be associated with the image as metadata, which may be added to the list 86 for browsing among all other commonly created metadata entries sorted or organized in any suitable manner. In addition, context or proximity information may also be associated with the image as metadata, and such information may also be displayed in the list 86 in association with the corresponding metadata entry.
It is not necessary to limit the addition or modification of metadata associated with a content item to the application at the time of creation. But rather, the metadata may be added or modified at any time. For example, if a series of images are being viewed from a photo library (or a song is being listened to from an audio collection), the currently selected image (or song) may have the metadata assigned (or decorated) by similarly pressing the key associated with the desired metadata to be assigned.
In one exemplary embodiment, for example, each of the keys associated with the numbers 0 through 9 may be assigned a single metadata entry, such that the 10 most frequently used metadata entries may be assigned to the corresponding keys associated with the numbers 0 through 9. Metadata entries may differ for the following differences: for different applications (i.e., audio, video, image, or other media applications), different content types (i.e., media, text, web pages, etc.), different contexts; or the metadata entry is created in real-time by contextual information (i.e., location, time, temperature, etc.); or the metadata entry may be different for different combinations of the above. Thus, each key may be associated with multiple metadata entries within a particular application, content type or context or across multiple applications, content types or contexts. The tagging element 70 may be capable of determining an application, content type, or context associated with a currently selected content item to allow for free text entry or modification of existing metadata entries, thereby providing flexibility to support tagging of any of a variety of applications. Likewise, the association of a particular key and metadata entry may include a reference to an application, content type, and/or context associated with the content item at the time of creation or rendering. References to applications, content types, and/or contexts may be provided, for example, in the modifier field 94 or other portion of the metadata entry. Alternatively, one or more media files may be selected, and one or more keys may also be selected, so that a group of media files will have more metadata designations.
In one embodiment, each value of the key indicator field 92 may have multiple corresponding values in the attribute and/or modifier fields 90 and 94. In other words. In the key assignment process, the same key may be assigned to more than one different metadata entry. Thus, the above-mentioned loop function may be performed such that any of the fields may remain unchanged, while the variable values of the remaining fields are cycled through. For example, all possible values of the attribute field 90 associated with the number 2 key are cycled through. In such an embodiment, when a key associated with a plurality of attributes and/or modifier values is pressed during the tagging process, the values associated with the key may be presented for user selection according to a statistical probability order. In other words, the most likely value may be presented first or more prominently than the other values, and the other values may be presented in order of decreasing likelihood. For example, if the number 2 key is associated with the values hockey, football, and baseball of the attribute field 90, and 65% of the metadata entries associated with the number 2 key relate to hockey, 25% relate to football, and 10% relate to baseball, then hockey may be displayed first or selectable via a single press of the button, while football and baseball may be displayed second and third, respectively, or selectable via one or more presses. As shown in FIG. 6, after modifying the value of any field, the metadata entry 89 may be stored or discarded. In addition, an option menu may be provided that extends functionality and provides back and away selections to return to a previous screen or exit from a key designation or marking application.
FIG. 7 is a flowchart of a system, method and program product according to exemplary embodiments of the invention. It will be understood that each block or step of the flowcharts, and combinations of blocks in the flowcharts, can be implemented by various means, such as hardware, firmware, and/or software including one or more computer program instructions. For example, one or more of the procedures described above may be embodied by computer program instructions. In this regard, the computer program instructions which embody the procedures described above may be stored by a memory device of the mobile terminal and executed by a built-in processor in the mobile terminal. As will be appreciated, any such computer program instructions may be loaded onto a computer or other programmable apparatus (i.e., hardware) to produce a machine, such that the instructions which execute on the computer or other programmable apparatus create means for implementing the functions specified in the flowcharts block(s) or step(s). These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowcharts block(s) or step(s). The computer program instructions may also be loaded onto a computer or other programmable apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowcharts block(s) or step(s).
Accordingly, blocks or steps of the flowcharts support combinations of means for performing the specified functions, combinations of steps for performing the specified functions and program instruction means for performing the specified functions. It will also be understood that one or more blocks or steps of the flowcharts, and combinations of blocks or steps in the flowcharts, can be implemented by special purpose hardware-based computer systems which perform the specified functions or steps, or combinations of special purpose hardware and computer instructions.
In this regard, one embodiment of a method of providing metadata entries comprises: an activity, such as a media file, is rendered via an electronic device at operation 200. At operation 210, a selection of a key of an electronic device is received. The key may be a hardware key or a software key. The metadata entry may be accessed from a user device or a network device, may be created manually or from metadata that is already present on the user device, server, or particular application. At operation 220, the metadata entry for the rendered activity, such as a media file, may be modified in response to selection of the key. Modifying metadata may include: the metadata entry for the selected key is designated as metadata for the rendered activity. Optionally, modifying the metadata may include: rendering the rendered metadata entry for the activity associated with the selected key. Optionally, at operation 230, additional related data may be associated with the metadata entry. Additional relevant data may be based on the proximity of the device or user or on the presence information. Optionally, the additional related data may include contextual information, media type information, and/or information related to the rendering application. The metadata entry may be displayed when the content item is rendered. As a further optional operation, at operation 240, a dedicated key may be selected to display a list including all metadata entries and corresponding content items. The list may be a general list that includes all metadata tags regardless of the application with which the metadata tag is associated.
It should again be noted that although the foregoing exemplary embodiments have been described primarily in the context of image-related content items, embodiments of the present invention may also be implemented in the context of any other content item. For example, the content items may include, but are not limited to: images, video files, television broadcast data, text, web pages, web links, audio files, radio broadcast data, broadcast program guide data, and so forth. It should also be noted that embodiments of the present invention need not be limited to application to a single device. In other words, some operations of a method according to embodiments of the present invention may be performed on one device while other operations are performed on another device. Similarly, one or more of the operations described above may be performed by a combined effort of devices or apparatuses communicating with each other.
The above-described functions may be performed in a variety of ways. For example, any suitable means for carrying out each of the functions described above may be employed to carry out the invention. In one embodiment, all or a portion of the elements of the invention generally operate under control of a computer program product. The computer program product for performing the method of an embodiment of the invention comprises: a computer-readable storage medium, such as the non-volatile storage medium, and computer-readable program code portions, such as a series of computer instructions, embodied in the computer-readable storage medium.
Many modifications and other embodiments of the invention will come to mind to one skilled in the art to which this invention pertains having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the inventions are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.
Claims (35)
1. A method, comprising:
rendering the activity via the electronic device;
receiving a selection of a key of the electronic device; and
modifying the rendered active metadata entry in response to selection of the key.
2. The method of claim 1, wherein modifying the metadata comprises: designating the metadata entry for the selected key as metadata for the rendered activity.
3. The method of claim 1, wherein modifying the metadata entry comprises: rendering metadata of the rendered activity associated with the selected key.
4. The method of claim 1, further comprising:
additional related data is associated with the metadata entry.
5. The method of claim 4, wherein associating additional related data with the metadata entry comprises: associating the metadata entry with at least one of:
presenting the information;
proximity information;
context information;
a media type; or
The application is rendered.
6. The method of claim 1, further comprising: displaying the possible metadata entries in a probabilistic order in response to the selected key being associated with more than one metadata entry.
7. The method of claim 1, further comprising: in response to selection of the dedicated key, a list of all metadata entries and corresponding content items is displayed.
8. The method of claim 1, wherein rendering the activity comprises rendering one of:
an image;
a video file;
an audio file;
television broadcasting;
radio broadcasting;
a text; or
A web page.
9. The method of claim 1, further comprising: when a content item is rendered, the corresponding metadata entry is displayed.
10. The method of claim 1, wherein modifying the metadata entry comprises: the metadata entry is decorated with markup information obtained via browsing.
11. The method of claim 1, wherein rendering the activity comprises: and recording the data.
12. A computer program product comprising at least one computer-readable storage medium having computer-readable program code portions stored therein, the computer-readable program code portions comprising:
a first executable portion for rendering an activity via an electronic device;
a second executable portion for receiving a selection of a key of the electronic device; and
a third executable portion for modifying a metadata entry for the rendered activity in response to selection of the key.
13. A computer program product according to claim 12, wherein the third executable portion includes: instructions for designating a metadata entry for the selected key as metadata for the rendered activity.
14. A computer program product according to claim 12, wherein the third executable portion includes: instructions for rendering metadata of the rendered activity associated with the selected key.
15. A computer program product according to claim 12, further comprising a fourth executable portion for associating additional related data with the metadata entry.
16. A computer program product according to claim 15, wherein the fourth executable portion includes instructions for associating the metadata entry with at least one of:
presenting the information;
proximity information;
context information;
a media type; or
The application is rendered.
17. A computer program product according to claim 12, further comprising a fourth executable portion for displaying possible metadata entries in a probabilistic order in response to the selected key being associated with more than one metadata entry.
18. A computer program product according to claim 12, further comprising a fourth executable portion for displaying a list of all metadata entries and corresponding content items in response to selection of the dedicated key.
19. A computer program product according to claim 12, wherein the first executable portion includes instructions for rendering one of:
an image;
a video file;
an audio file;
television broadcasting;
radio broadcasting;
a text; or
A web page.
20. A computer program product according to claim 12, further comprising a fourth executable portion for displaying a corresponding metadata entry when rendering the content item.
21. A computer program product according to claim 12, wherein the third executable portion includes instructions for using tag information obtained via browsing to modify the metadata entry.
22. A computer program product according to claim 12, wherein the first executable portion includes instructions for recording data.
23. An apparatus, comprising:
an output device for rendering the activity via the electronic device;
an interface element for accepting user input of a selection of a key of the electronic device;
a tagging element configured to modify a metadata entry of the rendered activity in response to selection of the key.
24. The device of claim 23, wherein the tagging element is configured to designate a metadata entry of the selected key as metadata of the rendered activity.
25. The device of claim 23, wherein the tagging element is configured to render metadata of a rendered activity associated with the selected key.
26. The apparatus of claim 23, wherein the tagging element is configured to associate additional related data with the metadata entry.
27. The apparatus of claim 26, wherein the additional related data comprises at least one of:
presenting the information;
proximity information;
context information;
a media type; or
The application is rendered.
28. The apparatus of claim 23, wherein the activity comprises one of:
an image;
a video file;
an audio file;
television broadcasting;
radio broadcasting;
a text; or
A web page.
29. The apparatus of claim 23, wherein the tagging element is configured to: displaying the possible metadata entries in a probabilistic order in response to the selected key being associated with more than one metadata entry.
30. The device of claim 23, wherein the tagging element is configured to associate the ten most commonly used metadata entries with respective keys of keypad numbers 0 through 9.
31. The apparatus of claim 23, further comprising a dedicated key for performing a function associated with metadata tagging in response to selection of the dedicated key.
32. The apparatus of claim 31, wherein the functions comprise: a list of all metadata entries and corresponding content items is displayed.
33. The apparatus of claim 23, further comprising a processing element configured to: when a content item is rendered, the corresponding metadata entry is displayed.
34. The apparatus of claim 23, wherein the apparatus is embodied as a mobile terminal.
35. An apparatus, comprising:
means for rendering the activity via the electronic device;
means for receiving a selection of a key of the electronic device; and
means for modifying a metadata entry for the rendered activity in response to selection of the key.
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US11/424,615 | 2006-06-16 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| HK1131230A true HK1131230A (en) | 2010-01-15 |
Family
ID=
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2007144727A2 (en) | Method, apparatus and computer program product for providing metadata entry | |
| US9977783B2 (en) | Systems, methods, devices, and computer program products for arranging a user's media files | |
| US20090003797A1 (en) | Method, Apparatus and Computer Program Product for Providing Content Tagging | |
| US20090012959A1 (en) | Method, Apparatus and Computer Program Product for Providing Presentation of a Media Collection | |
| US9910934B2 (en) | Method, apparatus and computer program product for providing an information model-based user interface | |
| US20080320033A1 (en) | Method, Apparatus and Computer Program Product for Providing Association of Objects Using Metadata | |
| CN101889279A (en) | Method, apparatus and computer program product for hierarchical navigation of content items in a media collection | |
| US8059139B2 (en) | Display controller, display control method, display control program, and mobile terminal device | |
| US20090164928A1 (en) | Method, apparatus and computer program product for providing an improved user interface | |
| CN101896905A (en) | System, method, apparatus and computer program product for providing presentation of content items of a media collection | |
| KR20120125377A (en) | Apparatus and methods of receiving and acting on user-entered information | |
| US20070245006A1 (en) | Apparatus, method and computer program product to provide ad hoc message recipient lists | |
| WO2007116281A1 (en) | Method for utilizing speaker recognition in content management | |
| CN101232678A (en) | A method and terminal for menu selection | |
| US20080154905A1 (en) | System, Method, Apparatus and Computer Program Product for Providing Content Selection in a Network Environment | |
| US20090163239A1 (en) | Method, apparatus and computer program product for generating media content by recording broadcast transmissions | |
| HK1131230A (en) | Method, apparatus and computer program product for providing metadata entry | |
| US20070136758A1 (en) | System, method, mobile terminal and computer program product for defining and detecting an interactive component in a video data stream | |
| US20110225147A1 (en) | Apparatus and method for providing tag information of multimedia data in mobile terminal |