HK1177567B - Method and device for making respondiveness for missed calls - Google Patents
Method and device for making respondiveness for missed calls Download PDFInfo
- Publication number
- HK1177567B HK1177567B HK13104369.0A HK13104369A HK1177567B HK 1177567 B HK1177567 B HK 1177567B HK 13104369 A HK13104369 A HK 13104369A HK 1177567 B HK1177567 B HK 1177567B
- Authority
- HK
- Hong Kong
- Prior art keywords
- user
- contact
- icon
- interactive display
- call
- Prior art date
Links
Abstract
In one aspect of the invention, a computer-implemented method is disclosed for use in conjunction with a portable electronic device with a touch screen display. Upon detecting an incoming telephone call from a caller, a text identifier of the caller is displayed; an image associated with the caller is displayed; a call answer icon is displayed, which if selected by a user of the device, answers the incoming telephone call; and a call decline icon is displayed, which if selected by the user of the device, declines the incoming telephone call.
Description
This application is a divisional application of PCT national phase of entry application 200780040472.8(PCT application number: PCT/US2007/077436) entitled "management of incoming phone calls for portable multifunction devices with touch screen display", filed on 31/8/2007.
Technical Field
The disclosed embodiments relate generally to portable electronic devices and, more particularly, to a portable device having a touch screen display capable of performing a variety of functions including a call.
Background
As portable electronic devices become more compact and the number of functions performed by a given device increases, it becomes a significant challenge to design a user interface that enables a user to easily interact with a multifunction device. This challenge is particularly pronounced for handheld portable devices that have much smaller screens than desktop or laptop computers. This is disadvantageous because the user interface is not only the way the user receives content, but also the way the user receives a response to a user action or behavior, including the user's attempt to access the device's features, tools, and functions. Some portable communication devices (e.g., mobile phones-sometimes referred to as cell phones, cellular phones, cell phones, etc.) have resorted to adding more press buttons, increasing the density of press buttons, overloading (overload) the functionality of press buttons, or using complex menu systems to allow a user to access, store, and process data. These conventional user interfaces often result in the user having to memorize complex key sequences and menu hierarchies.
Many conventional user interfaces, such as those that include a physical push button, are also inflexible. This is disadvantageous because it may prevent the user interface from being configured and/or modified by an application running on the portable device or by the user. Such inflexibility is frustrating to most users when coupled with the time spent memorizing the multi-key sequence and menu hierarchy and the difficulty of activating the desired press button.
In particular, user interfaces for managing telephone calls in portable devices can be frustrating to users because they do not provide call information and illustrate the call options available to the user in a simple and clear manner.
Accordingly, there is a need for a portable multifunction device with a more transparent and intuitive user interface for managing telephone calls that is easy to use, configure, and/or modify.
Disclosure of Invention
The disclosed portable multifunction device reduces or eliminates the above-described disadvantages and other problems associated with user interfaces of portable devices. In certain embodiments, the device includes a touch-sensitive display (also referred to as a "touch screen") having a Graphical User Interface (GUI), one or more processors, memory, and one or more modules, programs, or sets of instructions stored in the memory for performing various functions. In some embodiments, the user interacts with the GUI primarily through finger contacts and gestures on the touch-sensitive display. In some embodiments, the functionality may include telephony, video conferencing, email, instant messaging, blogging, digital photography, digital video, web browsing, digital music playing, and/or digital video playing. Instructions for performing these functions may be stored in a computer-readable storage medium configured to be executed by one or more processors.
A computer-implemented method is disclosed for use in conjunction with a portable electronic device having a touch screen display with a plurality of user interface objects. Upon detecting a telephone call from a caller: displaying a text identifier of the caller; displaying an image associated with the caller; displaying a call answer icon, wherein the incoming telephone call is answered if the call answer icon is selected by a user of the device; and displaying a call decline icon, wherein the incoming telephone call is declined if the device user selects the call decline icon.
A computer-implemented method is disclosed for use in conjunction with a portable electronic device having a touch screen display with a plurality of user interface objects. Upon detecting that a telephone call has been established between a user of the device and another entity, displaying a mute icon for muting a microphone of the device and displaying a speaker icon for activating a speaker mode of the device. In addition, a conference call icon for forming a multi-party telephone call between the user, the other entity, and at least one additional entity is also displayed. Also, a call hold icon for suspending (suspend) the telephone call is displayed, and an end call icon for ending the telephone call is displayed.
A computer-implemented method is disclosed for use in conjunction with a portable electronic device having a touch screen display with a plurality of user interface objects. Upon detecting that a telephone call has been established between a user of the device and another entity, displaying a mute icon for muting a microphone of the device, displaying a keypad icon for displaying a keypad, and displaying a speaker icon for activating a speaker mode of the device. Additionally, an add call icon for forming a multiparty telephone call between the user, the other entity, and at least one additional entity is also displayed. Also, a call hold icon for suspending the telephone call, a contact icon for displaying a contact list, and an end call icon for ending the telephone call are displayed.
A computer-readable storage medium for use in conjunction with a portable electronic device having a touch screen display with a plurality of user interface objects, the computer-readable storage medium storing one or more programs, the one or more programs including instructions for displaying, upon detection of a telephone incoming call from a caller: a text identifier of the caller; an image associated with the caller; a call answer icon, wherein the incoming telephone call is answered if the call answer icon is selected by a user of the device; and a call decline icon, wherein the incoming telephone call is declined if the user of the device selects the call decline icon.
A computer readable storage medium for use in conjunction with a portable electronic device having a touch screen display with a plurality of user interface objects, the computer readable storage medium storing one or more programs, the one or more programs including instructions for displaying, upon detecting that a telephone call has been established between a user of the device and another entity: a mute icon for muting a microphone of the device; a speaker icon for activating a speaker mode of the device; a conference call icon for forming a multi-party telephone call between the user, the other entity, and at least one additional entity; a call hold icon for suspending the telephone call; and an end call icon for ending the telephone call.
A computer readable storage medium for use in conjunction with a portable electronic device having a touch screen display with a plurality of user interface objects, the computer readable storage medium storing one or more programs, the one or more programs including instructions for displaying, upon detecting that a telephone call has been established between a user of the device and another entity: a mute icon for muting a microphone of the device; a keypad icon for displaying a keypad; a speaker icon for activating a speaker mode of the device; an add call icon for forming a multiparty telephone call between the user, the other entity and at least one further entity; a call hold icon for suspending the telephone call; a contact icon for displaying a contact list; and an end call icon for ending the telephone call.
A portable electronic device with a touch screen display with a plurality of user interface objects is disclosed that includes memory, one or more processors, and one or more programs stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for displaying, upon detecting a telephone call in from a caller: a text identifier of the caller; an image associated with the caller; a call answer icon, wherein the incoming telephone call is answered if the call answer icon is selected by a user of the device; and a call decline icon, wherein the incoming telephone call is declined if the user of the device selects the call decline icon.
A portable electronic device with a touch screen display with a plurality of user interface objects is disclosed that includes memory, one or more processors, and one or more programs stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for displaying, upon detecting that a telephone call has been established between a user of the device and another entity: a mute icon for muting a microphone of the device; a speaker icon for activating a speaker mode of the device; a conference call icon for forming a multi-party telephone call between the user, the other entity, and at least one additional entity; a call hold icon for suspending the telephone call; and an end call icon for ending the telephone call.
A portable electronic device with a touch screen display with a plurality of user interface objects is disclosed that includes memory, one or more processors, and one or more programs stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for displaying, upon detecting that a telephone call has been established between a user of the device and another entity: a mute icon for muting a microphone of the device; a keypad icon for displaying a keypad; a speaker icon for activating a speaker mode of the device; an add call icon for forming a multiparty telephone call between the user, the other entity and at least one further entity; a call hold icon for suspending the telephone call; a contact icon for displaying a contact list; and an end call icon for ending the telephone call.
The disclosed embodiments provide a more transparent and intuitive user interface for managing telephone calls, thereby increasing user efficiency and satisfaction with portable communication devices.
Drawings
For a better understanding of the above-described embodiments of the invention, as well as other embodiments, reference should be made to the following detailed description read in conjunction with the accompanying drawings, wherein like reference numerals represent corresponding parts throughout the several views.
FIG. 1 is a block diagram illustrating a portable multifunction device with a touch-sensitive display in accordance with certain embodiments.
FIG. 2 illustrates a portable multifunction device with a touch screen in accordance with certain embodiments.
FIG. 3 illustrates an exemplary user interface for unlocking a portable electronic device, in accordance with certain embodiments.
FIG. 4 illustrates an exemplary user interface for an application menu on a portable multifunction device, in accordance with certain embodiments.
FIG. 5 illustrates an exemplary user interface for listing instant messaging sessions on a portable multifunction device in accordance with certain embodiments.
Fig. 6A-6E illustrate exemplary user interfaces for entering text for an instant message, in accordance with some embodiments.
Fig. 7 illustrates an exemplary user interface for deleting an instant messaging session, in accordance with certain embodiments.
Fig. 8A and 8B illustrate an exemplary user interface for a contact list, according to some embodiments.
Fig. 9 illustrates an exemplary user interface for entering (entry) a telephone number for instant messaging, in accordance with certain embodiments.
10A-10M illustrate exemplary user interfaces for displaying and managing contacts, according to some embodiments.
11A-11C illustrate exemplary user interfaces for displaying and managing favorite (favorite) contacts according to some embodiments.
FIGS. 12A-12D illustrate exemplary user interfaces for displaying and managing recent calls (recency calls), according to some embodiments.
Fig. 13 illustrates an exemplary dial interface for a call, according to some embodiments.
14A-14D illustrate exemplary user interfaces displayed during a call, according to some embodiments.
Fig. 15A and 15B illustrate exemplary user interfaces displayed during an incoming call, in accordance with some embodiments.
Fig. 16A and 16B illustrate an exemplary user interface for voicemail, according to some embodiments.
FIG. 17 illustrates an exemplary user interface for organizing and managing emails, according to some embodiments.
18A and 18B illustrate an exemplary user interface for creating an email, according to some embodiments.
19A-19F illustrate exemplary user interfaces for displaying and managing inboxes, according to some embodiments.
FIG. 20 illustrates an exemplary user interface for setting email user preferences, according to some embodiments.
21A and 21B illustrate exemplary user interfaces for creating and managing email rules, in accordance with certain embodiments.
22A and 22B illustrate exemplary user interfaces for mobile email messages, according to some embodiments.
FIG. 23 is a flow chart illustrating a process for handling missed telephone calls on a portable electronic device with a touch screen display, in accordance with certain embodiments.
FIG. 24 is a flow chart illustrating a process for handling missed telephone calls on a portable electronic device with a touch screen display, in accordance with certain embodiments.
FIG. 25 is a flow chart illustrating a process for handling missed telephone calls on a portable electronic device with a touch screen display, in accordance with certain embodiments.
FIG. 26 is a flow chart illustrating a process for processing information of a previous phone call on a portable electronic device having a touch screen display with multiple user interface objects, in accordance with certain embodiments.
FIG. 27 is a flow chart illustrating a process for handling a previous phone call on a portable electronic device having a touch screen display with multiple user interface objects, in accordance with certain embodiments.
FIG. 28 is a flow chart illustrating a process for handling an incoming telephone call on a portable electronic device having a touch screen display with a plurality of user interface objects, in accordance with certain embodiments.
FIG. 29 is a flow chart illustrating a process for handling an established telephone call on a portable electronic device having a touch screen display with a plurality of user interface objects, in accordance with certain embodiments.
Detailed Description
Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail as not to unnecessarily obscure aspects of the embodiments.
Embodiments of portable multifunction devices, user interfaces for such devices, and related processes utilizing such devices are described. In some embodiments, the device is a portable communication device, such as a cell phone that also contains other functions, such as PDA and/or music player functions.
In addition to a touch screen, the user interface may also include a click wheel. Clicking on the scroll wheel is a physical user interface device that may provide navigation commands based on the angular displacement of the scroll wheel or the point of contact of the device user with the scroll wheel. Clicking on the scroll wheel may also be used to provide user commands corresponding to a selection of one or more items, such as when a device user presses down on at least a portion of the scroll wheel or the center of the scroll wheel. For brevity, in the discussion that follows, a portable multifunction device that includes a touch screen is used as an exemplary embodiment. It should be understood, however, that some of the user interfaces and associated processes may be applied to other devices, such as personal computers and laptop computers, which may include one or more other physical user interface devices such as a click wheel, physical keyboard, mouse, and/or joystick.
The device supports a variety of applications, such as a telephony application, a video conferencing application, an email application, an instant messaging application, a blogging application, a digital camera application, a digital video camera application, a web browsing application, a digital music player application, and/or a digital video player application.
Various applications executable on the device may use at least one common physical user interface device, such as a touch screen. One or more functions of the touch screen and corresponding information displayed on the device may be adjusted and/or changed from one application to the next and/or in various applications. In this way, a common physical structure of the device (such as a touch screen) may support multiple applications with potentially intuitive and transparent user interfaces.
The user interface may include one or more soft keyboard embodiments. Soft keyboard embodiments may include standard (QWERTY) and/or non-standard configurations of symbols on displayed icons of a keyboard, such as described in U.S. patent application No.11/459,606 entitled "keys For Portable electronic Devices" filed 24.7.2006 and U.S. patent application No.11/459,615 entitled "Touch Screen keys For Portable electronic Devices" filed 24.7.2006, the entire contents of which are incorporated herein by reference. The keyboard embodiment may include icons (or soft keys) that are fewer in number than the number of keys in an existing physical keyboard, such as a keyboard for a typewriter. This may make it easier for the user to select one or more icons in the keyboard and thus one or more corresponding symbols. The keyboard embodiment may be adaptable. For example, the displayed icons may be modified according to user actions such as selecting one or more icons and/or one or more corresponding symbols. One or more applications on the portable device may utilize common and/or different keyboard embodiments. In this way, the keyboard embodiments used may be adapted to at least some applications. In some embodiments, one or more keyboard embodiments may be adapted for individual users. For example, based on word usage history (lexicography, slang, personal usage) of each user. Some keyboard embodiments may be tuned to reduce the probability of user error in selecting one or more icons and thus one or more symbols when using a soft keyboard embodiment.
Attention is now directed to embodiments of the apparatus. FIG. 1 is a block diagram illustrating a portable multifunction device 100 with a touch-sensitive display 112 in accordance with some embodiments. Touch-sensitive display 112 is sometimes referred to as a "touch screen" for convenience, and may also be considered or referred to as a touch-sensitive display system. Device 100 may include memory 102 (which may include one or more computer-readable storage media), a memory controller 122, one or more processing units (CPUs) 120, a peripheral interface 118, RF circuitry 108, audio circuitry 110, a speaker 111, a microphone 113, an input/output (I/O) subsystem 106, other input or control devices 116, and an external port 124. The device 100 may include one or more optical sensors 164. These components may communicate over one or more communication buses or signal lines 103.
It should be understood that device 100 is merely one example of a portable multifunction device 100 and that device 100 may have more or fewer components than shown, may combine two or more components, or may have a different configuration or arrangement of components. The various components shown in fig. 1 may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The memory 102 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state storage devices. Access to the memory 102 by other components of the device 100, such as the CPU120 and the peripheral interface 118, may be controlled by a memory controller 122.
Peripheral interface 118 couples the input and output peripherals of the device to CPU120 and memory 102. The one or more processors 120 run or execute various software programs and/or sets of instructions stored in the memory 102 to perform various functions and process data for the device 100.
In some embodiments, peripheral interface 118, CPU120, and memory controller 122 may be implemented on a single chip, such as chip 104. In some other embodiments, they may be implemented separately on separate chips.
RF (radio frequency) circuitry 108 receives and transmits RF signals, also referred to as electromagnetic signals. RF circuitry 108 may convert electrical signals to and/or from electromagnetic signals and communicate with communication networks and other communication devices via the electromagnetic signals. RF circuitry 108 may include known circuitry for performing these functions including, but not limited to, an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a Subscriber Identity Module (SIM) card, memory, and so forth. RF circuitry 108 may communicate with networks and other devices via wireless communications, where the networks are such as the internet, also known as the World Wide Web (WWW), intranets, and/or wireless networks, e.g., a cellular telephone network, a wireless Local Area Network (LAN), and/or a Metropolitan Area Network (MAN). Wireless communication may use any of a variety of communication standards, protocols, and technologies, including but not limited to: global system for mobile communications (GSM), Enhanced Data GSM Environment (EDGE), High Speed Downlink Packet Access (HSDPA), wideband code division multiple access (W-CDMA), Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), bluetooth, wireless fidelity (Wi-Fi) (e.g., IEEE802.11 a, IEEE802.11 b, IEEE802.11 g, and/or IEEE802.11 n), voice over internet protocol (VoIP), Wi-MAX, protocols for email, instant messaging, and/or Short Message Service (SMS), or any suitable communication protocol, including communication protocols not yet developed at the time of filing this document.
Audio circuitry 110, speaker 111, and microphone 113 provide an audio interface between a user and device 100. The audio circuitry 110 receives audio data from the peripheral interface 118, converts the audio data to electrical signals, and sends the electrical signals to the speaker 111. The speaker 111 may convert the electrical signal into a sound wave audible to a human. The audio circuit 110 also receives electrical signals converted from sound waves by the microphone 113. The audio circuit 110 converts the electrical signals to audio data and sends the audio data to the peripheral interface 118 for processing. Audio data may be obtained from memory 102 and/or RF circuitry 108 and/or transmitted to memory 102 and/or RF circuitry 108 through peripherals interface 118. In some embodiments, audio circuitry 110 also includes a headset jack (not shown). The headset jack provides an interface between the audio circuitry 110 and a removable audio input/output peripheral such as output-only headphones or a headset with an output (e.g., headphones for one or both ears) and an input (e.g., a microphone).
The I/O subsystem 106 couples input/output peripheral devices on the device 100, such as a touch screen 112 and other input/control devices 116, to a peripheral interface 118. The I/O subsystem 106 may include a display controller 156 and one or more input controllers 160 for other input or control devices. The one or more input controllers 160 receive electrical signals from the other input or control devices 116 and/or transmit electrical signals to the other input or control devices 116. The other input/control devices 116 may include physical buttons (e.g., push buttons, rocker buttons, etc.), dials (dials), slide switches, joysticks, click wheels, and so forth. In some alternative embodiments, one or more input controllers 160 may be coupled to any of (or none of): a keyboard, an infrared port, a USB port, and a pointer device such as a mouse. The one or more buttons (e.g., 208 of fig. 2) may include an up/down button for volume control of the speaker 111 and/or microphone 113. The one or more buttons may include a push button (e.g., 206 of fig. 2). A quick press of the push button may unlock the touch screen 112 or initiate a process of Unlocking the Device with a gesture on the touch screen, as described in U.S. patent application No.11/322,549 entitled "Unlocking a Device by Forming improvements on an Unlockimage," filed on 23.12.2005, which is incorporated herein by reference. Pressing the push button (e.g., 206) for a longer period of time may turn power to the device 100 on or off. The user can also customize the functionality of one or more buttons. The touch screen 112 is used to implement virtual or soft buttons and one or more soft keyboards.
Touch-sensitive touch screen 112 provides an input interface and an output interface between the device and the user. Display controller 156 may receive electrical signals from touch screen 112 and/or send electrical signals to touch screen 112. Touch screen 112 may display visual output to a user. The visual output may include graphics, text, icons, video, or any combination thereof (collectively "graphics"). In some embodiments, some or all of the visual output may correspond to user interface objects, further details of which are described below.
Touch screen 112 has a touch-sensitive surface, sensor or group of sensors that accept input from a user based on tactile (haptic) and/or tactile (tactile) contact. Touch screen 112 and display controller 156 (along with any associated modules and/or sets of instructions in memory 102) detect contact (and any movement or breaking of contact) on touch screen 112 and convert the detected contact into interaction with user interface objects (e.g., one or more soft keys, icons, web pages, or images) displayed on the touch screen. In an exemplary embodiment, the point of contact between touch screen 112 and the user corresponds to a finger of the user.
The touch screen 112 may use LCD (liquid crystal display) technology, or LPD (light emitting polymer display) technology, although other display technologies may be used in other embodiments. Touch screen 112 and display controller 156 may detect contact and any movement or interruption thereof using any of a variety of touch sensing technologies, known or later developed, including but not limited to: capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with the touch screen 112. The touch-sensitive display in some embodiments of touch screen 112 may be similar to a multi-touch sensitive graphical input device (multi-touch sensitive tablet) described in the following U.S. patents: 6,323,846 (Westerman et al), 6,570,557 (Westerman et al), and/or 6,677,932 (Westerman), and/or a multi-touch sensitive graphical input device similar to that described in U.S. patent publication 2002/0015024a1, which are incorporated herein by reference. However, the touch screen 112 displays visual output from the portable device 100, while the touch sensitive graphical input device does not provide visual output. Touch screen 112 may have a resolution in excess of 100 dpi. In one exemplary embodiment, touch screen 112 has a resolution of approximately 168 dpi. The user may make contact with touch screen 112 using any suitable object or accessory, such as a stylus (stylus), finger, or the like. In some embodiments, the user interface is designed to work primarily with finger-based contacts and gestures, which are much less accurate than stylus-based input due to the larger area of finger contact with touch screen 112. In some embodiments, the device may translate the coarse finger-based input into a precise pointer/cursor position or command for performing a user-desired action.
Touch-sensitive displays in some embodiments of touch screen 112 may be as described in the following applications: (1) U.S. patent application No.11/381,313, "Multipoint Touch Surface Controller", filed on 2.5.2006; (2) U.S. patent application No.10/840,862, "Multipoint Touchscreen", filed 5/6/2004; (3) U.S. patent application No.10/903,964, "Gestures For Touch Sensitive Input Devices", filed On 30.7.2004, (4) U.S. patent application No.11/048,264, "Gestures For Touch Sensitive Input Devices", filed On 31.1.2005, (5) U.S. patent application No.11/038,590, "model-based graphical User Interface For Touch Sensitive Input Devices", filed On 16.9.2005, (6) U.S. patent application No.11/228,758, "Virtual Input device plan On A Touch Sensitive User Interface, 2006" (7) U.S. patent application No.11/228,700, filed On 16.9.2005, "U.S. patent application No. 2005" Keyboard application No. 16 "filed On 16.9.9.9.9.9.9.9.9.9.9.9.9", "Multi-Functional Hand-Held Device". All of these applications are incorporated herein by reference.
In some embodiments, in addition to a touch screen, device 100 may include a touch pad (not shown) for activating or deactivating particular functions. In some embodiments, the touchpad is a touch sensitive area of the device that, unlike a touch screen, does not display visual output. The touchpad may be a touch-sensitive surface that is separate from the touch screen 112 or an extension of the touch-sensitive surface formed by the touch screen.
In some embodiments, the device 100 may include a click wheel as the input control device 116. By rotating the click wheel or by moving the point of contact with the click wheel (e.g., the amount of movement of the point of contact is measured by its angular displacement relative to a center point of the click wheel), the user may navigate between or interact with one or more graphical objects (hereinafter referred to as icons) displayed in touch screen 112. A click wheel may also be used to select one or more of the displayed icons. For example, the user may press down on at least a portion of the click wheel or an associated physical button. User commands and navigation commands provided by a user via a click wheel may be processed by input controller 160, as well as by one or more modules and/or sets of instructions in memory 102.
The device 100 also includes a power system 162 for powering the various components. The power supply system 162 may include a power management system, one or more power sources (e.g., battery, Alternating Current (AC)), a charging system, power failure detection circuitry, a power converter or inverter, a power status indicator (e.g., Light Emitting Diode (LED)), and various other components related to power generation, management, and distribution in portable devices.
The device 100 may also include one or more optical sensors 164. FIG. 1 shows an optical sensor coupled to an optical sensor controller 158 in the I/O subsystem 106. The optical sensor 164 may include a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The optical sensor 164 receives light from the environment projected through one or more lenses and converts the light into data representing an image. Along with the imaging module 144, the optical sensor 164 may capture still images or video. In some embodiments, an optical sensor is located on the back of device 100, opposite touch screen display 112 on the front of the device, so that the touch screen display can be used as a viewfinder for taking still and/or video images. In some embodiments, an optical sensor is located on the front of the device to enable the user to obtain images of the user for the video conference while the user views other video conference participants on the touch screen display. In some embodiments, the position of the optical sensor 164 may be changed by the user (e.g., by rotating a lens and sensor in the device housing) so that a single optical sensor 164, along with the touch screen display, may be used for both video conferencing and still and/or video image acquisition.
The device 100 may also include one or more proximity sensors 166. Fig. 1 shows a proximity sensor 166 coupled to the peripheral interface 118. Alternatively, the proximity sensor 166 may be coupled to the input controller 160 in the I/O subsystem 106. The Proximity sensor 166 may operate as described In U.S. patent application No.11/241,839 entitled "Proximity Detector In Handheld Device" filed on 30.9.2005 and U.S. patent application No.11/240,788 entitled "Proximity Detector In Handheld Device" filed on 30.9.2005, which are hereby incorporated by reference. In some embodiments, the proximity sensor turns off and disables touch screen 112 when the multifunction device is placed near the user's ear (e.g., when the user is making a phone call). In some embodiments, the proximity sensor keeps the screen off when the device is in a user's pocket, purse, or other dark area to prevent unnecessary battery drain when the device is in a locked state.
In certain embodiments, the software components stored in memory 102 may include an operating system 126, a communication module (or set of instructions) 128, a contact/motion module (or set of instructions) 130, a graphics module (or set of instructions) 132, a text input module (or set of instructions) 134, a Global Positioning System (GPS) module (or set of instructions) 135, and an application (or set of instructions) 136.
Operating system 126 (e.g., Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
Communications module 128 facilitates communications with other devices through one or more external ports 124, and also includes various software components for processing data received by RF circuitry 108 and/or external ports 124. The external port 124 (e.g., Universal Serial Bus (USB), FIREWIRE, etc.) is adapted to couple directly to other devices or indirectly to other devices through a network (e.g., the internet, wireless LAN, etc.). In some embodiments, the external port may be a multi-pin (e.g., 30-pin) connector that is the same as, similar to, and/or compatible with the 30-pin connector used on iPod (trademark of Apple Computer, inc.) devices.
The contact/motion module 130 may detect contact with the touch screen 112 (along with the display controller 156) and other touch sensitive devices (e.g., a touchpad or a click wheel). The contact/motion module 130 includes various software components for performing various operations related to detecting contact, such as determining whether contact has occurred, determining whether there is movement of the contact and tracking movement on the touch screen 112, and determining whether contact has been interrupted (i.e., whether contact has terminated). Determining movement of the point of contact may include determining velocity (magnitude), velocity (magnitude and direction), and/or acceleration (change in magnitude and/or direction) of the point of contact. These operations may be applied to a single point of contact (e.g., one finger contact) or multiple simultaneous contacts (e.g., "multi-touch"/multiple finger contacts). In some embodiments, the contact/motion module 130 and the display controller 156 also detect contact on the touchpad. In some embodiments, the contact/motion module 130 and the controller 160 detect contact on the click wheel.
Graphics module 132 includes various known software components for rendering (render) and displaying graphics on touch screen 112, including components for changing the brightness of the displayed graphics. As used herein, the term "graphic" includes any object that may be displayed to a user, including but not limited to: text, web pages, icons (such as user interface objects including soft keys), digital images, videos, animations, and the like. An animation here is the display of a series of images that present a motion and inform the user that an action has been performed (such as moving an email message to a folder). In this case, each animation for confirming the action of the user of the device usually takes a predetermined finite amount of time, with typical values between 0.2 and 1.0 seconds, and usually less than 2 seconds.
Text input module 134, which may be a component of graphics module 132, provides a soft keyboard for entering text in various applications, such as contacts 137, email 140, IM141, blog 142, browser 147, and any other application requiring text input.
The GPS module 135 determines the location of the device and provides this information for use by various applications (e.g., to the phone 138 for location-based dialing, to the camera 143 and/or blog 142 as picture/video metadata, and to applications that provide location-based services, such as weather widgets (widgets), local yellow pages widgets, and map/navigation widgets).
The applications 136 may include the following modules (or sets of instructions), or a subset or superset thereof:
● contact module 137 (sometimes referred to as an address book or contact list);
● a telephone module 138;
● video conferencing module 139;
● e-mail client module 140;
● Instant Messaging (IM) module 141;
● blog module 142;
● camera module 143 for still and/or video images;
● an image management module 144;
● video player module 145;
● music player module 146;
● browser module 147;
● calendar module 148;
● Widget Module 149, which may include a weather widget 149-1, a stock widget 149-2, a calculator widget 149-3, an alarm widget 149-4, a dictionary widget 149-5, and other widgets acquired by the user, and a user-created widget 149-6;
● widget creator module 150 for making user-created widgets 149-6; and/or
● search module 151.
Examples of other applications 136 that may be stored in memory 102 include notepads and other word processing applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.
In cooperation with touch screen 112, display controller 156, contact module 130, graphics module 132, and text input module 134, contacts module 137 may be used to manage an address book or contact list, including: adding name(s) to the address book; delete name(s) from the address book; associating phone number(s), email address (es), physical address (es), or other information with a name; associating the image with a name; sorting and ordering names; providing a telephone number or email address to initiate and/or facilitate communication over telephone 138, video conference 139, email 140, or IM 141; and so on. Embodiments of a user interface and associated processing utilizing contacts module 137 are further described below.
In cooperation with the RF circuitry 108, the audio circuitry 110, the speaker 111, the microphone 113, the touch screen 112, the display controller 156, the contact module 130, the graphics module 132, and the text input module 134, the phone module 138 may be used to type a sequence of characters corresponding to a phone number, access one or more phone numbers in the address book 137, modify phone numbers that have been typed, dial individual phone numbers, conduct a call, and disconnect or hang up when the call is completed. As mentioned above, wireless communication may use any of a variety of communication standards, protocols, and technologies. Embodiments of the user interface and associated processing using the telephony module 138 are described further below.
In cooperation with the RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch screen 112, display controller 156, optical sensor 164, optical sensor controller 158, contact module 130, graphics module 132, text input module 134, contact list 137, and telephony module 138, the video conference module 139 may be used to initiate, conduct, and terminate video conferences between the user and one or more other participants.
In cooperation with RF circuitry 108, touch screen 112, display controller 156, contact module 130, graphics module 132, and text input module 134, email client module 140 may be used to create, send, receive, and manage emails. In cooperation with the image management module 144, the e-mail module 140 can easily create and transmit an e-mail with a still or video image photographed by the camera module 143. Embodiments of user interfaces and associated processes utilizing email module 140 are described further below.
In cooperation with RF circuitry 108, touch screen 112, display controller 156, contact module 130, graphics module 132, and text input module 134, instant messaging module 141 may be used to type a sequence of characters corresponding to an instant message, modify previously-typed characters, send individual instant messages (e.g., using a Short Message Service (SMS) or Multimedia Messaging Service (MMS) protocol), receive instant messages, and view received instant messages. In some embodiments, the sent and/or received instant messages may include graphics, photos, audio files, video files, and/or other attachments supported by MMS and/or Enhanced Messaging Service (EMS). Embodiments utilizing the user interface and associated processing of messaging module 141 are described further below.
In cooperation with RF circuitry 108, touch screen 112, display controller 156, contact module 130, graphics module 132, text input module 134, image management module 144, and browser module 147, blog module 142 may be used to send text, still images, video, and/or other graphics to a blog (e.g., a user's blog).
In cooperation with touch screen 112, display controller 156, optical sensor(s) 164, optical sensor controller 158, contact module 130, graphics module 132, and image management module 144, camera module 143 may be used to capture and store still images or video (including video streams) into memory 102, modify characteristics of the still images or video, or delete the still images or video from memory 102.
In cooperation with touch screen 112, display controller 156, contact module 130, graphics module 132, text input module 134, and camera module 143, image management module 144 may be used to arrange, modify or copy, mark, delete, present (e.g., in the form of a digital slide show or photo album), and store still and/or video images.
In cooperation with touch screen 112, display controller 156, contact module 130, graphics module 132, audio circuitry 110, and speakers 111, video player module 145 may be used to display, render, or playback video (e.g., on the touch screen or on an external display connected through external port 124).
In cooperation with touch screen 112, display controller 156, contact module 130, graphics module 132, audio circuitry 110, speakers 111, RF circuitry 108, and browser module 147, music player module 146 enables a user to download and playback recorded music and other sound files stored in one or more file formats, such as MP3 or AAC files. In some embodiments, the device 100 may include the functionality of an MP3 player, such as an iPod (trademark of Apple Computer, Inc.).
In cooperation with RF circuitry 108, touch screen 112, display controller 156, contact module 130, graphics module 132, and text input module 134, browser module 147 may be used to browse the internet, including searching for, linking to, receiving, and displaying web pages or portions of web pages, as well as attachments and other files linked to web pages.
In cooperation with the RF circuitry 108, the touch screen 112, the display controller 156, the contact module 130, the graphics module 132, the text input module 134, the email module 140, and the browser module 147, the calendar module 148 may be used to create, display, modify, and store calendars and data associated with calendars (e.g., calendar entries, to-do lists, etc.).
In cooperation with RF circuitry 108, touch screen 112, display controller 156, contact module 130, graphics module 132, text input module 134, and browser module 147, widget module 149 is a small application (e.g., weather widget 149-1, stock widget 149-2, calculator widget 149-3, alarm widget 149-4, and dictionary widget 149-5) or a small application (e.g., user-created widget 149-6) that may be downloaded and used by a user. In some embodiments, the widgets include HTML (HyperText markup language) files, CSS (cascading Style sheets) files, and JavaScript files. In some embodiments, the widgets include XML (extensible markup language) files and JavaScript files (e.g., Yahoo! widgets).
In cooperation with RF circuitry 108, touch screen 112, display system controller 156, contact module 130, graphics module 132, text input module 134, and browser module 147, widget creator module 150 may be used by a user to create a widget (e.g., to change a portion of a web page specified by the user into a widget).
In cooperation with touch screen 112, display system controller 156, contact module 130, graphics module 132, and text input module 134, search module 151 may be operable to search memory 102 for text, music, sound, images, videos, and/or other files that match one or more search criteria (e.g., one or more user-specified search terms).
Each of the modules and applications described above corresponds to a respective set of instructions for performing one or more of the functions described above. These modules (i.e., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules may be combined or reconfigured in various embodiments. In some embodiments, memory 102 may store a subset of the modules and data structures described above. Moreover, memory 102 may store additional modules and data structures not described above.
In some embodiments, device 100 is a device on which operations of a predefined set of functions are performed solely by touch screen 112 and/or a touchpad. By using a touch screen and/or touch pad as the primary input/control device for operating device 100, the number of physical input/control devices (e.g., push buttons, dials, etc.) on device 100 may be reduced.
The predefined set of functions that may be performed only by the touch screen and/or touchpad include navigation between user interfaces. In some embodiments, the touchpad, when touched by a user, may navigate device 100 from any user interface that may be displayed on device 100 to a main menu, home menu, or root menu. In such embodiments, the touchpad may be referred to as a "menu button". In some other embodiments, the menu button may be a physical push button or other physical input/control device in place of the touch pad.
FIG. 2 illustrates a portable multifunction device 100 with a touch screen 112 in accordance with some embodiments. The touch screen may display one or more graphics. In this embodiment, as well as other embodiments described below, a user may select one or more of the graphics by, for example, contacting or touching the graphics using one or more fingers 202 (not shown to scale in the figures). In some embodiments, selection of one or more graphics occurs when a user breaks contact with the one or more graphics. In some embodiments, the contact may include a gesture that has made contact with device 100, such as one or more taps, one or more swipes (swipes) (left-to-right, right-to-left, up, and/or down), and/or a rolling finger (right-to-left, left-to-right, up, and/or down). In some embodiments, inadvertent contact with a graphic does not select the graphic. For example, when the gesture corresponding to the selection is a tap, a swipe gesture that sweeps over an application icon (sweep) does not select the corresponding application.
The device 100 may also include one or more physical buttons, such as a "home" or menu button 204. As previously described, the menu button 204 may be used to navigate to any application 136 in the set of applications that are executable on the device 100. Alternatively, in some embodiments, the menu buttons may be implemented as soft keys in a GUI in touch screen 112.
In one embodiment, device 100 includes touch screen 112, menu button 204, push button 206 for turning the power to the device on/off and locking the device, and volume adjustment button(s) 208. Pressing the button 206 may be used to turn the power to the device on/off by pressing the button and holding the button in a pressed state for a predetermined time interval; pressing the button 206 may be used to lock the device by depressing the button and releasing the button before a predetermined time interval has elapsed; and/or pressing the button 206 may unlock the device or initiate an unlock process. In an alternative embodiment, the device 100 may also accept verbal input through the microphone 113 for activating or deactivating some functions.
Attention is now directed to embodiments of a user interface ("UI") and related processes that may be implemented on portable multifunction device 100.
Fig. 3 illustrates an exemplary user interface for unlocking a portable electronic device of some embodiments. In certain embodiments, the user interface 300 includes the following elements, or a subset or superset thereof:
● unlock image 302, which moves with a finger gesture to unlock the device;
● arrow 304, which provides a visual cue to the unlock gesture;
● channel 306, which provides additional prompts for an unlock gesture;
● time 308;
● weeks 310;
● date 312; and
● wallpaper image 314.
In some embodiments, the device detects contact with the touch-sensitive display (e.g., a user's finger makes contact on or near the unlock image 302) when the device is in the user interface lock state. The device moves the unlock image 302 in accordance with the contact. If the detected contact corresponds to a predetermined gesture, such as moving the unlock image across the channel 306, the device transitions to a user interface unlock state. Conversely, if the detected contact does not correspond to the predetermined gesture, the device maintains the user interface locked state. As noted above, the process of using a gesture on a touch screen to Unlock a Device is described in U.S. patent application No.11/322,549 entitled "Unlocking a Device by Forming improvements on an Unlock Image," filed on 23.12.2005, which is incorporated herein by reference.
FIG. 4 illustrates an exemplary user interface for an application menu on a portable multifunction device in accordance with certain embodiments. In some embodiments, the user interface 400 includes the following elements, or a subset or superset thereof:
● signal strength indicator 402 for wireless communication;
● time 404;
● battery status indicator 406;
● have trays (tray) 408 of icons of common applications such as:
omicron telephone 138;
an email client 140, which may include an indicator 410 indicating the number of unread emails;
o browser 147; and
o, a music player 146; and
● icons for other applications, such as:
οIM141;
image management 144;
an omicron camera 143;
omicron video player 145;
omicron 149-1;
omicron 149-2;
omicron 142;
omicron calendar 148;
omicron calculator 149-3;
omicron alarm clock 149-4;
omicron dictionary 149-5; and
user created widget 149-6.
In some embodiments, the UI400 displays all available applications 136 on one screen, so that scrolling through the list of applications (e.g., via a scroll bar) is not required. In some embodiments, as the number of applications increases, the icons corresponding to those applications may decrease in size so that all of the applications may be displayed on a single screen without scrolling. In some embodiments, utilizing a menu button and having all applications on one screen enables a user to access any desired application by up to two inputs, such as activating the menu button 204 and then activating the desired application (e.g., by tapping or using other finger gestures on an icon corresponding to the application).
In some embodiments, the UI400 provides integrated (integrated) access to widget-based applications and non-widget-based applications. In some embodiments, all widgets, whether created by a user or not, are displayed on the UI 400. In other embodiments, the icon 149-6 activating the user-created widget may lead to another UI (not shown) containing the user-created widget or an icon corresponding to the user-created widget.
In some embodiments, the user may rearrange the icons in the UI400, for example, using the process described in U.S. patent application No.11/459,602 entitled "Portable Electronic Device With Interface Reconfiguration Mode", filed 24/7/2006, which is incorporated herein by reference. For example, the user may move the application icons into and out of the tray (tray) 408 using a finger gesture.
In certain embodiments, the UI400 includes a meter (not shown) that displays updated Account usage metrics (meters) For accounts associated with Device usage, as described in U.S. patent application 11/322,552 entitled "Account Information Display For Portable communication Device," filed on 23.12.2005, which is incorporated herein by reference.
FIG. 5 illustrates an exemplary user interface for listing instant messaging sessions on a portable multifunction device in accordance with certain embodiments. In some embodiments, the user interface 500 includes the following elements, or a subset or superset thereof:
● the above-described 402, 404, and 406;
● "instant message" or other similar label 502;
● the name 504 of the person with whom the user is conducting an instant messaging conversation (e.g., JaneDoe 504-1), or if the person's name is not available, the phone number (e.g., 408-123-4567504-3);
● text 506 of the last message in the conversation;
● date 508 and/or time of last message in conversation;
● selects icon 510, which when activated (e.g., by tapping a finger on the icon) initiates a transition to a UI for the corresponding session (e.g., FIG. 6A for Jane Doe 504-1);
● edit icon 512, which when activated (e.g., by tapping a finger on the icon), initiates a transition to a UI for the delete session (e.g., FIG. 7); and
● creates a message icon 514 that when activated (e.g., by tapping a finger on the icon) initiates a transition to the user's contact list (e.g., fig. 8A).
In some embodiments, the name 504 for the instant messaging session is determined by looking up an entry in the user contact list 137 that contains the telephone number used for the instant messaging session. If no such entry is found, the telephone number (of the other party with which the user is exchanging messages) is displayed (e.g., 504-3). In some embodiments, if another party sends a message from two or more different telephone numbers, the message may appear as a single conversation under a single name if all of the telephone numbers used are found to be in the same entry of the user's contact list 137 (i.e., the other party's entry).
Automatically grouping instant messages into "conversations" (instant message exchanges with the same user or the same phone number) makes it easier for a user to handle and track instant message exchanges with multiple parties.
Fig. 6A-6E illustrate exemplary user interfaces for entering text for an instant message, in accordance with some embodiments.
In some embodiments, the user interface 600A includes the following elements, or a subset or superset thereof:
● the above-described 402, 404, and 406;
● (or the phone number itself if a name is not available) corresponding to the phone number used in the instant messaging session;
● instant message icon 602, which when activated (e.g., by tapping a finger on the icon), initiates a transition to a UI (e.g., UI 500) listing instant message sessions;
● instant messages 604 from another party, listed generally in order along one side of the UI 600A;
● sent to another party's instant message 606, typically listed in order along the other side of the UI600A to show the round-trip of messages in a conversation;
● timestamps 608 for at least some instant messages;
● text entry box 612;
● sending an icon 614 which, when activated (e.g. by tapping a finger on the icon), initiates sending the message in the textbox 612 to another party (e.g. Jane Doe 504-1);
● an alphabetical keyboard 616 for typing text in box 612;
● replace the keyboard selector icon 618 which, when activated (e.g. by tapping a finger on the icon), initiates the display of a different keyboard (e.g. 624 in fig. 6C);
● sending an icon 620 which, when activated (e.g., by tapping a finger on the icon), initiates sending the message in the textbox 612 to another party (e.g., Jane Doe 504-1); and
● shift key 628 that when activated (e.g., by a finger tap on an icon) turns the next letter selected on the alphabetic keyboard 616 to uppercase.
In some embodiments, the user may scroll through the message conversation (comprised of messages 604 and 606) by applying the vertical swipe gesture 610 to the area of the display conversation. In some embodiments, the vertical down gesture scrolls the conversation down, thereby displaying the older messages in the conversation. In some embodiments, the vertical up gesture scrolls the conversation up, thereby displaying the newer, closer message in the conversation. In some embodiments, as described above, the most recent message in the conversation (e.g., 606-2) is displayed in the instant message list 500 (e.g., 506-1).
In some embodiments, keys in keyboards 616, 624, and/or 638 change shade and/or color momentarily when touched/activated by a user to help the user understand the activation of a desired key.
In some embodiments, user interface 600B (FIG. 6B) includes the following elements, or a subset or superset thereof:
● the components 402, 404, 406, 602, 604, 606, 608, 612, 614, 616, 618, and 620 described above; and
● word suggestion area 622, which provides a list of possible words for completing the word segment typed by the user in block 612.
In some embodiments, the word suggestion region does not appear in the UI600B until after a predetermined time delay (e.g., 2-3 seconds) when text is typed by the user. In some embodiments, the word suggestion region is not used, or may be closed by the user.
In certain embodiments, user interface 600C (FIG. 6C) includes the following elements, or a subset or superset thereof:
● the components 402, 404, 406, 602, 604, 606, 608, 612, 614, 620, and 622 described above;
● instead of the keyboard 624, may consist essentially of numbers and punctuation; in some embodiments, the punctuation keys (e.g., period key 630, comma key 632, question key 634, and exclamation key 636) that are often used are larger than the other keys of keyboard 624;
● alphanumeric keyboard selector icon 626, which when activated (e.g., by tapping a finger on the icon), initiates display of an alphanumeric keyboard (e.g., 616 in FIG. 6A); and
● Shift key 628, which when activated (e.g., by tapping a finger on an icon), initiates display of another keyboard (e.g., 638 in FIG. 6D).
In some embodiments, period key 630 is held near keyboard selector icon 626, which reduces the distance a user's finger needs to move in order to type a frequently used period.
In certain embodiments, user interface 600D (FIG. 6D) includes the following elements, or a subset or superset thereof:
● the above-described parameters 402, 404, 406, 602, 604, 606, 608, 612, 614, 620, 622, 626, 628; and
● Another alternate keyboard 638, may consist essentially of symbols and punctuation; in some embodiments, the punctuation keys (e.g., period key 630, comma key 632, question key 634, and exclamation point key 636) that are often used are larger than the other keys.
In some embodiments, user interface 600E (FIG. 6E) includes the following elements, or a subset or superset thereof:
● the components 402, 404, 406, 602, 604, 606, 608, 612, 614, 616, 618, and 620 described above; and
● is sent to the other party's new instant message 606-3.
In some embodiments, when the user activates a send key (e.g., 614 or 620), the text "pop up" (pop) in text box 612 or comes out of the text box and becomes part of the string of user message 606 to the other party. The black arrows in FIG. 6E illustrate the animation formation (animated format) that refers to the bubble (quote bubble) 606-3. In some embodiments, the size of the reference bubble scales with the size of the message. In some embodiments, a sound such as a water droplet sound is also generated to notify the user when the message is sent.
Fig. 7 illustrates an exemplary user interface for deleting an instant messaging session, in accordance with certain embodiments. In certain embodiments, the user interface 700 includes the following elements, or a subset or superset thereof:
● the above-described 402, 404, 406, 504, 506, 508, 510;
● delete icon 702;
● remove icon 704; and
● completes the icon 706.
In some embodiments, if the user activates the edit icon 512 (fig. 5), a delete icon 702 appears next to each instant messaging session. If the user activates the delete icon (e.g., by tapping it with a finger), the icon may rotate 90 degrees (e.g., 702-4) or change its appearance, and/or a second icon may appear (e.g., remove icon 704). If the user activates the second icon, the corresponding instant message session is deleted.
This deletion process requires the user to perform multiple gestures on different portions of the touch screen (e.g., delete icon 702-4 and remove icon 704 are on opposite sides of the touch screen), which greatly reduces the likelihood that the user will inadvertently delete a conversation or other similar item.
When the user has finished deleting the IM session, the user activates the done icon 706 (e.g., by tapping it with a finger), and the device returns to UI 500.
If there is a long list of conversations (not shown) that fills beyond the screen area, the user may scroll through the list on the touch screen using a vertical up and/or vertical down gesture 708.
Fig. 8A and 8B illustrate an exemplary user interface for a contact list, according to some embodiments.
In certain embodiments, the user interfaces 800A and 800B include the following elements, or a subset or superset thereof:
● the above-described 402, 404, and 406;
● group icon 802, which when activated (e.g., by tapping a finger on the icon), initiates display of a contact group;
● A (first name) icon 804 which, when activated (e.g., by a finger tap on the icon), initiates the display of the user's contacts in alphabetical order of their names (FIG. 8B);
● a last name icon 806 that, when activated (e.g., by tapping a finger on the icon), initiates display of the user's contacts in alphabetical order of their last name (FIG. 8A);
● alphabetical list icon 808 that the user may touch to quickly reach a particular initial in the displayed contact list;
● cancel icon 810, which when activated (e.g., by tapping a finger on the icon), initiates a transition back to the previous UI (e.g., UI 500); and
● other number icon 812, which when activated (e.g., by tapping a finger on the icon), initiates a UI (e.g., UI900 of FIG. 9) that branches to a UI for typing a phone number for instant messaging, such as a phone number that is not in the user's contact list.
As described in U.S. patent application No.11/322,547 entitled "Scrolling List With flowing Adjacent Index Symbols," filed on 23.12.2005, which is incorporated herein by reference, a user can scroll through a List of contacts on a touch screen using a vertical up and/or vertical down gesture 814.
Fig. 9 illustrates an exemplary user interface for entering a telephone number for instant messaging, in accordance with certain embodiments. In certain embodiments, user interface 900 includes the following elements, or a subset or superset thereof:
● the above-described 402, 404, 406, 602, and 624;
● cancel the icon 902, which when activated (e.g., by tapping a finger on the icon), initiates a transition back to the previous UI (e.g., UI800A or UI 800B);
● save icon 904 which, when activated (e.g., by tapping a finger on the icon), initiates saving of the typed phone number in an instant messaging session list (e.g., UI 500) and displaying a UI (e.g., UI 600A) for composing an instant message to be sent to the typed phone number; and
● number entry box 906 for entering telephone numbers using keyboard 624.
Note that the displayed keyboard may depend on the application environment. For example, when numeric input is needed or desired, the UI displays a soft keyboard (e.g., 624) with the numbers. When letter input is needed or desired, the UI displays a soft keyboard (e.g., 616) with letters.
10A-10M illustrate exemplary user interfaces for displaying and managing contacts, according to some embodiments.
In some embodiments, in response to the user activating phone icon 138 (fig. 4) in UI400 (e.g., by tapping a finger on the icon), the user contact list (e.g., UI2600A in fig. 10A) is displayed.
As described in U.S. patent application No.11/322,547 entitled "Scrolling List With flowing Adjacent Index Symbols," filed on 23.12.2005, which is incorporated herein by reference, a user may scroll through a List of contacts on a touch screen using a vertical up and/or vertical down gesture 2602.
In some embodiments, in response to a user activation to add a new contact icon 2604 (e.g., by tapping a finger on the icon), the touch screen displays a user interface (e.g., UI2600B in fig. 10B) for editing the contact name.
In some embodiments, in response to the user typing in a contact name (e.g., "Ron Smith" via the keyboard 616 in the UI2600C of fig. 10C) and activating the save icon 2606 (e.g., by tapping a finger on the icon), the contact module creates and displays a new entry for the contact (e.g., the UI2600D of fig. 10D).
In some embodiments, in response to a user activation to add a new phone icon 2608 (e.g., by tapping a finger on the icon or on the row containing the icon), the touch screen displays a user interface (e.g., UI2600E in fig. 10E) for editing the phone number(s) of the contact.
In some embodiments, in response to the user typing in a telephone number (e.g., via keyboard 624 in UI2600E of fig. 10E); specifying the type of phone number (e.g., by a tap or other predetermined gesture on cell phone icon 2618, family icon 2620, or work icon 2622); and activating the save icon 2618 (e.g., by tapping a finger on the icon), the contacts module creates a phone number for the corresponding contact.
In some embodiments, the user may select additional types of phone numbers. For example, in response to a user activation to select icon 2624 (e.g., by tapping a finger on the icon), the touchscreen displays a phone label UI (e.g., UI2600F in fig. 10F). In some embodiments, in response to a user activating a certain tab in UI2600F, the selected tab is displayed at the location of work icon 2622 of UI 2600E. In some embodiments, the selected label is also highlighted in UI2600E to indicate to the user that the entered telephone number will be given the selected label.
In some embodiments, the user may add custom phone tags to UI2600F by activating add tag icon 2628 and typing the tags via a soft keyboard (e.g., 616, not shown).
In some embodiments, the user may delete one or more tags in UI 2600F. In some embodiments, only user-defined tabs may be deleted. For example, in response to the user activating the edit icon 2630 (e.g., by tapping a finger on the icon), the touch screen displays a delete icon 2632 (e.g., UI2600G in fig. 10G) next to the labels that can be deleted. If the user activates the delete icon (e.g., by tapping it with a finger), the icon may be rotated 90 degrees (e.g., 2634 in FIG. 10H) or otherwise change its appearance, and/or a second icon may appear (e.g., the remove icon 2636 in FIG. 10H). If the user activates the second icon, the contacts module deletes the corresponding tab. This deletion process is similar to the process described above with reference to fig. 7. As described above, the deletion process requires the user to perform multiple gestures on different portions of the touch screen (e.g., in UI 2600H, delete icon 2632 and remove icon 2636 are located on opposite sides of the touch screen), which greatly reduces the likelihood that the user will inadvertently delete a tab or other similar item. When the user has finished deleting the tab, the user activates the done icon 2638 (e.g., by tapping it with a finger) and the device returns to UI 2600F.
In some embodiments, in response to user activation of add new email icon 2610 (fig. 10D) (e.g., by tapping a finger on the icon or on the row containing the icon), the touch screen displays a user interface (e.g., UI2600I in fig. 10I) for editing the email address (es) of the contact.
In some embodiments, in response to the user typing in an email address (e.g., via keyboard 616 in UI2600I of FIG. 10I); specifying the type of email address (e.g., by tapping or other predetermined gesture on home icon 2640, work icon 2642, or other icon 2644); and activating the save icon 2648 (e.g., by tapping a finger on the icon), the contacts module creates an email address for the corresponding contact.
In some embodiments, the user may select additional email address types by activating selection icon 2646; custom email address types may be added and/or email address types may be deleted using processes and UIs similar to those described for the phone number types (FIGS. 10F-10H).
In some embodiments, in response to a user activation to add a new address icon 2612 (fig. 10D) (e.g., by tapping a finger on the icon or on the row containing the icon), the touch screen displays a user interface (e.g., UI2600J in fig. 10J) for editing the physical address (es) of the contact.
In some embodiments, in response to the user typing an address (e.g., via keyboard 616 in UI2600J of FIG. 10J); specifying the type of address (e.g., by tapping or other predetermined gesture on home icon 2650, work icon 2652, or other icon 2654); and activating the save icon 2658 (e.g., by tapping a finger on the icon), the contacts module creates an address for the corresponding contact.
In some embodiments, the user may select additional address types by activating selection icon 2656; custom address types may be added and/or deleted using processes and UIs similar to those described for the phone number types (FIGS. 10F-10H).
In some embodiments, in response to a user activation of an add new ring tone (ringtone) icon 2614 (fig. 10D) (e.g., by tapping a finger on the icon or on the row containing the icon), the touch screen displays a user interface (e.g., UI2600K in fig. 10K) for selecting a ring tone.
In some embodiments, in response to the user selecting a ringtone (e.g., by tapping or other predetermined gesture on the ringtone or on the row containing the ringtone) and activating the save icon 2660 (e.g., by tapping a finger on the icon), the contact module assigns the ringtone to the corresponding contact.
In some embodiments, the user may add a new ring tone by activating add icon 2662.
In some embodiments, a thumbnail or other graphic 2616 (fig. 10D) may be added to the contact.
FIG. 10L illustrates an exemplary user interface for an existing contact list entry, according to some embodiments. In response to the user selecting the edit icon 2664 (e.g., by tapping a finger on the icon), the touchscreen displays a user interface (e.g., UI2600M in fig. 10M) for editing the contact. In response to a user selection, using the processes and UIs described above (e.g., fig. 10E-10K), the contact list module may delete one or more existing contact information, add a new phone number, add a new E-mail address, add a new physical address, and/or add a new ring tone.
11A-11C illustrate exemplary user interfaces for displaying and managing favorite (favorites) contacts according to some embodiments. UI2700A (FIG. 11A) displays an exemplary favorites list. In some embodiments, each row in the list corresponding to a favorite includes the name 2702 of the favorite, the type of phone number 2704 of the favorite to be called, and an additional information icon 2706. In some embodiments, in response to a user activating an icon 2706 for a particular favorite (e.g., by tapping a finger on the icon), the touch screen displays the contact list entry corresponding to the favorite (e.g., UI2600L in FIG. 10L). In some embodiments, in response to the user tapping or other predetermined gesture elsewhere in the row corresponding to a particular collection (i.e., a tap or gesture that is not on icon 2702), the phone module dials the corresponding phone number 2704 for that particular collection.
In some embodiments, in response to user activation of the add favorites icon 2708 (e.g., by tapping a finger on the icon), the device displays a user contact list from which the user selects a contact list entry for the new favorite and in which the user selects a telephone number for the new favorite.
In response to the user activating the edit icon 2710 (e.g., by tapping a finger on the icon), the touch screen displays a delete icon 2712 (e.g., UI2700B in fig. 11B) next to the favorites. If the user activates the delete icon (e.g., by tapping it with a finger), the icon may be rotated 90 degrees (e.g., 2714 in FIG. 11C) or otherwise change its appearance, and/or a second icon may appear (e.g., the remove icon 2716 in FIG. 11C). If the user activates the second icon, the corresponding favorite is deleted. This deletion process is similar to the process described above with respect to fig. 7, 10G, and 10H. As described above, the delete process requires the user to perform multiple gestures on different portions of the touch screen (e.g., delete icon 2714 and remove icon 2716 are on opposite sides of the touch screen in UI 2700C), which greatly reduces the likelihood that the user will inadvertently delete favorites or other similar items. When the user has finished deleting favorites, the user activates the done icon 2718 (e.g., by tapping it with a finger), and the device returns to UI 2700A.
Figures 12A-12D illustrate exemplary user interfaces for displaying and managing recent calls, according to some embodiments.
In some embodiments, in response to the user activating the "all" icon 2810, the touchscreen displays a list of all recent calls (e.g., UI2800A of fig. 12A). In some embodiments, in response to the user activating the "missed" icon 2812, the touchscreen displays a list of recently missed calls (e.g., UI2800B of fig. 12B). The processing of missed calls will be further described below with reference to fig. 23-25. The processing of telephone call information, including recent calls, will be further described with reference to fig. 26-27.
In some embodiments, each row in the list corresponds to a call or a series of consecutive calls involving the same person or the same number (no call involving another person or another telephone number is inserted). In some embodiments, each row comprises: the other party's name 2802 (if available through the contacts module) or phone number (if the other party's name is not available); a number of consecutive calls 2804 (in an exemplary embodiment, number of consecutive calls 2804 is not displayed if it equals 1); date and/or time of last call 2806; and an additional information icon 2808. In some embodiments, in response to a user activating an icon 2808 for a particular row (e.g., by tapping a finger on the icon), the touchscreen displays the other party's corresponding contact list entry (e.g., UI2800C in fig. 12C), or UI2800D (fig. 12D), if the phone number cannot be associated with an entry in the user's contact list. In some embodiments, in response to a user tapping or other predetermined gesture elsewhere in a given row (i.e., a tap or gesture that is not on icon 2808), the phone module dials the corresponding phone number for that row.
In some embodiments, some rows may include an icon indicating whether the last call associated with the row was missed or answered.
If there is a list of recent calls that fill in beyond the screen area, the user may scroll through the list on the touch screen using the vertical up and/or vertical down gesture 2814.
In some embodiments, the UI2800C highlights (e.g., with color, shading, and/or bold) the phone number associated with the recent call (e.g., two recent incoming calls from Bruce Walker in UI2800A are work numbers 2816 from Bruce Walker). In some embodiments, the phone module dials the highlighted number (e.g., 2816) in response to the user tapping or performing other predetermined gestures on the highlighted number 2816. In some embodiments, in response to a user tapping or other predetermined gesture on another number (e.g., home number 2818) in the contact list entry, the phone module dials the corresponding number. In some embodiments, in response to a user tapping or performing other predetermined gesture on an email address (e.g., work email 2820 or home email 2822) in a contact list entry, the mail module prepares an email message having the selected email address for text input by the user. In some embodiments, in response to the user tapping or performing another predetermined gesture on an instant message object (not shown) corresponding to the telephone number, the instant message module prepares an instant message to be sent to the corresponding telephone number for text entry by the user. Thus, by selecting icon 2808 (fig. 12A) in recent calls UI2800C, the user can easily answer (respond) the caller using the same number (e.g., 2816) involved in the previous call, another number (e.g., 2818) associated with the same caller, or another means of communication other than telephone, such as an email to the caller's work 2820 or home 2822 email addresses.
In some embodiments, UI2800D provides one or more options so that the user can use the phone number in the recent call that is not associated with an entry in the user's contact list. In some embodiments, in response to a tap or other predetermined user gesture, the device may: call a phone number (e.g., if the gesture is applied to icon 2824); create a new contact with the phone number (e.g., if the gesture is applied to icon 2826); add the phone number to an existing contact (e.g., if the gesture is applied to icon 2828); or check the call history associated with the number (e.g., if a gesture is applied to icon 2830).
FIG. 13 illustrates an exemplary dial pad interface for a call, in accordance with certain embodiments. In response to a user activating a numeric key in the dial 2902 (e.g., by a finger tap on a numeric icon), the touchpad displays the selected number 2904. In some embodiments, the phone module automatically adds brackets and dashes to the selected numbers to make the numbers more legible. In response to the user activating call icon 2906, the phone module dials or transmits the selected number.
14A-14D illustrate exemplary user interfaces displayed during a call, according to some embodiments. In some embodiments, the UI indicates that a call is being attempted 3002 (UI 3000A of fig. 14A and UI3000C of fig. 14C), and then indicates a connection time 3004 (UI 3000B of fig. 14B and UI3000D of fig. 14D) after the connection is achieved.
In some embodiments, in response to a tap or other predetermined user gesture, the device may: mute the call (e.g., if the gesture is applied to mute icon 3006); put the call on hold (e.g., if a gesture is applied to the call on hold icon 3008); placing the call on a speaker (e.g., if a gesture is applied to the speaker icon 3010); establish a conference call (e.g., if the gesture is applied to conference icon 3012 in fig. 14A-14B or add call icon 3018 in fig. 14C-14D); display the keypad (e.g., if the gesture is applied to the keypad icon 3016); display a contact list (e.g., if a gesture is applied to icon 3020); or end the call (e.g., if the gesture is applied to the end call icon 3014). The display of the respective items in the user interface will be further described below with reference to a process 5600 (fig. 29).
Fig. 15A and 15B illustrate exemplary user interfaces displayed during an incoming call, in accordance with some embodiments.
In some embodiments, if the incoming call is from a phone number associated with someone or other entry in the user's contact list, the touch screen may display: the name 3102 of the person or item; a graphic 3104 associated with the person or item; a decline icon 3106, which when activated (e.g., by tapping a finger on the icon), causes the phone module to decline the call and/or initiate a voicemail for the call; and an answer icon 3108 which, when activated (e.g., by tapping a finger on the icon), causes the phone module to answer the call (e.g., UI3100A of fig. 15A). The display of these items will be further described below with reference to process 5500 (FIG. 28).
In some embodiments, if the incoming call is from a phone number not associated with someone or other entry in the user's contact list, the touch screen may display: the telephone number 3110 of the other party; a decline icon 3106, which when activated (e.g., by tapping a finger on the icon), causes the phone module to decline the call and/or initiate a voicemail for the call; and an answer icon 3108 which, when activated (e.g., by tapping a finger on the icon), causes the phone module to answer the call (e.g., UI3100B of fig. 15B).
In some embodiments, when there is an incoming call, the device pauses some other application (e.g., music player 146, video player, and/or slide show); displaying UI3100A or UI3100B before the call is answered; the UI3000B is displayed during the call; and terminating the suspension of the other application if the incoming call is rejected or the call is ended. In some embodiments, the in and out pauses are smooth transitions (e.g., smoothly decreasing and increasing the volume of the music player).
Fig. 16A and 16B illustrate an exemplary user interface for voicemail, according to some embodiments. In certain embodiments, the user interfaces 3200A and 3200B comprise the following elements, or a subset or superset thereof:
● the above-described 402, 404, and 406;
● go back icon 3202 which, when activated (e.g. by tapping a finger on the icon), initiates a process to go back and replay the previous few seconds of voicemail messages;
● progress bar 3204, which indicates a portion of a voicemail message that has been played and is available to help scroll through the message in response to user gesture 3206;
● Accelerator icon 3208, which when activated (e.g., by a finger tap on the icon), initiates a process that accelerates the playback of voicemail messages, which may also adjust the frequency or pitch of the sound of the quick playback to make the word, although spoken quickly, still easily understandable;
● the name 3210 of the person who left the voicemail message (e.g., Aaron Jones 3210-1) (associated with the incoming telephone number by the user's contact list), or if the person's name is not available, the telephone number (e.g., 408 and 246 and 81013210-3);
● date 3212 and/or time of the voicemail;
● append information icon 3214 which, when activated (e.g., by tapping a finger on the icon), initiates a transition to a corresponding contact list entry (e.g., UI2800C of FIG. 12C) or to a UI for unknown phone numbers (e.g., UI2800D of FIG. 12D);
● speaker icon 3216 which when activated (e.g. by tapping a finger on the icon) initiates playback of the voicemail through the speaker;
● options icon 3218 which when activated (e.g. by a finger tap on the icon) initiates display of an additional voicemail options menu;
● pause icon 3220 which, when activated (e.g., by tapping a finger on the icon), initiates the pausing of the voicemail;
● deleter icon 3222 which, when activated (e.g., by tapping a finger on the icon), initiates display of a UI (e.g., UI3200B of FIG. 16B) confirming that the user wishes to delete the corresponding voicemail;
● cancel icon 3226, which when activated (e.g., by tapping a finger on the icon), changes the display from UI3200B to UI3200A without canceling the corresponding voicemail;
● delete icon 3228, which when activated (e.g., by tapping a finger on the icon), deletes the corresponding voicemail and changes the display from UI3200B to UI 3200A; and
● plays icon 3230, which when activated (e.g., by tapping a finger on the icon) initiates or continues playback of the voicemail.
If the list of voicemails fills beyond the screen area, the user may scroll through the list on the touch screen using a vertical up and/or vertical down gesture 3224.
In some embodiments, in response to a user tap or other predetermined gesture in the row corresponding to a particular voicemail (rather than a tap or gesture on icon 3214), the phone module initiates playback of the corresponding voicemail. Thus, there is random access to the voicemail and the voicemail can be listened to in any order.
In some embodiments, the playback position in the voicemail may be modified in response to a user gesture. For example, in response to the user's finger touching 3206 at or near the current playback position in the progress bar and then sliding along the progress bar, the playback position may be changed to correspond to the position of the user's finger along the progress bar. Such user gestures on the progress bar make it easy for the user to jump to and/or replay a portion of interest in the voicemail.
FIG. 17 illustrates an exemplary user interface for organizing and managing emails, according to some embodiments. In some embodiments, user interface 3300 includes the following elements, or a subset or superset thereof:
● the above-described 402, 404, and 406;
● a set of mailboxes, such as inbox 3302, which may be organized into a plurality of rows, each row having a selection icon 3306;
● setting icon 3308 which, when activated (e.g., by tapping a finger on the icon), initiates the display of a UI (e.g., UI3600 in fig. 20) for entering mailbox settings; and
● creates an email icon 3310 which, when activated (e.g., by tapping a finger on the icon), initiates the display of a UI (e.g., UI3400 in fig. 18) for creating a new email message.
If the set of mailboxes fills beyond the screen area, the user may scroll through the mailboxes on the touch screen using a vertical up and/or vertical down gesture 3312.
18A and 18B illustrate an exemplary user interface for creating an email, according to some embodiments.
In response to a user activation to create email icon 3310 (UI 3300 of fig. 17), the device displays UI 3400A.
In some embodiments, if the user taps or other predetermined gestures on the subject line 3408 or in the body 3412 of the email (fig. 18A), an alphabetic keyboard 616 appears and the user can enter the subject and/or body text (fig. 18B). In some embodiments, To type an email address, a user taps or other predetermined gesture on the recipient line (To: line) 3406 of the email; a contact list of the user appears (e.g., FIG. 8A); a user tapping or other predetermined gesture on a desired recipient/contact; and the device places the corresponding email address in the email message (fig. 18B). In some embodiments, the user may also type in an email address using one or more keyboards (e.g., 616 and 624, not shown). In response to the user activating the send icon 3404 (fig. 18B) (e.g., by tapping a finger on the icon), the device sends an email message. Alternatively, if the user activates cancel icon 3402, the device may display a save draft icon and a not save icon (not shown). If the user activates the save draft icon, the device saves the draft in, for example, a draft folder of the mail client 140 (FIG. 17). If the user activates the do not save icon, the device deletes the draft.
In some embodiments, in response to a user activating the attachment icon 3410 (e.g., by tapping a finger on the icon), the touchscreen displays a UI (not shown) for adding an attachment.
19A-19F illustrate exemplary user interfaces for displaying and managing inboxes, according to some embodiments. Similar user interfaces may be used to display and manage other mailboxes (e.g., draft boxes, outboxes, trash boxes, private, and/or work in UI 3300). In certain embodiments, user interfaces 3500A-3500F include the following elements, or a subset or superset thereof:
● the 402, 404, 406, and 3310 components described above;
● mailbox icon 3502 which, when activated (e.g., by tapping a finger on the icon), initiates display of mailbox UI3300 (fig. 17);
● unread messages icon 3504, which displays the number of unread messages in the inbox;
● name 3506 of the sender of the email message;
● subject line 3508 of the email message;
● date of email message 3510;
● unread messages icon 3512, indicating messages that have not yet been opened;
● preview pane separator (separator) 3518, which separates the list of messages from previews of selected messages in the list;
● set icon 3520 which, when activated (e.g., by tapping a finger on the icon), initiates display of the settings UI3600 (fig. 20);
● move message icon 3522, which when activated (e.g., by tapping a finger on the icon), initiates display of move message UI3800A (fig. 22);
● deleter icon 3524, which when activated (e.g., by tapping a finger on the icon), initiates display of a UI (e.g., UI3500E of fig. 19E) for confirming that the user wants to delete the selected email;
● reply/forward icon 3526 that, when activated (e.g., by tapping a finger on the icon), initiates display of a UI (e.g., UI3500F of fig. 19F) for selecting how to reply to or forward the selected email;
● preview pane 3528 displaying a portion of the selected email message;
● detailed information icon 3530 which, when activated (e.g., by a finger tap on the icon), initiates display of detailed information 3534 (FIG. 19C) of the email address;
● cancel icon 3540, which when activated (e.g., by tapping a finger on the icon), returns the device to the previous user interface (e.g., UI 3500D);
● delete icon 3542, which when activated (e.g., by tapping a finger on the icon), deletes the selected email;
● reply icon 3544 which, when activated (e.g., by a finger tap on the icon), initiates creation of an email reply to the sender;
● reply to all icons 3546 which, when activated (e.g., by tapping a finger on the icon), initiate the creation of an email that replies (e.g., by cc) to the sender and other parties included in the selected email;
● forward icon 3548, which when activated (e.g., by tapping a finger on the icon), initiates creation of an email to be forwarded.
If the set of emails fills beyond the screen area (or beyond the screen area above the preview pane), the user can scroll through the emails on the touch screen using the vertical up and/or vertical down gesture 3514.
In some embodiments, if the preview pane 3528 is used, the email subject 3508 is not displayed (as shown in FIGS. 19B-19F). In some embodiments, the user can adjust the position of the preview pane separator bar by making contact 3516 on or near the preview pane separator bar and moving the separator bar to the desired position by dragging finger contact 3538 (see fig. 19A). In some embodiments, an arrow 3539 or other graphic (e.g., UI3500D of fig. 19D) may appear during the determination of the location of the preview pane divider bar to help guide the user.
In some embodiments, in response to a user tapping or other predetermined gesture in a row containing information (e.g., 3506, 3510, and/or 3508) about a particular email message, some or all of the text in the row is highlighted (e.g., by color, shading, or bolding) and the corresponding message is displayed in the preview pane area. In some embodiments, in response to a user tapping or other predetermined gesture in a row containing information (e.g., 3506, 3510, and/or 3508) about a particular email message, the email message is displayed on the full screen if the preview pane is not used.
In some embodiments, if the selected email fills beyond the preview pane area, the user may scroll through the email using the two-dimensional gesture 3532 in the preview pane, thereby causing the email to move vertically and/or horizontally on the touch screen.
In some embodiments, in response to the user activating an additional information icon (e.g., ">") on the detailed information 3534 (e.g., by tapping the finger 3536 on the icon), the touchscreen may display contact list information for the respective party, if available (e.g., the UI2800C of fig. 12C), or display a UI similar to the UI2800D of fig. 12D.
FIG. 20 illustrates an exemplary user interface for setting email user preferences, according to some embodiments. In some embodiments, the user interface 3600 includes the following elements, or a subset or superset thereof:
● the above-described 402, 404, and 406;
● complete icon 3602, which when activated (e.g., by tapping a finger on the icon) returns the device to the previous UI;
● Account 3604 for entering e-mail account information;
● message List display 3606 for selecting whether to display sender 3506 and/or subject 3508 information in an email list;
●, displaying a latest message 3608 for selection of whether to have the latest message displayed at the top or bottom of the screen;
● message display location 3610 for selecting whether to have the message displayed in the preview pane or in full screen display;
● message format 3612 is preferred for selecting how to format the message (e.g., HTML or plain text);
● rules 3614 for creating rules for managing email messages (e.g., using UI3700A of FIG. 21A and UI3700B of FIG. 21B);
● select icon 3616, which when activated (e.g., by tapping a finger on the icon), displays options for the corresponding setting.
In some embodiments, the user may tap anywhere in the row for a particular setting to initiate display of the corresponding setting option.
21A and 21B illustrate exemplary user interfaces for creating and managing email rules, in accordance with certain embodiments. In certain embodiments, the user interface 3700A includes the following elements, or a subset or superset thereof:
● the above-described 402, 404, and 406;
● set icon 3702, which when activated (e.g., by tapping a finger on the icon) returns the device to set UI3600 (fig. 20);
● rule 3704;
● select icon 3706, which when activated (e.g., by tapping a finger on the icon), displays options for the corresponding rule;
● add an icon 3708 that, when activated (e.g., by tapping a finger on the icon), displays a UI (e.g., UI3700B of fig. 21B) for creating new rules;
● completes the icon 3710, which when activated (e.g., by tapping a finger on the icon) returns the device to the settings UI3600 (fig. 20).
In some embodiments, the user may tap anywhere in the row for a particular rule to initiate display of the corresponding rule (e.g., UI3700B of FIG. 21B).
22A and 22B illustrate exemplary user interfaces for mobile email messages, according to some embodiments.
In response to the user activating the mobile message icon 3522 (see UI3500A of fig. 19A), the device displays a UI3800A and displays some information 3804 about the selected message.
In some embodiments, if the user taps 3802 or performs another predetermined gesture on the row corresponding to a particular mailbox or other folder, the message is moved to that corresponding mailbox or folder (e.g., the job in FIG. 22A). In some embodiments, the selected row is highlighted and an animation occurs that moves the message information 3804 into the selected row (as schematically illustrated in fig. 22B).
Fig. 23 is a flow diagram illustrating a process 5000 of handling a missed telephone call on a portable electronic device with a touch screen display, in accordance with certain embodiments. A list of items including missed telephone calls is displayed (5002). For example, UI2800B (fig. 12B) displays a list of missed calls. In some embodiments, a single item in the list of items corresponds to multiple missed telephone calls (5004). In some embodiments, information indicating the number of missed calls is displayed in each individual entry in the list (5006). For example, the first entry in UI2800B for Bruce Walker2803 corresponds to two missed telephone calls, as indicated by numeral 2805. In some embodiments, multiple calls corresponding to a single item are time-sequential. In some embodiments, a single item in the list of items corresponds to multiple missed telephone calls from multiple different telephone numbers associated with a respective caller (5008). Displaying a single item in the list of items that corresponds to multiple missed telephone calls narrows the list of missed calls and makes it easy for the user to determine which people are attempting to contact the user and how many times they have attempted to contact the user.
In some embodiments, a scroll gesture is detected, wherein the scroll gesture includes a substantially vertical motion of user contact with the touch screen display (5010). In response, the displayed list of items is scrolled (5012). For example, in response to the portrait gesture 2814, the list of items displayed in the UI2800B is scrolled (fig. 12B). The scrolling gesture provides a simple way for a user to quickly browse a list of items.
Upon detecting a user selection of an item in the list (5014), contact information of a respective caller corresponding to the item selected by the user is displayed (5016). For example, if the user selects the item of Bruce Walker2803 in UI2800B (fig. 12B), the contact information of Bruce Walker is displayed in UI2800C (fig. 12C). The contact information includes a plurality of contact objects. The plurality of contact objects includes a first contact object and a second contact object, wherein the first contact object includes a phone number object having a first phone number associated with a missed phone call. In some embodiments, the second contact object is an email contact object. In some embodiments, the second contact object is a phone number object having a second phone number different from the first phone number. In some embodiments, the second contact object is an instant messaging object. In the example of fig. 12C, the work phone number 2816 from which the two missed calls came corresponds to the first contact object. Any of objects 2818, 2820, and 2822 correspond to a second contact object.
Upon detecting a user selection of the second contact object (5018), communication with the respective caller is initiated via a contact address (modality) corresponding to the second contact object (5020). In some embodiments, when the second contact object is an email contact object, the contact address corresponding to the second contact object includes sending an email message. For example, a user selection of object 2820 in UI2800C (FIG. 12C) would initiate an email to the work email address of Bruce Walker. In some embodiments, when the second contact object is a phone number object having a second phone number different from the first phone number, the contact address corresponding to the second contact object includes initiating a phone call to the second phone number. For example, a user selection of object 2818 in UI2800C would initiate a telephone call to Bruce Walker's home number. In some embodiments, when the second contact object is an instant messaging object, the contact address corresponding to the second contact object includes sending an instant message. Providing multiple contact objects makes it easy for a user to select and initiate communication with a missed caller via any available communication contact, without being limited to calling back the missed caller with the phone number associated with the missed call. For example, a user may easily dial or send an email message to Bruce Walker's home phone, rather than call back to his work number.
Although the missed telephone call processing procedure 5000 described above includes a number of operations that appear to occur in a particular order, it should be apparent that: process 5000 may include more or fewer operations, which may be performed sequentially or in parallel (e.g., using parallel processors or a multi-threaded environment); the order of two or more operations may be changed; and/or two or more operations may be combined into a single operation. For example, if the item to be selected in operation 5014 is initially displayed in operation 5002, operations 5010 and 5012 may be omitted.
Fig. 24 is a flow chart illustrating a process 5100 of handling missed phone calls on a portable electronic device with a touch screen display, in accordance with certain embodiments. As described above with reference to process 5000 (fig. 23), a list of items including missed telephone calls is displayed (5002). In some embodiments, a single item in the list of items corresponds to multiple missed telephone calls (5004). In some embodiments, information indicating the number of missed calls is displayed in individual entries in the list (5006). In some embodiments, a single item in the list of items corresponds to multiple missed telephone calls from multiple different telephone numbers associated with a respective caller (5008). Displaying a single item in the list of items that corresponds to multiple missed telephone calls narrows the list of missed calls and makes it easy for the user to determine which people are attempting to contact the user and how many times they have attempted to contact the user.
In some embodiments, a scroll gesture is detected, wherein the scroll gesture includes a substantially longitudinal motion of a user contact with the touch screen display. In response, the displayed list of items is scrolled (not shown). The scrolling gesture provides a simple way for a user to quickly browse a list of items.
Upon detecting finger contact (5110) with a first portion of the user-selected item in the list, a return telephone call (5112) to a return number associated with the user-selected item is initiated. For example, in some embodiments, in response to a tap or other predetermined gesture on a row of UI2800B (fig. 12B) other than icon 2808, a return call to the corresponding number of the row is initiated.
Upon detecting finger contact (5114) with a second portion of a corresponding item in the list, contact information (5116) of the corresponding caller corresponding to the item selected by the user is displayed. The contact information includes a plurality of contact objects. The plurality of contact objects includes a first contact object and a second contact object, wherein the first contact object includes a phone number object having a return number. In some embodiments, the second contact object is an email contact object. In some embodiments, the second contact object is a phone number object having a second phone number different from the return number. In some embodiments, the second contact object is an instant messaging object. For example, in some embodiments, in response to a tap or other predetermined gesture on icon 2808 in the top row (top row) of UI2800B (fig. 12B), corresponding contact information is displayed in UI2800C (fig. 12C). In the example of fig. 12C, the work phone number 2816 from which the two missed calls came corresponds to the first contact object. Any of objects 2818, 2820, and 2822 may correspond to a second contact object.
Upon detecting a user selection of the second contact object (5018), communication with the respective caller is initiated via a contact corresponding to the second contact object (5020). In some embodiments, when the second contact object is an email contact object, the contact address corresponding to the second contact object includes sending an email message. In some embodiments, when the second contact object is a phone number object having a second phone number different from the return number, the contact address corresponding to the second contact object includes initiating a phone call to the second phone number. In some embodiments, when the second contact object is an instant messaging object, the contact address corresponding to the second contact object includes sending an instant message. In some embodiments, detecting contact with the first portion or the second portion of the item facilitates a user: (a) immediately call back to the telephone number associated with the missed call without having to view contact information associated with the missed call (e.g., fig. 12C); or (b) view contact information to select from a plurality of communication contacts associated with the missed caller.
While the missed telephone call handling procedure 5100 described above includes a number of operations that appear to occur in a particular order, it should be apparent that: process 5100 may include more or fewer operations, which may be performed sequentially or in parallel (e.g., using parallel processors or a multi-threaded environment); the order of two or more operations may be changed; and/or two or more operations may be combined into a single operation.
Fig. 25 is a flow diagram illustrating a process 5200 of handling missed telephone calls on a portable electronic device with a touch screen display, in accordance with certain embodiments. Missed telephone call information is displayed, including a list of items. At least one of the items corresponds to missed telephone calls from a respective caller (5202). In some embodiments, an item in the list of items corresponds to missed telephone calls from a plurality of different telephone numbers associated with a respective caller (5204). In some embodiments, at least two of the plurality of missed telephone calls from the respective caller include missed telephone calls from at least two different telephone numbers associated with the respective caller (5208). In some embodiments, one entry in the list of entries corresponds to one or more VoIP calls having an associated IP address (5206). Displaying a single item in the list of items that corresponds to multiple missed telephone calls narrows the list of missed calls and makes it easy for the user to determine which people are attempting to contact the user and how many times they have attempted to contact the user.
In some embodiments, a scroll gesture is detected, wherein the scroll gesture includes a substantially longitudinal movement of user contact with the touch screen display (5010). In response, the displayed list of items is scrolled (5210). The scrolling gesture provides a simple way for a user to quickly browse a list of items.
Upon detecting a user selection of an item in the list of items (5212), contact information for a respective caller corresponding to the item selected by the user is displayed. The contact information includes a plurality of contact objects (5214). In some embodiments, examples of the contact object include the examples described above with reference to operation 5016 of FIG. 23. Upon detecting a user selection (5216) of each of the plurality of contact objects, communication (5218) with the respective caller is initiated through the contact corresponding to the contact object selected by the user. In some embodiments, examples of contact details include the examples described above with reference to operation 5020 of fig. 23. Providing multiple contact objects makes it easy for a user to select and initiate communication with a missed caller via any available communication contact, without being limited to calling back the missed caller with the phone number associated with the missed call. For example, a user may easily dial the Bruce Walker's home phone or send an email message to Bruce, rather than call back to his work number.
While the missed telephone call processing procedure 5200 described above includes a number of operations that appear to occur in a particular order, it should be apparent that: process 5200 can include more or fewer operations, which can be performed sequentially or in parallel (e.g., using parallel processors or a multi-threaded environment); the order of two or more operations may be changed; and/or two or more operations may be combined into a single operation. For example, if the item to be selected in operation 5014 is initially displayed in operation 5002, operations 5010 and 5012 may be omitted. In another example, all operations except operation 5202 may be omitted.
Fig. 26 is a flow diagram illustrating a process 5300 for handling phone call information on a portable electronic device with a touch screen display, according to some embodiments. Telephone call information is displayed, including a list of items. At least one of the items corresponds to a plurality of telephone calls for a respective caller (5302). For example, UI2800A (fig. 12A) displays telephone call information for all recent calls. The entries of Kim Brook 2802 correspond to three telephone calls as indicated by numeral 2804. Displaying a single item in the list of items that corresponds to multiple missed telephone calls narrows the list of missed calls and makes it easy for the user to determine which people are attempting to contact the user and how many times they have attempted to contact the user. A scroll gesture is detected, wherein the scroll gesture includes a substantially longitudinal movement of user contact with the touch screen display (5010). For example, a longitudinal gesture 2814 is detected. In response, the display of the phone call information is scrolled (5304). The scrolling gesture provides a simple way for a user to quickly browse a list of items.
While the previous telephone call processing process 5300 described above includes a number of operations that appear to occur in a particular order, it should be apparent that: process 5300 can include more or fewer operations, which can be performed sequentially or in parallel (e.g., with parallel processors or a multi-threaded environment); the order of two or more operations may be changed; and/or two or more operations may be combined into a single operation.
Figure 27 is a flow diagram illustrating a process 5400 for processing a previous phone call on a portable electronic device with a touch screen display, in accordance with certain embodiments. A list of items of previous phone calls is displayed (5402). In some embodiments, a single item in the list of items corresponds to multiple previous phone calls (5404). For example, UI2800A (fig. 12A) displays telephone call information for all recent calls. The entries of Kim Brook 2802 correspond to three telephone calls as indicated by numeral 2804. Displaying a single item in the list of items that corresponds to multiple missed telephone calls narrows the list of missed calls and makes it easy for the user to determine which people are attempting to contact the user and how many times they have attempted to contact the user.
Upon detecting a finger contact with a first portion of the user-selected item in the list (5406), initiating a phone call to a primary phone number associated with the user-selected item (5408). For example, in some embodiments, in response to a tap or other predetermined gesture on a row of UI2800A (fig. 12A) other than icon 2808, a return call to the corresponding number of the row is initiated.
Upon detecting a finger contact 5410 with a second portion of a corresponding item in the list, such as icon 2808, contact information 5412 of the corresponding caller associated with the item selected by the user is displayed. The displayed contact information includes a plurality of contact objects including a first contact object and a second contact object. The first contact object includes a phone number object having a primary phone number. In some embodiments, the second contact object is an email contact object, an instant messaging object, or a phone number object having a secondary phone number different from the primary phone number.
Upon detecting a user selection of a second contact object (5414), communication with the respective caller is initiated through a contact corresponding to the second contact object (5416). In some embodiments, when the second contact object is an email contact object, the contact address corresponding to the second contact object includes sending an email message. In some embodiments, when the second contact object is a phone number object having a secondary phone number different from the primary phone number, the contact address corresponding to the second contact object includes initiating a phone call to the secondary phone number. In some embodiments, when the second contact object is an instant messaging object, the contact address corresponding to the second contact object includes sending an instant message. In some embodiments, examples of contact objects and corresponding contact details correspond to the examples provided with reference to operations 5016 and 5020 in fig. 23. In some embodiments, detecting contact with the first portion or the second portion of the item facilitates a user: (a) immediately call back to the telephone number associated with the missed call without having to view contact information associated with the missed call (e.g., fig. 12C); or (b) view contact information to select from a plurality of communication contacts associated with the missed caller.
While the previous phone call processing procedure 5400 described above includes a number of operations that appear to occur in a particular order, it should be apparent that: process 5400 can include more or fewer operations, which can be performed sequentially or in parallel (e.g., using parallel processors or a multi-threaded environment); the order of two or more operations may be changed; and/or two or more operations may be combined into a single operation.
Figure 28 is a flow diagram illustrating a process 5500 for handling an incoming telephone call on a portable electronic device with a touch screen display, in accordance with certain embodiments. An incoming telephone call from a caller is detected (5502). In some embodiments, contact information corresponding to the caller is identified (5504). A text identifier of the caller (e.g., caller name 3102 of fig. 15A) and an image associated with the caller (e.g., graphic 3104) are displayed 5506. In some embodiments, the text identifier and the image are from the identified contact information. A call answer icon (e.g., icon 3108 of fig. 15A) is displayed, wherein the incoming telephone call is answered if selected by the device user. A call decline icon (e.g., icon 3106) is displayed, wherein the incoming telephone call is declined if selected by the device user. The process 5500 provides call information and interprets the call options available to the user in a simple and clear manner.
Although the incoming call processing procedure 5500 described above includes a number of operations that appear to occur in a particular order, it should be apparent that: process 5100 may include more or fewer operations, which may be performed sequentially or in parallel (e.g., using parallel processors or a multi-threaded environment); the order of two or more operations may be changed; and/or two or more operations may be combined into a single operation.
Fig. 29 is a flow diagram illustrating a process 5600 for handling an established telephone call on a portable electronic device with a touch screen display, according to some embodiments. Upon detecting that a telephone call has been established between the device user and another entity (5602), the following items are simultaneously displayed (5604): a mute icon (e.g., mute icon 3006 in fig. 14B, 14D) for muting a microphone of the device; a keypad icon for displaying a keypad; a speaker icon for activating a speaker mode of the device; a conference call icon or an add call icon for forming a multi-party telephone call between the user, the other entity and at least one further entity; a call hold icon for suspending the telephone call; a contact icon for displaying a contact list; and an end call icon for ending the telephone call. Examples of these items are illustrated in the UI3000B of FIG. 14B and the UI3000D of FIG. 14D. In some embodiments, not all of these listed icons are displayed. For example, in UI3000B (fig. 14B), no contact icons and no keypad icons are displayed. Process 5600 provides call information and explains the call options available to the user in a simple and clear manner.
The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the invention and the practical application, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated.
Claims (16)
1. A method of responding to a missed call, comprising:
displaying a list comprising a plurality of interactive display items associated with a telephone call on a touch screen display of a portable electronic device,
wherein at least one interactive display item in the list is associated with a missed telephone call and is also associated with contact information stored in the portable electronic device;
wherein each such interactive display item associated with both the missed telephone call and the contact information comprises at least a first interactive display portion and a second interactive display portion, the second interactive display portion being separate from the first interactive display portion;
detecting a finger tap input at a first user-selected interactive display item associated with both the missed phone call and the contact information, wherein the finger tap input is detected at a first interactive display portion of the first user-selected interactive display item;
in response to a finger tap input detected at a first user-selected interactive display item, initiating a phone call to a phone number associated with the first user-selected interactive display item;
detecting a finger tap input at a second user-selected interactive display item associated with both the missed phone call and the contact information, wherein the finger tap input is detected at a second interactive display portion of the second user-selected interactive display item;
in response to a finger tap input detected at a second user-selected interactive display item, displaying contact information of a caller corresponding to the second user-selected interactive display item, the displayed contact information including a plurality of contact objects displayed simultaneously, the plurality of contact objects including:
a first contact object associated with a telephony communication modality of calling the caller, an
A second contact object associated with a non-telephonic communication modality of contacting the caller; and is
In response to detecting a user selection of a second contact object, initiating communication with the corresponding caller via a non-telephonic communication modality corresponding to the second contact object.
2. The method of claim 1, wherein the second contact object is an email contact object, and wherein initiating communication through a non-telephonic communication modality corresponding to the second contact object includes preparing an email message having a selected email address associated with the second contact object for text entry by a user.
3. The method of claim 1, wherein the second contact object is an instant message object, and wherein initiating communication via a non-telephonic communication modality corresponding to the second contact object comprises preparing an instant message addressed to a number associated with the second contact object for text entry by a user.
4. The method of claim 1, wherein a single item in the list of items corresponds to a plurality of consecutive missed telephone calls.
5. The method of claim 1, comprising displaying a list of items including missed telephone calls while displaying a number indicating a number of consecutive missed telephone calls within a respective single item of the list.
6. The method of claim 1, wherein a single entry in the list of entries corresponds to missed telephone calls from different telephone numbers associated with respective callers.
7. The method of claim 1, wherein the second interactive display portion of the second user-selected interactive display item is identified by an icon displayed in the second user-selected interactive display item.
8. The method of claim 1, wherein the second contact object is a contact object associated with sending an email to a caller and the plurality of simultaneously displayed contact objects includes a third contact object associated with sending an instant message to a caller.
9. An apparatus for responding to a missed call, comprising:
means for displaying a list comprising a plurality of interactive display items associated with a telephone call on a touch screen display of a portable electronic device,
wherein at least one interactive display item in the list is associated with a missed telephone call and is also associated with contact information stored in the portable electronic device;
wherein each such interactive display item associated with both the missed telephone call and the contact information comprises at least a first interactive display portion and a second interactive display portion, the second interactive display portion being separate from the first interactive display portion;
means for detecting a finger tap input at a first user selected interactive display item associated with both the missed phone call and the contact information, wherein the finger tap input is detected at a first interactive display portion of the first user selected interactive display item;
means for initiating a phone call to a phone number associated with a first user-selected interactive display item in response to a finger tap input detected at the first user-selected interactive display item;
means for detecting a finger tap input at a second user selected interactive display item associated with both the missed phone call and the contact information, wherein the finger tap input is detected at a second interactive display portion of the second user selected interactive display item;
means for displaying contact information of a caller corresponding to a second user-selected interactive display item in response to a finger tap input detected at the second user-selected interactive display item, the displayed contact information including a plurality of contact objects displayed simultaneously, the plurality of contact objects including:
a first contact object associated with a telephony communication modality of calling the caller, an
A second contact object associated with a non-telephonic communication modality of contacting the caller; and is
In response to detecting a user selection of a second contact object, means for initiating communication with the corresponding caller via a non-telephonic communication modality corresponding to the second contact object.
10. The device of claim 9, wherein the second contact object is an email contact object, and wherein initiating communication through a non-telephonic communication modality corresponding to the second contact object includes preparing an email message having a selected email address associated with the second contact object for text entry by a user.
11. The device of claim 9, wherein the second contact object is an instant message object, and wherein initiating communication via a non-telephonic communication modality corresponding to the second contact object includes preparing an instant message addressed to a number associated with the second contact object for text entry by a user.
12. The apparatus of claim 9, wherein a single item in the list of items corresponds to a plurality of consecutive missed telephone calls.
13. The apparatus of claim 9, including means for displaying a list of items including missed telephone calls while displaying a number indicating a number of consecutive missed telephone calls within a respective single item of the list.
14. The apparatus of claim 9, wherein a single entry in the list of entries corresponds to missed telephone calls from different telephone numbers associated with respective callers.
15. The apparatus of claim 9, wherein the second interactive display portion of the second user-selected interactive display item is identified by an icon displayed in the second user-selected interactive display item.
16. The apparatus of claim 9, wherein the second contact object is a contact object associated with sending an email to a caller and the plurality of simultaneously displayed contact objects includes a third contact object associated with sending an instant message to a caller.
Applications Claiming Priority (10)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US82476906P | 2006-09-06 | 2006-09-06 | |
| US60/824,769 | 2006-09-06 | ||
| US88378307P | 2007-01-06 | 2007-01-06 | |
| US60/883,783 | 2007-01-06 | ||
| US87925307P | 2007-01-07 | 2007-01-07 | |
| US60/879,253 | 2007-01-07 | ||
| US87946907P | 2007-01-08 | 2007-01-08 | |
| US60/879,469 | 2007-01-08 | ||
| US11/769,695 | 2007-06-27 | ||
| US11/769,695 US20080055263A1 (en) | 2006-09-06 | 2007-06-27 | Incoming Telephone Call Management for a Portable Multifunction Device |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| HK1177567A1 HK1177567A1 (en) | 2013-08-23 |
| HK1177567B true HK1177567B (en) | 2015-07-17 |
Family
ID=
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20230362601A1 (en) | Missed telephone call management for a portable multifunction device | |
| AU2009100722A4 (en) | Incoming telephone call management for a portable multifunction device with touch screen display | |
| US11057335B2 (en) | Portable multifunction device, method, and graphical user interface for an email client | |
| EP2069899B1 (en) | Deletion gestures on a portable multifunction device | |
| US8253695B2 (en) | Email client for a portable multifunction device | |
| CN101529874A (en) | Incoming call management for portable multifunction device with touch screen display | |
| AU2020239803B2 (en) | Incoming telephone call management for a portable multifunction device with touch screen display | |
| HK1177567B (en) | Method and device for making respondiveness for missed calls | |
| HK1172980B (en) | Deletion gestures on a portable multifunction device |