[go: up one dir, main page]

WO2024144800A1 - Methods and systems for operating handheld devices - Google Patents

Methods and systems for operating handheld devices Download PDF

Info

Publication number
WO2024144800A1
WO2024144800A1 PCT/US2022/082399 US2022082399W WO2024144800A1 WO 2024144800 A1 WO2024144800 A1 WO 2024144800A1 US 2022082399 W US2022082399 W US 2022082399W WO 2024144800 A1 WO2024144800 A1 WO 2024144800A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
touchscreen
input
hand
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2022/082399
Other languages
French (fr)
Inventor
Jingmin Zhou
Bin Liu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Innopeak Technology Inc
Original Assignee
Innopeak Technology Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Innopeak Technology Inc filed Critical Innopeak Technology Inc
Priority to CN202280100993.2A priority Critical patent/CN120035801A/en
Priority to PCT/US2022/082399 priority patent/WO2024144800A1/en
Publication of WO2024144800A1 publication Critical patent/WO2024144800A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/169Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes
    • G06F1/1692Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes the I/O peripheral being a secondary touch screen used as control interface, e.g. virtual buttons or sliders
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/033Indexing scheme relating to G06F3/033
    • G06F2203/0339Touch strips, e.g. orthogonal touch strips to control cursor movement or scrolling; single touch strip to adjust parameter or to implement a row of soft keys

Definitions

  • a handheld device typically includes a touchscreen that functions both as an output display and an interface for receiving user input. While the large display is almost always welcome, large touchscreens are often difficult for users to operate, especially with one hand. There are various solutions to make one-hand operation easier for users, but they have been inadequate, as described below.
  • the present invention is directed to methods and systems for operating handheld devices.
  • side sensors that are configured on the sides of a handheld device capture the operating characteristics of a user’s hand.
  • the display is changed to make one-hand operation easier.
  • Embodiments of the present invention can be implemented in conjunction with existing systems and processes.
  • the present device configuration and its related methods according to the present invention can be used in a wide variety of handheld systems, including cellphones, tablets, and other mobile devices.
  • various techniques according to the present invention can be adopted into existing systems via integrated circuit fabrication, mobile operating software, and wireless communication protocols. There are other benefits as well.
  • a system of one or more computers can be configured to perform particular operations or actions by virtue of having software, firmware, hardware, or a combination of them installed on the system that in operation causes or cause the system to perform the actions.
  • One or more computer programs can be configured to perform particular operations or actions by virtue of including instructions that, when executed by the data processing apparatus, cause the apparatus to perform the actions.
  • One general aspect includes a handheld system that includes a housing.
  • the system also includes a touchscreen configured on a front side of the housing, the touchscreen may include a first one-hand display region.
  • the system also includes a first sensor configured on a left side of the housing, the first sensor being configured to generate a first sensor reading.
  • the system also includes a second sensor configured on a right side of the housing, the second sensor being configured to generate a second sensor reading.
  • the system also includes a storage configured to store display preference data.
  • the system also includes a processor coupled to the first sensor and the second sensor, the processor being configured to: process the display preference data, select a one-hand operation mode based on at least the first sensor reading and the second sensor reading, and move a user interface to the first one-hand display region.
  • Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
  • Implementations may include one or more of the following features.
  • the handheld system where the processor may include a central processing unit and a graphic processing unit, the graphic processing unit being coupled to the touchscreen.
  • the first sensor may include a pressure sensor.
  • the handheld system includes the first sensor, which may include a pressure sensor.
  • the first sensor may include a capacitive sensor.
  • the handheld system may include an accelerometer, the processor being configured to move the user interface to a second one-hand display region based on an input from the accelerometer.
  • the processor is further configured to update the display preference data based on user input received from the touchscreen.
  • the processor is further configured to select between a left-hand mode and a right-hand mode based on the first sensor reading and the second sensor reading.
  • the touch screen further may include a second one-hand display region, the first one-hand display region being associated with the right-hand mode, and the second one-hand display region being associated with the left-hand mode.
  • the user interface may include one or more control items.
  • the handheld system may include a communication module configured to transmit the display preference data to a server.
  • the processor may include a neural processing unit configured to update the display preference data based on user inputs. Implementations of the described techniques may include hardware, a method or process, or computer software on a computer-accessible medium.
  • One general aspect includes a method for operating a handheld system.
  • the method includes displaying a user interface on a first region and a second region of a touchscreen.
  • the method also includes receiving a first input from a first side sensor, the first side sensor being configured on a first side of a housing.
  • the method also includes receiving a second input from a second side sensor, the second side sensor being configured on a second side of a housing.
  • the method also includes obtaining display preference data from a storage.
  • the method also includes processing the first input and the second input.
  • the method also includes selecting a one-hand operation mode based on the first input and the second input.
  • the method also includes selecting the first region of the touchscreen based on the one-hand operation mode and the display preference data.
  • the method also includes removing the user interface from the second region of the touchscreen based on the one-hand operation mode and the display preference data.
  • the method also includes receiving a user input within the first region of the touchscreen.
  • Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
  • Implementations may include one or more of the following features.
  • the method may include displaying the user interface in the first region and the second region of the touchscreen upon receiving an indication of leveling change.
  • the method may include shifting or scaling the user interface to the first region of the touchscreen.
  • the method may include determining a control item location on the touchscreen.
  • the method may include determining a distance between the control item location and a thumb.
  • Implementations of the described techniques may include hardware, a method or process, or computer software on a computer-accessible medium.
  • One general aspect includes a method for calibrating an input mode for operating a handheld device.
  • the method also includes providing a touchscreen.
  • the method also includes displaying a calibration interface on the touch screen.
  • the method also includes creating a display preference profile based on a predetermined template.
  • the method also includes receiving user input in a one-hand mode on the touchscreen during a first time interval.
  • the method also includes receiving a first side sensor input from a first side sensor configured on a first side of the handheld device during the first time interval.
  • the method also includes receiving a second side sensor input from a second side sensor configured on a second side of the handheld device during the first time interval.
  • the method also includes updating the display preference profile using at least the first side sensor input and the second side sensor input.
  • the present invention provides configurations and methods for handheld devices that allow users to use handhold gestures to move a user interface displayed on the touchscreen to position certain control items of the user interface in more accessible locations (e.g., within reach while in a singlehand operating mode). Additionally, the present invention implements flexible approaches to configure and calibrate display preference profiles related to the operation of the user interface.
  • Figure l is a simplified diagram illustrating a handheld device configured with side sensors according to embodiments of the present invention.
  • Figure 2 is a simplified block diagram illustrating components of a handheld device according to embodiments of the present invention.
  • Figure 3 is a simplified flow diagram illustrating a method for operating a handheld device according to embodiments of the present invention.
  • Figure 4 is a simplified flow diagram illustrating a method for setting a one-hand operation mode for a handheld device according to embodiments of the present invention.
  • Figure 5 is a simplified diagram illustrating the shifting display region to the left side to make a control item reachable to a user according to embodiments of the present invention.
  • Figure 6 is a simplified diagram illustrating the shifting display region to the bottom to make a control item reachable to a user according to embodiments of the present invention.
  • Figure 7 is a simplified diagram illustrating the shifting display region to the bottom left comer to make a control item reachable to a user according to embodiments of the present invention.
  • the present invention is directed to methods and systems for operating handheld devices.
  • side sensors that are configured on the sides of a handheld device to capture the operating characteristics of a user hand. Upon detection of one-hand operation mode using the side sensors, the display is changed to make one-hand operation easier.
  • the display is changed to make one-hand operation easier.
  • embodiments of the present invention provide improved methods and systems for operating handheld devices.
  • embodiments of the present invention can determine one or more touch points of a user’s hand on the phone, as well as the relative pressure of each touch point.
  • the handheld device can determine various user gestures that can trigger different operating modes, such as a one-hand operation mode to provide more convenient access for a user’s thumb to a particular control item of a user interface.
  • the handheld device can also include a level sensor array, which can be used to determine the leveling state of the phone and determine particular operation modes as well.
  • FIG. 1 is a simplified diagram illustrating a handheld device 100 configured with side sensors according to embodiments of the present invention. This diagram is merely an example, which should not unduly limit the scope of the claims. One of ordinary skill in the art would recognize many variations, alternatives, and modifications.
  • the first sensor 131 can be configured to generate at least a first sensor reading
  • the second sensor 132 can be configured to generate at least a second sensor reading.
  • Each of the first and second sensors can include a pressure sensor, a capacitive sensor, or other touch sensor and combinations thereof. These sensors can be used to detect and process pressure readings of a user’s fingers.
  • the device 100 may include one or more additional sensors that generate one or more additional sensor readings.
  • the storage 140 can be configured to store display preference data.
  • the storage 140 can include various memory devices, such as random-access memory (RAM), flash memory, and the like.
  • the method includes processing the first input and the second input, and selecting a one-hand operation mode based on the first and second inputs, respectively.
  • the processor can be configured to process the first and second inputs, as well as any additional inputs from additional sensors, to determine the one-hand operation mode.
  • the method can also include processing inputs related to one or more control items, such as a control item location on the touchscreen, a distance between the control item location and a user’s finger (e.g., thumb), etc.
  • the method includes selecting the first region of the touchscreen based on the one-hand operation mode and the display preference data.
  • the method includes creating a display preference profile based on a predetermined template.
  • the predetermined template can include one or more settings, configurations, or functions that are related to the display preference profile.
  • the predetermined template can also include a plurality of predetermined display preference profiles. A user can provide inputs to one or more control items provided by the calibration interface to determine profile elements, select certain profiles, or the like.
  • the method includes receiving user input in a one-hand mode on the touchscreen during a first time interval. Also, in steps 410 and 412, the method includes receiving a first side sensor input from a first side sensor configured on a first side of the handheld device during the first time interval, and receiving a second side sensor input from a second side sensor configured on a second side of the handheld device during the first time interval. As discussed previously, the method can also include receiving one or more sensor inputs from one or more sensors configured within various portions of the handheld device.
  • the method includes updating the display preference profile using at least the first side sensor input and the second side sensor input.
  • the update to the profile can include changes to one or more profile elements, a selection of another predetermined profile template, or the like.
  • the method can include updating the display preference profile during a second time interval based on a third side sensor input received from the first side sensor.
  • additional sensor inputs from the previous sensors or additional sensors may be used in the previous time intervals or additional time intervals to perform interactions with the user interface, the display preference profile, or other related functions.
  • the application can build an Al model/profile of the user’s single handhold gesture (e.g., instead of the template profile creation of step 406 or as part of the profile update of step 414).
  • This data can be used to determine the shifting offset of the displayed user interface, the threshold of activating the shifting of the user interface while tilting the handheld device, etc.
  • the application can also give the user the choice of enabling a continuous user behavior learning process in the background to continue updating the profile while the user is operating the device in daily activities.
  • An NPU can be used with the calibration application to build the model/profile and for the continuous learning process as well.
  • Figure 5 shows a handheld device 501 displaying a user interface in a first state and a handheld device 502 displaying a user interface (UI) in a second state following a user action trigger.
  • Devices 501 and 502 have a touchscreen display 510 that displays a user interface area 520, which includes at least one control item 530 (e.g., button, icon, text field, interactive element, etc.).
  • control item 530 e.g., button, icon, text field, interactive element, etc.
  • the user action trigger can include one or more sensor inputs received from such sensors.
  • a user may have difficulty reaching a particular control item 530 while holding the device 501/502 in one hand.
  • the control item 530 being in the upper-right corner of the touchscreen may be difficult to reach in a left-handed operation (i.e., using the left-hand thumb).
  • the user action trigger results in moving the UI area 520 such that the control item 530 moves from the right-side (denoted as the higher side) to the left-side (denoted as the lower side).
  • the movement of the UI area 520 leaves a portion of the touchscreen 510 with a blank screen area 522.
  • the process 900 can include various user action triggers, such as: a fast shake trigger 912, a side multi-tap trigger 914, and a side finger squeeze trigger 916.
  • these user action triggers can be detected by receiving one or more sensor inputs from sensors configured within various portions of the handheld device. If any of these triggers are detected, then the process 900 performs a sequence of check conditions to determine whether to enter a one-hand operation mode.
  • the check conditions can include: a threshold leveling change in a time interval 922, a sensor detection of one-hand operation 924, a control item presence on user interface 926, and a threshold control item distance to user thumb 928.
  • the order of the check conditions for the sensor detection of one-hand operation 924 and the control item presence on user interface 926 can be reversed.
  • the threshold control item distance to user thumb 928 can range from the distance between the edge and middle of the screen (e.g., about 3 cm) to distance from the opposite comers of the screen (e.g., about 17 cm). In some instances, smaller threshold distances may not be useful if the user thumb is able to reach the control item without adjusting the user interface. And, in some instances, greater threshold distances may result in a bad user experience due to the mobile device being too large for comfortable one-hand operation.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention is directed to methods and systems for operating handheld devices. In a specific embodiment, side sensors that are configured on the sides of a handheld device capture the operating characteristics of a user hand. Upon detection of one-hand operation mode using the side sensors, the display is changed to make one-hand operation easier. There are other embodiments as well.

Description

METHODS AND SYSTEMS FOR OPERATING HANDHELD DEVICES
BACKGROUND OF THE INVENTION
[0001] Over the past two decades, handheld devices such as mobile phones and small tablets have become ubiquitous. A handheld device typically includes a touchscreen that functions both as an output display and an interface for receiving user input. While the large display is almost always welcome, large touchscreens are often difficult for users to operate, especially with one hand. There are various solutions to make one-hand operation easier for users, but they have been inadequate, as described below.
[0002] Therefore, new and improved methods and systems for operating handheld devices are desired.
BRIEF SUMMARY OF THE INVENTION
[0003] The present invention is directed to methods and systems for operating handheld devices. In a specific embodiment, side sensors that are configured on the sides of a handheld device capture the operating characteristics of a user’s hand. Upon detection of one-hand operation mode using the side sensors, the display is changed to make one-hand operation easier. There are other embodiments as well.
[0004] Embodiments of the present invention can be implemented in conjunction with existing systems and processes. For example, the present device configuration and its related methods according to the present invention can be used in a wide variety of handheld systems, including cellphones, tablets, and other mobile devices. Additionally, various techniques according to the present invention can be adopted into existing systems via integrated circuit fabrication, mobile operating software, and wireless communication protocols. There are other benefits as well.
[0005] A system of one or more computers can be configured to perform particular operations or actions by virtue of having software, firmware, hardware, or a combination of them installed on the system that in operation causes or cause the system to perform the actions. One or more computer programs can be configured to perform particular operations or actions by virtue of including instructions that, when executed by the data processing apparatus, cause the apparatus to perform the actions. One general aspect includes a handheld system that includes a housing. The system also includes a touchscreen configured on a front side of the housing, the touchscreen may include a first one-hand display region. The system also includes a first sensor configured on a left side of the housing, the first sensor being configured to generate a first sensor reading. The system also includes a second sensor configured on a right side of the housing, the second sensor being configured to generate a second sensor reading. The system also includes a storage configured to store display preference data. The system also includes a processor coupled to the first sensor and the second sensor, the processor being configured to: process the display preference data, select a one-hand operation mode based on at least the first sensor reading and the second sensor reading, and move a user interface to the first one-hand display region. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
[0006] Implementations may include one or more of the following features. The handheld system where the processor may include a central processing unit and a graphic processing unit, the graphic processing unit being coupled to the touchscreen. The first sensor may include a pressure sensor. The handheld system includes the first sensor, which may include a pressure sensor. The first sensor may include a capacitive sensor. The handheld system may include an accelerometer, the processor being configured to move the user interface to a second one-hand display region based on an input from the accelerometer. The processor is further configured to update the display preference data based on user input received from the touchscreen. The processor is further configured to select between a left-hand mode and a right-hand mode based on the first sensor reading and the second sensor reading. The touch screen further may include a second one-hand display region, the first one-hand display region being associated with the right-hand mode, and the second one-hand display region being associated with the left-hand mode. The user interface may include one or more control items. The handheld system may include a communication module configured to transmit the display preference data to a server. The processor may include a neural processing unit configured to update the display preference data based on user inputs. Implementations of the described techniques may include hardware, a method or process, or computer software on a computer-accessible medium.
[0007] One general aspect includes a method for operating a handheld system. The method includes displaying a user interface on a first region and a second region of a touchscreen. The method also includes receiving a first input from a first side sensor, the first side sensor being configured on a first side of a housing. The method also includes receiving a second input from a second side sensor, the second side sensor being configured on a second side of a housing. The method also includes obtaining display preference data from a storage. The method also includes processing the first input and the second input. The method also includes selecting a one-hand operation mode based on the first input and the second input. The method also includes selecting the first region of the touchscreen based on the one-hand operation mode and the display preference data. The method also includes removing the user interface from the second region of the touchscreen based on the one-hand operation mode and the display preference data. The method also includes receiving a user input within the first region of the touchscreen. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
[0008] Implementations may include one or more of the following features. The method may include displaying the user interface in the first region and the second region of the touchscreen upon receiving an indication of leveling change. The method may include shifting or scaling the user interface to the first region of the touchscreen. The method may include determining a control item location on the touchscreen. The method may include determining a distance between the control item location and a thumb. Implementations of the described techniques may include hardware, a method or process, or computer software on a computer-accessible medium.
[0009] One general aspect includes a method for calibrating an input mode for operating a handheld device. The method also includes providing a touchscreen. The method also includes displaying a calibration interface on the touch screen. The method also includes creating a display preference profile based on a predetermined template. The method also includes receiving user input in a one-hand mode on the touchscreen during a first time interval. The method also includes receiving a first side sensor input from a first side sensor configured on a first side of the handheld device during the first time interval. The method also includes receiving a second side sensor input from a second side sensor configured on a second side of the handheld device during the first time interval. The method also includes updating the display preference profile using at least the first side sensor input and the second side sensor input. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods. [0010] Implementations may include one or more of the following features. The method may include updating the display preference profile during a second time interval based on a third side sensor input received from the first side sensor. The method may include processing pressure readings of user fingers. Implementations of the described techniques may include hardware, a method or process, or computer software on a computer-accessible medium.
[0011] It is to be appreciated that embodiments of the present invention provide many advantages over conventional techniques. Among other things, the present invention provides configurations and methods for handheld devices that allow users to use handhold gestures to move a user interface displayed on the touchscreen to position certain control items of the user interface in more accessible locations (e.g., within reach while in a singlehand operating mode). Additionally, the present invention implements flexible approaches to configure and calibrate display preference profiles related to the operation of the user interface.
[0012] The present invention achieves these benefits and others in the context of known technology. However, a further understanding of the nature and advantages of the present invention may be realized by reference to the latter portions of the specification and attached drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0013] Figure l is a simplified diagram illustrating a handheld device configured with side sensors according to embodiments of the present invention.
[0014] Figure 2 is a simplified block diagram illustrating components of a handheld device according to embodiments of the present invention.
[0015] Figure 3 is a simplified flow diagram illustrating a method for operating a handheld device according to embodiments of the present invention.
[0016] Figure 4 is a simplified flow diagram illustrating a method for setting a one-hand operation mode for a handheld device according to embodiments of the present invention.
[0017] Figure 5 is a simplified diagram illustrating the shifting display region to the left side to make a control item reachable to a user according to embodiments of the present invention. [0018] Figure 6 is a simplified diagram illustrating the shifting display region to the bottom to make a control item reachable to a user according to embodiments of the present invention.
[0019] Figure 7 is a simplified diagram illustrating the shifting display region to the bottom left comer to make a control item reachable to a user according to embodiments of the present invention.
[0020] Figure 8 is a simplified diagram illustrating a mechanism for the user to activate one-hand operation mode using side sensors according to embodiments of the present invention.
[0021] Figure 9 is a simplified flow diagram illustrating a process for entering a one-hand operation mode according to embodiments of the present invention.
[0022] Figure 10 is a simplified flow diagram illustrating a process for exiting a one-hand operation mode according to embodiments of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
[0023] The present invention is directed to methods and systems for operating handheld devices. In a specific embodiment, side sensors that are configured on the sides of a handheld device to capture the operating characteristics of a user hand. Upon detection of one-hand operation mode using the side sensors, the display is changed to make one-hand operation easier. There are other embodiments as well.
[0024] As described above, operating a handheld device with a large touchscreen can be difficult. Particularly, when a user is operating a handheld device with one hand, the user’s thumb is often too short to conveniently reach the opposite side of the touchscreen. Thus, the present invention provides configurations and methods for handheld devices that allow users to use handhold gestures to move a user interface displayed on the touchscreen. By doing so, certain control items of the user interface (e.g., buttons, icons, text fields, interactive elements, etc.) can be positioned in more convenient locations.
[0025] The related methods and devices include using arrays of sensors configured within a handheld to detect handhold gestures by the user to identify one or more single handhold operation modes. In these modes, the handhold gestures can be used to trigger user interface moving processes to facilitate thumb operations on the touchscreen or to trigger the user interface to return to a default or previous display mode (e.g., full-screen mode). [0026] As an example, existing one-hand operation mechanisms are inadequate, such as half-screen modes, zoom desktops, and distort desktops. Existing half-screen mode implementations (e.g., Apple, Samsung, Huawei, etc.) only make use of the bottom half of the screen with no option for left-right half-screen modes. Touchscreen interactions are also needed to enable and disable the half-screen mode, which has a fixed size and layout. Similarly, zoom desktop implementations also need touchscreen interactions to enable and disable their functions. And, distort desktop implementations are vague and unclear about how mode changes are triggered.
[0027] It is to be appreciated that embodiments of the present invention provide improved methods and systems for operating handheld devices. By using arrays of sensors, embodiments of the present invention can determine one or more touch points of a user’s hand on the phone, as well as the relative pressure of each touch point. With these sensors configured within different portions of the device housing, the handheld device can determine various user gestures that can trigger different operating modes, such as a one-hand operation mode to provide more convenient access for a user’s thumb to a particular control item of a user interface. The handheld device can also include a level sensor array, which can be used to determine the leveling state of the phone and determine particular operation modes as well.
[0028] The following description is presented to enable one of ordinary skill in the art to make and use the invention and to incorporate it in the context of particular applications. Various modifications, as well as a variety of uses in different applications will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to a wide range of embodiments. Thus, the present invention is not intended to be limited to the embodiments presented, but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
[0029] In the following detailed description, numerous specific details are set forth in order to provide a more thorough understanding of the present invention. However, it will be apparent to one skilled in the art that the present invention may be practiced without necessarily being limited to these specific details. In other instances, well-known structures and devices are shown in block diagram form, rather than in detail, in order to avoid obscuring the present invention.
[0030] The reader’s attention is directed to all papers and documents which are filed concurrently with this specification and which are open to public inspection with this specification, and the contents of all such papers and documents are incorporated herein by reference. All the features disclosed in this specification, (including any accompanying claims, abstract, and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of equivalent or similar features.
[0031] Furthermore, any element in a claim that does not explicitly state “means for” performing a specified function, or “step for” performing a specific function, is not to be interpreted as a “means” or “step” clause as specified in 35 U.S.C. Section 112, Paragraph 6. In particular, the use of “step of’ or “act of’ in the Claims herein is not intended to invoke the provisions of 35 U.S.C. 112, Paragraph 6.
[0032] Please note, if used, the labels left, right, front, back, top, bottom, forward, reverse, clockwise and counter clockwise have been used for convenience purposes only and are not intended to imply any particular fixed direction. Instead, they are used to reflect relative locations and/or directions between various portions of an object.
[0033] Figure 1 is a simplified diagram illustrating a handheld device 100 configured with side sensors according to embodiments of the present invention. This diagram is merely an example, which should not unduly limit the scope of the claims. One of ordinary skill in the art would recognize many variations, alternatives, and modifications.
[0034] As shown, device 100 can be configured within a housing 110 and can include a touchscreen 120, a first sensor 131, a second sensor 132, a storage 140, and a processor 150. The touchscreen 120 can be configured on a front side of the housing 110, while the first sensor 131 and the second sensor 132 can be configured on a first side and a second side of the housing, respectively. The storage 140 and the processor 150 can be configured within the housing 110 (e.g., in an integrated circuit device, or the like), as shown within the dotted- line cross-section 101. Further, the processor 150 can be coupled to the touchscreen 120, the sensors 131 and 132, and the storage 140 to communicate between these device elements.
[0035] In an example, the first sensor 131 can be configured to generate at least a first sensor reading, and the second sensor 132 can be configured to generate at least a second sensor reading. Each of the first and second sensors can include a pressure sensor, a capacitive sensor, or other touch sensor and combinations thereof. These sensors can be used to detect and process pressure readings of a user’s fingers. In other cases, the device 100 may include one or more additional sensors that generate one or more additional sensor readings. The storage 140 can be configured to store display preference data. The storage 140 can include various memory devices, such as random-access memory (RAM), flash memory, and the like.
[0036] The processor 150 can be configured to process the display preference data, select a one-hand operation mode based on at least the first sensor reading and the second sensor reading, and move a user interface to one or more one-hand display regions. In other cases, the processor 150 may receive a plurality of sensor readings from a plurality of sensors configured in different portions of the housing 110, determine an operation mode based on the plurality of sensor readings, and move a user interface to one or more display regions from an initial display region or from a state in which the user interface was not previously displayed.
[0037] In a specific example, the processor 150 can be configured to select between a lefthandhold mode and a right-handhold mode based on the first sensor reading and the second sensor reading. For example, a calibration application can prompt the user on the display of the touchscreen 120 to hold the device 100 with either the left hand or the right hand and to operate the phone with the thumb of the holding hand. In this case, the first sensor is configured on a left-side of the housing 110, while the second sensor is configured on the right side of the housing 110. Further, the touch screen 120 would include a first one-hand display region associated with the left-handhold mode, and a second one-hand display region associated with the right-handhold mode.
[0038] Also, the processor 150 can be configured to move the user interface to one or more display regions based on one or more inputs from one or more additional sensors, such as an accelerometer, a gyroscope, a magnetometer, a level sensor, or other sensors and combinations thereof. In other cases, the device 100 can include a plurality of sensors, which can be configured within various portions of the housing 110 (e.g., left, right, top, bottom, front-side and back-side). These sensors can all provide sensor readings that can be used to determine the desired operation mode and move the user interface to desired display regions of the touchscreen, among other processes.
[0039] In an example, the user interface can include one or more control items, such as buttons, icons, text fields, interactive elements, and the like. Depending on the operating mode, the user interface may be shifted, scaled, or otherwise transformed during the moving process to a particular display region. The location of the one or more control items can also be used to as an input to determine how the user interface is transformed. For example, one or more of the sensors may be used to determine the distance from a user’s thumb to a control item, which can be used as an input to determine the type of operation mode or the extent to which the user interface is transformed. Other handhold gestures, determined by one or more sensor inputs, can also be used to trigger different operating modes in which the user interface is transformed to provide the user with more convenient access to control items.
[0040] Figure 2 is a simplified block diagram illustrating components of a handheld device 200 according to embodiments of the present invention. This diagram is merely an example, which should not unduly limit the scope of the claims. One of ordinary skill in the art would recognize many variations, alternatives, and modifications.
[0041] As shown, Figure 2 expands on the internal and external components that can be included in a handheld device according to the present invention, such as the device 100 of Figure 1. Configured to a housing 201, these components can include a central processing unit (CPU) 210 coupled to a graphic processing unit (GPU) 212 and a touchscreen including a display 220 and a screen touch sensor 222. Here, the CPU 210 is coupled to the screen touch sensor 222, an auxiliary hand sensor 230, and an accelerometer 232, all of which can provide sensor inputs to determine user gestures and device operating modes, as discussed previously. The GPU 212 can be coupled to the display 220 and a camera 250. Further, the GPU 212 can be configured to transmit various forms of the user interface to the display 220 depending on the operating mode.
[0042] As discussed previously, the CPU 210 can be further configured to update the display preference data in a storage (see Figure 1) based on user input received from the touchscreen. In a specific example, the CPU 210 is also coupled to the neural processing unit (NPU) 214 that is configured to update the display preference data based on user inputs, which can be received via the screen touch sensor 222, or the sensors 131/132 in Figure 1, or the like. In a specific example, the NPU 214 can be used to learn the user’s holding gestures via one or more of the sensors within the device 200. Such gesture data can also be stored within memory (e.g., storage 140 of Figure 1).
[0043] The device 200 can further include a communication module 240 coupled to the CPU 210 and configured to transmit the display preference data to a server. This communication module 240 can be configured for mobile data, Wi-Fi, Bluetooth, etc. Implementations of the described techniques may include hardware, a method or process, or computer software on a computer-accessible medium. Of course, there can be other variations, modifications, and alternatives to these device elements and their configurations. [0044] Figure 3 is a simplified flow diagram illustrating a method 300 for operating a handheld device according to embodiments of the present invention. This diagram is merely an example, which should not unduly limit the scope of the claims. One of ordinary skill in the art would recognize many variations, alternatives, and modifications. For example, one or more steps may be added, removed, repeated, rearranged, modified, replaced, and/or overlapped, and should not limit the scope of the claims.
[0045] As shown, the method 300 includes the step 302 of displaying a user interface on a first region and a second region of a touchscreen of the handheld device. As discussed previously, the touchscreen can be configured on a front side of a housing of the handheld device (e.g., cellphone, tablet, etc.). In a specific example, displaying the user interface in both the first and second regions can be performed upon receiving an indication of leveling change. Further, the user can include one or more control items (e.g., buttons, icons, text fields, interactive elements, etc.).
[0046] In steps 304 and 306, the method includes receiving a first input from a first sensor and receiving a second input from a second sensor, respectively. In an example, the first sensor can be a first side sensor configured on a first side of the housing, while the second sensor can be a second side sensor configured on a second side of the housing. As discussed previously, the handheld device can include a plurality of sensors configured within various portions of the housing. Thus, the method can also include receiving one or more inputs from such sensors.
[0047] In step 308, the method includes obtaining display preference data from a storage of the handheld system. As discussed previously, the storage can include various types of memory devices that are configured within the housing and coupled to a processor of the handheld device. The display preference data can include dimensions of the display regions, type of transformations of the user interface, detected gesture configurations, or the like.
[0048] In steps 310 and 312, the method includes processing the first input and the second input, and selecting a one-hand operation mode based on the first and second inputs, respectively. As discussed previously, the processor can be configured to process the first and second inputs, as well as any additional inputs from additional sensors, to determine the one-hand operation mode. In a specific example, the method can also include processing inputs related to one or more control items, such as a control item location on the touchscreen, a distance between the control item location and a user’s finger (e.g., thumb), etc. [0049] In step 314, the method includes selecting the first region of the touchscreen based on the one-hand operation mode and the display preference data. And, in step 316, the method includes removing the user interface from the second region of the touchscreen based on the one-hand operation mode and the display preference data. As discussed previously, the processor can be configured to select certain display regions and transform the user interface based on the selected operation mode and the display preference data. The step of removing the user interface from the second region can include transforming (e.g., shifting, scaling, etc.) the user interface to the first region. In a specific example, the transformation of the user interface can be such that the portion of the user interface within the second region is instead accessible to the user in the first region.
[0050] In step 318, the method includes receiving a user input within the first region of the touchscreen. This step can include any user interaction with the user interface to update the display preference data, execute an application, send a communication, etc. Further details are discussed with respect to the subsequent figures.
[0051] Figure 4 is a simplified flow diagram illustrating a method for setting a one-hand operation mode for a handheld device according to embodiments of the present invention. This diagram is merely an example, which should not unduly limit the scope of the claims. One of ordinary skill in the art would recognize many variations, alternatives, and modifications. For example, one or more steps may be added, removed, repeated, rearranged, modified, replaced, and/or overlapped, and should not limit the scope of the claims.
[0052] As shown, the method 400 includes the steps 402 and 404 of providing a touchscreen and displaying a calibration interface on the touchscreen, respectively. The touchscreen can be configured on a front side of the handheld device, as described in previous examples. The calibration interface can be a user interface configured to receive user input to determine a display preference profile. In an example, the calibration interface can include one or more control items (e.g., button, software keyboard, slides, etc.) configured to obtain inputs related to the display preference profile. These control items can be displayed at different positions on the touchscreen with prompts for the user to operate these control items.
[0053] In step 406, the method includes creating a display preference profile based on a predetermined template. In an example, the predetermined template can include one or more settings, configurations, or functions that are related to the display preference profile. The predetermined template can also include a plurality of predetermined display preference profiles. A user can provide inputs to one or more control items provided by the calibration interface to determine profile elements, select certain profiles, or the like.
[0054] In step 408, the method includes receiving user input in a one-hand mode on the touchscreen during a first time interval. Also, in steps 410 and 412, the method includes receiving a first side sensor input from a first side sensor configured on a first side of the handheld device during the first time interval, and receiving a second side sensor input from a second side sensor configured on a second side of the handheld device during the first time interval. As discussed previously, the method can also include receiving one or more sensor inputs from one or more sensors configured within various portions of the handheld device.
[0055] In step 414, the method includes updating the display preference profile using at least the first side sensor input and the second side sensor input. The update to the profile can include changes to one or more profile elements, a selection of another predetermined profile template, or the like. In a specific example, the method can include updating the display preference profile during a second time interval based on a third side sensor input received from the first side sensor. In other cases, additional sensor inputs from the previous sensors or additional sensors may be used in the previous time intervals or additional time intervals to perform interactions with the user interface, the display preference profile, or other related functions.
[0056] In a specific example, the calibration application can prompt the user on the touchscreen to hold the device with one hand and to operate the phone with the thumb of the holding hand. The application can also prompt the user to draw the largest area on the touchscreen with the user’s thumb. The user may be prompted to perform other calibration gestures as well. The calibration application can read the data from the sensors during the user’s operation (i.e., user’s response to prompts) to obtain the position and pressure of each finger, distance between fingers, length and common maneuver area of the thumb, angle and speed of tilting the phone, etc.
[0057] Using the sensor data, the application can build an Al model/profile of the user’s single handhold gesture (e.g., instead of the template profile creation of step 406 or as part of the profile update of step 414). This data can be used to determine the shifting offset of the displayed user interface, the threshold of activating the shifting of the user interface while tilting the handheld device, etc. The application can also give the user the choice of enabling a continuous user behavior learning process in the background to continue updating the profile while the user is operating the device in daily activities. An NPU can be used with the calibration application to build the model/profile and for the continuous learning process as well.
[0058] A system of one or more computers can be configured to perform particular operations or actions by virtue of having software, firmware, hardware, or a combination of them installed on the system that in operation causes or cause the system to perform the actions. One or more computer programs can be configured to perform particular operations or actions by virtue of including instructions that, when executed by the data processing apparatus, cause the device to perform the actions. Those of ordinary skill in the art will recognize other variations, modifications, and alternatives to the steps described previously.
[0059] Figure 5 is a simplified diagram illustrating the shifting display region to the left side to make a control item reachable to a user according to embodiments of the present invention. This diagram is merely an example, which should not unduly limit the scope of the claims. One of ordinary skill in the art would recognize many variations, alternatives, and modifications.
[0060] As shown, Figure 5 shows a handheld device 501 displaying a user interface in a first state and a handheld device 502 displaying a user interface (UI) in a second state following a user action trigger. Devices 501 and 502 have a touchscreen display 510 that displays a user interface area 520, which includes at least one control item 530 (e.g., button, icon, text field, interactive element, etc.). These devices can also include elements discussed previously, such as the sensors and processors described for Figures 1 and 2. And, the user action trigger can include one or more sensor inputs received from such sensors.
[0061] In certain situations, a user may have difficulty reaching a particular control item 530 while holding the device 501/502 in one hand. In this case, the control item 530 being in the upper-right corner of the touchscreen may be difficult to reach in a left-handed operation (i.e., using the left-hand thumb). In an example, the user action trigger results in moving the UI area 520 such that the control item 530 moves from the right-side (denoted as the higher side) to the left-side (denoted as the lower side). The movement of the UI area 520 leaves a portion of the touchscreen 510 with a blank screen area 522. An overflow UI area 524 is illustrated to show the portion of the UI area 520 that is not visible on the touchscreen 510 during the user action trigger. Being closer to the left side of the touchscreen 510, the control item 530 can be accessed more easily by a user in a left-handed operation mode.
[0062] Figure 6 is a simplified diagram illustrating the shifting display region to the bottom to make a control item reachable to a user according to embodiments of the present invention. This diagram is merely an example, which should not unduly limit the scope of the claims. One of ordinary skill in the art would recognize many variations, alternatives, and modifications.
[0063] As shown, Figure 6 shows a handheld device 601 displaying a user interface in a first state and a handheld device 602 displaying a user interface in a second state following a user action trigger. Similar to the devices of Figure 5, devices 601 and 602 have a touchscreen display 610 that displays a user interface area 620, which includes at least one control item 630 (e.g., button, text field, icon, interactive image, etc.). These devices can also include elements discussed previously, such as the sensors and processors described for Figures 1 and 2. And, the user action trigger can include one or more sensor inputs received from such sensors.
[0064] Here, the control item 630 being in the upper-right corner of the touchscreen may be difficult to reach in a right-handed operation (i.e., using the right-hand thumb). In an example, the user action trigger results in moving the UI area 620 such that the control item 630 moves from the top-side (denoted as the higher side) to the bottom-side (denoted as the lower side). Similar to the previous example, the movement of the UI area 620 leaves a portion of the touchscreen 610 with a blank screen area 622 and an overflow UI area 624, which is moved to the bottom-side in this case. Being closer to the bottom-side of the touchscreen 610, the control item 630 can be accessed more easily by a user in a right-handed operation mode.
[0065] Figure 7 is a simplified diagram illustrating the shifting display region to the bottom left comer to make a control item reachable to a user according to embodiments of the present invention. This diagram is merely an example, which should not unduly limit the scope of the claims. One of ordinary skill in the art would recognize many variations, alternatives, and modifications.
[0066] As shown, Figure 7 shows a handheld device 701 displaying a user interface in a first state and a handheld device 702 displaying a user interface in a second state following a user action trigger. Similar to the devices of Figures 5 and 6, devices 701 and 702 have a touchscreen display 710 that displays a user interface area 720, which includes at least one control item 630 (e.g., button, text field, icon, interactive image, etc.). These devices can also include elements discussed previously, such as the sensors and processors described for Figures 1 and 2. And, the user action trigger can include one or more sensor inputs received from such sensors. [0067] Here, the control item 730 being in the upper-right corner of the touchscreen may be difficult to reach in either a left-handed or right-handed operation. In an example, the user action trigger results in moving the UI area 720 such that the control item 730 moves from the top-right (denoted as the higher side) to the bottom-left (denoted as the lower side). Similar to the previous example, the movement of the UI area 720 leaves a portion of the touchscreen 710 with a blank screen area 722 and an overflow UI area 724, which is moved to the bottom-left in this case. Being closer to the center of the touchscreen 610, the control item 630 can be accessed more easily by a user in either a left-handed or right-handed operation mode.
[0068] Figure 8 is a simplified diagram illustrating a mechanism for the user to activate one-hand operation mode using side sensors according to embodiments of the present invention. This diagram is merely an example, which should not unduly limit the scope of the claims. One of ordinary skill in the art would recognize many variations, alternatives, and modifications.
[0069] In an example, the handheld device 801 represents a single-finger squeeze mechanism to activate a one-hand operation mode in which the user interface area 820 is shifted to the right to bring the control item 830 closer to the user’s thumb. The single-finger squeeze can be detected by a sensor configured within the device 801 near the illustrated middle finger. In other examples, the device 801 can have a plurality of sensors configured to detect the single-finger squeeze. Using a calibration application, these sensors may detect different pressures, areas, finger positions, etc. Also, an NPU can be used to determine a user’s gesture profile based on how the user holds the device and performs gestures. For reference, the portion of the UI area extending past the dotted lines shows the UI overflow area that is not displayed on the touchscreen 810 when the single-finger squeeze mechanism is triggered.
[0070] In an example, the handheld device 802 represents a multi-finger squeeze mechanism to activate a one-hand operation mode in which the user interface area 820 is shifted to the upper-right to bring the control item 830 closer to the user’s thumb. Here, the multi-finger squeeze can be detected by at least two sensors configured within the device 802 near the illustrated middle and ring fingers. Similar to the previous example, a plurality of sensors can also be configured within the device 801 to detect the multi-figure squeeze, using a calibration application and/or NPU learning process as well. Further, the portion of the UI area extending past the dotted lines shows the UI overflow area that is not displayed on the touchscreen 810 when the multi-finger squeeze mechanism is triggered. [0071] These finger mechanisms, as well as others, can also be triggered by other fingers at other locations on the handheld device via one or more sensors configured within such locations. Also, the multi-finger mechanisms can include additional finger squeeze inputs (e.g., middle, ring, and pinky fingers). Other mechanisms such as shaking and tilting can be used along with such finger squeeze inputs as well. Such mechanisms can be set up in a calibration mode, selected by predetermined template settings, or another similar method of configuration.
[0072] As discussed previously, a calibration application and/or an NPU can be used to configure interpretation of user inputs and gestures (e.g., when the user’s fingers to not align perfectly with the sensors or are not consistently in the same positions). For example, the calibration application can be used to set up predetermined gesture profiles or the NPU can compare future user gestures to previously recorded gesture profiles to determine the user’s intended gesture. Such gesture profiles or display preference profiles can be configured as Al models that are stored within the memory of the handheld device.
[0073] Figure 9 is a simplified flow diagram illustrating a process 900 for entering a one- hand operation mode according to embodiments of the present invention. This diagram is merely an example, which should not unduly limit the scope of the claims. One of ordinary skill in the art would recognize many variations, alternatives, and modifications. For example, one or more steps may be added, removed, repeated, rearranged, modified, replaced, and/or overlapped, and should not limit the scope of the claims.
[0074] As shown, the process 900 can include various user action triggers, such as: a fast shake trigger 912, a side multi-tap trigger 914, and a side finger squeeze trigger 916. As discussed previously, these user action triggers can be detected by receiving one or more sensor inputs from sensors configured within various portions of the handheld device. If any of these triggers are detected, then the process 900 performs a sequence of check conditions to determine whether to enter a one-hand operation mode. The check conditions can include: a threshold leveling change in a time interval 922, a sensor detection of one-hand operation 924, a control item presence on user interface 926, and a threshold control item distance to user thumb 928. In an example, the order of the check conditions for the sensor detection of one-hand operation 924 and the control item presence on user interface 926 can be reversed. In an example, the threshold control item distance to user thumb 928 can range from the distance between the edge and middle of the screen (e.g., about 3 cm) to distance from the opposite comers of the screen (e.g., about 17 cm). In some instances, smaller threshold distances may not be useful if the user thumb is able to reach the control item without adjusting the user interface. And, in some instances, greater threshold distances may result in a bad user experience due to the mobile device being too large for comfortable one-hand operation. If all conditions are satisfied, then the handheld device can be configured to enter the particular one-hand operation mode and to move the user interface towards the user’s thumb depending on sensor inputs and a display preference profile 932. If any of the conditions are not satisfied, then the process 900 can be configured to ignore the user action triggers 934.
[0075] Figure 10 is a simplified flow diagram illustrating a process 1000 for exiting a one- hand operation mode according to embodiments of the present invention. This diagram is merely an example, which should not unduly limit the scope of the claims. One of ordinary skill in the art would recognize many variations, alternatives, and modifications. For example, one or more steps may be added, removed, repeated, rearranged, modified, replaced, and/or overlapped, and should not limit the scope of the claims.
[0076] As shown, the process 1000 can include various user action triggers, such as: a fast shake trigger 1012, a side multi-tap trigger 1014, and a release of a side finger squeeze (i.e., fingers no longer squeezing sides of handheld device) trigger 1016. Similar to the previous example, these user action triggers can be detected by receiving one or more sensor inputs from sensors configured within various portions of the handheld device. If any of these triggers are detected, then the process 1000 performs a sequence of check conditions to determine whether to exit a one-hand operation mode. The check conditions can include: a threshold leveling change in a time interval 1022, a sensor detection of one-hand operation 1024, a prior user interface movement 1026, and a leveling change opposite to the user interface movement 1028. In an example, the order of the check conditions for the sensor detection of one-hand operation 1024 and the prior user interface movement 1026 can be reversed. If all conditions are satisfied, then the handheld device can be configured to exit the particular one-hand operation mode and to restore the user interface to a default position or a previous position 1032. If any of the conditions are not satisfied, then the process 1000 can be configured to ignore the user action triggers 1034.
[0077] While the above is a full description of the specific embodiments, various modifications, alternative constructions and equivalents may be used. Therefore, the above description and illustrations should not be taken as limiting the scope of the present invention which is defined by the appended claims.

Claims

WHAT IS CLAIMED IS:
1. A handheld system comprising: a housing; a touchscreen configured on a front side of the housing, the touchscreen comprising a first one-hand display region; a first sensor configured on a left side of the housing, the first sensor being configured to generate a first sensor reading; a second sensor configured on a right side of the housing, the second sensor being configured to generate a second sensor reading; a storage configured to store display preference data; and a processor coupled to the first sensor and the second sensor, the processor being configured to: process the display preference data; select a one-hand operation mode based on at least the first sensor reading and the second sensor reading; and move a user interface to the first one-hand display region.
2. The handheld system of claim 1 wherein the processor comprises a central processing unit and a graphic processing unit, the graphic processing unit being coupled to the touchscreen.
3. The handheld system of claim 1 wherein the first sensor comprises a pressure sensor.
4. The handheld system of claim 1 the first sensor comprises a pressure sensor.
5. The handheld system of claim 1 wherein the first sensor comprises a capacitive sensor.
6. The handheld system of claim 1 further comprising an accelerometer, the processor being configured to move the user interface to a second one-hand display region based on an input from the accelerometer.
7. The handheld system of claim 1 wherein the processor is further configured to update the display preference data based on user input received from the touchscreen.
8. The handheld system of claim 1 wherein the processor is further configured to select between a left-hand mode and a right-hand mode based on the first sensor reading and the second sensor reading.
9. The handheld system of claim 8 wherein the touch screen further comprises a second one-hand display region, the first one-hand display region being associated with the right-hand mode, and the second one-hand display region being associated with the lefthand mode.
10. The handheld system of claim 1 wherein the user interface comprises one or more control items.
11. The handheld system of claim 1 further comprising a communication module configured to transmit the display preference data to a server.
12. The handheld system of claim 1 wherein the processor comprises a neural processing unit configured to update the display preference data based on user inputs.
13. A method for operating a handheld system, the method comprising: displaying a user interface on a first region and a second region of a touchscreen; receiving a first input from a first side sensor, the first side sensor being configured on a first side of a housing; receiving a second input from a second side sensor, the second side sensor being configured on a second side of a housing; obtaining display preference data from a storage; processing the first input and the second input; selecting a one-hand operation mode based on the first input and the second input; selecting the first region of the touchscreen based on the one-hand operation mode and the display preference data; removing the user interface from the second region of the touchscreen based on the one-hand operation mode and the display preference data; and receiving a user input within the first region of the touchscreen.
14. The method of claim 13 further comprising displaying the user interface in the first region and the second region of the touchscreen upon receiving an indication of leveling change.
15. The method of claim 13 further comprising shifting or scaling the user interface to the first region of the touchscreen.
16. The method of claim 13 further comprising determining a control item location on the touchscreen.
17. The method of claim 16 further comprising determining a distance between the control item location and a thumb.
18. A method for calibrating an input mode for operating a handheld device, the method comprising: providing a touchscreen; displaying a calibration interface on the touch screen; creating a display preference profile based on a predetermined template; receiving user input in a one-hand mode on the touchscreen during a first time interval; receiving a first side sensor input from a first side sensor configured on a first side of the handheld device during the first time interval; receiving a second side sensor input from a second side sensor configured on a second side of the handheld device during the first time interval; and updating the display preference profile using at least the first side sensor input and the second side sensor input.
19. The method of claim 18 further comprising updating the display preference profile during a second time interval based on a third side sensor input received from the first side sensor.
1 20. The method of claim 18 further comprising processing pressure readings
2 of user fingers.
PCT/US2022/082399 2022-12-27 2022-12-27 Methods and systems for operating handheld devices Ceased WO2024144800A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202280100993.2A CN120035801A (en) 2022-12-27 2022-12-27 Method and system for operating a handheld device
PCT/US2022/082399 WO2024144800A1 (en) 2022-12-27 2022-12-27 Methods and systems for operating handheld devices

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2022/082399 WO2024144800A1 (en) 2022-12-27 2022-12-27 Methods and systems for operating handheld devices

Publications (1)

Publication Number Publication Date
WO2024144800A1 true WO2024144800A1 (en) 2024-07-04

Family

ID=91719088

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/082399 Ceased WO2024144800A1 (en) 2022-12-27 2022-12-27 Methods and systems for operating handheld devices

Country Status (2)

Country Link
CN (1) CN120035801A (en)
WO (1) WO2024144800A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060197750A1 (en) * 2005-03-04 2006-09-07 Apple Computer, Inc. Hand held electronic device with multiple touch sensing devices
US20150067531A1 (en) * 2013-08-30 2015-03-05 Verizon Patent And Licensing Inc. User-based customization of a user interface
US20180136774A1 (en) * 2013-03-13 2018-05-17 Immersion Corporation Method and Devices for Displaying Graphical User Interfaces Based on User Contact

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060197750A1 (en) * 2005-03-04 2006-09-07 Apple Computer, Inc. Hand held electronic device with multiple touch sensing devices
US20180136774A1 (en) * 2013-03-13 2018-05-17 Immersion Corporation Method and Devices for Displaying Graphical User Interfaces Based on User Contact
US20150067531A1 (en) * 2013-08-30 2015-03-05 Verizon Patent And Licensing Inc. User-based customization of a user interface

Also Published As

Publication number Publication date
CN120035801A (en) 2025-05-23

Similar Documents

Publication Publication Date Title
CN105824545B (en) The vision-control method and mobile terminal of a kind of display interface
US8823749B2 (en) User interface methods providing continuous zoom functionality
CN105843491B (en) Method, device and terminal for quick page navigation and switching
CN102414649B (en) Operating a touch screen control system according to a plurality of rule sets
JP5759660B2 (en) Portable information terminal having touch screen and input method
TWI482077B (en) Electronic device, method for viewing desktop thereof, and computer program product therof
CN107179865B (en) A page switching method and terminal
CN112527431A (en) Widget processing method and related device
EP2562628A1 (en) Image scale alteration arrangement and method
TW201329835A (en) Display control device, display control method, and computer program
US20150128081A1 (en) Customized Smart Phone Buttons
JP2011227854A (en) Information display device
US20150007088A1 (en) Size reduction and utilization of software keyboards
CN103838507A (en) Touch-sensing display device and driving method thereof
JP5628991B2 (en) Display device, display method, and display program
US20110316887A1 (en) Electronic device with a touch screen and touch operation control method utilized thereby
CN106371595B (en) A method for calling out a message notification bar and a mobile terminal
JP5820414B2 (en) Information processing apparatus and information processing method
CN110851048A (en) A method and electronic device for adjusting control
KR101920864B1 (en) Method and terminal for displaying of image using touchscreen
WO2024144800A1 (en) Methods and systems for operating handheld devices
CN105892918A (en) Mobile terminal with touch screen and control method of mobile terminal
KR20090020157A (en) Screen zooming method on touch screen and terminal using same
CN112703473A (en) Touch keyboard adjusting method, electronic device and computer readable storage medium
KR102296968B1 (en) Control method of favorites mode and device including touch screen performing the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22970344

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202280100993.2

Country of ref document: CN

WWP Wipo information: published in national office

Ref document number: 202280100993.2

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE