[go: up one dir, main page]

CN114518817A - Display method, electronic equipment and storage medium - Google Patents

Display method, electronic equipment and storage medium Download PDF

Info

Publication number
CN114518817A
CN114518817A CN202210023829.8A CN202210023829A CN114518817A CN 114518817 A CN114518817 A CN 114518817A CN 202210023829 A CN202210023829 A CN 202210023829A CN 114518817 A CN114518817 A CN 114518817A
Authority
CN
China
Prior art keywords
dynamic effect
interface
electronic device
playing
application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210023829.8A
Other languages
Chinese (zh)
Other versions
CN114518817B (en
Inventor
金东洙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210023829.8A priority Critical patent/CN114518817B/en
Priority to CN202310382648.9A priority patent/CN116501210B/en
Publication of CN114518817A publication Critical patent/CN114518817A/en
Application granted granted Critical
Publication of CN114518817B publication Critical patent/CN114518817B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/363Graphics controllers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请提供一种显示方法、电子设备及存储介质,涉及终端技术领域,能够解决电子设备切换屏幕刷新率时,出现卡顿的问题。该方法应用于电子设备,电子设备支持第一刷新率和第二刷新率,该方法包括:电子设备在显示屏上以第一刷新率显示第一界面;电子设备接收用户的第一操作;电子设备响应于第一操作,开始播放动效;电子设备检测到动效播放结束,切换为在显示屏上以第二刷新率显示第二界面;其中,动效用于指示电子设备由第一界面切换至第二界面的过程中显示的画面。

Figure 202210023829

The present application provides a display method, an electronic device, and a storage medium, which relate to the technical field of terminals, and can solve the problem of jamming when an electronic device switches a screen refresh rate. The method is applied to an electronic device, and the electronic device supports a first refresh rate and a second refresh rate. The method includes: the electronic device displays a first interface at the first refresh rate on a display screen; the electronic device receives a first operation from a user; The device starts to play the dynamic effect in response to the first operation; the electronic device detects that the playing of the dynamic effect ends, and switches to display the second interface on the display screen at the second refresh rate; wherein the dynamic effect is used to instruct the electronic device to switch from the first interface The screen displayed during the process of reaching the second interface.

Figure 202210023829

Description

Display method, electronic equipment and storage medium
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a display method, an electronic device, and a storage medium.
Background
At present, a Low Temperature Polycrystalline Oxide (LTPO) display screen is mostly adopted for a display screen of an electronic device, and the LTPO display screen refers to that a layer of oxide is additionally added in a substrate of an organic light-emitting diode (OLED) display screen, so that energy consumption required for exciting pixel points is reduced, and thus, the power consumption of the electronic device during display can be reduced.
LTPO display screens are capable of supporting a variety of screen refresh rates, for example, from 144 hertz (Hz) to 1Hz, among others. Electronic devices with LTPO display screens may use different screen refresh rates when running different applications. For example, when the electronic device runs a video-type application, the screen refresh rate of the electronic device is 60 Hz; and when the electronic device displays the desktop, the screen refresh rate of the electronic device is 120 Hz. Thus, switching of the screen refresh rate may be involved when the electronic device launches a different application. For example, when the electronic device starts a video application on the desktop, the screen refresh rate of the electronic device may be switched from 120Hz to 60 Hz. However, when the electronic device switches the screen refresh rate, a stuck phenomenon may occur.
Disclosure of Invention
The embodiment of the application provides a display method, an electronic device and a storage medium, which can solve the problem of unsmooth clamping when the electronic device switches the screen refresh rate.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical solutions:
in a first aspect, a display method is provided, where the method is applied to an electronic device, and the electronic device supports a first refresh rate and a second refresh rate, and the method includes: the electronic equipment displays a first interface on a display screen at a first refresh rate; the electronic equipment receives a first operation of a user; the electronic equipment responds to the first operation and starts playing the dynamic effect; the electronic equipment detects that the dynamic effect playing is finished, and switches to display a second interface on the display screen at a second refresh rate; the action effect is used for indicating a picture displayed in the process that the electronic equipment is switched from the first interface to the second interface.
Based on the first aspect, the electronic device displays a first interface on the display screen at a first refresh rate; when the electronic equipment receives a first operation of a user and responds to the first operation, the electronic equipment starts playing a dynamic effect; when the electronic equipment detects that the dynamic effect playing is finished, the electronic equipment is switched to display a second interface on the display screen at a second refresh rate; because the dynamic effect is used for executing the image displayed in the process that the electronic equipment is switched from the first interface to the second interface, when the electronic equipment detects that the dynamic effect playing is finished, the electronic equipment switches the first refresh rate to the second refresh rate and displays the second interface, and therefore the blocking problem caused by switching the refresh rate in the dynamic effect playing process is avoided.
In a possible design manner of the first aspect, the dynamic effect includes consecutive N frames of pictures, where N is greater than or equal to 1; the electronic equipment starts playing the dynamic effect, comprising: the electronic equipment starts playing the dynamic effect according to the dynamic effect attribute; the dynamic effect attribute comprises at least one of dynamic effect content, dynamic effect size, dynamic effect duration or dynamic effect starting position and dynamic effect ending position; the dynamic effect starting position is used for indicating the position of the first frame picture in the N frame pictures on the display screen, and the dynamic effect ending position is used for indicating the position of the last frame picture in the N frame pictures on the display screen.
In the design mode, the electronic equipment starts to play the dynamic effect according to the dynamic effect attribute; the dynamic effect attribute comprises at least one of dynamic effect content, dynamic effect size, dynamic effect duration or dynamic effect starting position and dynamic effect ending position; the dynamic effect starting position is used for indicating the position of the first frame of the N frames on the display screen, and the dynamic effect ending position is used for indicating the position of the last frame of the N frames on the display screen, so that the dynamic effect playing effect of the electronic equipment can be improved
In a possible design manner of the first aspect, the dynamic effect includes consecutive N frames of pictures, where N is greater than or equal to 1; the electronic equipment detects that the dynamic effect playing is finished, and the method comprises the following steps: when the electronic equipment plays the Mth frame of picture, if the distance between the position of the Mth frame of picture on the display screen and the target position is smaller than a preset value, the electronic equipment detects that the dynamic effect playing is finished; m is more than or equal to 1 and less than or equal to N; the target position is used for indicating the position of the last frame of the N frames of pictures on the display screen.
In the design mode, when the electronic equipment plays the Mth frame of picture, if the distance between the position of the Mth frame of picture on the display screen and the target position is smaller than a preset value, the electronic equipment detects that the dynamic effect playing is finished; the target position is used for indicating the position of the last frame of the N frames of pictures on the display screen, namely when the electronic equipment plays the last frame of picture, the electronic equipment detects that the dynamic effect playing is finished, and the power consumption of the equipment is reduced.
In a possible design manner of the first aspect, the detecting, by the electronic device, that the live action playback is ended includes: when the time length of playing the dynamic effect by the electronic equipment meets the preset time length, the electronic equipment detects that the playing of the dynamic effect is finished.
In the design mode, when the time length of playing the dynamic effect by the electronic equipment meets the preset time length, the electronic equipment detects that the dynamic effect playing is finished, and the power consumption of the equipment is favorably reduced.
In a possible design manner of the first aspect, the electronic device includes a target application, and the second interface is an interface of the target application; after the electronic device starts playing the animation, the method further comprises: the electronic equipment acquires first information; the first information comprises an application package name of the target application; the electronic equipment determines a second refresh rate according to the application package name of the target application and a preset refresh rate switching rule; the preset refresh rate switching rule is used for indicating the mapping relation between the application package name and the refresh rate of the display screen.
In the design mode, after the electronic equipment starts playing the dynamic effect, the electronic equipment can determine a second refresh rate according to the application package name of the target application and a preset refresh rate switching rule; the preset refresh rate switching rule is used for indicating the mapping relation between the application package name and the refresh rate of the display screen, so that the power consumption of the equipment is reduced.
In one possible design of the first aspect, the electronic device includes a target application; the first interface is a desktop of the electronic equipment, and the second interface is an interface after the target application is started; the animation includes a picture displayed during the process of starting the target application by the electronic equipment.
In a possible design manner of the first aspect, when the target application is not running in the background of the electronic device, the second interface is a main interface of the target application; or when the target application runs in the background of the electronic device, the second interface is an interface of the target application when the target application runs in the background.
In a possible design manner of the first aspect, the dynamic effect includes N consecutive frames, where N is greater than or equal to 1; in the N frames, the size of each frame is different; in the process of starting the target application, the electronic equipment sequentially displays N frames of pictures, and in the N frames of pictures, the size of a first frame of picture to the size of an Nth frame of picture are sequentially increased.
In the design mode, in the starting process of the target application, the electronic equipment sequentially displays N frames of pictures, and in the N frames of pictures, the size from the first frame of picture to the size from the Nth frame of picture is sequentially increased, so that the attractiveness of playing dynamic effect in the starting process of the target application is improved.
In one possible design of the first aspect, the electronic device includes a target application; the first interface is an interface of a target application, and the second interface is a desktop of the electronic equipment; the dynamic effect comprises a picture displayed in the process that the electronic equipment exits the target application.
In a possible design manner of the first aspect, the dynamic effect includes N consecutive frames, where N is greater than or equal to 1; in the N frames, the size of each frame is different; in the process that the target application exits, the electronic equipment sequentially displays N frames of pictures, and in the N frames of pictures, the size of a first frame of picture to the size of an Nth frame of picture are sequentially reduced.
In the design mode, in the process of exiting the target application, the electronic equipment sequentially displays N frames of pictures, and in the N frames of pictures, the size from the first frame of picture to the size from the Nth frame of picture is sequentially reduced, so that the attractiveness of playing dynamic effect in the process of exiting the target application is improved.
In one possible design of the first aspect, the electronic device includes a source application and a target application; the first interface and the second interface are multi-task interfaces of the electronic equipment; the first interface comprises an interface of the source application when running between the latest tasks, and the second interface comprises an interface of the target application when running between the latest tasks.
In a possible design manner of the first aspect, the electronic device includes a desktop starter, a dynamic effect identification module, and a dynamic effect playing component; the electronic equipment responds to a first operation and starts playing the dynamic effect, and the method comprises the following steps: the desktop starter responds to the first operation and sends a first dynamic effect notification message to the dynamic effect identification module; the first dynamic effect notification message is used for notifying the dynamic effect identification module that the dynamic effect starts; the dynamic effect identification module sends a first target message to the dynamic effect playing component according to the first dynamic effect notification message; the first target message is used for indicating the dynamic effect attribute; the action component starts playing the action according to the first target message.
In the design mode, the desktop starter responds to a first operation and sends a first dynamic effect notification message to the dynamic effect identification module; the first dynamic effect notification message is used for notifying the dynamic effect identification module that the dynamic effect starts; the dynamic effect identification module sends a first target message to the dynamic effect playing component according to the first dynamic effect notification message; the first target message is used for indicating the dynamic effect attribute; and the dynamic effect component starts playing the dynamic effect according to the first target message, so that the power consumption of the equipment is reduced.
In a possible design manner of the first aspect, the electronic device includes a desktop starter, a dynamic effect identification module, and a dynamic effect playing component; the electronic equipment detects that the dynamic effect playing is finished, and the method comprises the following steps: when the dynamic effect playing component plays the Mth frame of picture, if the distance between the position of the Mth frame of picture on the display screen and the target position is smaller than a preset value, the dynamic effect playing component informs the desktop manager that the dynamic effect playing is finished; the desktop manager sends a second dynamic effect notification message to the dynamic effect identification module; and the dynamic effect identification module detects that the dynamic effect playing is finished according to the second dynamic effect notification message.
In the design mode, when the dynamic effect playing component plays the Mth frame of picture, if the distance between the position of the Mth frame of picture on the display screen and the target position is smaller than the preset value, the dynamic effect playing component informs the desktop manager that the dynamic effect playing is finished; the desktop manager sends a second dynamic effect notification message to the dynamic effect identification module; the dynamic effect identification module detects that the dynamic effect playing is finished according to the second dynamic effect notification message, and the power consumption of the equipment is favorably reduced.
In a possible design manner of the first aspect, the electronic device includes a desktop starter, a dynamic effect identification module, and a dynamic effect playing component; the electronic equipment detects that the dynamic effect playing is finished, and the method comprises the following steps: when the time length of playing the dynamic effect by the dynamic effect playing component meets the preset time length, the dynamic effect playing component informs the desktop manager that the dynamic effect playing is finished; the desktop manager sends a second dynamic effect notification message to the dynamic effect identification module; and the dynamic effect identification module detects that the dynamic effect playing is finished according to the second dynamic effect notification message.
In the design mode, when the time length of playing the dynamic effect by the dynamic effect playing component meets the preset time length, the dynamic effect playing component informs the desktop manager that the dynamic effect playing is finished; the desktop manager sends a second dynamic effect notification message to the dynamic effect identification module; the dynamic effect identification module detects that the dynamic effect playing is finished according to the second dynamic effect notification message, and the power consumption of the equipment is favorably reduced.
In a second aspect, an electronic device is provided, which has the function of implementing the first aspect. The function can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the functions described above.
In a third aspect, an electronic device is provided that includes a display screen, a memory, and one or more processors; the display screen, the memory and the processor are coupled; the memory is for storing computer program code, the computer program code comprising computer instructions; the computer instructions, when executed by the processor, cause the electronic device to perform the steps of: the electronic equipment displays a first interface on a display screen at a first refresh rate; the electronic equipment receives a first operation of a user; the electronic equipment responds to the first operation and starts playing the dynamic effect; the electronic equipment detects that the dynamic effect playing is finished, and switches to display a second interface on the display screen at a second refresh rate; the action effect is used for indicating a picture displayed in the process that the electronic equipment is switched from the first interface to the second interface.
In one possible design of the third aspect, the dynamic effect includes N consecutive frames of pictures, where N is greater than or equal to 1; when the computer instructions are executed by the processor, the electronic device is enabled to specifically execute the following steps: the electronic equipment starts playing the dynamic effect according to the dynamic effect attribute; the dynamic effect attribute comprises at least one of dynamic effect content, dynamic effect size, dynamic effect duration or dynamic effect starting position and dynamic effect ending position; the dynamic effect starting position is used for indicating the position of the first frame picture in the N frame pictures on the display screen, and the dynamic effect ending position is used for indicating the position of the last frame picture in the N frame pictures on the display screen.
In one possible design of the third aspect, the dynamic effect includes N consecutive frames of pictures, where N is greater than or equal to 1; when the computer instructions are executed by the processor, the electronic device is enabled to specifically execute the following steps: when the electronic equipment plays the Mth frame of picture, if the distance between the position of the Mth frame of picture on the display screen and the target position is smaller than a preset value, the electronic equipment detects that the dynamic effect playing is finished; m is more than or equal to 1 and less than or equal to N; the target position is used for indicating the position of the last frame of the N frames of pictures on the display screen.
In one possible design of the third aspect, the computer instructions, when executed by the processor, cause the electronic device to perform the following steps: when the time length of playing the dynamic effect by the electronic equipment meets the preset time length, the electronic equipment detects that the playing of the dynamic effect is finished.
In one possible design of the third aspect, the electronic device includes a target application, and the second interface is an interface of the target application; the computer instructions, when executed by the processor, cause the electronic device to further perform the steps of: the electronic equipment acquires first information; the first information comprises an application package name of the target application; the electronic equipment determines a second refresh rate according to the application package name of the target application and a preset refresh rate switching rule; the preset refresh rate switching rule is used for indicating the mapping relation between the application package name and the refresh rate of the display screen.
In one possible design of the third aspect, the electronic device includes a target application; the first interface is a desktop of the electronic equipment, and the second interface is an interface after the target application is started; the animation includes a picture displayed during the process of starting the target application by the electronic equipment.
In a possible design of the third aspect, when the target application is not running in the background of the electronic device, the second interface is a main interface of the target application; or when the target application runs in the background of the electronic device, the second interface is an interface of the target application when the target application runs in the background.
In one possible design of the third aspect, the dynamic effect includes N consecutive frames, where N is greater than or equal to 1; in N frames of pictures, the size of each frame of picture is different; in the process of starting the target application, the electronic equipment sequentially displays N frames of pictures, and in the N frames of pictures, the size of a first frame of picture to the size of an Nth frame of picture are sequentially increased.
In one possible design of the third aspect, the electronic device includes a target application; the first interface is an interface of a target application, and the second interface is a desktop of the electronic equipment; the dynamic effect comprises a picture displayed in the process that the electronic equipment exits the target application.
In a possible design manner of the third aspect, the dynamic effect includes N consecutive frames of pictures, where N is greater than or equal to 1; in the N frames, the size of each frame is different; in the process that the target application exits, the electronic equipment sequentially displays N frames of pictures, and in the N frames of pictures, the size of a first frame of picture to the size of an Nth frame of picture are sequentially reduced.
In one possible design of the third aspect, the electronic device includes a source application and a target application; the first interface and the second interface are multi-task interfaces of the electronic equipment; the first interface comprises an interface of the source application when running between the latest tasks, and the second interface comprises an interface of the target application when running between the latest tasks.
In a possible design manner of the third aspect, the electronic device includes a desktop starter, a dynamic effect identification module, and a dynamic effect playing component; the desktop starter responds to the first operation and sends a first dynamic effect notification message to the dynamic effect identification module; the first dynamic effect notification message is used for notifying the dynamic effect identification module that the dynamic effect starts; the dynamic effect identification module sends a first target message to the dynamic effect playing component according to the first dynamic effect notification message; the first target message is used for indicating the dynamic effect attribute; the action component starts playing the action according to the first target message.
In a possible design manner of the third aspect, the electronic device includes a desktop starter, a dynamic effect identification module, and a dynamic effect playing component; when the dynamic effect playing component plays the Mth frame of picture, if the distance between the position of the Mth frame of picture on the display screen and the target position is smaller than the preset value, the dynamic effect playing component informs the desktop manager that the dynamic effect playing is finished; the desktop manager sends a second dynamic effect notification message to the dynamic effect identification module; and the dynamic effect identification module detects that the dynamic effect playing is finished according to the second dynamic effect notification message.
In a possible design manner of the third aspect, the electronic device includes a desktop starter, a dynamic effect identification module, and a dynamic effect playing component; the electronic equipment detects that the dynamic effect playing is finished, and the method comprises the following steps: when the time length of playing the dynamic effect by the dynamic effect playing component meets the preset time length, the dynamic effect playing component informs the desktop manager that the dynamic effect playing is finished; the desktop manager sends a second dynamic effect notification message to the dynamic effect identification module; and the dynamic effect identification module detects that the dynamic effect playing is finished according to the second dynamic effect notification message.
In a fourth aspect, a computer-readable storage medium is provided, in which computer instructions are stored, and when the computer instructions are executed on a computer, the computer is enabled to execute the display method according to any one of the first aspect.
In a fifth aspect, there is provided a computer program product comprising instructions which, when run on a computer, cause the computer to perform the display method of any of the first aspects above.
For technical effects brought by any one of the design manners in the second aspect to the fourth aspect, reference may be made to technical effects brought by different design manners in the first aspect, and details are not described herein.
Drawings
Fig. 1 is a first schematic diagram illustrating a screen refresh rate switching according to an embodiment of the present disclosure;
fig. 2 is a second schematic diagram illustrating a screen refresh rate switching according to an embodiment of the present disclosure;
fig. 3 is a third schematic diagram illustrating a screen refresh rate switching according to an embodiment of the present application;
fig. 4a is a first schematic diagram of interface switching according to an embodiment of the present disclosure;
fig. 4b is a schematic diagram of a dynamic effect playing provided in the embodiment of the present application;
fig. 5 is a second schematic diagram of interface switching according to an embodiment of the present disclosure;
fig. 6 is a third schematic diagram of interface switching according to an embodiment of the present disclosure;
fig. 7 is a fourth schematic view of interface switching provided in the embodiment of the present application;
fig. 8 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present application;
fig. 9 is a software framework diagram of an electronic device according to an embodiment of the present application;
fig. 10 is a schematic view illustrating a first processing flow of interface display of an electronic device according to an embodiment of the present application;
fig. 11 is a schematic view illustrating an electronic device interface display processing flow chart according to an embodiment of the present application;
fig. 12 is a schematic view illustrating a third processing flow of interface display of an electronic device according to an embodiment of the present application;
fig. 13 is a first flowchart illustrating a display method according to an embodiment of the present disclosure;
fig. 14 is a second flowchart illustrating a display method according to an embodiment of the present application;
fig. 15 is a schematic structural diagram of a chip system according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. Where in the description of the present application, "/" indicates a relationship where the objects associated before and after are an "or", unless otherwise stated, for example, a/B may indicate a or B; in the present application, "and/or" is only an association relationship describing an associated object, and means that there may be three relationships, for example, a and/or B, and may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. Also, in the description of the present application, "a plurality" means two or more than two unless otherwise specified. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple. In addition, in order to facilitate clear description of technical solutions of the embodiments of the present application, in the embodiments of the present application, terms such as "first" and "second" are used to distinguish the same items or similar items having substantially the same functions and actions. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance. Also, in the embodiments of the present application, words such as "exemplary" or "for example" are used to mean serving as examples, illustrations or illustrations. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present relevant concepts in a concrete fashion for ease of understanding.
Currently, when an application is started, an electronic device plays a start action; and simultaneously, the quit action effect can be played when the application is quitted. The starting action refers to a process from the electronic equipment receiving a starting operation of a user to the electronic equipment displaying an interface of the application; the exit action refers to a process from the electronic equipment receiving the exit operation of the user to the electronic equipment displaying the desktop interface. In the related art, the switching of the screen refresh rate of the electronic device is often performed during the process of playing the dynamic effect (e.g., starting the dynamic effect or exiting the dynamic effect), and since the dynamic effect is a process in which an interface changes, the electronic device switches the screen refresh rate during playing the dynamic effect, which may cause the electronic device to be stuck, thereby affecting the user experience.
Illustratively, as shown in fig. 1, for example, in the case where the electronic device enters a video-class application from a desktop, the electronic device switches the screen refresh rate during playing of a animation that starts the video-class application. Referring to fig. 1, it can be seen that, since the start animation of the video-type application is a dynamic process, during the process of playing the start animation of the video-type application, the screen refresh rate of the electronic device is directly switched from 120Hz to 60Hz, thereby causing the electronic device to be stuck.
Specifically, there are two main reasons why the stutter occurs during playing the animation. On one hand, when the electronic device employs hardware refresh, as shown in fig. 2, due to the limitation of the hardware of the electronic device, the electronic device needs about two frames to switch the screen refresh rate, which causes frame loss and thus causes a jam. On the other hand, the screen refresh rate of the electronic device is changed at the time of switching, and the user feels unsmoothness, i.e., stutter. Especially from a high refresh rate (e.g., 120Hz) to a low refresh rate (e.g., 60Hz) with a relatively high degree of sticking.
In order to solve the above technical problem, a fixed delay mechanism is proposed in the related art. The fixed delay mechanism refers to the electronic device delaying the time for switching the screen refresh rate by a fixed time. However, since the durations of the application start action and the application exit action are not consistent, the pause occurring in the application start action (or the application exit action) can only be solved by delaying the time for switching the screen refresh rate by a fixed time, and the pause occurring in the application start action and the application exit action cannot be solved simultaneously. Taking glorious application start-up action and application exit action as examples, the time duration of the application start-up action is 400ms, and the time duration of the application exit action is 700 ms. Then, by delaying the time for switching the screen refresh rate by a fixed time (e.g., 400ms), the application can be prevented from starting the pause of the animation effect, and the application can not be prevented from exiting the pause of the animation effect. If the time for switching the screen refresh rate is delayed by a fixed time (e.g. 700ms), the applications can be prevented from starting the action and quitting the action. However, if the user performs a touch operation (such as sliding or clicking) within 300ms after the application is started, the screen refresh rate is still switched to cause a pause.
In addition, because different devices have different performances, the switching time of the screen refresh rate may be advanced or delayed, and thus the method for fixing the time delay needs to set different time delays for different devices, thereby increasing the workload of developers and reducing the research and development efficiency.
Based on this, the embodiment of the present application provides a display method, which enables an electronic device to switch a screen refresh rate after playing a completion effect, so as to solve the problem that the electronic device is stuck when switching the screen refresh rate.
Illustratively, as shown in fig. 3, for example, in a case where the electronic device enters a video-class application from a desktop, the electronic device switches the screen refresh rate after playing the startup action of the video-class application. Referring to fig. 3, it can be seen that the screen refresh rate of the electronic device is still 120Hz when the electronic device plays the start animation of the video-like application. And after the starting action of the video application is played, the screen refresh rate is switched to 60 Hz.
It should be understood that the scheme of the embodiment of the application is suitable for switching the screen refresh rate in the dynamic effect scene. The dynamic effect scene refers to a scene in which an interface of a display screen of the electronic device changes. That is, in the case that the interface of the display screen of the electronic device changes, the electronic device may switch the screen refresh rate. Illustratively, the electronic device refreshes the display screen using a first refresh rate (e.g., 120Hz) while displaying the first interface; when the electronic device switches from the first interface to the second interface, the electronic device refreshes the display screen using a second refresh rate (e.g., 60 Hz). The process of switching the electronic device from the first interface to the second interface is the dynamic effect scene in the embodiment of the application.
In some embodiments, the dynamic scenarios may be, for example, scenarios of application launch, scenarios of application exit, scenarios of application switch (e.g., switch from one application to another), scenarios of application switch between recent tasks in a multitasking interface, and the like.
The following takes an electronic device as a mobile phone as an example to illustrate the dynamic effect scenario described in the embodiment of the present application. It should be understood that the scenes described in the following embodiments are only some examples in the embodiments of the present application, and do not constitute a limitation to the present application, and other dynamic scenes suitable for screen refresh rate switching should also belong to the scope of the embodiments of the present application.
Taking the application as the "communication" application and the dynamic effect scene as the scene of the "communication" application starting as an example, as shown in fig. 4a (a), in response to the user operating the icon 101 of the "communication" application in the mobile phone home screen interface (e.g. the user clicks the icon 101), the mobile phone displays the interface 102 shown in fig. 4a (b). The interface 102 is an interface after the communication application is started. In some embodiments, as shown in fig. 4a (b), the interface 102 may be an interface for a user's correspondence list. In the scenario of the "communication" application starting, the starting action of the "communication" application is a process of the user clicking the icon 101 to the mobile phone display interface 102.
It should be noted that the trick play has a fixed duration (e.g., 400 ms). Continuously playing a series of image frames within a fixed time length is a dynamic effect scene. Taking the above-mentioned start-up animation as an example, the start-up animation is a process of the image displayed in the interface 102 being changed from small to large. Specifically, when the user clicks the icon 101, the action is started, and the image displayed in the interface 102 is played; when the image displayed in the interface 102 is spread over the entire screen (or called display screen), the start action is finished.
For example, it is illustrated that a mobile phone continuously plays six frames of image frames within a fixed time duration of the dynamic effect playing. When the user clicks on the icon 101 of the "communication" application, the mobile phone continuously plays the image frames of frame 1, frame 2, frame 3, frame 4, frame 5 and frame 6 as shown in fig. 4 b. The process of playing the image frames of frame 1, frame 2, frame 3, frame 4, frame 5 and frame 6 by the mobile phone is the process of starting the dynamic effect.
Additionally, in a scenario where dynamic effects are enabled, in some embodiments, the mobile phone displays a home screen interface and refreshes the display screen using a first preset refresh rate (e.g., 120 Hz); when the user starts the communication application and displays the interface of the communication application, the mobile phone refreshes the display screen by using a second preset refresh rate (such as 60 Hz). In other embodiments, the handset displays the home screen interface and refreshes the display screen using a first preset refresh rate (e.g., 120 Hz); when the main screen interface of the mobile phone does not receive the operation of the user within a certain time (i.e. the main interface does not change within a certain time and is in a static state), the mobile phone may reduce the screen refresh rate (e.g. to 60Hz), and refresh the display screen using the reduced screen refresh rate. When the mobile phone receives the operation of the user (such as the operation of starting the communication application by the user), the mobile phone firstly increases the screen refresh rate to a first preset refresh rate (such as 120 Hz); and then, after the communication application is started, the mobile phone refreshes the display screen by using a second preset refresh rate (such as 60Hz) and displays the interface of the communication application.
It should be noted that, in the scenario of starting the dynamic effect, the starting dynamic effect may be a dynamic effect played when the user starts the application for the first time, or a dynamic effect played when the user does not start the application for the first time. Where the first time means that the application is neither running in the foreground nor in the background. Not for the first time, the application is running in the background, in which case, the start action may also be understood as a process in which the application is switched from the background to the foreground.
Taking the application as the "communication" application and the dynamic effect scene as the scene in which the "communication" application exits as an example, the mobile phone displays the interface 103 shown in fig. 5 (a), where the interface 103 is an interface after the "communication" application is started. Illustratively, as shown in fig. 5 (a), the interface 103 may be, for example, an interface of a user correspondence list. Then, the cellular phone displays the interface 104 shown in fig. 5 (b) in response to the exit operation of the user to the "communication" application. The interface 204 may be, for example, a mobile phone home screen interface.
In some embodiments, the exit operation may be, for example, one of a gesture operation, a voice operation, or a touch operation. The touch operation may be, for example, a click operation, a slide operation, or the like. Taking the exit operation as the sliding operation, for example, as shown in fig. 5 (b), the exit operation may be, for example, an operation of a user sliding up the interface of the "communication" application.
In the scene of exiting the communication application, the exiting action of the communication application is the process of the user sliding up the interface of the communication application to display the main screen interface of the mobile phone. In some embodiments, in conjunction with the above embodiments, the logout animation is a process of changing from large to small images displayed in the interface 102. Specifically, when a user slides on an interface of the communication application, the sliding effect is quitted; the image in the interface 102 displayed by the mobile phone begins to shrink; the exit action ends when the image displayed in the interface 102 is completely exited and the cell phone displays the home screen interface.
Taking the scenario of the dynamic effect scenario as the scenario of the mobile phone switching from the home screen interface to the multitask interface as an example, the mobile phone displays an interface 105 as shown in fig. 6 (a), where the interface 105 is the mobile phone home screen interface. Then, in response to the user's operation of the home screen interface, the mobile phone displays an interface 106 as shown in fig. 6 (b), and the interface 106 is a multitasking interface of the mobile phone. The interface 106 includes an interface of an application 1 running in the background, where the application 1 is the application running in the background for the shortest time. In the animation scene, animation refers to the process of displaying the multi-task interface by the mobile phone through the operation of the user on the home screen interface.
It should be noted that, for the operation of the user on the home screen interface, reference may be made to the example of the exit operation in the foregoing embodiment, and details are not described here any more. Taking this operation as an example of a slide operation, as shown in fig. 6 (b), this operation may be, for example, a slide-up operation of the home screen interface by the user.
Taking the dynamic effect scene as an example of a scene of application switching between the latest tasks in the multitasking interface of the mobile phone, exemplarily, the mobile phone displays an interface 107 as shown in (a) in fig. 7, where the interface 107 is the multitasking interface of the mobile phone. The interface 107 includes an interface of an application 1 running in the background, and the application 2 is an application running in the background for the shortest time. Then, the handset responds to the operation of the user on the interface 107, and displays an interface 108 as shown in (b) in fig. 7, wherein the interface 108 comprises an interface of the application 2 running in the background, and the duration of the application 2 running in the background is longer than that of the application 1 running in the background. Wherein, in the dynamic effect scene, the dynamic effect refers to the process of the user operating the interface 107 to the mobile phone display interface 108.
It should be noted that, for the operation of the interface 107 by the user, reference may be made to the example of the exit operation in the foregoing embodiment, and details are not described here any more. Taking this operation as an example of a slide operation, as shown in fig. 7 (b), this operation may be, for example, a right slide operation.
The display method provided by the embodiment of the present application will be described in detail below with reference to the drawings of the specification.
The display method provided in the embodiment of the present application may be applied to an electronic device with a display function, where the electronic device may be a mobile phone, a sports camera (GoPro), a digital camera, a tablet pc, a desktop, a laptop, a handheld computer, a notebook, a vehicle-mounted device, an ultra-mobile personal computer (UMPC), a netbook, a cellular phone, a Personal Digital Assistant (PDA), an Augmented Reality (AR) device, and the like, and the embodiment of the present application does not specially limit a specific form of the electronic device.
Fig. 8 is a schematic structural diagram of the electronic device 100. Among them, the electronic device 100 may include: the mobile terminal includes a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the present embodiment does not constitute a specific limitation to the electronic apparatus 100. In other embodiments, electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller may be a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
It should be understood that the interface connection relationship between the modules illustrated in this embodiment is only an exemplary illustration, and does not constitute a limitation on the structure of the electronic device. In other embodiments, the electronic device may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel (or display substrate). The display panel may employ an organic light-emitting diode (OLED). In the embodiment of the application, the display screen is an LTPO display screen; the LTPO display screen includes display cells (e.g., TFTs) in the display panel that are LTPO TFTs. For the illustration of LTPO, reference may be made to the above embodiments, which are not described in detail herein.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV and other formats. In some embodiments, the electronic device may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device selects a frequency point, the digital signal processor is used for performing fourier transform and the like on the frequency point energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device can play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can realize applications such as intelligent cognition of electronic equipment, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110. The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the electronic device. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as audio, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 121. For example, in the embodiment of the present application, the processor 110 may execute instructions stored in the internal memory 121, and the internal memory 121 may include a program storage area and a data storage area.
The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area can store data (such as audio data, phone book and the like) created in the using process of the electronic device. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc. The SIM card interface 195 is used to connect a SIM card. The SIM card can be attached to and detached from the electronic device by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic equipment can support 1 or N SIM card interfaces, and N is a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc.
In some embodiments, the software system of the electronic device 100 may employ a hierarchical architecture, an event-driven architecture, a micro-core architecture, or a cloud architecture. In the embodiment of the present application, a layered architecture Android system is taken as an example to exemplarily illustrate a software structure of the electronic device 100.
Fig. 9 is a software structure diagram of an electronic device according to an embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into five layers, which are an application layer (or called application layer), an application framework layer, an Android runtime (Android runtime) and system library layer, a hardware abstraction layer, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 9, the application packages may include phone, mailbox, calendar, camera, etc. applications. In some embodiments, the application layer also includes a launcher. Launchers are desktop launchers in the Android system, and desktop interfaces (UIs) of the Android system are collectively called launchers. The desktop interface comprises icons of all applications installed on the electronic equipment. Illustratively, the desktop interface includes a phone icon, a mailbox icon, a calendar icon, a camera icon, and the like.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 9, the application framework layer may include a window manager, a refresh rate switch module, a live action recognition module, an image composition system, a view system, a package manager, an input manager, an activity manager, a resource manager, a live action playback component, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The refresh rate switching module is used for adjusting the screen refresh rate.
The dynamic effect identification module is used for identifying the dynamic effect of interface switching under the scene that the electronic equipment has interface switching. For example, to identify the beginning of a dynamic effect, the end of a dynamic effect, etc.
The image synthesis system is used to control image synthesis and generate vertical synchronization (Vsync) signals. In some embodiments, the image composition system further comprises an image buffer queue. Illustratively, the application renders the image through a view system and renders the rendered image through an image rendering system. Then the application sends the rendered image to an image cache queue in the image synthesis system; the image buffer queue is used for buffering the image rendered by the application drawing. Every time the Vsync signal comes, the image synthesis system sequentially acquires one frame of image to be synthesized from the image buffer queue, and then performs image synthesis by the image synthesis system.
The image synthesizing system includes: a composition thread, a Vsync thread, a buffer queue (queue buffer) thread. The composition thread is used to wake up for composition by the Vsync signal. The Vsync thread is used to generate the next Vsync signal from the Vsync signal request. The buffer queue thread is used for storing buffers, generating Vsync signal requests, waking up the composition thread, and the like.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The package manager is used for program management within the system, for example: application installation, uninstallation, upgrade, and the like.
The input manager is used to manage the programs of the input device. For example, the input system may determine input operations such as a mouse click operation, a keyboard input operation, and a touch slide.
The activity manager is used for managing the life cycle of each application program and the navigation backspacing function. The method is responsible for the creation of the main thread of the Android and the maintenance of the life cycle of each application program.
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The dynamic effect playing component is used for playing dynamic effects. Illustratively, the dynamic effect playing component is used for playing a dynamic effect when the application is started (such as starting the dynamic effect) or a dynamic effect when the application is exited (such as exiting the dynamic effect).
The Android runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application layer and the application framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: an image rendering library, an image composition library, a function library, a media library, an input processing library, and the like.
The image rendering library is used for rendering two-dimensional or three-dimensional images. The image composition library is used for composition of two-dimensional or three-dimensional images.
In a possible implementation manner, the application performs rendering on the image through the image rendering library, and then the application sends the rendered image to a cache queue of the image synthesis system. Each time the Vsync signal arrives, an image synthesis system (e.g., a surface flicker) sequentially acquires one frame of image to be synthesized from the buffer queue, and then performs image synthesis by the image synthesis library.
The function library provides macros, type definitions, character string operation functions, mathematical calculation functions, input and output functions, and the like used in the C language.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
The input processing library is used for processing a library of input devices, and can realize mouse, keyboard, touch input processing and the like.
The hardware abstraction layer may include a plurality of library modules, which may be, for example, hardware compositor (hwcomposer, HWC), camera library modules, and the like. The Android system can load corresponding library modules for the equipment hardware, and then the purpose that the application program framework layer accesses the equipment hardware is achieved. The device hardware may include, for example, a display screen, a camera, etc. in the electronic device.
The kernel layer is a layer between hardware and software. The kernel layer at least comprises a Touch Panel (TP) driver, a display driver, a Bluetooth driver, a WIFI driver, a keyboard driver, a shared memory driver, a camera driver and the like.
The hardware may be audio devices, bluetooth devices, camera devices, sensor devices, etc.
The following describes an exemplary workflow of software and hardware of an electronic device in conjunction with a scenario in which the electronic device is switched between interfaces.
When a touch sensor in the touch panel receives a touch operation, the kernel layer processes the touch operation into an original input event (including information such as touch coordinates, touch force, and a time stamp of the touch operation). The raw input events are stored at the kernel layer. The kernel layer determines the information (including operation type, report position and the like) of the original input event and the focus application according to the current focus through the input processing library, and sends the analyzed information to the focus application. The focus may be a touch point in a touch operation or a click position in a mouse click operation. The focus application is an application running in the foreground of the electronic equipment or an application corresponding to a touch position in touch operation. And the focus application determines a control corresponding to the original input event according to the analyzed information (such as a breakpoint position) of the original input event.
Taking the touch operation as a click operation, and taking the control corresponding to the click operation as an icon of the application 1 as an example, the application 1 calls an image rendering library in a system library to render and render an image through a view system of an application program framework layer. The application 1 sends the rendered image to a buffer queue of the image synthesis system. And synthesizing the rendered images drawn in the image synthesis system into an application 1 interface through an image synthesis library in the system library. The image composition system is driven by the display of the kernel layer so that the screen (display screen) displays the corresponding interface of the application 1.
For ease of understanding, an explanation of some of the concepts related to the embodiments of the present application are given for illustrative purposes.
1. Frame: refers to a single picture of the smallest unit in the interface display. A frame can be understood as a still picture and displaying a number of consecutive frames in rapid succession can create the illusion of object motion. The frame rate is the number of frames of a picture refreshed in 1 second, and can also be understood as the number of times of refreshing the picture per second by an image processor in the electronic device. A high frame rate may result in a smoother and more realistic animation. The greater the number of frames per second, the more fluid the displayed motion will be.
It should be noted that the interface usually needs to go through drawing, rendering, composition, and other processes before displaying the frame.
2. And (3) frame drawing: the drawing refers to drawing of pictures displayed on an interface. The display interface may be comprised of one or more views, each of which may be drawn by a visual control of the view system, each of which is comprised of a sub-view, one of which corresponds to a widget in the view, e.g., one of which corresponds to a symbol in the picture view.
3. Frame rendering: rendering the rendered view or adding 3D effects, etc. For example: the 3D effect may be a light effect, a shadow effect, a texture effect, and the like.
4. Frame synthesis: is the process of compositing a plurality of the one or more rendered views into a display interface.
In order to improve display smoothness and reduce display jamming and the like, electronic devices generally perform display based on Vsync signals to synchronize the flow of drawing, rendering, synthesizing, screen refreshing display and the like of images. Those skilled in the art will appreciate that the Vsync signal is a periodic signal, and the period of the Vsync signal may be set according to the refresh rate of the display screen. For example, when the refresh rate of the display screen is 60Hz, the Vsync signal period may be 16.6ms, i.e., the electronics generate a control signal every 16.6ms to cause the Vsync signal period to trigger.
In addition, the Vsync signal may be divided into a software Vsync signal and a hardware Vsync signal. The software Vsync signals include Vsync-APP and Vsync-SF. Vsync-APP is used to trigger the rendering process. Vsync-SF is used to trigger the synthesis flow. The hardware Vsync signal (Vsync-HW) is used to trigger the screen display refresh process.
Typically, the software Vsync signal and the hardware Vsync signal maintain period synchronization. Taking the variation of 120Hz and 60Hz as an example, if Vsync-HW is switched from 120Hz to 60Hz, Vsync-APP and Vsync-SF are synchronously varied and switched from 120Hz to 60 Hz.
Fig. 10 is a schematic view of an electronic device interface display processing flow provided in an embodiment of the present application. The content displayed by the electronic device corresponds to frame 1, frame 2, and frame 3 in chronological order.
Specifically, taking the display of the frame 1 as an example, the application of the electronic device renders the frame 1 through a view system of an application framework layer. After the frame 1 is rendered and rendered, the application of the electronic equipment sends the rendered frame 1 to the image synthesis system. The image composition system composes the rendered frame 1. After the frame 1 is synthesized, the electronic device may display the content corresponding to the frame 1 on a screen (e.g., a display screen) by calling the kernel layer display driver. It should be noted that the process of frames 2 and 3 similar to that of frame 1 is also synthesized and displayed, and is not described here again. Each frame in fig. 3 lags from drawing to display by 2 periods of the Vsync signal, i.e. the display of the electronic device has hysteresis.
Fig. 11 is a schematic view of a processing flow of interface display of an electronic device according to an embodiment of the present application. The content displayed by the electronic device corresponds in chronological order to frame 0, frame 1, frame 2, frame 3, frame 4, frame 5, and frame 6.
Specifically, taking the display of the frame 2 as an example, the application of the electronic device renders and renders the frame 2 through a view system of an application framework layer. After the frame 2 is drawn and rendered, the application of the electronic equipment sends the drawn and rendered frame 2 to the image synthesis system. The image composition system composes the rendered frame 2. After the frame 2 is synthesized, the electronic device may start the display driver by calling the kernel layer, and display the content corresponding to the frame 2. The processes of frame 3, frame 4, frame 5 and frame 6 similar to frame 2 are also synthesized and displayed, and are not described in detail here.
When the frame 3 is rendered, a screen refresh rate switching module of the electronic device decides to switch the refresh rate (for example, switching from 120Hz to 60 Hz); and when the frame 4 is drawn and rendered, switching the screen refresh rate, wherein the period duration of the Vsync signal corresponding to the frame 4 drawing and rendering is prolonged, and the switching of the screen refresh rate is completed.
As can be seen in fig. 11, each frame in fig. 11 lags from drawing to display by 2 periods of the Vsync signal. It can be known from the foregoing embodiment that the switching of the screen refresh rate of the electronic device occurs during the dynamic effect playing process, and as can be seen with reference to fig. 11, during the dynamic effect playing process, the display screen of the electronic device sequentially displays frames 2 and 3. When the screen refresh rate is switched during the live-action playing, the frame interval corresponding to the frame 2 rendering is not consistent with the frame interval corresponding to the frame 2 display (for example, the frame interval corresponding to the frame 2 display is longer and the distance is longer than the frame interval corresponding to the frame 2 rendering). Similarly, the frame interval corresponding to the frame 3 rendering is not consistent with the frame interval corresponding to the frame 3 display, thereby causing the pause during the live-action playing.
Fig. 12 is a schematic view of an electronic device interface display processing flow provided in an embodiment of the present application. In chronological order, the content displayed by the electronic device corresponds to frame 0, frame 1, frame 2, frame 3, frame 4, frame 5, and frame 6, in that order.
When the frame 4 is rendered, the screen refresh rate switching module of the electronic device decides to switch the refresh rate (for example, switching from 120Hz to 60 Hz); and when the frame 5 is drawn and rendered, switching the screen refresh rate, wherein the period duration of the Vsync signal corresponding to the frame 5 drawing and rendering is prolonged, and the switching of the screen refresh rate is completed.
As can be seen from fig. 12, the switching of the screen refresh rate of the electronic device occurs after the live effect is played, i.e. immediately after the live effect is played, the switching of the screen refresh rate is performed. Referring to fig. 12, during the dynamic effect playing process, the display screen of the electronic device displays frame 0, frame 1, and frame 2. At this time, the frame interval corresponding to the rendering of the frame 1 coincides with the frame interval corresponding to the display of the frame 1. Similarly, the frame interval corresponding to the frame 2 rendering is consistent with the frame interval corresponding to the frame 2 displaying, so that the problem of pause of the electronic equipment in the process of playing the dynamic effect can be avoided.
It should be noted that, when the screen refresh rate of the display screen is 120Hz, the Vsync signal period may be 8.3ms, that is, the electronic device generates a control message every 8.3ms to trigger the Vsync signal period. When the refresh rate of the display screen is 60Hz, the Vsync signal period may be 16.6ms, i.e., the electronics generate a control signal every 16.6ms to cause the Vsync signal period to trigger.
In addition, the size of the display screen of the electronic device is 1080X1980 (unit is pixel), that is, the longitudinal direction of the display screen of the electronic device is 1080pixel, the transverse direction is 1980pixel, and the application starting time duration is 400 ms. Then 24 frames need to be drawn at a 60Hz refresh rate to apply the start-up animation. Typically, the display screen is spaced apart 83 pixels per frame in the vertical direction and 45 pixels per frame in the horizontal direction. If the screen refresh rate of the electronic device is switched to 120Hz, the interval per frame is halved, i.e. the display screen is longitudinally spaced by 42 pixels per frame and laterally spaced by 23 pixels per frame.
For convenience of understanding, the following describes a process of interaction between the modules involved in the display method provided in the embodiment of the present application with reference to a software architecture diagram shown in fig. 9.
Exemplarily, fig. 13 is a schematic process diagram of interaction between modules in the display method provided in the embodiment of the present application. As shown in fig. 13, the electronic device may include: the system comprises a desktop starter, a dynamic effect identification module, an activity manager, a dynamic effect playing component, a screen refresh rate switching module, an image synthesis system, a hardware synthesizer and a display driver. For example, the display method may include S201-S215.
S201, the desktop starter receives a first operation of a user.
The first operation may be one of a voice operation, a gesture operation, or a touch operation. The touch operation may be, for example, a click operation or a slide operation.
Taking the first operation as a click operation as an example, as shown in fig. 4a, the desktop launcher receives a click operation of a user on the communication application, and launches the communication application in response to the click operation. And after the communication application is started, the electronic equipment plays a starting dynamic effect. Taking the first operation as a sliding operation as an example, as shown in fig. 5, for example, the desktop launcher receives a sliding operation (e.g., a sliding operation) of the user on the communication application interface, and exits the communication application in response to the sliding operation. When the communication application exits, the electronic equipment plays an exit dynamic effect.
S202, the desktop starter sends a first dynamic effect notification message to the dynamic effect identification module.
The first action notification message is used for notifying the action recognition module that a target action is generated (or called target action is started).
In some embodiments, the first animation notification message further comprises a first animation type. Wherein the first dynamic effect type is used for indicating the dynamic effect type of the target dynamic effect start. For example, when the first action type is "0", it indicates that the target action is the start action; when the first action type is 1, the target action is the quit action.
Illustratively, the first dynamic notification message may be, for example, a start "0"; or start "1". Where start denotes start. On the basis, when the first dynamic effect notification message is start 0, the target dynamic effect is the start dynamic effect start; when the first action notification message is start "1", the target action is the exit action start.
In some embodiments, after the desktop launcher receives a first operation of a user, the target application is launched in response to the first operation. The dynamic effect played by the target application in the starting process is called a target dynamic effect. On this basis, for example, when the desktop launcher detects that the user's finger leaves the desktop (i.e., the user's finger leaves the screen of the electronic device), the target animation effect (e.g., the launch animation effect or the exit animation effect) starts playing. In addition, in some embodiments, when the target dynamic effect starts to play, the desktop starter sends a first dynamic effect notification message to the dynamic effect identification module; that is, when the desktop launcher detects that the user's finger leaves the desktop, the desktop launcher sends a first animation notification message to the animation recognition module.
S203, the dynamic effect identification module registers and recalls to the desktop starter.
In some embodiments, when the animation recognition module receives the first animation notification message sent by the desktop launcher, the animation recognition module registers a callback with the desktop launcher to notify the desktop launcher that the target animation is beginning.
And S204, the desktop starter informs the activity manager that the current activity is started.
S205, the activity manager sends a first message to the screen refresh rate switching module.
The first message is used for informing the screen refresh rate switching module of the current effective start. In some embodiments, the first message includes an application package name of the target application.
S206, the screen refresh rate switching module determines a preset refresh rate of the target application according to the first message and a preset refresh rate switching rule.
The preset refresh rate switching rule is used for indicating the corresponding relation between the application package name of the target application and the preset refresh rate of the target application.
For example, the application package name of the target application is com.ent.qqlive, and the preset refresh rate corresponding to the application package name of the target application is 60 Hz.
It should be noted that, if the dynamic effect scene is a scene in which a dynamic effect is played when the application is started, the first message includes a package name of the starting application (i.e., the target application). If the dynamic effect scene is a scene in which a dynamic effect is played when the first application is switched to the second application, the first message may be a packet name switching message, that is, a message in which an application packet name of the first application is switched to an application packet name of the second application. On the basis, the screen refresh rate switching module can determine the preset refresh rate of the switched second application according to the packet name switching message.
It should be noted that, in the related art, after the screen refresh rate switching module receives the first message sent by the activity manager, the screen refresh rate switching module determines a preset refresh rate of the target application; then, the screen refresh rate switching module switches the current screen refresh rate to the preset refresh rate of the target application. However, in the embodiment of the present application, after the screen refresh rate switching module receives the first message sent by the activity manager, the screen refresh rate switching module determines that the current activity starts according to the first message. And then, the screen refresh rate switching module determines a preset refresh rate of the target application according to the application package name of the target application carried by the first message and the screen refresh rate switching rule, and stores the preset refresh rate in the screen refresh rate switching module. When the screen refresh rate switching module receives the message of finishing the dynamic effect, the screen refresh rate switching module switches the current screen refresh rate to the preset refresh rate of the target application.
And S207, the dynamic effect identification module sends a second message to the dynamic effect playing component.
Wherein, the second message (or called the first target message) is used to indicate the action attribute of the target action. Illustratively, the dynamic effect attribute includes one or more of dynamic effect content, dynamic effect size (e.g., from small to large or from large to small), dynamic effect duration, or dynamic effect start and end positions of the target dynamic effect.
For example, when the target action is the start of the action, the action identification module may send one or more of the content of the action, the size of the action, the duration of the action, or the start and end positions of the action to the action play component.
It should be noted that the animation includes consecutive multi-frame image frames. The dynamic effect starting position refers to the position of the first frame image frame, and the dynamic effect ending position refers to the position of the last frame image frame.
And S208, the dynamic effect playing component plays the target dynamic effect according to the second message.
Still taking the action type of the target action as the start action start as an example, for an example, the action playing component plays the target action according to the action content of the start action start, the action size (e.g. from small to large) of the start action, the action duration (e.g. 400ms) of the start action, and the start position and the end position of the start action.
S209, registering the callback to the desktop starter by the dynamic effect playing component.
Illustratively, when the dynamic effect playing component finishes playing the target dynamic effect, the dynamic effect playing component registers a callback with the desktop launcher to notify the desktop launcher that the target dynamic effect is finished.
Illustratively, the target animation is a series of image frames (frame 1 to frame 6 shown in fig. 4 b) that are continuously played by the electronic device within a fixed time period. In some embodiments, the dynamic effect playing component registers a callback with the desktop launcher when the dynamic effect playing component has played the last frame image frame of the target dynamic effect. Illustratively, when the image frame currently played by the dynamic effect playing component reaches the dynamic effect ending position (i.e. the image frame played by the dynamic effect playing component is the last frame at this time), the dynamic effect playing component registers a callback with the desktop starter. In other embodiments, the animation playback component registers a callback with the desktop launcher when the duration that the animation playback component plays the target animation reaches a fixed duration (e.g., 400ms complete).
S210, the desktop starter sends a second dynamic effect notification message to the dynamic effect identification module.
And the second dynamic effect notification message is used for notifying the dynamic effect identification module that the target dynamic effect is finished.
In some embodiments, the second animation notification message further comprises a second animation type. Wherein the second action type is used for indicating the action type of the target action end. For example, when the second action type is "0", it indicates that the target action is the start action; when the second action type is '1', the target action is represented as an exit action.
For example, the second animation notification message may be end "0"; or end "1". Where end denotes the end. On the basis, when the second dynamic effect notification message is end '0', the target dynamic effect is the end of the starting dynamic effect; when the second action notification message is end "1", the target action is the end of the quit action.
S211, the dynamic effect identification module informs the screen refresh rate switching module that the target dynamic effect is finished.
Illustratively, after the activity recognition module notifies the screen refresh rate switching module of the end of the target activity, the screen refresh rate switching module switches the current screen refresh rate to the preset refresh rate of the target application (e.g., from 120Hz to 60Hz) according to the previously stored preset refresh rate of the target application. It should be noted that, if the current screen refresh rate is the same as the preset refresh rate of the target application, the screen refresh rate switching module does not switch the refresh rate.
It should be noted that the preset refresh rate of the target application described in the embodiment of the present application refers to a screen refresh rate of a display screen of the electronic device, that is, the number of times of refreshing pictures of the display screen per second. In other words, the electronic device displays the interface of the target application on the display screen at the preset refresh rate of the target application.
It should be noted that, when the desktop launcher sends the first dynamic effect notification message and the second dynamic effect notification message to the dynamic effect identification module, a problem of time delay may occur. For example, after the desktop starter receives a first operation of the user for a certain period of time (e.g., 3ms), the desktop starter may send the first action notification message to the action recognition module, i.e., delay sending. This delay can cause the electronic device to pause while playing the animation. Based on this, in the embodiment of the application, the desktop starter can send the first dynamic effect notification message and the second dynamic effect notification message to the dynamic effect identification module at a certain time in advance, so that the problem of blocking caused by time delay is effectively avoided.
S212, the screen refresh rate switching module sends the preset refresh rate of the target application to the image synthesis system.
And S213, the image synthesis system synthesizes the target image according to the preset refresh rate of the target application.
Illustratively, the image composition system triggers the target application to render the image data. For example, the target application renders image data through a view system and renders the image data through an image rendering system. And the target application sends the rendered image data to an image synthesis system, and the image synthesis system synthesizes the image data to obtain a target image.
And S214, the image synthesis system sends the synthesized target image to a hardware synthesizer.
And S215, sending the target image to a display driver by the hardware synthesizer.
Specifically, after the hardware synthesizer sends the synthesized target image to the display driver, the display driver drives the display screen to display the target image.
For example, in this embodiment of the present application, the target image may be an image displayed after the target application is started (e.g., an image of a main interface of the target application, or an image of an interface of the target application running in the background).
In summary, in the embodiment of the application, since the electronic device can recognize the start and the end of the target dynamic effect through the dynamic effect recognition module, after the electronic device recognizes the end of the target dynamic effect, the electronic device can switch the refresh rate of the display screen through the screen refresh rate switching module, so as to avoid the problem of stuttering of the electronic device in the process of playing the target dynamic effect.
Fig. 14 is a schematic flow chart of a display method according to an embodiment of the present application. The display method is applied to an electronic device which supports a first refresh rate and a second refresh rate. The display method comprises the following steps:
s301, the electronic device displays a first interface on the display screen at a first refresh rate.
In some embodiments, as shown in fig. 4a and 6, the first interface is a desktop of the electronic device; at this time, the first refresh rate is the preset refresh rate of the desktop. In other embodiments, as shown in FIG. 5, the first interface is an interface of a target application (e.g., a communication application); the first refresh rate is now the preset refresh rate of the target application. In still other embodiments, the first interface is a multitasking interface of the electronic device, the first interface including an interface where the source application was run during a most recent task room; the first refresh rate is now the preset refresh rate of the source application. Illustratively, as shown in fig. 7 (a), the first interface is an interface when the source application (or application 1) runs in the latest task space.
As described in conjunction with the above embodiments, in the embodiment of the present application, the source application may also be referred to as a first application (or application 1), and the target application may also be referred to as a second application (or application 2).
S302, the electronic equipment receives a first operation of a user.
For the illustration of the first operation, reference may be made to the above illustration of the first operation in embodiment S201, and no one is listed here.
And S303, the electronic equipment responds to the first operation, and the electronic equipment starts playing the dynamic effect.
In some embodiments, a dynamic effect (or target dynamic effect) includes a succession of N frames of pictures (or image frames); wherein N is greater than or equal to 1. Illustratively, the electronic device starts playing the dynamic effect according to the dynamic effect attribute; the dynamic effect attribute comprises at least one of dynamic effect content, dynamic effect size, dynamic effect duration or dynamic effect starting position and dynamic effect ending position; the dynamic effect starting position is used for indicating the position of the first frame picture in the N frame pictures on the display screen, and the dynamic effect ending position is used for indicating the position of the last frame picture in the N frame pictures on the display screen.
S304, the electronic equipment detects that the dynamic effect playing is finished, and switches to display a second interface on the display screen at a second refresh rate.
The action effect is used for indicating a picture displayed in the process that the electronic equipment is switched from the first interface to the second interface.
In some embodiments, as shown in fig. 4a, the second interface is an interface of a target application (e.g., a communication application), and the second refresh rate is a preset refresh rate of the target application. In other embodiments, as shown in fig. 5, the second interface is a desktop of the electronic device, and the second refresh rate is a preset refresh rate of the desktop. In other embodiments, as shown in FIG. 6, the second interface is the interface when the source application (e.g., application 1) was running during the last task, and the second refresh rate is the preset refresh rate of the source application. In other embodiments, as shown in fig. 7, the second interface is an interface of the target application (e.g., application 2) running during the latest task time, and the second refresh rate is the preset refresh rate of the target application.
In some embodiments, when the electronic device plays the mth frame of picture, if the distance between the position of the mth frame of picture on the display screen and the target position is smaller than a preset value, the electronic device detects that the dynamic effect playing is finished; wherein M is more than or equal to 1 and less than or equal to N; the target position is used for indicating the position of the last frame picture in the N frame pictures on the display screen.
It should be noted that the preset value may be set according to actual requirements, and this is not limited in the embodiment of the present application.
For example, when the electronic device plays the mth frame of picture, if the position of the mth frame of picture on the display screen is the same as the target position, the electronic device detects that the motion effect playing is finished. In other words, when the electronic device plays the last frame of the N frames, the electronic device detects that the live play is finished.
In other embodiments, when the duration of playing the dynamic effect by the electronic device satisfies a preset duration, the electronic device detects that the playing of the dynamic effect is finished.
It should be noted that the preset time period may be set according to actual requirements, and this is not limited in the embodiment of the present application.
For example, when the duration of playing the dynamic effect by the electronic device reaches the preset 400ms, the electronic device detects that the playing of the dynamic effect is finished.
In the embodiment of the application, the electronic equipment displays a first interface on a display screen at a first refresh rate; when the electronic equipment receives a first operation of a user and responds to the first operation, the electronic equipment starts playing a dynamic effect; when the electronic equipment detects that the dynamic effect playing is finished, the electronic equipment is switched to display a second interface on the display screen at a second refresh rate; because the dynamic effect is used for executing the image displayed in the process that the electronic equipment is switched from the first interface to the second interface, when the electronic equipment detects that the dynamic effect playing is finished, the electronic equipment switches the first refresh rate to the second refresh rate and displays the second interface, and therefore the blocking problem caused by switching the refresh rate in the dynamic effect playing process is avoided.
An embodiment of the present application provides an electronic device, which may include: a display screen (e.g., a touch screen), memory, and one or more processors. The display, memory and processor are coupled. The memory is for storing computer program code comprising computer instructions. When the processor executes the computer instructions, the electronic device may perform various functions or steps performed by the mobile phone in the above-described method embodiments. The structure of the electronic device may refer to the structure of the electronic device 100 shown in fig. 8.
An embodiment of the present application further provides a chip system, as shown in fig. 15, the chip system 1800 includes at least one processor 1801 and at least one interface circuit 1802. The processor 1801 may be the processor 110 shown in fig. 8 in the foregoing embodiment. The interface circuit 1802 may be, for example, an interface circuit between the processor 110 and the external memory 120; or an interface circuit between the processor 110 and the internal memory 121.
The processor 1801 and the interface circuit 1802 may be interconnected by wires. For example, the interface circuit 1802 may be used to receive signals from other devices (e.g., a memory of an electronic device). Also for example, the interface circuit 1802 may be used to send signals to other devices, such as the processor 1801. Illustratively, the interface circuit 1802 may read instructions stored in the memory and send the instructions to the processor 1801. The instructions, when executed by the processor 1801, may cause the electronic device to perform the steps performed by the handset 180 in the embodiments described above. Of course, the chip system may further include other discrete devices, which is not specifically limited in this embodiment of the present application.
Embodiments of the present application further provide a computer-readable storage medium, where the computer-readable storage medium includes computer instructions, and when the computer instructions are executed on an electronic device, the electronic device is caused to perform various functions or steps performed by a mobile phone in the foregoing method embodiments.
The embodiments of the present application further provide a computer program product, which when run on a computer, causes the computer to execute each function or step executed by the mobile phone in the above method embodiments.
Through the description of the above embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical functional division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another device, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, that is, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (16)

1. A display method is applied to an electronic device, wherein the electronic device supports a first refresh rate and a second refresh rate; the method comprises the following steps:
the electronic equipment displays a first interface on a display screen at the first refresh rate;
the electronic equipment receives a first operation of a user;
the electronic equipment responds to the first operation and starts playing the dynamic effect;
the electronic equipment detects that the dynamic effect playing is finished, and switches to display a second interface on the display screen at the second refresh rate;
the action effect is used for indicating a picture displayed in the process that the electronic equipment is switched from the first interface to the second interface.
2. The method of claim 1, wherein the animation comprises consecutive N frames of pictures, wherein N is greater than or equal to 1; the electronic device starts playing the dynamic effect, comprising:
the electronic equipment starts playing the dynamic effect according to the dynamic effect attribute; the dynamic effect attribute comprises at least one of dynamic effect content, dynamic effect size, dynamic effect duration or dynamic effect starting position and dynamic effect ending position; the dynamic effect starting position is used for indicating the position of the first frame of the N frames of pictures on the display screen, and the dynamic effect ending position is used for indicating the position of the last frame of the N frames of pictures on the display screen.
3. The method according to claim 1 or 2, wherein the motion effect comprises consecutive N frames of pictures, where N is greater than or equal to 1; the electronic device detects that the dynamic effect playing is finished, and the method comprises the following steps:
when the electronic equipment plays the Mth frame of picture, if the distance between the position of the Mth frame of picture on the display screen and the target position is smaller than a preset value, the electronic equipment detects that the dynamic effect playing is finished; m is more than or equal to 1 and less than or equal to N;
the target position is used for indicating the position of the last frame of the N frames of pictures on the display screen.
4. The method according to claim 1 or 2, wherein the electronic device detecting the end of the dynamic effect playing comprises:
and when the time length of playing the dynamic effect by the electronic equipment meets the preset time length, the electronic equipment detects that the playing of the dynamic effect is finished.
5. The method according to any one of claims 1-4, wherein the electronic device comprises a target application, and the second interface is an interface of the target application; after the electronic device starts playing the animation, the method further comprises:
the electronic equipment acquires first information; the first information comprises an application package name of the target application;
the electronic equipment determines the second refresh rate according to the application package name of the target application and a preset refresh rate switching rule; the preset refresh rate switching rule is used for indicating the mapping relation between the application package name and the refresh rate of the display screen.
6. The method of any of claims 1-5, wherein the electronic device comprises a target application;
the first interface is a desktop of the electronic equipment, and the second interface is an interface after the target application is started;
the dynamic effect comprises a picture displayed in the process that the electronic equipment starts the target application.
7. The method of claim 6,
when the target application is not operated in the background of the electronic equipment, the second interface is a main interface of the target application; or,
when the target application runs in the background of the electronic equipment, the second interface is an interface of the target application running in the background.
8. The method according to claim 6 or 7, wherein the motion effect comprises consecutive N frames of pictures, N being greater than or equal to 1; in the N frames, the size of each frame is different;
and in the process of starting the target application, the electronic equipment sequentially displays the N frames of pictures, and in the N frames of pictures, the size from the first frame of picture to the size from the Nth frame of picture is sequentially increased.
9. The method of any of claims 1-5, wherein the electronic device comprises a target application;
the first interface is an interface of the target application, and the second interface is a desktop of the electronic equipment;
the dynamic effect comprises a picture displayed in the process that the electronic equipment exits the target application.
10. The method of claim 9, wherein the animation comprises consecutive N frames of pictures, N being greater than or equal to 1; in the N frames, the size of each frame is different;
and in the process of exiting the target application, the electronic equipment sequentially displays the N frames of pictures, and in the N frames of pictures, the size of a first frame of picture to the size of an Nth frame of picture are sequentially reduced.
11. The method of any of claims 1-5, wherein the electronic device comprises a source application and a target application;
the first interface and the second interface are multitask interfaces of the electronic equipment; the first interface comprises an interface of the source application when running between recent tasks, and the second interface comprises an interface of the target application when running between the recent tasks.
12. The method of claim 2, wherein the electronic device comprises a desktop launcher, a dynamic effect identification module, and a dynamic effect playing component; the electronic equipment responds to a first operation and starts playing the dynamic effect, and the method comprises the following steps:
the desktop starter responds to the first operation and sends a first dynamic effect notification message to the dynamic effect identification module; the first dynamic effect notification message is used for notifying a dynamic effect identification module that a dynamic effect starts;
the dynamic effect identification module sends a first target message to the dynamic effect playing component according to a first dynamic effect notification message; the first target message is used for indicating the dynamic effect attribute;
and the dynamic effect component starts playing the dynamic effect according to the first target message.
13. The method of claim 3, wherein the electronic device comprises a desktop launcher, a dynamic effect identification module, and a dynamic effect playing component; the electronic device detects that the dynamic effect playing is finished, and the method comprises the following steps:
when the dynamic effect playing component plays the Mth frame of picture, if the distance between the position of the Mth frame of picture on the display screen and the target position is smaller than a preset value, the dynamic effect playing component informs the desktop manager that the dynamic effect playing is finished;
the desktop manager sends a second dynamic effect notification message to the dynamic effect identification module;
and the dynamic effect identification module detects that the dynamic effect playing is finished according to the second dynamic effect notification message.
14. The method of claim 4, wherein the electronic device comprises a desktop launcher, a dynamic effect identification module, and a dynamic effect playing component; the electronic device detects that the dynamic effect playing is finished, and the method comprises the following steps:
when the time length of playing the dynamic effect by the dynamic effect playing component meets the preset time length, the dynamic effect playing component informs the desktop manager that the dynamic effect playing is finished;
the desktop manager sends a second dynamic effect notification message to the dynamic effect identification module;
and the dynamic effect identification module detects that the dynamic effect playing is finished according to the second dynamic effect notification message.
15. An electronic device, characterized in that the electronic device comprises: a display screen, memory, and one or more processors; the display screen, the memory and the processor are coupled;
the memory for storing computer program code, the computer program code comprising computer instructions; the computer instructions, when executed by the processor, cause the electronic device to perform the method of any of claims 1-14.
16. A computer-readable storage medium comprising computer instructions; the computer instructions, when executed on the electronic device, cause the electronic device to perform the method of any of claims 1-14.
CN202210023829.8A 2022-01-10 2022-01-10 Display method, electronic device and storage medium Active CN114518817B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210023829.8A CN114518817B (en) 2022-01-10 2022-01-10 Display method, electronic device and storage medium
CN202310382648.9A CN116501210B (en) 2022-01-10 2022-01-10 Display method, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210023829.8A CN114518817B (en) 2022-01-10 2022-01-10 Display method, electronic device and storage medium

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202310382648.9A Division CN116501210B (en) 2022-01-10 2022-01-10 Display method, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114518817A true CN114518817A (en) 2022-05-20
CN114518817B CN114518817B (en) 2023-04-07

Family

ID=81597576

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202310382648.9A Active CN116501210B (en) 2022-01-10 2022-01-10 Display method, electronic equipment and storage medium
CN202210023829.8A Active CN114518817B (en) 2022-01-10 2022-01-10 Display method, electronic device and storage medium

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202310382648.9A Active CN116501210B (en) 2022-01-10 2022-01-10 Display method, electronic equipment and storage medium

Country Status (1)

Country Link
CN (2) CN116501210B (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115774594A (en) * 2022-12-21 2023-03-10 深圳市坚果软件有限责任公司 Application switching method, projection device and computer-readable storage medium
CN116684677A (en) * 2022-09-20 2023-09-01 荣耀终端有限公司 A method for playing dynamic effects of electronic equipment, electronic equipment and storage medium
CN116701307A (en) * 2022-12-20 2023-09-05 荣耀终端有限公司 Interface display method and terminal device of reading application
CN116884368A (en) * 2023-07-14 2023-10-13 广州炫视智能科技有限公司 A system and method for automatic adjustment of screen refresh rate and power consumption
CN116991274A (en) * 2023-09-28 2023-11-03 荣耀终端有限公司 A method for handling exceptions in scrolling effects and electronic equipment
CN117130698A (en) * 2023-03-29 2023-11-28 荣耀终端有限公司 Menu display method and electronic device
WO2024041047A1 (en) * 2022-08-24 2024-02-29 荣耀终端有限公司 Screen refresh rate switching method and electronic device
CN117724772A (en) * 2023-05-25 2024-03-19 荣耀终端有限公司 Application program exit control method and device
WO2024055904A1 (en) * 2022-09-14 2024-03-21 荣耀终端有限公司 Method for requesting vsync signal, and electronic device
CN118053408A (en) * 2022-11-16 2024-05-17 中兴通讯股份有限公司 A method, device and computer readable storage medium for adjusting refresh rate
WO2024148945A1 (en) * 2023-01-10 2024-07-18 荣耀终端有限公司 Refresh rate adjustment method and electronic device
CN118550617A (en) * 2023-02-27 2024-08-27 华为技术有限公司 Display method of electronic equipment, electronic equipment and storage medium
WO2024212627A1 (en) * 2023-04-11 2024-10-17 华为技术有限公司 Method for adjusting screen refresh rate and electronic device
CN119649723A (en) * 2024-10-17 2025-03-18 荣耀终端股份有限公司 Display screen refresh rate switching method, electronic device and storage medium
EP4546126A4 (en) * 2022-10-19 2025-09-24 Huawei Tech Co Ltd INTERFACE GENERATING METHOD AND ELECTRONIC DEVICE
WO2025208446A1 (en) * 2024-04-03 2025-10-09 荣耀终端股份有限公司 Method for setting refresh rate, and electronic device

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN120276794A (en) * 2023-12-29 2025-07-08 荣耀终端股份有限公司 PWM frequency adjustment method, electronic device, and readable storage medium
CN119271318B (en) * 2024-01-04 2025-09-16 荣耀终端股份有限公司 Application management method and related device
CN120335906A (en) * 2024-01-10 2025-07-18 荣耀终端股份有限公司 Frame synthesis method, electronic device and computer readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070279407A1 (en) * 2006-05-30 2007-12-06 Maximino Vasquez Switching of display refresh rates
US20180261190A1 (en) * 2017-03-10 2018-09-13 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Method for Controlling Display Refresh Rate and Electronic Device
CN110377251A (en) * 2019-06-06 2019-10-25 努比亚技术有限公司 A kind of screen refresh rate method of adjustment, terminal and computer readable storage medium
CN112256219A (en) * 2020-10-13 2021-01-22 北京小米移动软件有限公司 Display method and device, terminal and storage medium
US10964262B1 (en) * 2018-08-30 2021-03-30 Apple Inc. Systems and methods for reducing visual artifacts in displays due to refresh rate
CN112667340A (en) * 2020-12-31 2021-04-16 努比亚技术有限公司 Screen refresh control method, mobile terminal and computer readable storage medium
CN112689168A (en) * 2020-12-09 2021-04-20 四川金熊猫新媒体有限公司 Dynamic effect processing method, dynamic effect display method and dynamic effect processing device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112445315B (en) * 2019-08-28 2024-11-05 北京小米移动软件有限公司 Screen refresh frame rate control method, device and storage medium
US11276340B2 (en) * 2019-12-31 2022-03-15 Micron Technology, Inc. Intelligent adjustment of screen refresh rate
CN114842816A (en) * 2020-03-06 2022-08-02 华为技术有限公司 Refresh rate switching method and electronic device
CN113438552B (en) * 2021-05-19 2022-04-19 荣耀终端有限公司 A refresh rate adjustment method and electronic device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070279407A1 (en) * 2006-05-30 2007-12-06 Maximino Vasquez Switching of display refresh rates
US20180261190A1 (en) * 2017-03-10 2018-09-13 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Method for Controlling Display Refresh Rate and Electronic Device
US10964262B1 (en) * 2018-08-30 2021-03-30 Apple Inc. Systems and methods for reducing visual artifacts in displays due to refresh rate
CN110377251A (en) * 2019-06-06 2019-10-25 努比亚技术有限公司 A kind of screen refresh rate method of adjustment, terminal and computer readable storage medium
CN112256219A (en) * 2020-10-13 2021-01-22 北京小米移动软件有限公司 Display method and device, terminal and storage medium
CN112689168A (en) * 2020-12-09 2021-04-20 四川金熊猫新媒体有限公司 Dynamic effect processing method, dynamic effect display method and dynamic effect processing device
CN112667340A (en) * 2020-12-31 2021-04-16 努比亚技术有限公司 Screen refresh control method, mobile terminal and computer readable storage medium

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024041047A1 (en) * 2022-08-24 2024-02-29 荣耀终端有限公司 Screen refresh rate switching method and electronic device
CN117711356A (en) * 2022-08-24 2024-03-15 荣耀终端有限公司 Screen refresh rate switching method and electronic device
CN117711355A (en) * 2022-08-24 2024-03-15 荣耀终端有限公司 Screen refresh rate switching method and electronic equipment
WO2024055904A1 (en) * 2022-09-14 2024-03-21 荣耀终端有限公司 Method for requesting vsync signal, and electronic device
EP4538875A4 (en) * 2022-09-14 2025-07-02 Honor Device Co Ltd Method for requesting a VSYNC signal and electronic device
CN116684677A (en) * 2022-09-20 2023-09-01 荣耀终端有限公司 A method for playing dynamic effects of electronic equipment, electronic equipment and storage medium
CN116684677B (en) * 2022-09-20 2024-06-11 荣耀终端有限公司 Electronic device motion effect playing method, electronic device and storage medium
EP4546126A4 (en) * 2022-10-19 2025-09-24 Huawei Tech Co Ltd INTERFACE GENERATING METHOD AND ELECTRONIC DEVICE
CN118053408A (en) * 2022-11-16 2024-05-17 中兴通讯股份有限公司 A method, device and computer readable storage medium for adjusting refresh rate
WO2024103697A1 (en) * 2022-11-16 2024-05-23 中兴通讯股份有限公司 Method and apparatus for adjusting refresh rate, and computer-readable storage medium
CN116701307A (en) * 2022-12-20 2023-09-05 荣耀终端有限公司 Interface display method and terminal device of reading application
CN115774594A (en) * 2022-12-21 2023-03-10 深圳市坚果软件有限责任公司 Application switching method, projection device and computer-readable storage medium
WO2024148945A1 (en) * 2023-01-10 2024-07-18 荣耀终端有限公司 Refresh rate adjustment method and electronic device
WO2024179249A1 (en) * 2023-02-27 2024-09-06 华为技术有限公司 Electronic device display method, electronic device, and storage medium
CN118550617B (en) * 2023-02-27 2025-11-21 华为技术有限公司 Display method of electronic equipment, electronic equipment and storage medium
CN118550617A (en) * 2023-02-27 2024-08-27 华为技术有限公司 Display method of electronic equipment, electronic equipment and storage medium
CN117130698A (en) * 2023-03-29 2023-11-28 荣耀终端有限公司 Menu display method and electronic device
WO2024212627A1 (en) * 2023-04-11 2024-10-17 华为技术有限公司 Method for adjusting screen refresh rate and electronic device
CN117724772A (en) * 2023-05-25 2024-03-19 荣耀终端有限公司 Application program exit control method and device
CN116884368A (en) * 2023-07-14 2023-10-13 广州炫视智能科技有限公司 A system and method for automatic adjustment of screen refresh rate and power consumption
CN116991274B (en) * 2023-09-28 2023-12-19 荣耀终端有限公司 A method for handling exceptions in upward sliding effects and electronic equipment
CN116991274A (en) * 2023-09-28 2023-11-03 荣耀终端有限公司 A method for handling exceptions in scrolling effects and electronic equipment
WO2025208446A1 (en) * 2024-04-03 2025-10-09 荣耀终端股份有限公司 Method for setting refresh rate, and electronic device
CN119649723A (en) * 2024-10-17 2025-03-18 荣耀终端股份有限公司 Display screen refresh rate switching method, electronic device and storage medium

Also Published As

Publication number Publication date
CN114518817B (en) 2023-04-07
CN116501210A (en) 2023-07-28
CN116501210B (en) 2024-06-11

Similar Documents

Publication Publication Date Title
CN116501210B (en) Display method, electronic equipment and storage medium
CN114092595B (en) Image processing method and electronic equipment
CN114579075B (en) Data processing method and related device
CN117711356B (en) Screen refresh rate switching method and electronic equipment
CN114697446B (en) Refresh rate switching method, electronic device and storage medium
CN114579076A (en) Data processing method and related device
CN114661263A (en) Display method, electronic equipment and storage medium
US12452486B2 (en) Refresh rate setting method and related device
CN117724781B (en) A method for playing application startup animation and electronic device
WO2024016798A1 (en) Image display method and related apparatus
CN116414337A (en) Frame rate switching method and device
CN120510057A (en) Image processing method and electronic equipment
CN116414336A (en) Frame rate switching method and device
CN117689785B (en) Rendering method, electronic device and computer readable storage medium
CN117724779B (en) Method for generating interface image and electronic device
CN116077940B (en) Drawing processing method and related device in game application
CN118363688A (en) Interface rendering method, electronic device and computer readable storage medium
WO2024212549A1 (en) Window animation processing method and electronic device
WO2023124227A9 (en) Frame rate switching method and device
WO2026001314A1 (en) Image composition method and electronic device
CN119948519A (en) Data processing method and related device
WO2025148971A1 (en) Display method, electronic device and storage medium
CN119311236A (en) A layer synthesis method, electronic device and storage medium
CN120704571A (en) Display processing method and electronic device
WO2025148975A1 (en) Frame composition method, electronic device, and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address

Address after: Unit 3401, unit a, building 6, Shenye Zhongcheng, No. 8089, Hongli West Road, Donghai community, Xiangmihu street, Futian District, Shenzhen, Guangdong 518040

Patentee after: Honor Terminal Co.,Ltd.

Country or region after: China

Address before: 3401, unit a, building 6, Shenye Zhongcheng, No. 8089, Hongli West Road, Donghai community, Xiangmihu street, Futian District, Shenzhen, Guangdong

Patentee before: Honor Device Co.,Ltd.

Country or region before: China

CP03 Change of name, title or address