[go: up one dir, main page]

WO2018161534A1 - Procédé d'affichage d'image, terminal à double écran et support d'informations non volatil lisible par ordinateur - Google Patents

Procédé d'affichage d'image, terminal à double écran et support d'informations non volatil lisible par ordinateur Download PDF

Info

Publication number
WO2018161534A1
WO2018161534A1 PCT/CN2017/102896 CN2017102896W WO2018161534A1 WO 2018161534 A1 WO2018161534 A1 WO 2018161534A1 CN 2017102896 W CN2017102896 W CN 2017102896W WO 2018161534 A1 WO2018161534 A1 WO 2018161534A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
image
layers
display
channel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2017/102896
Other languages
English (en)
Chinese (zh)
Inventor
梅正怡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Mobile Communications Technology Co Ltd
Original Assignee
Hisense Mobile Communications Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Mobile Communications Technology Co Ltd filed Critical Hisense Mobile Communications Technology Co Ltd
Publication of WO2018161534A1 publication Critical patent/WO2018161534A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1431Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display using a single graphics controller

Definitions

  • the present invention relates to the field of electronic technologies, and in particular, to a method for displaying an image, a dual screen terminal, and a computer readable nonvolatile storage medium.
  • terminals such as mobile phones with display functions.
  • terminals such as mobile phones with display functions.
  • only one screen is set in the terminal used by the user, and the user can interact with the terminal through the image displayed on the terminal screen.
  • most of the terminals only support the display of a single screen, and there are two factors on the terminal for simultaneously displaying images. More.
  • Embodiments of the present invention provide a method of displaying an image, a dual screen terminal, and a computer readable nonvolatile storage medium.
  • an embodiment of the present invention provides a method for displaying an image, where the method includes:
  • the first image is transmitted to the first screen for display by a hardware synthesizer in the dual screen terminal, and the second image is transmitted to the second screen for display.
  • the synthesizing the plurality of layers to be synthesized corresponding to the first screen to form a first image includes:
  • synthesizing the plurality of layers to be synthesized corresponding to the second screen to form a second image including:
  • the first screen is an ink screen
  • the acquiring a plurality of layers to be synthesized corresponding to the first screen includes:
  • synthesizing the plurality of layers to be synthesized corresponding to the first screen, and forming the first image includes:
  • acquiring the plurality of layers to be synthesized corresponding to the second screen includes:
  • synthesizing the plurality of layers to be synthesized corresponding to the second screen, and forming the second image includes:
  • acquiring the plurality of layers to be synthesized corresponding to the second screen includes:
  • the channel other than the first channel the plurality of layers to be synthesized corresponding to the second screen, wherein one channel of the hardware synthesizer acquires a layer, where the The number of layers of the plurality of layers to be synthesized corresponding to the two screens is less than or equal to the number of channels of the hardware synthesizer minus one;
  • synthesizing the plurality of layers to be synthesized corresponding to the second screen to form a second image including:
  • the layers in the first layer set do not change.
  • the layers in the second layer set are all changed, and the multiple layers to be synthesized corresponding to the second screen are obtained, including:
  • synthesizing the plurality of layers to be synthesized corresponding to the second screen to form a second image including:
  • each layer in the second layer set and the third image to form the second image.
  • the first channel stored in the first buffer area is acquired by the first channel of the hardware synthesizer, and the first image is transmitted to the first screen for display, including :
  • the method further includes:
  • the first buffer is stored in the first channel of the hardware synthesizer And transmitting the first image to the first screen for display, including:
  • Decoding the first image obtaining logical TCON data corresponding to the first image, and driving the first screen to display according to the TCON data.
  • an embodiment of the present invention provides a dual screen terminal, including: the dual screen terminal includes: a memory, a processor, a GPU, a hardware synthesizer, a first screen, a second screen, and are stored on the memory and a program executable on the processor, the processor being respectively coupled to the memory, the hardware synthesizer, and the GPU, the GPU being coupled to the memory, the hardware synthesizer separately and the The memory, the first screen and the second screen are connected, and the processor implements the program when:
  • the first image is transmitted to the first screen for display by a hardware synthesizer in the dual screen terminal, and the second image is transmitted to the second screen for display.
  • the processor implements the program when:
  • the first screen is an ink screen
  • the implementation when the processor executes the program, the implementation is:
  • the processor executes the program:
  • the processor executes the program:
  • the channel other than the first channel the plurality of layers to be synthesized corresponding to the second screen, wherein one channel of the hardware synthesizer acquires a layer, where the The number of layers of the plurality of layers to be synthesized corresponding to the two screens is less than or equal to the number of channels of the hardware synthesizer minus one;
  • the processor when the plurality of layers to be synthesized corresponding to the second screen includes a first layer set and a second layer set, the processor implements the program when:
  • each layer in the second layer set and the third image to form the second image.
  • the processor implements the program when:
  • the processor when executing the program further implements:
  • the processor implements the program when:
  • Decoding the first image obtaining logical TCON data corresponding to the first image, and driving the first screen to display according to the TCON data.
  • an embodiment of the present invention provides a computer readable non-volatile storage medium, where the non-volatile storage medium stores a program, and when the program is executed, the method step of the first aspect is carried out.
  • an embodiment of the present invention provides an apparatus for displaying an image, including:
  • An acquiring module configured to respectively acquire a plurality of layers to be synthesized corresponding to the first screen and multiple layers to be synthesized corresponding to the second screen;
  • a synthesizing module configured to synthesize a plurality of layers to be synthesized corresponding to the first screen to form a first image, and synthesize the plurality of layers to be synthesized corresponding to the second screen to form a second image ;
  • a display module configured to transmit the first image to the first screen for display through a hardware synthesizer in the dual screen terminal, and transmit the second image to the second screen for display.
  • the synthesizing module is configured to display, according to the display position of the first screen, each of the plurality of layers to be synthesized corresponding to the first screen, to the first screen Corresponding multiple layers to be combined are combined to form the first image; according to the display position of each of the plurality of layers to be synthesized corresponding to the second screen at the second screen, And synthesizing a plurality of layers to be synthesized corresponding to the second screen to form the second image.
  • the acquiring module is configured to acquire, by the graphics processor GPU in the dual screen terminal, a plurality of graphics to be synthesized corresponding to the first screen, when the first screen is an ink screen. Floor;
  • the synthesizing module is configured to synthesize a plurality of layers to be synthesized corresponding to the first screen by using the GPU to form the first image, and store the first image in a first buffer area;
  • the display module is configured to acquire a first image stored in the first buffer area by using a first channel of the hardware synthesizer, and transmit the first image to the first screen for display.
  • the acquiring module is configured to acquire, by the GPU, the first screen correspondingly when each of the plurality of layers to be synthesized corresponding to the second screen does not change. Multiple to be synthesized Layer
  • the synthesizing module is configured to synthesize a plurality of layers to be synthesized corresponding to the second screen by using the GPU to form the second image, and store the second image in a second buffer area;
  • the display module is configured to acquire a second image stored in the second buffer area by using a second channel of the hardware synthesizer, and transmit the second image to the second screen for display.
  • the acquiring module is configured to remove, by the hardware synthesizer, the first one when each of the plurality of layers to be synthesized corresponding to the second screen changes
  • the channel outside the channel acquires multiple layers to be synthesized corresponding to the second screen, where one channel in the hardware synthesizer acquires a layer, and the number of multiple layers to be synthesized corresponding to the second screen is smaller than Or equal to the number of channels of the hardware synthesizer minus one;
  • the synthesizing module is configured to synthesize a plurality of layers to be synthesized corresponding to the second screen by using the hardware synthesizer to form the second image.
  • the acquiring module is configured to acquire, by the GPU, when the multiple layers to be synthesized corresponding to the second screen include a first layer set and a second layer set Each layer in a set of layers acquires each layer in the second set of layers through the remaining channels of the hardware synthesizer except the first channel and the second channel; wherein one of the hardware synthesizers The channel acquires a layer, the number of layers in the second set is less than or equal to the number of channels of the hardware synthesizer minus 2;
  • the synthesizing module is configured to synthesize each layer in the first layer set by the GPU to form a third image, and store the third image in a second buffer area;
  • the acquiring module is further configured to acquire, by using a second channel of the hardware synthesizer, a third image stored in the second buffer area;
  • the synthesizing module is further configured to synthesize each layer in the second layer set and the third image by using the hardware synthesizer to form the second image.
  • the display module is configured to acquire, by the first channel of the hardware synthesizer in the dual screen terminal, when the GPU completes the synthesis completion instruction of the first image. a first image stored in the first buffer area, and transmitting the first image to the first screen for display; when detecting a synthesis completion instruction of the second image by the GPU, passing the hardware
  • the second channel of the synthesizer acquires the second image stored in the second buffer area and transmits the second image to the second screen for display.
  • the apparatus further includes a deletion module
  • the deleting module is configured to: when the display completion instruction of the first image is detected, delete the first image in the first buffer area; when detecting the display completion instruction of the second image, The second image in the second buffer area is deleted.
  • the apparatus further includes a decoding module
  • the acquiring module is further configured to acquire, by using the first channel of the hardware synthesizer, the first image stored in the first buffer area;
  • the decoding module is configured to decode the first image, obtain logical TCON data corresponding to the first image, and drive the first screen to display according to the TCON data.
  • Embodiment 1 is a schematic flow chart of Embodiment 1 of a method for displaying an image provided by the present invention
  • FIG. 2 is a schematic diagram of layer composition according to an embodiment of the present invention.
  • FIG. 3 is a schematic flowchart diagram of Embodiment 2 of a method for displaying an image according to the present invention
  • FIG. 3a is a schematic diagram of layer synthesis according to Embodiment 2 of the present invention.
  • FIG. 4 is a schematic flow chart of a third embodiment of a method for displaying an image according to the present invention.
  • FIG. 4a is a schematic diagram of layer synthesis according to Embodiment 3 of the present invention.
  • 4b is a schematic diagram showing a display flow of the first image and the second image
  • FIG. 5 is a schematic flowchart diagram of Embodiment 4 of a method for displaying an image according to the present invention
  • FIG. 5a is a schematic diagram of layer composition according to Embodiment 4 of the present invention.
  • FIG. 6 is a schematic flowchart diagram of Embodiment 5 of a method for displaying an image according to the present invention.
  • FIG. 6a is a schematic diagram of layer composition according to Embodiment 5 of the present invention.
  • FIG. 7 is a schematic structural diagram of a terminal according to an embodiment of the present disclosure.
  • FIG. 8 is a schematic structural diagram of a terminal according to an embodiment of the present disclosure.
  • FIG. 9 is a schematic flowchart diagram of Embodiment 1 of an apparatus for displaying an image according to the present invention.
  • FIG. 10 is a schematic flowchart diagram of Embodiment 2 of an apparatus for displaying an image according to the present invention.
  • FIG. 11 is a schematic flowchart diagram of Embodiment 3 of an apparatus for displaying an image according to the present invention.
  • the embodiment of the invention provides a method for displaying an image, and the execution body of the method is a dual screen terminal.
  • the dual-screen terminal may be any terminal having a display function, such as a dual-screen mobile phone.
  • the dual-screen terminal may be provided with a graphics processor, a memory, a hardware synthesizer, a screen, and the graphics processor may be used for processing related processes of synthesizing a plurality of layers, and the memory may be used to store the following processes required and generated.
  • the data synthesizer can be used for correlation processing of multiple layers for synthesis, and can be used to acquire images stored in the buffer area and transfer them to the screen, which can be used to display images.
  • FIG. 1 is a schematic flowchart diagram of Embodiment 1 of a method for displaying an image provided by the present invention.
  • the execution body of this embodiment is a dual screen terminal, and specifically may be a device having a display image function in a dual screen terminal.
  • the method in this embodiment may include:
  • the method for synthesizing the image to be displayed on the first screen is similar to the method for synthesizing the image to be displayed on the second screen, and the first screen will be taken as an example in conjunction with the specific embodiment.
  • the image displayed by the dual-screen terminal is synthesized by multiple layers, that is, the dual-screen terminal is often used before the image is displayed. Synthesize multiple layers contained in the image to be displayed on the screen.
  • the display content and the scene will change constantly, and the number of layers will change constantly. To display the desired content on the screen, these images must be displayed.
  • the layers are combined into a final picture.
  • smart terminals with display will have GPU and hardware synthesizer, and the synthesis of layers is done by these two modules. Among them, the hardware synthesizer consumes less power when synthesized.
  • GPU synthesis refers to merging layers according to the size and starting position of the layer in order from top to bottom.
  • the dual screen terminal can pass the GPU (Graphics Processing) Unit, graphics processor or CPU (Central Processing Unit) draws the layer, or draws the layer by the upper application to get the multiple layers corresponding to the first screen, GPU or hardware synthesizer (For example, the Overlay module) may acquire a plurality of layers to be synthesized corresponding to the first screen and a plurality of layers to be synthesized corresponding to the second screen.
  • GPU Graphics Processing
  • CPU Central Processing Unit
  • the Overlay module may acquire a plurality of layers to be synthesized corresponding to the first screen and a plurality of layers to be synthesized corresponding to the second screen.
  • the GPU or the hardware synthesizer may combine the multiple layers to be synthesized corresponding to the first screen to form a first image.
  • the first image corresponds to four layers to be synthesized, wherein when the contents in the first layer and the second layer are unchanged, the contents in the third layer and the fourth layer are changed at any time, such that
  • the GPU can be used to synthesize the first layer and the second layer, the third layer and the fourth layer are synthesized using a hardware synthesizer, and then the GPU synthesized image and hardware synthesizer are synthesized using a hardware synthesizer.
  • the image is finally synthesized to obtain the first image.
  • the method for synthesizing the first image is not limited, and is specifically determined according to actual needs.
  • the obtaining process of the second image is the same as the acquiring process of the first image, and the description of the first image is not described herein.
  • the first image is transmitted to the first screen for display through a hardware synthesizer in the dual screen terminal, and the second image is transmitted to the second screen for display.
  • the dual screen terminal may send the first image to the first screen for display by using a hardware synthesizer (HW overlay), and send the second image to the second screen for display.
  • HW overlay hardware synthesizer
  • the synthesis function of the hardware synthesizer in this step is turned off and is in the channel mode.
  • the hardware synthesizer directly transmits the synthesized first image or the second image to the first screen or the second screen. display. If the first image and the second image are synthesized by the GPU, the hardware synthesizer acquires the first image and the second image synthesized by the GPU, and transmits the first image to the first display screen for display using different channels, and the second image The image is transmitted to the second display for display.
  • the solution of the embodiment of the present invention may be applicable to a scenario in which the first screen and the second screen are simultaneously displayed.
  • the dual screen terminal can also adopt the method of displaying an image on the first screen or the second screen described in the present scheme.
  • the method for displaying an image obtained by the embodiment of the present invention obtains a plurality of to-be-synthesized corresponding to the first screen respectively. a plurality of layers to be combined corresponding to the second layer and the second screen; combining the plurality of layers to be synthesized corresponding to the first screen to form a first image, and corresponding to the second screen
  • the layer is synthesized to form a second image; the first image is transmitted to the first screen for display through the hardware synthesizer in the dual screen terminal, and the second image is transmitted to the second screen for display, thereby realizing two of the dual screen terminals
  • the screen also displays an image with multiple layers.
  • the multiple layers to be synthesized corresponding to the first screen are combined to form a first image, and the method includes:
  • each of the plurality of layers to be combined corresponding to the first screen synthesizing a plurality of layers to be synthesized corresponding to the first screen to form The first image.
  • an upper layer application (such as a desktop application or a game application) that triggers a dual screen terminal to display an image may send a composite location corresponding to each layer to a GPU or a hardware synthesizer before synthesizing a plurality of layers.
  • the composite location may be a display location for identifying the layer ultimately in the screen.
  • the GPU or the hardware synthesizer may use multiple GPUs and/or hardware synthesizers according to the composite position corresponding to each layer in the multiple layers.
  • the layers are combined to form a first image. For example, the desktop image displayed by the dual-screen terminal is synthesized by the status bar layer, the wallpaper layer, and the icon layer.
  • the composite position of each layer may be used. It is synthesized to obtain the final desktop image, as shown in Figure 2.
  • the composite location may be a display location in the first screen.
  • the composite location corresponding to each of the multiple layers to be synthesized corresponding to the first screen may include each layer in the layer The display position in the first screen, that is, the display position in the screen (for example, in the upper area of the screen, or the middle area), wherein the display position may be the pixel position of the display area where the layer is located (the pixel position may be The pixel position of two diagonal vertices in the area).
  • the composite location of the layer may also include the number of layers in which the layer is located (eg, may be the second layer in all layers).
  • the multiple layers to be synthesized corresponding to the second screen are combined to form a second image, which includes:
  • the GPU and/or the hardware synthesizer according to the display position of the second screen corresponding to each of the plurality of layers to be synthesized corresponding to the second screen, and the plurality of to-be-composited corresponding to the second screen Layers are synthesized,
  • the second image is formed, and the specific process is the same as the process of synthesizing the first image. Referring to the above description, details are not described herein again.
  • FIG. 3 is a schematic flowchart diagram of Embodiment 2 of a method for displaying an image provided by the present invention.
  • the present embodiment relates to a specific process of acquiring a first image and displaying a first image when the first screen of the embodiment is an ink screen, as shown in FIG. 3, which is the embodiment of the present embodiment.
  • Methods can include:
  • the electronic ink screen is different from the screen of the ordinary color screen, and the driving method is also different, that is, the composite display data of the ordinary color screen is directly output to the color screen for display, and the composite display data of the ink screen needs to be converted into TCON data through software decoding.
  • the ink screen is synthesized by the GPU, and the data synthesized by the GPU can be saved, so that the saved synthesized data can be decoded and processed, and the synthesized data is directly output to the screen by the hardware synthesizer, which cannot be obtained.
  • the final synthesized data is therefore unable to be decoded. Therefore, in this embodiment, when the first screen is an ink screen, the plurality of layers to be combined corresponding to the first screen are acquired by the GPU, as shown in FIG. 3a.
  • the GPU acquires multiple layers to be synthesized corresponding to the first screen, and synthesizes the layers. For example, the GPU is first according to each of the plurality of layers to be synthesized corresponding to the first screen. The display position of the screen is combined with the plurality of layers to be synthesized corresponding to the first screen to form a first image.
  • the synthesized first image may be stored in a storage area corresponding to the first screen, that is, may be stored in the first buffer area, where the first buffer area may be used to store the first screen to be displayed.
  • the memory storage area of the image for example, can be stored in the frame buffer frame buffer of fb_target0.
  • the dual screen terminal may acquire the first image stored in the first buffer area through the first channel in the hardware synthesizer (HW overlay), and then, Send to the first screen for display.
  • the synthesis function of the hardware synthesizer in this step is turned off, in the channel mode, and the first channel may be any channel of the hardware synthesizer that is not currently used.
  • the foregoing S203 may specifically include:
  • Decoding the first image obtaining logical TCON data corresponding to the first image, and driving the first screen to display according to the TCON data.
  • the decoding program in the dual screen terminal reads the first from the first buffer area. And decoding the first image, decoding the first image into TCON data, and driving the ink screen with the TCON data, thereby implementing display of the first image on the first screen.
  • the foregoing S203 may be specifically:
  • the GPU completes the synthesis of the multiple layers, and after storing the synthesized first image in the first buffer area, may send a synthesis completion notification to the hardware synthesizer, and the hardware synthesizer receives the synthesis of the first image.
  • the notification is that, when the dual screen terminal detects the GPU completes the synthesis completion instruction of the first image, the first image stored in the first buffer area may be acquired through the first channel, and the first image is transmitted to the first screen for display. .
  • the first image in the first buffer area may be deleted.
  • the processing may be as follows: when the display completion instruction of the first image is detected, The first image in the first buffer is deleted.
  • the dual-screen terminal may further include a buffer management service, and the dual-screen terminal may manage the first buffer area and the second buffer area by using the buffer management server.
  • the display completion instruction of the first image may be sent to the buffer management service, and the buffer management service receives the first image.
  • the display completion instruction of an image the first image in the first buffer area can be deleted.
  • the method for displaying an image provided by the embodiment of the present invention when the first screen is an ink screen, the plurality of layers to be combined corresponding to the first screen are acquired by the GPU, and the corresponding screen to be synthesized by the GPU is corresponding to the first screen.
  • the plurality of layers are combined to form a first image, the first image is stored in the first buffer area, and finally the first image stored in the first buffer area is acquired by the first channel of the hardware synthesizer, and the first image is transmitted to The first screen is displayed to implement display of the first image on the first screen.
  • FIG. 4 is a schematic flowchart diagram of Embodiment 3 of a method for displaying an image according to the present invention.
  • the present embodiment relates to a specific process of acquiring a second image and displaying a second image when each of the plurality of layers to be synthesized corresponding to the second screen is unchanged.
  • the method in this embodiment may specifically include:
  • the layer when the content of the layer is unchanged, the layer is determined to be a constant layer, and when each layer of the plurality of layers to be synthesized corresponding to the second screen does not change, These layers can be composited using the GPU to form a first image. This is mainly because the image synthesized by the GPU is saved, so that only the invariant layer needs to be synthesized once, and the synthesized second image is placed in the frame buffer frame buffer of fb_target1, and the next frame is displayed.
  • the layer corresponding to the second image does not need to be resynthesized, and the second image can be directly read from the fb_target1 frame buffer frame buffer area, thereby reducing the workload of the GPU and reducing the power consumption of the mobile phone.
  • the dual-screen terminal may acquire a plurality of layers to be synthesized corresponding to the second screen by using the GPU, where the drawing of the layer may be consistent with the method described in S101, and further, the GPU is used to The plurality of layers corresponding to the two screens are combined to obtain a second image that is synthesized and stored in the second buffer area, wherein the second buffer area may be different from the first buffer area for storing the second screen.
  • the memory area of the image to be displayed for example, can be stored in the frame buffer frame buffer of fb_target1, as shown in Figure 4a.
  • the dual screen terminal may acquire the second image through the second channel of the hardware synthesizer, and transmit the second image to the second screen for display, where the second The channel may be any channel of the hardware synthesizer different from the first channel and not used, and the display flow of the first image and the second image is as shown in FIG. 4b.
  • the specific implementation in the process of displaying the second image is similar to the process of displaying the first image, and details are not described herein.
  • the foregoing S303 may be specifically:
  • the GPU completes the synthesis of multiple layers, and stores the synthesized second image in the second buffer area.
  • the synthesis completion notification may be sent to the hardware synthesizer.
  • the hardware synthesizer receives the notification of the completion of the synthesis of the second image, that is, when the dual screen terminal detects the synthesis completion instruction of the second image by the GPU, the second channel may obtain the second channel.
  • the second image stored in the second buffer area and the second image is transmitted to the second screen for display.
  • the second image in the second buffer area may be deleted.
  • the processing may be as follows: when the display completion instruction of the second image is detected, The second image in the second buffer area is deleted.
  • the dual-screen terminal may further include a buffer management service, and the dual-screen terminal may manage the first buffer area and the second buffer area by using the buffer management server.
  • the display completion instruction of the second image may be sent to the buffer management service, and the buffer management service receives the first image.
  • the second image in the second buffer area can be deleted.
  • the method for displaying an image according to the embodiment of the present invention when each of the plurality of layers to be synthesized corresponding to the second screen does not change, the GPU acquires multiple layers to be synthesized corresponding to the second screen. And synthesizing the plurality of layers to be synthesized corresponding to the second screen by the GPU to form a second image, storing the second image in the second buffer area, and finally acquiring the second buffer area through the second channel of the hardware synthesizer The stored second image is transmitted to the second screen for display, thereby implementing display of the second image on the second screen.
  • FIG. 5 is a schematic flowchart diagram of Embodiment 4 of a method for displaying an image provided by the present invention.
  • the present embodiment relates to a specific process of acquiring a second image and displaying a second image when each of the plurality of layers to be synthesized corresponding to the second screen changes.
  • the method in this embodiment may specifically include:
  • the one channel of the hardware synthesizer acquires a layer, and the number of layers of the plurality of layers to be synthesized corresponding to the second screen is less than or equal to the number of channels of the hardware synthesizer minus one.
  • this embodiment uses a hardware synthesizer for synthesis.
  • the hardware synthesizer of the dual-screen terminal has 4 channels, and the channel 1 is used for the first screen, then the second screen can only use the channel 2 to Channel 4 is the three channels.
  • one layer corresponds to one channel, that is, one channel can only acquire one layer. Therefore, in this embodiment, if the number of channels of the hardware synthesizer is n, the corresponding screen of the second screen is to be synthesized. The number of multiple layers is m, then m is less than or equal to n-1, so as to ensure that the hardware synthesizer obtains each of the plurality of layers to be synthesized corresponding to the second screen. In this embodiment, the remaining channels of the hardware synthesizer except the first channel acquire a plurality of layers to be synthesized corresponding to the second screen, as shown in FIG. 5a.
  • the layers are combined to form a second image, and the synthesis process is performed.
  • the hardware synthesizer synthesizes the second image
  • the second image is not saved, but is directly sent to the second screen for display.
  • the hardware synthesizer merges the layers corresponding to the next frame graphic to form a new second image, and directly sends the new second image to the second screen for display.
  • the power consumption of the second image of the change of the dual screen terminal processing is saved.
  • the method for displaying an image when each of the plurality of layers to be synthesized corresponding to the second screen changes, acquiring a second screen through a channel other than the first channel in the hardware synthesizer Corresponding multiple layers to be combined, and synthesizing a plurality of layers corresponding to the second screen corresponding to the second screen by a hardware synthesizer to form a second image, and transmitting the second image to the second screen for display, thereby implementing The display of the second image on the second screen reduces the power consumption of the dual screen terminal.
  • FIG. 6 is a schematic flowchart diagram of Embodiment 5 of a method for displaying an image according to the present invention.
  • the embodiment relates to acquiring the second image and displaying the second image when the plurality of layers to be combined corresponding to the second screen include both the change layer and the unchanged layer.
  • the specific process. As shown in FIG. 6, the method in this embodiment may specifically include:
  • one channel in the hardware synthesizer acquires a layer
  • the number of layers in the second set is less than or equal to the number of channels of the hardware synthesizer minus 2.
  • the mobile phone usually has several layers on the standby interface, and the status bar (the signal quality is displayed at the top of the screen). Pool power), navigation bar (virtual button at the bottom of the screen), wallpaper, launcher (APP icon), when the screen is bright, these layers have not changed, in this case, all layers can be GPU synthesized, because No change in the layer only needs to be synthesized once.
  • the status bar, navigation bar and wallpaper are unchanged layers, only the launcher icon changes, then you can use mixed composition, the status bar, navigation bar and wallpaper are synthesized by GPU, the result of the synthesis Together with the launcher layer, it is sent to the hardware synthesizer for resynthesis.
  • the plurality of layers to be synthesized corresponding to the second screen include both the changed layer and the unchanged layer, and the layer that does not change is attributed to the first layer set for easy distinction.
  • the changed layer is reduced to a second layer set.
  • one channel in the hardware synthesizer acquires one layer
  • one channel of the hardware synthesizer for example, the first channel
  • one channel of the hardware synthesizer for example, the second channel
  • the synthesized image is such that, assuming that the hardware synthesizer has n channels, the channels for acquiring the layers in the second layer set are n-2. Therefore, in order to ensure complete acquisition of each layer in the second layer set by the hardware synthesizer, the number of layers in the second layer set should not exceed n-2, as shown in FIG. 6a.
  • the GPU obtains each layer in the first layer set, and the layers are unchanged layers, and the second picture is obtained through the remaining channels of the hardware synthesizer except the first channel and the second channel. Layers of each change in the layer set.
  • the GPU acquires the layers that do not change in the first layer set, and combines the layers to form a third image.
  • the specific composition process is described in the foregoing embodiment, for example, according to each layer in the first layer set.
  • the specific positions on the second screen are combined to form a third image.
  • the third image is saved to the second buffer area, wherein the second buffer area is used to store the second image or an image related to the second image.
  • the GPU may send a synthesis completion notification to the hardware synthesizer, and the hardware synthesizer receives the notification of the completion of the synthesis of the third image, that is, When the dual screen terminal detects the GPU completes the synthesis completion instruction of the third image, the third image stored in the second buffer area may be acquired through the second channel.
  • the hardware synthesizer acquires a third image in the second buffer area through the second channel, and each layer in the second layer set obtained by the remaining channels except the first channel and the second channel, the hardware synthesizer pair The three images and the layers in the second layer set are merged to form a second image, and the second image is directly transmitted to the second screen for display.
  • the hardware synthesizer may synthesize each layer in the third image and the second layer set at the same time, or may synthesize each layer in the second layer set to form a fourth The image is then combined with the third image and the fourth image to generate a second image.
  • the method for displaying an image according to the embodiment of the present invention when a plurality of layers to be synthesized corresponding to the second screen includes a change layer and a non-change layer, the GPU obtains each image in the first layer set.
  • the layer obtains each layer in the second layer set through the remaining channels except the first channel and the second channel in the hardware synthesizer, and synthesizes each layer in the first layer set through the GPU to form a third layer.
  • And storing the third image in the second buffer area acquiring the third image stored in the second buffer area through the second channel of the hardware synthesizer, and using each layer in the second layer set by the hardware synthesizer And synthesizing with the third image to form a second image. That is, in this embodiment, the GPU is used to synthesize the unchanged layers, and the synthesis of the changed layers is performed by the hardware synthesizer, thereby achieving accurate acquisition of the second image and preparing for display of the second image.
  • a person skilled in the art may understand that all or part of the steps of implementing the above embodiments may be completed by hardware, or may be instructed by a program to execute related hardware, and the program may be stored in a computer readable storage medium.
  • the storage medium mentioned may be a read only memory, a magnetic disk or an optical disk or the like.
  • FIG. 7 is a schematic structural diagram of a dual screen terminal according to an embodiment of the present invention.
  • the dual screen terminal 600 includes: a memory 120, a processor 110, a GPU 130, a hardware synthesizer 151, a first screen 161, a second screen 162, and is stored on the memory 120 and can be in the a program running on the processor 110, the processor 110 is respectively connected to the memory 120, the hardware synthesizer 151 and the GPU 130, and the GPU 130 is connected to the memory 120, and the hardware synthesizer 151 respectively
  • the memory 120, the first screen 161 and the second screen 162 are connected, and the processor 110 can be used to implement the method for displaying an image provided in the above embodiments.
  • the memory 120 can be used to store software programs and modules, and the processor 110 executes various functional applications and data processing by running software programs and modules stored in the memory 120.
  • the memory 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store the root Data (such as audio data, phone book, etc.) created according to the use of the dual screen terminal 600.
  • memory 120 can include high speed random access memory 120, and can also include non-volatile memory 120, such as at least one disk storage 120 piece, flash memory device, or other volatile solid state memory 120 piece. Accordingly, memory 120 may also include a memory 120 controller to provide processor 110 access to memory 120.
  • the processor 110 is the control center of the dual screen terminal 600, which connects various portions of the entire handset using various interfaces and lines, by running or executing software programs and/or modules stored in the memory 120, and recalling stored in the memory 120.
  • the data performs various functions and processing data of the dual screen terminal 600, thereby performing overall monitoring of the mobile phone.
  • the processor 110 may include one or more processing cores; preferably, the processor 110 may integrate the application processor 110 and the modem processor 110, wherein the application processor 110 mainly processes an operating system, a user interface, and The application processor or the like, the modem processor 110 mainly processes wireless communication. It can be understood that the above-mentioned modem processor 110 may not be integrated into the processor 110.
  • the GPU 130 and the hardware synthesizer 151 are used to synthesize a layer, and the hardware synthesizer 151 is further configured to transmit the synthesized picture to the display unit 160 to cause the first screen 161 or the second screen 162 in the display unit 160 to display.
  • the hardware synthesizer 151 of the present embodiment belongs to the display controller 150 of the dual screen terminal 600, and the first screen 161 and the second screen 162 belong to the display unit 160, wherein the display controller 150 is connected to the display unit 160.
  • the processor 110, the memory 120, and the display controller 150 of the present embodiment are both connected to the system bus 140, and data can be transmitted through the system bus 140.
  • the GPU 130 is connected to the memory 120, and can read data from the memory 120 or write the data into the memory 120.
  • the hardware synthesizer 151 in the display controller 150 reads the data in the memory 120 through the system bus 140.
  • the dual-screen terminal 600 may further include an RF (Radio Frequency) circuit 210 , an input unit 220 , a sensor 170 , an audio circuit 180 , and a WiFi (Wireless Fidelity) module 190 . , power supply 200 and other components.
  • RF Radio Frequency
  • the dual screen terminal 600 structure shown in FIG. 11 does not constitute a limitation of the dual screen terminal, and may include more or less components than those illustrated, or may combine some components, or different. Assembly of parts. among them:
  • the RF circuit 210 can be used for receiving and transmitting signals during and after receiving or transmitting information, in particular, after receiving downlink information of the base station, and processing it by one or more processors 110; in addition, transmitting data related to the uplink to the base station.
  • the RF circuit 210 includes, but is not limited to, an antenna, at least one amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, an LNA (Low Noise Amplifier). , duplexer, etc.
  • SIM Subscriber Identity Module
  • the RF circuit 210 can also communicate with the wireless The network communicates with other devices.
  • the wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System of Mobile communication), GPRS (General Packet Radio Service), CDMA (Code Division Multiple Access). , Code Division Multiple Access), WCDMA (Wideband Code Division Multiple Access), LTE (Long Term Evolution), e-mail, SMS (Short Messaging Service), and the like.
  • GSM Global System of Mobile communication
  • GPRS General Packet Radio Service
  • CDMA Code Division Multiple Access
  • WCDMA Wideband Code Division Multiple Access
  • LTE Long Term Evolution
  • e-mail Short Messaging Service
  • the input unit 220 can be configured to receive input numeric or character information and to generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function controls.
  • input unit 220 can include touch-sensitive surface 221 as well as other input devices 222.
  • Touch-sensitive surface 221, also referred to as a touch display or trackpad, can collect touch operations on or near the user (eg, the user uses a finger, stylus, etc., on any touch-sensitive surface 221 or on the touch-sensitive surface 221 The operation near the touch-sensitive surface 221) and driving the corresponding connecting device according to a preset program.
  • the touch sensitive surface 221 can include two portions of a touch detection device and a touch controller.
  • the touch detection device detects the touch orientation of the user, and detects a signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts the touch information into contact coordinates, and sends the touch information.
  • the processor 110 is provided and can receive commands from the processor 110 and execute them.
  • the touch-sensitive surface 221 can be implemented in various types such as resistive, capacitive, infrared, and surface acoustic waves.
  • the input unit 220 can also include other input devices 222.
  • other input devices 222 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackballs, mice, joysticks, and the like.
  • the display unit 160 can be used to display information input by the user or information provided to the user and various graphical user interfaces of the dual screen terminal 600, which can be composed of graphics, text, icons, video, and any combination thereof.
  • the display unit 160 may include a first screen 161 and a second screen 162.
  • the second screen may be configured in the form of an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), or the like. 162.
  • the first screen 161 can be an ink screen.
  • touch-sensitive surface 221 can cover the display unit 160, and when the touch-sensitive surface 221 detects a touch operation thereon or nearby, it is transmitted to the processor 110 to determine the type of the touch event, and then the processor 110 according to the touch event The type provides a corresponding visual output on display unit 160.
  • touch-sensitive surface 221 and display unit 160 are implemented as two separate components to implement input and input functions, in some embodiments, touch-sensitive surface 221 can be integrated with display unit 160 for input. And output function.
  • the dual screen terminal 600 can also include at least one type of sensor 170, such as a light sensor, a motion sensor, and His sensor.
  • the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display unit 160 according to the brightness of the ambient light, and the proximity sensor may close the display unit when the dual screen terminal 600 moves to the ear. 160 and / or backlight.
  • the gravity acceleration sensor can detect the magnitude of acceleration in all directions (usually three axes). When it is stationary, it can detect the magnitude and direction of gravity.
  • the gesture of the mobile phone such as horizontal and vertical screen switching, related Game, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tapping), etc.; as for the dual-screen terminal 600, it can also be configured with other sensors such as gyroscope, barometer, hygrometer, thermometer, infrared sensor, etc. This will not be repeated here.
  • the audio circuit 180, the speaker 181, and the microphone 182 can provide an audio interface between the user and the dual screen terminal 600.
  • the audio circuit 180 can transmit the converted electrical data of the received audio data to the speaker 181 for conversion to the sound signal output by the speaker 181; on the other hand, the microphone 182 converts the collected sound signal into an electrical signal by the audio circuit 180. After receiving, it is converted into audio data, and then processed by the audio data output processor 110, transmitted to the terminal, for example, via the RF circuit 210, or outputted to the memory 120 for further processing.
  • the audio circuit 180 may also include an earbud jack to provide communication of the peripheral earphones with the dual screen terminal 600.
  • WiFi is a short-range wireless transmission technology
  • the dual-screen terminal 600 can help users to send and receive emails, browse web pages, and access streaming media through the WiFi module 190, which provides wireless broadband Internet access for users.
  • FIG. 8 shows the WiFi module 190, it can be understood that it does not belong to the essential configuration of the dual screen terminal 600, and may be omitted as needed within the scope of not changing the essence of the invention.
  • the dual screen terminal 600 also includes a power source 200 (such as a battery) for powering various components.
  • the power source 200 can be logically coupled to the processor 110 through the power source management system to manage charging, discharging, and power through the power management system. Consumption management and other functions.
  • the power supply 200 can also include any one or more of a DC or AC power source, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
  • the dual screen terminal 600 may further include a camera, a Bluetooth module, and the like, and details are not described herein.
  • the display unit 160 of the dual screen terminal 600 is a touch screen display
  • the dual screen terminal 600 further includes a memory 120, and one or more programs, wherein one or more programs are stored in the memory 120, and The one or more programs described above are configured to be executed by one or more processors 110, which are implemented when the processor 110 executes the program:
  • the first image is transmitted to the first screen for display by a hardware synthesizer in the dual screen terminal, and the second image is transmitted to the second screen for display.
  • the step of synthesizing the plurality of layers to be synthesized corresponding to the first screen to form a first image is performed, and the step includes:
  • the step of synthesizing the plurality of layers to be combined corresponding to the second screen to form a second image is further implemented, the step comprising:
  • the first screen is an ink screen
  • the processor 110 executes the program, the step of acquiring multiple layers to be synthesized corresponding to the first screen is implemented, and the step includes:
  • the step includes:
  • the step of transmitting the first image to the first screen for display by using a hardware synthesizer in the dual screen terminal is further implemented, the step comprising:
  • the processor 110 when each of the plurality of layers to be synthesized corresponding to the second screen does not change, the processor 110, when executing the program, obtains the to-be-synthesized corresponding to the second screen.
  • the steps of multiple layers, this step includes:
  • the step of synthesizing the plurality of layers to be combined corresponding to the second screen to form a second image is further implemented, the step comprising:
  • the step of transmitting the second image to the second screen for display by the hardware synthesizer is further implemented, and the step includes:
  • this step includes:
  • the channel other than the first channel the plurality of layers to be synthesized corresponding to the second screen, where one channel in the hardware synthesizer acquires a layer, the second screen
  • the number of corresponding multiple layers to be synthesized is less than or equal to the number of channels of the hardware synthesizer minus one;
  • the step of synthesizing the plurality of layers to be combined corresponding to the second screen to form a second image is further implemented, the step comprising:
  • the processor 110 when the plurality of layers to be synthesized corresponding to the second screen includes a first layer set and a second layer set, the processor 110 implements a layer in the first layer set when executing the program.
  • the steps of the multiple layers in the second layer set are changed, and the multiple layers corresponding to the second screen are acquired.
  • the steps include:
  • the step of synthesizing the plurality of layers to be combined corresponding to the second screen to form a second image is further implemented, the step comprising:
  • each layer in the second layer set and the third image to form the second image.
  • the processor 110 executes the program
  • the first channel stored in the first buffer area is acquired by the first channel of the hardware synthesizer, and the first image is transmitted to the first image.
  • the screen displays the steps, which include:
  • This step includes:
  • processor 110 also implements when executing the program:
  • the processor 110 executes the program
  • the first image stored in the first buffer area is acquired by the first channel of the hardware synthesizer, and the first image is transmitted to the first A screen for displaying steps, the steps including:
  • Decoding the first image obtaining logical TCON data corresponding to the first image, and driving the first screen to display according to the TCON data.
  • the dual-screen terminal of this embodiment may be used to implement the technical solution of the foregoing method embodiment, and the implementation principle and the technical effect are similar, and details are not described herein again.
  • the embodiment provides a computer readable non-volatile storage medium, where the program is stored in the non-volatile storage medium, and when the program is executed, the method step of displaying the image is Execution, its implementation principle and technical effect are similar, and will not be described here.
  • FIG. 9 is a schematic flowchart diagram of Embodiment 1 of an apparatus for displaying an image provided by the present invention.
  • the device of this embodiment can It is implemented by software, hardware or a combination of software and hardware.
  • the apparatus of this embodiment may include:
  • the obtaining module 410 is configured to respectively acquire a plurality of layers to be synthesized corresponding to the first screen and a plurality of layers to be combined corresponding to the second screen.
  • the compositing module 420 is configured to synthesize a plurality of layers to be synthesized corresponding to the first screen to form a first image, and synthesize the plurality of layers to be synthesized corresponding to the second screen to form a second image.
  • the display module 430 is configured to transmit the first image to the first screen for display through a hardware synthesizer in the dual screen terminal, and transmit the second image to the second screen for display.
  • the device in this embodiment may be used to implement the technical solution of the foregoing method embodiment, and the implementation principle and the technical effect are similar, and details are not described herein again.
  • the synthesizing module 420 is configured to display, according to the first screen, a display position of each of the plurality of layers to be synthesized corresponding to the first screen. And synthesizing a plurality of layers to be synthesized corresponding to the first screen to form the first image; each of the plurality of layers to be synthesized corresponding to the second screen is in the a display position of the second screen, and synthesizing a plurality of layers to be synthesized corresponding to the second screen to form the second image.
  • the obtaining module 410 is configured to acquire, by the graphics processor GPU in the dual-screen terminal, a plurality of layers to be synthesized corresponding to the first screen when the first screen is an ink screen.
  • the synthesizing module 420 is configured to synthesize a plurality of layers to be synthesized corresponding to the first screen by using the GPU to form the first image, and store the first image in a first buffer area.
  • the display module 430 is configured to acquire a first image stored in the first buffer area by using a first channel of the hardware synthesizer, and transmit the first image to the first screen for display.
  • the obtaining module 410 is configured to acquire, by the GPU, the to-be-synthesized corresponding to the first screen by using the GPU when each of the plurality of layers to be synthesized corresponding to the second screen does not change. Layers.
  • the synthesizing module 420 is configured to synthesize a plurality of layers to be synthesized corresponding to the second screen by using the GPU to form the second image, and store the second image in a second buffer area.
  • the display module 430 is configured to acquire a second image stored in the second buffer area by using a second channel of the hardware synthesizer, and transmit the second image to the second screen for display.
  • the obtaining module 410 is configured to obtain, by using, a channel other than the first channel in the hardware synthesizer when each of the plurality of layers to be synthesized corresponding to the second screen changes The plurality of layers to be synthesized corresponding to the second screen, wherein one channel in the hardware synthesizer acquires a layer, and the number of the plurality of layers to be synthesized corresponding to the second screen is less than or equal to the hardware The number of channels in the synthesizer is reduced by 1.
  • the synthesizing module 420 is configured to synthesize a plurality of layers to be synthesized corresponding to the second screen by using the hardware synthesizer to form the second image.
  • the obtaining module 410 is configured to: when the multiple layers to be synthesized corresponding to the second screen include the first layer set and the second layer set, obtain the first layer set by using the GPU Each layer of the second layer set is acquired by the remaining channels of the hardware synthesizer except the first channel and the second channel; wherein one channel in the hardware synthesizer acquires a layer The number of layers in the second set is less than or equal to the number of channels of the hardware synthesizer minus 2.
  • the synthesizing module 420 is configured to synthesize each layer in the first layer set by the GPU to form a third image, and store the third image in a second buffer area.
  • the obtaining module 410 is further configured to acquire, by using the second channel of the hardware synthesizer, the third image stored in the second buffer area.
  • the synthesizing module 420 is further configured to synthesize each layer in the second layer set and the third image by using the hardware synthesizer to form the second image.
  • the display module 430 is configured to acquire, by using the first channel of the hardware synthesizer in the dual-screen terminal, the first buffer area when detecting the synthesis completion instruction of the GPU by the GPU And storing the first image, and transmitting the first image to the first screen for display; when detecting the synthesis completion instruction of the second image by the GPU, passing the second of the hardware synthesizer The channel acquires the second image stored in the second buffer area, and transmits the second image to the second screen for display.
  • the device in this embodiment may be used to implement the technical solution of the foregoing method embodiment, and the implementation principle and the technical effect are similar, and details are not described herein again.
  • FIG. 10 is a schematic flowchart diagram of Embodiment 2 of an apparatus for displaying an image provided by the present invention. Based on the above embodiment, as shown in FIG. 10, the apparatus further includes a deletion module 440:
  • the deleting module 440 is configured to: when the display completion instruction of the first image is detected, the first The first image in the buffer area is deleted; when the display completion instruction of the second image is detected, the second image in the second buffer area is deleted.
  • the device in this embodiment may be used to implement the technical solution of the foregoing method embodiment, and the implementation principle and the technical effect are similar, and details are not described herein again.
  • FIG. 11 is a schematic flowchart diagram of Embodiment 3 of an apparatus for displaying an image according to the present invention. Based on the above embodiment, as shown in FIG. 11, the apparatus further includes a decoding module 450:
  • the obtaining module 410 is further configured to acquire, by using the first channel of the hardware synthesizer, the first image stored in the first buffer area.
  • the decoding module 450 is configured to decode the first image, obtain logical TCON data corresponding to the first image, and drive the first screen to display according to the TCON data.
  • the device in this embodiment may be used to implement the technical solution of the foregoing method embodiment, and the implementation principle and the technical effect are similar, and details are not described herein again.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Image Processing (AREA)

Abstract

Des modes de réalisation de la présente invention concernent un procédé d'affichage d'images, un terminal à double écran et un support d'informations non volatil lisible par ordinateur. Le procédé consiste : à acquérir respectivement une pluralité de couches d'image à fusionner correspondant à un premier écran et une pluralité de couches d'image à fusionner correspondant à un second écran ; à fusionner la pluralité de couches d'image à fusionner correspondant au premier écran pour former une première image et à fusionner la pluralité de couches d'image à fusionner correspondant au second écran pour former une seconde image ; au moyen d'un synthétiseur matériel dans le terminal à double écran, à transmettre la première image au premier écran en vue de son affichage et transmettre la seconde image au second écran en vue de son affichage, ce qui permet de mettre en œuvre un affichage simultané d'images comprenant de multiples couches d'image sur les deux écrans du terminal à double écran.
PCT/CN2017/102896 2017-03-09 2017-09-22 Procédé d'affichage d'image, terminal à double écran et support d'informations non volatil lisible par ordinateur Ceased WO2018161534A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710138031.7 2017-03-09
CN201710138031.7A CN106933525B (zh) 2017-03-09 2017-03-09 一种显示图像的方法和装置

Publications (1)

Publication Number Publication Date
WO2018161534A1 true WO2018161534A1 (fr) 2018-09-13

Family

ID=59433823

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/102896 Ceased WO2018161534A1 (fr) 2017-03-09 2017-09-22 Procédé d'affichage d'image, terminal à double écran et support d'informations non volatil lisible par ordinateur

Country Status (2)

Country Link
CN (1) CN106933525B (fr)
WO (1) WO2018161534A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109885271A (zh) * 2019-03-18 2019-06-14 青岛海信电器股份有限公司 数据显示处理方法、装置及电子设备

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106933525B (zh) * 2017-03-09 2019-09-20 青岛海信移动通信技术股份有限公司 一种显示图像的方法和装置
CN107728988B (zh) * 2017-10-12 2020-11-06 青岛海信移动通信技术股份有限公司 用于墨水屏的内容显示方法及装置
CN107678825A (zh) * 2017-10-16 2018-02-09 青岛海信电器股份有限公司 一种应用于电子白板的渲染方法及电子白板
CN107783749A (zh) * 2017-11-09 2018-03-09 青岛海信移动通信技术股份有限公司 一种图像数据的显示方法、装置和移动终端
CN108563413A (zh) * 2018-03-13 2018-09-21 安徽思帕德信息技术有限公司 计算机实现分层显示的系统及方法
CN109324915A (zh) * 2018-09-26 2019-02-12 努比亚技术有限公司 一种信息处理方法、终端和计算机可读存储介质
CN110022445B (zh) * 2019-02-26 2022-01-28 维沃软件技术有限公司 一种内容输出方法及终端设备
CN110641382B (zh) * 2019-09-10 2023-06-16 沈阳中科创达软件有限公司 一种车载界面的显示方法、装置、电子设备和存储介质
CN112083905A (zh) * 2020-09-16 2020-12-15 青岛海信移动通信技术股份有限公司 电子设备及其图层绘制方法
CN113625983B (zh) * 2021-08-10 2024-08-27 Oppo广东移动通信有限公司 图像显示方法、装置、计算机设备及存储介质
CN113986162B (zh) * 2021-09-22 2022-11-11 荣耀终端有限公司 图层合成方法、设备及计算机可读存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102184720A (zh) * 2010-06-22 2011-09-14 上海盈方微电子有限公司 一种支持多层多格式输入的图像合成显示的方法及装置
CN103294453A (zh) * 2012-02-24 2013-09-11 华为技术有限公司 图像处理方法和图像处理设备
CN104423946A (zh) * 2013-08-30 2015-03-18 联想(北京)有限公司 一种图像处理方法以及电子设备
CN104994276A (zh) * 2015-06-26 2015-10-21 三星电子(中国)研发中心 一种拍摄的方法和装置
CN106933525A (zh) * 2017-03-09 2017-07-07 青岛海信移动通信技术股份有限公司 一种显示图像的方法和装置

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4722784B2 (ja) * 2006-07-05 2011-07-13 パイオニア株式会社 電子黒板装置および電子黒板装置における画像処理方法並びにそのプログラム
CN202217260U (zh) * 2011-09-08 2012-05-09 福州瑞芯微电子有限公司 一种多屏幕显示控制器
CN103686304B (zh) * 2013-12-09 2017-02-01 华为技术有限公司 一种图层合成方法、装置及终端设备
CN104850327B (zh) * 2015-05-27 2019-07-16 小米科技有限责任公司 移动终端的屏幕截图方法及装置、电子设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102184720A (zh) * 2010-06-22 2011-09-14 上海盈方微电子有限公司 一种支持多层多格式输入的图像合成显示的方法及装置
CN103294453A (zh) * 2012-02-24 2013-09-11 华为技术有限公司 图像处理方法和图像处理设备
CN104423946A (zh) * 2013-08-30 2015-03-18 联想(北京)有限公司 一种图像处理方法以及电子设备
CN104994276A (zh) * 2015-06-26 2015-10-21 三星电子(中国)研发中心 一种拍摄的方法和装置
CN106933525A (zh) * 2017-03-09 2017-07-07 青岛海信移动通信技术股份有限公司 一种显示图像的方法和装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109885271A (zh) * 2019-03-18 2019-06-14 青岛海信电器股份有限公司 数据显示处理方法、装置及电子设备

Also Published As

Publication number Publication date
CN106933525B (zh) 2019-09-20
CN106933525A (zh) 2017-07-07

Similar Documents

Publication Publication Date Title
US11861161B2 (en) Display method and apparatus
WO2018161534A1 (fr) Procédé d'affichage d'image, terminal à double écran et support d'informations non volatil lisible par ordinateur
CN110602321B (zh) 应用程序切换方法、装置、电子装置及存储介质
KR102776206B1 (ko) 애플리케이션 공유 방법, 제1 전자기기 및 컴퓨터 판독가능 저장 매체
CN114741012B (zh) 在通知栏下拉菜单中管理多个自由窗口
JP2021525430A (ja) 表示制御方法及び端末
CN108021321B (zh) 一种应用运行状态控制方法及移动终端
CN108762881B (zh) 界面绘制方法、装置、终端及存储介质
CN110989882B (zh) 一种控制方法、电子设备和计算机可读存储介质
JP6202345B2 (ja) 表示制御装置、表示制御方法、およびプログラム
CN106293375B (zh) 一种场景切换方法,及设备
KR102090745B1 (ko) 전자장치에서 외부 디스플레이 장치를 이용하여 멀티태스킹을 수행하는 방법 및 장치
CN113129417B (zh) 一种全景应用中图像渲染的方法及终端设备
WO2018006841A1 (fr) Procédé, dispositif et appareil de transmission d'informations de code qr
CN103488450A (zh) 一种投射图片的方法、装置及终端设备
CN109003194A (zh) 评论分享方法、终端以及存储介质
CN110460894A (zh) 一种视频图像显示方法及终端设备
WO2018137304A1 (fr) Procédé d'affichage d'une application 2d dans un dispositif vr, et terminal
CN111158815B (zh) 一种动态壁纸模糊方法、终端和计算机可读存储介质
CN106502608A (zh) 显示方法、装置及终端设备
CN110045890A (zh) 应用标识的显示方法及终端设备
CN107479799B (zh) 一种显示窗口的方法和装置
CN111240551A (zh) 应用程序控制方法及电子设备
WO2015014138A1 (fr) Procédé, dispositif et équipement d'affichage de trame d'affichage
CN110493451B (zh) 一种数据传输方法、电子设备及终端

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17899701

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 13/12/2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17899701

Country of ref document: EP

Kind code of ref document: A1