WO2022121775A1 - Procédé de projection sur écran, et dispositif - Google Patents
Procédé de projection sur écran, et dispositif Download PDFInfo
- Publication number
- WO2022121775A1 WO2022121775A1 PCT/CN2021/135158 CN2021135158W WO2022121775A1 WO 2022121775 A1 WO2022121775 A1 WO 2022121775A1 CN 2021135158 W CN2021135158 W CN 2021135158W WO 2022121775 A1 WO2022121775 A1 WO 2022121775A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- terminal
- interface
- screen
- data
- content
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1454—Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
Definitions
- the present application relates to the field of electronic devices, and in particular, to a screen projection method and device.
- the display interface of one device can be projected onto the display screen of another device for the user to view.
- the display interface of one device can be presented on another device, which is mainly realized by the one-to-one mirror screen projection technology, that is, only one-to-one screen projection can be realized.
- the embodiments of the present application provide a screen projection method and device, which realizes the presentation of display interfaces of multiple devices on the same device, that is, realizes many-to-one screen projection.
- the screen projection source end creates multiple media streams and distributes them to one or more screen projection destinations according to the policy, so that the content of multiple applications in one device can be projected and displayed on other devices.
- an embodiment of the present application provides a screen projection method.
- the method can be applied to a first terminal.
- the first terminal is connected to a plurality of second terminals.
- the method may include: the first terminal selects from a plurality of second terminals. Each second terminal receives data; the first terminal displays multiple first interfaces on the first terminal according to the data received from multiple second terminals, and multiple first interfaces correspond to multiple second terminals one-to-one; wherein , the content of the first interface is a mirror image of the content of the second interface displayed by the corresponding second terminal, or the content of the first interface is the same as part of the content of the second interface displayed by the corresponding second terminal.
- the first terminal serving as the screen projection destination can display multiple first interfaces on the display screen of the first terminal according to data sent by multiple second terminals serving as the screen projection source.
- the interfaces are in one-to-one correspondence with multiple second terminals.
- the content of the first interface is a mirror image of the content of the second interface displayed by the corresponding second terminal, or the content of the first interface is the same as part of the content of the second interface displayed by the corresponding second terminal. It realizes many-to-one projection from multiple projection sources to one projection destination. In this way, in scenarios such as meetings, conference presentations, etc., multiple mobile phones and tablet computers can project the content (such as PPT, broadcast videos) on their display screens to the same large-screen device for presentation, realizing many-to-one. 's screencast. The efficiency of collaborative use of multiple devices is improved, and the user experience is improved.
- the method may further include: the first terminal may create multiple drawing components, and the multiple drawing components are in one-to-one correspondence with the multiple second terminals.
- a drawing component can be a view or a canvas.
- the first terminal displays a plurality of first interfaces on the first terminal according to the data received from the plurality of second terminals, which may include: the first terminal, according to the data received from the plurality of second terminals, displays on the plurality of drawing components respectively.
- a first interface corresponding to the second terminal is drawn to display a plurality of first interfaces on the first terminal.
- the method may further include: configuring the first terminal with multiple first interfaces Decoding parameters, the multiple decoding parameters are in one-to-one correspondence with multiple second terminals; the first terminal decodes the data received from the corresponding second terminals according to the multiple decoding parameters.
- configuring corresponding decoding parameters for different second terminals, which are used to decode corresponding data multi-channel decoding is realized.
- the method may further include: acquiring the connection information of the plurality of second terminals by the first terminal, the The connection information is used for establishing a connection between the first terminal and the corresponding second terminal; wherein, the multiple drawing components are in one-to-one correspondence with the multiple second terminals, including: the one-to-one correspondence between the multiple drawing components and the connection information of the multiple second terminals; The plurality of decoding parameters are in one-to-one correspondence with the plurality of second terminals, including: a one-to-one correspondence between the plurality of decoding parameters and the connection information of the plurality of second terminals.
- the method may further include: the first terminal receives user feedback A first operation of the window of the first interface; in response to the first operation, the first terminal reduces, enlarges or closes the window, or switches the focus window.
- the user can control the first interface by using the input device of the screen projection destination, for example, by setting the focus and switching the focus between the screen projection interfaces of different source devices according to user operations, or realizing independent control of different screen projection sources (such as zoom out, zoom in or close the screen projection interface).
- the screen projection destination can also adjust the layout of the presented screen projection interface according to the increase or decrease of the source device, so as to present the best visual effect to the user.
- the method may further include: the first terminal receives user feedback The second operation of the first interface corresponding to the second terminal; the first terminal sends the data of the second operation to the second terminal for the second terminal to display the third interface according to the second operation.
- the first terminal After receiving the user's operation on the first interface using the input device of the screen-casting destination, such as the screen-casting interface, the first terminal sends the data corresponding to the operation to the screen-casting source corresponding to the first interface, so as to facilitate the screen-casting source
- the terminal makes a corresponding response, so that the user can use the input device of the screen-casting destination to realize the reverse control of the screen-casting source.
- the method may further include: the first terminal receives updated data from the second terminal;
- the first interface corresponding to the second terminal is updated to a fourth interface, and the content of the fourth interface is a mirror image of the content of the third interface, or the content of the fourth interface is the same as part of the content of the third interface.
- the data of the updated interface can be sent to the first terminal, so that the first terminal can update the corresponding interface displayed by the first terminal.
- the first terminal further establishes a connection with a third terminal; the method may further include: the first terminal sends data received from multiple second terminals to the third terminal, for the third terminal
- the terminal displays multiple first interfaces.
- the third terminal may be a terminal that conducts a smooth call with the first terminal.
- the terminal can also display the interface of the screen projection source to realize cross-regional office work. This cross-regional office method can improve meeting efficiency and save communication costs for cross-regional office work.
- the method may further include: the first terminal receives video data from the third terminal; the first terminal, while the first terminal displays a plurality of first interfaces, according to the video data of the third terminal The data displays a video call screen on the first terminal.
- the method may further include: the first terminal collects video data and sends it to the third terminal, so that the third terminal displays video data while displaying multiple first interfaces on the third terminal call screen.
- the terminals in the two regions can not only display the video call screen, but also display the content projected by the local and the peer end, which further improves the meeting efficiency and saves the communication cost of cross-regional office work.
- an embodiment of the present application provides a screen projection method, the method can be applied to a second terminal, the second terminal is connected to the first terminal, the method can include: the second terminal displays a second interface; the second terminal receives User operation; in response to the user operation, the second terminal sends the data of the second interface to the first terminal for the first terminal to display the first interface corresponding to the second terminal, and the first terminal also displays data related to other second terminals The corresponding first interface; wherein the content of the first interface is a mirror image of the content of the second interface displayed by the corresponding second terminal, or the content of the first interface is the same as part of the content of the second interface displayed by the corresponding second terminal.
- multiple second terminals serving as the source of screen projection can send the data of the current interface to the first terminal serving as the destination end of screen projection according to user triggers, so that the first terminal can, according to the data sent by the multiple second terminals,
- Multiple first interfaces may be displayed on the display screen of the first terminal, and the multiple first interfaces are in one-to-one correspondence with multiple second terminals.
- the content of the first interface is a mirror image of the content of the second interface displayed by the corresponding second terminal, or the content of the first interface is the same as part of the content of the second interface displayed by the corresponding second terminal. It realizes many-to-one projection from multiple projection sources to one projection destination.
- multiple mobile phones and tablet computers can project the content (such as PPT, broadcast videos) on their display screens to the same large-screen device for presentation, realizing many-to-one. 's screencast.
- the efficiency of collaborative use of multiple devices is improved, and the user experience is improved.
- the above-mentioned user operation may be an operation of starting screen projection; before the second terminal sends the data of the second interface to the first terminal, the method may further include: the second terminal obtains the data of the second interface. data; wherein, when the content of the first interface is a mirror image of the content of the second interface, the data of the second interface is the screen recording data of the second interface; the content of the first interface is the same as part of the content of the second interface In this case, the data of the second interface is the screen recording data of the layer where the predetermined element in the second interface is located.
- multiple second terminals can project their currently displayed interface or part of the content in the interface to the first terminal for display, so as to realize many-to-one screen projection.
- the method may further include: the second terminal displays The configuration interface includes a layer filter setting option; the second terminal receives a user's selection operation on the layer filter setting option.
- the second terminal as the screencasting source can project the layer where some elements (such as the element dragged by the user, or the predetermined element) in the current interface are located to the screencasting purpose end to implement layer filtering. In this way, it can be ensured that the private information of the screen projection source end is not projected to the screen projection destination end, and the privacy of the user is protected.
- the second terminal receives the user operation, which may include: the second terminal receives the user's drag operation on the second interface or an element in the second interface; sending the first terminal on the second terminal to the first terminal Before the data of the second interface, the method may further include: the second terminal determines that the user's drag intention is to drag across devices; and the second terminal obtains the data of the second interface.
- the user can trigger screen projection by dragging the interface of the second terminal or an element in the interface.
- the element in the case of receiving a user's drag operation on an element in the second interface, the element may be a video component, a floating window, a picture-in-picture or a free-form window.
- the data is the screen recording data of the layer where the element is located; or, the element is a user interface (UI) control in the second interface, and the data of the second interface is the instruction stream of the second interface and the identifier of the UI control, or the second interface
- the data are the drawing instructions and logos of UI controls.
- the command stream corresponding to the content to be projected can be sent to the projection destination to realize the projection. In this way, the display effect of the projection interface at the projection destination can be improved, and the transmission can be saved. bandwidth.
- an embodiment of the present application provides a screen projection device, the device can be applied to a first terminal, the first terminal is connected to a plurality of second terminals, and the device may include: a receiving unit for receiving from a plurality of second terminals Each second terminal in the terminals receives data; the display unit is configured to display a plurality of first interfaces on the first terminal according to the data received from the plurality of second terminals, and the plurality of first interfaces are connected with the plurality of second terminals.
- the content of the first interface is a mirror image of the content of the second interface displayed by the corresponding second terminal, or the content of the first interface is the same as part of the content of the second interface displayed by the corresponding second terminal.
- the apparatus may further include: a creating unit, configured to create multiple drawing components, the multiple drawing components are in one-to-one correspondence with the multiple second terminals, and the drawing components are views or canvases; Displaying a plurality of first interfaces on the first terminal according to the data received from the plurality of second terminals may include: according to the data received from the plurality of second terminals, respectively drawing the first interface corresponding to the second terminal on the plurality of drawing components an interface to display a plurality of first interfaces on the first terminal.
- the apparatus may further include: a configuration unit configured to configure a plurality of decoding parameters, and the plurality of decoding parameters are in one-to-one correspondence with the plurality of second terminals; a decoding unit configured to decode according to the plurality of second terminals parameter to decode the data received from the corresponding second terminal.
- the apparatus may further include: an acquiring unit, configured to acquire connection information of multiple second terminals, where the connection information is used for establishing a connection between the first terminal and the corresponding second terminal; wherein the multiple The one-to-one correspondence between the drawing components and the multiple second terminals includes: a one-to-one correspondence between the multiple drawing components and the connection information of the multiple second terminals; the one-to-one correspondence between the multiple decoding parameters and the multiple second terminals, including: multiple decoding parameters The parameters are in one-to-one correspondence with connection information of multiple second terminals.
- the apparatus may further include: an input unit, configured to receive a user's first operation on the window of the first interface; a display unit, further configured to reduce, zoom in or zoom out in response to the first operation Close the window, or switch the focus window.
- the input unit is further configured to receive a user's second operation on the first interface corresponding to the second terminal; the apparatus may further include: a sending unit, configured to send the data of the second operation It is sent to the second terminal for the second terminal to display the third interface according to the second operation.
- the receiving unit is further configured to receive updated data from the second terminal; the display unit is further configured to update the first interface corresponding to the second terminal to a fourth interface according to the updated data , the content of the fourth interface is a mirror image of the content of the third interface, or the content of the fourth interface is the same as part of the content of the third interface.
- the first terminal further establishes a connection with the third terminal; the sending unit is further configured to send data received from multiple second terminals to the third terminal, so that the third terminal can display multiple a first interface.
- the receiving unit is further configured to receive video data from the third terminal; the display unit is further configured to, while the first terminal displays a plurality of first interfaces, according to the video data of the third terminal The data displays a video call screen on the first terminal.
- the apparatus may further include: a collection unit, configured to collect video data; a sending unit, further configured to send the video data to a third terminal, for the third terminal to display on the third terminal While multiple first interfaces are displayed, a video call screen is displayed.
- an embodiment of the present application provides a screen projection device, the device can be applied to a second terminal, the second terminal is connected to the first terminal, the device can include: a display unit for displaying a second interface; an input unit , used to receive user operation; the sending unit is used to send data of the second interface to the first terminal in response to the user operation, so that the first terminal can display the first interface corresponding to the second terminal, and the first terminal also displays the data of the second interface.
- the user operation is an operation of starting screen projection; the device may further include: an acquiring unit, configured to acquire data of the second interface; wherein the content in the first interface is the content of the second interface In the case of mirroring, the data of the second interface is the screen recording data of the second interface; when the content of the first interface is the same as part of the content of the second interface, the data of the second interface is the location of the predetermined element in the second interface The screen recording data of the layer.
- an acquiring unit configured to acquire data of the second interface
- the content in the first interface is the content of the second interface
- the data of the second interface is the screen recording data of the second interface
- the data of the second interface is the location of the predetermined element in the second interface The screen recording data of the layer.
- the display unit is further configured to display a configuration interface, where the configuration interface includes a layer filter setting option; the input unit is further configured to receive a user's selection operation on the layer filter setting option.
- the input unit receives a user operation, which may include: the input unit receives a user's drag operation on the second interface or an element in the second interface; the apparatus may further include: a determination unit for determining The dragging intention of the user is to drag across devices; the obtaining unit is also used to obtain the data of the second interface.
- the element in the case of receiving a user's drag operation on an element in the second interface, the element may be a video component, a floating window, a picture-in-picture or a free-form window.
- the data is the screen recording data of the layer where the element is located; or, the element can be a user interface UI control in the second interface, and the data of the second interface is the instruction flow of the second interface and the identifier of the UI control, or the second interface.
- the data is the drawing instruction and identification of the UI control.
- an embodiment of the present application provides a screen projection method, which is applied to a first terminal.
- the method may include: the first terminal displays an interface of the first application; the first terminal receives a first operation; and in response to the first operation, The first terminal sends data of the interface of the first application to the second terminal for the second terminal to display the first interface, and the content of the first interface is a mirror image of the interface content of the first application, or the content of the first interface is the same as that of the first interface.
- the first terminal receives the second operation; in response to the second operation, the first terminal displays the interface of the second application; the first terminal receives the third operation; In the case of an interface of an application, in response to the third operation, the first terminal sends data of the interface of the second application to the third terminal for the third terminal to display the second interface, and the content of the second interface is the second application
- the mirror image of the interface content of the second application, or the content of the second interface is the same as part of the interface content of the second application.
- the first terminal serving as the screen projection source can realize the projection of the contents of multiple applications of the first terminal to one or more screen projection destinations by creating multiple media streams, which satisfies the requirement of multi-task parallelism , which can improve the use efficiency of the terminal and improve the user experience.
- the method may further include: the first terminal creates a first virtual display; the first terminal draws the interface of the first application or the first element in the interface of the first application to the first virtual display, to obtain the data of the interface of the first application; the first terminal creates a second virtual display; the first terminal draws the interface of the second application or the second element in the interface of the second application to the second virtual display to obtain the second application interface data.
- the first terminal creates a virtual display
- screen recording is performed on the content of the projection source end, so as to realize the display of the content at the projection source end on the projection destination end, and support mirror projection and hetero-source projection.
- the method may further include: the first terminal sends audio data of the first application to the second terminal for the second terminal to output corresponding audio; the first terminal sends the second terminal to the third terminal The audio data of the application is used for the third terminal to output corresponding audio. In this way, the projection display of the audio data from the projection source end to the projection target end is supported.
- the method may further include: the first terminal creates a first audio recording (AudioRecord) object, and records and obtains audio data of the first application based on the first AudioRecord object; the first terminal creates a second AudioRecord object , and record and obtain audio data of the second application based on the second AudioRecord object.
- the first terminal creates a first audio recording (AudioRecord) object, and records and obtains audio data of the first application based on the first AudioRecord object
- the first terminal creates a second AudioRecord object , and record and obtain audio data of the second application based on the second AudioRecord object.
- the second terminal is the same as the third terminal.
- an embodiment of the present application provides a screen projection method, which is applied to a second terminal.
- the method may include: the second terminal receives data from an interface of a first application of the first terminal; and the second terminal displays the first interface , the content of the first interface is a mirror image of the interface content of the first application, or the content of the first interface is the same as part of the content of the interface of the first application; the second terminal receives data from the interface of the second application of the first terminal; The second terminal displays a third interface.
- the third interface includes the content of the first interface and the content of the second interface.
- the content of the second interface is a mirror image of the interface content of the second application, or the content of the second interface and the content of the second application. Parts of the interface are the same.
- the second terminal serving as the screen projection destination can receive multiple media streams from the first terminal serving as the screen projection source, so as to realize the projection of the contents of multiple applications of the first terminal to the second terminal, which satisfies many To meet the requirements of parallel tasks, this can improve the efficiency of the terminal and improve the user experience.
- an embodiment of the present application provides a screen projection device, which is applied to a first terminal.
- the device may include: a display unit for displaying an interface of the first application; an input unit for receiving a first operation; a sending unit , used to send the data of the interface of the first application to the second terminal in response to the first operation, for the second terminal to display the first interface, the content of the first interface is a mirror image of the interface content of the first application, or the first interface
- the content of the interface is the same as part of the content of the interface of the first application;
- the input unit is also used to receive the second operation;
- the display unit is also used to display the interface of the second application in response to the second operation; the input unit is also used to receiving a third operation;
- the sending unit is further configured to, in the case where the first terminal projects the interface of the first application to the second terminal, in response to the third operation, send the data of the interface of the second application to the third terminal, for The third terminal is displaying the second interface, and the content
- the apparatus may further include: a creating unit, configured to create a first virtual display; a drawing unit, configured to draw the interface of the first application or the first element in the interface of the first application to the first virtual display. a virtual display to obtain the data of the interface of the first application; the creation unit is also used to create a second virtual display; the drawing unit is also used to draw the interface of the second application or the second element in the interface of the second application to The second virtual display is used to obtain data of the interface of the second application.
- the sending unit is further configured to send the audio data of the first application to the second terminal, so that the second terminal outputs corresponding audio; send the audio data of the second application to the third terminal, use The corresponding audio is output to the third terminal.
- the creation unit is further configured to create a first AudioRecord object; the apparatus may further include: a recording unit, configured to record and obtain audio data of the first application based on the first AudioRecord object; the creation unit, It is also used to create a second AudioRecord object; the recording unit is also used to record and obtain the audio data of the second application based on the second AudioRecord object.
- the second terminal is the same as the third terminal.
- an embodiment of the present application provides a screen projection device, which is applied to a second terminal.
- the device may include: a receiving unit, configured to receive data from an interface of the first application of the first terminal; a display unit, configured to Displaying a first interface, the content of the first interface is a mirror image of the interface content of the first application, or the content of the first interface is the same as part of the content of the interface of the first application; the receiving unit is further configured to receive the first interface from the first terminal.
- the data of the interface of the second application is further configured to display a third interface, where the third interface includes the content of the first interface and the content of the second interface, and the content of the second interface is a mirror image of the interface content of the second application, or The content of the second interface is the same as part of the content of the interface of the second application.
- an embodiment of the present application provides a screen projection device, the device may include: a processor; a memory for storing instructions executable by the processor; wherein the processor is configured to execute the instructions so that the screen projection device implements the following: The method described in any one of the first aspect or possible implementations of the first aspect, or the screen projection device is made to implement the method described in any one of the second aspect or possible implementations of the second aspect, or the projection device is made to The screen device implements the method according to any one of the fifth aspect or possible implementation manners of the fifth aspect, or enables the screen projection device to implement the method according to the sixth aspect.
- embodiments of the present application provide a computer-readable storage medium on which computer program instructions are stored, and when the computer program instructions are executed by an electronic device, the electronic device can implement the first aspect or the possible implementation manner of the first aspect. Any of the methods described in any of the above, or implementing the method described in any one of the second aspect or possible implementation manners of the second aspect, or enabling the screen projection device to implement the fifth aspect or the possible implementation manners of the fifth aspect Any one of the methods, or make the screen projection device implement the method according to the sixth aspect.
- an embodiment of the present application provides a screen projection system
- the system may include a first terminal and a plurality of second terminals; each second terminal in the plurality of second terminals is used to display a second interface; After receiving the user operation, send the data of the second interface to the first terminal; the first terminal is used to receive data from each of the plurality of second terminals; according to the data received from the plurality of second terminals, A plurality of first interfaces are displayed on the first terminal, and the plurality of first interfaces are in one-to-one correspondence with the plurality of second terminals; wherein the content of the first interface is a mirror image of the content of the second interface displayed corresponding to the second terminal, or the The content of an interface is the same as part of the content of the second interface corresponding to the second terminal.
- an embodiment of the present application provides an electronic device (such as the above-mentioned first terminal or second terminal), the electronic device includes a display screen, one or more processors and a memory; the display screen, the processor and the memory are coupled
- the memory is used to store computer program codes, the computer program codes include computer instructions, and when the computer instructions are executed by the electronic device, the electronic device is made to perform the method as described in any one of the first aspect or the possible implementation modes of the first aspect , or execute the method described in any one of the second aspect or the possible implementation manner of the second aspect, or cause the screen projection device to implement the method described in any one of the fifth aspect or the possible implementation manner of the fifth aspect , or make the screen projection device implement the method described in the sixth aspect.
- embodiments of the present application provide a computer program product, including computer-readable codes, or a non-volatile computer-readable storage medium carrying computer-readable codes, when the computer-readable codes are stored in an electronic device
- the processor in the electronic device executes the method described in any one of the first aspect or possible implementations of the first aspect, or executes any one of the second aspect or possible implementations of the second aspect
- the described method either enables the screen projection device to implement the method described in any one of the fifth aspect or possible implementation manners of the fifth aspect, or enables the screen projection device to implement the method described in the sixth aspect.
- the electronic device described in the second aspect, and the beneficial effects that can be achieved by the computer program product described in the thirteenth aspect reference may be made to the first aspect or the second aspect or the fifth aspect or the sixth aspect and any possibility thereof The beneficial effects in the implementation manner of , will not be repeated here.
- FIG. 1A is a schematic diagram of a scenario provided by an embodiment of the present application.
- FIG. 1B is a simplified schematic diagram of a system architecture provided by an embodiment of the present application.
- FIG. 2 is a schematic structural diagram of a mobile phone according to an embodiment of the present application.
- FIG. 3 is a schematic diagram of the composition of a software architecture provided by an embodiment of the present application.
- FIG. 4 is a schematic flowchart of a screen projection method provided by an embodiment of the present application.
- FIG. 5 is a schematic diagram of a display interface provided by an embodiment of the present application.
- FIG. 6 is a schematic flowchart of another screen projection method provided by an embodiment of the present application.
- FIG. 7 is another schematic diagram of a display interface provided by an embodiment of the present application.
- FIG. 8 is another schematic diagram of a display interface provided by an embodiment of the present application.
- FIG. 9 is a schematic flowchart of another screen projection method provided by an embodiment of the present application.
- FIG. 10 is a schematic diagram of another display interface provided by an embodiment of the present application.
- FIG. 11 is a schematic diagram of another display interface provided by an embodiment of the present application.
- FIG. 12 is a schematic diagram of another display interface provided by an embodiment of the present application.
- FIG. 13 is a schematic diagram of another display interface provided by an embodiment of the present application.
- FIG. 14 is a schematic diagram of another display interface provided by an embodiment of the present application.
- FIG. 16 is a schematic diagram of another display interface provided by an embodiment of the present application.
- FIG. 17 is a schematic diagram of another display interface provided by an embodiment of the present application.
- FIG. 18 is a schematic diagram of another display interface provided by an embodiment of the present application.
- FIG. 19 is a schematic diagram of another display interface provided by an embodiment of the present application.
- FIG. 20 is a schematic diagram of another display interface provided by an embodiment of the present application.
- FIG. 21 is a schematic diagram of another display interface provided by an embodiment of the present application.
- FIG. 22 is a schematic diagram of another display interface provided by an embodiment of the present application.
- FIG. 23 is another schematic diagram of a display interface provided by an embodiment of the present application.
- FIG. 24 is another schematic diagram of a display interface provided by an embodiment of the present application.
- FIG. 25 is a schematic diagram of another display interface provided by an embodiment of the present application.
- FIG. 26 is a schematic diagram of another display interface provided by an embodiment of the present application.
- FIG. 27 is a schematic diagram of the composition of a screen projection device according to an embodiment of the present application.
- FIG. 28 is a schematic diagram of the composition of another screen projection device provided by an embodiment of the present application.
- FIG. 29 is a schematic diagram of the composition of another software architecture provided by an embodiment of the present application.
- FIG. 30 is another schematic diagram of a display interface provided by an embodiment of the present application.
- FIG. 31 is a schematic diagram of data transmission provided by an embodiment of the present application.
- FIG. 32 is another schematic diagram of data transmission provided by an embodiment of the present application.
- FIG. 34 is a schematic diagram of the composition of a chip system according to an embodiment of the present application.
- first and second are only used for descriptive purposes, and should not be construed as indicating or implying relative importance or implicitly indicating the number of indicated technical features.
- a feature defined as “first” or “second” may expressly or implicitly include one or more of that feature.
- plural means two or more.
- a user can connect multiple terminals to work together. For example, after two terminals are connected, collaborative office between the two terminals can be realized by using multi-screen collaboration.
- Multi-screen collaboration can use the mirror projection method to project the interface displayed by one terminal to the display screen of another terminal for display.
- a terminal that projects its display interface can be called a screen-casting source, or a source (source)
- a terminal that receives the projection from the screen-casting source and displays the display interface of the screen-casting source is called a screen-casting The destination end, or called the sink end.
- the screen projected by the screencast source displayed on the screencast destination is called the screencasting interface
- the window used by the screencasting destination to display the screencasting interface is called the screencasting window.
- the mirror screen projection method can only realize the display of the display interface of one terminal to another terminal, that is, only one-to-one screen projection can be realized.
- display interfaces of multiple terminals may be required to be presented on the same terminal (eg, a large-screen device), that is, there is a many-to-one screen projection requirement.
- wireless screen projectors such as AWIND projectors
- Wireless projection gateway realizes the projection of the interfaces of multiple terminals to a terminal display screen.
- this technology for implementing many-to-one screen projection requires the help of a corresponding wireless screen projection device.
- the embodiment of the present application provides a screen projection method, which can be applied to a screen projection scenario.
- the display interfaces of multiple terminals can be displayed on the same terminal display screen without the help of other devices, which satisfies the requirements for many-to-one screen projection in scenarios such as meetings and conference presentations, and improves the The efficiency of multi-terminal collaborative use is improved, and the user experience is improved.
- DLNA digital media server
- DMP digital media player
- DMS provides the ability to acquire, record, store and serve as a source of media streams, such as providing content to various DMPs and sharing content with other devices in the network.
- DMS can be regarded as a multimedia network disk.
- DMP can find and play any media file provided by DMS.
- computers and TVs support DLNA, and users need to manually turn it on.
- DLNA does not have a connection state, and it can be successfully connected by connecting to the same local area network by default.
- DLNA only supports the delivery of multimedia files (such as pictures, audio, and video). After delivery, the DMS displays the control interface and does not play synchronously. In addition, DLNA only sends the pictures, audio or video of the mobile phone to the large screen for display or playback. For online video, third-party application support is required, and the TV (box) or large screen needs to support DLNA. Since DLNA is essentially a uniform resource locator (URL) that pushes a resource, when multiple devices serve as DMS to deliver content to the same target device as DMP, the preemptive method is adopted, that is, which device sends the content last content, the target device plays its media file.
- URL uniform resource locator
- Miracast is a wireless display standard based on Wi-Fi Direct, established by the Wireless Fidelity (Wi-Fi) Alliance in 2012.
- Miracast is a mirror projection, that is, the interface of the projection source and the projection destination are exactly the same, which is suitable for remote sharing.
- Devices that support this standard can share video footage wirelessly.
- a mobile phone can play a movie or photo directly on a large screen such as a TV through Miracast without being affected by the length of the connecting cable.
- Miracast requires accessory support.
- not all devices support Miracast For example, PCs have only supported Miracast since Windows 8.1, and PCs with earlier versions of Windows are not supported.
- mirror projection needs to send a large number of real-time encoded data streams, which has high requirements on network quality.
- AirPlay is Apple A wireless technology developed to wirelessly transmit pictures, audio or video from an iOS device to an AirPlay-enabled device via Wi-Fi.
- AirPlay has a mirroring function that DLNA does not have, and can wirelessly transmit images from iOS devices such as mobile phones and tablets to the TV. That is to say, whatever is displayed on the iOS device will be displayed on the TV screen, and it is not limited to pictures and videos.
- AirPlay only works with Apple A certified device or an authorized partner's device.
- AirPlay is not open source, and There are also limitations to device interaction.
- the above technology for realizing multi-screen collaboration can only realize the projection display of the content corresponding to one application of a device to another device, that is, it can only realize one-to-one screen projection, or can only realize one application of a device to other devices. "Transfer" on, and can not achieve true multi-task parallelism.
- the terminal can also realize the projection and display of the content of one or more applications of the terminal on other terminals by creating multiple media streams, so as to meet the requirements of multi-tasking and improve the performance of the terminal. Use efficiency and improve user experience.
- FIG. 1B shows a simplified schematic diagram of a system architecture to which embodiments of the present application can be applied.
- the system architecture may include: a first terminal 101 and at least one second terminal 102 .
- each second terminal 102 it can establish a connection with the first terminal 101 in a wired or wireless manner. Based on the established connection, the first terminal 101 and the second terminal 102 may be used together in cooperation.
- the wireless communication protocol adopted when the first terminal 101 and the second terminal 102 establish a connection in a wireless manner may be Wi-Fi protocol, Bluetooth (Bluetooth) protocol, ZigBee protocol, Near Field Communication (Near Field Communication) , NFC) protocol, etc., and may also be various cellular network protocols, which are not specifically limited here. Different wireless communication protocols used when the second terminal 102 establishes the connection with the first terminal 101 may be the same or different.
- the screen-casting source in the first terminal 101 and the multiple second terminals 102 can display the interface displayed on the display screen or Some elements in the interface are projected on the display screen of the projection destination.
- the first terminal 101 as the screen projection destination and multiple second terminals 102 as the screen projection source as an example.
- Each second terminal 102 in the plurality of second terminals 102 may project the interface displayed on its display screen or some elements in the interface to the display screen of the first terminal 101 for display.
- the first terminal 101 may aggregate the interfaces of multiple second terminals 102 and display them on the display screen of the first terminal 101 for the user to view.
- the user can also use the input device of the first terminal 101 to operate on the screen projection interface corresponding to each second terminal 102 displayed on the display screen of the first terminal 101, so as to realize the corresponding interface of the actual interface displayed in the second terminal 102. operate.
- the screen projection source ends in the first terminal 101 and the second terminal 102 can create a multi-channel media stream, one or more of the The content of each application is projected to the display screen of the projection destination.
- the first terminal 101 can project the content of one or more applications in the first terminal 101 to display on the display screen of at least one second terminal 102 by creating multiple media streams, so as to meet the requirement of parallel multitasking.
- the first terminal 101 can project the contents of multiple applications in the first terminal 101 to display on one or more display screens of the second terminal 102 by creating multiple media streams.
- the first terminal 101 may project the content of an application in the first terminal 101 to display screens of multiple second terminals 102 by creating multiple media streams.
- the terminals in the embodiments of the present application may be mobile phones, tablet computers, handheld computers, personal computers (PCs), cellular phones, Personal digital assistants (PDAs), wearable devices (such as smart watches), in-vehicle computers, game consoles, and augmented reality (AR) ⁇ virtual reality (VR) devices, etc.
- PDAs Personal digital assistants
- wearable devices such as smart watches
- in-vehicle computers game consoles
- AR augmented reality
- VR virtual reality
- the technical solutions provided in this embodiment can be applied to other electronic devices, such as smart home devices (eg, TV sets), in addition to the above-mentioned terminals (or mobile terminals).
- the device shapes of the first terminal 101 and the second terminal 102 may be the same or different.
- the device forms of the multiple second terminals 102 may be the same or different, which is not limited in this embodiment.
- the first terminal 101 may be a large-screen device such as a PC and a TV
- the second terminal 102 may be a mobile device such as a mobile phone and a tablet computer.
- the first terminal 101 may be a mobile device such as a mobile phone and a tablet
- the second terminal 102 may be a large-screen device such as a PC and a TV.
- the first terminal 101 is a television
- the plurality of second terminals 102 are mobile phones as an example, but this embodiment is not limited to this.
- the terminal is a mobile phone as an example.
- FIG. 2 is a schematic structural diagram of a mobile phone according to an embodiment of the present application. The methods in the following embodiments can be implemented in a mobile phone having the above-mentioned hardware structure.
- the mobile phone may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1.
- Antenna 2 wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193 and display screen 194, etc.
- the mobile phone may further include a mobile communication module 150, a subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
- SIM subscriber identification module
- the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and an environmental sensor Light sensor 180L, bone conduction sensor 180M, etc.
- the structure illustrated in this embodiment does not constitute a specific limitation on the mobile phone.
- the cell phone may include more or fewer components than shown, or some components may be combined, or some components may be split, or a different arrangement of components.
- the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
- the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
- application processor application processor, AP
- modem processor graphics processor
- graphics processor graphics processor
- ISP image signal processor
- controller memory
- video codec digital signal processor
- DSP digital signal processor
- NPU neural-network processing unit
- the controller can be the nerve center and command center of the phone.
- the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
- a memory may also be provided in the processor 110 for storing instructions and data.
- the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
- the processor 110 may include one or more interfaces.
- the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, SIM interface, and/or USB interface, etc.
- I2C integrated circuit
- I2S integrated circuit built-in audio
- PCM pulse code modulation
- PCM pulse code modulation
- UART universal asynchronous transceiver
- MIPI mobile industry processor interface
- GPIO general-purpose input/output
- SIM interface SIM interface
- USB interface etc.
- the charging management module 140 is used to receive charging input from the charger. While the charging management module 140 charges the battery 142 , it can also supply power to the mobile phone through the power management module 141 .
- the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
- the power management module 141 can also receive the input of the battery 142 to supply power to the mobile phone.
- the wireless communication function of the mobile phone can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
- Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
- Each antenna in a cell phone can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
- the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
- the mobile communication module 150 can provide a wireless communication solution including 2G/3G/4G/5G etc. applied on the mobile phone.
- the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
- the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
- the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
- at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
- at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
- the modem processor may include a modulator and a demodulator.
- the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
- the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
- the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
- the application processor outputs sound signals through audio devices (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or videos through the display screen 194 .
- the modem processor may be a stand-alone device.
- the modem processor may be independent of the processor 110, and may be provided in the same device as the mobile communication module 150 or other functional modules.
- the wireless communication module 160 can provide applications on the mobile phone including wireless local area networks (WLAN) (such as Wi-Fi networks), bluetooth (BT), global navigation satellite system (GNSS), Solutions for wireless communication such as frequency modulation (FM), NFC, infrared technology (infrared, IR).
- WLAN wireless local area networks
- BT Bluetooth
- GNSS global navigation satellite system
- FM frequency modulation
- NFC infrared technology
- IR infrared
- the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
- the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
- the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify the signal, and convert it into electromagnetic waves for radiation through the antenna 2 .
- the antenna 1 of the mobile phone is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the mobile phone can communicate with the network and other devices through wireless communication technology.
- the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
- the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (GLONASS), a Beidou navigation satellite system (BDS), a quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
- GPS global positioning system
- GLONASS global navigation satellite system
- BDS Beidou navigation satellite system
- QZSS quasi-zenith satellite system
- SBAS satellite based augmentation systems
- the mobile phone realizes the display function through the GPU, the display screen 194, and the application processor.
- the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
- Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
- Display screen 194 is used to display images, videos, and the like.
- Display screen 194 includes a display panel.
- the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
- LED diode AMOLED
- flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
- the handset may include 1 or N display screens 194, where N is a positive integer greater than 1.
- the mobile phone can realize the shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194 and the application processor.
- the mobile phone may include 1 or N cameras 193 , where N is a positive integer greater than 1.
- the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the mobile phone.
- the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
- Internal memory 121 may be used to store computer executable program code, which includes instructions.
- the processor 110 executes various functional applications and data processing of the mobile phone by executing the instructions stored in the internal memory 121 .
- the internal memory 121 may include a storage program area and a storage data area.
- the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
- the storage data area can store data (such as audio data, phone book, etc.) created during the use of the mobile phone.
- the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
- the mobile phone can implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, and an application processor. Such as music playback, recording, etc.
- the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
- the pressure sensor 180A may be provided on the display screen 194 .
- the gyroscope sensor 180B can be used to determine the motion attitude of the mobile phone.
- the air pressure sensor 180C is used to measure air pressure.
- the magnetic sensor 180D includes a Hall sensor.
- the mobile phone can use the magnetic sensor 180D to detect the opening and closing of the flip holster.
- the acceleration sensor 180E can detect the magnitude of the acceleration of the mobile phone in various directions (generally three axes).
- Distance sensor 180F for measuring distance.
- the mobile phone can use the proximity light sensor 180G to detect the user holding the mobile phone close to the ear to talk, so as to automatically turn off the screen to save power.
- Proximity light sensor 180G can also be used in holster mode, pocket mode automatically unlocks and locks the screen.
- the ambient light sensor 180L is used to sense ambient light brightness.
- the fingerprint sensor 180H is used to collect fingerprints. The mobile phone can use the collected fingerprint characteristics to unlock the fingerprint, access the application lock, take a picture with the fingerprint, answer the incoming call with the fingerprint, etc.
- Touch sensor 180K also called “touch panel”.
- the touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
- the touch sensor 180K is used to detect a touch operation on or near it.
- the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
- Visual output related to touch operations may be provided through display screen 194 .
- the touch sensor 180K may also be disposed on the surface of the mobile phone, which is different from the location where the display screen 194 is located.
- the bone conduction sensor 180M can acquire vibration signals.
- the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
- Motor 191 can generate vibrating cues. The motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
- the indicator 192 can be an indicator light, which can be used to indicate the charging state, the change of the power, and can also be used to indicate a message, a missed call, a notification, and the like.
- the SIM card interface 195 is used to connect a SIM card.
- the SIM card can be inserted into the SIM card interface 195 or pulled out from the SIM card interface 195 to achieve contact and separation with the mobile phone.
- the mobile phone can support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
- the mobile phone interacts with the network through the SIM card to realize functions such as calls and data communication.
- the handset employs an eSIM, ie: an embedded SIM card.
- the eSIM card can be embedded in the mobile phone and cannot be separated from the mobile phone.
- the embodiments of the present application exemplarily illustrate the software architectures of the first terminal 101 and the second terminal 102 .
- the first terminal 101 is used as the screen projection destination end
- the second terminal 102 is used as the screen projection source end as an example.
- FIG. 3 is a schematic diagram of the composition of a software architecture provided by an embodiment of the present application.
- the software architectures of both the first terminal 101 and the second terminal 102 may include: an application layer and a framework layer (framework, FWK).
- the first terminal 101 may include: a network management module, a decoding module and a window management module.
- Each module included in the first terminal 101 may be included in any layer of the software architecture of the first terminal 101 .
- the network management module of the first terminal 101, the decoding module and the window management module are all included in the framework layer of the first terminal 101, which is not specifically limited in this embodiment.
- the first terminal 101 may also include an application program, which may be included in the above-mentioned application layer.
- the application program may include a screencasting application, and the screencasting application may assist the first terminal 101 serving as the screencasting destination to implement a many-to-one screencasting function.
- the second terminal 102 may include: a network management module, an encoding module and a setting module. Each module included in the second terminal 102 may be included in any layer of the software architecture of the second terminal 102 .
- the network management module and the encoding module of the second terminal 102 are included in the framework layer of the second terminal 102 .
- the setting module of the second terminal 102 is included in the application layer of the second terminal 102, which is not specifically limited in this embodiment.
- the second terminal 102 may also include an application program, which may be included in the above-mentioned application layer.
- the application program may include a screen projection application, and the screen projection application can assist the second terminal 102 serving as the screen projection source end to implement a many-to-one screen projection function.
- the network management module of the first terminal 101 may be responsible for establishing a transmission channel between the first terminal 101 and the second terminal 102 .
- the network management module of the first terminal 101 can support the establishment of transmission channels between the first terminal 101 and a plurality of second terminals 102, that is, supports the establishment of a 1-to-N connection.
- the decoding module of the first terminal 101 may be responsible for decoding the data from the second terminal 102 serving as the screen projection source end (for example, called screen projection data, and may also be called screen recording data).
- the decoding module supports multi-channel decoding. For example, for data from different second terminals 102, the decoding module of the first terminal 101 can use different decoding parameters to decode the corresponding data.
- the window management module of the first terminal 101 may be responsible for presenting multiple screen projection windows on the first terminal 101 according to the decoded multi-channel data.
- the plurality of screen projection windows are in one-to-one correspondence with the plurality of second terminals 102 .
- the content in the screen projection window is the same as all or part of the content of the interface presented by the corresponding second terminal 102 .
- the window management module of the first terminal 101 is also responsible for dynamically increasing and decreasing the screen projection window on the first terminal 101, and reducing, enlarging, and switching the focus window on the screen projection window presented on the first terminal 101 according to user operations.
- the network management module of the second terminal 102 may be responsible for establishing a transmission channel between the second terminal 102 and the first terminal 101 .
- the encoding module of the second terminal 102 may be responsible for encoding the currently displayed interface or data corresponding to some elements in the interface (for example, called screen projection data).
- the setting module of the second terminal 102 may be responsible for setting audio and video parameters according to user settings, and the audio and video parameters may include resolution, horizontal and vertical screen, homologous/heterogeneous, layer filtering, and the like.
- homologous/heterogeneous may refer to whether the current interface will continue to be displayed on the second terminal 102 after the second terminal 102 casts the screen, and homologous means that the second terminal 102 will continue to display the current interface after the second terminal 102 casts the screen.
- the source means that the second terminal 102 does not continue to display the current interface after the screen is projected by the second terminal 102 .
- the first terminal 101 is a television set
- the plurality of second terminals 102 are mobile phones (for example, the plurality of second terminals 102 include a mobile phone 1 and a mobile phone 2 ) as an example
- the present application will be discussed with reference to the accompanying drawings.
- the screen projection method provided by the embodiment will be introduced in detail.
- FIG. 4 is a schematic flowchart of a screen projection method provided by an embodiment of the present application. As shown in FIG. 4, the method may include the following S401-S406.
- the mobile phone 1 establishes a connection with the TV set, and the mobile phone 2 establishes a connection with the TV set.
- the display interfaces of multiple terminals such as the second terminal, such as the above-mentioned mobile phone 1 and mobile phone 2
- the same terminal such as the first terminal, such as the above-mentioned TV set
- the multi-pair When one screen is projected the plurality of second terminals can be respectively connected to the first terminal.
- the first terminal and the second terminal may establish a connection in a wired manner.
- a wired connection can be established between the mobile phone 1 and the TV set through a data line.
- a wired connection can be established between the mobile phone 2 and the television set through a data line.
- the first terminal and the second terminal may establish a connection wirelessly.
- the connection information may be a device identifier of the terminal, such as an internet protocol (internet protocol, IP) address, a port number, or an account logged in by the terminal, and the like.
- IP internet protocol
- the account logged in by the terminal may be an account provided by the operator for the user, such as a Huawei account.
- the account logged in by the terminal can also be an application account, such as WeChat Account, Youku account, etc.
- the transmission capability of the terminal may be near-field communication capability or long-distance communication capability. That is to say, between terminals, for example, the wireless communication protocol used to establish a connection between the mobile phone 1 (or mobile phone 2) and the TV set may be a near field communication protocol such as a Wi-Fi protocol, a Bluetooth protocol or an NFC protocol, or a cellular network protocol. .
- different second terminals may establish a connection with the first terminal in the same manner or different.
- the manner in which the TV establishes a connection with the mobile phone 1 may be the same as or different from the manner in which the connection is established with the mobile phone 2.
- the manner in which the connection is established with the mobile phone 2 There is no specific limitation in this embodiment.
- a plurality of second terminals all establish connections with the first terminal in a wireless manner.
- the user wants to implement many-to-one screen projection from multiple second terminals to the first terminal, that is, multiple second terminals, such as mobile phone 1 and mobile phone 2, are the source ends of the screen projection, and the first terminal, such as a TV, is the destination end of the screen projection.
- the user can manually enable the screen projection service function of the TV set serving as the screen projection destination (it may also be referred to as a many-to-one screen projection function).
- the screen projection service function of the TV can also be automatically turned on, for example, when the TV is turned on. After the screen projection service function of the TV is turned on, the TV can obtain connection information, such as IP addresses, of each screen projection source (eg, mobile phone 1 and mobile phone 2 ).
- the TV can obtain the connection information of each second terminal serving as the screen projection source end in the following manner.
- the connection information of each second terminal may be manually input by the user.
- the TV may display a configuration interface 1 for the user to input connection information of each second terminal, such as an IP address.
- the television can obtain the connection information of each second terminal.
- the configuration interface 1 the number of controls (eg, input boxes) for the user to input connection information may be fixed (eg, 2, 3 or more, which is not specifically limited in this embodiment).
- the user can input the connection information of the second terminal in the control.
- the amount of connection information entered by the user can be equal to or less than the number of controls. It can be understood that the number of connection information input by the user is the same as the number of screen projection sources that can be connected to the TV.
- the TV can display a configuration interface 501 , which includes an input box 502 and an input box 503 for the user to input connection information.
- the user may input connection information, such as an IP address, of the second terminal serving as the screen projection source in the input box 502 and the input box 503 respectively.
- connection information such as an IP address
- the user inputs the IP address of the mobile phone 1: 192.168.43.164 in the input box 502, and the IP address of the mobile phone 2 in the input box 503: 192.168.43.155.
- the TV can obtain the connection information of each second terminal from the configuration interface 501 .
- the aggregation button 504 in the configuration interface 501 may be operated, such as a click operation.
- the TV set can obtain the connection information of each second terminal from the configuration interface 501, such as IP address: 192.168.43.164 and IP address: 192.168.43.155.
- the IP address: 192.168.43.164 and the IP address: 192.168.43.155 can be obtained from the configuration interface 501 by the window management module of the television.
- the connection information of each second terminal serving as the screen projection source end may be monitored by the TV.
- the mobile phone 1, the mobile phone 2 and the TV set have the Bluetooth function turned on.
- the TV can start to perform the device discovery process.
- the TV has Bluetooth monitoring enabled.
- the second terminal serving as the screen projection source end such as the mobile phone 1 and the mobile phone 2
- the Bluetooth function enabled it can send a Bluetooth broadcast.
- the TV can receive the Bluetooth broadcast sent by the second terminal.
- the TV may also exchange connection information, such as an IP address, with the discovered device (such as the above-mentioned second terminal).
- the TV may send notification messages to the second terminals, such as the mobile phone 1 and the mobile phone 2, respectively, to notify them to report their own IP addresses.
- the TV set eg, the network management module of the TV set
- the TV set can receive the IP addresses from the second terminals, such as the mobile phone 1 and the mobile phone 2 .
- the Bluetooth broadcast TV sent by all the terminals within the monitoring range can be monitored.
- the television set may send the above notification message to all monitored terminals, so that they can report their own connection information. If the TV monitors the Bluetooth broadcast of the mobile phone 1 and the mobile phone 2, it sends the above notification message to both the mobile phone 2 and the mobile phone 2.
- the TV may display a list of discovered devices.
- the discovered device list includes the identifiers of all terminals monitored by the TV, for example, including the identifier of the mobile phone 1 and the identifier of the mobile phone 2 .
- the discovered device list is for the user to select the terminal that the user wants to connect with the TV.
- the television set may only send the above notification message to the terminal selected by the user. For example, if the user selects the identification of the mobile phone 1 and the identification of the mobile phone 2 in the discovery device list, the TV can send the above notification message to the mobile phone 1 and the mobile phone 2.
- the television After acquiring the connection information of each second terminal, the television can establish a connection with the corresponding second terminal according to each acquired connection information.
- the wireless communication protocol adopted when the TV sets establish the connection with each second terminal may be the same or different, which is not specifically limited in this embodiment.
- the TV can establish a connection with the mobile phone 1 using the Wi-Fi protocol according to the IP address 192.168.43.164 of the mobile phone 1, and establish a connection with the mobile phone 2 using the Wi-Fi protocol according to the IP address 192.168.43.155 of the mobile phone 2.
- the TV can establish a connection with the mobile phone 1 using the Wi-Fi protocol according to the IP address 192.168.43.164 of the mobile phone 1, and establish a connection with the mobile phone 2 using the Bluetooth protocol according to the IP address 192.168.43.155 of the mobile phone 2.
- the process of establishing a connection between the TV and the second terminal may be: the network management module of the TV initiates a network connection to the second terminal according to the IP address, such as sending Establish a connection request.
- the network management module of the second terminal completes the establishment of the connection with the TV set.
- the connection information of each second terminal is specifically obtained by the window management module of the TV.
- the window management module of the TV can send the obtained connection information of each second terminal to the network management module of the TV, so that the network management module of the TV can initiate network connection.
- the TV creates views corresponding to the mobile phone 1 and the mobile phone 2 respectively, and configures decoding parameters corresponding to the mobile phone 1 and the mobile phone 2 respectively.
- the terminal serving as the screen projection source end can project the interface displayed on its display screen to display on the display screen of the terminal serving as the screen projection destination end.
- multiple second terminals are used as the source end of screen projection
- the first terminal is used as the destination end of screen projection, that is, multiple second terminals can project the interface displayed on their display screens. It is displayed on the display screen of the first terminal to realize many-to-one screen projection.
- the first terminal serving as the screen projection destination can perform the following preparations:
- the first terminal may create a corresponding view (view), using for rendering the interface projected by the second terminal.
- views may be drawing components in the embodiments of the present application.
- the user can input each second terminal through the Terminal connection information, such as IP address.
- the first terminal such as the window management module of the first terminal, can obtain the IP addresses of the second terminals from the configuration interface 1 (eg, step 1 in FIG. 6 ).
- the first terminal may locally store an array, such as array 1.
- the array 1 includes the IP addresses of each second terminal serving as the screen projection source end.
- the first terminal may, according to the array 1, create a corresponding view for each second terminal serving as the screen projection source, for rendering the interface projected by each second terminal.
- a view array is created by the window management module of the first terminal, and the view array may include: views corresponding to the IP addresses in the array 1 one-to-one (eg, step 2 in FIG. 6 ).
- the first terminal configures decoding parameters for each of the plurality of second terminals for decoding screen projection data from each of the second terminals.
- the specific implementation of projecting the currently displayed interface by the screencasting source end to the screencasting destination end may be that the screencasting source end obtains the data corresponding to the currently displayed interface, such as screencasting data, and sends it to the screencasting destination. terminal, so that the destination terminal of the projection screen displays the corresponding content on its display screen.
- the screen projection source terminal transmits the screen projection data
- the screen projection data can be encoded, and the encoded screen projection data can be transmitted to the screen projection destination end.
- the screen-casting destination after receiving the screen-casting data from the screen-casting source, it can decode it.
- the first terminal may use the same decoding parameters to decode screen projection data from different second terminals, or use different decoding parameters to decode screen projection data from different second terminals.
- the screen projection data of the second terminal is decoded.
- the window management module of the first terminal after the window management module of the first terminal successfully creates the view corresponding to each IP address, the The window management module may configure decoding parameters associated with the corresponding IP address in the decoding module of the first terminal (eg, step 3 in FIG. 6 ).
- the window management module of the first terminal may configure decoding parameters associated with the corresponding IP address in the decoding module through a callback function after the view is successfully created.
- the first terminal may configure different decoding parameters for each of the second terminals for decoding the screen projection data from each of the second terminals.
- the above-mentioned decoding parameters may be negotiated between the first terminal and the second terminal, or may be pre-configured on the first terminal, which is not specifically limited in this embodiment.
- the above-mentioned decoding parameters may include: the distribution mode of the video stream, the specification of the video stream, the video encoding format, the bit rate of the video encoding, the flag of the virtual display (Virtual Display), whether to project audio data, and the like.
- the distribution mode of the video stream may include a broadcast mode, a distribution mode, a convergence mode, and the like. Broadcast mode can refer to only starting a single video stream and distributing it to multiple projection destinations with low latency.
- the distribution mode may refer to enabling multiple video streams to be distributed to multiple different projection destinations.
- Convergence mode may refer to enabling multiple video streams to be distributed to the same projection destination.
- the specification of the video stream may refer to the resolution of the video encoder, such as 720P, 1080P, 2K, etc.
- the encoding format of the video may be H.264 (Advanced Video Coding (AVC)), H.265 (High Efficiency Video Coding (HEVC)), and the like.
- the first terminal saves a connection instance for each of the plurality of second terminals for receiving screen projection data from the second terminal.
- the first terminal establishes a connection with each second terminal based on the obtained (eg, user input) IP address.
- the window management module of the first terminal can transmit the obtained IP address of each second terminal to the network management module of the first terminal, and the network management module can communicate with each second terminal according to the obtained IP address.
- the terminal establishes the connection (eg, step 4 in Figure 6).
- the first terminal such as the network management module of the first terminal, can locally maintain an array, such as called array 2, and the array 2 includes the A connection instance (or referred to as an instance) corresponding to the IP addresses one-to-one is used to receive screen projection data from the corresponding second terminal.
- the mobile phone 1 and the mobile phone 2 are used as the screen projection source ends, and the TV set is used as the screen projection destination end.
- the TV displays the configuration interface 1 (the configuration interface 501 shown in FIG. 5 )
- the user can input the mobile phone 1 in the configuration interface 1. and the IP address of phone 2.
- the window management module of the TV can obtain the IP address of the mobile phone 1 and the IP address of the mobile phone 2 from the configuration interface 1 .
- the TV set can save an array 1 locally.
- the array 1 includes the IP address of mobile phone 1 and the IP address of mobile phone 2.
- the window management module of the TV can create a view array according to the array 1.
- the view array includes: the view corresponding to the IP address of mobile phone 1 in array 1, such as view 1, which is used to render the interface projected by mobile phone 1, and the view corresponding to the IP address of mobile phone 2 in array 1, such as view 2, which is used to render the mobile phone 2 Projected interface.
- the window management module of the TV successfully creates the view 1 corresponding to the IP address of the mobile phone 1, the decoding parameters associated with the IP address of the mobile phone 1 are configured in the decoding module through the callback function, such as decoding parameter 1.
- the decoding parameters associated with the IP address of the mobile phone 2 are configured in the decoding module through the callback function, such as decoding parameter 2.
- the TV can configure different decoding parameters for mobile phone 1 and mobile phone 2 for decoding the screen projection data.
- the network management module of the TV set can also maintain an array 2 locally.
- the array 2 includes: a connection instance corresponding to the IP address of mobile phone 1 in array 1, such as connection instance 1, which is used to receive screen projection data from mobile phone 1, and a connection corresponding to the IP address of mobile phone 2 in array 1 Instances, such as connection instance 2, are used to receive screencast data from mobile phone 2.
- the mobile phone 1 acquires the screen projection data 1 and sends it to the TV.
- the mobile phone 2 acquires the screen projection data 2 and sends it to the TV.
- the second terminal when the first terminal and the second terminal are connected, the second terminal can act as the screen projection source end to project the interface displayed on its display screen onto the display screen of the first terminal as the screen projection destination end show.
- the conditions for the second terminal to start screen projection include not only successfully establishing a connection with the first terminal, but also receiving a corresponding user operation.
- the user operation may be an operation that the user selects to start screencasting, such as a user's click operation on the start screencasting button.
- the operation of selecting to start screencasting may be received by the second terminal before establishing the connection with the first terminal, or may be received after the connection with the first terminal is established. If the operation of selecting to start screencasting is received by the second terminal before establishing a connection with the first terminal, the second terminal can start screencasting after the second terminal successfully establishes a connection with the first terminal. If the operation of selecting to start screencasting is received by the second terminal after establishing the connection with the first terminal, the connection is successfully established between the second terminal and the first terminal, and the second terminal receives the operation of selecting to start screencasting After that, start screencasting.
- the user operation may be an operation of the user confirming screen projection during the process of establishing a connection between the second terminal and the first terminal.
- the second terminal may display a confirmation interface to ask the user whether to confirm whether to project the display interface of the second terminal to the first terminal for display.
- the operation for confirming screen projection may be a click operation of the user on the confirmation screen projection button in the confirmation interface. Afterwards, after the second terminal successfully establishes a connection with the first terminal, the second terminal can start to perform screen projection.
- the specific implementation of projecting the interface displayed on the display screen of the second terminal onto the display screen of the first terminal may be: the second terminal obtains the current display interface of the second terminal (the interface The data corresponding to the second interface) in this embodiment of the present application, such as screen projection data, can be sent to the first terminal for the first terminal to display corresponding content on its display screen, thereby realizing the display interface of the second terminal Projection display on the display screen of the first terminal.
- mobile phone 1 and mobile phone 2 are used as the screen projection source, and the TV is used as the screen projection destination.
- the above user operation is a wireless screen projection scenario. The example is executed before the mobile phone 1 and the mobile phone 2 are connected with the TV set.
- the user can trigger the mobile phone 1 and the mobile phone 2 to start screen projection respectively.
- the mobile phone 1 currently displays an interface 701
- the mobile phone 2 currently displays an interface 702 .
- the user can trigger the mobile phone 1 and the mobile phone 2 to display an interface including a start screen projection button, such as a configuration interface 2, so that the mobile phone 1 and the mobile phone 2 can be triggered to start screen projection.
- a start screen projection button such as a configuration interface 2
- the user can trigger the mobile phone 1 to display a configuration interface 801 , and the configuration interface 801 includes a start screen projection button 802 .
- the user can perform a click operation on the start screen projection button 802 .
- the mobile phone 1 receives the user's click operation on the start screen casting button 802 .
- the mobile phone 1 can acquire the data corresponding to the current display interface 701 .
- the mobile phone 1 can obtain the corresponding data of the current display interface 701 of the mobile phone 1 through the display management module of the mobile phone 1 (or called a display manager, DisplayManager, which can be a module of the framework layer of the mobile phone 1), such as the screen projection data 1.
- the user can also trigger the mobile phone 2 to display the configuration interface 2 (eg, similar to the configuration interface 801 in FIG. 8 ).
- the data corresponding to the current display interface 702 can be obtained.
- the mobile phone 2 can obtain data corresponding to the current display interface of the mobile phone 2 through the display management module of the mobile phone 2 (or a display manager, which can be a module of the framework layer of the mobile phone 2 ), such as screen projection data 2 .
- the TV set serving as the screen projection destination can establish connections with the mobile phone 1 and the mobile phone 2 respectively according to the IP addresses of the mobile phone 1 and the mobile phone 2 .
- the mobile phone 1 can send the obtained screen projection data 1 to the TV set, so as to realize the projection display of the display interface 701 of the mobile phone 1 on the TV display screen.
- the mobile phone 2 can send the obtained screen projection data 2 to the TV set, so as to realize the projection display of the display interface of the mobile phone 2 on the TV display screen.
- a distributed multimedia protocol may be used to implement the projection display of the display interface of the second terminal on the display screen of the first terminal.
- DMP distributed Multi-media Protocol
- the second terminal may use the display management module of the second terminal to create a virtual display (Virtual Display). Afterwards, the second terminal may move the drawing of the interface displayed on the display screen of the second terminal to the VirtualDisplay. In this way, the second terminal can obtain the corresponding screen projection data. Afterwards, the second terminal may send the obtained screen projection data to the first terminal. For example, referring to FIG.
- the second terminal after obtaining the screen projection data, can encode the screen projection data by the encoding module of the second terminal and send the data to the network management module of the second terminal.
- the network management module of the second terminal may send the encoded screen projection data to the first terminal through the connection established with the first terminal.
- wireless projection can also be used to realize the projection display of the display interface of the second terminal on the display screen of the first terminal, that is, the second terminal can obtain all layers of the display interface of the second terminal, and then All the obtained layers are integrated into a video stream (or called screencast data). After that, it can be encoded by the encoding module of the second terminal and sent to the network management module of the second terminal, so that the network management module adopts the real time streaming protocol (RTSP) protocol, through the connection established with the first terminal. sent to the first terminal.
- RTSP real time streaming protocol
- the above embodiments are described by projecting all the contents of the display interface on the display screen of the second terminal to the display screen of the first terminal for display as an example.
- part of the content of the interface displayed on the display screen of the second terminal such as part of the elements of the interface, may also be projected onto the display screen of the first terminal for display.
- the element to be projected to the first terminal may be a predetermined element in the interface, such as a video element.
- the second terminal performs screen projection, only the layer where the predetermined element is located may be projected to the first terminal without projecting other layers. In this way, the private information on the second terminal can be protected from being displayed to the first terminal.
- whether the second terminal only projects the layer where the predetermined element is located may be predefined by the system. For example, when the interface displayed on the display screen of the second terminal includes a predetermined element, the second terminal only projects the layer where the predetermined element is located to the first terminal; when the interface displayed on the display screen of the second terminal does not include the predetermined element , the second terminal projects all the contents of the current interface to the first terminal. Whether the second terminal only projects the layer where the predetermined element is located may also be set by the user. For example, continuing with reference to FIG. 8 , the configuration interface 801 further includes an option 803 for enabling layer filtering (this option 803 may be a layer filtering setting option in this embodiment of the present application).
- the second terminal activates the layer filtering function, that is, the second terminal only projects the layer where the predetermined element is located to the first terminal;
- the option 803 for enabling layer filtering is not selected in the interface 801, the second terminal projects the entire content of the current interface to the first terminal.
- the specific implementation that the second terminal only projects the layer where the predetermined element is located may include: After the second terminal creates the VirtualDisplay, the second terminal, such as the display synthesis (surface Flinger) module of the second terminal (for example, the module of the application layer of the second terminal) can convert the interface displayed on the display screen of the second terminal layer by layer. Composite into VirtualDisplay. In the process of layer-by-layer synthesis, the surface Flinger module of the second terminal can determine whether the layer to be synthesized currently includes video elements.
- the display synthesis (surface Flinger) module of the second terminal for example, the module of the application layer of the second terminal
- the surface Flinger module of the second terminal can determine whether the layer to be synthesized currently includes video elements.
- the second terminal may determine whether a video element is included in the layer according to the prefix of the layer name of the layer.
- the prefix of the layer name of the layer where the video element is located is generally Surfaceview. Therefore, when the second terminal determines that the layer name of the layer to be synthesized currently has the prefix Surfaceview, it can determine that the layer includes the video element. When it is determined that the prefix of the layer name of the layer to be composited is not Surfaceview, it is determined that the layer does not include video elements.
- the surface Flinger module of the second terminal only synthesizes layers including video elements into VirtualDisplay, and layers that do not include video elements are not synthesized into VirtualDisplay to obtain corresponding screen projection data. Wherein, the screen projection data only includes data corresponding to the layer where the video element is located, so as to realize the purpose of projecting only the video element to the first terminal.
- the second terminal when the second terminal is currently playing sound, for example, when the user uses the second terminal to watch videos or listen to music, after the second terminal enables screen projection, the second terminal can not only display the current display
- the interface is projected to the first terminal, and the audio can also be projected to the first terminal.
- the above-mentioned screen projection data (such as screen projection data 1 or screen projection data 2) may include video data and audio data.
- the video data is used for the first terminal to display the corresponding screen projection interface on the display screen of the first terminal, and the audio data is used for the first terminal to play the corresponding sound.
- the specific acquisition process of the video data is the same as the process described in the above-mentioned embodiments to realize screen projection by using DMP or wireless projection.
- the acquiring process of the audio data may be as follows: the second terminal may create an audio recording (AudioRecord) object in advance, and create a buffer (buffer). After the user triggers the second terminal to start screen projection, the second terminal may call the AudioRecord object. After the AudioRecord object is called, the audio data in the second terminal can be recorded. If the projected interface includes a video component, the audio in the video played in the video component can be recorded to obtain the audio data. Audio data will be stored in the created buffer. After that, the second terminal can obtain the audio data from the buffer and send it to the first terminal.
- AudioRecord Audio recording
- the second terminal may create an audio recording (AudioRecord) object in advance, and create a buffer (buffer). After the user triggers the second terminal to start screen projection, the second terminal may call the AudioRecord object. After the AudioRecord object is called, the audio data in the second terminal can be recorded. If the projected interface includes a video component, the audio in the video played in the video component can be recorded to obtain the audio
- both the video data and the audio data may be screen-cast to the first terminal, or only the video data may be screen-cast to the first terminal without the audio data being screen-cast to the first terminal.
- Whether or not to project audio data can be predefined by the system or set by the user.
- the configuration interface 801 also includes an option 804 for enabling audio.
- the second terminal screens both video data and audio data to the first terminal; when the user does not select the audio-enabled option 804 in the configuration interface 801 , the second terminal only projects the video data to the first terminal.
- the TV respectively decodes the screen projection data 1 and the screen projection data 2 according to the configured corresponding decoding parameters.
- the TV draws the screen projection interface 1 and the screen projection interface 2 by using the created corresponding views according to the decoded screen projection data 1 and the screen projection data 2, and displays them on the TV.
- the screen projection interface 1 and the screen projection interface 2 may be the first interface in the embodiment of the application.
- the first terminal may display screen projection interfaces corresponding to the plurality of second terminals one-to-one on the display screen of the first terminal according to the received screen projection data.
- the TV receives the screen projection data 1, it can display the screen projection interface on the TV according to the screen projection data 1, such as the screen projection interface 1.
- the content displayed in the screen projection interface 1 is the same as the All or part of the content of the display interface on the display screen of the mobile phone 1 is the same, or the content in the screen projection interface 1 is a mirror image of all or part of the content of the display interface on the display screen of the mobile phone 1 .
- the TV after the TV receives the screen projection data 2, it can display the screen projection interface on the TV according to the screen projection data 2, such as the screen projection interface 2, the content displayed in the screen projection interface 2 is the same as the display screen of the mobile phone 2. All or part of the content of the display interface above is the same, or the content in the screen projection interface 2 is a mirror image of all or part of the content of the display interface on the display screen of the mobile phone 2 .
- the first terminal correspondingly displays the screen projection interface on the first terminal.
- the specific implementation may be: the network management module of the first terminal is in the After receiving the screen projection data from the second terminal, the screen projection data may be sent to the decoding module of the first terminal for decoding (eg, step 5 shown in FIG. 6 ). After the decoding module of the first terminal decodes the screen projection data by using the corresponding decoding parameters, it sends it to the window management module of the first terminal; the window management module of the first terminal uses the corresponding view, according to the received screen projection data. data can be drawn and displayed on the display screen of the first terminal corresponding to the screen projection interface (eg, step 6 in FIG. 6 ).
- the network management module of the mobile phone 1 after the network management module of the mobile phone 1 sends the encoded screen projection data 1 to the TV through the connection established with the TV, the network management of the TV
- the module can receive the encoded screen projection data 1.
- the network management module of the TV can receive the encoded screen projection data 1 through the connection instance 1 in the array 2 maintained locally.
- the network management module of the TV can determine that the IP address of the screen projection source is the IP address of the mobile phone 1 according to the connection instance 1 of the received data.
- the network management module of the TV can send the encoded screen projection data 1 and the IP address of the mobile phone 1 to the decoding module of the TV.
- the decoding module of the TV can obtain the corresponding decoding parameters according to the IP address of the mobile phone 1, such as obtaining the decoding parameter 1, and use the decoding parameter 1 to decode the screen projection data 1.
- the decoding module of the TV can send the decoded screen projection data 1 to the window management module of the TV.
- the window management module of the TV uses the view 1 corresponding to the IP address of the mobile phone 1 in the created view array to realize the drawing of the screen projection interface 1, as shown in (c) in Figure 7.
- the screen projection interface 1 is displayed on the display screen of the TV.
- the content in the screen projection interface 1 is the same as the content in the interface 701 displayed by the mobile phone 1 in (a) of FIG. 7 .
- the network management module of the mobile phone 2 sends the encoded screen projection data 2 to the TV through the connection established with the TV
- the network management module of the TV can use the connection instance 2 in the array 2 maintained locally.
- the encoded screen projection data 2 is received.
- the network management module of the TV can determine that the IP address of the screen projection source is the IP address of the mobile phone 2 according to the connection instance 2 of the received data. After that, the network management module of the TV can send the encoded screen projection data 2 and the IP address of the mobile phone 2 to the decoding module of the TV.
- the decoding module of the TV can obtain the corresponding decoding parameters according to the IP address of the mobile phone 2 , such as obtaining the decoding parameter 2 , and use the decoding parameter 2 to decode the screen projection data 2 .
- the decoding module of the TV can send the decoded screen projection data 2 to the window management module of the TV.
- the window management module of the TV uses the view 2 corresponding to the IP address of the mobile phone 2 in the created view array to realize the drawing of the screen projection interface 2, as shown in (c) in Figure 7.
- the screen projection interface 2 is displayed on the display screen of the TV.
- the content in the screen projection interface 2 is the same as the content in the interface 702 displayed by the mobile phone 2 in (b) of FIG. 7 .
- the window used by the first terminal to display the screen-casting interface may be referred to as a screen-casting window.
- the window used for displaying the screen projection interface 1 may be referred to as the screen projection window 1
- the window used for displaying the screen projection interface 2 may be referred to as the screen projection window 2 .
- the first terminal may display a corresponding screen projection window after determining to be connected to the second terminal (such as the above-mentioned mobile phone 1 or mobile phone 2).
- the first terminal may set the size and layout of the screen projection window corresponding to each second terminal according to the number of the second terminals serving as the screen projection source and the size of the display screen of the first terminal.
- the number of the second terminals serving as the screen projection source is two.
- screen projection windows corresponding to the two second terminals respectively can be displayed on the display screen of the first terminal.
- the two screen projection windows may be arranged vertically or horizontally on the display screen of the first terminal.
- the size of the two projection windows can be the same or different.
- the screen projection window 1 corresponding to the mobile phone 1 and the screen projection window 2 corresponding to the mobile phone 2 are vertically arranged, and the screen projection window 1 and the screen projection window 2 have the same size.
- the two screencasting windows may be displayed on the display screen of the first terminal at the same time, or may be in the order in which the screencasting starts corresponding to the second terminal, or the order in which the screencasting data corresponding to the second terminal is received by the first terminal successively displayed on the display screen of the first terminal.
- the size of the screencasting window displayed first may be the same as the size of the display screen of the first terminal, and the screencasting window displayed later may be smaller than that of the first terminal.
- the display screen is displayed in the form of a floating window above the projection screen displayed first.
- the user's operation can be performed according to the user's operation (the operation may be the first one in this embodiment of the application). operation) to reduce, enlarge, switch the focus window, and close the corresponding screen projection window.
- the operation may be a user's touch operation on the screen of the first terminal, or may be an operation input by the user using an input device of the first terminal (eg, a mouse, a keyboard of a PC; another example, a remote control of a TV).
- the screen projection interface 1 and the screen projection interface 2 are displayed on the TV, the window for displaying the screen projection interface 1 is the screen projection window 1, and the window for displaying the screen projection interface 2 is the screen projection window 2 for example.
- the user can use the remote control of the television to control the interface currently displayed on the television.
- the television After receiving the user's control operation (eg, step 1 in FIG. 9 ), the television can determine whether the focus window needs to be switched (eg, step 2 in FIG. 9 ) according to the received control operation. Wherein, if the control operation is an operation of switching the focus window, it is determined that the focus window needs to be switched.
- the operation of switching the focus window may be the user's operation of the left button or the right button of the remote control. That is, if the control operation received by the television is an operation of the left or right button of the remote control, the television may determine that the focus window needs to be switched, and the television may switch the focus (eg, step 3 in FIG. 9 ).
- the television set may locally save a focus window variable, and the focus window variable is used to indicate which window is the focus window among the multiple screen projection windows currently displayed.
- the operation of switching the focus of the television set may include that the television set updates the focus window variable from identifier 1 to identifier 2.
- the identifier 1 is the identifier of the screen projection window that is the focus window before the focus is switched
- the identifier 2 is the identifier of the screen projection window that is the focus window after the focus is switched.
- the screen projection window of one of the screen projection interfaces can be the focus window by default.
- the TV is used to display screen projection interface 1 by default.
- the projection window 1 is the focus window.
- the television set can display a prompt sign 1001 for prompting the user that the current screen projection window 1 is the focus window.
- the TV set can also set the focus window variable as the identifier of the screen projection window 1, which is used to indicate that the screen projection window 1 is the focus window.
- the TV can determine that the focus window needs to be switched, and then the TV updates the focus window variable from ID 1 to ID 2 of the projection window 2, which is used to indicate the projection window. 2 is the current focus window.
- the TV can update the position of the prompt mark 1001 on the TV display screen, that is, slide from the position of the screen projection window 1 to the position of the screen projection window 2, so as to remind the user of the current screen projection window 2 the focus window.
- the TV can determine whether the current focus window needs to be enlarged according to the received control operation and in combination with the size of the current focus window (eg, step 4 in FIG. 9 ).
- the control operation is a selection operation of the focus window, for example, the selection operation may be an operation of determining a button on the remote control, and when the current focus window is not a maximized window, the TV can enlarge the current focus window.
- the TV can hide them (eg, step 5 in Figure 9). It can be understood that the size of the screen-casting interface changes with the size of the screen-casting window. The screen-casting interface is also hidden along with the screen-casting window.
- the current focus window is the screen-casting window 1 .
- the TV receives the user's operation of the remote control confirmation button, and the TV determines the current focus window, that is, the projection window 1 is not the maximized window, the TV can maximize the projection window 1 and hide other projection windows. (i.e. hide the projection window 2).
- the television may determine the enlarged size of the current focus window according to the size of the display screen of the television, for example, the enlarged size is the same as the size of the display screen of the television.
- the TV can determine whether the current focus window needs to be reduced according to the received control operation and in combination with the size of the current focus window (eg, step 6 in FIG. 9 ). Wherein, if the control operation is an operation of determining a button on the remote control, and the current focus window is a maximized window, the TV can reduce the current focus window and display other non-focus windows (eg, step 7 in FIG. 9 ) . For example, the TV currently displays the maximized projection window 1, and the projection window 2 is hidden.
- the TV can reduce the projection window 1 and display the The other screen-casting windows that are hidden, that is, the screen-casting window 2 is displayed.
- the TV may determine the reduced size of the current focus window according to the display size of the TV and the number of other hidden projection windows, such as the reduced size and other hidden projection windows.
- the windows are the same size, and the sum of all projection windows is the same size as the TV display.
- the TV set may update the screen projection interface in the current focus window according to the received control operation (eg, step 8 in FIG. 9 ).
- the control operation may be an operation for operating the screen projection interface (this operation can be implemented in this application. the second operation in the example).
- the TV can send the control operation to the projection source corresponding to the current focus window, so that the projection source can execute the corresponding event according to the received control operation, and update the interface displayed by the projection source (
- the updated interface of the screen projection source end may be the third interface in this embodiment of the present application).
- the screen projection source end can project the updated interface to the screen projection destination end, such as a TV set, that is, the screen projection source end can obtain new screen projection data and send it to the TV set.
- the TV receives the updated screen projection data, it can update the screen projection interface in the current focus window according to the new screen projection data (the updated screen projection interface of the TV can be the fourth interface in the embodiment of the application) .
- the current focus window is screen projection window 1.
- the content of the screen-casting interface 1 in the screen-casting window 1 is PPT.
- the TV can send the operation of the up or down button on the remote control to the mobile phone 1 corresponding to the screen projection window 1 .
- the mobile phone 1 can perform page-up or page-down operations on the PPT according to the operation, and can acquire new screen projection data and send it to the TV.
- the TV can update and display the screen projection interface 1 in the screen projection window 1 according to the new screen projection data.
- the mobile phone 1 acquires and sends new screen projection data, and the TV receives the new screen projection data and displays the screen projection interface according to the new screen projection data.
- the specific implementation is the same as that in the above embodiment, the corresponding processes in S403-S406 The implementation is similar and will not be described in detail here.
- the control operation used to operate the screen projection interface may also be other operations, such as an operation on an operable element in the screen projection interface. If the control operation is an operation on an operable element in the screen projection interface, the TV can not only send the operation to the corresponding screen projection source terminal, but also send the operation position of the operation in the screen projection interface to the projection screen. screen source. According to the operation position, the screen projection source can determine which element in the current display interface the user is operating, and then execute the corresponding event according to the received operation and the determined element to be operated, and update the screen displayed by the projection source. interface.
- the first terminal can also dynamically adjust the size and arrangement of the screen projection windows corresponding to each second terminal displayed by the first terminal according to the number of the second terminals serving as the screen projection source.
- the number of the second terminals serving as the screen projection source end can be dynamically increased or decreased.
- the first terminal has established connections with multiple second terminals, and the first terminal currently displays screen projection windows corresponding to the multiple terminals respectively.
- the first terminal When the first terminal is disconnected from one of the second terminals, or the first terminal receives the user's operation to close a screen projection window (for example, when a screen projection window is the focus window, the TV receives the user's return to the remote control) key operation), that is, the number of second terminals serving as the source of screen projection decreases, the first terminal can stop displaying the screen projection window corresponding to the disconnected second terminal, and adjust the number of remaining connected second terminals according to the number of remaining connected second terminals.
- Each second terminal corresponds to the size and arrangement of the screen projection window.
- the first terminal can increase the display corresponding to the new second terminal. screen projection window, and adjust the size and arrangement of the screen projection window corresponding to each second terminal according to the number of second terminals currently serving as the screen projection source end.
- the examples in the above embodiments are described by taking the implementation of many-to-one screen projection as an example in a wireless screen projection scenario.
- the many-to-one screen projection method in this embodiment may also be applied to a cross-device dragging scenario.
- the specific implementation of many-to-one screen projection is similar to the implementation in S401-S406 above, with the following differences:
- the timing of creating a view and configuring decoding parameters for a first terminal may be executed after the connection with a corresponding second terminal, such as mobile phone 1 and mobile phone 2, is successfully established, or the first terminal determines the corresponding second terminal. It will be executed after the terminal starts to cast the screen.
- a first terminal such as a TV set
- the second terminal determines that the user triggers the cross-device dragging
- it may send corresponding dragging data to the first terminal.
- the drag data may be used to indicate that the drag data is related data in the drag start event.
- the indication may identify the start of the dragging.
- the first terminal may determine that the second terminal will start screen projection.
- the TV may create a view corresponding to the second terminal, and configure decoding parameters corresponding to the second terminal.
- the conditions for the second terminal to start screen projection include not only successfully establishing a connection with the first terminal, but also determining that the user's dragging intention is cross-device dragging.
- the object dragged by the user may be an interface displayed by the second terminal, or an element in the interface (such as a video element, a picture-in-picture, or a floating window).
- the second terminal can determine whether the user's intention to drag is not. It is dragging across devices. If it is determined that the user's intention to drag the element is dragging across devices, screen projection can be started.
- the second terminal may set a drag-aware area to determine whether the user's drag intention is to drag across devices.
- the drag sensing area may be an area on the display screen of the second terminal at a predetermined distance from the edge of the display screen. The predetermined distance may be predefined, or a setting interface may be provided for the user to set.
- the drag sensing area of the second terminal may be one or multiple.
- a transparent view control is set at the drag-aware area. After a dragged object, such as an interface or an element in the interface, is dragged into the drag-aware area, the view control set in the corresponding area can detect the dragging of the element. When the presence of the view control detects that the element is dragged in, the second terminal can determine that the user's drag intention is to drag across devices.
- a dragged object such as an interface or an element in the interface
- the second terminal can project the display interface (the display interface can be the second interface in the embodiment of the application) to the first terminal.
- the specific implementation is similar to the implementation of the second terminal projecting the display interface to the first terminal in the wireless screen projection scenario in S403 and S404.
- the second terminal may only project the element to the first terminal.
- the second terminal may obtain the layer name (or layer name, layer Name) of the element in the currently displayed interface.
- the second terminal can determine whether the layer name of the layer currently to be synthesized is the same as the acquired layer name. If the same, the second terminal composites the layer into the VirtualDisplay. If not, the second terminal does not synthesize the layer into the VirtualDisplay, so as to achieve the purpose of projecting only the element dragged by the user to the first terminal.
- the second terminal may display the dragged object on the first terminal after receiving the user's drag release operation. .
- a part of the area of the dragged object is displayed on the display screen of the second terminal, and another part of the area is hidden (or overflows the display screen).
- the object is also displayed. Specifically: for the dragged object, a part of the area is displayed on the second terminal, and another part of the area (the area overflowing the second terminal) is displayed on the first terminal.
- the specific implementation of displaying the dragged object on the first terminal and the second terminal at the same time may be: after the screen projection is started, the second terminal not only needs to send a message to the first terminal To cast screen data, it is also necessary to send the rectangle (rect) information of the dragged object to the first terminal, and a certain corner of the object during the dragging process (such as any one of the upper left corner, lower left corner, upper right corner and lower right corner). corner) coordinate information, that is, the data sent by the second terminal to the first terminal includes screen projection data, rectangle information of the dragged object, and coordinate information of a certain corner of the object during the dragging process.
- the rectangle information of the object includes coordinate information of the four corners of the upper left corner, the upper right corner, the lower left corner and the lower right corner of the object when the dragging starts.
- the first terminal can determine whether the object has an area overflowing the display screen of the second terminal according to the rectangle information of the object, the coordinate information of a corner of the object during the dragging process, and the resolution of the second terminal. If the existence area of the object overflows the display screen of the second terminal, the first terminal can determine whether the object can be displayed on the first Information about the area corresponding to the object displayed on the display screen of the terminal (the area is the same as the area where the object overflows the display screen of the second terminal).
- the resolution of the second terminal may be sent by the second terminal to the first terminal during the process of establishing a connection between the first terminal and the second terminal, or after the connection is successfully established.
- the first terminal may display the content of the area corresponding to the object on the display screen of the first terminal according to the determined area information and screen projection data.
- the TV obtains the IP address 1 of the mobile phone 1 and establishes a connection with the mobile phone 1 .
- the TV creates a view corresponding to IP address 1, eg, view a.
- the TV sets the decoding parameters associated with IP address 1, such as decoding parameters a.
- the TV saves the connection instance a corresponding to the IP address 1, and is used to receive the screen projection data from the mobile phone 1.
- the user opens the video application of the mobile phone 1 to play the video X.
- the mobile phone 1 receives an operation triggered by the user to drag up the video element 1201 for presenting the video X.
- the mobile phone 1 can drag the video element 1201 up, and can also perform background blur processing.
- the mobile phone 1 receives the user's drag operation on the dragged video element 1201 .
- the mobile phone 1 makes the video element 1201 move on the display screen of the mobile phone 1 following the movement of the user's finger, giving the user a visual effect of the video element 1201 being dragged by the user's finger.
- the dragging direction of the video element 1201 may be upward dragging, leftward dragging, rightward dragging, and downward dragging.
- the user may use a finger to perform a drag operation on the dragged video element 1201 , such as an operation of long pressing and moving the finger to the right.
- the phone can draw and display an animation of the video element 1201 as the user's finger moves.
- the mobile phone 1 can determine whether the user's drag intention is a cross-device operation.
- the mobile phone 1 After the mobile phone 1 determines that the user's drag intention is a cross-device operation, the mobile phone 1 can create a virtual display, and draw the layer where the video element 1201 in the current interface is located on the virtual display to obtain screen projection data, such as called Screen projection data a.
- the mobile phone 1 can encode the screen projection data a and send it to the TV.
- the mobile phone 1 can also send the rectangle information of the video element 1201 and the coordinate information of a certain corner (eg, upper left corner) of the video element 1201 to the TV set during the dragging process.
- the TV can receive the encoded screen projection data a, the rectangle information of the video element 1201, and the coordinate information of the upper left corner of the video element 1201 during the dragging process through the connection instance a.
- the received rectangle information of the video element 1201 the coordinate information of the upper left corner of the video element 1201 and the resolution of the mobile phone 1 during the dragging process, after determining that the area where the video element 1201 exists overflows the display screen of the mobile phone 1, the television
- the rectangle information of the video element 1201 the coordinate information of the upper left corner of the video element 1201 and the resolution of the mobile phone 1 during the dragging process, the information of the area corresponding to the video element 1201 that can be displayed on the display screen of the TV can be determined.
- the TV can determine that the IP address of the screen projection source is the IP address 1 of the mobile phone 1 .
- the TV can decode the received screen projection data a by using the encoding parameter a corresponding to the IP address 1 according to the IP address 1.
- the TV can create a view a corresponding to IP address 1 to realize screen projection Drawing of interface 1. As shown in (a) of FIG.
- the screen projection interface 1 is displayed on the display screen of the TV, and the content in the screen projection interface 1 and the video X carried in the video element 1201 of the mobile phone 1 overflow the content of the mobile phone display screen same.
- the mobile phone 1 can acquire the screen projection data a and the coordinate information of the upper left corner of the video element 1201 during the dragging process in real time, and send them to the TV.
- the TV can update the screen projection interface 1 in real time according to the received data.
- the TV can display the screen projection interface 1 in full screen on the display screen of the TV set according to the screen projection data a received in real time.
- the content in the screen projection interface 1 is the same as the entire content of the video X carried in the video element 1201 .
- the TV obtains the IP address 2 of the mobile phone 2 and establishes a connection with the mobile phone 2 .
- the TV creates a view corresponding to IP address 2, eg, view b.
- the TV configures decoding parameters associated with IP address 2, such as decoding parameters b.
- the TV saves the connection instance b corresponding to the IP address 2 for receiving the screen projection data from the mobile phone 2 .
- the user opens the fitness application of the mobile phone 2 to view the fitness video.
- the mobile phone 2 receives the user's drag operation on the video element bearing the fitness video.
- the mobile phone 2 makes the video element move on the display screen of the mobile phone 2 following the movement of the user's finger, giving the user a visual effect that the video element is dragged by the user's finger.
- the mobile phone 2 can determine whether the user's drag intention is a cross-device operation.
- the mobile phone 2 After the mobile phone 2 determines that the user's drag intention is a cross-device operation, the mobile phone 2 can create a virtual display, and draw the layer where the video element in the current interface is located on the virtual display to obtain screen projection data, such as called projection screen data b.
- the mobile phone 2 can encode the screen projection data b and send it to the TV.
- the mobile phone 2 can also send the rectangle information of the video element and the coordinate information of a certain corner (eg, upper left corner) of the video element to the TV set during the dragging process.
- the TV can receive the encoded screen projection data b, the rectangle information of the video element, and the coordinate information of the upper left corner of the video element during the dragging process through the connection instance b.
- the TV set can The rectangle information of the video element, the coordinate information of the upper left corner of the video element during the dragging process, and the resolution of the mobile phone 2 determine the information corresponding to the area of the video element that can be displayed on the display screen of the TV.
- the TV can determine that the IP address of the screen projection source is the IP address 2 of the mobile phone 2 .
- the TV can decode the received screen projection data b by using the encoding parameter b corresponding to the IP address 2 according to the IP address 2 .
- the TV can create a view b corresponding to the IP address 2 to realize the screen projection interface 2 drawing.
- the TV can simultaneously display the screen projection interface 1 and the screen projection interface 2 on the TV display screen. For example, a screen projection interface 1 is currently displayed in full screen on the TV.
- the TV set may display the screen projection interface 2 in the form of a small window (or picture-in-picture, floating window) on the display screen of the TV set.
- the content in screen interface 2 is the same as the content of the fitness video of mobile phone 2 overflowing the mobile phone display.
- the mobile phone 2 can acquire the screen projection data b and the coordinate information of the upper left corner of the video element in the dragging process in real time, and send it to the TV. In this way, the TV can update the screen projection interface 2 in real time according to the received data.
- the TV can continue to display the screen-casting interface 2 in the form of a small window on the display screen of the TV according to the screen-casting data b received in real time.
- the content in the screen projection interface 2 is the same as the entire content of the fitness video displayed on the mobile phone 2 .
- the TV set may set the projection window of one of the projection interfaces as the focus window by default, for example, the TV defaults a small window as the focus window.
- the TV displays a prompt sign 1301 for prompting the user of a small window, that is, the screen-casting window of the screen-casting interface 2 is the focus window.
- the user can use the remote control of the TV to select and switch the focus window, and can also switch the layout of the large and small windows (wherein the window used for displaying the screen projection interface 1 in full screen may be called a large window), and can also close the large and small windows.
- the TV receives the user's operation of the left button or the right button of the remote control, it switches the focus window.
- the TV can display the small window, that is, the screen projection interface 2 in full screen, and The large window, that is, the screen projection interface 1 is displayed in the form of a small window.
- the TV can stop displaying the small window, or close the small window, and the TV can also notify the mobile phone 2 corresponding to the small window to stop projecting the screen.
- the TV can stop displaying the large window, and the TV can also notify the mobile phone 1 corresponding to the large window to stop projecting the screen.
- the object dragged by the user is an interface displayed by the second terminal, or an element in the interface such as a video element, a picture-in-picture, or a floating window as an example.
- the object dragged by the user may also be a UI control in an interface displayed by the second terminal.
- the dragged UI control can be defined by a third-party application, selected by the user, or recommended by the system.
- the specific implementation of many-to-one screen projection is similar to the implementation of the dragged object as an interface or an element in the interface. The differences are:
- the second terminal does not acquire screen projection data and send it to the first terminal for realizing screen projection. Instead, after the screen projection is started, the second terminal obtains data, such as the instruction stream of the current interface, and sends the instruction stream to the first terminal.
- the second terminal may also send the identifier of the dragged UI control (that is, the above-mentioned data may also include the identifier of the dragged UI control) to the first terminal.
- the first terminal can extract the canvas instruction of the dragged UI control from the received instruction stream, so as to realize the dragged UI control according to the canvas instruction display of the UI controls on the first terminal.
- the screen projection of the UI control on the first terminal in the interface currently displayed by the second terminal (the interface may be the second interface in the embodiment of the application) is realized.
- the UI control displayed on the first terminal may be the first interface in this embodiment of the application.
- the first terminal and the second terminal may further include an instruction management module.
- the command management module of the second terminal may be responsible for extracting the content of the source end interface of the projection screen, that is, responsible for obtaining the command stream of the current interface.
- the command management module of the first terminal may be responsible for restoring the content of the source end of the screen projection, for example, drawing corresponding UI controls according to the command stream.
- the second terminal acquires data, such as the 2D drawing instruction and the identifier of the dragged UI control, and sends it to the first terminal.
- the first terminal draws the dragged UI control to the display screen of the first terminal according to the received 2D drawing instruction and logo and according to the corresponding layout file, that is, realizes the UI that is dragged by the user in the interface displayed by the second terminal Display of controls on the first terminal.
- the layout file also includes other configurations of the drawing area (such as configurations such as positions and styles corresponding to the identifiers of UI controls).
- the first terminal reads the configuration corresponding to the logo from the layout file according to the received 2D drawing instruction and logo to realize the drawing and layout of UI controls on the display screen of the first terminal.
- the data used to realize the screen projection of the second terminal on the first terminal can be understood as video data, or includes video data, so it can be
- the channel used for transmitting screen projection data between the first terminal and the second terminal is called a video channel, or a video transmission channel.
- the data used to implement screen projection by the second terminal on the first terminal is an instruction stream.
- the above-mentioned video channel may continue to be used to implement the transmission of the instruction stream.
- an instruction channel, or referred to as an instruction transmission channel may also be used to implement the transmission of the instruction stream. That is to say, in this embodiment, multiple instruction streams can be supported to be projected to one screen projection destination, such as the screen of the first terminal, so as to realize many-to-one projection.
- the first terminal can create a canvas corresponding to each second terminal (the canvas can be This is the drawing component in this embodiment of the application), which is used to implement the projection of the UI controls of the second terminal on the first terminal.
- the process for the first terminal to project multiple instruction streams onto one screen may include: after the second terminal is connected to the first terminal, or after the second terminal is connected to the first terminal and starts screen projection, The first terminal creates a canvas corresponding to the second terminal for carrying (or drawing) the UI controls projected by the second terminal (eg, step 1 in FIG. 15 ).
- the first terminal draws corresponding content on the corresponding canvas according to the instruction stream from each second terminal and the identifier of the dragged UI control (eg, step 2 in FIG. 15 ).
- the first terminal synthesizes the canvases corresponding to the second terminals into one canvas (eg, step 3 in FIG. 15 ).
- the first terminal displays the synthesized canvas on the screen of the first terminal (eg, step 4 in FIG. 15 ).
- FIG. 16 when there is only one second terminal as the screen projection source, only the canvas corresponding to the second terminal is displayed on the screen of the first terminal (as shown in (a) of FIG. 16 ). Contents of canvas 1).
- the canvases corresponding to the two second terminals can be displayed on the screen of the first terminal according to the corresponding layout. For example, the screen of the first terminal is divided into two areas, one area is used to display the content of the canvas corresponding to one of the second terminals (canvas 1 in FIG. 16(b)), and the other area is used to display The content of the canvas corresponding to another second terminal (canvas 2 in (b) of FIG. 16 ) is displayed.
- the canvases corresponding to the multiple second terminals can be displayed on the screen of the first terminal according to the corresponding layout.
- the screen of the first terminal can be divided into a corresponding number of area, respectively used to display the content of the canvas corresponding to each second terminal.
- the layout of the multiple canvases on the screen of the first terminal may be predetermined or set according to the user's settings, for example, the multiple canvases are divided into horizontal equal parts, vertical equal parts, picture-in-picture, three, etc.
- the layout is in the form of division, quarter division, etc. on the screen, and is not limited to the horizontal division layout shown in (b) in FIG. 16 .
- mobile phone 1 and mobile phone 2 are used as the screen projection source, the TV is used as the screen projection destination, and the dragged UI control is selected by the user.
- the implementation process of many-to-one screen projection in the scenario of dragging and dropping UI controls across devices is introduced as an example.
- network monitoring can be activated to monitor connection requests.
- the TV can also broadcast its own IP address for other devices to initiate connection requests.
- the mobile phone 1 receives the IP address of the TV.
- the mobile phone 1 can initiate a connection request according to the IP address of the TV set to request to establish a connection with the TV set.
- the TV can obtain the IP address 1 of the mobile phone 1 .
- the TV can start the distribution function, such as creating a canvas corresponding to IP address 1, such as canvas x, and configuring the decoding parameters associated with IP address 1, such as decoding parameter x , and save the connection instance x corresponding to IP address 1, which is used to receive data from mobile phone 1, such as the command stream, the identifier of the UI control being dragged, etc., so as to prepare for the screen projection of mobile phone 1.
- the television can also notify the mobile phone 1 that it is ready.
- the user can trigger the mobile phone 1 to start screen projection by dragging and dropping the UI controls in the current display interface of the mobile phone 1 .
- the mobile phone 1 displays a shopping details page 1701 of the shopping application.
- the mobile phone 1 receives the user's drag operation on the UI controls in the shopping details page 1701 .
- the dragging operation may include: an operation of the user selecting a UI control and an operation of triggering the movement of the selected UI control.
- Take the dragged UI controls including: product preview control 1702, product price control 1703, product introduction control 1704, add to cart button 1705 and buy now button 1706 on the shopping details page 1701 as an example.
- the mobile phone 1 in response to the drag operation, can display an animation of the corresponding UI control moving with the movement of the user's finger, giving the user a visual effect of the UI control being dragged by the user's finger.
- the mobile phone 1 can determine whether the user's dragging intention is a cross-device operation. After the mobile phone 1 determines that the user's drag intention is a cross-device operation, the mobile phone 1 can start the command capture. For example, the mobile phone 1 can perform command extraction on the shopping details page 1701 to obtain the instruction stream corresponding to the shopping details page 1701, as described in is the instruction stream x.
- the instruction stream x may include information such as the canvas instruction of each UI control in the current interface, the layer name, and the identifier of the control.
- the mobile phone 1 can encode the instruction stream x and send it to the TV.
- the mobile phone 1 can also send the identifier of the dragged UI control to the TV.
- the identifier of the control may be a specific field identifier (eg, dup ID) defined by the application developer.
- the mobile phone 1 can identify the type of UI control dragged by the user through UI control identification.
- the mobile phone 1 can determine the ID of the dragged UI control according to the identified type of UI control.
- the types of the controls correspond to the identifiers one-to-one, and the corresponding relationship is pre-stored in the mobile phone 1 .
- an artificial intelligence (artificial intelligence) recognition method can be used to recognize the type of UI control dragged by the user.
- each interface of each application in the mobile phone can be obtained in advance, for example, the whole frame image data of the product detail page 1701 can be obtained by taking a screenshot, and the target detection technology in machine learning (such as R-CNN, Fast-R-CNN YOLO and other model algorithms) locate the area of each UI control in the product detail page 1701, and then match the located area and type of each UI control in the product detail page 1701 with the The identifier of the product detail page 1701 is correspondingly stored in the mobile phone 1 .
- machine learning such as R-CNN, Fast-R-CNN YOLO and other model algorithms
- the mobile phone After receiving the user's operation of dragging the UI controls in the product detail page 1701, the mobile phone can identify the user dragging the UI controls according to the position touched by the user when selecting the UI controls and the stored area of each UI control in the product detail page 1701. The type of UI control to drag. For another example, after receiving the user's operation of dragging the UI control on the product details page 1701, the UI control selected by the user can be drawn, and then the target classification technology in machine learning (such as the ResNet model algorithm) can be used to identify The type of UI control drawn.
- the target classification technology in machine learning such as the ResNet model algorithm
- the TV can receive the encoded instruction stream x and the identifier of the dragged UI control through the connection instance x.
- the TV set can determine that the IP address of the source end of the screen projection is the IP address 1 of the mobile phone 1 according to the connection instance x of the received data.
- the TV set can decode the received instruction stream x by using the encoding parameter x corresponding to IP address 1.
- the TV can create a canvas x corresponding to the IP address 1 to realize the drawing and display of the dragged UI control on the TV screen. For example, after the user releases the drag, as shown in (a) of FIG.
- the TV can display the screen projection interface x.
- the content in the screen projection interface x is the same as the UI control dragged by the user in the product detail page 1701 displayed on the mobile phone 1 .
- the TV when it implements the drawing of UI controls on the canvas, it may draw each UI control according to a preconfigured layout file.
- the layout file includes the configuration of the drawing area of each UI control (for example, including the configuration of the identifier, position and style of the UI control), and the drawing area of each UI control does not overlap.
- the drawing area of each UI control in the layout file may not correspond to the area of the corresponding UI control in the original interface, that is, through the layout file, the UI controls can be rearranged.
- the layout file can be a system developer or an application developer using Android generated by studio. if using android Studio can realize the capture and preview display of UI control related layouts. System developers or application developers can adjust the layout of UI controls in the preview, and can generate layout files according to the final layout.
- the user can project the UI controls in the interface displayed on the mobile phone 2 to the TV for display by dragging and dropping.
- the specific implementation is similar to the display of the UI controls in the display interface of the mobile phone 1 projected on the TV, and will not be repeated here.
- the mobile phone 2 displays a shopping details page 1901 of the shopping application.
- the user performs a drag operation on the UI controls in the shopping details page 1901 .
- the dragged UI controls include: product preview control 1902 on the shopping details page 1901 , product price control 1903 , product introduction control 1904 , add to cart button 1905 and buy now button 1906 .
- the TV set can decode the received instruction stream (eg, instruction stream y) using the corresponding encoding parameter (eg, encoding parameter y).
- the TV can use the created corresponding canvas (eg canvas y) to realize the drawing of the dragged UI control on the mobile phone 2 .
- the TV also draws the dragged UI controls on the mobile phone 1 on the canvas x.
- the TV can combine canvas x and canvas y into one canvas and display it on the TV screen. For example, as shown in (b) of FIG. 18 , the TV can display a screen projection interface x and a screen projection interface y.
- the content in the screen projection interface x is the same as the UI control dragged by the user in the product details page 1701 displayed by the mobile phone 1
- the content in the screen projection interface y is the same as the user dragged in the product details page 1901 displayed by the mobile phone 2.
- UI controls are the same.
- the television set may by default set the projection window of one of the screen projection interfaces as the focus window.
- the focus position may specifically be a UI control in the screen-casting interface presented by the screen-casting window.
- the focus position of the television is the product preview control 1801 of the screen projection interface x.
- the user can choose to switch the focus position using the remote control of the TV. For example, if the TV receives the user's operation of the left button, right button, up button or down button of the remote control, it can switch the focus position. For example, in conjunction with (b) in FIG.
- the TV receives the user’s operation of the right button of the remote control, then as shown in (c) in FIG. 18 , the TV sets the focus position from the product preview of the screen projection interface x Control 1801, switch to product preview control 1802 of screen projection interface y. After that, when the TV receives the user's operation of pressing the down button on the remote control, as shown in (d) of FIG. 18 , the TV switches the focus position from the product preview control 1802 of the screen projection interface y to the screen projection interface y The add to cart button 1803.
- Users can also use the remote control of the TV to achieve reverse control.
- the television can obtain the location information of the operation.
- the TV can determine that the position (such as coordinates) of the operation corresponds to the original position (such as coordinates) in the interface of the mobile phone, so as to determine that the user wants to operate What is the UI control on the phone.
- the TV can send the corresponding operation instructions to the mobile phone, so that the mobile phone can respond accordingly, so as to realize the reverse control.
- the mobile phone can re-project the updated interface content to the TV, so that the TV can update the corresponding projection interface.
- the focus position is the product preview control 1801 of the screen projection interface x.
- the television receives the user's operation of the confirmation button of the remote control.
- the television can determine that the user wants to operate the product preview control on the mobile phone 1 according to the current focus position and layout.
- the television set can send the corresponding operation instruction to the mobile phone 1 .
- the mobile phone 1 can respond accordingly according to the operation instruction, such as playing a product preview video.
- the mobile phone 1 can also record the played video and send it to the TV.
- the TV can play the preview video of the product in full screen.
- the mobile phone may not project the updated interface to the TV.
- the user can continue to operate on the phone.
- the focus position is the buy now button 2001 of the screen-casting interface x.
- the television receives the user's operation of the confirmation button of the remote control. According to the current focus position and the stored correspondence, the television can determine that the user wants to operate the buy now control on the mobile phone 1 .
- the television set can send the corresponding operation instruction to the mobile phone 1 . As shown in (b) of FIG.
- the mobile phone 1 after receiving the operation instruction, can display a purchase interface 2002 .
- the user can continue to operate on the mobile phone 1 .
- the TV set can also set the screen projection interface x corresponding to the mobile phone 1 to gray, and can also display prompt information 2003, such as the words "continue to operate on the mobile phone", to prompt Users can continue to operate on the mobile terminal.
- prompt information 2003 such as the words "continue to operate on the mobile phone"
- the user can switch back to the television to continue the operation.
- the prompt message 2003 also includes the words "to exit, please press the "return” key".
- the above screen projection application can realize many-to-one from multiple screen projection sources to one screen projection destination. screencast.
- multiple mobile phones and tablet computers can project the content (such as PPT, broadcast video) on their display screens to the same large-screen device for presentation, realizing many-to-one. 's screencast.
- the efficiency of collaborative use of multiple devices is improved, and the user experience is improved. It allows users to control the screen-casting interface using the input device of the screen-casting destination, and can also realize the reverse control of the screen-casting source.
- the screen projection destination can also adjust the layout of the presented screen projection interface according to the increase or decrease of the source device, so as to present the best visual effect to the user.
- layer filtering is supported, so that the layers where some elements in the current interface (such as elements dragged by the user, or predetermined elements) are located are projected to the screen projection destination. In this way, it can be ensured that the private information of the screen projection source end is not projected to the screen projection destination end, and the privacy of the user is protected.
- the content to be projected can be replaced from a pure video stream to an instruction stream, which can improve the display effect of the projection interface at the destination end of the projection screen and save transmission bandwidth.
- each meeting needs to carry various devices and cables and prepare in advance. This reduces meeting efficiency and increases communication costs for cross-regional office work.
- the many-to-one screen projection solution provided in this embodiment can be combined with a smooth call to realize cross-regional office work. This cross-regional office method can improve meeting efficiency and save communication costs for cross-regional office work.
- Changlian Call realizes high-definition audio and video calls between multiple devices. You can make video calls between mobile phones, mobile phones, large-screen devices, smart speakers with screens and other devices, and you can freely connect between these devices and choose the best device. Answering, bringing consumers a smoother and more free call experience. At the same time, it provides users with a good audio and video call experience, and can realize 1080P high-definition video calls, and can maintain smoothness in the case of dark light and poor network quality (such as subway or high-speed rail scenes).
- dark light and poor network quality such as subway or high-speed rail scenes.
- Region A includes a first terminal, such as large-screen device A.
- Region B includes a third terminal, such as large-screen device B.
- Large-screen device A communicates with large-screen device B smoothly.
- the large-screen device A displays the site picture of the region B, and can also display the site picture of the local (that is, the region A).
- the large-screen device B displays a picture of a conference site in region A, and can also display a picture of a conference site in the local area (ie region B).
- the large-screen device displays the conference site picture of the other party's conference site, which is drawn by the large-screen device according to the video data collected in real time by the opposite-end large-screen device.
- the local venue screen displayed by the large-screen device is drawn based on the video data collected in real time by itself.
- the video data collected in real time can be transmitted between the large-screen devices through the far-field data channel established between them.
- Participants in region A can project documents displayed on one or more second terminals, such as mobile phone 1 and mobile phone 2 (for example, document 1 and document 2, respectively) using the many-to-one screen projection solution provided by the above embodiment.
- the large-screen device A in region A For example, the document 1 displayed on the mobile phone 1 and the document 2 displayed on the mobile phone 2 can be projected on the large-screen device A by dragging and dropping across devices or by wirelessly projecting the screen.
- the mobile phone 1 can send the screen projection data A1 to the large-screen device A through the near-field data channel established with the large-screen device A, so that the large-screen device A can display the document 1, so as to realize the document displayed on the mobile phone 1 1 Display on large screen device A.
- the mobile phone 2 sends the screen projection data A2 to the large-screen device A through the near-field data channel established with the large-screen device A, which is used to display the document 2 on the large-screen device A, so that the document 2 displayed on the mobile phone 2 can be displayed on the large-screen device A.
- Display on A that is, with reference to FIG. 21, as shown in FIG. 22, the large-screen device A can, according to the received screen projection data A1, the screen projection data A2, the video data from the large-screen device B, and the video data collected by the large-screen device A itself.
- the site image of region B, the site image of region A, document 1 projected by mobile phone 1 and document 2 projected by mobile phone 2 are displayed on the screen of large-screen device A.
- the local conference site image that is, the conference site image in the region A may not be displayed.
- the large-screen device A and the large-screen device B will respectively capture the local site images in real time, and send the corresponding video data to the opposite-end large-screen device.
- the large-screen device A receives the screen projections of the mobile phone 1 and the mobile phone 2, that is, after receiving the above-mentioned screen projection data A1 and A2
- the large-screen device A not only needs to send the video data collected in real time to the large-screen device B , and the projection data A1 and A2 can also be sent to the large-screen device B through the far-field data channel with the large-screen device B, so that the large-screen device B can also display documents 1 and 1 on its screen.
- Document 2
- the large-screen device B can display the site image of the region A on the screen of the large-screen device B according to the projection data A1, the projection data A2 and the video data from the large-screen device A, Document 1 and Document 2.
- the large-screen device B can also display the local conference site picture, that is, the conference site picture of the region B, according to the video data collected by itself.
- participant in region B can also use one or more documents displayed on the second terminals, such as mobile phone 3 and mobile phone 4 (for example, document 3 and document 4, respectively) using the many-to-one provided by the above embodiment.
- the screen projection solution is projected to the large-screen device B in region B.
- the large-screen device A and the large-screen device B can respectively display the corresponding conference site screen and the documents of the two regions.
- the screen projection data used by the mobile phone 3 to realize screen projection is called screen projection data B1
- the screen projection data used by the mobile phone 4 to realize screen projection is called screen projection data B2 as an example.
- the large-screen device A can, according to the projection data A1 from the mobile phone 1, the projection data A2 from the mobile phone 2, the video data from the large-screen device B, the projection data B1 and the projection screen Data B2, and the video data collected by the large-screen device A itself, display on the screen of the large-screen device A the site picture of region B, the site picture of region A, document 1 projected by mobile phone 1, and document 2 projected by mobile phone 2, Document 3 cast by phone 3 and document 4 cast by phone 4.
- the large-screen device B can use the projection data B1 from the mobile phone 3, the projection data B2 from the mobile phone 4, the video data from the large-screen device A, the projection data A1 and the projection data.
- A2 On the screen of the large-screen device B, the screen of the conference site in region 1, document 1 projected by mobile phone 1, document 2 projected by mobile phone 2, document 3 projected by mobile phone 3, and document 4 projected by mobile phone 4 are displayed.
- the screen of the large-screen device can be used to display the video call screen.
- the above-mentioned area of the conference site screen is called the video call area, and will be used to display the screen projection interface, as described above.
- the area of the document is called the document presentation area, as shown in FIG. 23 .
- the layout of the video call area and the document presentation area on the screen of the large-screen device may be predefined.
- the predefined layout is not limited to the horizontal layout shown in FIG. 23 , but can also be arranged vertically, in a picture-in-picture manner, and the like.
- the large-screen device When the large-screen device currently only displays the video call screen, if the screen projection data from the mobile phone is received, the large-screen device can divide the screen into two areas: the video call area and the document display area according to the predefined layout. , which are used to display the video call screen and the corresponding screen projection interface respectively.
- the predefined layout is horizontal layout
- mobile phone 1 projects document 1 to large-screen device A as an example.
- the large-screen device A currently displays the video call screen, including the conference site screen in region B and the conference site screen in region A.
- the user can trigger cross-device screen projection by dragging.
- the screen casting request that can be received by the large-screen device A.
- the large-screen device A can display a request notification 2401 for asking the user's mobile phone 1 to request screen projection and whether to allow it.
- permission for example, the permission button 2402 is selected
- the large-screen device A can vertically divide the screen into a video call area and a document display area according to a predefined layout. Two areas are displayed, and the animation effect added by the document 1 projected by the mobile phone is presented. For example, the video call screen is retracted to the left area of the screen, and the document 1 is displayed to the right area of the screen. After that, as shown in (c) of FIG. 24 , the large-screen device A can display the video call screen and the document 1 at the same time.
- the user may also use the input device of the large-screen device to control the content presented on the screen.
- the user may use the remote controller of the large-screen device to switch the layout.
- the large-screen device being the large-screen device A as an example.
- the large-screen device A can correspond to each A full-screen button is displayed in the window where the screen is presented.
- a full-screen button 2501 is displayed in the window corresponding to the conference site screen of region B
- a full-screen button 2503 is displayed in the window corresponding to the conference site screen in region B
- a full-screen button 2502 is displayed in the window corresponding to the screen of document 1.
- the large-screen device A can display the picture of the corresponding window in full screen, and hide the pictures of other windows.
- the user can use the remote control of the large-screen device A to switch the focus position of the remote control operation on the screen, for example, the focus position of the remote control operation is switched to the full screen button 2502, which is the same as the full screen button 2502.
- the large-screen device A receives the user's operation of the determination button of the remote control. In response to this operation, as shown in (b) of FIG. 25 , the large-screen device A displays the document 1 in full screen. The site screen of region B and the site screen of region A can be hidden.
- the large-screen device A displays a screen in full screen, such as the above-mentioned screen of document 1, the large-screen device A can also display a zoom-out button, as shown in (b) in FIG. After receiving the operation of the zoom-out button 2504 by the user, the large-screen device A can present all the pictures on the screen at the same time, as shown in (a) of FIG. 25 .
- the large-screen device may not display full-screen buttons corresponding to different pictures.
- a large-screen device such as the large-screen device A
- displays multiple pictures the window of one of the pictures can be set as the focus window by default.
- the user can use the direction keys of the remote control of the large-screen device A to switch the focus window.
- the large-screen device A receives the user's operation of the confirmation button on the remote control, and the large-screen device A presents the screen of the focus window in full screen.
- the large-screen device A receives the user's operation of the confirmation button or the return button on the remote control, it exits the full screen and presents all the pictures on the screen at the same time.
- the above example is described by showing only the pictures in the document display area as an example, the user can also notify to perform the above corresponding operations, and only display the pictures in the video call area, which will not be repeated in this embodiment.
- the content projected by the projection source can be as follows:
- Solution 1 Support many-to-one coexistence sharing in the document display area.
- large-screen device A namely mobile phone 1 and mobile phone 2
- large-screen device B namely mobile phone 3 and mobile phone 4
- large-screen device A and large-screen device B can display document 1 projected by mobile phone 1, document 2 projected by mobile phone 2, document 3 projected by mobile phone 3, and document 4 projected by mobile phone 4 at the same time.
- document 1, document 2, document 3 and document 4 are displayed in the document display area in the form of a square grid.
- the document display area is divided into four document display sub-areas, namely document display sub-area 1 , document display sub-area 2 , document display sub-area 3 and document display sub-area 4 .
- the large-screen device A and the large-screen device B display the documents in the corresponding document display sub-areas in the order in which the corresponding screen projection data is received, respectively.
- the sequence of screen projection data is: the screen projection data of mobile phone 1, the screen projection data of mobile phone 2, the screen projection data of mobile phone 3, and finally the screen projection data of mobile phone 4.
- the large-screen device A and the large-screen device B sequentially display document 1, document 2, document 3, and document 4 in the corresponding document display sub-area 1, document display sub-area 2, document display sub-area 3, and document display sub-area 4.
- Option 2 Support preemptive sharing in the document display area. That is, there is only one document display area on a large-screen device.
- the document projected on the latter screen can cover the document projected on the previous screen.
- mobile phone 1 is first connected to large-screen device A, and projects document 1, that is, large-screen device A and large-screen device B receive the screen projection data of mobile phone 1 first, then the large-screen device A and the large-screen device B display document 1 in their document display area.
- the mobile phone 2 is connected to the large-screen device A and projects document 2, that is, the large-screen device A and the large-screen device B receive the screen projection data of the mobile phone 2, then the large-screen device A and the large-screen device B are in their document display area.
- Document 1 is not displayed, Document 2 is displayed.
- the mobile phone 3 is connected to the large-screen device B, and the document 3 is projected, that is, the large-screen device B and the large-screen device A receive the screen projection data of the mobile phone 3, and the large-screen device A and the large-screen device B are in their document display area.
- Document 2 is not displayed, Document 3 is displayed.
- the mobile phone 4 is connected to the large-screen device B, and the document 4 is projected, that is, the large-screen device B and the large-screen device A receive the screen projection data of the mobile phone 4, and the large-screen device A and the large-screen device B are in their document display area.
- Document 3 is not displayed, Document 4 is displayed.
- Scheme 3 The above scheme 1 and scheme 2 can also be combined.
- a large-screen device supports up to four screen projection sources to present content on the screen at the same time.
- the large screen can be displayed according to the result shown in (a) in Figure 26.
- the content of each projection source is displayed on the device.
- preemptive sharing can be used to present the projected content. For example, in conjunction with (a) in FIG.
- the large-screen device currently presents the content projected by mobile phone 1, mobile phone 2, mobile phone 3 and mobile phone 4, if mobile phone 5 needs to perform screen projection, then the content projected by mobile phone 5 Content, such as document 5, can be presented on the large-screen device overlying the document 1 projected by the mobile phone 1. After that, if the mobile phone 6 needs to perform screen projection, the content projected by the mobile phone 6, such as the document 6, can cover the document 2 projected by the mobile phone 2 and be presented on the large-screen device, and so on.
- FIG. 27 is a schematic diagram of the composition of a screen projection device according to an embodiment of the present application.
- the apparatus can be applied to a first terminal, and the first terminal is connected with a plurality of second terminals.
- the apparatus may include: a receiving unit 2701 and a display unit 2702 .
- the receiving unit 2701 is configured to receive data from each of the plurality of second terminals.
- the display unit 2702 is configured to display a plurality of first interfaces on the first terminal according to data received from a plurality of second terminals, and the plurality of first interfaces are in one-to-one correspondence with the plurality of second terminals;
- the content is a mirror image of the content of the second interface displayed by the corresponding second terminal, or the content of the first interface is the same as part of the content of the second interface displayed by the corresponding second terminal.
- the apparatus may further include: a creating unit 2703 .
- the creating unit 2703 is configured to create multiple drawing components, the multiple drawing components are in one-to-one correspondence with the multiple second terminals, and the drawing components are views or canvases.
- the display unit 2702 displays a plurality of first interfaces on the first terminal according to the data received from the plurality of second terminals, which may include: the display unit 2702, according to the data received from the plurality of second terminals, displays on the plurality of drawing components respectively A first interface corresponding to the second terminal is drawn to display a plurality of first interfaces on the first terminal.
- the apparatus may further include: a configuration unit 2704 and a decoding unit 2705 .
- the configuration unit 2704 is configured to configure a plurality of decoding parameters, and the plurality of decoding parameters are in one-to-one correspondence with the plurality of second terminals.
- the decoding unit 2705 is configured to decode the data received from the corresponding second terminal according to the plurality of decoding parameters.
- the apparatus may further include: an obtaining unit 2706 .
- the obtaining unit 2706 is configured to obtain connection information of multiple second terminals, and the connection information is used for establishing a connection between the first terminal and the corresponding second terminal; wherein, multiple drawing components correspond to multiple second terminals one-to-one, including: multiple Each drawing component is in one-to-one correspondence with connection information of multiple second terminals; and multiple decoding parameters are in one-to-one correspondence with multiple second terminals, including: multiple decoding parameters are in one-to-one correspondence with connection information of multiple second terminals.
- the apparatus may further include: an input unit 2707 .
- the input unit 2707 is configured to receive a user's first operation on the window of the first interface.
- the display unit 2702 is further configured to reduce, enlarge or close the window, or switch the focus window in response to the first operation.
- the input unit 2702 is further configured to receive a second operation of the user on the first interface corresponding to the second terminal.
- the apparatus may further include: a sending unit 2708, configured to send the data of the second operation to the second terminal, so that the second terminal can display the third interface according to the second operation.
- a sending unit 2708 configured to send the data of the second operation to the second terminal, so that the second terminal can display the third interface according to the second operation.
- the receiving unit 2701 is further configured to receive updated data from the second terminal.
- the display unit 2702 is further configured to update the first interface corresponding to the second terminal to a fourth interface according to the updated data, and the content of the fourth interface is a mirror image of the content of the third interface, or the content of the fourth interface and the third interface part of the same.
- the first terminal also establishes a connection with the third terminal; the sending unit 2708 is further configured to send data received from multiple second terminals to the third terminal, so that the third terminal displays multiple first interfaces.
- the receiving unit 2701 is further configured to receive video data from a third terminal.
- the display unit 2702 is further configured to display a video call picture on the first terminal according to video data of the third terminal while the first terminal displays a plurality of first interfaces.
- the apparatus may further include: a collection unit for collecting video data.
- the sending unit 2708 is further configured to send video data to the third terminal, for the third terminal to display a video call screen while displaying a plurality of first interfaces on the third terminal.
- FIG. 28 is a schematic diagram of the composition of another screen projection device according to an embodiment of the present application.
- the apparatus can be applied to a second terminal, and the second terminal is connected to the first terminal.
- the apparatus may include: a display unit 2801 , an input unit 2802 and a sending unit 2803 .
- the display unit 2801 is used to display the second interface.
- the input unit 2802 is used for receiving user operations.
- the sending unit 2803 is used to send the data of the second interface to the first terminal in response to the user operation, so that the first terminal can display the first interface corresponding to the second terminal, and the first terminal also displays data related to other second terminals.
- the corresponding first interface wherein the content of the first interface is a mirror image of the content of the second interface displayed by the corresponding second terminal, or the content of the first interface is the same as part of the content of the second interface displayed by the corresponding second terminal.
- the apparatus may further include: an acquiring unit 2804, configured to acquire data of the second interface.
- the data of the second interface is the screen recording data of the second interface; when the content of the first interface is the same as part of the content of the second interface , the data of the second interface is the screen recording data of the layer where the predetermined element in the second interface is located.
- the display unit 2801 is further configured to display a configuration interface, where the configuration interface includes layer filter setting options.
- the input unit 2802 is further configured to receive a user's selection operation on a layer filter setting option.
- the input unit 2802 receives a user operation, which may include: the input unit 2802 receives a user's drag operation on the second interface or elements in the second interface.
- the apparatus may further include: a determining unit 2805, configured to determine that the user's drag intention is to drag across devices; an acquiring unit 2804, further configured to acquire data of the second interface.
- the element in the case of receiving a user's drag operation on an element in the second interface, the element can be a video component, a floating window, a picture-in-picture or a free small window, and the data of the second interface is the layer where the element is located.
- Screen recording data; or, this element can be a user interface UI control in the second interface, the data of the second interface is the instruction stream of the second interface and the identifier of the UI control, or the data of the second interface is the drawing instruction of the UI control and logo.
- the above embodiments describe the process of screen projection from multiple terminals to one terminal.
- a terminal such as a mobile phone
- the terminal serving as the screen projection source end can realize the projection display of the content of one or more applications of the terminal on other terminals serving as the screen projection destination end by creating multiple media streams, To meet the needs of multi-task parallelism.
- the first terminal 101 is used as the screen projection source end
- the second terminal 102 is used as the screen projection destination end as an example.
- the first terminal 101 may be a mobile device such as a mobile phone and a tablet
- the second terminal 102 may be a large-screen device such as a PC and a TV.
- FIG. 29 is a schematic diagram of the composition of another software architecture provided by an embodiment of the present application.
- the software architectures of both the first terminal 101 and the second terminal 102 may include: an application layer and a framework layer.
- the first terminal 101 may include: a service scheduling and policy selection module, a video collection module, an audio collection module, a privacy mode setting module, an audio and video Encoding module, multi-device connection management protocol adaptation module and media stream transmission module.
- Each module included in the first terminal 101 may be included in any layer of the software architecture of the first terminal 101 .
- the above-mentioned modules included in the first terminal 101 are all included in the framework layer of the first terminal 101 , which is not specifically limited in this embodiment.
- the first terminal 101 may also include an application program, which may be included in the above-mentioned application layer.
- the second terminal 102 may include: a video rendering module, an audio rendering module, a video cropping module, an audio and video decoding module, a multi-device connection management protocol adaptation module and a media stream transmission module .
- Each module included in the second terminal 102 may be included in any layer of the software architecture of the second terminal 102 .
- each module included in the second terminal 102 is included in the framework layer of the second terminal 102, which is not specifically limited in this embodiment.
- the second terminal 102 may also include an application program, which may be included in the above-mentioned application layer.
- the first terminal 101 and the second terminal 102 may establish a connection in a wireless or wired manner.
- the first terminal 101 and the second terminal 102 can discover each other through a discovery process, and establish a connection through a connection process, or form a network.
- a transmission channel may be provided between the first terminal 101 and the second terminal 102 for data transmission between the two, so as to realize the content of one or more applications in the first terminal 101 to the display screen of the second terminal 102 display on.
- the composition of the software architecture illustrated in this embodiment does not constitute a specific limitation on the composition of the terminal software architecture.
- the terminals may include more or less modules than those shown in the figure, or combine some modules, or split some modules, or different module layout.
- the above-mentioned first terminal 101 may not include a privacy mode setting module.
- the above-mentioned first terminal 101 does not include an audio collection module, and the second terminal 102 does not include an audio rendering module.
- the above-mentioned first terminal 101 does not include a video acquisition module
- the second terminal 102 does not include a video rendering module and a video cropping module.
- the above-mentioned second terminal 102 does not include a video cropping module.
- the first terminal 101 serving as the screen projection source can project the content of one or more of its applications to the second screen projection destination by creating multiple media streams. displayed on the display screen of the terminal 102 .
- the first terminal 101 serving as the screen projection source is a mobile phone
- the second terminal 102 serving as the screen projection destination is a TV as an example.
- the mobile phone's video capture module and audio capture module can be selected according to service scheduling and policies. Customized media strategy for audio extraction and video extraction to obtain audio data and video data.
- the video acquisition module and the audio acquisition module of the mobile phone can transmit the collected audio data and video data to the audio and video coding module of the mobile phone.
- the audio and video encoding module of the mobile phone can encode the audio data and video data respectively, unpack them and store them in the cache queue.
- the multi-device connection management protocol adaptation module of the mobile phone can start network monitoring and connection management.
- the mobile phone can establish a connection with the TV to establish a connection channel between the mobile phone and the TV.
- the media stream transmission module of the mobile phone can take out the buffered audio data and video data from the buffer queue, and transmit them to the TV through the connection channel between the mobile phone and the TV, such as to the media stream transmission module of the TV.
- the media stream transmission module of the TV receives the data, it will be packaged and decoded by the audio and video decoding module of the TV to obtain audio data and video data.
- the audio and video decoding module of the TV transmits the audio data to the audio rendering module of the TV, and the audio rendering module outputs the corresponding audio.
- the audio and video decoding module of the TV transmits the video data to the video rendering module of the TV, and the video rendering module outputs the corresponding video, that is, the corresponding interface content is displayed.
- the process of audio and video extraction, encoding, unpacking and caching performed by the mobile phone can be called creating a media stream.
- the mobile phone can complete the projection of the content of an application on the mobile phone to the TV by creating a media stream (for example, called the first media stream).
- the mobile phone can also create another one or more media streams (such as the second media stream, the third media stream, etc.) to realize the projection of the content applied on the mobile phone to the TV or other screen projection destination.
- the other media streams created such as the second media stream, the third media stream, etc., may be media streams created for the content of the application or media streams created for the content of other applications.
- Scenario 1 Mobile phone A does not support parallel multitasking.
- the user wants to view the content of APP1 and the content of APP2 of mobile phone A at the same time.
- APP1 may be the first application in this embodiment of the present application.
- APP2 may be the second application in this embodiment of the present application.
- APP1 is a video application
- APP2 is a fitness application.
- mobile phone A (mobile phone A may be the above-mentioned first terminal) can be used as the source terminal of screen projection, and the contents of the two applications are projected to one or more other terminals that are the target terminal of screen projection, so as to satisfy the user’s requirement at the same time Check out the demand for video app and fitness app content.
- the screen projection destination including a terminal, such as a TV (the TV may be the above-mentioned second terminal) as an example.
- the user can trigger the mobile phone A to create two media streams by dragging, so as to project the content of the video application and the content of the fitness application on the mobile phone A to the TV.
- Mobile phone A establishes a connection with the TV.
- the description of establishing a connection between the mobile phone A and the TV is similar to the description of the corresponding content in the above-mentioned embodiment S401 shown in FIG. 4 , and details are not repeated here.
- mobile phone A When mobile phone A is connected to the TV, mobile phone A can act as the screen projection source to project the content of the application to the TV that is the screen projection destination.
- the specific description that the mobile phone A projects the content of the application to the TV is similar to the description that the mobile phone 1 or the mobile phone 2 projects the content to the TV in the foregoing embodiment, and will not be repeated here.
- the user triggers the mobile phone A to start projecting the content of the application to the TV by dragging as an example for description.
- the content of the application may include the interface content of the application displayed by the mobile phone A.
- mobile phone A currently displays an interface of a video application.
- the user may perform a drag and drop operation on the interface of the video application displayed on the mobile phone A or an element in the interface.
- Mobile phone A can receive the drag operation.
- the dragging operation may be the first operation in this embodiment of the present application. It can be understood that dragging can be divided into intra-device dragging and cross-device dragging (or inter-device dragging).
- In-device drag may refer to a drag where the intent of the drag is to drag the object being dragged from one location on the device to another location on the device.
- a cross-device drag may refer to a drag whose intent is to drag a dragged object from one location on the device into another device.
- the mobile phone A may determine whether the user's drag intention is to drag across devices. If it is determined that the dragging intention of the user is to drag across devices, the content of the video application, such as the projection of the interface content of the video application to the TV, is started. As an example, the mobile phone A may perform video extraction on the interface of the currently displayed video application to obtain corresponding video data, and send the video data to the TV serving as the screen projection destination.
- the video data can be used to project and display the interface of the video application or the elements in the interface on the destination end of the projection screen.
- the video data may be data of the interface of the first application in this embodiment of the present application.
- the object dragged by the user may be the interface of the video application, or may be an element in the interface of the video application, such as a video element, a picture-in-picture, or a floating window.
- the mobile phone A When the object dragged by the user is the interface of the video application displayed by the mobile phone A, referring to FIG. 29 , the mobile phone A performs video extraction to obtain the corresponding video data.
- Mobile phone A creates a virtual display (VirtualDisplay).
- the video capture module of mobile phone A sends a request to create a VirtualDisplay to the display manager of mobile phone A.
- the display manager of mobile phone A After the display manager of mobile phone A completes the creation of the VirtualDisplay, it can return the created VirtualDisplay to the video capture module of mobile phone A.
- the mobile phone A can start the video application into the VirtualDisplay, or in other words, move the interface drawing of the video application to the VirtualDisplay.
- mobile phone A can also bind VirtualDisplay to the video capture module of mobile phone A for screen recording, or video extraction. In this way, the video acquisition module of the mobile phone A can obtain corresponding video data.
- the mobile phone A may only project the element to the screen projection destination.
- mobile phone A performs video extraction, and the process of obtaining video data may be: after determining that the user's drag intention is to drag across devices, mobile phone A creates a VirtualDisplay. After that, the mobile phone A can move the drawing of the element dragged by the user in the interface of the video application to the VirtualDisplay.
- Mobile phone A can also bind VirtualDisplay to the video capture module of mobile phone A for screen recording, or video extraction. In this way, the video acquisition module of the mobile phone A can obtain corresponding video data.
- the specific implementation of the mobile phone A moving the drawing of the element dragged by the user in the application interface to the VirtualDisplay may be: after receiving the user's drag operation on the element in the interface of the video application, the mobile phone A can obtain the current value of the element in the video application interface.
- Mobile phone A can synthesize the interface of the video application into the VirtualDisplay layer by layer. In the process of layer-by-layer synthesis, mobile phone A can determine whether the layer name of the layer to be synthesized currently is the same as the layer name of the layer where the dragged element is located. If it is the same, Phone A composites the layer into the VirtualDisplay. If not, Phone A does not composite the layer into VirtualDisplay.
- the mobile phone A can also only project specific elements in the interface, such as video elements, to the destination end of the projection screen, so as to protect the privacy of the user.
- mobile phone A may provide a setting interface for the user to enable or disable this function, such as a so-called privacy mode.
- the mobile phone A When the user chooses to turn on the privacy mode, the mobile phone A only synthesizes the layer where the specific element in the interface is located into the VirtualDisplay to obtain the video data.
- the user chooses to turn off the privacy mode, mobile phone A can synthesize all layers of the interface into VirtualDisplay to obtain video data.
- the video data can be encoded and sent to the TV serving as the screen projection destination.
- the acquired video data can be transmitted to the audio and video encoding module of mobile phone A.
- the audio and video encoding module of mobile phone A can encode the video data, unpack it, and store it in the cache queue.
- the mobile phone A can send the video data in the buffer queue to the TV.
- the media stream transmission module of the mobile phone A can take out the buffered video data from the buffer queue and transmit it to the TV through the connection channel between the mobile phone A and the TV, such as to the media stream transmission module of the TV.
- the TV can display an interface or elements in the interface corresponding to the video application on the TV according to the video data.
- the data is packaged and decoded by the audio and video decoding module of the TV, and the corresponding video data can be obtained.
- the audio and video decoding module of the TV transmits the video data to the video rendering module of the TV, and the video rendering module displays the corresponding interface content.
- the interface of the video application in the mobile phone A or the elements in the interface can be projected and displayed on the TV, or the "transfer" of the video application from the mobile phone A to the TV can be realized.
- the user can continue to view the content of the video application on the TV.
- the object dragged by the user is an element in the interface of the video application, such as a video element.
- the user performs a drag operation on the video element 1201, such as an operation of long pressing and moving a finger to the right.
- the phone can draw and display an animation of the video element 1201 as the user's finger moves.
- the mobile phone A creates a virtual display, such as virtual display 1 (the virtual display 1 may be an embodiment of the present application) The first virtual display in the current interface), and draw the layer where the video element 1201 (the video element 1201 can be the first element in the embodiment of the application) in the current interface is located on the virtual display 1, so that the mobile phone A can perform video Extraction to obtain video data, such as video data a (the video data a may be the data of the interface of the first application in the embodiment of the application).
- the mobile phone A can encode and unpack the video data a and store it in the cache queue.
- the mobile phone A can send the video data a in the buffer queue to the TV.
- the video data a is packaged and decoded, and then rendered, so as to display the video X played in the video element 1201 on the TV.
- the "transfer" of the video application from the mobile phone A to the TV is realized, and the user can continue to watch the video X on the TV.
- the TV may display the dragged object on the TV after the user releases the drag of the object on the mobile phone A.
- the mobile phone A sends the video data a in the cache queue to the TV.
- the mobile phone A sends the video data a to the TV after receiving the user's release of dragging and dropping the video element 501 for the TV.
- a visual effect of dragging the object from mobile phone A to the TV is provided to the user. During the dragging process of the object, if part of the object is If the area overflows the display, the object can be displayed on both phone A and the TV.
- mobile phone A can perform video extraction to obtain video data a, encode and unpack the video data a and send it to the TV .
- the mobile phone A can also send the rectangle information of the video element 1201 and the coordinate information of a certain corner (eg, upper left corner) of the video element 1201 to the TV during the dragging process.
- the received rectangle information of the video element 1201 the coordinate information of the upper left corner of the video element 1201 and the resolution of the mobile phone A during the dragging process, after determining that the area where the video element 1201 exists overflows the display screen of the mobile phone A, according to the video
- the rectangle information of the element 1201, the coordinate information of the upper left corner of the video element 1201 during the dragging process, and the resolution of the mobile phone A determine the information corresponding to the area of the video element 1201 that can be displayed on the TV screen.
- the TV packages and decodes the video data a, and performs interface rendering according to the information of the determined area and the decoded packaged video data a, so as to realize the drawing of the video X played in the video element 1201 on the TV.
- an interface 1 is displayed on the display screen of the TV, and the content of the interface 1 is the same as the content of the video X carried in the video element 1201 of the mobile phone A overflowing the display screen of the mobile phone.
- the mobile phone A can acquire the video data a and the coordinate information of the upper left corner of the video element 1201 in real time during the dragging process, and send it to the TV.
- the TV can update the interface 1 in real time according to the received data.
- the TV can display the interface 1 in full screen on the display screen of the TV according to the video data a received in real time.
- the content in the interface 1 is the same as the entire content of the video X carried in the video element 1201 .
- the interface 1 may be the first interface in this embodiment of the application.
- the above-mentioned process of extracting, encoding, unpacking, and buffering the content of the application may be referred to as creating a media stream. That is, in combination with the above example, when the content of the video application includes the interface content, the mobile phone A can create a virtual display (such as virtual display 1), and use the virtual display 1 to realize one media stream (such as the first media stream). flow) creation. Afterwards, the mobile phone A can realize the projection of the interface content of the video application to the TV by sending the created data corresponding to the first media stream, such as the above-mentioned video data a, or the first video data to the TV.
- a virtual display such as virtual display 1
- one media stream such as the first media stream
- flow the mobile phone A can realize the projection of the interface content of the video application to the TV by sending the created data corresponding to the first media stream, such as the above-mentioned video data a, or the first video data to the TV.
- the mobile phone A can realize the projection of the content of other applications on the mobile phone A to the TV by creating another channel or multiple media streams.
- mobile phone A can create another media stream for the fitness application, such as the second media stream, so as to realize the content of the fitness application, such as interface content to TV projection.
- the process of creating a media stream for a fitness application to project the content of the fitness application to the TV is similar to the above-mentioned process of creating a media stream for a video application to project the content of the video application to the TV, and will not be described in detail here.
- the user after projecting the content of the video application in the mobile phone A to the TV, as shown in FIG. 14 , the user opens the fitness application of the mobile phone A (the operation of opening the fitness application may be the second operation in the embodiment of the application) to view fitness video.
- Mobile phone A receives the user's drag operation on the video element (the video element may be the second element in the embodiment of the application) that carries the fitness video (the drag operation may be the third operation in the embodiment of the application) .
- the mobile phone A makes the video element move on the display screen of the mobile phone A following the movement of the user's finger, giving the user a visual effect that the video element is dragged by the user's finger.
- the mobile phone A can determine whether the user's drag intention is to drag across devices.
- mobile phone A may create another virtual display, such as called virtual display 2 (this virtual display 2 may be the second virtual display in the embodiment of the application), And draw the layer where the video element is located in the current interface on the virtual display 2, so that the mobile phone A performs video extraction to obtain video data, such as video data b (the video data b can be the data of the interface of the second application).
- the mobile phone A can encode the video data b, unpack it, and store it in the cache queue. After that, the mobile phone A can send the video data b in the buffer queue to the TV.
- the mobile phone A may also send the rectangle information of the video element and the coordinate information of a certain corner (eg, upper left corner) of the video element to the TV during the dragging process.
- the TV can receive the video data b, the rectangle information of the video element, and the coordinate information of the upper left corner of the video element during the dragging process.
- the TV can The rectangle information of the video element, the coordinate information of the upper left corner of the video element during the dragging process and the resolution of mobile phone A determine the information corresponding to the area of the video element that can be displayed on the display screen of the TV.
- the TV packs and decodes the video data b, it renders the interface according to the information of the determined area and the video data b after decoding the pack, so as to realize the drawing of interface 2, and the content in the interface 2 is the same as that of the fitness application in the mobile phone A.
- the TV can simultaneously display the content of the video application of the mobile phone A and the content of the fitness application on the TV display screen.
- the TV currently displays the content of the video application in full screen (such as the above interface 1).
- the TV can display the above-mentioned in the form of a small window (or picture-in-picture, floating window) on the display screen of the TV. interface 2.
- the mobile phone A can obtain the video data b and the coordinate information of the upper left corner of the video element during the dragging process in real time, and send it to the TV.
- the TV can update the interface 2 in real time according to the received data. After the user releases the drag, as shown in (d) of FIG.
- the TV can continue to display the interface 2 in the form of a small window on the display screen of the TV according to the video data b received in real time.
- the interface 2 The content is the same as the entire content of the fitness app's fitness video. It can be obtained from the above description that the mobile phone A creates a virtual display 2 and uses the virtual display 2 to implement another media stream, such as the creation of a second media stream. By sending the data corresponding to the created second media stream, such as the above-mentioned video data b, or the second video data, to the TV, the mobile phone A realizes the projection of the content of the fitness application to the TV.
- the interface 2 may be the second interface in this embodiment of the application.
- the interface including the content of the interface 2 and the content of the interface 1 may be the third interface in this embodiment of the application.
- the content of the video application and fitness application of mobile phone A such as the interface content
- the content of the video application and fitness application of mobile phone A are projected on the TV as the screen projection destination, which satisfies the user's demand for viewing the content of the video application and fitness application at the same time.
- the content of the above application may also include audio.
- mobile phone A's application such as a video application
- mobile phone A's music application to listen to music
- mobile phone A can not only display the current display
- the interface content of the application is projected to the screen-casting destination, and the audio can also be projected to the screen-casting destination.
- the mobile phone A not only needs to send the above-mentioned video data (such as video data a or video data b) to the TV, but also needs to send audio data to the TV.
- the video data is used for the TV to display the corresponding interface on the display screen of the TV
- the audio data is used for the TV to play the corresponding sound.
- audio data can be obtained by creating an audio recording (AudioRecord) object. That is to say, when the user triggers mobile phone A to start projecting the content of the application, if the content of the application includes interface content and audio, mobile phone A can create a virtual display and AudioRecord object, and use the virtual display and AudioRecord object to realize a channel of media After the creation of the stream, the corresponding video data and audio data are sent to the TV through the created media stream, so as to realize the projection of the application content, including the interface content and the audio to the TV.
- AudioRecord audio recording
- the mobile phone A may create multiple AudioRecord objects in advance, which are used for subsequent audio extraction of different media streams. For example, it can be used for subsequent audio extraction of different applications, that is, based on the created AudioRecord object, the audio data of the application that needs to be projected is redirected to the corresponding media stream, and other audio data is still output from the projection source.
- the content of the video application includes interface content and audio.
- Phone A pre-creates two AudioRecord objects and creates a cache. After the user triggers the content of the video application to start projecting, the mobile phone A can realize the projection of the content of the video application to the TV by creating the first media stream. The process of projecting the interface content of the video application to the TV is as described in the foregoing embodiment, and details are not repeated here.
- the mobile phone A can also call the AudioRecord object to perform audio extraction to obtain audio data, such as audio data a, which is used to realize the projection of the audio of the video application to the TV.
- audio data such as audio data a
- the specific process of acquiring audio data a may include: mobile phone A, for example, the audio acquisition module of mobile phone A can call one of the two AudioRecord objects created in advance, such as called AudioRecord object 1 (AudioRecord object 1 may be the first AudioRecord object in this embodiment of the application). After the AudioRecord object 1 is called, the audio acquisition module of the mobile phone A can record the audio in the video played by the video application to obtain audio data, such as audio data a (the audio data a can be the same as in the embodiment of the application). audio data of the first application). After acquiring the audio data a, the audio acquisition module of the mobile phone A can transmit the acquired audio data a to the audio and video encoding module of the mobile phone A. The audio and video encoding module of the mobile phone A can encode the audio data a, unpack it and store it in the cache.
- AudioRecord object 1 may be the first AudioRecord object in this embodiment of the application.
- the media stream transmission module of the mobile phone A can obtain the audio data a from the buffer, and send it to the TV through the connection channel between the mobile phone A and the TV.
- the television can output corresponding audio according to the audio data a.
- the data is packaged and decoded by the audio and video decoding module of the TV, and the corresponding audio data a can be obtained.
- the audio and video decoding module of the TV transmits the audio data a to the audio rendering module of the TV, and the audio rendering module outputs the corresponding audio. In this way, the audio of the video application in the mobile phone A is projected to the TV. So far, other audios of mobile phone A are still output through mobile phone A.
- mobile phone A can create a second media stream to realize the projection of the content of the fitness application to the TV.
- the process of projecting the interface content of the fitness application to the TV is as described in the foregoing embodiment, and details are not repeated here.
- the mobile phone A can also call another AudioRecord object pre-created by the mobile phone A, such as called AudioRecord object 2 (AudioRecord object 2 can be the second AudioRecord object in the embodiment of the application), so as to realize the projection of the audio of the fitness application to the TV , the specific implementation process is similar to the projection of the audio of the video application to the TV, which is not repeated here.
- the audio of the video application and fitness application of mobile phone A is output through the TV, and other audio is output through mobile phone A.
- the TV can select one channel of audio output. For example, taking the TV displaying the interface content projected by different applications in the form of a large window (that is, a full-screen display window) and a small window, the TV can be configured not to output the audio of the small window, but to output the audio of the large window. For example, in conjunction with the example shown in (d) of FIG. 13 , the TV plays the sound of the video X and does not play the sound of the fitness video.
- media policies may be configured for creating the above-described media streams.
- the media policy may be pre-configured, or a configuration interface (eg, the configuration interface may be the interface shown in FIG. 8 ) may be provided for the user to set.
- the media policies corresponding to different media streams may be the same or different.
- the media strategy corresponding to one media stream may include: whether to distribute audio (or project audio), whether to distribute video (or project interface content), and parameters corresponding to virtual display when distributing video (such as: name, width, height, code rate, encoding format, dots per inch (dots per inch, DPI, etc.), specifications of the audio collected when distributing the audio, etc.
- the mobile phone A can determine whether to project audio and whether to project video according to the corresponding media strategy, and collect video data and audio data of the specified specification according to the corresponding parameters.
- the TV can set a window of one of the interfaces as the focus window by default, for example, the TV defaults a small window as the focus window.
- the television displays a prompt sign 1301 for prompting the user of a small window, that is, the window of the interface 2 is the focus window.
- the user can use the remote control of the TV to select and switch the focus window, switch the layout of the large and small windows, and also close the large and small windows.
- the window used for full-screen display interface 1 may be referred to as a large window.
- the TV when the TV receives the user's operation of the left button or the right button of the remote control, it switches the focus window.
- the focus window is a small window
- the TV receives the user's operation of the confirmation button on the remote control, as shown in (e) of Figure 13, the TV can display the small window, namely, interface 2 in full screen, and display the large window, namely, interface 2 in full screen. Interface 1 is displayed in the form of a small window.
- the TV receives the user's operation of the return key on the remote control, the TV can stop displaying the small window, or close the small window, and the TV can also notify the mobile phone A to stop projecting the content of the application corresponding to the small window.
- the mobile phone A can switch the application corresponding to the small window to the home screen to continue running. If the user continues to receive the user's operation of the return key on the remote control, the TV can stop displaying the large window, and the TV can also notify the mobile phone A to stop projecting the content of the application corresponding to the large window. In addition, mobile phone A can stop the application corresponding to the small window from running on the home screen, and start the application corresponding to the large window to run on the home screen.
- the projection content corresponding to different media streams is displayed on the screen projection destination in the form of a large window and a small window, which is only an example.
- the projection destination can also use other arrangements, such as vertical arrangement and horizontal arrangement, to display windows corresponding to different media streams.
- the projection destination displays windows corresponding to different media streams.
- the specific implementation of the window is not specifically limited.
- the screen projection destination can also dynamically adjust the size and arrangement of the windows corresponding to each media stream displayed by the projection destination according to the number of projected media streams.
- the number of projected media streams can be dynamically increased or decreased. When the number of projected media streams increases or decreases, the screen projection destination can adjust the size and arrangement of windows corresponding to each media stream according to the current number of projected media streams.
- the above scenario 1 is described by taking the screen projection source end projecting the contents of multiple applications to the same screen projection destination as an example.
- the screen projection source terminal may also project multiple applications thereof to different screen projection destinations.
- scenario 2 the following description will be given in conjunction with scenario 2.
- Mobile phone B does not support parallel multitasking.
- the user wants to view the content of APP3 and the content of APP4 of mobile phone B at the same time.
- APP3 is a fitness application
- APP4 is an educational application.
- APP3 may be the first application in the embodiment of the present application
- APP4 may be the second application in the embodiment of the present application.
- mobile phone B mobile phone B can be the above-mentioned first terminal
- the contents of the two applications can be projected to one or more other terminals that are the destination of screen projection, so as to satisfy the user's ability to view fitness applications and Demand for educational app content.
- the screen projection destination including two terminals, such as a TV and a tablet (the TV may be the second operation in the embodiment of the application, and the tablet may be the third terminal in the embodiment of the application).
- Phone B can create two media streams to project the content of the fitness application to the TV and the content of the education application to the tablet.
- the specific implementation is similar to the corresponding description in the above scenario 1, and will not be described in detail here. The difference is that the data corresponding to one of the media streams created by mobile phone B is transmitted to the TV to realize the fitness application.
- the data corresponding to another media stream is transmitted to the tablet, which is used to realize the projection of the content of the educational application on the tablet.
- the description continues by taking the user triggering the mobile phone B to start projecting the content of the application by dragging as an example. That is, in the cross-device dragging scenario, the user can trigger mobile phone B to create two media streams by dragging, so as to project the content of the fitness application on mobile phone B to the TV and the content of the education application to the tablet.
- the content of fitness applications and the content of educational applications include interface content and audio.
- mobile phone B is connected to both the tablet and the TV.
- Phone B pre-creates two AudioRecord objects.
- the user opens the fitness application of mobile phone B to view the fitness video.
- the mobile phone B receives the user's drag operation on the video element carrying the fitness video (the video element may be the first element in the embodiment of the application).
- the mobile phone B can determine whether the user's drag intention is to drag across devices.
- the mobile phone B may create a virtual display, such as a virtual display A (the virtual display A may be the first virtual display in this embodiment of the application), and call the two pre-created virtual displays.
- AudioRecord object A the AudioRecord object A may be the first AudioRecord object in this embodiment of the application.
- virtual display A and AudioRecord object A and mobile phone B can realize the creation of one media stream to obtain corresponding video data and audio data, such as video data a' and audio data a' respectively.
- the mobile phone B can send the video data a' and the audio data a' to the tablet or TV connected to the mobile phone B to realize the projection of the fitness application content to the screen projection destination.
- the mobile phone B may use one terminal of the tablet and the TV as the destination terminal for screen projection. For example, after mobile phone B determines that the user's drag intention is to drag across devices, mobile phone B may display a device list, where the device list includes the device ID of the tablet and the device ID of the TV. The user can select the device ID in the device list, so that the mobile phone B can determine the projection destination for this projection. If the mobile phone B receives the user's selection operation on the device identification of the TV, indicating that the user wants to project the content of the fitness application to the TV, then according to the user's selection operation, the mobile phone B can transfer the above-mentioned video data a' and audio data a' send to TV.
- mobile phone B may determine the screen projection destination of this screen projection according to the drag direction of the drag operation performed by the user and the direction of the terminal connected to mobile phone B relative to mobile phone B.
- mobile phone B can obtain the direction of each terminal connected to mobile phone B relative to mobile phone B, and determine the terminal in the drag direction as this The projection destination of the secondary projection. For example, if the tablet is located in the direction pointing to the upper edge of the mobile phone, and the TV is located in the direction pointing to the right edge of the mobile phone, the dragging direction for the user to perform the drag operation is to drag to the right.
- the mobile phone B can obtain the direction of the TV and tablet connected to the mobile phone B relative to the mobile phone B. According to the direction of the TV and tablet connected to the mobile phone B relative to the mobile phone B and the dragging direction, the mobile phone B can determine that the TV is located in the dragging direction, indicating that the user wants to project the content of the fitness application to the TV, then the mobile phone B can The video data a' and the audio data a' are sent to the television. Wherein, the direction of other terminals relative to the mobile phone B, the mobile phone B can be obtained by using positioning technologies such as Bluetooth, Ultra-wideband (Ultra-wideband, UWB), and ultrasound.
- positioning technologies such as Bluetooth, Ultra-wideband (Ultra-wideband, UWB), and ultrasound.
- the TV After the TV receives the video data a' and the audio data a' from the mobile phone B, it can package and decode, and then render the audio and video to display the fitness video on the TV, as shown in 3001 in Figure 30, and play The corresponding audio realizes the projection of the content of the fitness application of the mobile phone B to the TV.
- the user After projecting the content of the fitness application in the mobile phone B to the TV, the user opens the educational application of the mobile phone B to view the educational video.
- Mobile phone B receives the user's drag operation on the video element bearing the educational video.
- the mobile phone B may create a virtual display after determining that the user's drag intention is to drag across devices, such as a virtual display B (the virtual display B may be the second one in this embodiment of the application). virtual display), and call the other of the two pre-created AudioRecord objects, such as AudioRecord object B (the AudioRecord object B may be the second AudioRecord object in this embodiment of the application).
- mobile phone B can realize the creation of another media stream to obtain corresponding video data and audio data, such as video data b' and audio data b' respectively. Afterwards, the mobile phone B can send the video data b' and the audio data b' to the tablet or TV connected to the mobile phone B to realize the projection of the educational application content to the screen projection destination.
- mobile phone B can determine the screen projection according to the user's selection operation, or according to the dragging direction of the dragging operation performed by the user and the direction of the terminal connected to mobile phone B relative to mobile phone B. Projection destination. For example, if the mobile phone B receives the user's operation of selecting the tablet, or determines that the tablet is in the dragging direction, indicating that the user wants to project the content of the educational application to the tablet, the mobile phone B can send the above-mentioned video data b' and audio data b' to television.
- the tablet After the tablet receives the video data b' and audio data b' from the mobile phone B, it can be packaged, decoded, and then rendered audio and video to display the educational video on the tablet, as shown in 3002 in Figure 30, and play The corresponding audio realizes the projection of the content of the educational application of mobile phone B to the tablet.
- the contents of the fitness application and education application of mobile phone B such as interface content and audio, are respectively projected to the TV and tablet as the screen projection destination, which satisfies the user's demand for viewing the content of the fitness application and the education application at the same time.
- the mode in which the screen projection source end in scene 1 creates multiple media streams and sends them to the same screen projection destination end to realize application content projection is called aggregation mode
- the screen projection source end in scene 2 creates multiple media streams.
- the mode in which the media stream is sent to multiple different projection destinations to realize the projection of the application content is called the distribution mode.
- the screen projection source terminal also supports projecting one media stream created by it to multiple screen projection destinations, and this mode may be called a broadcast mode.
- the screen projection source terminal can simultaneously support the above three video distribution modes, that is, the screen projection source terminal has the ability to implement the above three video distribution modes.
- the screen projection source supports the configuration of the three video distribution modes, for example, a setting interface can be provided for the user to set, or the system default configuration.
- the configured video distribution mode can also be understood as the above-mentioned media strategy. That is to say, the screen projection source can obtain the relevant configuration of the video distribution mode from the media policy.
- the screen projection source end has the ability to realize the above three video distribution modes. If the user sets the video distribution mode of the screen projection source end to the above aggregation mode, after the multi-channel media stream is created at the screen projection source end, the screen projection mode is set according to the settings.
- the source end can project the multiple media streams to the same screen projection destination to meet the multitasking needs of users. Take the screen projection source creating two media streams as an example.
- the screen projection source can obtain the video distribution mode as the aggregation mode according to the customized media policy in the service scheduling and policy selection module. According to the user's trigger, the screen projection source can collect the audio and video data of the first channel and the audio and video data of the second channel.
- the screen projection source end separately encodes the audio and video data of the first channel of audio and video data and the audio and video data of the second channel to realize the creation of two channels of media streams.
- the projection source can transmit to the same projection destination after being adapted by the multi-device connection management protocol.
- the source device may assign different identifiers for different channels of audio and video data (for example, the identifier may be an identifier corresponding to the virtual display, such as the name of the virtual display, or the identifier may be an index allocated by the source device for different channels of media streams. ), so that the projection destination can be partitioned.
- the destination end of the screen projection can distinguish the first channel of audio and video data and the second channel of audio and video data according to the difference in the identification of the received audio and video data.
- the audio and video data are respectively rendered, so as to realize the projection of the application content corresponding to the two channels of media streams at the screen projection destination.
- the projection source has the ability to implement the above three video distribution modes.
- the system defaults the video distribution mode of the projection source to the above distribution mode.
- the projection source end can project the multi-channel media stream to multiple different projection destinations to meet the multitasking needs of users.
- the screen projection source creates two media streams.
- the screen projection source can obtain the video distribution mode as the distribution mode according to the customized media policy in the service scheduling and policy selection module. According to the user's trigger, the screen projection source can collect the audio and video data of the first channel and the audio and video data of the second channel.
- the screen projection source end separately encodes the audio and video data of the first channel of audio and video data and the audio and video data of the second channel to realize the creation of two channels of media streams.
- the projection source can transmit to different projection destinations after being adapted by the multi-device connection management protocol.
- the first channel of audio and video data is transmitted to the projection destination. 1.
- the screen projection destination end 1 and the screen projection destination end 2 After receiving the corresponding audio and video data, the screen projection destination end 1 and the screen projection destination end 2 perform audio and video decoding on the received audio and video data respectively, and then perform audio and video rendering, so as to achieve the purpose of screen projection destination 1 and screen projection.
- the two media streams on terminal 2 correspond to the projection of application content.
- the screencasting source has the ability to realize the above three video distribution modes.
- the system defaults the video distribution mode of the screencasting source to the above broadcast mode, then the screencasting source can create a media stream, and distribute this media stream according to the settings. Streams are projected to multiple different projection destinations.
- the screen projection source can obtain the video distribution mode as the broadcast mode according to the customized media strategy in the service scheduling and strategy selection module.
- the screen projection source can collect single-channel audio and video data according to the user's trigger.
- the source end of the screen projection performs audio and video encoding on the audio and video data to realize the creation of a media stream.
- the projection source can transmit to different projection destinations after being adapted by the multi-device connection management protocol.
- the audio and video data of this channel can be transmitted to the projection destination 1 and the projection destination 2.
- the projection destination 1 and the projection destination 2 respectively decode the received audio and video data, and then perform audio and video rendering, so as to realize the audio and video rendering on the projection destination 1 and the projection destination.
- This media stream on the destination end 2 corresponds to the projection of the application content.
- the screen projection source end when the screen projection source end has the ability to implement the above three video distribution modes, the screen projection source end can also determine the video distribution mode according to the number of devices connected to it. For example, if there is one device connected to the screen projection source, the screen projection source can determine that the video distribution mode is the aggregation mode. For the created multi-channel media stream, the screen projection source end can project the multi-channel media stream to the device, so as to realize the projection of different application contents in the screen projection source end on the same screen projection destination end, as shown in the above scenario 1. For another example, if there are multiple devices connected to the screen projection source, the screen projection source may determine that the video distribution mode is the distribution mode.
- the screen projection source can project the multi-channel media streams to different devices, so as to realize the projection of different application contents in the screen projection source to different screen projection destinations, as shown in the above scenario 2.
- the screencasting source can also drag and drop according to when the user performs a dragging operation for different applications.
- the difference in direction determines the video distribution mode. For example, if the user drags and drops in different directions for different applications, the projection source can determine that the video distribution mode is the distribution mode. Therefore, for the created multi-channel media stream, the projection source can Cast media streams to different devices.
- the projection source can determine that the video distribution mode is the aggregation mode. Therefore, for the created multi-channel media stream, the projection source can channel media streams to the same device.
- a terminal serving as a screencasting source can create multiple media streams to realize the projection of the contents of multiple applications of the terminal to one or more screencasting destinations. It satisfies the requirement of multitasking and parallelism, which can improve the use efficiency of the terminal and improve the user experience.
- screen recording and encoding of the content at the source end of the projection screen are stored in the local cache, so as to realize the display of the content at the source end of the projection screen at the destination end of the projection screen, Support mirror projection and heterogeneous projection.
- Third-party applications can integrate the corresponding screen projection capabilities (such as, Provide dll library, Provide aar package), call the API interface of Multimedia Distribution Protocol (Distributed Multimedia Protocol, DMP) to realize screen projection, so that online video projection can be realized.
- DMP Distributed Multimedia Protocol
- Mirror projection means that the audio and video rendered by the destination end of the projection screen are exactly the same as the source end of the projection screen.
- the picture, audio or video is opened on the source end of the projection screen, and the destination end also displays the picture and plays the audio or video;
- An application such as application or window ( window) to the projection destination, which can achieve both sharing and privacy protection.
- the display of multiple projection contents can be realized on the device.
- the user datagram protocol (UDP) protocol and the forward error correction (FEC) protocol can be used to realize the transmission of the source end media stream to the screen projection destination end, which can effectively alleviate the Packet loss and congestion avoidance.
- the invalidate reference frame (IFR) technology can be used to ensure fast recovery after packet loss and avoid blurry screens and long-term freezes.
- An embodiment of the present application further provides a screen projection device, and the device can be applied to an electronic device, such as the first terminal or the second terminal in the foregoing embodiment.
- the device may include: a processor; a memory for storing instructions executable by the processor; wherein the processor is configured to execute the instructions to cause the screen projection device to implement the first terminal (such as a television set) or the first terminal in the above method embodiment.
- Two functions or steps performed by a terminal such as a mobile phone).
- An embodiment of the present application provides an electronic device (such as the above-mentioned first terminal or second terminal), the electronic device includes a display screen, one or more processors and a memory; the display screen, the processor and the memory are coupled; the memory is used for storing Computer program code, the computer program code includes computer instructions, when the computer instructions are executed by the electronic device, the electronic device is made to implement the first terminal (such as a TV set, a mobile phone A, and a mobile phone B) or a second terminal (such as Each function or step performed by mobile phone, TV, tablet).
- the electronic device includes but is not limited to the above-mentioned display screen, memory and one or more processors.
- the structure of the electronic device may refer to the structure of the mobile phone shown in FIG. 2 .
- the chip system includes at least one processor 3401 and at least one interface circuit 3402 .
- the processor 3401 may be the processor in the above-mentioned terminal.
- the processor 3401 and the interface circuit 3402 may be interconnected by wires.
- the processor 3401 may receive and execute computer instructions from the memory of the terminal through the interface circuit 3402.
- the terminal such as the first terminal or the second terminal described above
- the chip system may also include other discrete devices, which are not specifically limited in this embodiment of the present application.
- Embodiments of the present application further provide a computer-readable storage medium, which is used to store computer instructions run by the above-mentioned terminal (eg, the first terminal or the second terminal).
- Embodiments of the present application further provide a computer program product, including computer instructions run by the above-mentioned terminal (eg, the first terminal or the second terminal).
- the disclosed apparatus and method may be implemented in other manners.
- the device embodiments described above are only illustrative.
- the division of the modules or units is only a logical function division. In actual implementation, there may be other division methods.
- multiple units or components may be Incorporation may either be integrated into another device, or some features may be omitted, or not implemented.
- the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
- the units described as separate components may or may not be physically separated, and the components shown as units may be one physical unit or multiple physical units, that is, they may be located in one place, or may be distributed to multiple different places . Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
- each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
- the above-mentioned integrated units may be implemented in the form of hardware, or may be implemented in the form of software functional units.
- the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it may be stored in a readable storage medium.
- the technical solutions of the embodiments of the present application can be embodied in the form of software products in essence, or the parts that contribute to the prior art, or all or part of the technical solutions, which are stored in a storage medium , including several instructions to make a device (may be a single chip microcomputer, a chip, etc.) or a processor (processor) to execute all or part of the steps of the methods described in the various embodiments of the present application.
- the aforementioned storage medium includes: U disk, mobile hard disk, read only memory (ROM), random access memory (random access memory, RAM), magnetic disk or optical disk and other media that can store program codes.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Controls And Circuits For Display Device (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
L'invention concerne un procédé de projection sur écran, et un dispositif, se rapportant au domaine des dispositifs électroniques, le procédé réalisant la présentation d'interfaces d'affichage d'une pluralité de dispositifs sur le même dispositif, c'est-à-dire réalisant une projection sur écran multiple-à-un. La solution spécifique comprend les étapes suivantes : un premier terminal reçoit des données de chacun d'une pluralité de seconds terminaux ; afficher une pluralité de premières interfaces sur le premier terminal en fonction des données reçues de la pluralité de seconds terminaux, la pluralité de premières interfaces correspondant à la pluralité de seconds terminaux sur une base un-à-un, le contenu de la première interface est une image miroir du contenu d'une seconde interface affichée par le second terminal correspondant, ou le contenu de la première interface est le même qu'une partie du contenu de la seconde interface affichée par le second terminal correspondant.
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202011425441 | 2020-12-08 | ||
| CN202011425441.8 | 2020-12-08 | ||
| CN202110182037.0A CN114610253A (zh) | 2020-12-08 | 2021-02-09 | 一种投屏方法及设备 |
| CN202110182037.0 | 2021-02-09 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2022121775A1 true WO2022121775A1 (fr) | 2022-06-16 |
Family
ID=81857309
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2021/135158 Ceased WO2022121775A1 (fr) | 2020-12-08 | 2021-12-02 | Procédé de projection sur écran, et dispositif |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN114610253A (fr) |
| WO (1) | WO2022121775A1 (fr) |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN115134341A (zh) * | 2022-06-27 | 2022-09-30 | 联想(北京)有限公司 | 显示方法和装置 |
| CN115607955A (zh) * | 2022-10-13 | 2023-01-17 | 蔚来汽车科技(安徽)有限公司 | 车机系统、用于实现扩展显示的方法和存储介质 |
| CN115866310A (zh) * | 2022-11-29 | 2023-03-28 | 深圳市酷开网络科技股份有限公司 | 一种同屏显示方法、装置、系统及存储介质 |
| CN116679895A (zh) * | 2022-10-26 | 2023-09-01 | 荣耀终端有限公司 | 一种协同业务的调度方法、电子设备及协同系统 |
| EP4529187A4 (fr) * | 2022-09-06 | 2025-08-06 | Huawei Tech Co Ltd | Procédé et système d'affichage de projection d'écran, et dispositif électronique |
Families Citing this family (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN115052186B (zh) * | 2022-07-12 | 2023-09-15 | 北京字跳网络技术有限公司 | 投屏方法及相关设备 |
| CN115437592A (zh) * | 2022-08-03 | 2022-12-06 | 北京罗克维尔斯科技有限公司 | 多显示设备的显示方法、装置、设备、存储介质及车辆 |
| CN117675993A (zh) * | 2022-08-29 | 2024-03-08 | Oppo广东移动通信有限公司 | 跨设备接续方法、装置、存储介质及终端设备 |
| CN117896447A (zh) * | 2022-10-08 | 2024-04-16 | 广州视臻信息科技有限公司 | 一种数据传输方法、电子设备、传屏器及存储介质 |
| CN118230535B (zh) * | 2022-12-21 | 2025-09-30 | 华为技术有限公司 | 设备管理方法及电子设备 |
| CN118259995A (zh) * | 2022-12-28 | 2024-06-28 | 华为技术有限公司 | 跨设备分屏方法及相关装置 |
| CN117156190B (zh) * | 2023-04-21 | 2024-08-06 | 荣耀终端有限公司 | 投屏管理方法和装置 |
| CN116434791B (zh) * | 2023-06-12 | 2023-08-11 | 深圳福德源数码科技有限公司 | 一种用于音频播放器的配置方法及系统 |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6493008B1 (en) * | 1999-02-19 | 2002-12-10 | Canon Kabushiki Kaisha | Multi-screen display system and method |
| CN102740155A (zh) * | 2012-06-15 | 2012-10-17 | 宇龙计算机通信科技(深圳)有限公司 | 图像显示的方法及电子设备 |
| JP2014044738A (ja) * | 2013-11-05 | 2014-03-13 | Seiko Epson Corp | 画像表示装置が表示する分割画面に画像の割り当てを行う端末装置、端末装置の制御方法およびコンピュータープログラム |
| CN105516754A (zh) * | 2015-12-07 | 2016-04-20 | 小米科技有限责任公司 | 画面显示控制方法、装置及终端 |
| CN109275130A (zh) * | 2018-09-13 | 2019-01-25 | 锐捷网络股份有限公司 | 一种投屏方法、装置及存储介质 |
| CN109508162A (zh) * | 2018-10-12 | 2019-03-22 | 福建星网视易信息系统有限公司 | 一种投屏显示方法、系统及存储介质 |
| CN110191350A (zh) * | 2019-05-28 | 2019-08-30 | 上海哔哩哔哩科技有限公司 | 多端投屏方法、计算机设备及存储介质 |
| CN110515576A (zh) * | 2019-07-08 | 2019-11-29 | 华为技术有限公司 | 显示控制方法及装置 |
Family Cites Families (17)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9176703B2 (en) * | 2012-06-29 | 2015-11-03 | Lg Electronics Inc. | Mobile terminal and method of controlling the same for screen capture |
| US9632648B2 (en) * | 2012-07-06 | 2017-04-25 | Lg Electronics Inc. | Mobile terminal, image display device and user interface provision method using the same |
| KR101218295B1 (ko) * | 2012-07-11 | 2013-01-04 | 주식회사 엘지유플러스 | 이동 단말기 및 이의 화면 공유 방법 |
| KR20140146759A (ko) * | 2013-06-18 | 2014-12-29 | 엘지전자 주식회사 | 휴대 단말기 및 그 제어 방법 |
| CN103593111A (zh) * | 2013-11-14 | 2014-02-19 | 三星电子(中国)研发中心 | 一种移动终端屏幕共享的方法和移动终端 |
| KR20150067521A (ko) * | 2013-12-10 | 2015-06-18 | 에스케이플래닛 주식회사 | 화면 공유 서비스를 위한 장치 및 방법, 이를 위한 화면 공유 시스템 |
| KR20170091303A (ko) * | 2016-02-01 | 2017-08-09 | 엘지전자 주식회사 | 단말기 및 그를 포함하는 디스플레이 시스템 |
| US11283912B2 (en) * | 2017-06-16 | 2022-03-22 | Huawei Technologies Co., Ltd. | Display method and device |
| CN109508161A (zh) * | 2017-09-15 | 2019-03-22 | 浙江绍兴苏泊尔生活电器有限公司 | 数据显示方法、装置和系统、烹饪器具 |
| CN109327728B (zh) * | 2018-11-23 | 2021-10-15 | 深圳市鹰硕技术有限公司 | 一种一对多同屏方法、装置和系统、同屏设备及存储介质 |
| CN109782949A (zh) * | 2019-01-03 | 2019-05-21 | 北京东科佳华科技有限公司 | 基于服务器及触控显示屏的智能显示系统 |
| CN110248224A (zh) * | 2019-05-24 | 2019-09-17 | 南京苏宁软件技术有限公司 | 投屏连接建立方法、装置、计算机设备和存储介质 |
| CN110381195A (zh) * | 2019-06-05 | 2019-10-25 | 华为技术有限公司 | 一种投屏显示方法及电子设备 |
| CN112968991B (zh) * | 2019-06-20 | 2022-07-29 | 华为技术有限公司 | 一种输入方法、电子设备和投屏系统 |
| CN111324327B (zh) * | 2020-02-20 | 2022-03-25 | 华为技术有限公司 | 投屏方法及终端设备 |
| CN111666055B (zh) * | 2020-04-24 | 2021-12-14 | 华为技术有限公司 | 数据的传输方法及装置 |
| CN111880870B (zh) * | 2020-06-19 | 2024-06-07 | 维沃移动通信有限公司 | 控制电子设备的方法、装置和电子设备 |
-
2021
- 2021-02-09 CN CN202110182037.0A patent/CN114610253A/zh active Pending
- 2021-12-02 WO PCT/CN2021/135158 patent/WO2022121775A1/fr not_active Ceased
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6493008B1 (en) * | 1999-02-19 | 2002-12-10 | Canon Kabushiki Kaisha | Multi-screen display system and method |
| CN102740155A (zh) * | 2012-06-15 | 2012-10-17 | 宇龙计算机通信科技(深圳)有限公司 | 图像显示的方法及电子设备 |
| JP2014044738A (ja) * | 2013-11-05 | 2014-03-13 | Seiko Epson Corp | 画像表示装置が表示する分割画面に画像の割り当てを行う端末装置、端末装置の制御方法およびコンピュータープログラム |
| CN105516754A (zh) * | 2015-12-07 | 2016-04-20 | 小米科技有限责任公司 | 画面显示控制方法、装置及终端 |
| CN109275130A (zh) * | 2018-09-13 | 2019-01-25 | 锐捷网络股份有限公司 | 一种投屏方法、装置及存储介质 |
| CN109508162A (zh) * | 2018-10-12 | 2019-03-22 | 福建星网视易信息系统有限公司 | 一种投屏显示方法、系统及存储介质 |
| CN110191350A (zh) * | 2019-05-28 | 2019-08-30 | 上海哔哩哔哩科技有限公司 | 多端投屏方法、计算机设备及存储介质 |
| CN110515576A (zh) * | 2019-07-08 | 2019-11-29 | 华为技术有限公司 | 显示控制方法及装置 |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN115134341A (zh) * | 2022-06-27 | 2022-09-30 | 联想(北京)有限公司 | 显示方法和装置 |
| EP4529187A4 (fr) * | 2022-09-06 | 2025-08-06 | Huawei Tech Co Ltd | Procédé et système d'affichage de projection d'écran, et dispositif électronique |
| CN115607955A (zh) * | 2022-10-13 | 2023-01-17 | 蔚来汽车科技(安徽)有限公司 | 车机系统、用于实现扩展显示的方法和存储介质 |
| CN116679895A (zh) * | 2022-10-26 | 2023-09-01 | 荣耀终端有限公司 | 一种协同业务的调度方法、电子设备及协同系统 |
| CN116679895B (zh) * | 2022-10-26 | 2024-06-07 | 荣耀终端有限公司 | 一种协同业务的调度方法、电子设备及协同系统 |
| CN115866310A (zh) * | 2022-11-29 | 2023-03-28 | 深圳市酷开网络科技股份有限公司 | 一种同屏显示方法、装置、系统及存储介质 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN114610253A (zh) | 2022-06-10 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2022121775A1 (fr) | Procédé de projection sur écran, et dispositif | |
| CN112398855B (zh) | 应用内容跨设备流转方法与装置、电子设备 | |
| CN111316598B (zh) | 一种多屏互动方法及设备 | |
| CN110109636B (zh) | 投屏方法、电子设备以及系统 | |
| CN112394895B (zh) | 画面跨设备显示方法与装置、电子设备 | |
| EP4050900A1 (fr) | Procédé de lecture audio et vidéo de projection d'écran et dispositif électronique | |
| CN112351235B (zh) | 一种视频通话的方法 | |
| JP7369281B2 (ja) | デバイス能力スケジューリング方法および電子デバイス | |
| WO2020238871A1 (fr) | Procédé et système de projection d'écran, et appareil associé | |
| WO2022100237A1 (fr) | Procédé d'affichage par projection d'écran et produit associé | |
| CN114040242B (zh) | 投屏方法、电子设备和存储介质 | |
| US12356066B2 (en) | Camera invocation method and system, and electronic device | |
| CN114924682A (zh) | 一种内容接续方法及电子设备 | |
| CN112527174B (zh) | 一种信息处理方法及电子设备 | |
| WO2021190466A1 (fr) | Procédé de poursuite de lecture de contenu multimédia entre dispositifs | |
| WO2022143077A1 (fr) | Procédé, système et dispositif électronique de photographie | |
| CN116489268A (zh) | 一种设备识别方法及相关装置 | |
| WO2022048474A1 (fr) | Procédé destiné à de multiples applications afin de partager une caméra, et dispositif électronique | |
| CN112527222A (zh) | 一种信息处理方法及电子设备 | |
| WO2022135527A1 (fr) | Procédé d'enregistrement vidéo et dispositif électronique | |
| WO2022105445A1 (fr) | Procédé de projection d'écran d'application basé sur un navigateur et appareil associé | |
| WO2022042769A2 (fr) | Système et procédé d'interaction multi-écrans, appareil, et support de stockage | |
| WO2023005900A1 (fr) | Procédé de projection d'écran, dispositif électronique et système | |
| WO2024037352A1 (fr) | Procédé d'affichage à écran divisé et appareil associé | |
| CN116170629A (zh) | 一种传输码流的方法、电子设备及计算机可读存储介质 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21902482 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 21902482 Country of ref document: EP Kind code of ref document: A1 |