CN113468349B - Multimedia file evaluation method, device and system - Google Patents
Multimedia file evaluation method, device and systemInfo
- Publication number
- CN113468349B CN113468349B CN202010246633.6A CN202010246633A CN113468349B CN 113468349 B CN113468349 B CN 113468349B CN 202010246633 A CN202010246633 A CN 202010246633A CN 113468349 B CN113468349 B CN 113468349B
- Authority
- CN
- China
- Prior art keywords
- user
- multimedia file
- electronic device
- evaluation result
- wearable device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/40—Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
- G06F16/48—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0201—Market modelling; Market analysis; Collecting market data
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Strategic Management (AREA)
- Finance (AREA)
- Development Economics (AREA)
- Accounting & Taxation (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Entrepreneurship & Innovation (AREA)
- Library & Information Science (AREA)
- Databases & Information Systems (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Game Theory and Decision Science (AREA)
- Economics (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Information Transfer Between Computers (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
The embodiment of the application provides a method, a device and a system for evaluating multimedia files. In the method, the electronic equipment can generate an evaluation result of the user on the multimedia file according to experience data when the user views the multimedia file, and share the evaluation result to the server. The evaluation result is fit with the actual experience of each user, accords with the actual experience of the user, and has objectivity, authenticity, accuracy and fairness. The method has low cost, is not limited by scenes, and is convenient to implement. By implementing the technical scheme provided by the embodiment of the application, more accurate reference opinions can be provided for other users, and optimization of merchants on multimedia files can be promoted, so that user experience is improved.
Description
Technical Field
The present application relates to the field of information processing and terminal technologies, and in particular, to a method, an apparatus, and a system for evaluating multimedia files.
Background
With the rapid development of internet technology, services provided by computer devices for users are more and more diversified, for example, the computer devices can play multimedia files such as movies, television shows, etc. for users, so as to meet entertainment demands of users. Typically, the user selects a multimedia file for viewing in dependence on other user ratings of the multimedia file, and in addition, the user ratings of the multimedia file may also facilitate optimization of the multimedia file by the merchant. Therefore, the evaluation of multimedia files by users plays an important role in the internet field.
At present, two modes of questionnaire investigation and audio-visual effect collection exist. When a user's evaluation of a multimedia file is obtained by using a questionnaire, the user's feeling may change with time due to the time lag of the questionnaire compared with the playing time of the multimedia file, and in addition, whether the user actually views the multimedia file cannot be guaranteed. Therefore, the accuracy and fairness of the results of the questionnaire are not high. The video-audio effect collection relies on expensive video-audio effect information collection systems, such as acoustic collection devices deployed in movie theatres, to collect the visual impact of the audience, such as cheering, and analyze the audience's performance to obtain user ratings for multimedia files. The video and audio effect acquisition mode is limited by scene and equipment cost, so that the video watching cost of users is improved, the accuracy is limited, and corresponding evaluation cannot be acquired for each user.
How to timely, conveniently and efficiently obtain accurate and objective evaluation of a user on a multimedia file is a problem to be solved currently.
Disclosure of Invention
The embodiment of the application provides a method, a device and a system for evaluating a multimedia file, which can generate an evaluation result of a user on the multimedia file according to experience data when the user views the multimedia file.
In a first aspect, an embodiment of the present application provides a method for evaluating a multimedia file, which is applied to an electronic device. The method comprises the steps that the electronic equipment obtains relevant information of a multimedia file, experience data of a user watching the multimedia file are obtained, an evaluation result of the user on the multimedia file is generated according to the experience data, the relevant information of the multimedia file and the evaluation result are sent to a server, wherein the relevant information of the multimedia file comprises one or more of names, duration or brief introduction of the multimedia file, and the experience data comprises one or more of physical sign data, action data or sound data of the user.
By implementing the method of the first aspect, the electronic device may generate the evaluation result of the user on the multimedia file according to the experience data when the user views the multimedia file. The evaluation result is fit with the actual experience of each user, accords with the actual experience of the user, and has objectivity, authenticity, accuracy and fairness. In addition, the method is low in cost, is not limited to scenes, and is convenient to implement. By implementing the method, more accurate reference opinions can be provided for other users, and optimization of merchants on the multimedia files can be promoted, so that user experience is improved.
In combination with the first aspect, the multimedia file may comprise audio and/or video playing information, such as movies, television shows, musical shows, songs, etc.
In combination with the first aspect, the evaluation result of the user on the multimedia file may be implemented in various forms, such as text, keywords, expression packages, scores, symbols, audio and video, and the like, which is not limited by the embodiment of the present application.
In combination with the first aspect, in some embodiments, the multimedia file viewed by the user may be played by the electronic device. The multimedia file played by the electronic device may be a local file stored in the electronic device, or may be an online file obtained by the electronic device from a network, for example, a streaming media file obtained from a server, etc.
In combination with the first aspect, in some embodiments, the multimedia file viewed by the user may also be played by the large screen device. The multimedia file played by the large-screen device can be a file stored locally by the large-screen device, an online file, and the like. The online file can be obtained from a network by the large-screen device, or can be obtained from the network by the electronic device and then sent to the large-screen device.
With reference to the first aspect, in some embodiments, experience data acquired by the electronic device may be collected by the wearable device. After the wearable device collects the experience data of the user watching the multimedia file, the electronic device may receive the experience data sent by the wearable device. Thus, the electronic device and the wearable device can jointly execute the evaluation method of the multimedia file provided by the embodiment of the application.
When experience data is collected by the wearable device, the electronic device may trigger the wearable device to collect the experience data in several ways:
(1) When the electronic equipment detects the first operation, the wearable equipment is informed to collect experience data when the user views the multimedia file in response to the first operation. That is, the user may manually trigger the wearable device to collect experience data when viewing the multimedia file by himself when needed.
(2) When the electronic equipment identifies that the user is in a scene of watching the multimedia file, the electronic equipment automatically triggers or informs the wearable equipment to collect experience data when the user watches the multimedia file. That is, the electronic device may directly trigger the wearable device to collect the experience data when the user is in a scene of watching the multimedia file, without user operation, which is simpler and more convenient for the user.
Here, when devices playing the multimedia file are different, the way in which the electronic device recognizes whether the user is in a scene of viewing the multimedia file is different. In the following, several ways in which the electronic device recognizes whether the user is in a scene of viewing a multimedia file will be briefly described in connection with different situations:
1. The multimedia file is played by the electronic device. In this case, the electronic device may directly learn that the current user is in a scene of viewing the multimedia file when playing the multimedia file by itself.
2. The multimedia file is played by a large screen device, which may include, for example, a television, a projector, and the like. In this case, the electronic device may determine that the current user is in a scene of viewing the multimedia file when receiving a notification message sent by the large screen device, where the notification message is used to indicate that the large screen device is playing the multimedia file.
3. The multimedia file is played by a large screen device, which may include, for example, a video-audio device in a movie theater, etc. In this case, the electronic device may acquire current geographical location information and ticket booking information of the user, and determine whether the user is in a scene of viewing the multimedia file according to the geographical location information and the ticket booking information of the user.
4. The multimedia file is played by the large screen device. In this case, the electronic device may collect surrounding environmental data and determine whether the user is in a scene of viewing the multimedia file according to the environmental data. The environmental data may include surrounding sound data, light and shadow data, image data, and the like.
With reference to the first aspect, in other embodiments, experience data acquired by an electronic device may be collected by the electronic device. Thus, the electronic device can independently execute the evaluation method of the multimedia file provided by the embodiment of the application.
When experience data is collected by an electronic device, the electronic device may begin collecting the experience data in several cases:
(1) And when the electronic equipment detects the second operation, responding to the second operation to acquire experience data when the user watches the multimedia file. That is, the user may manually trigger the electronic device to collect experience data when viewing the multimedia file by himself, if desired.
In some embodiments, after the electronic device detects the second operation, experience data of the user for viewing the multimedia file may be collected when the user is identified as being in a scene for viewing the multimedia file. That is, the electronic device does not begin to activate the various sensors to collect data until the user views the multimedia file. Here, the manner in which the electronic device recognizes that the user is in the scene of viewing the multimedia file is referred to the above description related to the (2) th point, which is not repeated here.
(2) And when the electronic equipment identifies that the user is in a scene of watching the multimedia file, automatically starting to acquire experience data when the user watches the multimedia file. That is, the electronic device can directly collect experience data when the user is in a scene of watching the multimedia file, and the user operation is not needed, so that the electronic device is simpler and more convenient for the user. Here, the manner in which the electronic device recognizes that the user is in the scene of viewing the multimedia file is referred to the above description related to the (2) th point, which is not repeated here.
In combination with the first aspect, when the devices playing the multimedia file are different, the manner in which the electronic device obtains the relevant information of the multimedia file is also different. In the following, several ways in which the electronic device obtains the relevant information of the multimedia file will be briefly described in connection with different situations:
(1) The multimedia file is played by the electronic device. In this case, if the multimedia file is a local file in the electronic device, the relevant information of the multimedia file is obtained from the local by the electronic device, and if the multimedia file is a streaming media file obtained from the network by the electronic device, the relevant information of the multimedia file is obtained from the network by the electronic device.
(2) The multimedia file is played by a large screen device, which may include, for example, a television, a projector, and the like. In this case, if the multimedia file is a local file in the large-screen device, the electronic device obtains the relevant information of the multimedia file from the large-screen device. If the multimedia file is a streaming media file acquired from the network by the large screen device, the relevant information of the multimedia file is acquired from the network by the electronic device, or the relevant information of the multimedia file is acquired from the large screen device by the electronic device.
(3) The multimedia file is played by a large screen device, which may include, for example, a video-audio device in a movie theater, etc. In this case, the relevant information of the multimedia file is obtained by the electronic device from the ticket booking information of the user, or the relevant information of the multimedia file is obtained by the electronic device from a network according to the ticket booking information.
(4) The multimedia file is played by the large screen device. In this case, the related information of the multimedia file is acquired from the network by the electronic device according to the environmental data.
In combination with the first aspect, in some embodiments, the electronic device may further display a first user interface, and after acquiring the experience data, the electronic device refreshes the first user interface. The refreshed first user interface comprises a first area, wherein the first area displays the experience data or the processed experience data. The experience data is displayed in the user interface, so that the user can know the experience of watching the multimedia file in real time.
In some embodiments, the electronic device may further display the evaluation result in the first area after generating the evaluation result of the multimedia file by the user according to the experience data. This may enable the user to learn his own evaluation of the multimedia file.
With reference to the first aspect, in some embodiments, the electronic device may generate an evaluation result of the multimedia file by the user according to the experience data and the evaluation policy. The evaluation strategies indicate the association relation between experience data and user experience, and different users correspond to different evaluation strategies. Therefore, by implementing the evaluation method of the multimedia file provided by the embodiment of the application, different evaluation strategies can be used for different users to obtain the evaluation result of the user on the multimedia file, and the obtained evaluation result is more fit with the actual situation of each user, more accords with the actual experience of the user, and has objectivity and authenticity. In some embodiments, the assessment policy may be personalized based on one or more of the gender, age, and physical constitution of the user.
With reference to the first aspect, in some embodiments, the electronic device may further correct the generated evaluation result in response to the received third operation. The evaluation result uploaded to the server by the electronic device may be an evaluation result generated by the electronic device or a corrected evaluation result. Through the correction of the evaluation result by the user, the final obtained evaluation result is consistent with the actual experience of the user, and the authenticity of the evaluation result is improved.
In a second aspect, an embodiment of the present application provides a method for evaluating a multimedia file, where the method is applied to a wearable device. The method can include the steps that the wearable device collects experience data when a user watches the multimedia file and sends the experience data to the electronic device, and the experience data are used for the electronic device to generate an evaluation result of the user on the multimedia file.
By implementing the method of the second aspect, the wearable device can collect experience data when the user views the multimedia file, so that the electronic device automatically generates an evaluation result of the user on the multimedia file according to the experience data. The evaluation result is fit with the actual experience of each user, accords with the actual experience of the user, and has objectivity, authenticity, accuracy and fairness. In addition, the method is low in cost, is not limited to scenes, and is convenient to implement. By implementing the method, more accurate reference opinions can be provided for other users, and optimization of merchants on the multimedia files can be promoted, so that user experience is improved.
With reference to the second aspect, in some embodiments, the wearable device may collect experience data when the user views the multimedia file in three cases:
(1) When the wearable device detects the fourth operation, the wearable device starts to collect experience data when the user views the multimedia file in response to the fourth operation.
(2) After the electronic equipment responds to the detected first operation and notifies the wearable equipment to collect the experience data when the user watches the multimedia file, the wearable equipment starts to collect the experience data when the user watches the multimedia file.
When the wearable device adopts the (1) or (2) mode, in some embodiments, the wearable device may further identify whether the user is in a scene of viewing the multimedia file after detecting the fourth operation or receiving the notification of the electronic device, and collect experience data of the user viewing the multimedia file when identifying that the user is in a scene of viewing the multimedia file. Here, the manner in which the wearable device recognizes whether the user is in a scene of viewing the multimedia file may include the following two types:
1. The wearable device can collect surrounding environment data and judge whether the user is in a scene of watching the multimedia file according to the environment data. The environmental data may include surrounding sound data, light and shadow data, image data, and the like.
2. And after the electronic equipment identifies that the user is in a scene of watching the multimedia file, the wearable equipment is notified. Here, the manner in which the electronic device identifies that the user is in a scene of viewing the multimedia file may be referred to the related description of the first aspect, which is not repeated herein.
(3) And when the wearable device recognizes that the user is in a scene of watching the multimedia file, the wearable device starts to acquire experience data when the user watches the multimedia file. The wearable device may identify whether the user is in a scene of viewing the multimedia file, as described in the above point (2).
In a third aspect, an embodiment of the present application provides an electronic device, the electronic device including one or more processors, a memory, and a display screen, the memory, the display screen being coupled to the one or more processors, the memory being configured to store computer program code, the computer program code including computer instructions, the one or more processors invoking the computer instructions to cause the electronic device to perform the method of the first aspect or any of the embodiments of the first aspect.
In a fourth aspect, an embodiment of the present application provides a wearable device, the electronic device comprising one or more processors, a memory coupled to the one or more processors, the memory for storing computer program code, the computer program code comprising computer instructions, the one or more processors invoking the computer instructions to cause the wearable device to perform the method of the second aspect or any one of the embodiments of the second aspect.
In a fifth aspect, an embodiment of the present application provides a communication system, where the communication system includes a wearable device and an electronic device. The wearable device is connected with the electronic device. The wearable device is used for collecting experience data of a user when watching a multimedia file and sending the experience data to the electronic device, the electronic device is used for obtaining relevant information of the multimedia file, receiving the experience data sent by the wearable device, generating an evaluation result of the user on the multimedia file according to the experience data, and sending the evaluation result to a server or sending the corrected evaluation result to the server. The experience data comprises one or more of sign data, action data or sound data of the user, and the related information of the multimedia file comprises one or more of a name, duration or profile of the multimedia file.
The electronic device in the fifth aspect may be configured to perform the respective steps performed by the electronic device when the evaluation method of the multimedia file is performed in combination with the wearable device described in the first aspect. The wearable device in the fifth aspect may be a wearable device as described in the fourth aspect, and may be used to perform the method in the second aspect or any implementation of the second aspect.
In a sixth aspect, embodiments of the present application provide a computer program product comprising instructions which, when run on an electronic device, cause the electronic device to perform a method as described in the first aspect and any possible implementation of the first aspect.
In a seventh aspect, embodiments of the present application provide a computer program product comprising instructions which, when run on an electronic device, cause the electronic device to perform a method as described in the second aspect and any possible implementation of the second aspect.
In an eighth aspect, an embodiment of the present application provides a computer readable storage medium, comprising instructions which, when executed on an electronic device, cause the electronic device to perform a method as described in the first aspect and any possible implementation manner of the first aspect.
In a ninth aspect, embodiments of the present application provide a computer readable storage medium comprising instructions which, when run on an electronic device, cause the electronic device to perform a method as described in the second aspect and any possible implementation of the second aspect.
By implementing the technical scheme provided by the embodiment of the application, the electronic equipment can generate the evaluation result of the user on the multimedia file according to the experience data when the user views the multimedia file. The evaluation result is fit with the actual experience of each user, accords with the actual experience of the user, and has objectivity, authenticity, accuracy and fairness. In addition, the method is low in cost, is not limited to scenes, and is convenient to implement. By implementing the method, more accurate reference opinions can be provided for other users, and optimization of merchants on the multimedia files can be promoted, so that user experience is improved.
Drawings
Fig. 1 is a schematic structural diagram of a communication system according to an embodiment of the present application;
Fig. 2 is a schematic structural diagram of a wearable device according to an embodiment of the present application;
Fig. 3A is a schematic hardware structure of an electronic device according to an embodiment of the present application;
fig. 3B is a schematic software structure of an electronic device according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a large screen device according to an embodiment of the present application;
FIGS. 5A-5B are schematic diagrams illustrating an embodiment of the application for turning on a "movie viewing mode";
fig. 5C-5E are schematic flow diagrams of notifying a wearable device to start a "movie mode" by an electronic device according to an embodiment of the present application;
FIGS. 6A-6I are a set of user interface diagrams implemented on a wearable device provided by an embodiment of the present application;
FIGS. 7A-7G are a set of user interface diagrams implemented on an electronic device, provided by an embodiment of the application;
FIGS. 7H-7J are schematic diagrams of another set of user interfaces implemented on an electronic device provided in accordance with embodiments of the present application;
fig. 8 is a flow chart of a method for evaluating a multimedia file according to an embodiment of the present application;
Fig. 9 is a flowchart of another method for evaluating multimedia files according to an embodiment of the present application;
Fig. 10 is a flowchart of another method for evaluating multimedia files according to an embodiment of the present application.
Detailed Description
The following description will be given in detail of the technical solutions in the embodiments of the present application with reference to the accompanying drawings. In the description of the embodiment of the present application, unless otherwise indicated, "/" means or, for example, a/B may represent a or B, and "and/or" in the text is merely an association relationship describing an association object, which means that three relationships may exist, for example, a and/or B, and that three cases of a alone, a and B together, and B alone exist, and further, in the description of the embodiment of the present application, "a plurality" means two or more.
The terms "first," "second," and the like, are used below for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature, and in the description of embodiments of the application, unless otherwise indicated, the meaning of "a plurality" is two or more.
The embodiment of the application provides a method, a device and a system for evaluating multimedia files. In the method for evaluating the multimedia file, the wearable device can acquire experience data of a user when watching the multimedia file in real time, and the electronic device or the wearable device can automatically generate an evaluation result of the user on the multimedia file according to the experience data. The electronic device or the wearable device may share to a website or other users after the user corrects or confirms the evaluation result.
The multimedia files may include audio and/or video playback information such as movies, television shows, musical shows, songs, and the like.
The evaluation result of the user on the multimedia file may be implemented in various forms, such as text, keywords, expression packages, scores, symbols, audio and video, and the like, which is not limited in the embodiment of the present application.
Experience data of a user viewing a multimedia file may include, but is not limited to, one or more of sign data, motion data, sound data, etc. of the user viewing the multimedia file. The sign data may include one or more of blood glucose, heart rate, blood pressure, body surface temperature, skin conductance signals, eye movement signals, or brain microcurrent signals. The motion data may include one or more of hand motion, limb motion, torso motion, head motion, and the like. The sound data may include voiceprints, intonation, mood, language, etc. of the sound.
The method for evaluating the multimedia file provided by the embodiment of the application can generate the evaluation result of the user on the multimedia file according to the real-time sense of reality of each user, and is objective, accurate, real and fair. In addition, the method is low in cost, is not limited to scenes, and is convenient to implement. By implementing the method, more accurate reference opinions can be provided for other users, and optimization of merchants on the multimedia files can be promoted, so that user experience is improved.
In the method for evaluating the multimedia file provided by the embodiment of the application, under the condition that the 'film watching mode' of the wearable device is opened, the wearable device can automatically collect the experience data of the user when the user is in a scene of watching the multimedia file. And then, the wearable device can automatically generate the evaluation result of the user on the multimedia file according to the experience data, and can also send the experience data to the electronic device and automatically generate the evaluation result of the user on the multimedia file according to the experience data. Here, how the wearable device collects experience data of the user, and how the wearable device or the electronic device generates an evaluation result of the user on the multimedia file according to the experience data may refer to detailed description of the subsequent method embodiments, which is not repeated herein.
In the following embodiments of the present application, a "viewing mode" may be a service or function provided by a wearable device and/or an electronic device, which may support that when a user views a scene of a multimedia file, the wearable device automatically collects experience data of the user, and automatically generates an evaluation result of the user on the multimedia file according to the experience data, or sends the experience data to the electronic device, and the electronic device automatically generates an evaluation result of the user on the multimedia file according to the experience data.
It should be understood that the "movie mode" is just a word used in this embodiment, and its meaning is already described in this embodiment, and its name should not be construed as limiting the embodiment. In addition, in other embodiments of the present application, the "film-viewing mode" may also be referred to as other terms such as "intelligent film-evaluation".
Next, first, a communication system provided by an embodiment of the present application will be described.
Referring to fig. 1, fig. 1 illustrates a structure of a communication system 10 provided in an embodiment of the present application. As shown in fig. 1, communication system 10 may include a wearable device 100, an electronic device 200. The wearable device 100 and the electronic device 200 may be connected and communicate by a wireless communication technology such as bluetooth (blue), wireless local area network (wireless local area networks, WLAN) technology, wireless fidelity (WIRELESS FIDELITY, wi-Fi), or the like, or by a wired connection through a physical interface (e.g., USB interface, HDMI interface, etc.).
The electronic device 200 may be used to play multimedia files. The multimedia file may be a local file stored in the electronic device 200, or may be an online file obtained by the electronic device 200 from a network, for example, a streaming file obtained by the electronic device 200 from a server providing the online file, or the like.
In the embodiment of the present application, the electronic device 200 may be a portable terminal device, such as a mobile phone, a tablet computer, or a non-portable terminal device such as a Laptop computer (Laptop) with a touch-sensitive surface or a touch-sensitive panel, a desktop computer with a touch-sensitive surface or a touch-sensitive panel, etc. that is equipped with iOS, android, microsoft or other operating systems. In some embodiments, the electronic device 200 may be further configured to receive experience data sent by the wearable device 100, and automatically generate a result of evaluating the multimedia file by the user according to the experience data. In some embodiments, the electronic device 200 may also be used to share the user's evaluation of the multimedia file to a website or other user.
The wearable device 100 may be used to automatically collect user experience data when the user is in a scene of viewing a multimedia file. In some embodiments, the wearable device 100 may be further configured to automatically generate a user's evaluation result for the multimedia file according to the experience data. In some embodiments, the wearable device 100 may also be used to share the user's evaluation of the multimedia file to a website or other user.
In an embodiment of the present application, the wearable device 100 may be worn on the body of a user and configured with a plurality of sensors for collecting user experience data. The wearable device 100 may be a smart glasses, a smart helmet, a smart watch, a bracelet, a wristband, etc. The smart glasses or smart helmets may be, for example, virtual Reality (VR) devices, VR devices, or the like.
That is, the system formed by the electronic device 200 and the wearable device 100 can enable the user to automatically generate the evaluation result of the multimedia file for the user in the scene that the user views the multimedia file at home by using the mobile phone, the tablet computer and other devices.
Without limitation, in other embodiments of the present application, electronic device 200 in system 10 may not be used to play multimedia files and system 10 may also include other devices for playing multimedia files, such as large screen device 300. The multimedia files played by the large screen device 300 may be locally stored files, streaming media files obtained from a network such as a server providing online files, or the like. When the large screen device 300 plays the online file, the online file may be obtained by the large screen device 300 from the network, or may be obtained by the electronic device 200 from the network and then sent to the large screen device 300. The large screen device 300 may be a television, a projection device, a video playback device in a movie theater, etc. In this way, the system 10 can automatically generate a rating result for a multimedia file for a user in a scene where the user views the multimedia file at home using a television set or views a movie at a movie theater. In some embodiments, the large screen device 300 and the electronic device 200 may be connected and communicate via Bluetooth, WLAN, or other wireless communication technology.
Referring to fig. 2, fig. 2 shows a schematic structural diagram of a wearable device 100 according to an embodiment of the present application.
As shown in fig. 2, the wearable device 100 may include a processor 101, a memory 102, a wireless communication processing module 103, a sensor module 104, and a display screen 105. The processor 101, memory 102, wireless communication processing module 103, sensor module 104, and display screen 105 may be connected by a bus. Wherein:
The processor 101 may be used to read and execute computer readable instructions. In a specific implementation, the processor 101 may mainly include a controller, an operator, and a register. The controller is mainly responsible for instruction decoding and sending out control signals for operations corresponding to the instructions. The arithmetic unit is mainly responsible for performing fixed-point or floating-point arithmetic operations, shift operations, logic operations, and the like, and may also perform address operations and conversions. The register is mainly responsible for storing register operands, intermediate operation results and the like temporarily stored in the instruction execution process. In a specific implementation, the hardware architecture of the processor 101 may be an Application Specific Integrated Circuit (ASIC) architecture, a MIPS architecture, an ARM architecture, an NP architecture, or the like.
In some embodiments, the processor 101 may be configured to process experience data of the user collected by the sensor module 104, and generate a result of evaluating the multimedia file by the user according to the experience data.
Memory 102 is coupled to processor 101 for storing various software programs and/or sets of instructions. In particular implementations, memory 102 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state storage devices. The memory 102 may store an operating system. The memory 102 may also store a communication program that may be used to communicate with the wearable device 100.
The wireless communication processing module 103 may include one or more of a Bluetooth (BT) communication processing module, a WLAN communication processing module. The wireless communication processing module 103 causes the wearable device 100 to establish a wireless communication connection and communicate with the wearable device 100.
The wireless communication processing module 103 may also include a cellular mobile communication processing module (not shown). The cellular mobile communications processing module may communicate with other devices (e.g., servers) via cellular mobile communications technology.
The sensor module 104 may be used to collect experience data of a user viewing a multimedia file. The sensor module 104 may include three general categories of sensors, motion-type sensors, biological-type sensors, environmental-type sensors.
The motion type sensor is used to collect motion data of the wearable device 100. The motion sensor may include an acceleration sensor, a gyroscopic sensor, a magnetometer, a pressure sensor, etc.
The acceleration sensor may detect the magnitude of acceleration of the wearable device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the wearable device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
The gyroscopic sensor may be used to determine a motion pose of the wearable device 100. In some embodiments, the angular velocity of the wearable device 100 about three axes (i.e., x, y, and z axes) may be determined by a gyroscopic sensor.
The magnetic sensor includes a hall sensor. The wearable device 100 can detect the opening and closing of the flip holster using the magnetic sensor.
The pressure sensor is used for sensing a pressure signal and can convert the pressure signal into an electric signal. In some embodiments, the pressure sensor may be provided on the display screen. When a touch operation is applied to the display screen, the wearable device 100 detects the touch operation intensity according to the pressure sensor 180A. The wearable device 100 may also calculate the location of the touch from the detection signal of the pressure sensor 180A.
The biometric sensor is used to collect vital sign data of a user wearing the wearable device 100. The biological sensor may include a blood glucose sensor, a blood pressure sensor, a heart sensor, a muscle sensor, a body temperature sensor, a brain wave sensor, a bone sensor, and the like.
The blood glucose sensor may be used to collect a blood glucose value of a user. The blood pressure sensor may be used to collect blood pressure values of the user. An electrocardiograph may be used to collect heart rate of a user. Myoelectric sensors may be used to collect electrical conductivity values of the user's skin. The body temperature sensor may be used to collect the body surface temperature of the user. The brain wave sensor can be used for collecting micro-current values of the brain of a user. The bone sensor may acquire a vibration signal. In some embodiments, the processor 101 may parse the heart rate information based on the blood pressure beat signals acquired by the bone conduction sensor to implement a heart rate detection function.
The environmental-like sensor is used to collect sound data around the wearable device 100. The environmental sensors may include temperature sensors, proximity sensors, barometric pressure sensors, microphones.
A temperature sensor may be used to detect the external ambient temperature. An ambient light sensor may be used to sense ambient light level.
The air pressure sensor is used for measuring air pressure. In some embodiments, wearable device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation.
Microphones, also known as "microphones" and "microphones", are used to collect sound signals and may also be used to convert sound signals into electrical signals.
The proximity light sensor may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The wearable device 100 emits infrared light outwards through the light emitting diode. The wearable device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the wearable device 100. When insufficient reflected light is detected, the wearable device 100 may determine that there is no object in the vicinity of the wearable device 100.
The display screen 105 may be used to display images, videos, and the like. The display screen 105 may be a Liquid Crystal Display (LCD) screen (liquid CRYSTAL DISPLAY), an organic light-emitting diode (OLED) screen, an active-matrix organic LIGHT EMITTING diode (AMOLED) screen, a flexible light-emitting diode (FLED) screen, a quantum dot LIGHT EMITTING diodes (QLED) screen, or the like.
It is to be understood that the structure illustrated in fig. 2 does not constitute a specific limitation on the wearable device 100. In other embodiments of the application, the wearable device 100 may include more or less components than shown, for example may also include some physical keys or the like. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Referring to fig. 3A, fig. 3A is a schematic structural diagram of an electronic device 200 according to an embodiment of the present application.
It should be understood that the electronic device 200 shown in fig. 3A is only one example, and that the electronic device 200 may have more or fewer components than shown in fig. 3A, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
As shown in fig. 3A, the electronic device 200 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, a subscriber identity module (subscriber identification module, SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a memory, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and so forth. A memory may also be provided in the processor 110 for storing instructions and data.
The charge management module 140 is configured to receive a charge input from a charger. The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like.
The wireless communication function of the electronic device 200 can be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 200 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example, the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied on the electronic device 200. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. That is, the electronic device 200 may be used to play multimedia files. In some embodiments, the modem processor may be a stand-alone device.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (WIRELESS FIDELITY, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation SATELLITE SYSTEM, GNSS), frequency modulation (frequency modulation, FM), near field communication (NEAR FIELD communication, NFC), infrared (IR), etc., as applied to the electronic device 200. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 200 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 200 may communicate with a network and other devices via wireless communication techniques. The wireless communication techniques can include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (GENERAL PACKET radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation SATELLITE SYSTEM, GLONASS), a beidou satellite navigation system (beidou navigation SATELLITE SYSTEM, BDS), a quasi zenith satellite system (quasi-zenith SATELLITE SYSTEM, QZSS) and/or a satellite based augmentation system (SATELLITE BASED AUGMENTATION SYSTEMS, SBAS).
The electronic device 200 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information. The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The electronic device 200 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like. The ISP is used to process data fed back by the camera 193. The camera 193 is used to capture still images or video.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 200.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device 200 and data processing by executing instructions stored in the internal memory 121.
The electronic device 200 may implement audio functions through the audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, and application processor, etc.
The function of each sensor is similar to that of the wearable device 100 shown in fig. 2, and reference is made to the related description.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 200. The electronic device 200 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously.
The software system of the electronic device 200 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the invention, taking an Android system with a layered architecture as an example, a software structure of the electronic device 200 is illustrated.
Fig. 3B is a software architecture block diagram of the electronic device 200 according to an embodiment of the invention.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun rows (Android runtime) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 3B, the application package may include a camera, gallery, calendar, call, map, navigation, WLAN, bluetooth, music, video, short message, an application for performing the method provided by the embodiment of the present application, and so on.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for the application of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 3B, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views.
The telephony manager is used to provide the communication functions of the electronic device 200. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction.
Android run time includes a core library and virtual machines. Android runtime is responsible for scheduling and management of the android system.
The core library comprises two parts, wherein one part is a function required to be called by java language, and the other part is an android core library.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. Such as surface manager (surface manager), media library (Media Libraries), three-dimensional graphics processing library (e.g., openGL ES), 2D graphics engine (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
Referring to fig. 4, fig. 4 is a schematic structural diagram of a large screen device 300 according to an embodiment of the present application.
As shown in fig. 4, the large screen apparatus 300 may include a schematic diagram schematically illustrating the structure of the large screen apparatus 300 provided by the present application in fig. 4.
As shown in fig. 4, the large screen apparatus 300 may include a processor 122, a memory 123, a wireless communication processing module 124, a power switch 125, a wired LAN communication processing module 126, an HDMI communication processing module 127, a USB communication processing module 128, and a display screen 129. Wherein:
Processor 122 may be used to read and execute computer readable instructions. In particular implementations, processor 122 may include primarily controllers, operators, and registers. The controller is mainly responsible for instruction decoding and sending out control signals for operations corresponding to the instructions. The arithmetic unit is mainly responsible for performing fixed-point or floating-point arithmetic operations, shift operations, logic operations, and the like, and may also perform address operations and conversions. The register is mainly responsible for storing register operands, intermediate operation results and the like temporarily stored in the instruction execution process. In particular implementations, the hardware architecture of the processor 122 may be an Application Specific Integrated Circuit (ASIC) architecture, MIPS architecture, ARM architecture, NP architecture, or the like.
In some embodiments, the processor 122 may be configured to parse signals received by the wireless communication processing module 124 and/or the wired LAN communication processing module 126, such as broadcast probe requests of the electronic device 200, and so forth. The processor 122 may be configured to perform a corresponding processing operation according to the analysis result, for example, generating a probe response, and driving the display screen 129 to perform display according to the display request or the display instruction, for example.
In some embodiments, the processor 122 may also be configured to generate signals, such as bluetooth broadcast signals, beacon signals, that are transmitted outwardly by the wireless communication processing module 124 and/or the wired LAN communication processing module 126.
Memory 123 is coupled to processor 122 for storing various software programs and/or sets of instructions. In particular implementations, memory 123 may include high-speed random access memory and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state storage devices. Memory 123 may store an operating system, such as an embedded operating system, uCOS, vxWorks, RTLinux. Memory 123 may also store communication programs that may be used to communicate with electronic device 200, one or more servers, or additional devices.
The wireless communication processing module 124 may include one or more of a Bluetooth (BT) communication processing module 124A, WLAN a communication processing module 124B.
In some embodiments, one or more of the Bluetooth (BT) communication processing module, the WLAN communication processing module may listen for signals transmitted by other devices (e.g., electronic device 200), such as probe requests, scan signals, and the like.
In other embodiments, one or more of the Bluetooth (BT) communication processing module and the WLAN communication processing module may also transmit signals, such as broadcast Bluetooth signals, beacon signals, so that other devices (e.g., electronic device 200) may discover the large screen device 300 and establish wireless communication connections with other devices (e.g., electronic device 200).
The wireless communication processing module 124 may also include a cellular mobile communication processing module (not shown). The cellular mobile communications processing module may communicate with other devices (e.g., servers) via cellular mobile communications technology.
The power switch 125 may be used to control the power supplied by the power source to the large screen device 300.
The wired LAN communication processing module 126 may be used to communicate with other devices in the same LAN through a wired LAN, and may also be used to connect to the WAN through a wired LAN, and may communicate with devices in the WAN.
The HDMI communication processing module 127 may be used to communicate with other devices through an HDMI interface (not shown).
The USB communication processing module 128 may be used to communicate with other devices via a USB interface (not shown).
The display 129 may be used to display images, video, and the like. Display 129 may be a LCD, OLED, AMOLED, FLED, QLED or the like.
In some embodiments, the large screen device 300 may also include an audio module (not shown). The audio module may be configured to output audio signals via the audio output interface such that the display 121 supports audio playback. The audio module may also be used to receive audio data through the audio input interface. The large screen device 300 may be a media playing device such as a television.
In some embodiments, the large screen device 300 may also include a serial interface such as an RS-232 interface. The serial interface can be connected to other devices, such as audio playback devices, such as speakers, so that the display and the audio playback devices cooperate to play audio and video.
It will be appreciated that the configuration illustrated in fig. 4 does not constitute a particular limitation of the large screen apparatus 300. In other embodiments of the application, large screen device 300 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The method provided by the embodiment of the present application is described in detail below based on the communication system 10 shown in fig. 2 and the devices shown in fig. 2 to 4, and in conjunction with the user interfaces implemented on the wearable device 100 and the electronic device 200 provided by the embodiment of the present application.
The term "user interface" in the present description, claims and drawings is a media interface for interaction and information exchange between an application program or an operating system and a user, which enables conversion between an internal form of information and a form acceptable to the user. A commonly used presentation form of a user interface is a graphical user interface (graphic user interface, GUI), which refers to a graphically displayed user interface that is related to computer operations. The graphical user interface may include interface elements such as icons, windows, controls, etc., displayed in a display screen of the electronic device, where the controls may include visual interface elements such as icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, widgets, etc.
First, three ways of turning on the "movie mode" of the wearable device 100 provided in the embodiments of the present application are described.
The wearable device 100 starts the "movie watching mode" under the trigger of the user operation.
The user operation for turning on the "movie mode" of the wearable device 100 may be an operation acting on the electronic device 200.
Illustratively, referring to fig. 5A, fig. 5A shows a user interface 50 displayed on an electronic device 200. The user interface 50 may be provided by an application installed on the electronic device 200 for managing a wearable device connected to the electronic device 200, such as a hand loop management application, or a "setup" application on the electronic device 200.
As shown in fig. 5A, user interface 50 may include a status bar 501, a return key 502, a current page indicator 503, a device status display area 504, controls 505, and other controls for opening corresponding functions of wearable device 100.
Status bar 501 may include a signal strength indicator for a mobile communication signal (also may be referred to as a cellular signal), a carrier name (e.g., "chinese mobile"), a signal strength indicator for a wireless high-fidelity (WIRELESS FIDELITY, wi-Fi) signal, a battery status indicator, a time indicator, and so forth.
The return key 502 is an APP level return key that can be used to return to the menu upper level. Those skilled in the art will appreciate that the logical level of a page is fixed and is determined at the time of application design.
The current page indicator 503 may be used to indicate a current page, e.g., the text message "Hua as a bracelet" may be used to indicate that the current page is used to manage a wearable device to which the electronic device 200 is connected. Not limited to text information, the current page indicator 503 may also be an icon.
The device status display area 504 is used to display status information of the wearable device 100 connected to the electronic device 200, and may include, for example, a connection manner (such as a bluetooth icon shown in the drawing) of the wearable device 100 and the electronic device 200, a remaining power of the wearable device 100, a picture of the wearable device 100, and the like.
Control 505 may be used to monitor for user operations (e.g., touch operations, click operations, etc.), and in response to user operations acting on control 308, electronic device 200 may send a notification message to wearable device 100 through a connection with wearable device 100 to notify or instruct wearable device 100 to turn on a "movie mode". In some embodiments, referring to fig. 5B, after electronic device 200 detects a user operation on control 308, the display form of control 308 may also be altered to prompt the user that wearable device 100 has been turned on "movie mode".
It is understood that the user operation shown in fig. 5A and 5B that triggers the wearable device 100 to enter the "movie mode" on the electronic device 200 is only an example, and the user operation may be implemented in other manners. For example, the user operation may also be a click operation detected by the electronic device 200 after a downward swipe gesture at the status bar 501, popup the window and on a "view mode" switch control in the window.
Not limited to the operation on the electronic device 200, the user operation for turning on the "movie mode" of the wearable device 100 may also be an operation acting on the wearable device 100. For example, the user operation may be a double click operation on a physical key (e.g., a power key) of the wearable device 100, an operation on a switch control of "movie mode" in a user interface displayed on a display screen of the wearable device 100, or an operation to shake the wearable device 100, and the like, to which the embodiment of the present application is not limited.
In the embodiment of the present application, an operation for turning on the "movie mode" of the wearable device 100, which acts on the electronic device 200, may be referred to as a first operation.
In the embodiment of the present application, an operation for turning on the "movie mode" of the wearable device 100, which acts on the wearable device 100, may be referred to as a fourth operation.
And (II) the wearable device 100 automatically starts a 'viewing mode' when recognizing that the user is in a scene of watching the multimedia file, and no user operation is needed.
Through the second mode of automatically starting the "movie watching mode", the wearable device 100 can automatically generate a corresponding evaluation result for the user when the user watches the multimedia file, so that the method is simpler and more convenient for the user.
In the embodiment of the present application, the wearable device 100 may detect the environmental data and determine whether the current user is in a scene of viewing the multimedia file according to the environmental data. Specifically, the wearable device 100 may collect sound in the surrounding environment through the microphone, collect light and shadow through the ambient light sensor, collect images through the camera, and other environmental data, and determine whether to view the scene of the multimedia file for the user currently according to the data.
In some embodiments, the wearable device 100 may also search from the network according to the collected data to obtain related information of the multimedia file watched by the user. In an embodiment of the present application, the related information of the multimedia file may include, but is not limited to, a name, a duration, a content profile, a main creation information of the multimedia file, a time node of the multimedia file currently being played, and the like.
In some embodiments, the wearable device 100 may also send the acquired related information of the multimedia file to the electronic device 200.
It can be appreciated that the second mode is applicable to any of the scenes where the user views the multimedia file, for example, where the user views the multimedia file played by the electronic device 200, or where the user views the multimedia file played by the large-screen device 300.
And (III) upon identifying that the user is in a scene of viewing the multimedia file, the electronic device 200 notifies the wearable device 100to turn on the "viewing mode".
Specifically, the electronic device 200 may identify whether the current user is in a scene of viewing the multimedia file, and transmit a notification message to the wearable device 100 in case it is determined that the current user is in a scene of viewing the multimedia file. The notification message is used to notify the wearable device 100 to turn on the "movie mode".
After receiving the notification message, the wearable device 100 may turn on the "movie mode". In connection with 3 different scenarios, how the electronic device 200 recognizes whether the current user is in a scenario of viewing a multimedia file is described in detail below. In this way, when the user is in a scene of watching the multimedia file, the wearable device 100 can start the "movie watching mode" under the trigger of the electronic device 200, so that the user operation is not required, and the method is more concise and convenient for the user.
(1) In a scenario in which the electronic device 200 plays the multimedia file, the electronic device 200 may directly recognize that it is currently playing the multimedia file, thereby determining that the current user is in a scenario in which to view the multimedia file. Here, the multimedia file played by the electronic device 200 may be a local file or an online file.
In some embodiments, the electronic device 200 may also obtain information about the currently playing multimedia file. The related information of the multimedia file may be referred to the previous related description. Specifically, when the electronic device 200 plays the local file, the related information of the multimedia file may be stored in the electronic device 200, and the electronic device 200 may directly obtain the related information of the multimedia file. When the electronic device 200 plays an online file, the electronic device 200 may obtain information about the multimedia file from a network (e.g., at a video server that provides the online file).
In some embodiments, the electronic device 200 may also send the acquired information about the currently playing multimedia file to the wearable device 100.
Referring to fig. 5C, fig. 5C illustrates a flow of informing the wearable device 100to turn on the "movie mode" after the electronic device 200 determines that the current user is in a scene of viewing the multimedia file through the (1) th mode.
(2) In a scene where the large-screen device 300 (e.g., television, projector, etc.) plays a multimedia file, the electronic device 200 can learn, through a connection with the large-screen device 300, that the large-screen device 300 is currently playing the multimedia file, thereby determining that the current user is in a scene of viewing the multimedia file. Here, the multimedia file played by the large screen apparatus 300 may be a local file or an online file.
In some embodiments, the electronic device 200 may also obtain information about the multimedia file currently being played by the large screen device 300. The related information of the multimedia file may be referred to the previous related description.
When the large screen device 300 plays the local file, the related information of the multimedia file may be stored in the large screen device 300, and the electronic device 200 may obtain the related information of the multimedia file from the large screen device 300 through a connection with the large screen device 300.
When the large screen device 300 plays the online file, if the online file is obtained from the network by the large screen device 300, the large screen device 300 may also obtain the relevant information of the multimedia file from the network (such as a video server that provides the online file) and send the relevant information to the electronic device 200. If the online file is obtained by the electronic device 200 from the network (e.g., at a video server providing the online file) and sent to the large screen device 300, the electronic device 200 may obtain the relevant information of the multimedia file directly from the network (e.g., at the video server providing the online file).
In some embodiments, the electronic device 200 may also send the acquired information about the multimedia file played by the current large-screen device 300 to the wearable device 100.
Referring to fig. 5D, fig. 5D illustrates a flow of informing the wearable device 100to turn on the "viewing mode" after the electronic device 200 determines that the current user is in a scene of viewing the multimedia file through the (2) th mode.
(3) In a scenario where the large screen device 300 (e.g., an audio-visual device in a movie theater, etc.) plays a multimedia file, the electronic device 200 may learn, in conjunction with the location, ticket booking information of the user, etc., that the current user is in a scenario of viewing the multimedia file.
For example, the electronic device 200 may acquire geographical location information where the current user is located, and if the current user is in a movie theater and the ticket booking information of the user indicates corresponding movie information such as a name of a movie, an open time, etc., the electronic device 200 may determine that the current user is in a scene of viewing the multimedia file. The ticket booking information for the user may be stored locally on the electronic device 200, e.g., the ticket booking information for the user may be in a text message on the electronic device 200. The ticket booking information of the user may also be obtained by the electronic device 200 from the network, for example, the electronic device 200 may obtain the ticket booking information of the user from the network through ticket booking software.
In some embodiments, the ticket booking information for the user includes information regarding the multimedia file currently being played by the large screen device 300. In some embodiments, the electronic device 200 may also search the network for information about the multimedia file currently played by the large screen device 300 according to the acquired ticket booking information.
In some embodiments, the electronic device 200 may also send the searched related information of the multimedia file played by the current large-screen device 300 to the wearable device 100.
Referring to fig. 5E, fig. 5E illustrates a flow of informing the wearable device 100to turn on the "movie mode" after the electronic device 200 determines that the current user is in a scene of viewing the multimedia file through the (3) th mode.
(4) The electronic device 200 may detect the environmental data and determine whether the current user is in a scene of viewing the multimedia file according to the environmental data.
Specifically, the electronic device 200 may collect sound in the surrounding environment through a microphone, collect light and shadow through an ambient light sensor, collect images through a camera, and other environmental data, and determine whether to view the scene of the multimedia file for the user currently according to the data.
In some embodiments, the electronic device 200 may also search from the network according to the collected data to obtain relevant information of the multimedia file watched by the user. In an embodiment of the present application, the related information of the multimedia file may include, but is not limited to, a name, a duration, a content profile, a main creation information of the multimedia file, a time node of the multimedia file currently being played, and the like.
In some embodiments, the electronic device 200 may also send the acquired information about the multimedia file to the wearable device 100.
It will be appreciated that the above-described ways of identifying whether the current user is in a scene of viewing the multimedia file by the electronic device 200 of (1) - (4) can be implemented in combination, and the embodiment of the present application is not limited thereto.
It can be appreciated that when the "viewing mode" is turned on in the second or third manner, the wearable device 100 may also ask the user whether to turn on the "viewing mode" of the wearable device 100 before the "viewing mode" is turned on, and may turn on the "viewing mode" of the wearable device 100 after the user confirms that the "viewing mode" is turned on. By the method, more selectivity and awareness can be given to the user, and user experience can be improved.
After the wearable device 100 starts the "movie mode", an icon of the movie mode may be displayed on the display screen, for prompting the user that the "movie mode" is currently started. Referring to fig. 6A and 6B, fig. 6A illustrates a user interface 60 displayed by the wearable device 100 when locked. An icon 601 of the viewing mode is included in the user interface 60 for indicating that the "viewing mode" is currently turned on. Fig. 6B illustrates a main interface 61 displayed after the wearable device 100 is unlocked, where the user interface 61 includes icons of one or more applications installed on the wearable device 100, such as an icon of an instant messaging application, an icon of an application market, an icon of an address book, an icon 602 of a "view mode" may be displayed in the user interface 61 after the wearable device 100 is turned on the "view mode", and so on. The icons 601 and 602 of the "movie mode" may be used to monitor a user operation, and the electronic device 200 may display experience data of the user collected by the wearable device 100 in response to the user operation, and the description of the embodiments will be omitted herein.
In the case where the "movie mode" of the wearable device 100 is on, and the user is in a scene of viewing a multimedia file, the wearable device 100 may collect experience data when the user views the multimedia file.
After the wearable device 100 starts the "movie watching mode" through the above (first) manner, that is, under the trigger of the user operation, the wearable device 100 may collect experience data when the user views the multimedia file after determining that the user is in a scene of viewing the multimedia file. The manner in which the wearable device 100 determines that the user is in a scene of viewing the multimedia file may include two types of:
(1) The wearable device 100 may autonomously recognize whether the user is in a scene of viewing the multimedia file. Reference may be made specifically to the description related to the wearable device 100 in the above (second) manner as to whether the user is in a scene of viewing the multimedia file.
(2) The electronic device 200 may send a notification message to the wearable device 100 after determining that the current user is in a scene of viewing the multimedia file, so that the wearable device 100 knows that the current user is in a scene of viewing the multimedia file. Here, the electronic device 200 determines a manner in which the user is viewing the scene of the multimedia file, and reference is made to the related description in the above-described (third) manner.
It can be appreciated that, when the wearable device 100 determines that the user is in the scene of watching the multimedia file in the above (1) or (2), the related information of the multimedia file may also be obtained, and reference may be made to the related descriptions in the above (second) or (third) modes.
When the wearable device 100 starts the "viewing mode" in the second or third mode, because the user is already in the scene of watching the multimedia file before the wearable device 100 starts the "viewing mode", the wearable device can directly collect the experience data when the user watches the multimedia file after starting the "viewing mode".
In embodiments of the present application, wearable device 100 may continuously or periodically collect the experience data, without limitation. When the wearable device 100 periodically collects experience data, the processing pressure of the wearable device 100 can be reduced, and power consumption is saved.
In embodiments of the present application, experience data of a user viewing a multimedia file may include, but is not limited to, one or more of sign data, action data, sound data, etc. of the user viewing the multimedia file. The sign data may include one or more of blood glucose, heart rate, blood pressure, body surface temperature, skin conductance signals, eye movement signals, or brain microcurrent signals. The motion data may include one or more of hand motion, limb motion, torso motion, head motion, and the like. The sound data may include voiceprints, intonation, mood, language, etc. of the sound. Specifically, the wearable device 100 may collect acceleration through an acceleration sensor, collect angular velocity through a gyro sensor, collect a blood glucose value of a user through a blood glucose sensor, collect a blood pressure value of the user through a blood pressure sensor, collect a heart rate of the user through an electrocardio sensor, collect a conductive value of skin of the user through a myoelectric sensor, collect a body surface temperature of the user through a body temperature sensor, collect a micro-current value of brain of the user through a brain wave sensor, collect a vibration signal through a bone sensor, collect a sound of the user through a microphone, and the like.
In embodiments of the present application, wearable device 100 may record or store the collected experience data. In some embodiments, the wearable device may store the collected experience data in association with the time node of the multimedia file viewed by the current user, which may facilitate the accuracy of the subsequent generation of the user's evaluation result of the multimedia file.
In some embodiments of the application, wearable device 100 may display these experience data on a display screen when the user needs to view them. In addition, the wearable device 100 may display related information of the acquired multimedia file, such as a type, a name, etc., of the multimedia file on the display screen.
For example, referring to fig. 6A, 6B, and 6C, when the wearable device 100 detects a user operation acting on the icon 601 or 602 of the "movie mode", the user interface 62 may be displayed in response to the user operation.
In one possible implementation, as shown in fig. 6C, the user interface 62 displays experience data of the user collected by the wearable device 100, and the name of the multimedia file. In some embodiments, these experience data may be displayed in different time nodes. That is, the wearable device 100 may display the experience data collected at different time points, respectively.
In another possible implementation, as shown in fig. 6D, the user interface 62 may not directly display the experience data collected by the wearable device 100, but display the analyzed data, such as the emotion change condition of the user, the limb action condition, such as the number of times the user is in various emotions, the duration, the emotion at a specific point in time when the user views the multimedia file, and so on, according to the collected experience data. In addition, the name of the multimedia file, such as the movie "Tetank number", may also be displayed in the user interface 62. Here, the wearable device 100 may analyze the experience data according to a pre-stored or big data training learning analysis model or algorithm to obtain the emotion change condition of the user, where the analysis model and algorithm may be obtained by depending on the sign data, the action data, and the correspondence between the sound data and the emotion of the user. Referring to table 1, table 1 exemplarily shows a possible correspondence between experience data and emotion of a user.
| Experience data | Mood of emotion |
| Sound data characterizing large changes in user emotion | Active input of |
| Sound data characterizing little change in emotion of user | Calm and no interest |
| Motion data characterizing user limb motion frequency | Active input of |
| Motion data characterizing less motion of a user's limb | Calm and no interest |
| The heart rate is continuously higher than Yu Jing resting heart rate by 30 percent and is stable | Excited, sensitive and dynamic |
| Heart rate variation of less than 5% | Calm and no interest |
| The skin impedance is reduced by more than 30 percent | Easy and pleasant |
| The skin impedance is reduced by less than 5 percent | Not of interest |
| The blood pressure is continuously 30 percent higher than the basic blood pressure | Tension, input |
| The blood pressure change is lower than 5% | Calm and no interest |
TABLE 1
It can be appreciated that by displaying the experience data of the user on the wearable device 100, the user can grasp his own experience in real time in the process of watching the multimedia file, and the user experience is improved. Not limited to displaying the user's experience data on the wearable device 100, in some embodiments, the wearable device 100 may send the collected experience data to the electronic device 200 after the collected experience data, and the electronic device 200 displays the experience data or displays the analyzed data, which will be described in detail in the following embodiments, which will not be described herein.
In the embodiment of the present application, after acquiring the experience data when the user views the multimedia file, the wearable device 100 or the electronic device 200 may generate the evaluation result of the user on the multimedia file. In the following, the method of generating the evaluation result by the wearable device 100 by using the user to generate the multimedia file is described as an example, and the manner of generating the evaluation result by the electronic device 200 is the same as that, and will not be described in detail later.
In some embodiments, the wearable device 100 may generate a user's evaluation result of the multimedia file in response to a user operation when the user operation is received. Illustratively, referring to FIG. 6C or FIG. 6D, a control 603 may also be included in the user interface 62. The wearable device 100 may generate a user's evaluation result of the multimedia file in response to a user operation (e.g., a click operation, a touch operation, etc.) detected on the control 603. The user operation may be input by the user at any time when viewing the multimedia file. That is, the evaluation result of the multimedia file generated in the embodiment of the present application may be a part or all of the evaluation result of the multimedia file.
In other embodiments, the wearable device 100 may autonomously generate the user's evaluation result of the multimedia file when the user exits from viewing the multimedia file or when the user has finished viewing the multimedia file. The wearable device 100 may determine whether the user exits from viewing the multimedia file or has completed viewing the multimedia file by first detecting sound data by the wearable device 100 and determining whether the current user exits from viewing the multimedia file or has completed viewing the multimedia file according to the sound data. In the second way, the wearable device 100 knows from the electronic device 200 whether the current user exits from viewing the multimedia file or has completed viewing the multimedia file. In one embodiment, the electronic device 200 may send information about the multimedia file to the wearable device 100 when playing the multimedia file, so that the wearable device 100 knows the playing condition of the multimedia file. In another embodiment, the electronic device 200 may combine the location, the ticket booking information of the user, the information of the multimedia file, and the like to obtain the time when the user has finished viewing the multimedia file, so as to determine whether the current user has finished viewing the multimedia file. When the wearable device autonomously generates the evaluation result of the user on the multimedia file, the user does not need to intervene, so that the user operation can be simplified, and the user experience can be improved.
Specifically, in the embodiment of the present application, the wearable device 100 may generate the evaluation result of the user on the multimedia file according to the acquired experience data and the evaluation policy when the user views the multimedia file. Wherein the evaluation policy indicates an association relationship or association criterion of the experience data and the evaluation result, in other words, the evaluation policy indicates a correspondence relationship between the experience data and the user's experience on the multimedia file. In general, the higher the interaction frequency between the user and the wearable device 100, the longer the interaction time, the higher the satisfaction of the user with the multimedia file.
Illustratively, referring to Table 2, table 2 shows one possible evaluation strategy. In the evaluation strategies shown in table 2, the evaluation results of the multimedia files by the users are measured by scores, and the higher the total score is 10 scores, the higher the satisfaction degree of the users on the multimedia files is indicated. Table 2 gives a corresponding score specification and weight for each item of experience data, respectively. After the experience data of the user is obtained, weighting calculation can be performed on each item of experience data according to table 2, so that the total score of the user on the multimedia file is obtained.
TABLE 2
For example, if the wearable device 100 collects the experience data of a user watching multimedia information, the scores corresponding to each item in table 2 are 10, 9, 8 and 7, and according to the evaluation policy shown in table 2, the wearable device may generate the evaluation result of the user on the multimedia file according to the following formula (10×3+10×3+9×2+7×1+8×1)/10=9.3).
The evaluation policies shown in table 2 are merely examples, and in a specific implementation, the evaluation policies according to which the wearable device 100 generates the multimedia file by the user may also be implemented in other forms, and the following is a few possible cases.
In some embodiments of the application, the assessment policy may also be personalized based on one or more of the user's gender, age, and constitution. For example, when the experience data includes skin impedance, the rate of change of skin impedance measured at the same emotion or satisfaction is different for female users and male users, or users of different ages. To compensate for this difference in gender of the user, this difference data may be obtained from a large amount of experimental data, with different scoring specifications set for users of different gender for this item of experience data of skin impedance. In other words, in the embodiment of the application, different evaluation strategies can be used for different users to obtain the evaluation result of the user on the multimedia file, so that the obtained evaluation result is more fit with the actual situation of each user, more accords with the actual experience of the user, and has objectivity and authenticity.
In the embodiment of the present application, the wearable device 100 may store the evaluation policy in advance, may periodically acquire an updated evaluation policy from the network, and may correct the evaluation policy according to the actual situation of the user, which is not limited in the embodiment of the present application.
In the embodiment of the application, the evaluation result is not limited to the score, and can be realized in other forms, such as a symbol (e.g. a star symbol), an expression package, a text, an audio/video (e.g. an movie comment) and the like. For example, the wearable device 100 may further determine keywords (such as "feeling", "fun", "burn" and the like) that may be used to describe the experience or emotion of the user viewing the multimedia file according to the collected experience data and the evaluation policy when the user views the multimedia file, and use these keywords as the evaluation result. As another example, the wearable device 100 may further determine keywords (such as "comedy", "patriot", "love", "relatedness", etc.) for describing the multimedia file from the network according to the related information of the multimedia file, and automatically generate a piece of text in combination with the keywords for describing the experience or emotion of the user viewing the multimedia file. For another example, the wearable device 100 may collect an audio/video when the user views the multimedia file, or collect a section of audio/video recorded after the user views the multimedia file and used for evaluating the multimedia file, and take the audio/video as the evaluation result of the user on the multimedia file.
In practical applications, the wearable device 100 may also generate corresponding evaluation results for different time periods (for example, 10 minutes before the start, 20 minutes in the middle, or 5 minutes before the end) of the multimedia file, or for a time period including a specific lead actor (for example, a time period including star a) in the multimedia file. Therefore, the wearable device 100 may generate the evaluation result corresponding to the preset time period according to the acquired experience data of the user when the user views the information of the preset time period in the multimedia. Therefore, the flexibility of the evaluation result can be improved, and the user experience is improved.
In some embodiments of the present application, referring to fig. 6E, the wearable device 100 may also display a user interface 63 in the process of generating the evaluation result, prompting the user that the evaluation result is currently being generated, giving the user a better use experience.
In the embodiment of the present application, after the wearable device 100 generates the evaluation result according to the acquired experience data when the user views the multimedia file, the evaluation result may be displayed on the display screen.
In some embodiments, the wearable device 100 may display the generated evaluation result on the display screen in response to a user operation when the user operation is received. Illustratively, referring to fig. 6F, after generating the evaluation result, wearable device 100 may display user interface 64 on a display screen, and control 604 may be included in user interface 64. Referring to fig. 6F and 6G, the wearable device 100 may display the generated evaluation results in the user interface 64 in response to a user operation (e.g., a click operation, a touch operation, etc.) detected on the control 604.
In other embodiments, the wearable device 100 may also display the evaluation result directly on the display screen after generating the evaluation result. For example, the wearable device 100 may directly display the user interface as shown in fig. 6G after generating the evaluation result. By the method, user intervention is not needed, user operation can be simplified, and user experience is improved.
In an embodiment of the present application, the wearable device 100 may also modify or correct the generated evaluation result in response to a user operation. That is, after seeing the evaluation result displayed on the display screen of the wearable device 100, the user can modify the evaluation result by the user operation according to the self-demand. For example, referring to fig. 6G, a control 605 may also be included in the user interface 65, and the wearable device 100 may display a page for the user to modify the rating result and receive an operation for the user to change the rating result in response to a user operation received on the control 605. For example, when the evaluation result is a score, the user may increase or decrease the score. For another example, when the evaluation result is text, the user may add keywords to the text. The mode of modifying the evaluation result by the user is not particularly limited in the embodiment of the application. Through the correction of the evaluation result by the user, the final obtained evaluation result is consistent with the actual experience of the user, and the authenticity of the evaluation result is improved.
In the embodiment of the application, the user can share the related information and the evaluation result of the multimedia file to a website, friends in instant messaging software and the like through the wearable device 100. The evaluation result may be an evaluation result generated by the wearable device 100 or an evaluation result corrected by the user, which is not limited herein. Websites may include social networking websites, critique websites, video websites, and so forth.
Illustratively, referring to FIG. 6G, a control 606 for sharing relevant information and evaluation results of the multimedia file to a website or friend may also be included in the user interface 65. In response to an operation (e.g., touch operation, click operation) received on control 606, wearable device 100 may display widget 607 in user interface 65. Referring to FIG. 6H, the widget 607 may include icons for one or more applications, such as icons 607A, 607B, 607C. In some embodiments, in response to an operation (e.g., touch operation, click operation) detected on an icon of an application, the wearable device 100 may share relevant information and evaluation results of the multimedia file to a website or friend. By way of example, referring to fig. 6I, fig. 6I illustrates one possible user interface 66 displayed on a display screen after the wearable device 100 shares relevant information and evaluation results of a multimedia file to a website, which user interface 66 may be provided by a video-like application "hua video". Not limited to the user operation received on the application icon, the wearable device 100 may also share related information and evaluation results of the multimedia file to a website or a net friend in response to other forms of user operation, which the embodiment of the present application does not limit.
As can be seen from the examples shown in the embodiments of fig. 6A to 6I, the method for evaluating a multimedia file according to the embodiment of the present application can generate an evaluation result of each user on the multimedia file according to real-time sense of reality of the user, where the evaluation result is objective, accurate, real and fair. In addition, the method is low in cost, is not limited to scenes, is convenient to implement, and is simple and convenient for users.
In the embodiment of the application, after sharing the evaluation result of the multimedia file, the user can conveniently know the multimedia file by other users. In addition, after the evaluation result of the user on the multimedia file is shared to the website, the merchant of the multimedia file can clip and correct the multimedia file according to the evaluation result of the user, so that the user experience is improved. In addition, after the website receives the evaluation result of the user on the multimedia file, the preference of the user can be analyzed in a big data analysis mode and the like, namely, a user image is generated for the user, and the user can be recommended with proper content according to the user image, so that the user experience is improved. For example, assuming that the video application (such as hua is a video) receives the evaluation results of the user on the plurality of multimedia files and analyzes that the user is interested in comedy, the relevant content of comedy can be recommended to the user when the user uses the video application next time and enters into the viewing mode.
It will be appreciated that fig. 6C-6I above mainly describe a user interface implemented on the wearable device 100, and that in practical applications, some of the operations mentioned in the above embodiments may also be performed by the electronic device 200. Some of the operations that may be performed by the electronic device 200 are briefly described below in connection with a user interface implemented on the electronic device 200.
Referring to fig. 7A-7E, a user interface displayed on a display screen of the electronic device 200 is illustrated.
In some embodiments of the present application, after the "movie viewing mode" of the wearable device 100 is turned on, the wearable device 100 may collect experience data when the user views a multimedia file when determining that the user is in a scene of viewing the multimedia file, and send the collected experience data to the electronic device 200.
The wearable device 100 may also obtain information about the multimedia file viewed by the user. The manner in which the wearable device 100 obtains the relevant information of the multimedia file may refer to the relevant descriptions in the (second) and (third) manners above.
In some embodiments of the present application, after the electronic device 200 receives the experience data sent by the wearable device, the experience data may be displayed on a display screen.
For example, referring to fig. 7A and 7B, after receiving experience data sent by wearable device 100, electronic device 200 may add display area 506 in user interface 50. In one possible implementation, referring to fig. 7A, the display area 506 displays experience data of the user collected by the wearable device 100, and related information (e.g., type, name, etc.) of the multimedia file. In some embodiments, these experience data may be displayed in different time nodes. In another possible implementation, the experience data collected by the wearable device 100 may not be directly displayed in the display area 506, but the data obtained after analysis processing may be displayed, for example, the emotion change condition of the user, the limb action condition, such as the number of times the user is in various emotions, the duration, the emotion at a specific time point of viewing the multimedia file, and so on, which are obtained according to the collected experience data. Here, the manner in which the electronic device 200 analyzes the experience data to obtain the emotion is the same as the manner in which the wearable device 100 analyzes the experience data to obtain the emotion described above, and reference may be made to the related description.
In some embodiments of the present application, after the electronic device 200 obtains the experience data of the user when viewing the multimedia file, the evaluation result of the user on the multimedia file may be generated. The manner in which the electronic device 200 generates the evaluation result is the same as the manner in which the wearable device 100 generates the evaluation result of the user on the multimedia file, and the foregoing related description may be referred to.
In some embodiments, the electronic device 200 may generate a user's evaluation result of the multimedia file in response to a user operation when the user operation is received. For example, referring to fig. 7A or 7B, a control 506A may also be included in the region 506. The electronic device 200 can generate a user's evaluation of the multimedia file in response to a user operation (e.g., a click operation, a touch operation, etc.) detected on the control 506A.
In other embodiments, the electronic device 200 may autonomously generate the user's evaluation result of the multimedia file when the user exits from viewing the multimedia file or when the user has finished viewing the multimedia file.
The manner in which the electronic device 200 generates the result of the user's evaluation of the multimedia file is the same as that in which the wearable device 100 generates the result of the user's evaluation of the multimedia file, and the foregoing related description may be referred to.
In some embodiments of the present application, referring to fig. 7C, the electronic device 200 may further display, in the area 506, a prompt message in the process of generating the evaluation result, for prompting the user that the evaluation result is currently being generated, so as to give the user a better use experience. The prompt may be implemented as an icon, text, or other form, to which embodiments of the application are not limited.
In the embodiment of the present application, after the electronic device 200 generates the evaluation result according to the acquired experience data when the user views the multimedia file, the evaluation result may be displayed on the display screen.
In some embodiments, the electronic device 200 may display the generated evaluation result on the display screen in response to a user operation when the user operation is received. Illustratively, referring to fig. 7D, after generating the evaluation result, the electronic device 200 can display a control 506B in the region 506. Referring to fig. 7E, the electronic device 200 may display the generated evaluation result in the region 506 in response to a user operation (e.g., a click operation, a touch operation, etc.) detected on the control 506B. Without limitation, in other embodiments, the evaluation result may not be displayed in the area 506 of the user interface 50, but may be displayed in other interfaces, as the embodiments of the present application are not limited in this respect.
In other embodiments, the electronic device 200 may also display the evaluation result directly on the display screen after generating the evaluation result. For example, the electronic apparatus 200 may directly display the user interface as shown in fig. 7E after generating the evaluation result. By the method, user intervention is not needed, user operation can be simplified, and user experience is improved.
In an embodiment of the present application, the electronic device 200 may also modify or correct the generated evaluation result in response to a user operation. Illustratively, referring to FIG. 7E, a control 506C may also be included in the user interface 50, and the electronic device 200 may display a page for the user to modify the rating result and receive a user operation for modifying the rating result in response to a user operation received on the control 506C. The manner in which the user modifies the evaluation result on the electronic device 200 is not particularly limited in the embodiment of the present application. In the embodiment of the present application, the user operation for modifying or correcting the evaluation result may be referred to as a third user operation.
In the embodiment of the present application, the user may share the related information and the evaluation result of the multimedia file to a website, friends in the instant messaging software, and so on through the electronic device 200. The evaluation result may be an evaluation result generated by the electronic device 200 or an evaluation result corrected by the user, and is not limited thereto. For example, referring to FIG. 7E, a control 506D for sharing the results of the evaluation to a website or friend may also be displayed in the area 506 of the user interface 50. In response to an operation (e.g., touch operation, click operation) received on control 506D, electronic device 200 can display a widget 506E as shown in fig. 7F in user interface 50. The widget 506E may include icons for one or more applications. The electronic device 200 may send the related information and the evaluation result of the multimedia file to a corresponding website server or a server of the instant messaging software in response to an operation (such as a touch operation or a click operation) detected on an icon of the application program, so as to share the evaluation result to a website or a friend.
By way of example, referring to fig. 7G, fig. 7G illustrates one possible user interface 51 displayed on a display screen after the electronic device 200 shares relevant information and evaluation results of a multimedia file to a website, which user interface 51 may be provided by a video-like application "hua video".
In an embodiment of the present application, a user interface implemented on an electronic device for displaying a result of evaluation may be referred to as a first user interface, such as the user interface 50 shown in fig. 7A-7D. The area of the first user interface for displaying the evaluation result may also be referred to as a first area, such as area 506 in user interface 50 shown in fig. 7A-7D.
The electronic device 200 may display the related information and the evaluation result of the multimedia file in other user interfaces without being limited thereto, instead of displaying the related information and the evaluation result of the multimedia file in the user interface 50 shown in fig. 7A to 7D. For example, reference is made to fig. 7H-7J, which illustrate relevant information and evaluation results of a multimedia file displayed by an electronic device in another user interface.
Referring to fig. 7H, fig. 7H illustrates the user interface 70 of the electronic device 200 displayed after screen locking.
Referring to fig. 7I, after the "movie viewing mode" of the wearable device 100 is turned on, the electronic device 200 may acquire related information of a multimedia file viewed by a user and display the related information of the multimedia file in the user interface 70. The manner in which the electronic device 200 obtains the relevant information of the multimedia file may refer to the relevant description in the third manner in the foregoing embodiment. For example, the electronic device 200 may display the related information of the multimedia file in the user interface 70 after recognizing that the current user is in a scene of viewing the multimedia file through the detected environmental data and acquiring the related information of the multimedia file. As shown in fig. 7I, the user interface 70 displays information, such as a name, related to the multimedia file song "I am still". This may prompt the user that the current electronic device 200 has obtained information about the multimedia file being viewed by the user.
Referring to fig. 7I, after the electronic device 200 generates an evaluation result according to the experience data acquired by the wearable device 100 when the user views the multimedia file, the evaluation result is displayed on the user interface 70. Here, the timing of the electronic device 200 generating the evaluation result, and the timing and manner of displaying the evaluation result on the user interface 70 by the electronic device 200 may refer to similar descriptions of fig. 7A to 7E, which are not repeated herein.
It will be appreciated that the embodiments of fig. 6A-6I, and the embodiments of fig. 7A-7F, respectively, describe methods of evaluating multimedia files that are executed on the wearable device 100 and the electronic device 200, respectively, and that in particular implementations, the various steps mentioned above may be implemented in combination. In other words, among the above-mentioned individual steps, a part of the steps may be performed by the wearable device, and another part of the steps may be performed by the electronic device 200. For example, the electronic device 200 may generate an evaluation result using the experience data transmitted by the wearable device 100, and then may transmit the evaluation result to the wearable device 100, and the evaluation result is displayed on a display screen and shared to a website by the wearable device 100. For another example, the wearable device 100 may collect experience data and generate an evaluation result, and then send the evaluation result to the electronic device 200, and the electronic device 200 displays the evaluation result on a display screen and shares the evaluation result to a website.
The method for evaluating the multimedia file according to the embodiment of the application is described in detail below with reference to a method flowchart.
Referring to fig. 8, fig. 8 shows a flow of a method for evaluating a multimedia file provided by an embodiment of the present application when the wearable device 100 performs most of the operations. As shown in fig. 8, the method may include:
s110, the wearable device 100 turns on the "movie mode".
The wearable device 100 may be turned on in a "movie mode" by referring to any of the ways described in the (first) -third) above.
In some embodiments, the operation for turning on the "viewing mode" of the wearable device 100 may be an operation acting on the electronic device 200, or an operation acting on the electronic device 200, and may be described with reference to the embodiments of fig. 5A and 5B.
In other embodiments, wearable device 100 may also automatically turn on the "movie mode" without user action.
Specifically, turning on the "movie mode" may refer to activating various sensors of the wearable device 100. After turning on the "movie mode", the wearable device 100 may collect user experience data using the sensor when the user is in a scene of viewing the multimedia file.
S120, in a case where the user is in a scene of viewing the multimedia file, the wearable device 100 collects experience data when the user views the multimedia file.
Specifically, when the "movie mode" of the wearable device 100 is turned on, and when the user is in a scene of viewing the multimedia file, the wearable device 100 may collect experience data when the user views the multimedia file.
After the wearable device 100 starts the "movie watching mode" through the above (first) manner, that is, under the trigger of the user operation, the wearable device 100 may collect experience data when the user views the multimedia file after determining that the user is in a scene of viewing the multimedia file. The manner in which the wearable device 100 determines that the user is in a scene of viewing the multimedia file may include two types of:
(1) The wearable device 100 may autonomously recognize whether the user is in a scene of viewing the multimedia file. Reference may be made specifically to the description related to the wearable device 100 in the above (second) manner as to whether the user is in a scene of viewing the multimedia file.
(2) The electronic device 200 may send a notification message to the wearable device 100 after determining that the current user is in a scene of viewing the multimedia file, so that the wearable device 100 knows that the current user is in a scene of viewing the multimedia file. Here, the electronic device 200 determines a manner in which the user is viewing the scene of the multimedia file, and reference is made to the related description in the above-described (third) manner.
It can be appreciated that, when the wearable device 100 determines that the user is in the scene of watching the multimedia file in the above (1) or (2), the related information of the multimedia file may also be obtained, and reference may be made to the related descriptions in the above (second) or (third) modes.
When the wearable device 100 starts the "viewing mode" in the second or third mode, because the user is already in the scene of watching the multimedia file before the wearable device 100 starts the "viewing mode", the wearable device can directly collect the experience data when the user watches the multimedia file after starting the "viewing mode".
The manner in which the wearable device 100 collects experience data when the user views the multimedia file may refer to the foregoing related description, and will not be described herein.
In some embodiments, the wearable device 100 may display the collected experience data of the user in a display screen, see fig. 6C and the related description of fig. 6D.
And S130, the wearable device 100 generates an evaluation result of the user on the multimedia file according to the experience data.
In some embodiments, the wearable device 100 may generate the evaluation result of the multimedia file by the user in response to the user operation when receiving the user operation, and in particular, refer to the related descriptions of fig. 6C and 6D.
In other embodiments, the wearable device 100 may autonomously generate the user's evaluation result of the multimedia file when the user exits from viewing the multimedia file or when the user has finished viewing the multimedia file.
For the way in which the wearable device 100 generates the user's evaluation result for the multimedia file, reference may be made to the relevant description of the previous embodiments.
In some embodiments of the present application, after the wearable device 100 generates the evaluation result of the multimedia file by the user, the evaluation result may be displayed on the display screen, and in particular, reference may be made to the related descriptions of fig. 6F and fig. 6G.
And S140, the wearable device 100 responds to the received user operation and sends the related information of the multimedia file and the evaluation result of the user on the multimedia file to a server or other devices.
The evaluation result sent to the server or other devices may be an evaluation result directly generated by the wearable device 100, or may be a modified evaluation result, which is not limited in the embodiment of the present application.
Here, the server may be a server of each large website, and the other devices may be terminal devices used by friends of the user. That is, the wearable device 100 may share the user's ratings for the multimedia files to a website or friend.
Referring to fig. 9, fig. 9 shows a flow of a method for evaluating a multimedia file according to an embodiment of the present application when the electronic device 200 performs most of the operations. As shown in fig. 9, the method may include:
S210, the wearable device 100 turns on the "movie mode".
S220, in the case that the user is in a scene of viewing the multimedia file, the wearable device 100 collects experience data when the user views the multimedia file.
Step S210 may refer to step S110 in the embodiment of fig. 8, and step S220 may refer to step S120 in the embodiment of fig. 8.
In the embodiment of the present application, the electronic device 200 may also obtain the related information of the multimedia file, and the specific reference may be made to the foregoing related description. Here, the related information of the multimedia file may include, but is not limited to, a name, a duration, a content profile, main creation information, a time node of the multimedia file currently being played, and the like of the multimedia file.
And S230, the wearable device 100 sends the acquired experience data of the user when watching the multimedia file to the electronic device 200.
S240, the electronic equipment 200 receives the experience data and generates an evaluation result of the user on the multimedia file according to the experience data.
In the embodiment of the present application, after receiving the experience data sent by the wearable device, the electronic device 200 may display the collected experience data of the user on the display screen, and specifically, refer to fig. 7A and the related description of fig. 7B.
In some embodiments, the electronic device 200 may generate a user's evaluation result of the multimedia file in response to a user operation when receiving the user operation, and in particular, reference may be made to the related descriptions of fig. 7A-7D.
In other embodiments, the electronic device 200 may autonomously generate the user's evaluation result of the multimedia file when the user exits from viewing the multimedia file or when the user has finished viewing the multimedia file.
For the manner in which the electronic device 200 generates the user's evaluation result for the multimedia file, reference may be made to the relevant description of the previous embodiments.
In some embodiments of the present application, after the electronic device 200 generates the evaluation result of the multimedia file by the user, the evaluation result may be displayed on the display screen, and in particular, reference may be made to the related descriptions of fig. 7D and fig. 7E.
S250, the electronic equipment 200 responds to the received user operation and sends the related information of the multimedia file and the evaluation result of the user on the multimedia file to a server or other equipment.
The evaluation result sent to the server or other devices may be an evaluation result directly generated by the electronic device 200 or may be a modified evaluation result, which is not limited in the embodiment of the present application.
Here, the server may be a server of each large website, and the other devices may be terminal devices used by friends of the user. That is, the electronic device 200 may share the user's ratings for the multimedia files to a website or friend.
The method provided by the embodiment of the application is not limited to the way that the electronic device and the wearable device are combined to execute the method provided by the embodiment of the application together, and the embodiment of the application also provides an implementation mode that the electronic device independently executes the evaluation method of the multimedia file. When the electronic device independently executes the evaluation method of the multimedia file, the electronic device may integrate part of the hardware of the wearable device in the above embodiment and execute the corresponding function. For example, the electronic device may inherit various types of sensors for gathering user experience data.
Referring to fig. 10, fig. 10 shows a flowchart of a method for the electronic device to independently execute the evaluation of the multimedia file.
S310, the electronic equipment starts a 'film watching mode'.
Here, the manner in which the electronic device turns on the "movie mode" is similar to that in the above embodiment in which the wearable device turns on the "movie mode", and reference is made to the related description.
In some embodiments, a user may manually turn on a "viewing mode" of the electronic device by user operation. The operation for turning on the "viewing mode" of the electronic device may be an operation acting on the electronic device, and may be described with reference to the embodiments of fig. 5A and 5B. In the embodiment of the present application, an operation of turning on the "viewing mode" of the electronic device, which acts on the electronic device, may be referred to as a second operation.
In other embodiments, the electronic device may also automatically turn on the "movie mode" without user action. The manner in which the electronic device automatically starts the "movie mode" is the same as the manner in which the wearable device automatically starts the "movie mode" in the above embodiment, and reference may be made to the related description.
Specifically, turning on the "viewing mode" of the electronic device may refer to activating various sensors of the electronic device. After the 'viewing mode' is started, the electronic device can acquire experience data of the user by using the sensor when the user is in a scene of watching the multimedia file.
S320, under the condition that the user is in a scene of watching the multimedia file, the electronic equipment collects experience data when the user watches the multimedia file.
Specifically, when the "movie mode" of the electronic device is turned on, and when the user is in a scene of watching the multimedia file, the electronic device may collect experience data when the user watches the multimedia file.
After the electronic equipment starts a 'viewing mode' under the triggering of user operation, the electronic equipment can acquire experience data when the user views the multimedia file after determining that the user is in a scene of viewing the multimedia file. The manner in which the electronic device determines that the user is in a scene of viewing the multimedia file may include two types:
(1) The electronic equipment detects the environmental data and judges whether the current user is in a scene of watching the multimedia file according to the environmental data. Reference is made in particular to the description of the second mode in the above embodiment.
(2) The electronic equipment identifies itself to play the multimedia file, and identifies the large-screen equipment to play the multimedia file. Here, the manner in which the electronic device recognizes whether the current user is in a scene of viewing the multimedia file in the (2) th manner is the same as in the (third) manner in the above-described embodiment, and the description thereof is referred to.
It can be understood that the electronic device may also obtain related information of the multimedia file, and reference may be made to the related descriptions in the above (ii) and (iii) modes.
When the electronic equipment automatically starts the 'viewing mode', the electronic equipment can directly collect experience data when the user views the multimedia file after the 'viewing mode' is started because the user is in a scene of viewing the multimedia file before the 'viewing mode' is started.
The manner in which the electronic device collects experience data when the user views the multimedia file may refer to the foregoing related description, and will not be described herein.
In some embodiments, the electronic device may display the collected user experience data in a display screen, as described with respect to fig. 7A and 7B.
S330, the electronic equipment generates an evaluation result of the user on the multimedia file according to the acquired experience data.
In some embodiments, the electronic device may generate a user's evaluation result of the multimedia file in response to a user operation when the user operation is received, and in particular, reference may be made to the related descriptions of fig. 7A-7D.
In other embodiments, the electronic device may autonomously generate the user's evaluation result of the multimedia file when the user exits from viewing the multimedia file or when the user has finished viewing the multimedia file.
The manner in which the electronic device generates the user's evaluation result for the multimedia file may be referred to the relevant description of the previous embodiments.
In some embodiments of the present application, after the electronic device generates the evaluation result of the user on the multimedia file, the evaluation result may be displayed on the display screen, and specifically, reference may be made to the related descriptions of fig. 7D and fig. 7E.
And S340, the electronic equipment responds to the received user operation and sends the related information of the multimedia file and the evaluation result of the user on the multimedia file to a server or other equipment.
The evaluation result sent to the server or other devices may be an evaluation result directly generated by the electronic device or may be a modified evaluation result, which is not limited in the embodiment of the present application.
Here, the server may be a server of each large website, and the other devices may be terminal devices used by friends of the user. That is, the electronic device may share the user's ratings for the multimedia files to a website or friend.
The embodiments of the present application may be arbitrarily combined to achieve different technical effects.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions in accordance with the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk Solid STATE DISK), etc.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program to instruct related hardware, the program may be stored in a computer readable storage medium, and the program may include the above-described method embodiments when executed. The storage medium includes a ROM or a random access memory RAM, a magnetic disk or an optical disk, and other various media capable of storing program codes.
In summary, the foregoing description is only exemplary embodiments of the present invention and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, improvement, etc. made according to the disclosure of the present invention should be included in the protection scope of the present invention.
Claims (8)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202010246633.6A CN113468349B (en) | 2020-03-31 | 2020-03-31 | Multimedia file evaluation method, device and system |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202010246633.6A CN113468349B (en) | 2020-03-31 | 2020-03-31 | Multimedia file evaluation method, device and system |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN113468349A CN113468349A (en) | 2021-10-01 |
| CN113468349B true CN113468349B (en) | 2025-08-05 |
Family
ID=77865715
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202010246633.6A Active CN113468349B (en) | 2020-03-31 | 2020-03-31 | Multimedia file evaluation method, device and system |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN113468349B (en) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106060643A (en) * | 2016-06-28 | 2016-10-26 | 乐视控股(北京)有限公司 | Method and device for playing multimedia file and earphones |
| CN108319643A (en) * | 2017-12-22 | 2018-07-24 | 新华网股份有限公司 | Evaluation Method and System for Multimedia Information |
| CN108882045A (en) * | 2017-05-11 | 2018-11-23 | 昆山研达电脑科技有限公司 | A kind of film scoring apparatus and method based on wearable device |
Family Cites Families (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103530788A (en) * | 2012-07-02 | 2014-01-22 | 纬创资通股份有限公司 | Multimedia evaluation system, multimedia evaluation device and multimedia evaluation method |
| CN108377422B (en) * | 2018-02-24 | 2020-05-19 | 腾讯科技(深圳)有限公司 | Multimedia content playing control method, device and storage medium |
-
2020
- 2020-03-31 CN CN202010246633.6A patent/CN113468349B/en active Active
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106060643A (en) * | 2016-06-28 | 2016-10-26 | 乐视控股(北京)有限公司 | Method and device for playing multimedia file and earphones |
| CN108882045A (en) * | 2017-05-11 | 2018-11-23 | 昆山研达电脑科技有限公司 | A kind of film scoring apparatus and method based on wearable device |
| CN108319643A (en) * | 2017-12-22 | 2018-07-24 | 新华网股份有限公司 | Evaluation Method and System for Multimedia Information |
Also Published As
| Publication number | Publication date |
|---|---|
| CN113468349A (en) | 2021-10-01 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12335575B2 (en) | Selecting avatars to be included in the video being generated on demand | |
| US20220253475A1 (en) | Video generation system to render frames on demand using a fleet of servers | |
| US9894415B2 (en) | System and method for media experience data | |
| WO2020238356A1 (en) | Interface display method and apparatus, terminal, and storage medium | |
| US20150264432A1 (en) | Selecting and presenting media programs and user states based on user states | |
| WO2021244457A1 (en) | Video generation method and related apparatus | |
| KR101945082B1 (en) | Method for transmitting media contents, apparatus for transmitting media contents, method for receiving media contents, apparatus for receiving media contents | |
| CN115033313B (en) | Terminal application control method, terminal device and chip system | |
| US10311613B2 (en) | Electronic device for processing image and method for controlling thereof | |
| US12047660B2 (en) | Information processing apparatus, information processing method, and program | |
| CN115695860B (en) | Method, electronic device and server for recommending video clips | |
| CN111970401A (en) | Call content processing method and electronic equipment | |
| CN113128265B (en) | Character recognition method and device | |
| US20250013702A1 (en) | Application recommendation method and electronic device | |
| EP4195073A1 (en) | Content recommendation method, electronic device and server | |
| CN114079730B (en) | Shooting method and shooting system | |
| CN115809362A (en) | Content recommendation method and electronic device | |
| CN113468349B (en) | Multimedia file evaluation method, device and system | |
| CN119496954A (en) | Method and electronic device for generating audio effect video | |
| CN118402756A (en) | Optimization method for monitoring sleep wakefulness and related electronic equipment | |
| CN108632450A (en) | Electronic equipment and method for capture content | |
| US20150120707A1 (en) | Method and apparatus for performing image-based searches | |
| US20230388576A1 (en) | Engagement and synchronization using received audio or visual cues | |
| CN120278501A (en) | Training plan making method and related equipment | |
| CN118151567A (en) | Equipment control method and related device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |