CN110619239A - Application interface processing method and device, storage medium and terminal - Google Patents
Application interface processing method and device, storage medium and terminal Download PDFInfo
- Publication number
- CN110619239A CN110619239A CN201910818791.1A CN201910818791A CN110619239A CN 110619239 A CN110619239 A CN 110619239A CN 201910818791 A CN201910818791 A CN 201910818791A CN 110619239 A CN110619239 A CN 110619239A
- Authority
- CN
- China
- Prior art keywords
- application interface
- terminal
- application
- face image
- processing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/70—Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
- G06F21/82—Protecting input, output or interconnection devices
- G06F21/84—Protecting input, output or interconnection devices output devices, e.g. displays or monitors
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Computer Security & Cryptography (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the application discloses an application interface processing method, an application interface processing device, a storage medium and a terminal. The method comprises the following steps: detecting whether an application interface of a specified application is displayed currently; if so, acquiring a current face image through a camera of the terminal; identifying the face image to obtain an identification result; and processing the application interface according to the identification result. According to the scheme, when the user uses the terminal to check the application information, the face image which can be acquired by the camera is identified and verified, so that the leakage of the privacy information of the user is effectively avoided, and the information security of the terminal is improved.
Description
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a method and an apparatus for processing an application interface, a storage medium, and a terminal.
Background
With the development of the internet and the mobile communication network, and the rapid development of the processing capability and the storage capability of the terminal, a great amount of applications are rapidly spread and used.
However, along with the propagation and use of applications, the private information of the user is also easily leaked. In the related art, in a manner that a user manually closes or quits an application interface, a situation of missed closing or missed quitting is easy to occur. For example, in the process of viewing the private information by using an application, if an emergency occurs and the current interface cannot be closed in time, or the user forgets to close the current interface and leaves the terminal aside, the private information is easily leaked, and the security of the information in the terminal is poor.
Disclosure of Invention
The embodiment of the application interface processing method and device, the storage medium and the terminal can improve the safety of terminal information.
In a first aspect, an embodiment of the present application provides an application interface processing method, applied to a terminal, including:
detecting whether an application interface of a specified application is displayed currently;
if so, acquiring a current face image through a camera of the terminal;
identifying the face image to obtain an identification result;
and processing the application interface according to the identification result. .
In a second aspect, an embodiment of the present application provides an application interface processing apparatus, which is applied to a terminal, and includes:
the detection unit is used for detecting whether an application interface of a specified application is displayed at present;
the acquisition unit is used for acquiring a current face image through a camera of the terminal when the detection unit detects that the application interface of the specified application is displayed currently;
the recognition unit is used for recognizing the face image to obtain a recognition result;
and the processing unit is used for processing the application interface according to the identification result.
In a third aspect, an embodiment of the present application further provides a computer-readable storage medium, where a plurality of instructions are stored in the storage medium, and the instructions are adapted to be loaded by a processor to perform the application interface processing method described above.
In a fourth aspect, an embodiment of the present application further provides a terminal, including a processor and a memory, where the processor is electrically connected to the memory, the memory is used to store instructions and data, and the processor is used to execute the application interface processing method.
In the implementation of the application, whether the application interface of the specified application is displayed at the current terminal is detected, and if the application interface of the specified application is displayed, the current face image is acquired through the terminal camera and is identified. And finally, processing the application interface according to the identification result. According to the scheme, when the user uses the terminal to check the application information, the identity verification can be performed on the face image which can be acquired by the camera, the leakage of the privacy information of the user is effectively avoided, and the information security of the terminal is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flow chart of an application interface processing method according to an embodiment of the present disclosure.
Fig. 2 is a schematic diagram of a terminal according to an embodiment of the present application.
Fig. 3 is a schematic diagram of another terminal provided in the embodiment of the present application.
Fig. 4 is another schematic flow chart of an application interface processing method provided in the embodiment of the present application.
Fig. 5 is a schematic structural diagram of an application interface processing apparatus according to an embodiment of the present application.
Fig. 6 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Fig. 7 is another schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides an application interface processing method and device, a storage medium and a terminal. The details will be described below separately.
In an embodiment, an application interface processing method is provided, and is applied to terminal devices such as smart phones, tablet computers, and notebook computers. Referring to fig. 1, a specific flow of the application interface processing method may be as follows:
101. whether an application interface of a specified application is displayed currently is detected.
Specifically, the specific application may be a relatively private application that is screened by the user from various applications installed in the terminal in advance. Such as email, text messages, contacts, notebooks, etc. In particular, for convenience of setting, one or some types of applications may be directly set as more private applications. Wherein the application types can be classified based on the functions of the applications. For example, the application type may include office type, social type application, game type application, financial type application, and the like, and the social type application and the financial type application may be set as the above-mentioned specific application (i.e., more private application).
102. And acquiring a current face image through a camera of the terminal.
In this embodiment, the terminal has a camera, and can be used to shoot the picture at the side where the terminal display screen is located. For example, the camera may be a front camera built in the terminal.
In some embodiments, referring to fig. 2, the camera may be disposed in the display screen at a position close to the center line L to better capture a picture in front of the terminal.
Referring to fig. 3, in some embodiments, the terminal may include a plurality of cameras, and the plurality of cameras may be respectively disposed in different areas of the terminal to better acquire pictures from different angles.
With continued reference to fig. 3, the camera may also be disposed below the display screen. The display screen can be specially processed, so that the part of the display screen corresponding to the camera achieves the effect of light transmission. And then the light-in path of the camera enters the light-in surface of the camera through the display screen to obtain the external picture.
In this embodiment, the picture acquired by the terminal includes a face image. The method aims at the security of the terminal information, and aims to avoid the situation that other users peep the screen or illegally use the terminal. Therefore, the picture taken only for the camera is a scene containing a face image.
103. And identifying the face image to obtain an identification result.
In this embodiment, the face image may be recognized by an image recognition technique. Specifically, image features in the face image can be extracted, and corresponding recognition results can be obtained by analyzing the image features.
In some embodiments, the image features may include one or more of color features, texture features, shape features, spatial relationship features. Wherein a color feature is a global feature describing surface properties of a scene corresponding to an image or an image area. Texture features are also global features that describe the surface properties of the scene to which the image or image region corresponds. The shape feature is a local feature, and has two types of representation methods, one is a contour feature and mainly aims at the outer boundary of an object; the other is a region feature, which relates to the entire shape region. The spatial relationship features refer to the mutual spatial position or relative direction relationship among a plurality of targets segmented from the image, and these relationships can also be classified into connection/adjacency relationship, overlapping/overlapping relationship, inclusion/containment relationship, and the like.
Image feature extraction is to extract image information using a computer and determine whether a point of each image belongs to one image feature. The result of feature extraction is to divide the points on the image into different subsets, which often belong to isolated points, continuous curves or continuous regions. Features are the starting points for many computer image analysis algorithms. One of the most important characteristics of feature extraction is "repeatability", i.e. the features extracted from different images of the same scene should be the same.
In the specific implementation process, the image characteristics of the local area image can be extracted by utilizing a Fourier transform method, a window Fourier transform method, a wavelet transform method, a least square method, a boundary direction histogram method, texture characteristic extraction based on Tamura texture characteristics and the like.
104. And processing the application interface according to the identification result.
In some embodiments, the recognition result includes a user identity. The step of processing the application interface according to the recognition result may include the following processes:
(11) verifying the identity of the user;
(12) if the verification is passed, keeping the application interface unchanged;
(13) and if the verification fails, the preset picture is superposed on the application interface so as to cover the content displayed by the application interface.
Specifically, when the user identity is verified, the user identity may be matched with the sample user identity, and it is determined whether the user identity passes the verification based on a matching result. Wherein, the user identity can identify and distinguish each face image.
The preset picture may be a picture arbitrarily set by a user. In some embodiments, the preset amount picture may display information for prompting that a peeper is peeping the terminal display screen, so as to alert the user.
In some embodiments, the preset picture may be a picture similar to the matched content of the current application interface, so that a peeper may regard the covered preset picture as the application interface that is actually viewed by the user. It should be noted that different application interfaces may correspond to different preset pictures.
In some embodiments, the face image includes a plurality of faces. The step of processing the application interface according to the recognition result may include the following processes:
(21) according to the recognition result, determining a target face which cannot pass identity authentication from the plurality of faces;
(22) determining the visual line deflection angle of eyes in the target face;
(23) and when the sight line deflection angle is within the preset angle range, processing the application interface.
Specifically, after a target face (i.e., a face of an unauthorized user) that cannot be authenticated is determined from a plurality of faces according to the recognition result, an eye region is determined from the target face. If the eye area is identified, the image characteristics of the eye area can be further extracted, the sight line of the eyes is analyzed, and whether an unauthorized user watches the display screen of the terminal can be determined according to the analysis result.
In this embodiment, when determining the eye region from the face image, Dlib may be used to detect key points of the face. The Dlib is a machine-learning C + + library and contains a plurality of algorithms commonly used for machine learning. For example, by Dlib it is possible to detect: the chin is formed by 17 face key point marks, and left and right eyebrows are respectively formed by 5 face key point marks, and the nose is formed by 9 face key point marks, and left and right eyes are respectively formed by 6 face key point marks, and the mouth is formed by 20 face key point marks, has 68 face key points in total. And determining an eye region from the face image according to the detection result of the face key point.
In some embodiments, when determining the gaze deflection angle of the eyes, the difference information between the two may be calculated by comparing the information such as the pupil position and the area in the identified eye region with the information such as the pupil position and the area of the pre-acquired face image. Then, according to the difference information, the visual line deflection angle of the eyes is calculated by using a correlation algorithm.
In practical application, if the line of sight deflection angle is within a preset angle range, the line of sight of the user can be indicated to be on the display screen. At this time, it may be determined that the user is currently viewing the terminal display. When the line of sight deflection angle is out of the preset angle range, the fact that the line of sight of the user is not on the display screen can be indicated, and at the moment, it can be determined that the user is not watching the display screen of the terminal currently. Based on this, different processing can be performed on the application interface.
In some embodiments, if the gaze deflection angle is within the preset angle range, when the application interface is processed, the application interface may be specifically subjected to a blurring process (such as a mosaic process and a blurring process), or switched to another application interface. Because the peeping behavior is usually a certain distance away from the display screen, the current application interface can be subjected to fuzzy processing, so that a peeper cannot see the specific content displayed clearly; or the current interface is directly switched to other application interfaces, so that a peeper cannot perceive that the peeping behavior of the peeper is discovered.
In some embodiments, an application interface similar to the currently displayed application interface in interface layout but different in actual content may also be generated in real time based on the layout of the currently displayed application interface to replace the currently displayed application interface, so that a peeper cannot perceive the change of the interface.
In practical application, if the eye area cannot be identified, the user is not watching the display screen of the terminal currently, and therefore peeping behavior does not exist. At this time, the current application interface does not need to be processed.
In some embodiments, after the gaze deflection angle is determined to be within the preset angle range, the terminal may further preset a vibration frequency and a time interval to control the terminal to generate vibration, so as to generate a specific vibration sense, so as to prompt a user that a peeper is watching the display screen at present.
In practical applications, it is considered that the contents presented by different interfaces in different applications are different. Thus, for more private applications, not every application interface has private information. For example, a setting interface for setting application functions is not always private information. Thus, in implementations, it may be determined whether an interface is a more private application interface based on the actual presentation of the interface. That is, referring to fig. 4, in some embodiments, after detecting that the application interface currently displaying the specified application is displayed, before acquiring the current face image through the camera of the terminal, the following process may be further included:
105. acquiring content displayed by an application interface;
106. analyzing the content to obtain an analysis result;
107. determining the security level of the content according to the analysis result;
108. judging whether the security level meets a preset condition or not; if yes, go to step 102, otherwise, end the process.
Specifically, a related classification algorithm may be adopted to analyze the content displayed on the application interface, and identify the intention expressed by the content, that is, to know what interface is used in the application interface, so as to analyze and obtain the privacy degree of the content. Then, a privacy level of the content is determined based on the degree of privacy.
The privacy level can be divided into a plurality of levels based on actual conditions, and the higher the privacy level is, the higher the corresponding privacy degree is.
In practical applications, when determining whether the security level meets the preset condition, it may be specifically determined that the preset condition is met when the security level is higher. In specific implementation, a preset level may be set, and if the security level is greater than the preset level, the security level may be considered to satisfy the preset condition.
In some embodiments, analyzing the content to obtain the analysis result may include the following steps:
(31) performing feature extraction on the content to obtain content features, wherein the content features comprise: text features and/or image features;
(32) analyzing a first similarity between the text features and the sample text features and a second similarity between the image features and the sample image features;
(33) and generating an analysis result according to the first similarity and the second similarity.
Specifically, when analyzing the content of the application interface, text features, image features, and the like may be extracted for the content. Then, the similarity between the extracted features and preset sample features (i.e., sample text features, sample image features) is analyzed, and an analysis result can be obtained based on the magnitude of the similarity.
In some embodiments, the similarity value may be used directly as the analysis result, so that the corresponding privacy level may be determined subsequently based directly on the similarity value. In practical application, the similarity values may be divided into different intervals, and each security level corresponds to a different similarity interval. Wherein, the larger the similarity is, the higher the corresponding security level is.
As can be seen from the above, in the application interface processing method provided in this embodiment, whether the application interface of the specified application is displayed on the current terminal is detected, and if the application interface of the specified application is displayed, the current face image is acquired through the terminal camera and is identified. And finally, processing the application interface according to the identification result. According to the scheme, when the user uses the terminal to check the application information, the identity verification can be performed on the face image which can be acquired by the camera, the leakage of the privacy information of the user is effectively avoided, and the information security of the terminal is improved.
In another embodiment of the present application, an application interface processing apparatus is further provided, where the application interface processing apparatus may be integrated in a terminal in a form of software or hardware, and the terminal may specifically include a mobile phone, a tablet computer, a notebook computer, and the like. As shown in fig. 5, the application interface processing apparatus 300 may include: a detection unit 301, a first acquisition unit 302, a recognition unit 303 and a processing unit 304, wherein:
a detecting unit 301, configured to detect whether an application interface of a specific application is currently displayed;
a first obtaining unit 302, configured to obtain a current face image through a camera of the terminal when the detecting unit detects that an application interface of a specified application is currently displayed;
the recognition unit 303 is configured to recognize the face image to obtain a recognition result;
and the processing unit 304 is configured to process the application interface according to the identification result.
In some embodiments, the apparatus 300 may further include:
the second acquisition unit is used for acquiring the content displayed by the application interface before the current face image is acquired through the camera of the terminal after the application interface with the specified application is detected to be displayed currently;
the analysis unit is used for analyzing the content to obtain an analysis result;
a determining unit configured to determine a security level of the content according to the analysis result;
the obtaining unit 302 is specifically configured to obtain a current face image through a camera of the terminal when the determining unit 307 determines that the security level meets a preset condition.
In some embodiments, the analysis unit may be specifically configured to:
performing feature extraction on the content to obtain content features, wherein the content features comprise: text features and/or image features;
analyzing a first similarity between the text feature and a sample text feature and a second similarity between the image feature and the sample image feature;
and generating the analysis result according to the first similarity and the second similarity.
In some embodiments, the recognition result may include a user identity; the processing unit 304 may be specifically configured to:
verifying the user identity;
if the verification is passed, keeping the application interface unchanged;
and if the verification fails, overlapping a preset picture on the application interface so as to cover the content displayed by the application interface.
In some embodiments, the face image includes a plurality of faces; the processing unit 304 may include:
the first determining subunit is used for determining a target face which cannot pass identity authentication from the plurality of faces according to the identification result;
the second determining subunit is used for determining the visual line deflection angle of the eyes in the target human face;
and the processing subunit is used for processing the application interface when the second determining subunit determines that the sight line deflection angle is within the preset angle range.
In some embodiments, the processing subunit may be to: and when the sight line deflection angle is within a preset angle range, carrying out fuzzy processing on the application interface, or switching the application interface to other application interfaces.
In some embodiments, the apparatus 300 may further include:
and the prompting unit is used for controlling the terminal to generate vibration at a preset vibration frequency and a preset time interval after the sight line deflection angle is determined to be within a preset angle range.
In some embodiments, the first prompt message is a short message, and the second prompt message is a short message.
As can be seen from the above, the application interface processing apparatus provided in the embodiment of the present application detects whether an application interface of a specific application is displayed on a current terminal; if so, acquiring a current face image through a camera of the terminal; identifying the face image to obtain an identification result; and processing the application interface according to the identification result. According to the scheme, when the user uses the terminal to check the application information, the face image which can be acquired by the camera is identified and verified, so that the leakage of the privacy information of the user is effectively avoided, and the information security of the terminal is improved.
In another embodiment of the present application, a terminal is further provided, where the terminal may be a terminal device such as a smart phone and a tablet computer. As shown in fig. 6, the terminal 400 includes a processor 401, a memory 402, and a camera 407. The processor 401 is electrically connected to the memory 402.
The processor 401 is a control center of the terminal 400, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the terminal and processes data by running or loading an application stored in the memory 402 and calling data stored in the memory 402, thereby performing overall monitoring of the terminal.
In this embodiment, the processor 401 in the terminal 400 loads instructions corresponding to one or more application processes into the memory 402 according to the following steps, and the processor 401 runs the application stored in the memory 402, thereby implementing various functions:
detecting whether an application interface of a specified application is displayed currently;
if so, acquiring a current face image through a camera of the terminal;
identifying the face image to obtain an identification result;
and processing the application interface according to the identification result.
In some embodiments, after detecting that the application interface currently displaying the specified application is displayed, before acquiring the current face image through the camera of the terminal, the processor 401 may perform the following operations:
acquiring the content displayed by the application interface;
analyzing the content to obtain an analysis result;
determining a security level of the content according to the analysis result;
when the security level meets a preset condition, the processor 401 may perform an operation of acquiring a current face image through a camera of the terminal.
In some embodiments, when the content is analyzed and the analysis result is obtained, the processor 401 may perform the following operations:
performing feature extraction on the content to obtain content features, wherein the content features comprise: text features and/or image features;
analyzing a first similarity between the text feature and a sample text feature and a second similarity between the image feature and the sample image feature;
and generating the analysis result according to the first similarity and the second similarity.
In some embodiments, the identification result comprises a user identity. When the application interface is processed according to the recognition result, the processor 401 may perform the following operations:
verifying the user identity;
if the verification is passed, keeping the application interface unchanged;
and if the verification fails, overlapping a preset picture on the application interface so as to cover the content displayed by the application interface.
In some embodiments, the face image includes a plurality of faces. When the application interface is processed according to the recognition result, the processor 401 may perform the following operations:
according to the recognition result, determining a target face which cannot pass identity verification from the plurality of faces;
determining the visual line deflection angle of eyes in the target face;
and when the sight line deflection angle is within a preset angle range, processing the application interface.
In some embodiments, when the gaze deflection angle is within the preset angle range, the processor 401 may perform the following operations when processing the application interface:
and when the sight line deflection angle is within a preset angle range, carrying out fuzzy processing on the application interface, or switching the application interface to other application interfaces.
In some embodiments, after determining that the gaze deflection angle is within the preset angle range, the processor 401 may perform the following operations:
and controlling the terminal to generate vibration according to a preset vibration frequency and a preset time interval.
The memory 402 may be used to store applications and data. The memory 402 stores applications containing instructions executable in the processor. Applications may constitute various functional modules. The processor 401 executes various functional applications and data processing by running applications stored in the memory 402.
The camera 407 may be used to collect image information. In this embodiment, the camera 407 may specifically be a front camera, so as to obtain an image picture of a side where the terminal display screen is located. The front camera can be a single camera with one lens, or can be provided with two or more lenses.
In some embodiments, as shown in fig. 7, the terminal 400 further includes: display 403, control circuit 404, radio frequency circuit 405, input unit 406, sensor 408, and power supply 409. The processor 401 is electrically connected to the display 403, the control circuit 404, the rf circuit 405, the input unit 406, the sensor 408, and the power source 409.
The display screen 403 may be used to display information input by or provided to the user as well as various graphical user interfaces of the terminal, which may be constituted by images, text, icons, video, and any combination thereof.
The control circuit 404 is electrically connected to the display 403, and is configured to control the display 403 to display information.
The rf circuit 405 is used for transceiving rf signals to establish wireless communication with a terminal or other terminals through wireless communication, and to transceive signals with a server or other terminals.
The input unit 406 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control. The input unit 406 may include a fingerprint recognition module.
The sensor 408 is used to collect external environmental information. The sensors 408 may include ambient light sensors, acceleration sensors, light sensors, motion sensors, and other sensors.
The power supply 409 is used to power the various components of the terminal 400. In some embodiments, the power source 409 may be logically connected to the processor 401 through a power management system, so that functions of managing charging, discharging, and power consumption are implemented through the power management system.
Although not shown in fig. 7, the terminal 400 may further include a speaker, a bluetooth module, and the like, which will not be described in detail herein.
As can be seen from the above, the terminal provided in the embodiment of the present application detects whether an application interface of a specific application is currently displayed, and if the application interface of the specific application is displayed, acquires a current face image through the terminal camera, and identifies the face image. And finally, processing the application interface according to the identification result. According to the scheme, when the user uses the terminal to check the application information, the identity verification can be performed on the face image which can be acquired by the camera, the leakage of the privacy information of the user is effectively avoided, and the information security of the terminal is improved.
In some embodiments, a computer-readable storage medium is also provided, having stored therein a plurality of instructions adapted to be loaded by a processor to perform any of the application interface processing methods described above.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable storage medium, and the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
The application interface processing method, the application interface processing device, the storage medium and the terminal provided by the embodiment of the application are described in detail, a specific example is applied in the description to explain the principle and the implementation of the application, and the description of the embodiment is only used for helping to understand the method and the core idea of the application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.
Claims (10)
1. An application interface processing method is applied to a terminal, and is characterized by comprising the following steps:
detecting whether an application interface of a specified application is displayed currently;
if so, acquiring a current face image through a camera of the terminal;
identifying the face image to obtain an identification result;
and processing the application interface according to the identification result.
2. The application interface processing method according to claim 1, wherein after detecting that the application interface of the specified application is currently displayed, before acquiring the current face image through the camera of the terminal, the method further comprises:
acquiring the content displayed by the application interface;
analyzing the content to obtain an analysis result;
determining a security level of the content according to the analysis result;
and when the security level meets a preset condition, executing a step of acquiring a current face image through a camera of the terminal.
3. The application interface processing method according to claim 2, wherein the analyzing the content to obtain an analysis result comprises:
performing feature extraction on the content to obtain content features, wherein the content features comprise: text features and/or image features;
analyzing a first similarity between the text feature and a sample text feature and a second similarity between the image feature and the sample image feature;
and generating the analysis result according to the first similarity and the second similarity.
4. The application interface processing method according to claim 1, wherein the recognition result includes a user identity; the processing the application interface according to the recognition result comprises:
verifying the user identity;
if the verification is passed, keeping the application interface unchanged;
and if the verification fails, overlapping a preset picture on the application interface so as to cover the content displayed by the application interface.
5. The application interface processing method according to claim 1, wherein the face image includes a plurality of faces; the processing the application interface according to the recognition result comprises:
according to the recognition result, determining a target face which cannot pass identity verification from the plurality of faces;
determining the visual line deflection angle of eyes in the target face;
and when the sight line deflection angle is within a preset angle range, processing the application interface.
6. The application interface processing method according to claim 5, wherein when the gaze deflection angle is within a preset angle range, the processing of the application interface comprises:
and when the sight line deflection angle is within a preset angle range, carrying out fuzzy processing on the application interface, or switching the application interface to other application interfaces.
7. The application interface processing method according to claim 5, wherein after determining that the gaze deflection angle is within a preset angle range, the method further comprises:
and controlling the terminal to generate vibration according to a preset vibration frequency and a preset time interval.
8. An application interface processing device applied to a terminal, comprising:
the detection unit is used for detecting whether an application interface of a specified application is displayed at present;
the acquisition unit is used for acquiring a current face image through a camera of the terminal when the detection unit detects that the application interface of the specified application is displayed currently;
the recognition unit is used for recognizing the face image to obtain a recognition result;
and the processing unit is used for processing the application interface according to the identification result.
9. A computer-readable storage medium having stored thereon a plurality of instructions adapted to be loaded by a processor to perform the application interface processing method of any of claims 1-7.
10. A terminal is characterized by comprising a processor and a memory, wherein the processor is electrically connected with the memory, and the memory is used for storing instructions and data; the processor is configured to perform the application interface processing method of any one of claims 1-7.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201910818791.1A CN110619239A (en) | 2019-08-30 | 2019-08-30 | Application interface processing method and device, storage medium and terminal |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201910818791.1A CN110619239A (en) | 2019-08-30 | 2019-08-30 | Application interface processing method and device, storage medium and terminal |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN110619239A true CN110619239A (en) | 2019-12-27 |
Family
ID=68922898
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201910818791.1A Pending CN110619239A (en) | 2019-08-30 | 2019-08-30 | Application interface processing method and device, storage medium and terminal |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN110619239A (en) |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111666014A (en) * | 2020-07-06 | 2020-09-15 | 腾讯科技(深圳)有限公司 | Message pushing method, device, equipment and computer readable storage medium |
| CN112912882A (en) * | 2020-07-28 | 2021-06-04 | 深圳市大疆创新科技有限公司 | Control method, device and storage medium for terminal equipment |
| CN114185630A (en) * | 2021-11-29 | 2022-03-15 | 招联消费金融有限公司 | Screen recording method and device, computer equipment and storage medium |
| WO2022089187A1 (en) * | 2020-10-31 | 2022-05-05 | 华为技术有限公司 | Display method and electronic device |
| CN116167106A (en) * | 2023-04-25 | 2023-05-26 | 深圳市爱保护科技有限公司 | Smart watch display method and system, storage medium and smart terminal |
Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104392167A (en) * | 2014-10-27 | 2015-03-04 | 东莞宇龙通信科技有限公司 | Method and device for privacy information detection warning and terminal |
| CN106126017A (en) * | 2016-06-20 | 2016-11-16 | 北京小米移动软件有限公司 | Intelligent identification Method, device and terminal unit |
| CN106326867A (en) * | 2016-08-26 | 2017-01-11 | 维沃移动通信有限公司 | Face recognition method and mobile terminal |
| CN106446634A (en) * | 2016-09-26 | 2017-02-22 | 维沃移动通信有限公司 | Method for privacy protection of mobile terminal and mobile terminal |
| CN106599286A (en) * | 2016-12-23 | 2017-04-26 | 北京奇虎科技有限公司 | Information monitoring rumor refuting realization method and apparatus, and mobile terminal |
| CN106775390A (en) * | 2016-11-30 | 2017-05-31 | 努比亚技术有限公司 | Rimless terminal and unlocking method |
| CN107105162A (en) * | 2017-04-28 | 2017-08-29 | 努比亚技术有限公司 | A kind of image-pickup method and mobile terminal |
| CN107169329A (en) * | 2017-05-24 | 2017-09-15 | 维沃移动通信有限公司 | A kind of method for protecting privacy, mobile terminal and computer-readable recording medium |
| CN108062490A (en) * | 2018-01-03 | 2018-05-22 | 深圳市金立通信设备有限公司 | Glance prevention method, terminal and computer-readable medium |
-
2019
- 2019-08-30 CN CN201910818791.1A patent/CN110619239A/en active Pending
Patent Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104392167A (en) * | 2014-10-27 | 2015-03-04 | 东莞宇龙通信科技有限公司 | Method and device for privacy information detection warning and terminal |
| CN106126017A (en) * | 2016-06-20 | 2016-11-16 | 北京小米移动软件有限公司 | Intelligent identification Method, device and terminal unit |
| CN106326867A (en) * | 2016-08-26 | 2017-01-11 | 维沃移动通信有限公司 | Face recognition method and mobile terminal |
| CN106446634A (en) * | 2016-09-26 | 2017-02-22 | 维沃移动通信有限公司 | Method for privacy protection of mobile terminal and mobile terminal |
| CN106775390A (en) * | 2016-11-30 | 2017-05-31 | 努比亚技术有限公司 | Rimless terminal and unlocking method |
| CN106599286A (en) * | 2016-12-23 | 2017-04-26 | 北京奇虎科技有限公司 | Information monitoring rumor refuting realization method and apparatus, and mobile terminal |
| CN107105162A (en) * | 2017-04-28 | 2017-08-29 | 努比亚技术有限公司 | A kind of image-pickup method and mobile terminal |
| CN107169329A (en) * | 2017-05-24 | 2017-09-15 | 维沃移动通信有限公司 | A kind of method for protecting privacy, mobile terminal and computer-readable recording medium |
| CN108062490A (en) * | 2018-01-03 | 2018-05-22 | 深圳市金立通信设备有限公司 | Glance prevention method, terminal and computer-readable medium |
Non-Patent Citations (1)
| Title |
|---|
| 刘树勇 编著: "《科协专发 人工智能》", 30 April 2018, 科学普及出版社 * |
Cited By (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111666014A (en) * | 2020-07-06 | 2020-09-15 | 腾讯科技(深圳)有限公司 | Message pushing method, device, equipment and computer readable storage medium |
| CN111666014B (en) * | 2020-07-06 | 2024-02-02 | 腾讯科技(深圳)有限公司 | Message pushing method, device, equipment and computer readable storage medium |
| CN112912882A (en) * | 2020-07-28 | 2021-06-04 | 深圳市大疆创新科技有限公司 | Control method, device and storage medium for terminal equipment |
| WO2022089187A1 (en) * | 2020-10-31 | 2022-05-05 | 华为技术有限公司 | Display method and electronic device |
| CN114185630A (en) * | 2021-11-29 | 2022-03-15 | 招联消费金融有限公司 | Screen recording method and device, computer equipment and storage medium |
| CN114185630B (en) * | 2021-11-29 | 2024-04-23 | 招联消费金融股份有限公司 | Screen recording method, device, computer equipment and storage medium |
| CN116167106A (en) * | 2023-04-25 | 2023-05-26 | 深圳市爱保护科技有限公司 | Smart watch display method and system, storage medium and smart terminal |
| CN116167106B (en) * | 2023-04-25 | 2023-08-01 | 深圳市爱保护科技有限公司 | Smart watch display method and system, storage medium and smart terminal |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11321575B2 (en) | Method, apparatus and system for liveness detection, electronic device, and storage medium | |
| CN110619239A (en) | Application interface processing method and device, storage medium and terminal | |
| Deb et al. | Look locally infer globally: A generalizable face anti-spoofing approach | |
| CN111368811B (en) | Living body detection method, living body detection device, living body detection equipment and storage medium | |
| TWI686774B (en) | Human face live detection method and device | |
| CN108280418A (en) | The deception recognition methods of face image and device | |
| WO2016084072A1 (en) | Anti-spoofing system and methods useful in conjunction therewith | |
| US10776646B2 (en) | Identification method and apparatus and computer-readable storage medium | |
| CN105117122A (en) | Terminal screenshot method and terminal | |
| CN110135262A (en) | The anti-peeping processing method of sensitive data, device, equipment and storage medium | |
| CN108334761B (en) | User authority identification method and device | |
| CN108197585A (en) | Recognition algorithms and device | |
| CN104573440A (en) | Data viewing method and device | |
| CN112381091A (en) | Video content identification method and device, electronic equipment and storage medium | |
| KR20210036039A (en) | Electronic device and image processing method thereof | |
| CN110597426A (en) | Bright screen processing method and device, storage medium and terminal | |
| CN115359539A (en) | Office place information security detection method, device, equipment and storage medium | |
| US9197851B2 (en) | Apparatus and method for modulating images for videotelephony | |
| CN111695509A (en) | Identity authentication method, identity authentication device, machine readable medium and equipment | |
| CN115758364A (en) | Security detection method, device, equipment and medium | |
| US12277804B2 (en) | Spoof detection using catadioptric spatiotemporal corneal reflection dynamics | |
| US20250037509A1 (en) | System and method for determining liveness using face rotation | |
| JP2009156948A (en) | Display control apparatus, display control method, and display control program | |
| CN114140839A (en) | Image sending method, device and equipment for face recognition and storage medium | |
| US20250104065A1 (en) | Method for processing a transaction, system and corresponding program |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| RJ01 | Rejection of invention patent application after publication | ||
| RJ01 | Rejection of invention patent application after publication |
Application publication date: 20191227 |