WO2018161893A1 - Procédé et dispositif d'identification d'utilisateur - Google Patents
Procédé et dispositif d'identification d'utilisateur Download PDFInfo
- Publication number
- WO2018161893A1 WO2018161893A1 PCT/CN2018/078139 CN2018078139W WO2018161893A1 WO 2018161893 A1 WO2018161893 A1 WO 2018161893A1 CN 2018078139 W CN2018078139 W CN 2018078139W WO 2018161893 A1 WO2018161893 A1 WO 2018161893A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- stroke
- trajectory
- feature information
- information
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
Definitions
- the present application relates to the field of communications technologies, and in particular, to a user identity identification method and apparatus.
- smart devices such as smart watches, smart bracelets, smart oxygen meters, and smart blood pressure monitors.
- it is sometimes necessary to record part of sensitive data such as its own medical data, motion data, etc.
- the smart device is required to recognize the identity of the current user.
- most smart devices do not have a touch screen, and most smart devices do not choose to add additional expensive sensors (such as fingerprint readers) for identification in order to reduce costs, and Most smart devices do not have strong computing power.
- AirSig is an identity authentication method based on gesture recognition.
- AirSig can input a password for a smart device lacking a touch screen, no expensive sensors and limited computing resources for user identification, and the user holds the smart device.
- the smart device can identify the identity of the air gesture initiator.
- AirSig identifies the user's identity by installing an accelerometer and a gyroscope in the smart device.
- the real-time data output by the accelerometer and the gyroscope is compared with the pre-stored feature information template data.
- the real-time data outputted by the accelerometer and the gyroscope is directly compared with the pre-stored feature information template data to identify the user identity. Therefore, the user's hand holding the smart device corresponding to the real-time data needs the user hand corresponding to the pre-stored feature information template.
- the gesture of holding the smart device is consistent to ensure the accuracy of the recognition, which is limited by the posture of the user holding the smart device.
- the present application provides a user identification method and apparatus to solve the problem that the posture of the user holding the smart device is limited in the related art.
- the application provides a user identity identification method, including:
- the trajectory is composed of three-dimensional coordinate points of the user's arm at each moment; segmenting the stroke of the trajectory, and performing features on each stroke Information extraction; performing user identification according to the extracted feature information of each stroke of each stroke and the feature information of each stroke corresponding to one of the at least one set of pre-stored feature information templates, each set of pre-stored feature information templates including at least one track corresponding to each Characteristic information of a stroke.
- the trajectory of the user gesture is constructed according to the rotational angular velocity measured by the gyroscope in the terminal device in real time, the stroke segmentation processing is performed on the trajectory, and the feature information is extracted for each stroke, and finally, according to the extracted feature information of each stroke
- the feature information of each stroke corresponding to one track in at least one set of pre-stored feature information templates is used for user identification. Since the feature information is extracted based on the user gesture track and compared with the pre-stored feature information template, the user identity is not recognized. It is limited by the posture of the user holding the terminal device. Users and terminal devices can still identify the user in various postures, such as standing, standing, and lying, improving the accuracy of recognition and user experience.
- the feature information includes shape information and speed information, or the feature information includes length information, angle information, and speed information, or the feature information includes shape information, speed information, and acceleration information.
- the trajectory of the user gesture is constructed according to the rotational angular velocity measured by the gyroscope in real time, including:
- P t - is the three-dimensional coordinate point of the previous moment of the current time
- the three-dimensional coordinate point at which the user gesture starts is the origin.
- the segmenting processing of the trajectory includes:
- the size of each stroke is normalized, including:
- the rotation normalizes each stroke, including:
- the axis of each stroke is a line segment from the starting point to the ending point, the initial absolute coordinate system and the terminal device of the user gesture start time
- the three coordinate axes of the coordinate system are the same, and the origin of the initial absolute coordinate system is a three-dimensional coordinate point of the position where the user's elbow is located.
- the feature information extraction is performed for each stroke, including:
- a sequence of first derivative of the three-dimensional position of each stroke of the middle distance interval with respect to time is calculated, and the velocity information of each stroke is obtained.
- the feature information extraction is performed for each stroke, including:
- the acceleration information of each stroke is calculated according to the speed information of each stroke.
- the segmenting processing of the trajectory includes:
- the segmentation points in the trajectory after the rotation normalization and the size normalization are determined according to the two-dimensional curvature, and at least one stroke is obtained.
- the rotation normalization of the trajectory includes:
- the normalizing the size of the trajectory includes:
- the width W and height H of the trajectory are calculated, u(i) in the two-dimensional coordinate sequence is divided by W, and V(i) in the two-dimensional coordinate sequence is divided by H.
- the feature information extraction is performed for each stroke, including:
- a sequence of first derivative of the three-dimensional position of each stroke of the middle distance interval with respect to time is calculated, and the velocity information of each stroke is obtained.
- the user identity is determined according to the feature information of each of the extracted strokes and the feature information of each of the corresponding strokes in the at least one set of pre-stored feature information templates, including:
- Determining whether to accept the user who initiated the user gesture is based on the calculated DTW distance and a preset threshold.
- the preset threshold is an average of DTW distances between feature information of each segment of strokes corresponding to all the tracks in a set of pre-stored feature information templates, and all of the set of pre-stored feature information templates. The sum of the standard deviations of the DTW distances between the feature information of each stroke corresponding to the track.
- each set of pre-stored feature information templates carries a user identifier.
- the application provides a user identity identification device, including:
- a trajectory construction module configured to construct a trajectory of the user gesture according to a real-time measured rotational angular velocity of the gyroscope in the terminal device, where the trajectory is composed of three-dimensional coordinate points of the user's arm at each moment;
- a stroke segmentation processing module is configured to The trajectory performs segmentation processing of the stroke;
- the information extraction module is configured to extract feature information for each segment of the stroke;
- the identification module is configured to: according to the extracted feature information of each segment of the stroke and each of the at least one set of pre-stored feature information templates
- the feature information of a stroke is used for user identification, and each set of pre-stored feature information templates includes feature information of each stroke corresponding to at least one track.
- the trajectory of the user gesture is constructed according to the rotational angular velocity measured by the gyroscope in the terminal device in real time, the stroke segmentation processing is performed on the trajectory, and the feature information is extracted for each stroke, and finally, according to the extracted feature information of each stroke
- the feature information of each stroke corresponding to one track in at least one set of pre-stored feature information templates is used for user identification. Since the feature information is extracted based on the user gesture track and compared with the pre-stored feature information template, the user identity is not recognized. It is limited by the posture of the user holding the terminal device. Users and terminal devices can still identify the user in various postures, such as standing, standing, and lying, improving the accuracy of recognition and user experience.
- the feature information includes shape information and speed information, or the feature information includes length information, angle information, and speed information, or the feature information includes shape information, speed information, and acceleration information.
- the trajectory building module is specifically configured to:
- P t - is the three-dimensional coordinate point of the previous moment of the current time
- the three-dimensional coordinate point at which the user gesture starts is the origin.
- the stroke segmentation processing module includes:
- a first determining unit configured to determine a segment point in the trajectory according to the three-dimensional curvature, to obtain at least one stroke
- the first normalization unit is used for size normalization and rotation normalization of each stroke.
- the first normalization unit is specifically configured to:
- the axis of each stroke is a line segment from the starting point to the ending point, the initial absolute coordinate system and the terminal device of the user gesture start time
- the three coordinate axes of the coordinate system are the same, and the origin of the initial absolute coordinate system is a three-dimensional coordinate point of the position where the user's elbow is located.
- the information extraction module is specifically configured to:
- a sequence of first derivative of the three-dimensional position of each stroke of the middle distance interval with respect to time is calculated, and the velocity information of each stroke is obtained.
- the information extraction module is specifically configured to:
- the acceleration information of each stroke is calculated according to the speed information of each stroke.
- the stroke segmentation processing module includes:
- a second normalization unit configured to perform rotation normalization and size normalization on the trajectory
- a second determining unit configured to determine a segmentation point in the trajectory after the rotation normalization and the size normalization according to the two-dimensional curvature, to obtain at least one stroke.
- the second normalization unit is specifically configured to:
- the width W and height H of the trajectory are calculated, u(i) in the two-dimensional coordinate sequence is divided by W, and V(i) in the two-dimensional coordinate sequence is divided by H.
- the information extraction module is specifically configured to:
- a sequence of first derivative of the three-dimensional position of each stroke of the middle distance interval with respect to time is calculated, and the velocity information of each stroke is obtained.
- the identification module is specifically used to:
- Determining whether to accept the user who initiated the user gesture is based on the calculated DTW distance and a preset threshold.
- the preset threshold is an average of DTW distances between feature information of each segment of strokes corresponding to all the tracks in a set of pre-stored feature information templates, and all of the set of pre-stored feature information templates. The sum of the standard deviations of the DTW distances between the feature information of each stroke corresponding to the track.
- each set of pre-stored feature information templates carries a user identifier.
- the application provides a user identity recognition apparatus, including:
- the memory is used to store program instructions
- the processor is configured to perform the user identification method in any of the possible aspects of the first aspect and the first aspect.
- the present application provides a readable storage medium, where an execution instruction is stored, and when at least one processor of the user identification device executes the execution instruction, the user identification device performs the first aspect and the On the one hand, any possible design of the user identification method.
- the present application provides a program product comprising an execution instruction stored in a readable storage medium.
- At least one processor of the user identification device can read the execution instructions from a readable storage medium, the at least one processor executing the execution instructions such that the user identification device implements the first aspect and any of the possible aspects of the first aspect User identification method.
- Embodiment 1 is a flowchart of Embodiment 1 of a method for identifying a user of the present application
- FIG. 2 is a schematic diagram of a training process
- FIG. 3 is a schematic diagram of the authentication identification process
- Embodiment 4 is a flowchart of Embodiment 2 of a method for identifying a user of the present application
- FIG. 5 is a flowchart of Embodiment 3 of a method for identifying a user identity according to the present application
- Embodiment 6 is a flowchart of Embodiment 4 of a method for identifying a user identity according to the present application
- FIG. 7 is a schematic structural diagram of Embodiment 1 of a user identity identification apparatus according to the present application.
- Embodiment 8 is a schematic structural diagram of Embodiment 2 of a user identity identification apparatus according to the present application.
- FIG. 9 is a schematic structural diagram of Embodiment 3 of a user identity identification apparatus according to the present application.
- the user identification method and device provided by the application can be applied to various terminal devices, such as a mobile phone, a smart watch, an intelligent blood pressure meter, etc., for identifying a user identity, the terminal device does not need a touch screen, and the user holds the terminal device.
- the terminal device can identify the identity of the air gesture initiator.
- the present application draws the user gesture track by using the gyroscope installed in the terminal device, extracts the feature information based on the user gesture track, and compares it with the pre-stored feature template.
- the identity of the user is recognized, and the real-time data of the accelerometer and the gyroscope output is directly compared with the pre-stored feature information template data to identify the user identity, which is not limited by the posture of the user holding the terminal device.
- the user and the terminal device can still recognize the user identity in various postures, such as standing, standing, and lying.
- FIG. 1 is a flowchart of Embodiment 1 of a user identification method of the present application. As shown in FIG. 1 , the method in this embodiment may include:
- the terminal device is equipped with a gyro sensor, and the user can hold the hand device or the terminal device with the gyroscope on the wrist and then write in the air as an input of the terminal device, and the gyroscope can acquire the hand.
- the trajectory of each user is different even if they do the same gesture trajectory, because according to the principle of kinesiology, when a person wants to make a gesture, the brain will first segment the gesture, and each segment is an arc. For each segment, ie each arc, each segment is not a perfect arc and has a perfect arc due to the different ways in which the nerves control the muscles, resulting in a normal distribution of velocity from start to finish in each stroke. Personal characteristics, people writing is also based on this principle.
- the trajectory of the user gesture is constructed, and the two coordinate systems, the device coordinate system and the initial absolute coordinate system are designed, and the device coordinate system is the coordinate system of the terminal device itself, which is defined by the terminal device itself.
- the initial absolute coordinate system is the same as the three coordinate axes of the terminal device coordinate system at the start of the user gesture, and the origin of the initial absolute coordinate system is the three-dimensional coordinate point of the position where the user's elbow is located.
- the human arm is divided into an upper arm and a forearm, the upper arm connects the elbow and the shoulder, and the forearm connects the wrist and the elbow. When the user writes in the air, the movement is not very large.
- the main part of the displacement is the movement of the wrist relative to the elbow.
- This process can be regarded as the rigid body (forearm) moving around the elbow (the fulcrum).
- the gyroscope records the angular velocity of the rotation, so the motion trajectory can be calculated as long as the length of the rigid body is known.
- the trajectory of the user gesture is constructed according to the rotational angular velocity measured by the gyroscope in real time, which may be:
- P t - is the three-dimensional coordinate point of the previous moment of the current time
- the three-dimensional coordinate point at which the user gesture starts is the origin.
- the user's forearm can be regarded as a rigid body moving around the elbow, that is, a straight line moves around the origin in the initial absolute coordinate system.
- the gyroscope can obtain the angular velocity of the current device rotation, and thus can calculate the rotation matrix of the device posture change from the previous moment to the current moment, and this rotation matrix also describes the posture change of the user's arm.
- the current rotation matrix C t is:
- ( ⁇ x , ⁇ y , ⁇ z ) is the rotational angular velocity of the current time t obtained by the gyroscope
- C t ⁇ is the rotation matrix of the previous time t ⁇
- I is the unit matrix.
- the three-dimensional coordinate point of each moment of the user's arm is obtained, that is, the trajectory of the user's gesture is obtained
- S102 Perform stroke segmentation processing on the trajectory, and perform feature information extraction on each stroke.
- the trajectory needs to be segmented, and the principle of segmentation is that each pen approximates an arc. Differentiating between different people requires extracting the features hidden in each pen. For a trajectory, the segmentation point is the junction of the two strokes, which is the turning point between the two strokes.
- the normalization process is performed on the basis of each trajectory, and each of the trajectories is normalized and extracted. Personal characteristics of the pen.
- the stroke processing is performed on the trajectory, which may specifically be:
- the segmentation points in the trajectory are determined according to the three-dimensional curvature, at least one stroke is obtained, and then each of the strokes is subjected to size normalization and rotation normalization.
- a threshold is set to determine whether the point is a segmentation point.
- the size of each stroke is normalized, which can be: the size of each stroke is divided by the length of the track. Thus the track length of the entire gesture becomes unity.
- Rotate the normalization of each stroke which can be: rotate the axis of each stroke to be parallel to the X axis of the initial absolute coordinate system.
- the axis of each stroke is a line segment from the starting point to the ending point.
- the coordinate system is the same as the three coordinate axes of the terminal device coordinate system at the start of the user gesture, and the origin of the initial absolute coordinate system is the three-dimensional coordinate point of the position where the user's elbow is located.
- the feature information includes shape information and speed information, or the feature information includes length information, angle information, and speed information, or the feature information includes shape information, speed information, and acceleration information.
- feature information is extracted for each stroke, which may be:
- the three-dimensional coordinate sequence of each stroke is extracted at a medium time interval, and the shape information of each stroke is obtained; the first derivative sequence of the three-dimensional position of each stroke with respect to time is calculated, and the speed information of each stroke is obtained.
- the speed information For the current location, For the position of the previous moment, t is the current moment, t - is the previous moment, then the speed information
- t is the current moment
- t - is the previous moment
- feature information is extracted for each stroke, which may be:
- feature information is extracted for each stroke, which may be:
- the stroke processing of the trajectory may be specifically:
- Rotate normalization and size normalization of the trajectory determine the segmentation points in the trajectory after rotation normalization and size normalization according to the two-dimensional curvature, and obtain at least one stroke.
- the rotation of the trajectory is normalized, which may be:
- u(i) is a three-dimensional coordinate point constituting the trajectory projected on the YZ plane
- v(i) is the value of the Z-axis projected on the YZ plane of the three-dimensional coordinate point constituting the trajectory
- N is the number of three-dimensional coordinate points constituting the trajectory.
- the rotation axis of the minimum moment of inertia of the trajectory is searched, and the trajectory is rotated to a position where the rotation axis of the minimum moment of inertia is parallel to the Y axis projected to the Y-Z plane.
- the size of the trajectory is normalized, which may be:
- the feature information of the stroke is extracted for each stroke.
- the pre-stored feature information template includes feature information of each segment of the stroke corresponding to the at least one track.
- the user identity is performed. If the terminal device has not been trained by the user gesture before, the terminal The device will prompt the user to perform multiple trainings (for example, 5 to 10 times).
- the terminal device extracts the feature information corresponding to each user's gesture track according to the process of extracting the feature information, and stores the information multiple times.
- the feature information of the gesture track is used as a pre-stored feature information template.
- the device can enter the authentication state.
- a group of pre-stored feature information templates can be used to distinguish different users. When a user uses multiple users, multiple sets of pre-stored feature information templates can be stored. Each group of pre-stored feature information templates can carry user identifiers to distinguish different users.
- the user identity is performed according to the extracted feature information of each segment of the stroke and the feature information of each segment of the corresponding segment of the pre-stored feature information template, which may be:
- DTW dynamic time warping
- the DTW distance between the shape information and the velocity information between each stroke and the template is calculated separately.
- the two feature information are speed information and shape information, respectively, and their physical units are different. Therefore, when calculating the overall DTW distance, the DTW distances of the two feature information need to be normalized to make their values. The range is between 0-1 and then added.
- each gesture trajectory template determines whether to accept the current user according to the preset threshold, when more than half of the templates are currently being authenticated. The user is accepted when the trajectory DTW distance is less than a preset threshold.
- the preset threshold is set, for example, to the average value of the DTW distance between the feature information of each stroke corresponding to all the tracks in the pre-stored feature information template plus one standard deviation.
- the feature information including the shape information, the speed information and the acceleration information as an example
- the user can start the gesture recognition process of the terminal device in an active or passive manner.
- the terminal device prompts the recognition process to start
- the user can start to make a gesture
- the terminal device starts to record the gyroscope data in the gesture process.
- Users can customize the style of gestures, such as five-pointed stars, Chinese characters, and so on.
- the recognition may be ended in an active or passive manner.
- the device recognizes that the gesture ends, the trajectory construction, personal user feature information extraction and recognition are started, and if the user exists in the database, it is finally displayed on the display screen. Prompt the identified user and accept the user to log in to the terminal device. If the user does not exist in the database, the login is forbidden.
- the user identification method provided in this embodiment constructs a trajectory of the user gesture according to the rotational angular velocity measured by the gyroscope in the terminal device in real time, performs segmentation processing on the trajectory, and extracts feature information for each segment of the stroke, and finally extracts the feature information according to the extraction.
- the feature information of each stroke of each stroke is identified by the feature information of each stroke corresponding to one track in at least one set of pre-stored feature information templates, because the feature information is extracted on the basis of the user gesture track and the pre-stored feature information template is extracted.
- the comparison to identify the user's identity is not limited by the user's posture of holding the terminal device. The user and the terminal device can still recognize the user identity in various postures, such as standing, standing, and lying. Improved recognition accuracy and user experience.
- FIG. 2 is a schematic diagram of the training process, as shown in Figure 2, including:
- the terminal device prompts the user to perform multiple trainings (for example, 5 to 10 times), and stores feature information of multiple gesture trajectories as a pre-stored feature information template.
- FIG 3 is a schematic diagram of the authentication identification process, as shown in Figure 3, including:
- the authenticated user succeeds and executes the corresponding command.
- the user identification method of the present application is described in detail in three different usage scenarios.
- FIG. 4 is a flowchart of Embodiment 2 of the user identification method of the present application. As shown in FIG. 4, the method in this embodiment may include:
- the rotation matrix C t of the terminal device posture change from the previous moment to the current moment is calculated according to the rotational angular velocity of the current time
- P t P t ⁇ *C t
- the user's forearm can be regarded as a rigid body moving around the elbow, that is, a straight line moves around the origin in the initial absolute coordinate system.
- the gyroscope can obtain the angular velocity of the current device rotation, and thus can calculate the rotation matrix of the device posture change from the previous moment to the current moment, and this rotation matrix also describes the posture change of the user's arm.
- the current rotation matrix C t is:
- ( ⁇ x , ⁇ y , ⁇ z ) is the rotational angular velocity of the current time t obtained by the gyroscope
- C t ⁇ is the rotation matrix of the previous time t ⁇
- I is the unit matrix.
- the three-dimensional coordinate point of each moment of the user's arm is obtained, that is, the trajectory of the user's gesture is obtained
- S402. Determine a segmentation point in the trajectory according to the three-dimensional curvature, and obtain at least one stroke.
- each stroke is divided by the length of the trajectory.
- the track length of the entire gesture becomes unity.
- the axis of each stroke is a line segment from the starting point to the ending point.
- the DTW distance between the shape information and the velocity information between each stroke and the template is calculated separately. For each feature information, since there are multiple strokes, it is necessary to calculate the DTW distance of the gesture track and the corresponding stroke in the template, and add the DTW distances of all the strokes as the DTW distance of one feature of the two tracks. It is worth noting that the two feature information are speed information and shape information, respectively, and their physical units are different. Therefore, when calculating the overall DTW distance, the DTW distances of the two feature information need to be normalized to make their values. The range is between 0-1 and then added.
- each gesture trajectory template determines whether to accept the current user according to the preset threshold, when more than half of the templates are currently being authenticated. The user is accepted when the trajectory DTW distance is less than a preset threshold.
- the preset threshold is set, for example, to the average value of the DTW distance between the feature information of each stroke corresponding to all the tracks in the pre-stored feature information template plus one standard deviation.
- FIG. 5 is a flowchart of Embodiment 3 of the user identification method of the present application. As shown in FIG. 5, the method in this embodiment may include:
- the rotation of the trajectory is normalized, which may be:
- u(i) is a three-dimensional coordinate point constituting the trajectory projected on the YZ plane
- v(i) is the value of the Z-axis projected on the YZ plane of the three-dimensional coordinate point constituting the trajectory
- N is the number of three-dimensional coordinate points constituting the trajectory.
- the rotation axis of the minimum moment of inertia of the trajectory is searched, and the trajectory is rotated to a position where the rotation axis of the minimum moment of inertia is parallel to the Y axis projected to the Y-Z plane.
- the size of the trajectory is normalized, which may be:
- each group of pre-stored feature information templates may carry a user identifier for distinguishing different users.
- FIG. 6 is a flowchart of Embodiment 4 of the user identification method of the present application. As shown in FIG. 6, the method in this embodiment may include:
- S601 Construct a trajectory of the user gesture according to the rotational angular velocity measured by the gyroscope in the terminal device in real time, and the trajectory is composed of three-dimensional coordinate points of the user arm at each moment.
- S602. Determine a segmentation point in the trajectory according to the three-dimensional curvature, and obtain at least one stroke.
- each stroke is divided by the length of the trajectory.
- the track length of the entire gesture becomes unity.
- the axis of each stroke is a line segment from the starting point to the ending point.
- the acceleration information of each stroke is calculated according to the speed information of each stroke. For the speed of the current moment, Speed for the previous moment, acceleration information
- the calculation formula is:
- FIG. 7 is a schematic structural diagram of Embodiment 1 of a user identity identification apparatus according to the present application.
- the user identity identification apparatus may be implemented as part or all of a terminal device by using software, hardware, or a combination of the two.
- the embodiment of the present invention is as shown in FIG. 7, the embodiment of the present invention is as shown in FIG.
- the apparatus may include: a trajectory construction module 11, a stroke segmentation processing module 12, an information extraction module 13, and an identification module 14, wherein
- the trajectory construction module 11 is configured to construct a trajectory of the user gesture according to the rotational angular velocity measured by the gyro in the terminal device in real time, and the trajectory is composed of three-dimensional coordinate points of the user's arm at each moment.
- the stroke segmentation processing module 12 is configured to perform stroke segmentation processing on the trajectory.
- the information extraction module 13 is configured to perform feature information extraction for each piece of strokes.
- the identification module 14 is configured to perform user identification according to the extracted feature information of each segment of the stroke and the feature information of each segment of the stroke corresponding to one of the at least one set of pre-stored feature information templates, where each set of pre-stored feature information templates includes at least one track corresponding Characteristic information of each stroke.
- the feature information includes shape information and speed information, or the feature information includes length information, angle information, and speed information, or the feature information includes shape information, speed information, and acceleration information.
- trajectory construction module 11 is specifically configured to:
- P t - is the three-dimensional coordinate point of the previous moment of the current time
- the three-dimensional coordinate point at which the user gesture starts is the origin.
- the device in this embodiment may be used to implement the technical solution of the method embodiment shown in FIG. 1 , and the implementation principle is similar, and details are not described herein again.
- the user identity recognition apparatus constructs a trajectory of the user gesture according to the rotational angular velocity measured by the gyroscope in the terminal device in real time, performs segmentation processing on the trajectory of the trajectory, and extracts feature information for each segment of the stroke, and finally extracts according to the feature.
- the feature information of each stroke of each stroke is identified by the feature information of each stroke corresponding to one track in at least one set of pre-stored feature information templates, because the feature information is extracted on the basis of the user gesture track and the pre-stored feature information template is extracted.
- the comparison to identify the user's identity is not limited by the user's posture of holding the terminal device. The user and the terminal device can still recognize the user identity in various postures, such as standing, standing, and lying. Improved recognition accuracy and user experience.
- FIG. 8 is a schematic structural diagram of Embodiment 2 of the user identity identification apparatus of the present application. As shown in FIG. 8, the apparatus of this embodiment is based on the apparatus structure shown in FIG. 7. Further, the stroke segmentation processing module 12 includes the first a determining unit 121 and a first normalizing unit 122, the first determining unit 121 is configured to determine a segment point in the trajectory according to the three-dimensional curvature, to obtain at least one stroke, and the first normalization unit 122 is configured to perform each stroke Size normalization and rotation normalization.
- the first normalization unit 122 is specifically configured to:
- the axis of each stroke is a line segment from the starting point to the ending point, and the initial absolute coordinates. It is the same as the three coordinate axes of the terminal device coordinate system at the start of the user gesture, and the origin of the initial absolute coordinate system is the three-dimensional coordinate point of the position where the user's elbow is located.
- the information extraction module 13 is specifically configured to:
- the three-dimensional coordinate sequence of each stroke is extracted at a medium time interval, and the shape information of each stroke is obtained, and the first derivative sequence of the three-dimensional position of each stroke interval with respect to time is calculated, and the velocity information of each stroke is obtained.
- the information extraction module 13 is specifically configured to:
- Extract the three-dimensional coordinate sequence of each stroke in the middle time interval obtain the shape information of each stroke, calculate the first derivative sequence of the three-dimensional position of each stroke with respect to time, and obtain the speed information of each stroke, according to each stroke
- the velocity information calculates the acceleration information for each stroke.
- the device in this embodiment may be used to implement the technical solution of the method embodiment shown in FIG. 1 , and the implementation principle is similar, and details are not described herein again.
- FIG. 9 is a schematic structural diagram of Embodiment 3 of the user identity identification apparatus of the present application.
- the apparatus of this embodiment is based on the apparatus structure shown in FIG. 7, and further, the stroke segmentation processing module 12 includes a second.
- the normalization unit 123 and the second determining unit 124 are configured to perform rotation normalization and size normalization on the trajectory.
- the second determining unit 124 is configured to determine a segmentation point in the trajectory of the rotation normalization and the size normalization according to the two-dimensional curvature, to obtain at least one stroke.
- the second normalization unit 123 is specifically configured to:
- u(i) is a three-dimensional coordinate point constituting the trajectory projected on the YZ plane
- v(i) is the value of the Z-axis projected on the YZ plane of the three-dimensional coordinate point constituting the trajectory
- N is the number of three-dimensional coordinate points constituting the trajectory
- the information extraction module 13 is specifically configured to:
- Calculate the length of each stroke obtain the length information of each stroke, calculate the angle between two consecutive strokes, and the angle is the angle between the minimum moments of inertia of the two strokes, and obtain the continuous between the two strokes.
- the angle information is used to calculate a first-order derivative sequence of the three-dimensional position of each stroke of the medium-distance interval with respect to time, and obtain the velocity information of each stroke.
- the identification module 14 is specifically configured to:
- the dynamic time-timed DTW distance of the extracted feature information of each stroke of each stroke and the feature information of each stroke of a set of pre-stored feature information templates is calculated according to the calculated DTW distance.
- the preset threshold determines whether the user who initiated the user gesture is accepted.
- the preset threshold is an average value of the DTW distance between the feature information of each stroke corresponding to all the tracks in the pre-stored feature information template plus a standard deviation.
- each set of pre-stored feature information templates carries a user identifier.
- the device in this embodiment may be used to implement the technical solution of the method embodiment shown in FIG. 1 , and the implementation principle is similar, and details are not described herein again.
- the application also provides a user identity recognition device, including: a memory and a processor;
- the memory is used to store program instructions
- the processor is configured to execute the user identity identification method in the foregoing method embodiment.
- the application further provides a readable storage medium, where the execution instruction is stored, and when the at least one processor of the user identification device executes the execution instruction, the user identification device performs the user identity in the foregoing method embodiment. recognition methods.
- the application also provides a program product comprising an execution instruction stored in a readable storage medium.
- At least one processor of the user identification device can read the execution instruction from a readable storage medium, and the at least one processor executes the execution instruction such that the user identification device implements the user identification method in the above method embodiment.
- the aforementioned program can be stored in a computer readable storage medium.
- the program when executed, performs the steps including the foregoing method embodiments; and the foregoing storage medium includes various media that can store program codes, such as a ROM, a RAM, a magnetic disk, or an optical disk.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Security & Cryptography (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
- Image Analysis (AREA)
Abstract
La présente invention concerne un procédé et un dispositif d'identification d'utilisateur. Le procédé consiste à : construire une trajectoire pour un geste d'utilisateur selon une vitesse de rotation angulaire mesurée en temps réel par un gyroscope d'un dispositif terminal, la trajectoire étant constituée de coordonnées tridimensionnelles d'un bras d'un utilisateur à chaque instant (S101) ; réaliser un traitement de segmentation de course d'écriture sur la trajectoire et extraire les informations de caractéristique à partir de chaque segment d'une course d'écriture (S102) ; et effectuer une identification d'utilisateur en fonction des informations de caractéristiques extraites à partir de chacun des segments de course d'écriture et des informations de caractéristiques de chaque segment de course d'écriture correspondant à une trajectoire dans au moins un groupe de modèles d'informations de caractéristiques pré-stockées, chaque groupe de modèles d'informations de caractéristiques pré-stockées comprenant des informations de caractéristiques de chaque segment d'une course d'écriture correspondant à au moins une trajectoire (S103). Par conséquent, l'identification de l'utilisateur n'est pas limitée par un terminal portatif en ce qui concerne les gestes reconnus, ce qui augmente la précision de l'identification et améliore l'expérience de l'utilisateur.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201710128556.2A CN108536314A (zh) | 2017-03-06 | 2017-03-06 | 用户身份识别方法及装置 |
| CN201710128556.2 | 2017-03-06 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018161893A1 true WO2018161893A1 (fr) | 2018-09-13 |
Family
ID=63447267
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2018/078139 Ceased WO2018161893A1 (fr) | 2017-03-06 | 2018-03-06 | Procédé et dispositif d'identification d'utilisateur |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN108536314A (fr) |
| WO (1) | WO2018161893A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN117633276A (zh) * | 2024-01-25 | 2024-03-01 | 江苏欧帝电子科技有限公司 | 一种书写轨迹录播方法、系统及终端 |
Families Citing this family (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN109409316B (zh) * | 2018-11-07 | 2022-04-01 | 极鱼(北京)科技有限公司 | 空中签名方法及装置 |
| CN110490059A (zh) * | 2019-07-10 | 2019-11-22 | 广州幻境科技有限公司 | 一种可穿戴智能戒指的手势识别方法、系统及装置 |
| CN110942042B (zh) * | 2019-12-02 | 2022-11-08 | 深圳棒棒帮科技有限公司 | 一种三维手写签名认证方法、系统、存储介质及设备 |
| CN112598424A (zh) * | 2020-12-29 | 2021-04-02 | 武汉天喻聚联科技有限公司 | 一种基于动作口令的鉴权方法及系统 |
| CN116630993B (zh) * | 2023-05-12 | 2024-09-06 | 北京竹桔科技有限公司 | 身份信息记录方法、装置、计算机设备和存储介质 |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103034429A (zh) * | 2011-10-10 | 2013-04-10 | 北京千橡网景科技发展有限公司 | 用于触摸屏的身份验证方法和装置 |
| CN103295028A (zh) * | 2013-05-21 | 2013-09-11 | 深圳Tcl新技术有限公司 | 手势操作控制方法、装置及智能显示终端 |
| CN103631501A (zh) * | 2013-10-11 | 2014-03-12 | 金硕澳门离岸商业服务有限公司 | 基于手势控制的数据传输方法 |
| CN105431813A (zh) * | 2013-05-20 | 2016-03-23 | 微软技术许可有限责任公司 | 基于生物计量身份归属用户动作 |
Family Cites Families (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| GB0515796D0 (en) * | 2005-07-30 | 2005-09-07 | Mccarthy Peter | A motion capture and identification device |
| CN102810008B (zh) * | 2012-05-16 | 2016-01-13 | 北京捷通华声语音技术有限公司 | 一种空中输入系统、方法及空中输入采集设备 |
| CN103257711B (zh) * | 2013-05-24 | 2016-01-20 | 河南科技大学 | 空间手势输入方法 |
| CN103679213B (zh) * | 2013-12-13 | 2017-02-08 | 电子科技大学 | 一种3d手势识别方法 |
| CN103927532B (zh) * | 2014-04-08 | 2017-11-03 | 武汉汉德瑞庭科技有限公司 | 基于笔画特征的笔迹配准方法 |
| CN103984416B (zh) * | 2014-06-10 | 2017-02-08 | 北京邮电大学 | 一种基于加速度传感器的手势识别方法 |
| KR20160133305A (ko) * | 2015-05-12 | 2016-11-22 | 삼성전자주식회사 | 제스쳐 인식 방법, 컴퓨팅 장치 및 제어 장치 |
| CN105630174A (zh) * | 2016-01-22 | 2016-06-01 | 上海斐讯数据通信技术有限公司 | 一种智能终端拨号的系统和方法 |
| CN105912910A (zh) * | 2016-04-21 | 2016-08-31 | 武汉理工大学 | 基于手机传感的在线签名身份认证方法及系统 |
-
2017
- 2017-03-06 CN CN201710128556.2A patent/CN108536314A/zh active Pending
-
2018
- 2018-03-06 WO PCT/CN2018/078139 patent/WO2018161893A1/fr not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103034429A (zh) * | 2011-10-10 | 2013-04-10 | 北京千橡网景科技发展有限公司 | 用于触摸屏的身份验证方法和装置 |
| CN105431813A (zh) * | 2013-05-20 | 2016-03-23 | 微软技术许可有限责任公司 | 基于生物计量身份归属用户动作 |
| CN103295028A (zh) * | 2013-05-21 | 2013-09-11 | 深圳Tcl新技术有限公司 | 手势操作控制方法、装置及智能显示终端 |
| CN103631501A (zh) * | 2013-10-11 | 2014-03-12 | 金硕澳门离岸商业服务有限公司 | 基于手势控制的数据传输方法 |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN117633276A (zh) * | 2024-01-25 | 2024-03-01 | 江苏欧帝电子科技有限公司 | 一种书写轨迹录播方法、系统及终端 |
| CN117633276B (zh) * | 2024-01-25 | 2024-06-07 | 江苏欧帝电子科技有限公司 | 一种书写轨迹录播方法、系统及终端 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN108536314A (zh) | 2018-09-14 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2018161893A1 (fr) | Procédé et dispositif d'identification d'utilisateur | |
| Tian et al. | KinWrite: Handwriting-Based Authentication Using Kinect. | |
| US9965608B2 (en) | Biometrics-based authentication method and apparatus | |
| CN106897592B (zh) | 用户认证方法、用户认证设备以及书写工具 | |
| CN107679446B (zh) | 人脸姿态检测方法、装置及存储介质 | |
| CN106354252B (zh) | 一种基于stdw的连续字符手势轨迹识别方法 | |
| CN109829368B (zh) | 手掌特征的识别方法、装置、计算机设备及存储介质 | |
| US20150177842A1 (en) | 3D Gesture Based User Authorization and Device Control Methods | |
| TWI569176B (zh) | 手寫軌跡識別方法與系統 | |
| Zhang et al. | Fine-grained and real-time gesture recognition by using IMU sensors | |
| CN105980973A (zh) | 用户认证手势 | |
| JP2013206002A (ja) | 非接触生体認証装置 | |
| Lu et al. | Gesture on: Enabling always-on touch gestures for fast mobile access from the device standby mode | |
| US20160291704A1 (en) | Disambiguation of styli by correlating acceleration on touch inputs | |
| CN106411952A (zh) | 一种隔空动态手势用户身份认证方法及装置 | |
| Mendels et al. | User identification for home entertainment based on free-air hand motion signatures | |
| CN102411712B (zh) | 基于笔迹的身份识别的方法及终端 | |
| EP4544378A1 (fr) | Détection et suivi d'objet dans des dispositifs de réalité étendue | |
| CN110472389A (zh) | 基于触控手势的身份认证系统及其工作方法 | |
| KR102629007B1 (ko) | 사용자 인증 방법 및 시스템 | |
| Ciuffo et al. | Smartwatch-based transcription biometrics | |
| CN103440447A (zh) | 具有攻击者身份识别能力的在线笔迹身份认证方法 | |
| CN111754571B (zh) | 一种姿态识别方法、装置及其存储介质 | |
| US10180717B2 (en) | Information processing device, information processing method, and program | |
| Robledo-Moreno et al. | AirSignatureDB: Exploring In-Air Signature Biometrics in the Wild and its Privacy Concerns |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18764482 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 18764482 Country of ref document: EP Kind code of ref document: A1 |