WO2018121635A1 - 注视点映射函数确定方法及装置、注视点确定方法及装置 - Google Patents
注视点映射函数确定方法及装置、注视点确定方法及装置 Download PDFInfo
- Publication number
- WO2018121635A1 WO2018121635A1 PCT/CN2017/119182 CN2017119182W WO2018121635A1 WO 2018121635 A1 WO2018121635 A1 WO 2018121635A1 CN 2017119182 W CN2017119182 W CN 2017119182W WO 2018121635 A1 WO2018121635 A1 WO 2018121635A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- solution
- parameter
- determining
- gaze point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
Definitions
- the present invention relates to the field of human-computer interaction technology, and in particular, to a gaze point mapping function determining method and apparatus, and a gaze point determining method and apparatus.
- eye control technology the technology of using the eye movement control display screen to operate (referred to as eye control technology) is gradually mature.
- eye control technology when this kind of technology is applied, when the user eyes view the display somewhere, the icon on the display screen Can be opened, closed, deleted, etc.
- VR Virtual Reality
- the eye image determines the gaze point mapping function according to the image features of each eye image and the position of the calibration point corresponding to each eye image.
- the gaze mapping function functions to establish the user's eye image and the user's gaze.
- the mapping relationship between the point information when the user completes the line of sight calibration, when the user uses the VR system, the VR system can calculate the gaze point information of the user according to the image characteristics of the user's eye image and the determined gaze point mapping function.
- At least some embodiments of the present invention provide a gaze point mapping function determining method and apparatus, a gaze point determining method and apparatus, to at least partially solve the related art, using multiple calibration points to determine a gaze point mapping function, the user workload is large, A problem that benefits the user experience.
- An embodiment of the present invention provides a gaze point mapping function determining method, the method comprising: combining all parameters to be solved in a gaze point mapping function to obtain a mapping vector of a gaze point mapping function, and determining the mapping vector Corresponding priority solving parameter and secondary solving parameter; acquiring an eye image of the first user, and performing the priority solving parameter on the first user according to the eye image and the gaze point information corresponding to the eye image Solving; determining, according to the mapping vector that is respectively solved by the plurality of second users, the solution of the secondary solution parameter of the first user; and solving the solution of the priority solution parameter according to the first user Determining a solution of the secondary solution parameter of the first user to determine a gaze point mapping function of the first user.
- determining the priority solution parameter and the secondary solution parameter corresponding to the mapping vector including: determining, by using the highest priority parameter in the mapping vector, the priority solution parameter, and using the mapping vector All the parameters except the priority solution parameter are determined as the secondary solution parameter; or the coordinate transformation is performed on the mapping vector, and the top of the mapping vector after the coordinate transformation is ranked first A plurality of parameters are determined as the priority solution parameter, and all the parameters except the priority solution parameter are determined as the secondary solution parameter in the coordinate vector after the coordinate transformation.
- performing coordinate transformation on the mapping vector includes: performing principal component analysis on the mapping vectors respectively solved by the plurality of second users; and performing the mapping vector according to the result of the principal component analysis The coordinate transformation is performed to obtain the map vector after the coordinate transformation.
- determining the solution of the secondary solution parameter of the first user according to the mapping vector that is respectively solved by the plurality of second users including: separately completing the solution according to each of the second users a mapping vector, determining a solution of the secondary solution parameter corresponding to each of the second users; determining an expected value or an average value of a solution of the secondary solution parameters corresponding to all the second users as the A solution to the secondary solution parameter of a user.
- determining, according to the solution of the first solution parameter of the first user and the solution of the secondary solution parameter of the first user, determining a gaze point mapping function of the first user including: according to Determining a solution of the first solution parameter of the first user and a solution of the secondary solution parameter of the first user, determining a solution of the mapping vector of the first user; according to the first user Determining a solution of the mapping vector to determine a gaze point mapping function of the first user.
- An embodiment of the present invention provides a gaze point determining method, the method comprising: acquiring an eye image of a user; determining the user according to an image feature of the eye image and a gaze point mapping function of the user The gaze point information, wherein the gaze point mapping function of the user is determined using the method as described in the first aspect above.
- An embodiment of the present invention provides a gaze point mapping function determining apparatus, where the apparatus includes: a parameter determining module configured to combine all parameters to be solved in the gaze point mapping function to obtain a mapping vector of the gaze point mapping function Determining a priority solution parameter and a secondary solution parameter corresponding to the mapping vector; the first parameter solving module is configured to acquire an eye image of the first user, and according to the eye image and the eye point corresponding to the eye image And the second parameter solving module is configured to determine the time of the first user according to the mapping vector that is respectively solved by the plurality of second users. Solving a solution of the parameter; the function determining module is configured to determine the gaze of the first user according to the solution of the first solution parameter of the first user and the solution of the secondary solution parameter of the first user Point mapping function.
- the parameter determining module includes: a first determining submodule, configured to determine, in the mapping vector, a parameter that is the highest in the mapping vector, as the priority solving parameter, and the mapping vector is divided All the parameters other than the priority solving parameter are determined as the secondary solution parameter; or the second determining sub-module is configured to perform coordinate transformation on the mapping vector, and the coordinate vector after the coordinate transformation
- the plurality of parameters that are ranked first are determined as the priority solution parameter, and all the parameters except the priority solution parameter are determined as the secondary solution parameter in the coordinate vector after the coordinate transformation.
- the second determining sub-module is specifically configured to perform principal component analysis on the mapping vector that is respectively solved by the multiple second users, and perform the principal component analysis on the mapping vector according to the result of the principal component analysis.
- the coordinate transformation is performed to obtain the map vector after the coordinate transformation.
- An embodiment of the present invention provides a gaze point determining apparatus, the apparatus comprising: an image acquiring module configured to acquire an eye image of a user; and an information determining module configured to select an image feature and an image according to the eye image Determining the gaze point mapping function of the user, wherein the gaze point mapping function of the user is determined by the apparatus as described in the third aspect above.
- the gaze point mapping function determining method and apparatus when determining the gaze point mapping function of the first user, classify the solution process of the gaze point mapping function into a priority solution.
- the parameter solving process and the secondary solution parameter solving process solve the priority solving parameter according to the first user's eye image and the corresponding gaze point information, and determine the solution of the secondary solution parameter according to the mapping vectors solved by the plurality of second users respectively. . Since the solution of all the parameters in the gaze point mapping function is not required to be obtained by using the eye image of the user, and the number of the eye images of the user is not limited, the user does not need to look at the plurality of calibration points, and the user workload is small and the experience is high. Therefore, the problem of determining the gaze point mapping function by using multiple calibration points in the related art is solved, and the user workload is large, which is not conducive to the user experience.
- FIG. 1 is a schematic flow chart of a method for determining a gaze point mapping function according to an embodiment of the present invention
- FIG. 2 is a schematic flowchart of a method for determining a fixation point according to an embodiment of the present invention
- FIG. 3 is a schematic diagram of a composition of a gaze point mapping function determining apparatus according to an embodiment of the present invention
- FIG. 4 is a schematic structural diagram of a fixation point determining apparatus according to an embodiment of the present invention.
- the present invention provides a gaze point mapping.
- the function determining method and apparatus, the gaze point determining method and apparatus are specifically described below by way of embodiments.
- FIG. 1 is a schematic flowchart of a method for determining a gaze point mapping function according to an embodiment of the present invention. As shown in FIG. 1, the method includes the following steps:
- Step S102 combining all the parameters to be solved in the gaze point mapping function to obtain a mapping vector of the gaze point mapping function, and determining a priority solution parameter and a secondary solution parameter corresponding to the mapping vector.
- the role of the gaze point mapping function is to establish a mapping relationship between the user's eye image and the user's gaze point information. Each user corresponds to a gaze point mapping function. After determining a gaze point mapping function of a user, according to the user The gaze point mapping function and the user's eye image can determine the gaze point information of the user.
- the values of the multiple parameters are different.
- the gaze point mapping function of different users is determined, so The process of the gaze point mapping function of a certain user is a process of solving the value of each parameter corresponding to the user. Therefore, in the embodiment, the plurality of parameters in the gaze point mapping function are referred to as parameters to be solved. .
- each parameter in the mapping vector can also be referred to as an individual-specific parameter of the user.
- the gaze point mapping function can be a polynomial function form, a Gaussian function form, or a 3D model function form or the like.
- mapping vector X is assumed to be an N-dimensional vector, that is, there are N parameters that need to be solved.
- mapping vector After the mapping vector is obtained, the priority solution parameter and the secondary solution parameter corresponding to the mapping vector are determined, and the specific process is:
- the first plurality of parameters in the mapping vector are determined as the priority solution parameters, and all the parameters except the priority solution parameters in the mapping vector are determined as the secondary solution parameters.
- the mapping vector is an N-dimensional vector.
- a plurality of parameters that are ranked first in the N-dimensional vector are determined as priority solving parameters, and all parameters except the priority solving parameter are selected in the mapping vector. , determined as a secondary solution parameter.
- the number of priority solving parameters is set to L, L ⁇ 2M is required, and M indicates the number of calibration points on the display screen requiring the user to perform calibration, usually 9 or 16.
- the value of L is determined according to the number of equation conditions corresponding to the gaze point mapping function.
- the priority solution parameter and the secondary solution parameter corresponding to the mapping vector can also be determined in the following manner:
- the coordinate transformation is performed on the mapping vector, and the most advanced parameters in the coordinate vector after the coordinate transformation are determined as the priority solution parameters, and all the parameters except the priority solution parameters are obtained in the coordinate vector after the coordinate transformation. , determined as a secondary solution parameter.
- the mapping vector Y and the mapping vector X after the coordinate transformation are both N-dimensional vectors, that is, the number of parameters in the mapping vector Y after the coordinate transformation, and the number of parameters in the mapping vector X in the mode 1 is the same.
- the number of priority solution parameters determined by the mode is the same as the number L of the priority solution parameters in the first mode
- the number of secondary solution parameters determined by the method is the same as the number of secondary solution parameters in the mode 1.
- the second user refers to a predetermined user of the gaze point mapping function, and the second user corresponds to a mapping vector that is solved by the solution.
- the principal component analysis is performed on the mapping vectors respectively solved by the plurality of second users.
- the principal component analysis result is obtained, assuming that the principal component analysis result is the matrix B, and since the mapping vector X is an N-dimensional vector, the matrix B is an N ⁇ N-dimensional matrix.
- step S104 is performed to start determining the gaze point mapping function of the first user.
- Step S104 Acquire an eye image of the first user, and solve a priority solution parameter of the first user according to the eye image and the gaze point information corresponding to the eye image.
- the number of the eye images of the first user is not limited to one or more.
- image features of the eye image of the first user such as pupil coordinates, spot coordinates, and the like, are extracted, and the spot refers to an image formed by the light source in the eyeball of the first user.
- the eye image is generated. Since the gaze point is predetermined on the screen, the gaze point information of the eye image of the first user can also be obtained. Therefore, in this step, the first user's priority solution parameter is solved according to the image feature of the first user's eye image and the gaze point information of the first user's eye image, and it can be understood that for different users, priority is solved.
- the solutions of the parameters are all different, and the parameters are preferentially solved for individual-specific parameters of different users.
- the priority solution parameter can be determined in the mapping vector and can also be determined in the mapping vector after the coordinate transformation, the specific process for solving the first user's priority solution parameter can be implemented in various ways, which is not specifically limited herein.
- the number of priority solution parameters is defined as L, and L ⁇ 2M, and the specific value of L can be determined by the number of equation conditions of the gaze point mapping function, so in this step, when the acquired eye portion When the number of images is too small to solve all the priority solving parameters, the parameter space dimension reduction processing can be performed on the gaze point mapping function, thereby obtaining solutions of all the priority solving parameters.
- Step S106 determining a solution of the secondary solution parameter of the first user according to the mapping vector that is solved by the plurality of second users respectively.
- the second user refers to a predetermined user of the gaze point mapping function, and the second user corresponds to a mapping vector that is solved by the solution, and the second solution is determined by the plurality of second users respectively to determine the secondary solution parameter of the first user.
- the specific process of the solution can be:
- the solution of the secondary solution parameter of each second user is known for each second user, so in this step, all The expected or mean value of the solution of the secondary solution parameter corresponding to the second user is determined as the solution of the secondary solution parameter of the first user.
- step S108 After determining the solution of the first user's priority solution parameter and the solution of the secondary solution parameter, step S108 is performed.
- Step S108 determining a gaze point mapping function of the first user according to the solution of the first user's priority solution parameter and the solution of the first user's secondary solution parameter.
- the process of determining the gaze point mapping function of the first user is a process of solving the mapping vector of the first user, and the mapping vector corresponds to the priority solution parameter and the secondary solution parameter, the first user's priority solution parameter is solved. After the solution of the solution and the secondary solution parameters, the gaze point mapping function of the first user can be determined.
- the specific determination process is:
- the priority solution parameter and the secondary solution parameter When the priority solution parameter and the secondary solution parameter are determined in the mapping vector, the priority solution parameter and the secondary solution parameter together form a mapping vector, so the solution of the first user's priority solution parameter and the first user's secondary solution parameter are The solution is combined to obtain a solution of the mapping vector of the first user.
- the gaze point mapping function of the first user After obtaining the solution of the mapping vector of the first user, it is substituted into the gaze point mapping function, that is, the gaze point mapping function of the first user can be obtained.
- the solution process of the gaze point mapping function is divided into a priority solving parameter solving process and a secondary solving parameter solving process, according to The eye image of the first user and the corresponding gaze point information are solved by a priority solution parameter, and the solutions of the secondary solution parameters are determined according to the mapping vectors that are solved by the plurality of second users respectively. Since the method of determining the gaze point mapping function in the present embodiment does not require the solution of all the parameters in the gaze point mapping function to be obtained by using the eye image of the user, and there is no limit to the number of eye images of the user, the user does not need to gaze. A plurality of calibration points, the user workload is small, and the experience is high, thereby solving the problem that the gaze point mapping function is determined by using multiple calibration points in the related art, and the user workload is large, which is not conducive to the user experience.
- FIG. 2 is a schematic flowchart of a gaze point determining method according to an embodiment of the present invention, as shown in FIG. The method includes the following steps:
- Step S202 acquiring an eye image of the user
- Step S204 determining the gaze point information of the user according to the image feature of the eye image and the gaze point mapping function of the user, wherein the gaze point mapping function of the user is determined by using the gaze point mapping function determining method described above.
- the gaze point information of the user can be determined according to the image feature of the user's eye image and the gaze point mapping function of the user.
- the image feature of the user's eye image may be pupil coordinates, spot coordinates, and the like.
- the gaze point mapping function of the user is determined by the above-described gaze point mapping function determining method. Therefore, by using the gaze point determining method in this embodiment, it is not necessary to use the user's eye image to obtain all the gaze point mapping functions.
- the user's workload is large, which is not conducive to the user experience.
- FIG. 3 is a schematic diagram of a gaze point mapping function determining apparatus according to an embodiment of the present invention, such as As shown in Figure 3, the device includes:
- the parameter determining module 31 is configured to combine all the parameters to be solved in the gaze point mapping function to obtain a mapping vector of the gaze point mapping function, and determine a priority solution parameter and a secondary solution parameter corresponding to the mapping vector;
- the first parameter solving module 32 is configured to acquire an eye image of the first user, and solve a priority solution parameter of the first user according to the gaze point information corresponding to the eye image and the eye image;
- the second parameter solving module 33 is configured to determine a solution of the secondary solution parameter of the first user according to the mapping vector that is respectively solved by the plurality of second users;
- the function determination module 34 is configured to determine the gaze point mapping function of the first user based on the solution of the first user's priority solution parameter and the solution of the first user's secondary solution parameter.
- the parameter determining module 31 includes: a first determining submodule, configured to determine a plurality of parameters in the mapping vector that are ranked first in the first step, and determine all the parameters in the mapping vector except the priority solving parameter as The secondary solution parameter; or, the second determining sub-module is configured to perform coordinate transformation on the mapping vector, and determine a plurality of parameters in the mapping vector after the coordinate transformation are determined as priority solving parameters, and the coordinate transformation is performed. In the mapping vector, all parameters except the priority solution parameters are determined as secondary solution parameters.
- the second determining sub-module is configured to perform principal component analysis on the mapping vectors respectively solved by the plurality of second users, and perform coordinate transformation on the mapping vector according to the result of the principal component analysis to obtain a mapping vector after the coordinate transformation.
- the second parameter solving module 33 includes: a first solving sub-module, configured to determine a solution of a secondary solution parameter corresponding to each second user according to a mapping vector that is respectively solved by each second user; and a second solving sub-module And setting the expected value or the mean value of the solution of the secondary solution parameters corresponding to all the second users as the solution of the secondary solution parameter of the first user.
- the function determining module 34 includes: a vector determining submodule configured to determine a solution of the mapping vector of the first user according to the solution of the first user's priority solution parameter and the solution of the first user's secondary solution parameter; the function determining submodule And setting to determine a gaze point mapping function of the first user according to a solution of the mapping vector of the first user.
- the gaze point mapping function determining apparatus in the embodiment of the present invention when determining the gaze point mapping function of the first user, divides the solution process of the gaze point mapping function into a priority solution parameter solving process and a secondary solution parameter solving process, according to The eye image of the first user and the corresponding gaze point information are solved by a priority solution parameter, and the solutions of the secondary solution parameters are determined according to the mapping vectors that are solved by the plurality of second users respectively. Since the device is determined by the gaze point mapping function in the present embodiment, it is not necessary to obtain a solution of all the parameters in the gaze point mapping function by using the eye image of the user, and there is no limit to the number of eye images of the user, so the user does not need to gaze. A plurality of calibration points, the user workload is small, and the experience is high, thereby solving the problem that the gaze point mapping function is determined by using multiple calibration points in the related art, and the user workload is large, which is not conducive to the user experience.
- FIG. 4 is a schematic diagram of a gaze point determining device according to an embodiment of the present invention, as shown in FIG.
- the device includes:
- the image obtaining module 41 is configured to acquire an eye image of the user
- the information determining module 42 is configured to determine the gaze point information of the user according to the image feature of the eye image and the gaze point mapping function of the user, wherein the gaze point mapping function of the user is determined by the gaze point mapping function determining device described above.
- the gaze point mapping function of the user is determined by the gaze point mapping function determining method described above. Therefore, by using the gaze point determining apparatus in this embodiment, it is not necessary to use the ocular image of the user to obtain all the gaze point mapping functions.
- the user's workload is large, which is not conducive to the user experience.
- the gaze point mapping function determining apparatus and the gaze point determining apparatus provided by at least some embodiments of the present invention may be specific hardware on the device or software or firmware installed on the device.
- the implementation principle and the technical effects of the device provided by the embodiments of the present invention are the same as those of the foregoing method embodiments.
- the device embodiment is not mentioned, reference may be made to the corresponding content in the foregoing method embodiments.
- a person skilled in the art can clearly understand that for the convenience and brevity of the description, the specific working processes of the foregoing system, the device and the unit can refer to the corresponding processes in the foregoing method embodiments, and details are not described herein again.
- the disclosed apparatus and method may be implemented in other manners.
- the device embodiments described above are merely illustrative.
- the division of the unit is only a logical function division.
- multiple units or components may be combined or Can be integrated into another system, or some features can be ignored or not executed.
- the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some communication interface, device or unit, and may be electrical, mechanical or otherwise.
- the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
- each functional unit in the embodiment provided by the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
- the functions may be stored in a computer readable storage medium if implemented in the form of a software functional unit and sold or used as a standalone product. Based on such understanding, the technical solution of the present invention, or the part contributing to the related art, or the part of the technical solution, may be embodied in the form of a software product, which is stored in a storage medium, including several The instructions are for causing a computer device (which may be a personal computer, server, or network device, etc.) to perform all or part of the steps of the methods described in various embodiments of the present invention.
- the foregoing storage medium includes: a U disk, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk, and the like. .
- At least some embodiments of the present invention provide a gaze point mapping function determining method and apparatus, and a gaze point determining method and apparatus, which have the following beneficial effects: since all of the gaze point mapping functions are not required to be obtained by using the user's eye image There is no limit to the number of eye images of the user, so the user does not need to look at multiple calibration points, and the user has a small workload and a high degree of experience.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
- Image Processing (AREA)
Abstract
一种注视点映射函数确定方法及装置、注视点确定方法及装置,其中函数确定方法包括:将注视点映射函数中所有待求解的参数进行组合,得到注视点映射函数的映射向量,确定映射向量对应的优先求解参数和次级求解参数(S102);根据第一用户的眼部图像及其对应的注视点信息,对第一用户的优先求解参数进行求解(S104);根据多个第二用户分别求解完成的映射向量,确定第一用户的次级求解参数的解(S106);根据第一用户的优先求解参数的解和第一用户的次级求解参数的解,确定第一用户的注视点映射函数(S108)。该方法能够解决利用多个校准点确定注视点映射函数,用户工作量大,不利于用户体验的问题。
Description
本发明涉及人机交互技术领域,具体而言,涉及一种注视点映射函数确定方法及装置、注视点确定方法及装置。
随着人机交互技术的发展,利用眼球运动控制显示屏进行操作的技术(简称眼控技术)逐渐成熟,该种技术应用时,用户眼睛观看显示屏某处时,显示屏上该处的图标能够被打开、关闭、删除等。
在实现眼控技术时,用户需要注视显示屏上的若干个校准点进行视线校准,视线校准的具体原理为:VR(Virtual Reality,虚拟现实)系统获取用户注视显示屏上的每个校准点时的眼部图像,根据每幅眼部图像的图像特征以及每幅眼部图像对应的校准点的位置,确定注视点映射函数,注视点映射函数的作用是建立用户的眼部图像与用户的注视点信息之间的映射关系,当用户完成视线校准后,用户使用VR系统时,VR系统能够根据用户的眼部图像的图像特征以及确定好的注视点映射函数,计算得到用户的注视点信息,从而实现眼控技术。
相关技术中,用户需要注视显示屏上的M个校准点进行校准,VR系统根据用户校准的眼部图像,求解注视点映射函数中的N个待求解的参数,其中,M通常为9或者16,且2M>=N。可见,相关技术中,利用多个校准点确定注视点映射函数,用户需要注视多个校准点,用户工作量大,不利于用户体验。
发明内容
本发明至少部分实施例提供了一种注视点映射函数确定方法及装置、注视点确定方法及装置,以至少部分解决相关技术中利用多个校准点确定注视点映射函数,用户工作量大,不利于用户体验的问题。
本发明其中一实施例提供了一种注视点映射函数确定方法,所述方法包括:将注视点映射函数中所有待求解的参数进行组合,得到注视点映射函数的映射向量,确定所述映射向量对应的优先求解参数和次级求解参数;获取第一用户的眼部图像,根据所述眼部图像和所述眼部图像对应的注视点信息,对所述第一用户的所述优先求解参数进行求解;根据多个第二用户分别求解完成的所述映射向量,确定所述第一用户的所述次级求解参数的解;根据所述第一用户的所述优先求解参数的解和所述第一用户的所述次级求解参数的解,确定所述第一用户的注视点映射函数。
可选地,确定所述映射向量对应的优先求解参数和次级求解参数,包括:将所述映射向量中排序最靠前的若干个参数,确定为所述优先求解参数,将所述映射向量中,除所述优先求解参数以外的所有所述参数,确定为所述次级求解参数;或者,对所述映射向量进行坐标变换,将坐标变换后的所述映射向量中排序最靠前的若干个参数,确定为所述优先求解参数,将坐标变换后的所述映射向量中,除所述优先求解参数以外的所有所述参数,确定为所述次级求解参数。
可选地,对所述映射向量进行坐标变换,包括:对所述多个第二用户分别求解完成的所述映射向量进行主成分分析;根据所述主成分分析的结果对所述映射向量进行坐标变换,得到坐标变换后的所述映射向量。
可选地,根据多个第二用户分别求解完成的所述映射向量,确定所述第一用户的所述次级求解参数的解,包括:根据每个所述第二用户分别求解完成的所述映射向量,确定每个所述第二用户对应的所述次级求解参数的解;将所有所述第二用户对应的所述次级求解参数的解的期望值或者均值,确定为所述第一用户的所述次级求解参数的解。
可选地,根据所述第一用户的所述优先求解参数的解和所述第一用户的所述次级求解参数的解,确定所述第一用户的注视点映射函数,包括:根据所述第一用户的所述优先求解参数的解和所述第一用户的所述次级求解参数的解,确定所述第一用户的所述映射向量的解;根据所述第一用户的所述映射向量的解,确定所述第一用户的注视点映射函数。
本发明其中一实施例提供了一种注视点确定方法,所述方法包括:获取用户的眼部图像;根据所述眼部图像的图像特征和所述用户的注视点映射函数,确定所述用户的注视点信息,其中,所述用户的注视点映射函数采用如上述第一方面所述的方法确定。
本发明其中一实施例提供了一种注视点映射函数确定装置,所述装置包括:参数确定模块,设置为将注视点映射函数中所有待求解的参数进行组合,得到注视点映射函数的映射向量,确定所述映射向量对应的优先求解参数和次级求解参数;第一参数求解模块,设置为获取第一用户的眼部图像,根据所述眼部图像和所述眼部图像对应的注视点信息,对所述第一用户的所述优先求解参数进行求解;第二参数求解模块,设置为根据多个第二用户分别求解完成的所述映射向量,确定所述第一用户的所述次级求解参数的解;函数确定模块,设置为根据所述第一用户的所述优先求解参数的解和所述第一用户的所述次级求解参数的解,确定所述第一用户的注视点映射函数。
可选地,所述参数确定模块包括:第一确定子模块,设置为将所述映射向量中排序最靠前的若干个参数,确定为所述优先求解参数,将所述映射向量中,除所述优先求解参数以外的所有所述参数,确定为所述次级求解参数;或者,第二确定子模块,设置为对所述映射向量进行坐标变换,将坐标变换后的所述映射向量中排序最靠前的若干个参数,确定为所述优先求解参数,将坐标变换后的所述映射向量中,除所述优先求解参数以外的所有所述参数,确定为所述次级求解参数。
可选地,所述第二确定子模块具体设置为:对所述多个第二用户分别求解完成的所述映射向量进行主成分分析;根据所述主成分分析的结果对所述映射向量进行坐标变换,得到坐标变换后的所述映射向量。
本发明其中一实施例提供了一种注视点确定装置,所述装置包括:图像获取模块,设置为获取用户的眼部图像;信息确定模块,设置为根据所述眼部图像的图像特征和所述用户的注视点映射函数,确定所述用户的注视点信息,其中,所述用户的注视点映射函数采用如上述第三方面所述的装置确定。
通过本发明至少部分实施例中提供的注视点映射函数确定方法及装置、注视点确定方法及装置,在确定第一用户的注视点映射函数时,将注视点映射函数的求解过程划分为优先求解参数求解过程和次级求解参数求解过程,根据第一用户的眼部图像和对应的注视点信息求解优先求解参数,根据多个第二用户分别求解完成的映射向量,确定次级求解参数的解。由于无需利用用户的眼部图像求得注视点映射函数中所有参数的解,且对用户的眼部图像的数量没有限制,因此用户不需要注视多个校准点,用户工作量小,体验度高,从而解决相关技术中利用多个校准点确定注视点映射函数,用户工作量大,不利于用户体验的问题。
为使本发明的上述目的、特征和优点能更明显易懂,下文特举较佳实施例,并配合所附附图,作详细说明如下。
为了更清楚地说明本发明实施例的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,应当理解,以下附图仅示出了本发明的某些实施例,因此不应被看作是对范围的限定,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他相关的附图。
图1为本发明其中一实施例提供的注视点映射函数确定方法的流程示 意图;
图2为本发明其中一实施例提供的注视点确定方法的流程示意图;
图3为本发明其中一实施例提供的注视点映射函数确定装置的组成示意图;
图4为本发明其中一实施例提供的注视点确定装置的组成示意图。
为使本发明实施例的目的、技术方案和优点更加清楚,下面将结合本发明实施例中附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。通常在此处附图中描述和示出的本发明实施例的组件可以以各种不同的配置来布置和设计。因此,以下对在附图中提供的本发明的实施例的详细描述并非旨在限制要求保护的本发明的范围,而是仅仅表示本发明的选定实施例。基于本发明的实施例,本领域技术人员在没有做出创造性劳动的前提下所获得的所有其他实施例,都属于本发明保护的范围。
考虑到相关技术中,利用多个校准点确定注视点映射函数,用户需要注视多个校准点,可能多达16个,用户工作量大,不利于用户体验,本发明提供了一种注视点映射函数确定方法及装置、注视点确定方法及装置,下面通过实施例进行具体描述。
图1为本发明其中一实施例提供的注视点映射函数确定方法的流程示意图,如图1所示,该方法包括以下步骤:
步骤S102,将注视点映射函数中所有待求解的参数进行组合,得到注视点映射函数的映射向量,确定该映射向量对应的优先求解参数和次级求解参数。
注视点映射函数的作用是建立用户的眼部图像与用户的注视点信息之 间的映射关系,每个用户对应一个注视点映射函数,当确定某个用户的注视点映射函数后,根据该用户的注视点映射函数和该用户的眼部图像,既能够确定该用户的注视点信息。
注视点映射函数中存在多个参数,对于不同用户而言,该多个参数的取值不同,通过确定不同用户对应的每个参数的取值,从而确定不同用户的注视点映射函数,因此确定某个用户的注视点映射函数的过程,即为求解该用户对应的每个参数的取值的过程,因此本实施例中,将注视点映射函数中的上述多个参数称为待求解的参数。
将注视点映射函数中所有待求解的参数进行组合,得到注视点映射函数的映射向量,根据上述论述,确定某个用户的注视点映射函数的过程,即为求解该用户对应的映射向量的过程。由于映射向量中的每个参数的取值根据用户的不同而不同,因此映射向量中的每个参数也能够称为用户的个体特异性参数。
本实施例中,注视点映射函数能够为多项式函数形式、高斯函数形式或者3D模型函数形式等。比如,注视点映射函数为多项式函数形式,具体为gaze=f(x,y)=a+bx+cy+dxy+ex
2y......,映射向量为X=[a,b,c,d,e......],又如,注视点映射函数为3D模型函数形式,具体为gaze=g(x,y)=G(x,y,alpha,beta,R,D,n......),映射向量为X=[alpha,beta,R,D,n......],其中,[alpha,beta]为视轴偏角,R为角膜曲面半径,D为瞳孔到角膜曲面中心距离,n为水状体折射率。
为便于描述,本实施例中假设映射向量X为N维向量,也即存在N个参数需要求解。
在得到映射向量后,确定该映射向量对应的优先求解参数和次级求解参数,具体过程为:
方式1,将该映射向量中排序最靠前的若干个参数,确定为优先求解参 数,将该映射向量中,除优先求解参数以外的所有参数,确定为次级求解参数。
具体地,映射向量为N维向量,本实施例中,将该N维向量中排序最靠前的若干个参数,确定为优先求解参数,将该映射向量中,除优先求解参数以外的所有参数,确定为次级求解参数。其中,设定优先求解参数的数量为L,要求L<2M,M表示要求用户进行校准的显示屏上校准点的数量,通常为9或者16。在一种具体的实施方式中,根据注视点映射函数对应的方程条件数确定L的取值。
考虑到在数学求解过程中,通常需要进行坐标变换,因此本实施例中,还能够通过以下方式确定该映射向量对应的优先求解参数和次级求解参数:
方式2,对映射向量进行坐标变换,将坐标变换后的映射向量中排序最靠前的若干个参数,确定为优先求解参数,将坐标变换后的映射向量中,除优先求解参数以外的所有参数,确定为次级求解参数。
该方式中,坐标变换后的映射向量Y与映射向量X均为N维向量,也即坐标变换后的映射向量Y中参数的数量,与方式1中映射向量X中参数的数量相同,通过本方式确定的优先求解参数的数量,同方式1中优先求解参数的数量L一致,通过本方式确定的次级求解参数的数量,同方式1中次级求解参数的数量一致。假设坐标变换后的映射向量Y=[Y1,y2,y3,y4...yL...yN],将Y中前L个参数确定为优先求解参数,其余参数确定为次级求解参数。
方式2中,对映射向量进行坐标变换的具体过程为:
(1)对多个第二用户分别求解完成的映射向量进行主成分分析;
第二用户指的是注视点映射函数预先确定好的用户,第二用户对应有求解完成的映射向量,本动作(1)中,对多个第二用户分别求解完成的映 射向量进行主成分分析,得到主成分分析结果,假设主成分分析结果为矩阵B,由于映射向量X为N维向量,因此矩阵B为N×N维的矩阵。
(2)根据主成分分析的结果对映射向量进行坐标变换,得到坐标变换后的映射向量。
主成分分析结果为矩阵B,映射向量为X,坐标变换后的映射向量为Y,则Y=BX。
本实施例中,确定映射向量对应的优先求解参数和次级求解参数后,执行步骤S104,开始确定第一用户的注视点映射函数。
步骤S104,获取第一用户的眼部图像,根据该眼部图像和该眼部图像对应的注视点信息,对第一用户的优先求解参数进行求解。
具体地,为了确定第一用户的注视点映射函数,需要获取第一用户的眼部图像,本实施例中,不限制第一用户的眼部图像的数量,大于等于一幅即可。获取第一用户的眼部图像后,提取第一用户的眼部图像的图像特征,如瞳孔坐标、光斑坐标等,该光斑指的是光源在第一用户的眼球中所成的像。
本实施例中,第一用户在注视屏幕上的注视点时,生成眼部图像,由于注视点是在屏幕上预先确定好的,因此第一用户的眼部图像的注视点信息也能够得到,因此本步骤中,根据第一用户的眼部图像的图像特征,以及第一用户的眼部图像的注视点信息,对第一用户的优先求解参数进行求解,能够理解,对于不同用户,优先求解参数的解均不同,优先求解参数为不同用户的个体特异性参数。
由于优先求解参数能够在映射向量中确定,也能够在坐标变换后的映射向量中确定,因此对第一用户的优先求解参数进行求解的具体过程能够有多种实现方式,这里不做具体限定。
需要说明的是,步骤S102中定义了优先求解参数的数量为L,且L<2M, L的具体取值能够由注视点映射函数的方程条件数确定,因此本步骤中,当获取的眼部图像的数量过少,无法解出所有优先求解参数时,能够对注视点映射函数进行参数空间维度降低处理,从而得到所有优先求解参数的解。
步骤S106,根据多个第二用户分别求解完成的映射向量,确定第一用户的次级求解参数的解。
第二用户指的是注视点映射函数预先确定好的用户,第二用户对应有求解完成的映射向量,根据多个第二用户分别求解完成的映射向量,确定第一用户的次级求解参数的解的具体过程可以是:
(1)根据每个第二用户分别求解完成的映射向量,确定每个第二用户对应的次级求解参数的解;
(2)将所有第二用户对应的次级求解参数的解的期望值或者均值,确定为第一用户的次级求解参数的解。
具体地,由于第二用户对应有求解完成的映射向量,因此对于每个第二用户而言,每个第二用户的次级求解参数的解都是已知的,因此本步骤中,将所有第二用户对应的次级求解参数的解的期望值或者均值,确定为第一用户的次级求解参数的解。
在确定第一用户的优先求解参数的解和次级求解参数的解之后,执行步骤S108。
步骤S108,根据第一用户的优先求解参数的解和第一用户的次级求解参数的解,确定第一用户的注视点映射函数。
由于确定第一用户的注视点映射函数的过程,即为求解第一用户的映射向量的过程,且映射向量对应有优先求解参数和次级求解参数,因此在求解出第一用户的优先求解参数的解和次级求解参数的解之后,即能够确定第一用户的注视点映射函数。
具体确定过程为:
(1)根据第一用户的优先求解参数的解和第一用户的次级求解参数的解,确定第一用户的映射向量的解。
当在映射向量中确定优先求解参数和次级求解参数时,优先求解参数和次级求解参数共同组成映射向量,因此将第一用户的优先求解参数的解和第一用户的次级求解参数的解进行组合,既能够得到第一用户的映射向量的解。
当在坐标变换后的映射向量中确定优先求解参数和次级求解参数时,优先求解参数和次级求解参数共同组成坐标变换后的映射向量,因此将第一用户的优先求解参数的解和第一用户的次级求解参数的解进行组合,既能够得到第一用户的坐标变换后的映射向量的解,将坐标变换后的映射向量的解进行坐标反变换,如X=B
-1Y,既能够得到第一用户的映射向量的解。
(2)根据第一用户的映射向量的解,确定第一用户的注视点映射函数。
得到第一用户的映射向量的解后,将其代入注视点映射函数,即能够得到第一用户的注视点映射函数。
通过本发明实施例中的注视点映射函数确定方法,在确定第一用户的注视点映射函数时,将注视点映射函数的求解过程划分为优先求解参数求解过程和次级求解参数求解过程,根据第一用户的眼部图像和对应的注视点信息求解优先求解参数,根据多个第二用户分别求解完成的映射向量,确定次级求解参数的解。由于通过本实施例中的注视点映射函数确定方法,无需利用用户的眼部图像求得注视点映射函数中所有参数的解,且对用户的眼部图像的数量没有限制,因此用户不需要注视多个校准点,用户工作量小,体验度高,从而解决相关技术中利用多个校准点确定注视点映射函数,用户工作量大,不利于用户体验的问题。
在上述的注视点映射函数确定方法的基础上,本发明实施例还提供了一种注视点确定方法,图2为本发明其中一实施例提供的注视点确定方法的流程示意图,如图2所示,该方法包括以下步骤:
步骤S202,获取用户的眼部图像;
步骤S204,根据眼部图像的图像特征和用户的注视点映射函数,确定用户的注视点信息,其中,用户的注视点映射函数采用上述的注视点映射函数确定方法确定。
在采用上述的注视点映射函数确定方法确定了用户的注视点映射函数后,能够根据用户的眼部图像的图像特征和用户的注视点映射函数,确定用户的注视点信息。其中,用户的眼部图像的图像特征可以为瞳孔坐标、光斑坐标等。
由于本实施例中,用户的注视点映射函数采用上述的注视点映射函数确定方法确定,因此通过本实施例中的注视点确定方法,无需利用用户的眼部图像求得注视点映射函数中所有参数的解,且对用户的眼部图像的数量没有限制,因此用户不需要注视多个校准点,用户工作量小,体验度高,从而解决相关技术中利用多个校准点确定注视点映射函数,用户工作量大,不利于用户体验的问题。
对应上述的注视点映射函数确定方法,本发明其中一实施例还提供了一种注视点映射函数确定装置,图3为本发明其中一实施例提供的注视点映射函数确定装置的组成示意图,如图3所示,该装置包括:
参数确定模块31,设置为将注视点映射函数中所有待求解的参数进行组合,得到注视点映射函数的映射向量,确定该映射向量对应的优先求解参数和次级求解参数;
第一参数求解模块32,设置为获取第一用户的眼部图像,根据眼部图 像和眼部图像对应的注视点信息,对第一用户的优先求解参数进行求解;
第二参数求解模块33,设置为根据多个第二用户分别求解完成的映射向量,确定第一用户的次级求解参数的解;
函数确定模块34,设置为根据第一用户的优先求解参数的解和第一用户的次级求解参数的解,确定第一用户的注视点映射函数。
上述参数确定模块31包括:第一确定子模块,设置为将映射向量中排序最靠前的若干个参数,确定为优先求解参数,将映射向量中,除优先求解参数以外的所有参数,确定为次级求解参数;或者,第二确定子模块,设置为对映射向量进行坐标变换,将坐标变换后的映射向量中排序最靠前的若干个参数,确定为优先求解参数,将坐标变换后的映射向量中,除优先求解参数以外的所有参数,确定为次级求解参数。
第二确定子模块设置为:对多个第二用户分别求解完成的映射向量进行主成分分析;根据主成分分析的结果对映射向量进行坐标变换,得到坐标变换后的映射向量。
上述第二参数求解模块33包括:第一求解子模块,设置为根据每个第二用户分别求解完成的映射向量,确定每个第二用户对应的次级求解参数的解;第二求解子模块,设置为将所有第二用户对应的次级求解参数的解的期望值或者均值,确定为第一用户的次级求解参数的解。
上述函数确定模块34包括:向量确定子模块,设置为根据第一用户的优先求解参数的解和第一用户的次级求解参数的解,确定第一用户的映射向量的解;函数确定子模块,设置为根据第一用户的映射向量的解,确定第一用户的注视点映射函数。
通过本发明实施例中的注视点映射函数确定装置,在确定第一用户的注视点映射函数时,将注视点映射函数的求解过程划分为优先求解参数求解过程和次级求解参数求解过程,根据第一用户的眼部图像和对应的注视 点信息求解优先求解参数,根据多个第二用户分别求解完成的映射向量,确定次级求解参数的解。由于通过本实施例中的注视点映射函数确定装置,无需利用用户的眼部图像求得注视点映射函数中所有参数的解,且对用户的眼部图像的数量没有限制,因此用户不需要注视多个校准点,用户工作量小,体验度高,从而解决相关技术中利用多个校准点确定注视点映射函数,用户工作量大,不利于用户体验的问题。
对应上述的注视点确定方法,本发明其中一实施例还提供了一种注视点确定装置,图4为本发明其中一实施例提供的注视点确定装置的组成示意图,如图4所示,该装置包括:
图像获取模块41,设置为获取用户的眼部图像;
信息确定模块42,设置为根据眼部图像的图像特征和用户的注视点映射函数,确定用户的注视点信息,其中,用户的注视点映射函数采用上述的注视点映射函数确定装置确定。
由于本实施例中,用户的注视点映射函数采用上述的注视点映射函数确定方法确定,因此通过本实施例中的注视点确定装置,无需利用用户的眼部图像求得注视点映射函数中所有参数的解,且对用户的眼部图像的数量没有限制,因此用户不需要注视多个校准点,用户工作量小,体验度高,从而解决相关技术中利用多个校准点确定注视点映射函数,用户工作量大,不利于用户体验的问题。
本发明至少部分实施例所提供的注视点映射函数确定装置及注视点确定装置,可以为设备上的特定硬件或者安装于设备上的软件或固件等。本发明实施例所提供的装置,其实现原理及产生的技术效果和前述方法实施例相同,为简要描述,装置实施例部分未提及之处,可参考前述方法实施 例中相应内容。所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,前述描述的系统、装置和单元的具体工作过程,均可以参考上述方法实施例中的对应过程,在此不再赘述。
在本发明所提供的实施例中,应该理解到,所揭露装置和方法,可以通过其它的方式实现。以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,又例如,多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些通信接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本发明提供的实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本发明的技术方案本质上或者说对相关技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本发明各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质。
应注意到:相似的标号和字母在下面的附图中表示类似项,因此,一旦某一项在一个附图中被定义,则在随后的附图中不需要对其进行进一步定义和解释,此外,术语“第一”、“第二”、“第三”等仅设置为区分描述,而不能理解为指示或暗示相对重要性。
最后应说明的是:以上所述实施例,仅为本发明的具体实施方式,用以说明本发明的技术方案,而非对其限制,本发明的保护范围并不局限于此,尽管参照前述实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,其依然可以对前述实施例所记载的技术方案进行修改或可轻易想到变化,或者对其中部分技术特征进行等同替换;而这些修改、变化或者替换,并不使相应技术方案的本质脱离本发明实施例技术方案的精神和范围。都应涵盖在本发明的保护范围之内。因此,本发明的保护范围应所述以权利要求的保护范围为准。
如上所述,本发明至少部分实施例提供了一种注视点映射函数确定方法及装置、注视点确定方法及装置具有以下有益效果:由于无需利用用户的眼部图像求得注视点映射函数中所有参数的解,且对用户的眼部图像的数量没有限制,因此用户不需要注视多个校准点,用户工作量小,体验度高。
Claims (10)
- 一种注视点映射函数确定方法,所述方法包括:将注视点映射函数中所有待求解的参数进行组合,得到注视点映射函数的映射向量,确定所述映射向量对应的优先求解参数和次级求解参数;获取第一用户的眼部图像,根据所述眼部图像和所述眼部图像对应的注视点信息,对所述第一用户的所述优先求解参数进行求解;根据多个第二用户分别求解完成的所述映射向量,确定所述第一用户的所述次级求解参数的解;根据所述第一用户的所述优先求解参数的解和所述第一用户的所述次级求解参数的解,确定所述第一用户的注视点映射函数。
- 根据权利要求1所述的方法,其中,确定所述映射向量对应的优先求解参数和次级求解参数,包括:将所述映射向量中排序最靠前的若干个参数,确定为所述优先求解参数,将所述映射向量中,除所述优先求解参数以外的所有所述参数,确定为所述次级求解参数;或者,对所述映射向量进行坐标变换,将坐标变换后的所述映射向量中排序最靠前的若干个参数,确定为所述优先求解参数,将坐标变换后的所述映射向量中,除所述优先求解参数以外的所有所述参数,确定为所述次级求解参数。
- 根据权利要求2所述的方法,其中,对所述映射向量进行坐标变换,包括:对所述多个第二用户分别求解完成的所述映射向量进行主成分分析;根据所述主成分分析的结果对所述映射向量进行坐标变换,得到坐标变换后的所述映射向量。
- 根据权利要求1所述的方法,其中,根据多个第二用户分别求解完 成的所述映射向量,确定所述第一用户的所述次级求解参数的解,包括:根据每个所述第二用户分别求解完成的所述映射向量,确定每个所述第二用户对应的所述次级求解参数的解;将所有所述第二用户对应的所述次级求解参数的解的期望值或者均值,确定为所述第一用户的所述次级求解参数的解。
- 根据权利要求1至4任一项所述的方法,其中,根据所述第一用户的所述优先求解参数的解和所述第一用户的所述次级求解参数的解,确定所述第一用户的注视点映射函数,包括:根据所述第一用户的所述优先求解参数的解和所述第一用户的所述次级求解参数的解,确定所述第一用户的所述映射向量的解;根据所述第一用户的所述映射向量的解,确定所述第一用户的注视点映射函数。
- 一种注视点确定方法,所述方法包括:获取用户的眼部图像;根据所述眼部图像的图像特征和所述用户的注视点映射函数,确定所述用户的注视点信息,其中,所述用户的注视点映射函数采用如权利要求1至5任一项所述的方法确定。
- 一种注视点映射函数确定装置,所述装置包括:参数确定模块,设置为将注视点映射函数中所有待求解的参数进行组合,得到注视点映射函数的映射向量,确定所述映射向量对应的优先求解参数和次级求解参数;第一参数求解模块,设置为获取第一用户的眼部图像,根据所述眼部图像和所述眼部图像对应的注视点信息,对所述第一用户的所述优先求解参数进行求解;第二参数求解模块,设置为根据多个第二用户分别求解完成的所述映 射向量,确定所述第一用户的所述次级求解参数的解;函数确定模块,设置为根据所述第一用户的所述优先求解参数的解和所述第一用户的所述次级求解参数的解,确定所述第一用户的注视点映射函数。
- 根据权利要求7所述的装置,其中,所述参数确定模块包括:第一确定子模块,设置为将所述映射向量中排序最靠前的若干个参数,确定为所述优先求解参数,将所述映射向量中,除所述优先求解参数以外的所有所述参数,确定为所述次级求解参数;或者,第二确定子模块,设置为对所述映射向量进行坐标变换,将坐标变换后的所述映射向量中排序最靠前的若干个参数,确定为所述优先求解参数,将坐标变换后的所述映射向量中,除所述优先求解参数以外的所有所述参数,确定为所述次级求解参数。
- 根据权利要求8所述的装置,其中,所述第二确定子模块设置为:对所述多个第二用户分别求解完成的所述映射向量进行主成分分析;根据所述主成分分析的结果对所述映射向量进行坐标变换,得到坐标变换后的所述映射向量。
- 一种注视点确定装置,所述装置包括:图像获取模块,设置为获取用户的眼部图像;信息确定模块,设置为根据所述眼部图像的图像特征和所述用户的注视点映射函数,确定所述用户的注视点信息,其中,所述用户的注视点映射函数采用如权利要求7至9任一项所述的装置确定。
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US16/340,683 US10996745B2 (en) | 2016-12-28 | 2017-12-28 | Method and device for determining gaze point mapping function, and method and device for determining gaze point |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201611231596.1 | 2016-12-28 | ||
| CN201611231596.1A CN106598258B (zh) | 2016-12-28 | 2016-12-28 | 注视点映射函数确定方法及装置、注视点确定方法及装置 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018121635A1 true WO2018121635A1 (zh) | 2018-07-05 |
Family
ID=58604460
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2017/119182 Ceased WO2018121635A1 (zh) | 2016-12-28 | 2017-12-28 | 注视点映射函数确定方法及装置、注视点确定方法及装置 |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US10996745B2 (zh) |
| CN (1) | CN106598258B (zh) |
| WO (1) | WO2018121635A1 (zh) |
Families Citing this family (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106598258B (zh) | 2016-12-28 | 2019-04-16 | 北京七鑫易维信息技术有限公司 | 注视点映射函数确定方法及装置、注视点确定方法及装置 |
| CN107392156B (zh) * | 2017-07-25 | 2020-08-25 | 北京七鑫易维信息技术有限公司 | 一种视线估计方法及装置 |
| CN109032351B (zh) * | 2018-07-16 | 2021-09-24 | 北京七鑫易维信息技术有限公司 | 注视点函数确定方法、注视点确定方法、装置及终端设备 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102520796A (zh) * | 2011-12-08 | 2012-06-27 | 华南理工大学 | 一种基于逐步回归分析映射模型的视线跟踪方法 |
| WO2014131690A1 (en) * | 2013-02-27 | 2014-09-04 | Thomson Licensing | Method and device for calibration-free gaze estimation |
| CN105224065A (zh) * | 2014-05-29 | 2016-01-06 | 北京三星通信技术研究有限公司 | 一种视线估计设备和方法 |
| CN106066696A (zh) * | 2016-06-08 | 2016-11-02 | 华南理工大学 | 自然光下基于投影映射校正和注视点补偿的视线跟踪方法 |
| CN106598258A (zh) * | 2016-12-28 | 2017-04-26 | 北京七鑫易维信息技术有限公司 | 注视点映射函数确定方法及装置、注视点确定方法及装置 |
Family Cites Families (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE19953835C1 (de) * | 1999-10-30 | 2001-05-23 | Hertz Inst Heinrich | Rechnerunterstütztes Verfahren zur berührungslosen, videobasierten Blickrichtungsbestimmung eines Anwenderauges für die augengeführte Mensch-Computer-Interaktion und Vorrichtung zur Durchführung des Verfahrens |
| US7306337B2 (en) * | 2003-03-06 | 2007-12-11 | Rensselaer Polytechnic Institute | Calibration-free gaze tracking under natural head movement |
| WO2014125380A2 (en) * | 2013-02-14 | 2014-08-21 | The Eye Tribe Aps | Systems and methods of eye tracking calibration |
| US9179833B2 (en) * | 2013-02-28 | 2015-11-10 | Carl Zeiss Meditec, Inc. | Systems and methods for improved ease and accuracy of gaze tracking |
| GB201305726D0 (en) * | 2013-03-28 | 2013-05-15 | Eye Tracking Analysts Ltd | A method for calibration free eye tracking |
| CN104899565B (zh) * | 2015-06-01 | 2018-05-18 | 中国人民解放军军事医学科学院放射与辐射医学研究所 | 基于纹理特征的眼动轨迹识别方法及装置 |
| US10976813B2 (en) * | 2016-06-13 | 2021-04-13 | Apple Inc. | Interactive motion-based eye tracking calibration |
-
2016
- 2016-12-28 CN CN201611231596.1A patent/CN106598258B/zh active Active
-
2017
- 2017-12-28 US US16/340,683 patent/US10996745B2/en active Active
- 2017-12-28 WO PCT/CN2017/119182 patent/WO2018121635A1/zh not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102520796A (zh) * | 2011-12-08 | 2012-06-27 | 华南理工大学 | 一种基于逐步回归分析映射模型的视线跟踪方法 |
| WO2014131690A1 (en) * | 2013-02-27 | 2014-09-04 | Thomson Licensing | Method and device for calibration-free gaze estimation |
| CN105224065A (zh) * | 2014-05-29 | 2016-01-06 | 北京三星通信技术研究有限公司 | 一种视线估计设备和方法 |
| CN106066696A (zh) * | 2016-06-08 | 2016-11-02 | 华南理工大学 | 自然光下基于投影映射校正和注视点补偿的视线跟踪方法 |
| CN106598258A (zh) * | 2016-12-28 | 2017-04-26 | 北京七鑫易维信息技术有限公司 | 注视点映射函数确定方法及装置、注视点确定方法及装置 |
Also Published As
| Publication number | Publication date |
|---|---|
| US20190310704A1 (en) | 2019-10-10 |
| US10996745B2 (en) | 2021-05-04 |
| CN106598258A (zh) | 2017-04-26 |
| CN106598258B (zh) | 2019-04-16 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11714592B2 (en) | Gaze-based user interactions | |
| US12315091B2 (en) | Methods for manipulating objects in an environment | |
| KR102796024B1 (ko) | 안경용 가상 피팅 시스템 및 방법 | |
| CN103885589B (zh) | 眼动追踪方法及装置 | |
| WO2020015468A1 (zh) | 一种图像传输方法、装置、终端设备及存储介质 | |
| CN110378914A (zh) | 基于注视点信息的渲染方法及装置、系统、显示设备 | |
| US10719193B2 (en) | Augmenting search with three-dimensional representations | |
| WO2019214442A1 (zh) | 一种设备控制方法、装置、控制设备及存储介质 | |
| WO2018076622A1 (zh) | 图像处理方法、装置及终端 | |
| US11270367B2 (en) | Product comparison techniques using augmented reality | |
| KR20210030207A (ko) | 인스턴스 검출 및 일반적인 장면 이해를 이용한 객체 검출 | |
| CN109002169A (zh) | 一种基于眼动识别的交互方法及装置 | |
| WO2018121635A1 (zh) | 注视点映射函数确定方法及装置、注视点确定方法及装置 | |
| CN105866950A (zh) | 数据处理的方法及装置 | |
| US20240248678A1 (en) | Digital assistant placement in extended reality | |
| CN117372475A (zh) | 眼球追踪方法和电子设备 | |
| KR102804309B1 (ko) | 아바타 등록을 위한 액세서리 검출 및 결정 | |
| CN105718036A (zh) | 可穿戴式设备的信息交互方法和系统 | |
| CN120017810A (zh) | 一种指示信息显示方法、装置、电子设备及存储介质 | |
| AU2022293326B2 (en) | Virtual reality techniques for characterizing visual capabilities | |
| US10409464B2 (en) | Providing a context related view with a wearable apparatus | |
| US12027166B2 (en) | Digital assistant reference resolution | |
| RU2818028C1 (ru) | Способ и устройство для калибровки при окулографии | |
| US11181973B2 (en) | Techniques related to configuring a display device | |
| WO2017156742A1 (zh) | 一种基于虚拟现实的画面显示方法及相关设备 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17888870 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 17888870 Country of ref document: EP Kind code of ref document: A1 |