US10386482B2 - Device-free tracking system that accurately tracks hand movement - Google Patents
Device-free tracking system that accurately tracks hand movement Download PDFInfo
- Publication number
- US10386482B2 US10386482B2 US15/414,084 US201715414084A US10386482B2 US 10386482 B2 US10386482 B2 US 10386482B2 US 201715414084 A US201715414084 A US 201715414084A US 10386482 B2 US10386482 B2 US 10386482B2
- Authority
- US
- United States
- Prior art keywords
- controlled device
- audio signals
- estimated
- distance
- recited
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 230000033001 locomotion Effects 0.000 title claims abstract description 38
- 230000005236 sound signal Effects 0.000 claims abstract description 84
- 238000000034 method Methods 0.000 claims abstract description 39
- 238000004590 computer program Methods 0.000 claims abstract description 31
- 230000003595 spectral effect Effects 0.000 claims description 26
- 238000003860 storage Methods 0.000 claims description 17
- 230000015654 memory Effects 0.000 claims description 16
- 230000003068 static effect Effects 0.000 claims description 16
- 230000008859 change Effects 0.000 claims description 15
- 238000001914 filtration Methods 0.000 claims description 11
- 230000004044 response Effects 0.000 claims description 5
- 230000003190 augmentative effect Effects 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 11
- 230000005540 biological transmission Effects 0.000 description 8
- 230000006870 function Effects 0.000 description 8
- 238000012545 processing Methods 0.000 description 8
- 238000004088 simulation Methods 0.000 description 6
- 230000008569 process Effects 0.000 description 5
- 238000012549 training Methods 0.000 description 5
- 238000001228 spectrum Methods 0.000 description 4
- 238000013459 approach Methods 0.000 description 3
- 238000003491 array Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 238000005070 sampling Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 230000036541 health Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 230000000737 periodic effect Effects 0.000 description 2
- 230000001902 propagating effect Effects 0.000 description 2
- 238000010079 rubber tapping Methods 0.000 description 2
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 1
- 229910052802 copper Inorganic materials 0.000 description 1
- 239000010949 copper Substances 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000013383 initial experiment Methods 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/66—Sonar tracking systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/02—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
- G01S15/50—Systems of measurement, based on relative movement of the target
- G01S15/58—Velocity or trajectory determination systems; Sense-of-movement determination systems
- G01S15/62—Sense-of-movement determination
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/02—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
- G01S15/06—Systems determining the position data of a target
- G01S15/08—Systems for measuring distance only
- G01S15/32—Systems for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
- G01S15/34—Systems for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated using transmission of continuous, frequency-modulated waves while heterodyning the received signal, or a signal derived therefrom, with a locally-generated signal related to the contemporaneously transmitted signal
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/02—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
- G01S15/06—Systems determining the position data of a target
- G01S15/42—Simultaneous measurement of distance and other co-ordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/02—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
- G01S15/06—Systems determining the position data of a target
- G01S15/46—Indirect determination of position data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/02—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
- G01S15/50—Systems of measurement, based on relative movement of the target
- G01S15/58—Velocity or trajectory determination systems; Sense-of-movement determination systems
- G01S15/582—Velocity or trajectory determination systems; Sense-of-movement determination systems using transmission of interrupted pulse-modulated waves and based upon the Doppler effect resulting from movement of targets
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/02—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
- G01S15/50—Systems of measurement, based on relative movement of the target
- G01S15/58—Velocity or trajectory determination systems; Sense-of-movement determination systems
- G01S15/586—Velocity or trajectory determination systems; Sense-of-movement determination systems using transmission of continuous unmodulated waves, amplitude-, frequency-, or phase-modulated waves and based upon the Doppler effect resulting from movement of targets
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/87—Combinations of sonar systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/02—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
- G01S15/06—Systems determining the position data of a target
- G01S15/46—Indirect determination of position data
- G01S2015/465—Indirect determination of position data by Trilateration, i.e. two transducers determine separately the distance to a target, whereby with the knowledge of the baseline length, i.e. the distance between the transducers, the position data of the target is determined
Definitions
- the present invention relates generally to controlling devices, and more particularly to a device-free tracking system that accurately tracks hand movement thereby enabling the control of the device by hand movement.
- Smart TVs, smart appliances, Virtual Reality (VR), and Augmented Reality (AR) are all becoming increasingly popular.
- the key to their success is having an easy-to-use user interface to control the device (e.g., smart TVs, smart appliances and devices implementing VR/AR).
- smart TVs, smart appliances and devices implementing VR/AR Currently though such devices lack an easy-to-use user interface.
- VR and AR provide an immersive experience, and open the doors to new ways of training, education, meeting, advertising, travel, health care, emergency responses, and scientific experiments.
- the current user interface of devices implementing VR/AR are rather limited: they rely on tapping, swiping, voice recognition, or steering the camera towards the hand to make sure the hand is within the view and line-of-sight of the camera while wearing the headset.
- a method for tracking movement of an object comprises transmitting audio signals from one or more speakers to the object.
- the method further comprises receiving samples of the audio signals reflected from the object over a period of time.
- the method additionally comprises mixing the received audio signals with the transmitted audio signals.
- the method comprises performing, by a processor, a fast Fourier transform on the mixed audio signals.
- the method comprises selecting, by the processor, one or more peak frequencies in a frequency domain of the fast Fourier transformed mixed audio signals.
- the method comprises estimating a velocity of the object.
- the method further comprises estimating, by the processor, a distance from a speaker of the controlled device to a microphone of the controlled device via the object based on the selected one or more peak frequencies and velocity of the object.
- FIG. 1 illustrates a system configured in accordance with an embodiment of the present invention
- FIG. 2 illustrates a hardware configuration of a device to be controlled by the movement of a hand in accordance with an embodiment of the present invention
- FIG. 3 illustrates the chirp signal using Frequency Modulated Continuous Wave (FMCW) in accordance with an embodiment of the present invention
- FIG. 4 is a flowchart of a method for controlling a device using hand movements by allowing the controlled device to continuously track the movement of the hand using acoustic signals in accordance with an embodiment of the present invention
- FIG. 5 is a flowchart of the sub-steps of estimating the distance between the speakers of the controlled device in accordance with an embodiment of the present invention
- FIG. 6 shows a snapshot of the cross-correlation result in accordance with an embodiment of the present invention
- FIG. 7 is a flowchart of the sub-steps of estimating the initial position of the object, such as a hand, in accordance with an embodiment of the present invention.
- FIG. 8A shows the simulation result of FMCW range detection when the transmitter and receiver are both static and 0.5 m apart in accordance with an embodiment of the present invention
- FIG. 8B illustrates the simulation result when the range is 0.5 m and the Doppler shift is 20 Hz in accordance with an embodiment of the present invention
- FIGS. 9A-9B show the FMCW signals with and without the peaks in the spectral points, respectively, while the hand is moving in accordance with an embodiment of the present invention
- FIG. 10 shows a snapshot of the received FMCW signal while the hand is moving towards the microphone in accordance with an embodiment of the present invention.
- FIG. 11 shows an example of the Doppler shift estimation in accordance with an embodiment of the present invention.
- the present invention provides a device-free motion tracking system that enables a new way for users to interact with the world by simply moving their hands. They can freely play video games, interact with VR/AR devices, and control smart appliances anywhere at any time.
- a tracking system uses widely available speakers and microphones on the controlled device (e.g., computers, game consoles, VR/AR headsets and smart devices, such as smartphones and smart watches).
- the tracking system utilizes a novel approach that estimates the distance to the speaker from the hand and the velocity of the hand using a single chirp signal. Such information may be used to accurately locate the moving hand. By accurately locating the moving hand, a user will be able to interact and control devices by simply moving their hands.
- the device-free tracking system of the present invention is based on Frequency Modulated Continuous Wave (FMCW).
- the device to be controlled includes one or more speakers and one or more microphones which are used to emit and receive audio signals and track the hand movement.
- a pair of speakers and microphones may be co-located, and serve as an anchor point.
- Each speaker transmits chirp signals in an inaudible and non-overlapping spectrum band with a guard band in between, and the microphone collects the signals from the corresponding spectrum, mixes it with the transmitted signal, uses the peak frequencies to estimate the distance and velocity, which are in turn used to track the hand movement.
- the process begins by detecting the start of a “chirp” signal using cross correlation. Since the chirp signal is periodic, it only needs to be detected once and audio samples can be fetched continuously from the next sampling intervals (e.g., 100 ms). The distance between the speakers and the distance from the controlled device to the initial hand's position are estimated. Then audio signals are continuously fetched. Fast Fourier Transform (FFT) is performed on the fetched audio signals to detect the peak frequencies. It has been observed that the mixed signal in FMCW has a fundamental frequency determined by the parameters of the chirp sequence. This property is leveraged to filter out the reflection from the static objects and detect the reflection caused by the moving hand.
- FFT Fast Fourier Transform
- the Doppler shift is estimated and used to select the appropriate FMCW peak for distance estimation (distance between the controlled device and the hand). This distance estimation as well as the velocity estimation of the hand (obtained from the Doppler shift) are then used to continuously track the hand. A more detailed explanation of this process is provided below.
- FIG. 1 illustrates a system 100 configured in accordance with an embodiment of the present invention.
- system 100 includes a device to be controlled 101 (referred to herein as “controlled device”) by a hand 102 of a user.
- Controlled device 101 may be any computing device that contains speakers 103 A- 103 B (identified as “speaker A” and “speaker B,” respectively, in FIG. 1 , respectively) and two microphones 104 A- 104 B (identified as “microphone A” and “microphone B,” respectively, in FIG. 1 ).
- Speakers 103 A- 103 B may collectively or individually be referred to as speakers 103 or speaker 103 , respectively.
- Microphones 104 A- 104 B may collectively or individually be referred to as microphones 104 or microphone 104 , respectively. While FIG. 1 illustrates controlled device 101 as including two speakers 103 and two microphones 104 , controlled device 101 of the present invention is not to be limited to only including two speakers 103 and two microphones 104 . Instead, controlled device 101 may include one or more speakers 103 and one or more microphones 104 . Some examples of controlled devices 101 include but not limited to computers, video game consoles, VR/AR headsets, smartphones, smart watches, wearable devices, smart TVs and smart appliances. Controlled device 101 is configured to emit an audio signal through its speakers 103 , whether audible or inaudible to humans.
- controlled device 101 is configured to receive an audio signal that is reflected by the user's hand 102 by microphones 104 . This reflected audio signal is used by controlled device 101 to continuously track hand 102 in real time as discussed further below. A more detailed description of a hardware configuration of an embodiment of controlled device 101 is provided below in connection with FIG. 2 .
- FIG. 2 is a functional block diagram of an example of a controlled device 101 ( FIG. 1 ).
- controlled device 101 includes one or more processors 201 .
- Processor 201 can include one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), controllers, micro-controllers, microprocessors, electronic devices, electronic units, or a combination thereof.
- ASICs application specific integrated circuits
- DSPs digital signal processors
- DSPDs digital signal processing devices
- PLDs programmable logic devices
- FPGAs field programmable gate arrays
- controllers micro-controllers, microprocessors, electronic devices, electronic units, or a combination thereof.
- Processor 201 is configured to store data received by one or more interfaces and process and store the data on a memory 202 .
- Memory 202 can be implemented within processor 201 or external to processor 201 .
- the term memory refers to any type of long term, short term, volatile, nonvolatile, or other memory and is not to be limited to any particular type of memory or number of memories or type of media upon which memory is stored.
- memory 202 stores an application, such as a program for controlling device 101 using hand movements by hand 102 ( FIG. 1 ) by allowing the controlled device 101 to continuously track the movement of hand 102 using acoustic signals.
- processor 201 is configured to execute the program instructions of applications stored in memory 202 .
- speakers 103 A, 103 B and microphones 104 A, 104 B are connected to controlled device 101 via a user interface adapter 203 .
- Speakers 103 A, 103 B are configured to generate an audio signal (audible or inaudible to humans) at various frequencies.
- microphones 104 A, 104 B are configured to receive an audio signal that is reflected by the user's hand 102 .
- Controlled device 101 of FIG. 2 is not to be limited in scope to the elements depicted in FIG. 2 and may include fewer or additional elements than depicted in FIG. 2 .
- the present invention may be a system, a method, and/or a computer program product.
- the computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention.
- the computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device.
- the computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
- a non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing.
- RAM random access memory
- ROM read-only memory
- EPROM or Flash memory erasable programmable read-only memory
- SRAM static random access memory
- CD-ROM compact disc read-only memory
- DVD digital versatile disk
- memory stick a floppy disk
- a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon
- a computer readable storage medium is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
- Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network.
- the network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers.
- a network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
- Computer readable program instructions for carrying out operations of the present invention may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages.
- the computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
- the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
- electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present invention.
- These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
- These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
- the computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
- each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s).
- the functions noted in the block may occur out of the order noted in the figures.
- two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
- the current user interface of devices implementing VR/AR are rather limited: they rely on tapping, swiping, voice recognition, or steering the camera towards the hand to make sure the hand is within the view and line-of-sight of the camera while wearing the headset.
- devices such as smart devices and device implementing VR/AR.
- FIG. 3 illustrates the chirp signal using Frequency Modulated Continuous Wave (FMCW).
- FIG. 4 is a flowchart of a method for controlling a device using hand movements by allowing the controlled device to continuously track the movement of the hand using acoustic signals.
- FIG. 5 is a flowchart of the sub-steps of estimating the distance between the speakers of the controlled device.
- FIG. 6 shows a snapshot of the cross-correlation result.
- FIG. 7 is a flowchart of the sub-steps of estimating the initial position of the object, such as a hand.
- FIG. 8A shows the simulation result of FMCW range detection when the transmitter and receiver are both static and 0.5 m apart.
- FIG. 8B illustrates the simulation result when the range is 0.5 m and the Doppler shift is 20 Hz.
- FIGS. 9A-9B show the FMCW signals with and without the peaks in the spectral points, respectively, while the hand (e.g., hand 102 of FIG. 1 ) is moving.
- FIG. 10 shows a snapshot of the received FMCW signal while the hand is moving towards the microphone.
- FIG. 11 shows an example of the Doppler shift estimation.
- FMCW Frequency Modulated Continuous Wave
- FMCW indirectly estimates the propagation delay based on the frequency shift of the chirp signal as follows as shown in FIG. 3 .
- FIG. 3 illustrates the chirp signal using FMCW in accordance with an embodiment of the present invention.
- curve 301 shows a transmission chirp, whose frequency linearly increases over time.
- Curve 302 shows the received chirp.
- Let f c , B, and T denote the carrier frequency, bandwidth, and duration of the chirp, respectively.
- the frequency of the signal at time t is given by
- f ⁇ ( t ) f c + Bt T .
- the phase of the signal is calculated by integrating f(t) over time, which is:
- u ⁇ ( t ) 2 ⁇ ⁇ ( f c ⁇ t + B ⁇ t 2 2 ⁇ T ) .
- ⁇ rx ⁇ ( t ) cos ⁇ ( 2 ⁇ ⁇ ⁇ ⁇ f c ⁇ ( t - ⁇ d ) + ⁇ ⁇ ⁇ B ⁇ ( t - ⁇ d ) 2 T ) , where again the magnitude change is ignored.
- R, V c , V denote the distance between the transceiver and target, the propagation speed, and the target's velocity. Then, the reflected signal is delayed by ⁇ d :
- the mixed signal ⁇ m (t) is called the Intermediate Frequency (IF) signal.
- IF Intermediate Frequency
- ⁇ m ⁇ ( t ) cos ⁇ ( 2 ⁇ ⁇ ⁇ ⁇ f c ⁇ ⁇ d + 2 ⁇ ⁇ ⁇ ⁇ t ⁇ B ⁇ ⁇ ⁇ d ⁇ T + 2 ⁇ ⁇ ⁇ ⁇ B ⁇ ⁇ ⁇ d 2 T ) .
- Equation (1) By plugging Equation (1) into the above equation, ⁇ m (t) becomes:
- ⁇ m ⁇ ( t ) cos ⁇ ( 4 ⁇ ⁇ ⁇ ⁇ f c ⁇ R V c + 2 ⁇ ⁇ ⁇ ( 2 ⁇ f c ⁇ V V c + 2 ⁇ RB V c ⁇ T ) ⁇ t + 2 ⁇ ⁇ ⁇ ( R + Vt ) 2 Tc 2 ) . ( 2 )
- f IF f R + f V ( 3 )
- f R 2 ⁇ RB V c ⁇ T ( 4 )
- f V 2 ⁇ f c ⁇ V V c ( 5 )
- the frequency shift includes (i) the frequency shift that is proportional to the distance to the target f R , and (ii) the Doppler shift f V due to the movement of the target.
- the former depends on the distance and the latter depends on the velocity.
- FIG. 4 is a flowchart of a method 400 for controlling a device (e.g., device 101 of FIGS. 1 and 2 ) using hand movements by allowing controlled device 101 to continuously track the movement of an object, such as a hand (e.g., hand 102 of FIG. 1 ) using acoustic signals in accordance with an embodiment of the present invention.
- a device e.g., device 101 of FIGS. 1 and 2
- a hand e.g., hand 102 of FIG. 1
- acoustic signals in accordance with an embodiment of the present invention.
- controlled device 101 estimates the distance between speaker 103 and microphone 104 of controlled device 101 .
- a further description of the steps involved in estimating the distance between speakers 103 of controlled device 101 is provided below in connection with FIG. 5 .
- FIG. 5 is a flowchart of the sub-steps of estimating the distance between speakers 103 of controlled device 101 (step 401 ) in accordance with an embodiment of the present invention.
- controlled device 101 transmits Frequency Modulated Continuous Waves (FMCW) audio signals (whether audible or inaudible to humans), including a first chirp signal, from speaker(s) 103 to an object, such as hand 102 .
- FMCW Frequency Modulated Continuous Waves
- microphone 104 of controlled device 101 receives the samples of audio signals reflected from the object (e.g., hand 102 ) over a period of time.
- controlled device 101 detects the transmission of a first chirp signal from speaker 103 using cross correlation between the received and transmitted audio signals.
- controlled device 101 mixes the received audio signal with the transmitted audio signals.
- the program of the present invention should know the exact start time of the chirp signal transmission. This is challenging even if the audio signal transmission and the reception are implemented in the same program and running on the same machine, due to a random delay between the call of the audio play function and the play time of the audio file. To avoid the uncertainty, the beginning of the chirp signal is detected using cross-correlation between the received audio signal and the original chirp signal. As speaker 103 and microphone 104 are co-located, the signal is considered to be received as soon as it is transmitted.
- FIG. 6 shows one snapshot of the cross-correlation result in accordance with an embodiment of the present invention.
- FIG. 6 shows the peak every 480 samples, which corresponds to an exemplary chirp duration.
- controlled device 101 performs fast Fourier Transform (FFT) on the mixed audio signals.
- controlled device 101 further filters out the reflection from the static objects.
- controlled device 101 selects a peak frequency in the frequency domain of the FFT mixed audio signals.
- controlled device 101 estimates the distance between speaker 103 and microphone 104 of controlled device 101 using the selected peak frequency.
- the transmitted and received signals are synchronized using the above procedure.
- controlled device 101 estimates the distance between speakers 103 of controlled device 101 using the estimated distance between speaker 103 and microphone 104 of controlled device 101 .
- a speaker 103 and a microphone 104 are co-located and the distance between speakers 103 of controlled device 101 is equal to the distance between a speaker 103 and another speaker's 103 co-located microphone 104 .
- all speakers 103 and microphones 104 are placed along one line, and the distance between any two speakers 103 can be estimated based on the distances between the pairs of a speaker 103 and a microphone 104 (e.g., estimating the distance between speaker 1 and speaker 2 based on the distances between speaker 1 and microphone 1 , speaker 2 and microphone 2 , speaker 1 and microphone 2 , and speaker 2 and microphone 1 ).
- controlled device 101 estimates the initial position of the object, such as hand 102 .
- a further description of the steps involved in estimating the initial position of the object, such as hand 102 is provided below in connection with FIG. 7 .
- FIG. 7 is a flowchart of the sub-steps of estimating the initial position of the object (step 402 ) in accordance with an embodiment of the present invention.
- controlled device 101 transmits Frequency Modulated Continuous Waves (FMCW) audio signals (whether audible or inaudible to humans), including a first chirp signal, from speaker(s) 103 to the object, such as hand 102 , as in step 501 .
- FMCW Frequency Modulated Continuous Waves
- step 702 microphone 104 of controlled device 101 receives the samples of audio signals reflected from the object (e.g., hand 102 ) over a period of time as in step 502 .
- controlled device 101 detects the transmission of a first chirp signal from speaker 103 using cross correlation between the received and transmitted audio signals as in step 503 .
- controlled device 101 mixes the received audio signal with the transmitted audio signals as in step 504 .
- controlled device 101 performs fast Fourier Transform (FFT) on the mixed audio signals as in step 505 .
- FFT fast Fourier Transform
- controlled device 101 detects the initial hand gesture using pattern matching based on the frequency domain of the mixed audio signals.
- an initial gesture may correspond to a grabbing gesture (e.g., closing and opening a hand twice) as the initial gesture.
- the initial gesture can be recognized using pattern matching.
- the received signal reflected from the object e.g., hand 102
- the received signal is mixed with the transmitted signal, and fast Fourier transform (FFT) is performed on the mixed trace.
- FFT fast Fourier transform
- the FFT of the mixed trace is recorded.
- the above process may be repeated to obtain multiple positive training traces while the user is performing the initial gesture.
- negative training traces from the FFT of the mixed traces are collected while the user is not performing the initial gesture.
- Recently received FFTs of the mixed trace are matched against the positive or negative training trace using pattern matching algorithms to detect an initial gesture.
- controlled device 101 filters out the reflection from the static objects. That is, spectral points are filtered resulting from the reflection from the static objects.
- controlled device 101 selects a peak frequency in the frequency domain of the FFT mixed audio signals as in step 506 .
- controlled device 101 estimates the distance between speaker(s) 103 of controlled device 101 and the object, such as hand 102 , using the selected peak frequency.
- controlled device 101 estimates the initial position of the object, such as hand 102 , using the estimated distance between speaker(s) 103 and microphone(s) 104 of controlled device 101 and the estimated total distance from speaker 103 of controlled device 101 to microphone 104 of controlled device 101 via the object, such as hand 102 .
- the estimated distance is the sum of the distance from speaker 103 to the object (e.g. hand 102 ) and the distance from the object to microphone 104 . If the pair consisting of speaker 103 and microphone 104 is co-located, the distance between speaker 103 and the object is half the estimated distance. When the number of speakers 103 and the number of dimensions are both equal to 2, the position of hand 102 is estimated as the intersection of all the circles centered at each of the speakers 103 with the radius derived from
- the object is located at the intersection of ellipses (2D) or ellipsoids (3D) whose foci are at speaker 103 and microphone 104 respectively, and the total distance to the foci is the corresponding estimated distance. Therefore, the object's position can be estimated based on the total distance to the foci.
- controlled device 101 transmits Frequency Modulated Continuous Waves (FMCW) audio signals (whether audible or inaudible to humans) from speaker(s) 103 to the object, such as hand 102 .
- FMCW Frequency Modulated Continuous Waves
- step 404 microphone 104 of controlled device 101 receives samples of the audio signals reflected from the object, such as hand 102 , that were emitted by speaker(s) 103 over a period of time.
- controlled device 101 mixes the received audio signals with the transmitted audio signals.
- controlled device 101 performs fast Fourier Transform (FFT) on the mixed audio signals.
- FFT fast Fourier Transform
- controlled device 101 selects one or more peak frequencies in the frequency domain of the FFT mixed audio signals.
- FFT is performed to determine the peak frequencies in the mixed signals. Then the frequencies are filtered out at the spectral points to remove reflection from the static objects. Next, one of the remaining peaks is selected based on its magnitude and/or consistency with the velocity estimated from the actual Doppler shift. The selected peak is rounded to the closest spectral point, and the rounded frequency is converted to the distance estimate using Equation (4). This procedure works because the initial gesture has slow speed, well below 50 Hz.
- the transmitted chirp signal has fundamental frequencies that are all multiples of the frequency 1/T. In other words, in the frequency domain, it has spectral points with an interval of 1/T Hz. For example, when the chirp interval is 0.01 s, it has spectral points every 100 Hz.
- the received signal ⁇ r (t) is a simple time shifted version of the transmitted signal ⁇ t (t), and has the same period.
- ⁇ m (t) ⁇ r (t) ⁇ t (t)
- FIG. 8A shows the simulation result of FMCW range detection when the transmitter and receiver are both static and 0.5 m apart in accordance with an embodiment of the present invention.
- the carrier frequency, bandwidth, and the duration of the chirp signal are 17 KHz, 2 KHz, and 0.01 s, respectively.
- FIG. 8A is the FFT result of ⁇ m (t).
- ⁇ m (t) has power every 100 Hz.
- the peak at 0 Hz is caused by the self-interference that the transmitted signal is directly received without reflection.
- the self-interference can be filtered by ignoring the peak around 0 Hz.
- another peak at 600 Hz is detected and considered to be due to the reflection from the target.
- Equation (4) one can determine that the total distance is 1.02 m, and the distance from the transceiver to the target is half of that distance: 0.51 m since the total distance includes both forward and reflected paths.
- FIG. 8B plots the simulation result when the range is 0.5 m and the Doppler shift is 20 Hz in accordance with an embodiment of the present invention.
- a combined frequency shift at 620 Hz is observed. It is important to decouple the overall shift to the distance-based shift and Doppler shift to achieve high accuracy.
- FIGS. 9A-9B show the FMCW signals with and without the peaks in the spectral points, respectively, while the hand is moving in accordance with an embodiment of the present invention. Ignoring the spectral points, one can clearly observe the shift caused by the moving hand 102 .
- controlled device 101 estimates the “pseudo Doppler shift” in the frequency domain of the FFT mixed audio signals for each selected peak frequency, where the pseudo Doppler shift is the difference between the current peak frequency versus the closest spectral point below the current peak frequency in the FFT mixed audio signals.
- the pseudo Doppler shift is defined as the difference between the current peak frequency versus the closest spectral point below the current peak frequency. If the fundamental frequency is 100 Hz and the peak frequency is 620 Hz, then the pseudo Doppler shift is 20 Hz. Similarly, a peak frequency at 660 Hz has the pseudo Doppler shift of 60 Hz. In this case, the actual Doppler shift may be 60 Hz or ⁇ 40 Hz. As discussed further below, the pseudo Doppler shift is translated to the actual Doppler shift. It is noted that the pseudo Doppler shift is defined as the difference with respect to the spectral point below the current peak frequency for consistency. For example, suppose there are two peaks at 660 Hz and 540 Hz. To combine the two estimates, one should combine 60 Hz with 40 Hz, instead of ⁇ 40 Hz with 40 Hz.
- controlled device 101 combines the multiple pseudo Doppler shifts into one pseudo Doppler shift.
- n T - 1 2 ⁇ T ⁇ ⁇ Hz ⁇ ⁇ to [ n T + 1 2 ⁇ T ] ⁇ ⁇ Hz
- n is an integer smaller than B ⁇ T.
- T 0.01 s
- the n-th slot spans (n ⁇ 100 ⁇ 50, n ⁇ 100+50).
- the slot with the maximum peak is searched and denoted as k.
- the pseudo Doppler shift is computed in the slot k and its 4 nearby slots.
- ⁇ i and f d,I denote the peak magnitude and pseudo Doppler shift during the i-th slot, respectively.
- the highest peak and its nearby peaks are used for the Doppler estimation of hand 102 , because his/her hand 102 is the closest moving object to speakers 103 when a user is facing the controlled device 101 (a common usage scenario).
- FIG. 10 shows a snapshot of the received FMCW signal while hand 102 is moving towards microphone 104 in accordance with an embodiment of the present invention.
- the highest peak is in slot 6 .
- the peaks from slots 4 to 8 are combined to get the final estimate of the pseudo Doppler shift.
- controlled device 101 translates the pseudo Doppler shift to the actual Doppler shift.
- the frequency shift comes partly from the propagation delay and partly from the Doppler shift.
- the shift caused by the propagation delay should be multiples of the fundamental frequency and the difference between the spectral point and the frequency of the current peak is caused by the Doppler shift.
- the hand movement is typically within 1 m/s, and its corresponding frequency shift should be within 100 Hz.
- FIG. 11 shows an example of the Doppler shift estimation in accordance with an embodiment of the present invention. At 1.1 seconds, the actual Doppler shift is 59 Hz, but this scheme (see curve 1101 ) estimates ⁇ 41 Hz, which results in a significant tracking error.
- a simple search is used based on the previous movement (see curve 1102 “search”). It tries each of the two choices every slot and picks the combination that minimizes the velocity change over all the slots.
- the “search” selects the correct Doppler shift. Therefore, the MRC output discussed above is used to estimate the pseudo Doppler shift and the “search” is applied to translate the pseudo Doppler shift to the actual Doppler shift.
- controlled device 101 estimates the velocity of the movement of the object, such as hand 102 , using the actual Doppler shift.
- the velocity estimate of the movement of hand 102 is then estimated using the estimated actual Doppler shift in Equation (5) as discussed above.
- controlled device 101 selects one or more peak frequencies in the frequency domain of the FFT mixed signals based on the estimated velocity of the movement of the object.
- range estimation refers to the distance between hand 102 and controlled device 101 , specifically speaker 103 of controlled device 101 .
- Doppler estimation refers to the distance between hand 102 and controlled device 101 , specifically speaker 103 of controlled device 101 .
- a peak whose distance change is most consistent with the one estimated by the Doppler shift is found. This is based on the observation that Doppler estimation in a short interval is more reliable than the range estimation. Specifically, let a denote the velocity from the current Doppler estimation and t s denote the sampling interval. Then ⁇ t s is the distance change during the current interval estimated using the velocity. Meanwhile, one can estimate the distance change from the FMCW.
- the distance change from the previous position is computed (i.e.,
- the particular i that is selected is the one that minimizes ⁇ d u (t), d(i ⁇ 1)
- controlled device 101 estimates the distance between the object, such as hand 102 , and speaker 103 of controlled device 101 based on the selected one or more peak frequencies using Equation (4).
- controlled device 101 computes the coordinates of the object, such as hand 102 , using the estimated velocity of the object, such as hand 102 , the estimated distance between speakers 103 and microphones 104 of controlled device, the estimated total distance from speaker 103 of controlled device 101 to microphone 104 of controlled device 101 via the object and the initial position of the object, such as hand 102 .
- controlled device 101 continues to receive samples of the audio signals reflected from the object, such as hand 102 , that were emitted by speaker(s) 103 over the next period of time (e.g., 100 ms) in step 404 .
- R k [t] and f D,k [t] denote the distance and Doppler shift from the anchor point k at time t, respectively.
- the distance measurement is obtained using the velocity from the Doppler shift and the previous position (which is derived from the initial position).
- the Doppler based distance R D,k [t] is as follows:
- the device-free motion tracking system of the present invention provides a new way for users to interact with the world by simply moving their hands. Users can freely play video games, interface with VR/AR devise, and control smart appliances anywhere at any time.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
The phase of the signal is calculated by integrating f(t) over time, which is:
As a result, the transmitted chirp signal can be represented as υtx(t)=cos(u(t)), where its magnitude is assumed to be one for simplicity.
where again the magnitude change is ignored. Let R, Vc, V denote the distance between the transceiver and target, the propagation speed, and the target's velocity. Then, the reflected signal is delayed by τd:
Similarly, if both the number of speakers 103 and dimensions are 3, the solution is the intersection of the spheres centered at these speakers 103 with the corresponding radii. If there are more speakers 103 than the number of dimensions (i.e., there are more constraints than the unknowns), one can localize
where P is the coordinate of
The transmitted chirp signal has fundamental frequencies that are all multiples of the
where n is an integer smaller than B×T. For example, when T is 0.01 s, the n-th slot spans (n×100−50, n×100+50).
where σi and fd,I denote the peak magnitude and pseudo Doppler shift during the i-th slot, respectively. The highest peak and its nearby peaks are used for the Doppler estimation of
where Dk[t−1] is the distance from the k-th speaker at the time slot t−1, fk is the carrier frequency of speaker 103, Vc is the propagation speed of the audio signal, and ts is the FMCW sampling interval.
where (xk, yk) is the position of the k-th speaker, and α and β are the constant weighting factors determined by the reliability of the range and Doppler shift estimation results, respectively. In the exemplary evaluation, α was set to 0.5 and β was set to 1 since the Doppler shift was found to be more accurate than FMCW.
Claims (38)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US15/414,084 US10386482B2 (en) | 2016-01-25 | 2017-01-24 | Device-free tracking system that accurately tracks hand movement |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201662286521P | 2016-01-25 | 2016-01-25 | |
| US15/414,084 US10386482B2 (en) | 2016-01-25 | 2017-01-24 | Device-free tracking system that accurately tracks hand movement |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20170212235A1 US20170212235A1 (en) | 2017-07-27 |
| US10386482B2 true US10386482B2 (en) | 2019-08-20 |
Family
ID=59359036
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/414,084 Active 2037-12-02 US10386482B2 (en) | 2016-01-25 | 2017-01-24 | Device-free tracking system that accurately tracks hand movement |
Country Status (1)
| Country | Link |
|---|---|
| US (1) | US10386482B2 (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20180157333A1 (en) * | 2016-12-05 | 2018-06-07 | Google Inc. | Information privacy in virtual reality |
Families Citing this family (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10386482B2 (en) * | 2016-01-25 | 2019-08-20 | Board Of Regents, The University Of Texas System | Device-free tracking system that accurately tracks hand movement |
| US10572001B2 (en) | 2016-12-09 | 2020-02-25 | Board Of Regents, The University Of Texas System | Controlling a device by tracking the movement of a finger |
| CN107607923B (en) * | 2017-08-21 | 2021-07-30 | 上海交通大学 | Vibration monitoring system and signal processing method based on LFMCW radar |
| US10641905B2 (en) * | 2017-09-15 | 2020-05-05 | Qualcomm Incorporated | Velocity bias compensation for swimmer position tracking |
| KR102649497B1 (en) | 2017-12-22 | 2024-03-20 | 레스메드 센서 테크놀로지스 리미티드 | Apparatus, system, and method for physiological sensing in vehicles |
| CN111629658B (en) | 2017-12-22 | 2023-09-15 | 瑞思迈传感器技术有限公司 | Apparatus, system, and method for motion sensing |
| EP3727134B8 (en) | 2017-12-22 | 2023-03-08 | ResMed Sensor Technologies Limited | Processor readable medium and corresponding method for health and medical sensing |
| EP4155782B1 (en) * | 2018-06-05 | 2024-04-03 | Google LLC | Systems and methods of ultrasonic sensing in smart devices |
| SG11202101826WA (en) | 2018-08-23 | 2021-03-30 | Univ Texas | Controlling a device by tracking movement of hand using acoustic signals |
| CN111965653A (en) * | 2020-07-13 | 2020-11-20 | 华中科技大学 | Distance measurement method and system based on linear sweep frequency sound wave |
| CN112180377B (en) * | 2020-09-22 | 2023-07-14 | 湖南大学 | A non-contact human-computer interaction positioning method, tracking method, terminal and readable storage medium |
| US11360214B2 (en) * | 2020-10-08 | 2022-06-14 | Aeva, Inc. | Techniques for ghosting mitigation in coherent lidar systems |
| US20230417890A1 (en) * | 2022-06-27 | 2023-12-28 | Samsung Electronics Co., Ltd. | System and method for measuring proximity between devices using acoustics |
| US12094488B2 (en) * | 2022-10-22 | 2024-09-17 | SiliconIntervention Inc. | Low power voice activity detector |
Citations (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US3427617A (en) * | 1959-04-21 | 1969-02-11 | Hazeltine Research Inc | Signal transmitting and receiving system |
| US6002808A (en) | 1996-07-26 | 1999-12-14 | Mitsubishi Electric Information Technology Center America, Inc. | Hand gesture control system |
| US6236736B1 (en) | 1997-02-07 | 2001-05-22 | Ncr Corporation | Method and apparatus for detecting movement patterns at a self-service checkout terminal |
| US6918875B2 (en) * | 2000-11-02 | 2005-07-19 | Japan Science And Technology Corporation | Ultrasound measurement apparatus |
| US20120327125A1 (en) | 2011-06-23 | 2012-12-27 | Omek Interactive, Ltd. | System and method for close-range movement tracking |
| US8938124B2 (en) | 2012-05-10 | 2015-01-20 | Pointgrab Ltd. | Computer vision based tracking of a hand |
| US9170325B2 (en) * | 2012-08-30 | 2015-10-27 | Microsoft Technology Licensing, Llc | Distance measurements between computing devices |
| US20160063611A1 (en) * | 2014-08-30 | 2016-03-03 | Digimarc Corporation | Methods and arrangements including data migration among computing platforms, e.g. through use of steganographic screen encoding |
| US20160321917A1 (en) * | 2015-04-30 | 2016-11-03 | Board Of Regents, The University Of Texas System | Utilizing a mobile device as a motion-based controller |
| US20170164321A1 (en) * | 2015-12-04 | 2017-06-08 | Board Of Regents, The University Of Texas System | Accurately tracking a mobile device to effectively enable mobile device to control another device |
| US20170212235A1 (en) * | 2016-01-25 | 2017-07-27 | Board Of Regents, The University Of Texas System | Device-free tracking system that accurately tracks hand movement |
| US20170347951A1 (en) * | 2014-12-08 | 2017-12-07 | University Of Washington | Systems and methods of identifying motion of a subject |
| WO2018106872A1 (en) * | 2016-12-09 | 2018-06-14 | Board Of Regents, The University Of Texas System | Controlling a device by tracking the movement of a finger |
-
2017
- 2017-01-24 US US15/414,084 patent/US10386482B2/en active Active
Patent Citations (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US3427617A (en) * | 1959-04-21 | 1969-02-11 | Hazeltine Research Inc | Signal transmitting and receiving system |
| US6002808A (en) | 1996-07-26 | 1999-12-14 | Mitsubishi Electric Information Technology Center America, Inc. | Hand gesture control system |
| US6236736B1 (en) | 1997-02-07 | 2001-05-22 | Ncr Corporation | Method and apparatus for detecting movement patterns at a self-service checkout terminal |
| US6918875B2 (en) * | 2000-11-02 | 2005-07-19 | Japan Science And Technology Corporation | Ultrasound measurement apparatus |
| US20120327125A1 (en) | 2011-06-23 | 2012-12-27 | Omek Interactive, Ltd. | System and method for close-range movement tracking |
| US8938124B2 (en) | 2012-05-10 | 2015-01-20 | Pointgrab Ltd. | Computer vision based tracking of a hand |
| US9170325B2 (en) * | 2012-08-30 | 2015-10-27 | Microsoft Technology Licensing, Llc | Distance measurements between computing devices |
| US20160063611A1 (en) * | 2014-08-30 | 2016-03-03 | Digimarc Corporation | Methods and arrangements including data migration among computing platforms, e.g. through use of steganographic screen encoding |
| US20170347951A1 (en) * | 2014-12-08 | 2017-12-07 | University Of Washington | Systems and methods of identifying motion of a subject |
| US20160321917A1 (en) * | 2015-04-30 | 2016-11-03 | Board Of Regents, The University Of Texas System | Utilizing a mobile device as a motion-based controller |
| US20170164321A1 (en) * | 2015-12-04 | 2017-06-08 | Board Of Regents, The University Of Texas System | Accurately tracking a mobile device to effectively enable mobile device to control another device |
| US10182414B2 (en) * | 2015-12-04 | 2019-01-15 | Board Of Regents, The University Of Texas System | Accurately tracking a mobile device to effectively enable mobile device to control another device |
| US20170212235A1 (en) * | 2016-01-25 | 2017-07-27 | Board Of Regents, The University Of Texas System | Device-free tracking system that accurately tracks hand movement |
| WO2018106872A1 (en) * | 2016-12-09 | 2018-06-14 | Board Of Regents, The University Of Texas System | Controlling a device by tracking the movement of a finger |
| US20180164874A1 (en) * | 2016-12-09 | 2018-06-14 | Board Of Regents, The University Of Texas System | Controlling a device by tracking the movement of a finger |
Non-Patent Citations (2)
| Title |
|---|
| Nandakumar et al., "Contactless Sleep Apnea Detection on Smartphones," The 13th International Conference on Mobile Systems, Applications, and Services, Florence, Italy, May 18-22, 2015, pp. 1-13. |
| Yun et al., "Turning a Mobile Device into a Mouse in the Air," The 13th International Conference on Mobile Systems, Applications, and Services, Florence, Italy, May 18-22, 2015, pp. 1-15. |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20180157333A1 (en) * | 2016-12-05 | 2018-06-07 | Google Inc. | Information privacy in virtual reality |
| US10817066B2 (en) * | 2016-12-05 | 2020-10-27 | Google Llc | Information privacy in virtual reality |
Also Published As
| Publication number | Publication date |
|---|---|
| US20170212235A1 (en) | 2017-07-27 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10386482B2 (en) | Device-free tracking system that accurately tracks hand movement | |
| Yun et al. | Strata: Fine-grained acoustic-based device-free tracking | |
| Mao et al. | CAT: High-precision acoustic motion tracking | |
| Mao et al. | Rnn-based room scale hand motion tracking | |
| CN103229071B (en) | Systems and methods for object position estimation based on ultrasound reflection signals | |
| US10182414B2 (en) | Accurately tracking a mobile device to effectively enable mobile device to control another device | |
| Wang et al. | Device-free gesture tracking using acoustic signals | |
| US8681585B2 (en) | Multi-range object location estimation | |
| Rishabh et al. | Indoor localization using controlled ambient sounds | |
| EP2486474B1 (en) | User interfaces | |
| US10572001B2 (en) | Controlling a device by tracking the movement of a finger | |
| US20160321917A1 (en) | Utilizing a mobile device as a motion-based controller | |
| Wang et al. | {MAVL}: Multiresolution analysis of voice localization | |
| Shah et al. | Step-frequency radar with compressive sampling (SFR-CS) | |
| JP2010522879A (en) | System and method for positioning | |
| Zhuang et al. | ReflecTrack: Enabling 3D acoustic position tracking using commodity dual-microphone smartphones | |
| Cheng et al. | Push the limit of device-free acoustic sensing on commercial mobile devices | |
| Pfeil et al. | Robust acoustic positioning for safety applications in underground mining | |
| AlSharif et al. | Zadoff-Chu coded ultrasonic signal for accurate range estimation | |
| Cheng et al. | PD-FMCW: Push the limit of device-free acoustic sensing using phase difference in FMCW | |
| Khyam et al. | Pseudo-orthogonal chirp-based multiple ultrasonic transducer positioning | |
| US11474194B2 (en) | Controlling a device by tracking movement of hand using acoustic signals | |
| Lian et al. | Room-scale Location Trace Tracking via Continuous Acoustic Waves | |
| Ohara et al. | Preliminary investigation of position independent gesture recognition using wi-fi csi | |
| Yun | Towards accurate object tracking using acoustic signal |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: BOARD OF REGENTS, THE UNIVERSITY OF TEXAS SYSTEM, Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:QIU, LILI;YUN, SANGKI;SIGNING DATES FROM 20170117 TO 20170121;REEL/FRAME:041065/0799 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2551); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY Year of fee payment: 4 |