[go: up one dir, main page]

US20150205366A1 - Viewpoint Change on a Display Device Based on Movement of the Device - Google Patents

Viewpoint Change on a Display Device Based on Movement of the Device Download PDF

Info

Publication number
US20150205366A1
US20150205366A1 US14/672,856 US201514672856A US2015205366A1 US 20150205366 A1 US20150205366 A1 US 20150205366A1 US 201514672856 A US201514672856 A US 201514672856A US 2015205366 A1 US2015205366 A1 US 2015205366A1
Authority
US
United States
Prior art keywords
image
display
orientation
time
display screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/672,856
Inventor
Barry Lee Petersen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CELSIA LLC
Original Assignee
CELSIA LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CELSIA LLC filed Critical CELSIA LLC
Priority to US14/672,856 priority Critical patent/US20150205366A1/en
Publication of US20150205366A1 publication Critical patent/US20150205366A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/60Rotation of whole images or parts thereof
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/16Cloth

Definitions

  • the disclosed technology relates generally to viewing on a display device and, more specifically, to changing a viewing angle based on physical orientation of the device.
  • the changing of the exhibited display may be a change in viewpoint of the human figure around a predefined center of gravity of the human figure, or a center of gravity of a plurality of human figures.
  • the center of gravity may be an actual center of gravity, an estimated center of gravity, and/or a chosen point which is at the center, or a approximately a center, as defined in each case in a specific embodiment of the disclosed technology.
  • the display device may exhibit a display of a moving human figure, the human figure moving irrespective of, and in addition to, a change in viewpoint of the exhibited human figure. That is, a user of the display device may change the orientation of the device relative to his or her position, and the viewpoint of the human figure shown therein may continue to change with changes in device orientation around a vector orthogonal or partionally orthogonal to the sensor's directional vector.
  • the change in viewpoint may be non-linearly or linearly mapped to an amount of movement of the display device.
  • the human figure shown on the display device may be wearing clothes as part of an offer for sale of the clothes.
  • the display may be used to teach a user positioning during an exercise with defined positions, such as yoga, martial arts, sports, and pornography positions.
  • a further embodiment of the disclosed technology is a method of displaying a series of images on a handheld device.
  • the method proceeds by storing a plurality of images representative of a human figure, measuring the orientation of the handheld device with an orientation sensor, exhibiting on the display a first image of the human figure, and changing the image exhibited based on a direction and rate of movement of the handheld device, as determined from the orientation sensor.
  • the direction and rate of movement are determined by measuring changes in orientation over time compared to a fixed directional vector, such as acceleration due to gravity or magnetic north.
  • FIG. 1 shows an example of a display shown on a display device which changes based on an orientation change of the device.
  • FIG. 1A shows a first orientation of an example of a display device with images shown on the display that change based on an orientation change of the device relative to a stationary external environment.
  • FIG. 1B shows the display device at a second orientation, rotated to the left compared to the first orientation.
  • FIG. 1C shows the display device at a third orientation, rotated further to the left compared to the first and second orientations.
  • FIG. 2 demonstrates the use of an accelerometer to determine an orientation change in an embodiment of the disclosed technology.
  • FIG. 2A shows a familiar “portrait” configuration of an image to be reoriented on a display device.
  • FIG. 2B demonstrates the use of an accelerometer to change an orientation of the image of FIG. 2A relative to a three-dimensional plane, aligned parallel to the ground along one potential orthogonal axis relative to the accelerometer's gravitational orientation directional vector in an embodiment of the disclosed technology.
  • FIG. 2C shows a rotation around the top and bottom of the displayed person (of FIG. 2A ) along the horizontal axis of the display device aligned with the external axis that is orthogonal to the orientation directional vector.
  • FIG. 2D shows an alternative rotation of the image along the vertical axis of the device.
  • FIG. 3 demonstrates the use of a compass to determine an orientation change in an embodiment of the disclosed technology.
  • FIG. 3A shows the familiar “portrait” configuration for illustrative purposes.
  • FIG. 3B demonstrates the use of a compass to determine an orientation change of the image of FIG. 3A displayed on a display device aligned parallel to the x-z plane of the three-dimensional external environment shown by the axes.
  • FIG. 4 demonstrates the use of an accelerometer and a compass to determine an orientation change in an embodiment of the disclosed technology.
  • FIG. 4A shows the display device and an image of a person rotating around the vertical axis at a first angle, orthogonal to the accelerometer's gravitational orientation directional vector with the image aligned to the compass's orientation directional vector.
  • FIG. 4B demonstrates the use of an accelerometer and a compass to change an orientation of the image of FIG. 4A .
  • FIG. 4C shows the display device and an image of a person rotating around the same axis at a second angle, orthogonal to the accelerometer's gravitational orientation directional vector with the image aligned to the compass's orientation directional vector.
  • FIG. 4D shows the display device and an image of a person rotating around the horizontal axis at a first angle, orthogonal to the accelerometer's gravitational orientation directional vector with the image aligned to the compass's orientation directional vector.
  • FIG. 5 shows an example of the changing position of a human figure in embodiments of the disclosed technology.
  • FIG. 5A shows a first position of a sequence of orientations of an example of changing the position of a human figure in embodiments of the disclosed technology.
  • FIG. 5B shows a second position of a sequence of orientations of an example of changing the position of a human figure in embodiments of the disclosed technology.
  • FIG. 5C shows a third position of a sequence of orientations of an example of changing the position of a human figure in embodiments of the disclosed technology.
  • FIG. 6 shows an example of a display device tilted upwards and downwards with respect to the viewer in an embodiment of the disclosed technology.
  • FIG. 7 shows an example of changing orientation of a display with two human figures in an embodiment of the disclosed technology.
  • FIG. 7A shows a first portrait position in an example of changing orientation of a display with two human figures in an embodiment of the disclosed technology.
  • FIG. 7C shows the image of FIG. 7A in a third position.
  • FIG. 7E shows the image of FIG. 7A in a fifth position.
  • FIG. 8A shows a first position in an example of a changing orientation of a display device, as well the changing position of the human figures exhibited on the display device in embodiments of the disclosed technology.
  • FIG. 8B shows the image of the human figures of FIG. 8A in a second position.
  • FIG. 8C shows the image of the human figures of FIG. 8A in a third position.
  • FIG. 9 shows steps taking in a method of carrying out embodiments of the disclosed technology.
  • FIG. 10 shows a high level block diagram of a specialized image input and display device on which embodiments of the disclosed technology may be carried out.
  • FIG. 11 shows a high-level block diagram of a computer that may be used to carry out the disclosed technology.
  • an accelerometer is defined as a device which measures acceleration of a device relative to freefall.
  • a single or multi-axis accelerometer may be used to carry out embodiments of the disclosed technology to sense orientation.
  • An accelerometer measures the acceleration relative to a frame of reference.
  • An accelerometer at rest relative to the earth's surface will indicate approximately 1 g upwards, because any point on the earth's surface is accelerating upwards relative to a local inertial frame.
  • the offset from gravity is subtracted, or the change in acceleration is measured to determine when an object is rotating about an axis.
  • the combined measurements in relation to multiple axes are used to determine rotation which is unaligned with the earth.
  • a compass for purposes of this disclosure, is a device which determines the orientation of the display device of embodiments of the disclosed technology with respect to the plane of the earth's surface. It may detect true north, magnetic north, or any other direction. It may instead determine only a change in direction without knowing an actual direction, in embodiments of the disclosed technology. When determining a magnetic field, this may be by measuring the magnetic field directly or measuring another value and approximating the magnetic field/direction. A solid state compass, global positioning system, or other such device may be used for this purpose.
  • the display device used in embodiments of the disclosed technology is a display capable of changing an exhibited image and is a screen in two dimensions (e.g., a flat screen, CRT screen, LCD screen, plasma screen, etc).
  • the initial position of the display device is within an environment having an orientation relative to the external environment 120 where the x and y axes are parallel to the plane of the page on which the figure is printed, with the z axis pointing out towards the viewer.
  • the orientation directional vector 130 is shown pointing into the plane of the paper and away from the viewer, and could represent, for example, either acceleration due to gravity or a magnetic field direction such as magnetic north. As the display device 100 is rotated to the left, the orientation of the device changes with respect to the viewer and the external environment.
  • the image of the person presented on the display rotates to the left, along with the direction of rotation of the display device, in a second and third position of the person 112 and 114 and second and third orientation of the device relative to the external environment 120 .
  • the object on the display reacts just as a real object such as a doll would appear if held in the viewer's hands and rotated similarly, providing images that simulate the real-world rotation.
  • this is a linear or non-linear relationship.
  • the rotation may be made to feel natural, that is, a person is looking around the object by tilting the screen. As the screen is tilted, so too is the image in an adjacent manner, whether it is rotated the same degree amount or a multiple of that amount. That is, by way of example, when the screen is rotated 30, 45, and 60 degrees, the person shown in the image is rotated 30, 45, and 60 degrees in a one-to-one linear correspondence, or 60, 90, and 120 degrees in a two-to-one linear correspondence, or ⁇ 30, ⁇ 45, and ⁇ 60 in a negative one correspondence. In a non-linear example, when rotating the device/screen 30, 45, and 60 degrees, the person shown may be rotated 30, 90, and 180 degrees, respectively.
  • FIGS. 2 and 3 demonstrate the relationship of the display device to orientation directional vectors in embodiments of the disclosed technology.
  • FIG. 2 demonstrates the use of an accelerometer to determine an orientation change in an embodiment of the disclosed technology.
  • FIG. 3 demonstrates the use of a compass to determine an orientation change in an embodiment of the disclosed technology.
  • FIG. 2C shows a rotation around the top and bottom of the displayed person (of FIG. 2A ) along the horizontal axis of the display device aligned with the external axis that is orthogonal to the orientation directional vector.
  • FIGS. 2D and 2E show the device with the same image alignment of FIG. 2A at a second orthogonal angular variation 260 and rotation directions 270 .
  • FIG. 2D shows the rotation along the vertical axis of the device
  • FIG. 2E shows the rotation along the horizontal axis of the device.
  • the axes of the display device 200 could actually be aligned along any axis orthogonal or partially orthogonal to the orientation directional vector, G, 230 , and that the axis at angle 240 shown in FIGS.
  • the device may be rotated, in embodiments of the disclosed technology, around any axis orthogonal to the orientation directional vector G passing through the plane of the device, not just vertical ( FIGS. 2B , 2 D) or horizontal ( FIGS. 2C , 2 E), as long as the accelerometer can determine the angle of rotation. So long as image data is provided for such changes in orientation, a view in any direction around the person (or other object shown in the video display) is shown from any viewing direction by changing the device's orientation with respect to the orientation directional vector, G, 230 in the embodiment shown.
  • the rotations depicted by arrows 250 and 270 described above with reference to FIG. 2 are generally measurable with an accelerometer.
  • the measured directional change of a point on the display device with respect to the gravity of the earth e.g., acceleration of the display device relative to freefall
  • the image shown e.g., person image 210
  • an accelerometer may be used without the aid of a compass or determination of an orientation relative to the surface of the earth.
  • FIG. 3 demonstrates the use of a compass to determine an orientation change in an embodiment of the disclosed technology.
  • the elements of FIG. 3 are generally incremented by 100 compared to those of FIG. 2 .
  • FIG. 3 shows the orientation directional vector 330 obtained from a compass being used to generate the device orientation information that drives the image changing mechanism of the embodiment shown.
  • the compass is aligned only along the magnetic field lines of the earth with no relationship to mass and freefall, so only rotations 350 around the vertical axis are used for orientation determination with a compass.
  • FIG. 3A shows a video display of an image of a person 310 on the screen of the display device 300 in the embodiment shown.
  • Rotations of the display device around the vertical axis or comprising a vertical component to rotation about axis 340 orthogonal to the orientation directional vector generate angle displacements that can be used to change the images of the person (or other object displayed). Similar to the images shown in FIG. 1 , rotations 350 of the device 300 in either direction gives the impression of looking around the person's figure. Unlike with an accelerometer, there is no measurable change for rotations around the horizontal (parallel to the x-axis) or other non-vertical axes passing through the screen of the device.
  • a change in orientation of the display device is detected, in embodiments of the disclosed technology, in any direction.
  • These directional changes which may be detected in embodiments of the disclosed technology, include rotations about any one of a combination of the X, Y, and Z axes (which may be defined, in this instance, as in any direction in which a change to orientation with respect to any of the X, Y, and Z axes is detectable) regardless of the starting orientation of the display device.
  • FIG. 4 demonstrates the use of an accelerometer and a compass to determine an orientation change in an embodiment of the disclosed technology.
  • FIG. 4 shows an embodiment that employs the accelerometer in conjunction with a compass, in order to obtain two orientation directional vectors which can be used to fix the orientation of the person object in three-dimensional space so that the device can be used to explore the person as an object.
  • the elements of FIG. 4 are generally incremented by 100 compared to those of FIG. 3 .
  • FIGS. 4A and 4C show the display device 400 and an image of a person 410 rotating around the same axis at angle 440 , orthogonal to the accelerometer's gravitational orientation directional vector 430 . But unlike in FIGS.
  • FIGS. 4A and 4C show a third intermediate position between FIGS. 4A and 4C as the display device rotates counterclockwise around the z-axis, with the person object aligned to the magnetic orientation directional vector 480 .
  • Rotations 450 around the axis at angle 440 orthogonal to the accelerometer orientation directional vector 430 still use the accelerometer-measured displacements to allow the viewer to observe the person image from the sides as described in FIG. 1 and FIGS. 2B and 2D .
  • FIGS. 4D , 4 E, and 4 F show the same rotational concept as in FIG. 2E , wherein a rotation 470 of the device along a second independent orthogonal axis 460 would allow the viewer to look above and below the person object.
  • the information obtained from the compass keeps the person (or other) image aligned to the magnetic orientation directional vector 480 .
  • the display device could be rotated around any orthogonal axis such as 440 or 460 to obtain different directional views (images) of the person object. Note that when the display device moves to a vertical position, where the vertical axis of the display device is aligned to the z-axis of the external environment, there is no measurable accelerometer data and the system defaults to a purely compass orientation directional vector-based system as described in FIG. 3 .
  • FIG. 5 shows an example of changing the position of a human figure in embodiments of the disclosed technology.
  • FIGS. 5A through 5C show a sequence of orientations and displays on a display device 500 .
  • the figure may change position over time, irrespective of the orientation of the device, or may change position as a result of a change in orientation of the display device.
  • a combination of these cases may also be employed where the person image changes over time in a specific device orientation, and the display orientation of the human figure also changes as a result of a change in orientation of the display device.
  • Each of these cases will be described in more detail below.
  • the human figure's arm is in a down position 570 at a first time, shown in FIG. 5A , being raised 572 at a second time, in FIG. 5B , and pointing upwards 574 at a third time, shown in FIG. 5C .
  • the orientation of the device changes from a first orientation in FIG. 5A to a second orientation in FIG. 5B , and back to the first orientation in FIG. 5C .
  • the human FIG. 510 and 514 while in the same orientation (because the display devices are in the same orientation) are in different positions because a certain amount of time has elapsed, and over that period of time, the human FIG. 510 / 512 / 514 has moved.
  • This may be used for example with showing a football or other sports move.
  • the human figure is throwing a football and, to learn proper technique while watching the video/succession of images, a viewer can change the orientation of the device to see different angles of the human figure to learn how to emulate the move.
  • it may be a dance move, part of a pornographic video, or an advertisement such as for the sale of clothing.
  • the change in position of the human figure may be as a result of a change in orientation of the display device.
  • the arm is down 570 .
  • a second orientation and a second position of the human figure are shown (here, with a raised arm 572 ).
  • the arm is now raised. In this manner, any feature of the human figure may be changed, such as a smile to a frown, an arm up to an arm down, indicia on clothing, and the like.
  • the human figure may be toggled. That is, for example, the change from the orientation in 5 A to 5 B and back to the original orientation and human FIG. 514 is shown. Do it again, and the human FIG. 510 is shown.
  • the video or images could be continuously changing as time progresses, but the set of video images shown at every particular orientation of the device would be relative to each particular orientation.
  • the first orientation the person 510 might be speaking for one minute. If left in that orientation, the speaker would conduct his speech with his arm down 570 for the entirety of the speech.
  • the user instead changes the device to a second orientation as shown in FIG. 2B , he may raise his hand 572 during the course of the one minute speech.
  • his hand 574 is now raised until the one minute speech is complete. In this way, the movement to the second position shown in FIG.
  • FIG. 6 shows an example of a display device tilted upwards and downwards with respect to the viewer in an embodiment of the disclosed technology.
  • the display device 600 is turned 90 degrees clockwise with respect to that of the display device shown in FIG. 2 , but the tilt is analogous to the directional change indicated by arrow 270 in FIG. 2E .
  • FIG. 7 shows an example of the changing orientation of a display with two human figures in an embodiment of the disclosed technology.
  • both human FIGS. 710 and 720 are exhibited.
  • the orientation changes (left/right as cycled through the FIGS. 7A , 7 B, 7 C, 7 D, and 7 E)
  • the human FIGS. 710 and 720 are rotated, that is, the perspective from which they are viewed is rotated from the point of view of the viewer.
  • they are rotated about a center of gravity 715 which is a combined center of gravity of the figures shown, e.g., roughly at the point of contact of their hands.
  • Any of the techniques and embodiments shown and described with reference to FIGS. 1 through 6 may be employed with the two or more human FIGS. 710 and 720 shown in FIG. 7 .
  • FIG. 8 shows an example of the changing orientation of a display device as well the changing position of the human figures exhibited on the display device in embodiments of the disclosed technology. That which is shown in FIGS. 8A through 8D is analogous to a combination of what has been shown and described with respect to FIGS. 5 (change in position of human figure) and 7 (using multiple human figures). In this manner, whether based on time or orientation change or both, the orientation of the human FIGS. 810 and 820 and position of the human figures on device 800 are changed.
  • any element of the displayed human figures may change, including their position, position of one figure relative to the other, color of clothing or indicia, addition or removal of a logo, change in facial expression, addition of props (e.g., a football), and the like.
  • FIG. 9 shows steps taken in a method of carrying out embodiments of the disclosed technology.
  • a sequence of images of a human figure is stored in various orientations. This is at a first moment or instant in time.
  • images may be stored around the left and right side of the person.
  • images are stored over a period of time. These images may be from one or multiple orientations, the human figure may be moving or changing in some manner (e.g., clothes changing color or styles to show different clothing options for sale), and may be a video.
  • step 930 which can be carried out before, after, or both before and after step 960 , orientation data is received in a continuous manner from an orientation sensor, such as an accelerometer or a compass or another instrument for measuring orientation of, or relative to, a pre-existing magnetic field.
  • an orientation sensor such as an accelerometer or a compass or another instrument for measuring orientation of, or relative to, a pre-existing magnetic field.
  • step 950 it is determined how the display device of embodiments of the disclosed technology is moving or being reoriented, whether tilting up/down, tilting left/right, being rotated left/right, or being reoriented towards a different cardinal direction.
  • orientation data is further received from a secondary orientation sensor, which could also be an accelerometer or compass.
  • step 960 an image of a human figure is exhibited which is either a pre-defined first image or an image based on the orientation or position of the display device.
  • Steps 930 and 940 continue to be carried out, and in step 970 , upon receiving further movement data from the accelerometer (such as movement past a predefined threshold) or new orientation data from a compass, the image is changed.
  • the image may additionally change over time, such as with a video or sequence, in step 980 .
  • a change in time and a change in orientation would produce a combined change, in embodiments of the disclosed technology.
  • the human figure moves irrespective of a change in viewpoint (orientation) of the exhibited human figure.
  • a change in viewpoint may also occur, as determined by received motion data providing movement data with regard to the display device.
  • FIG. 10 shows a high level block diagram of a specialized image input and display device on which embodiments of the disclosed technology may be carried out.
  • the device may comprise some or all of the high level elements shown in FIG. 10 and may comprise further devices or be part of a larger device.
  • Data bus 1070 transports data between the numbered elements shown in device 1000 .
  • Central processing unit 1040 receives and processes instructions such as code.
  • Volatile memory 1010 and non-volatile memory 1020 store data for processing by the central processing unit 1040 .
  • the data storage apparatus 1030 may be magnetic media (e.g., hard disk, video cassette), optical media (e.g., Blu-Ray or DVD) or another type of storage mechanism known in the art.
  • the data storage apparatus 1030 or the non-volatile memory 1020 stores data which is sent via bus 1070 to the video output 1060 .
  • a datum received from an accelerometer or compass 1090 is processed by the central processing unit 1040 to determine if a change in viewpoint or orientation has been made.
  • the displayed image is outputted via a video output 1060 , that is, a transmitter or video relay device which transmits video to a television screen, monitor, or other display device 1080 via cable or data bus 1065 .
  • the video output 1060 may also be an output over a packet-switched network 1065 such as the internet, where it is received and interpreted as video data by a recipient display 1080 .
  • the recipient display may be a liquid crystal display, cathode ray tube, or series of light-emitting diodes, or any other known display system.
  • An input/output device 1050 such as buttons on the device itself, an infrared signal receiver for use with a remote control, or a network input/output for control via a local or wide area network, receives and/or sends a signal via data pathway 1055 (e.g., infrared signal, signal over copper or fiber cable, wireless network, etc.
  • the input/output device receives input from a user, such as which image to display and how to interact with a detected object.
  • FIG. 11 shows a high-level block diagram of a computer that may be used to carry out the disclosed technology.
  • Computer 1100 contains a processor 1150 that controls the overall operation of the computer by executing computer program instructions which define such operation.
  • the computer program instructions may be stored in a storage device 1120 (e.g., magnetic disk, database) and loaded into memory 1130 when execution of the computer program instructions is desired.
  • a storage device 1120 e.g., magnetic disk, database
  • the computer operation will be defined by computer program instructions stored in memory 1130 and/or storage 1120 , and the computer will be controlled by processor 1150 executing the computer program instructions.
  • Computer 1100 also includes one, or a plurality of, input network interfaces for communicating with other devices via a network (e.g., the internet).
  • a network e.g., the internet
  • Computer 1100 also includes one or more output network interfaces 1110 for communicating with other devices.
  • Computer 1100 also includes input/output 1140 , representing devices which allow for user interaction with the computer 1100 (e.g., display, keyboard, mouse, speakers, buttons, touch-sensitive screen, etc.).

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Economics (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Finance (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

Embodiments of the disclosed technology comprise a handheld display device with built-in accelerometer and, in some embodiments, compass. The display of a human figure is changed based on a change in viewpoint/orientation of the device. That is, upon detecting a change in viewpoint (e.g., viewing angle, tilt, roll, or pitch of the device), the image of the person changes. This may be used with a still picture of a person, such as for the sale of clothing, or in conjunction with moving images, such as for a sports or exercise instructional video.

Description

    FIELD OF THE DISCLOSED TECHNOLOGY
  • The disclosed technology relates generally to viewing on a display device and, more specifically, to changing a viewing angle based on physical orientation of the device.
  • BACKGROUND OF THE DISCLOSED TECHNOLOGY
  • Shopping online is typically a glorified version of catalog shopping. On a catalog page, a picture of the item, or perhaps several pictures, are shown, and, in the case of clothing, typically on a model. One can see the sizes and prices available as well. In an online catalog, generally the same information is available, but in some cases, a person can choose a certain picture and zoom in. In some instances, videos are available. Someone wishing to learn a karate sequence, might watch a video of it, see diagrams, and so forth. Still, the content shown is controlled, by and large, by the provider. The interactivity is limited to the familiar rewind, stop, play, and fast-forward features. While such features are useful, still pictures and even videos are a poor substitute for actually being there.
  • While using a mouse to click and drag around an object to transpose the position of the object or even rotate it, such an interaction has limited translation to the real world and allows a finite direction of change to what is viewed, based on how the mouse is currently mapped in an application. Generally, such movements either transpose the position of an object or camera position on the XY axis, zooms in, or allows a user choose a different picture where one can repeat the changes.
  • What is needed is a way to make the user experience more real in such a way as to make a user feel more immersed and in control of what s/he is watching and to have the ability to control greater axes of movement in a natural manner.
  • SUMMARY OF THE DISCLOSED TECHNOLOGY
  • It is an object of the disclosed technology to provide a simple and natural interface and control for viewing a human figure.
  • It is a further object of the disclosed technology to provide a handheld display device whereby a display becomes interactive when the device is moved.
  • It is a further object of the disclosed technology to take into account orientation and/or acceleration of a device to determine what is displayed on a device.
  • An embodiment of the disclosed technology is a display device with an orientation sensor. The orientation sensor, which for example could be an accelerometer or a compass, measures orientation of the display device relative to a fixed external directional vector and, in some embodiments, the rate of displacement of the device from the same directional vector. An accelerometer measures orientation or movement changes relative to gravity while a compass measures change in orientation or movement relative to a pole (e.g., relative to the north pole). Thus, depending on orientation of the device and direction of movement, the accelerometer, compass, or combination thereof determines a direction of movement of a display screen. The display device of this embodiment further has a storage device with data representative of a human figure, and a display exhibiting the human figure. The display of the human figure changes based on a direction of movement detected by the orientation sensor, and in some cases, also based on a direction of movement detected by a secondary orientation sensor.
  • The changing of the exhibited display may be a change in viewpoint of the human figure around a predefined center of gravity of the human figure, or a center of gravity of a plurality of human figures. The center of gravity may be an actual center of gravity, an estimated center of gravity, and/or a chosen point which is at the center, or a approximately a center, as defined in each case in a specific embodiment of the disclosed technology.
  • The display device may exhibit a display of a moving human figure, the human figure moving irrespective of, and in addition to, a change in viewpoint of the exhibited human figure. That is, a user of the display device may change the orientation of the device relative to his or her position, and the viewpoint of the human figure shown therein may continue to change with changes in device orientation around a vector orthogonal or partionally orthogonal to the sensor's directional vector. The change in viewpoint may be non-linearly or linearly mapped to an amount of movement of the display device.
  • In a specific embodiment of the display device, upon a first rotation of the device at a first relative position, a first display of the human figure is exhibited. Upon a second rotation to a second relative position, a second display of the human figure is exhibited, such as frames in a continuous video. Upon a third rotation back to the first relative position, a third display of the human figure is exhibited. Thus, for example, a person with frontal face showing may be frowning in the first display at a first relative position of the device. When a user changes the position of the device relative to his or her position, the display of the human figure on the device is changed, such as, for example, to a view of the side of the same person's head. When returning to the first relative position showing the frontal view of the person's face again, the same person is smiling. This viewpoint change may be toggled during repeated successions of viewing and changing the view away from and back to the first relative position.
  • The human figure shown on the display device may be wearing clothes as part of an offer for sale of the clothes. As another usage, the display may be used to teach a user positioning during an exercise with defined positions, such as yoga, martial arts, sports, and pornography positions.
  • A further embodiment of the disclosed technology is a method of displaying a series of images on a handheld device. The method proceeds by storing a plurality of images representative of a human figure, measuring the orientation of the handheld device with an orientation sensor, exhibiting on the display a first image of the human figure, and changing the image exhibited based on a direction and rate of movement of the handheld device, as determined from the orientation sensor. The direction and rate of movement are determined by measuring changes in orientation over time compared to a fixed directional vector, such as acceleration due to gravity or magnetic north.
  • Further elements of the device of the disclosed technology are applicable to embodiments of the method of the disclosed technology.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 shows an example of a display shown on a display device which changes based on an orientation change of the device.
  • FIG. 1A shows a first orientation of an example of a display device with images shown on the display that change based on an orientation change of the device relative to a stationary external environment.
  • FIG. 1B shows the display device at a second orientation, rotated to the left compared to the first orientation.
  • FIG. 1C shows the display device at a third orientation, rotated further to the left compared to the first and second orientations.
  • FIG. 2 demonstrates the use of an accelerometer to determine an orientation change in an embodiment of the disclosed technology.
  • FIG. 2A shows a familiar “portrait” configuration of an image to be reoriented on a display device.
  • FIG. 2B demonstrates the use of an accelerometer to change an orientation of the image of FIG. 2A relative to a three-dimensional plane, aligned parallel to the ground along one potential orthogonal axis relative to the accelerometer's gravitational orientation directional vector in an embodiment of the disclosed technology.
  • FIG. 2C shows a rotation around the top and bottom of the displayed person (of FIG. 2A) along the horizontal axis of the display device aligned with the external axis that is orthogonal to the orientation directional vector.
  • FIG. 2D shows an alternative rotation of the image along the vertical axis of the device.
  • FIG. 2E shows an alternative rotation of the image along the horizontal axis of the device.
  • FIG. 3 demonstrates the use of a compass to determine an orientation change in an embodiment of the disclosed technology.
  • FIG. 3A shows the familiar “portrait” configuration for illustrative purposes.
  • FIG. 3B demonstrates the use of a compass to determine an orientation change of the image of FIG. 3A displayed on a display device aligned parallel to the x-z plane of the three-dimensional external environment shown by the axes.
  • FIG. 4 demonstrates the use of an accelerometer and a compass to determine an orientation change in an embodiment of the disclosed technology.
  • FIG. 4A shows the display device and an image of a person rotating around the vertical axis at a first angle, orthogonal to the accelerometer's gravitational orientation directional vector with the image aligned to the compass's orientation directional vector.
  • FIG. 4B demonstrates the use of an accelerometer and a compass to change an orientation of the image of FIG. 4A.
  • FIG. 4C shows the display device and an image of a person rotating around the same axis at a second angle, orthogonal to the accelerometer's gravitational orientation directional vector with the image aligned to the compass's orientation directional vector.
  • FIG. 4D shows the display device and an image of a person rotating around the horizontal axis at a first angle, orthogonal to the accelerometer's gravitational orientation directional vector with the image aligned to the compass's orientation directional vector.
  • FIG. 4E demonstrates the use of an accelerometer and a compass to change an orientation of the image of FIG. 4D.
  • FIG. 4F shows the display device and an image of a person rotating around the same axis at a second angle, orthogonal to the accelerometer's gravitational orientation directional vector with the image aligned to the compass's orientation directional vector.
  • FIG. 5 shows an example of the changing position of a human figure in embodiments of the disclosed technology.
  • FIG. 5A shows a first position of a sequence of orientations of an example of changing the position of a human figure in embodiments of the disclosed technology.
  • FIG. 5B shows a second position of a sequence of orientations of an example of changing the position of a human figure in embodiments of the disclosed technology.
  • FIG. 5C shows a third position of a sequence of orientations of an example of changing the position of a human figure in embodiments of the disclosed technology.
  • FIG. 6 shows an example of a display device tilted upwards and downwards with respect to the viewer in an embodiment of the disclosed technology.
  • FIG. 6A shows an example of a display device at a first position with respect to the viewer in an embodiment of the disclosed technology.
  • FIG. 6B shows an example of a display device before at a second position, tilted upwards with respect to the viewer in an embodiment of the disclosed technology.
  • FIG. 7 shows an example of changing orientation of a display with two human figures in an embodiment of the disclosed technology.
  • FIG. 7A shows a first portrait position in an example of changing orientation of a display with two human figures in an embodiment of the disclosed technology.
  • FIG. 7B shows a second position in an example of changing orientation of a display with two human figures in an embodiment of the disclosed technology.
  • FIG. 7C shows the image of FIG. 7A in a third position.
  • FIG. 7D shows the image of FIG. 7A in a fourth position.
  • FIG. 7E shows the image of FIG. 7A in a fifth position.
  • FIG. 8 shows an example of a changing orientation of a display device, as well the changing position of the human figures exhibited on the display device in embodiments of the disclosed technology.
  • FIG. 8A shows a first position in an example of a changing orientation of a display device, as well the changing position of the human figures exhibited on the display device in embodiments of the disclosed technology.
  • FIG. 8B shows the image of the human figures of FIG. 8A in a second position.
  • FIG. 8C shows the image of the human figures of FIG. 8A in a third position.
  • FIG. 8D shows the image of the human figures of FIG. 8A in a fourth position.
  • FIG. 9 shows steps taking in a method of carrying out embodiments of the disclosed technology.
  • FIG. 10 shows a high level block diagram of a specialized image input and display device on which embodiments of the disclosed technology may be carried out.
  • FIG. 11 shows a high-level block diagram of a computer that may be used to carry out the disclosed technology.
  • DETAILED DESCRIPTION OF EMBODIMENTS OF THE DISCLOSED TECHNOLOGY
  • Embodiments of the disclosed technology comprise a handheld display device with a built-in orientation sensor such as an accelerometer or a compass. The display of a human figure on the display screen changes based on a change in the device's hardware with respect to an external orientation vector and the user's viewing angle. That is, upon detecting a change in the viewpoint (e.g., viewing angle, tilt, roll, or pitch of the device relative to the stationary user), the image of the person changes. This may be used with images of a non-moving person, such as for the sale of clothing, or in conjunction with images of a person moving, such as for a sports or exercise instructional video.
  • For purposes of this disclosure, an accelerometer is defined as a device which measures acceleration of a device relative to freefall. A single or multi-axis accelerometer may be used to carry out embodiments of the disclosed technology to sense orientation. An accelerometer measures the acceleration relative to a frame of reference. An accelerometer at rest relative to the earth's surface will indicate approximately 1 g upwards, because any point on the earth's surface is accelerating upwards relative to a local inertial frame. To obtain the acceleration due to motion with respect to the earth, the offset from gravity is subtracted, or the change in acceleration is measured to determine when an object is rotating about an axis. The combined measurements in relation to multiple axes are used to determine rotation which is unaligned with the earth.
  • A compass, for purposes of this disclosure, is a device which determines the orientation of the display device of embodiments of the disclosed technology with respect to the plane of the earth's surface. It may detect true north, magnetic north, or any other direction. It may instead determine only a change in direction without knowing an actual direction, in embodiments of the disclosed technology. When determining a magnetic field, this may be by measuring the magnetic field directly or measuring another value and approximating the magnetic field/direction. A solid state compass, global positioning system, or other such device may be used for this purpose.
  • The display device used in embodiments of the disclosed technology is a display capable of changing an exhibited image and is a screen in two dimensions (e.g., a flat screen, CRT screen, LCD screen, plasma screen, etc).
  • Embodiments of the disclosed technology will become clearer in view of the description of the following figures.
  • FIG. 1 shows an example of a display device with images shown on the display that change based on an orientation change of the device relative to a stationary external environment. FIG. 1A shows the display device at a first orientation. FIG. 1B shows the display device at a second orientation, rotated to the left compared to the first orientation. FIG. 1C shows the display device at a third orientation, rotated further to the left compared to the first and second orientations. In this example, as the display device is turned to the left, a clockwise rotation around a vertical axis, the image shown on the display changes and the person in the image appears to rotate to the left. FIG. 1A shows the initial image of the person 110 on the display device 100, such as when selecting a person to view. Such a selection might be made upon decision to view a clothing item for possible purchase, or the like, as will be discussed below. The initial position of the display device is within an environment having an orientation relative to the external environment 120 where the x and y axes are parallel to the plane of the page on which the figure is printed, with the z axis pointing out towards the viewer. The orientation directional vector 130 is shown pointing into the plane of the paper and away from the viewer, and could represent, for example, either acceleration due to gravity or a magnetic field direction such as magnetic north. As the display device 100 is rotated to the left, the orientation of the device changes with respect to the viewer and the external environment. The image of the person presented on the display rotates to the left, along with the direction of rotation of the display device, in a second and third position of the person 112 and 114 and second and third orientation of the device relative to the external environment 120. In this example, the object on the display reacts just as a real object such as a doll would appear if held in the viewer's hands and rotated similarly, providing images that simulate the real-world rotation.
  • In embodiments of the disclosed technology, this is a linear or non-linear relationship. In a linear relationship, the rotation may be made to feel natural, that is, a person is looking around the object by tilting the screen. As the screen is tilted, so too is the image in an adjacent manner, whether it is rotated the same degree amount or a multiple of that amount. That is, by way of example, when the screen is rotated 30, 45, and 60 degrees, the person shown in the image is rotated 30, 45, and 60 degrees in a one-to-one linear correspondence, or 60, 90, and 120 degrees in a two-to-one linear correspondence, or −30, −45, and −60 in a negative one correspondence. In a non-linear example, when rotating the device/screen 30, 45, and 60 degrees, the person shown may be rotated 30, 90, and 180 degrees, respectively.
  • FIGS. 2 and 3 demonstrate the relationship of the display device to orientation directional vectors in embodiments of the disclosed technology. FIG. 2 demonstrates the use of an accelerometer to determine an orientation change in an embodiment of the disclosed technology. FIG. 3 demonstrates the use of a compass to determine an orientation change in an embodiment of the disclosed technology.
  • Referring first to FIG. 2A, the display device 200 (where possible, elements shown in FIG. 1 have been incremented by 100 in FIG. 2) displays an image of a person 210 on its display screen in the embodiment shown. Although the display screen of the device could actually have any dimension, FIG. 2A shows the familiar “portrait” configuration for illustrative purposes. The person image 210, in this example, is aligned and in a fixed position with respect to the top and bottom of the device. FIGS. 2B and 2C show the display device in an orientation relative to a three-dimensional plane 220, aligned parallel to the ground along one potential orthogonal axis 240 relative to the accelerometer's gravitational orientation directional vector, G, 230, in an example of a change in orientation of the display shown in FIG. 2A. The circular arrow 250 in FIGS. 2B and 2C shows the directions of user rotation around the center of the display device that is detected with an accelerometer in embodiments of the disclosed technology and used as a signal for choosing and changing images displayed on the device. The rotation along the vertical axis of the display device aligned with the external orthogonal axis shown in FIG. 2B represents changing the view around the right and left sides of the person, similar to the process shown and described with reference to FIG. 1.
  • FIG. 2C shows a rotation around the top and bottom of the displayed person (of FIG. 2A) along the horizontal axis of the display device aligned with the external axis that is orthogonal to the orientation directional vector. FIGS. 2D and 2E show the device with the same image alignment of FIG. 2A at a second orthogonal angular variation 260 and rotation directions 270. Again, FIG. 2D shows the rotation along the vertical axis of the device, while FIG. 2E shows the rotation along the horizontal axis of the device. Note that the axes of the display device 200 could actually be aligned along any axis orthogonal or partially orthogonal to the orientation directional vector, G, 230, and that the axis at angle 240 shown in FIGS. 2B and 2C, and the axis at angle 260 shown in FIGS. 2D and 2E are chosen simply for illustrative purposes. Also, the device may be rotated, in embodiments of the disclosed technology, around any axis orthogonal to the orientation directional vector G passing through the plane of the device, not just vertical (FIGS. 2B, 2D) or horizontal (FIGS. 2C, 2E), as long as the accelerometer can determine the angle of rotation. So long as image data is provided for such changes in orientation, a view in any direction around the person (or other object shown in the video display) is shown from any viewing direction by changing the device's orientation with respect to the orientation directional vector, G, 230 in the embodiment shown.
  • The rotations depicted by arrows 250 and 270 described above with reference to FIG. 2 are generally measurable with an accelerometer. The measured directional change of a point on the display device with respect to the gravity of the earth (e.g., acceleration of the display device relative to freefall) is determined, and, based on this determination, the image shown (e.g., person image 210) on the display device is changed accordingly. In such orientation changes as shown in FIG. 2, an accelerometer may be used without the aid of a compass or determination of an orientation relative to the surface of the earth.
  • FIG. 3 demonstrates the use of a compass to determine an orientation change in an embodiment of the disclosed technology. The elements of FIG. 3 are generally incremented by 100 compared to those of FIG. 2. FIG. 3 shows the orientation directional vector 330 obtained from a compass being used to generate the device orientation information that drives the image changing mechanism of the embodiment shown. Unlike an accelerometer, the compass is aligned only along the magnetic field lines of the earth with no relationship to mass and freefall, so only rotations 350 around the vertical axis are used for orientation determination with a compass. FIG. 3A shows a video display of an image of a person 310 on the screen of the display device 300 in the embodiment shown. The person image is aligned and in a fixed position with respect to the top and bottom of the device for simplifying the understanding of embodiments of the disclosed technology. Although the display screen of the device could actually have any dimension, FIG. 3A shows the familiar “portrait” configuration for illustrative purposes. FIG. 3B shows the image of FIG. 3A displayed on a display device aligned parallel to the x-z plane of the three-dimensional external environment shown by the axes 320. When the display device turns with respect to a pole or cardinal direction, a compass or other magnetic field or directional indicator mechanism detects or measures this change (in terms of a relative change with respect to the original direction or more absolute change with respect to the earth's magnetic fields). Rotations of the display device around the vertical axis or comprising a vertical component to rotation about axis 340 orthogonal to the orientation directional vector generate angle displacements that can be used to change the images of the person (or other object displayed). Similar to the images shown in FIG. 1, rotations 350 of the device 300 in either direction gives the impression of looking around the person's figure. Unlike with an accelerometer, there is no measurable change for rotations around the horizontal (parallel to the x-axis) or other non-vertical axes passing through the screen of the device.
  • Using a combination of the methods and devices described with regard to FIGS. 2 and 3 and in the disclosure above, a change in orientation of the display device is detected, in embodiments of the disclosed technology, in any direction. These directional changes, which may be detected in embodiments of the disclosed technology, include rotations about any one of a combination of the X, Y, and Z axes (which may be defined, in this instance, as in any direction in which a change to orientation with respect to any of the X, Y, and Z axes is detectable) regardless of the starting orientation of the display device.
  • FIG. 4 demonstrates the use of an accelerometer and a compass to determine an orientation change in an embodiment of the disclosed technology. FIG. 4 shows an embodiment that employs the accelerometer in conjunction with a compass, in order to obtain two orientation directional vectors which can be used to fix the orientation of the person object in three-dimensional space so that the device can be used to explore the person as an object. The elements of FIG. 4 are generally incremented by 100 compared to those of FIG. 3. Similar to FIGS. 2B and 2C, FIGS. 4A and 4C show the display device 400 and an image of a person 410 rotating around the same axis at angle 440, orthogonal to the accelerometer's gravitational orientation directional vector 430. But unlike in FIGS. 2A and 2B, the person image in FIGS. 4A and 4C is not aligned to the top of the display device, but rather aligned to the compass's orientation directional vector 480. Thus, upon rotation of the device around the vertical z-axis in the external environment 420, the image object appears to remain stationary in three-dimensional space. Note that although the position of the person object is stationary, the view direction of the person object, or the image of the person, still changes with the orientation of the device in the external environment. FIG. 4B shows a third intermediate position between FIGS. 4A and 4C as the display device rotates counterclockwise around the z-axis, with the person object aligned to the magnetic orientation directional vector 480. Rotations 450 around the axis at angle 440 orthogonal to the accelerometer orientation directional vector 430 still use the accelerometer-measured displacements to allow the viewer to observe the person image from the sides as described in FIG. 1 and FIGS. 2B and 2D. FIGS. 4D, 4E, and 4F show the same rotational concept as in FIG. 2E, wherein a rotation 470 of the device along a second independent orthogonal axis 460 would allow the viewer to look above and below the person object. Again, the information obtained from the compass keeps the person (or other) image aligned to the magnetic orientation directional vector 480. The display device could be rotated around any orthogonal axis such as 440 or 460 to obtain different directional views (images) of the person object. Note that when the display device moves to a vertical position, where the vertical axis of the display device is aligned to the z-axis of the external environment, there is no measurable accelerometer data and the system defaults to a purely compass orientation directional vector-based system as described in FIG. 3.
  • FIG. 5 shows an example of changing the position of a human figure in embodiments of the disclosed technology. FIGS. 5A through 5C show a sequence of orientations and displays on a display device 500. The figure may change position over time, irrespective of the orientation of the device, or may change position as a result of a change in orientation of the display device. A combination of these cases may also be employed where the person image changes over time in a specific device orientation, and the display orientation of the human figure also changes as a result of a change in orientation of the display device. Each of these cases will be described in more detail below.
  • Referring now to the first case described above, that is, a changing human figure over time, irrespective of the orientation of the device, this may be a series of sequential images or a video. In the example shown, the human figure's arm is in a down position 570 at a first time, shown in FIG. 5A, being raised 572 at a second time, in FIG. 5B, and pointing upwards 574 at a third time, shown in FIG. 5C. As can be seen in FIGS. 5A, 5B, and 5C, the orientation of the device changes from a first orientation in FIG. 5A to a second orientation in FIG. 5B, and back to the first orientation in FIG. 5C. However, the human FIGS. 510 and 514, while in the same orientation (because the display devices are in the same orientation) are in different positions because a certain amount of time has elapsed, and over that period of time, the human FIG. 510/512/514 has moved. This may be used for example with showing a football or other sports move. Here, the human figure is throwing a football and, to learn proper technique while watching the video/succession of images, a viewer can change the orientation of the device to see different angles of the human figure to learn how to emulate the move. Likewise, it may be a dance move, part of a pornographic video, or an advertisement such as for the sale of clothing.
  • In the second case described above, the change in position of the human figure may be as a result of a change in orientation of the display device. In FIG. 5A, in a first orientation and a first image being displayed on the display device, the arm is down 570. When rotating/tilting the display device, as shown in FIG. 5B, a second orientation and a second position of the human figure are shown (here, with a raised arm 572). When returning to the first orientation, in FIG. 5C, the arm is now raised. In this manner, any feature of the human figure may be changed, such as a smile to a frown, an arm up to an arm down, indicia on clothing, and the like. Each time a person cycles through from the orientation in FIG. 5A to FIG. 5B and back to the first orientation in FIG. 5C, the human figure may be toggled. That is, for example, the change from the orientation in 5A to 5B and back to the original orientation and human FIG. 514 is shown. Do it again, and the human FIG. 510 is shown.
  • In a combination of the two cases described above, the video or images could be continuously changing as time progresses, but the set of video images shown at every particular orientation of the device would be relative to each particular orientation. For example, in FIG. 5A, the first orientation, the person 510 might be speaking for one minute. If left in that orientation, the speaker would conduct his speech with his arm down 570 for the entirety of the speech. However, if during that one minute, the user instead changes the device to a second orientation as shown in FIG. 2B, he may raise his hand 572 during the course of the one minute speech. Upon returning back to the first orientation, shown in FIG. 5C, his hand 574 is now raised until the one minute speech is complete. In this way, the movement to the second position shown in FIG. 5B actually altered the progression of the video. Alternatively, upon returning back to the first orientation from the second orientation of FIG. 5B the speaker could have instead resumed his original hand position 570 of FIG. 5A, instead of maintaining the change of hand position 572, 574 shown in FIG. 5B and FIG. 5C.
  • FIG. 6 shows an example of a display device tilted upwards and downwards with respect to the viewer in an embodiment of the disclosed technology. The display device 600 is turned 90 degrees clockwise with respect to that of the display device shown in FIG. 2, but the tilt is analogous to the directional change indicated by arrow 270 in FIG. 2E.
  • FIG. 7 shows an example of the changing orientation of a display with two human figures in an embodiment of the disclosed technology. On a display 700, both human FIGS. 710 and 720 are exhibited. As the orientation changes (left/right as cycled through the FIGS. 7A, 7B, 7C, 7D, and 7E), the human FIGS. 710 and 720 are rotated, that is, the perspective from which they are viewed is rotated from the point of view of the viewer. Here, they are rotated about a center of gravity 715 which is a combined center of gravity of the figures shown, e.g., roughly at the point of contact of their hands. Any of the techniques and embodiments shown and described with reference to FIGS. 1 through 6 may be employed with the two or more human FIGS. 710 and 720 shown in FIG. 7.
  • FIG. 8 shows an example of the changing orientation of a display device as well the changing position of the human figures exhibited on the display device in embodiments of the disclosed technology. That which is shown in FIGS. 8A through 8D is analogous to a combination of what has been shown and described with respect to FIGS. 5 (change in position of human figure) and 7 (using multiple human figures). In this manner, whether based on time or orientation change or both, the orientation of the human FIGS. 810 and 820 and position of the human figures on device 800 are changed. It should be understood that any element of the displayed human figures may change, including their position, position of one figure relative to the other, color of clothing or indicia, addition or removal of a logo, change in facial expression, addition of props (e.g., a football), and the like.
  • FIG. 9 shows steps taken in a method of carrying out embodiments of the disclosed technology. In step 910, a sequence of images of a human figure is stored in various orientations. This is at a first moment or instant in time. Thus, for example, in an application where the image will change based on a clockwise-counterclockwise rotation (see, for example, FIG. 5), images may be stored around the left and right side of the person. In optional step 920, images are stored over a period of time. These images may be from one or multiple orientations, the human figure may be moving or changing in some manner (e.g., clothes changing color or styles to show different clothing options for sale), and may be a video. It is known, for example, in movies to have moveable or multiple cameras around a person conducting an action, so that the camera view can appear to swoop around the person at a moment in time or during a scene, but embodiments of the present technology allow the viewer to actually change the viewing angle at the time of viewing. Thus, the experience becomes interactive with natural movement or changes of orientation of the viewer and/or handheld device.
  • In step 930, which can be carried out before, after, or both before and after step 960, orientation data is received in a continuous manner from an orientation sensor, such as an accelerometer or a compass or another instrument for measuring orientation of, or relative to, a pre-existing magnetic field. From this data, in step 950, it is determined how the display device of embodiments of the disclosed technology is moving or being reoriented, whether tilting up/down, tilting left/right, being rotated left/right, or being reoriented towards a different cardinal direction. To supplement this, in embodiments of the disclosed technology, in step 940, orientation data is further received from a secondary orientation sensor, which could also be an accelerometer or compass. Then, in step 960, an image of a human figure is exhibited which is either a pre-defined first image or an image based on the orientation or position of the display device. Steps 930 and 940 continue to be carried out, and in step 970, upon receiving further movement data from the accelerometer (such as movement past a predefined threshold) or new orientation data from a compass, the image is changed. The image may additionally change over time, such as with a video or sequence, in step 980. A change in time and a change in orientation would produce a combined change, in embodiments of the disclosed technology. Thus, over time, the human figure moves irrespective of a change in viewpoint (orientation) of the exhibited human figure. A change in viewpoint may also occur, as determined by received motion data providing movement data with regard to the display device.
  • FIG. 10 shows a high level block diagram of a specialized image input and display device on which embodiments of the disclosed technology may be carried out. The device may comprise some or all of the high level elements shown in FIG. 10 and may comprise further devices or be part of a larger device. Data bus 1070 transports data between the numbered elements shown in device 1000. Central processing unit 1040 receives and processes instructions such as code. Volatile memory 1010 and non-volatile memory 1020 store data for processing by the central processing unit 1040.
  • The data storage apparatus 1030 may be magnetic media (e.g., hard disk, video cassette), optical media (e.g., Blu-Ray or DVD) or another type of storage mechanism known in the art. The data storage apparatus 1030 or the non-volatile memory 1020 stores data which is sent via bus 1070 to the video output 1060.
  • A datum received from an accelerometer or compass 1090 is processed by the central processing unit 1040 to determine if a change in viewpoint or orientation has been made. The displayed image, as described above, is outputted via a video output 1060, that is, a transmitter or video relay device which transmits video to a television screen, monitor, or other display device 1080 via cable or data bus 1065. The video output 1060 may also be an output over a packet-switched network 1065 such as the internet, where it is received and interpreted as video data by a recipient display 1080. The recipient display may be a liquid crystal display, cathode ray tube, or series of light-emitting diodes, or any other known display system.
  • An input/output device 1050, such as buttons on the device itself, an infrared signal receiver for use with a remote control, or a network input/output for control via a local or wide area network, receives and/or sends a signal via data pathway 1055 (e.g., infrared signal, signal over copper or fiber cable, wireless network, etc. The input/output device, in embodiments of the disclosed technology, receives input from a user, such as which image to display and how to interact with a detected object.
  • FIG. 11 shows a high-level block diagram of a computer that may be used to carry out the disclosed technology. Computer 1100 contains a processor 1150 that controls the overall operation of the computer by executing computer program instructions which define such operation. The computer program instructions may be stored in a storage device 1120 (e.g., magnetic disk, database) and loaded into memory 1130 when execution of the computer program instructions is desired. Thus, the computer operation will be defined by computer program instructions stored in memory 1130 and/or storage 1120, and the computer will be controlled by processor 1150 executing the computer program instructions. Computer 1100 also includes one, or a plurality of, input network interfaces for communicating with other devices via a network (e.g., the internet). Computer 1100 also includes one or more output network interfaces 1110 for communicating with other devices. Computer 1100 also includes input/output 1140, representing devices which allow for user interaction with the computer 1100 (e.g., display, keyboard, mouse, speakers, buttons, touch-sensitive screen, etc.).
  • While the disclosed technology has been taught with specific reference to the above embodiments, a person having ordinary skill in the art will recognize that changes can be made in form and detail without departing from the spirit and the scope of the disclosed technology. The described embodiments are to be considered in all respects only as illustrative and not restrictive. All changes that come within the meaning and range of equivalency of the claims are to be embraced within their scope. Combinations of any of the methods, systems, and devices described hereinabove are also contemplated and within the scope of the disclosed technology.

Claims (22)

1-46. (canceled)
47. A method of changing an image of an object displayed on a portable device having a processor, and having a display screen forming a plane, comprising, with the processor of the portable device:
(a) at a first time, causing the display screen to display a first image depicting an object in an initial orientation; and
(b) at a second time, later than the first time, based on a change between the first time and the second time in a reference angle formed between the plane of the display screen and a select direction, which reference angle is determined from data produced by a sensor of the portable device, causing the display screen to display a second image depicting at least a portion of the object in a modified orientation different from the initial orientation.
48. The method of claim 47 wherein the first image and second image comprise views of the object at different moments while said at least a portion of said object is in motion, and wherein part (b) comprises, at the second time, causing the display screen to display said at least a portion of the object in the modified orientation as a result of said motion of said at least a portion of the object.
49. The method of claim 47 wherein the second image depicts the entire object displayed in the modified orientation.
50. The method of claim 49 wherein the first image and second image comprise views of the object at different moments while said at least a portion of said object is in motion, and wherein part (b) comprises, at the second time, causing the display screen to display said at least a portion of the object in the modified orientation, further as a result of said motion of said at least a portion of the object.
51. The method of claim 47 wherein the display screen has edges and further comprising causing the display screen to display the first image and the second image while maintaining an alignment of the object with respect to the edges of the display screen between the first time and the second time.
52. The method of claim 47 wherein the display screen has edges and further comprising causing the display screen to display the first image and the second image with an alignment of the object that is rotated with respect to the edges of the display screen between the first time and the second time, based on a rotation of the portable device in the plane of the display screen, which rotation is determined from data produced by a supplemental sensor of the portable device.
53. The method of claim 47 wherein the select direction is gravitational down and wherein the sensor is an accelerometer.
54. The method of claim 47 wherein the select direction is magnetic north and wherein the sensor is a compass.
55. The method of claim 47 wherein part (a) comprises causing the display screen to display a first image of a series of images, which first image depicts the object in the initial orientation, and part (b) comprises causing the display screen to display a second image of the series of images, which second image depicts the object in the modified orientation.
56. The method of claim 47 wherein the first image and the second image represent different views of the object from different perspectives aimed at a center of gravity of the object.
57. The method of claim 47 wherein the object is a human figure.
58. The method of claim 47 wherein the object is a plurality of bodies in a scene.
59. The method of claim 47 wherein the modified orientation and the initial orientation differ by a rotation angle and the rotation angle is not linearly proportional to the change in the reference angle.
60. The method of claim 47 further comprising, at a third time, later than the second time, based on a change between the second time and the third time in the reference angle, causing the display screen to display a third image depicting said at least a portion of the object in a further modified orientation different from both the initial orientation and the modified orientation.
61. The method of claim 60 further comprising, at a fourth time, later than the third time, based on data indicating that a reference angle at the fourth time matches the reference angle at the first time, causing the display screen to display the first image.
62. A portable display device comprising:
(a) a display screen forming a plane;
(b) a sensor;
(c) a storage device containing image data of an object;
(d) a computer processor responsive to the sensor, coupled to the storage device, and controlling the display screen, which computer processor:
(i) determines, from data produced by the sensor, a reference angle formed between the plane of the display screen and a select direction;
(ii) at a first time, causes the display screen to display a first image depicting an object in an initial orientation; and
(iii) at a second time, later than the first time, based on a change between the first time and the second time in the reference angle causes the display screen to display a second image depicting at least a portion of the object in a modified orientation different from the initial orientation.
63. The display device of claim 62 wherein the sensor is an accelerometer and the select direction is gravitational down.
64. The display device of claim 62 wherein the sensor is a compass and the select direction is magnetic north.
65. The display device of claim 62 wherein the image data of the object in the storage device comprises a series of images representing different views of the object rotated through different rotation angles.
66. The display device of claim 62 wherein the display screen has edges, further comprising a supplemental sensor, and wherein the computer processor (i) is responsive to the supplemental sensor, (ii) determines, from data produced by the supplemental sensor, a rotation of the portable device in the plane of the display screen, and (iii) causes the display screen to display the first image and the second image with the object aligned differently with respect to the edges of the display screen between the first time and the second time, which alignment difference is based on the rotation.
67. The display device of claim 66 wherein the sensor is an accelerometer and the supplemental sensor is a compass.
US14/672,856 2010-06-21 2015-03-30 Viewpoint Change on a Display Device Based on Movement of the Device Abandoned US20150205366A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/672,856 US20150205366A1 (en) 2010-06-21 2015-03-30 Viewpoint Change on a Display Device Based on Movement of the Device

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US12/819,250 US8730267B2 (en) 2010-06-21 2010-06-21 Viewpoint change on a display device based on movement of the device
US14/243,961 US9122313B2 (en) 2010-06-21 2014-04-03 Viewpoint change on a display device based on movement of the device
US14/672,856 US20150205366A1 (en) 2010-06-21 2015-03-30 Viewpoint Change on a Display Device Based on Movement of the Device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/243,961 Continuation US9122313B2 (en) 2010-06-21 2014-04-03 Viewpoint change on a display device based on movement of the device

Publications (1)

Publication Number Publication Date
US20150205366A1 true US20150205366A1 (en) 2015-07-23

Family

ID=45328213

Family Applications (3)

Application Number Title Priority Date Filing Date
US12/819,250 Active 2032-04-07 US8730267B2 (en) 2010-06-21 2010-06-21 Viewpoint change on a display device based on movement of the device
US14/243,961 Active 2030-07-24 US9122313B2 (en) 2010-06-21 2014-04-03 Viewpoint change on a display device based on movement of the device
US14/672,856 Abandoned US20150205366A1 (en) 2010-06-21 2015-03-30 Viewpoint Change on a Display Device Based on Movement of the Device

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US12/819,250 Active 2032-04-07 US8730267B2 (en) 2010-06-21 2010-06-21 Viewpoint change on a display device based on movement of the device
US14/243,961 Active 2030-07-24 US9122313B2 (en) 2010-06-21 2014-04-03 Viewpoint change on a display device based on movement of the device

Country Status (1)

Country Link
US (3) US8730267B2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170003721A1 (en) * 2013-12-31 2017-01-05 Commissariat A L'energie Atomique Et Aux Energies Alternatives Method and apparatus for detecting a manipulation of a portable device
WO2021087411A1 (en) * 2019-11-01 2021-05-06 Loop Now Technologies, Inc. Audio and video stream rendering modification based on device rotation metric

Families Citing this family (54)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140053885A (en) * 2011-04-18 2014-05-08 아이시360, 인코포레이티드 Apparatus and method for panoramic video imaging with mobile computing devices
US20130093667A1 (en) * 2011-10-12 2013-04-18 Research In Motion Limited Methods and devices for managing views displayed on an electronic device
CN103474050B (en) * 2012-06-06 2017-03-15 富泰华工业(深圳)有限公司 Image display and method for displaying image
US8959359B2 (en) 2012-07-11 2015-02-17 Daon Holdings Limited Methods and systems for improving the security of secret authentication data during authentication transactions
US9262615B2 (en) 2012-07-11 2016-02-16 Daon Holdings Limited Methods and systems for improving the security of secret authentication data during authentication transactions
US8453207B1 (en) 2012-07-11 2013-05-28 Daon Holdings Limited Methods and systems for improving the security of secret authentication data during authentication transactions
US20150130800A1 (en) 2013-11-12 2015-05-14 Fyusion, Inc. Segmentation of surround view data
KR20150101915A (en) * 2014-02-27 2015-09-04 삼성전자주식회사 Method for displaying 3 dimension graphic user interface screen and device for performing the same
FR3025645A1 (en) * 2014-09-09 2016-03-11 Renault Sas METHOD FOR DISPLAYING AND ADJUSTING THE ORIENTATION OF A VIRTUAL IMAGE FOR A NOMADIC ELECTRONIC DISPLAY DEVICE
US10176592B2 (en) 2014-10-31 2019-01-08 Fyusion, Inc. Multi-directional structured image array capture on a 2D graph
US10726593B2 (en) * 2015-09-22 2020-07-28 Fyusion, Inc. Artificially rendering images using viewpoint interpolation and extrapolation
US10262426B2 (en) 2014-10-31 2019-04-16 Fyusion, Inc. System and method for infinite smoothing of image sequences
US10586378B2 (en) 2014-10-31 2020-03-10 Fyusion, Inc. Stabilizing image sequences based on camera rotation and focal length parameters
US10650574B2 (en) 2014-10-31 2020-05-12 Fyusion, Inc. Generating stereoscopic pairs of images from a single lens camera
US10726560B2 (en) 2014-10-31 2020-07-28 Fyusion, Inc. Real-time mobile device capture and generation of art-styled AR/VR content
US10719939B2 (en) 2014-10-31 2020-07-21 Fyusion, Inc. Real-time mobile device capture and generation of AR/VR content
US10275935B2 (en) 2014-10-31 2019-04-30 Fyusion, Inc. System and method for infinite synthetic image generation from multi-directional structured image array
US9940541B2 (en) 2015-07-15 2018-04-10 Fyusion, Inc. Artificially rendering images using interpolation of tracked control points
US10852902B2 (en) 2015-07-15 2020-12-01 Fyusion, Inc. Automatic tagging of objects on a multi-view interactive digital media representation of a dynamic entity
US12261990B2 (en) 2015-07-15 2025-03-25 Fyusion, Inc. System and method for generating combined embedded multi-view interactive digital media representations
US11095869B2 (en) 2015-09-22 2021-08-17 Fyusion, Inc. System and method for generating combined embedded multi-view interactive digital media representations
US10222932B2 (en) 2015-07-15 2019-03-05 Fyusion, Inc. Virtual reality environment based manipulation of multilayered multi-view interactive digital media representations
US10750161B2 (en) 2015-07-15 2020-08-18 Fyusion, Inc. Multi-view interactive digital media representation lock screen
US10242474B2 (en) 2015-07-15 2019-03-26 Fyusion, Inc. Artificially rendering images using viewpoint interpolation and extrapolation
US10147211B2 (en) 2015-07-15 2018-12-04 Fyusion, Inc. Artificially rendering images using viewpoint interpolation and extrapolation
US11006095B2 (en) 2015-07-15 2021-05-11 Fyusion, Inc. Drone based capture of a multi-view interactive digital media
US11783864B2 (en) 2015-09-22 2023-10-10 Fyusion, Inc. Integration of audio into a multi-view interactive digital media representation
EP3364270A4 (en) * 2015-10-15 2018-10-31 Sony Corporation Information processing device and information processing method
WO2017070704A2 (en) * 2015-10-23 2017-04-27 Gobiquity, Inc. Visual acuity testing method and product
US10881936B2 (en) 2016-06-20 2021-01-05 Coreyak Llc Exercise assembly for performing different rowing routines
US10556167B1 (en) 2016-06-20 2020-02-11 Coreyak Llc Exercise assembly for performing different rowing routines
US10155131B2 (en) 2016-06-20 2018-12-18 Coreyak Llc Exercise assembly for performing different rowing routines
US11202017B2 (en) 2016-10-06 2021-12-14 Fyusion, Inc. Live style transfer on a mobile device
US10999602B2 (en) 2016-12-23 2021-05-04 Apple Inc. Sphere projected motion estimation/compensation and mode decision
US10437879B2 (en) 2017-01-18 2019-10-08 Fyusion, Inc. Visual search using multi-view interactive digital media representations
US10353946B2 (en) 2017-01-18 2019-07-16 Fyusion, Inc. Client-server communication for live search using multi-view digital media representations
US20180227482A1 (en) 2017-02-07 2018-08-09 Fyusion, Inc. Scene-aware selection of filters and effects for visual digital media content
US11044464B2 (en) 2017-02-09 2021-06-22 Fyusion, Inc. Dynamic content modification of image and video based multi-view interactive digital media representations
US11259046B2 (en) 2017-02-15 2022-02-22 Apple Inc. Processing of equirectangular object data to compensate for distortion by spherical projections
US10924747B2 (en) 2017-02-27 2021-02-16 Apple Inc. Video coding techniques for multi-view video
US10356395B2 (en) 2017-03-03 2019-07-16 Fyusion, Inc. Tilts as a measure of user engagement for multiview digital media representations
US10440351B2 (en) 2017-03-03 2019-10-08 Fyusion, Inc. Tilts as a measure of user engagement for multiview interactive digital media representations
US10237477B2 (en) 2017-05-22 2019-03-19 Fyusion, Inc. Loop closure
US10200677B2 (en) 2017-05-22 2019-02-05 Fyusion, Inc. Inertial measurement unit progress estimation
US10313651B2 (en) 2017-05-22 2019-06-04 Fyusion, Inc. Snapshots at predefined intervals or angles
US11093752B2 (en) * 2017-06-02 2021-08-17 Apple Inc. Object tracking in multi-view video
US11069147B2 (en) 2017-06-26 2021-07-20 Fyusion, Inc. Modification of multi-view interactive digital media representation
US10754242B2 (en) 2017-06-30 2020-08-25 Apple Inc. Adaptive resolution and projection format in multi-direction video
CN107817895B (en) * 2017-09-26 2021-01-05 微幻科技(北京)有限公司 Scene switching method and device
US10356341B2 (en) 2017-10-13 2019-07-16 Fyusion, Inc. Skeleton-based effects and background replacement
US10687046B2 (en) 2018-04-05 2020-06-16 Fyusion, Inc. Trajectory smoother for generating multi-view interactive digital media representations
US10382739B1 (en) 2018-04-26 2019-08-13 Fyusion, Inc. Visual annotation using tagging sessions
US10592747B2 (en) 2018-04-26 2020-03-17 Fyusion, Inc. Method and apparatus for 3-D auto tagging
CN109525723B (en) * 2018-11-09 2021-08-06 Oppo广东移动通信有限公司 Electronic device, center of gravity adjustment method, device, terminal device, and computer-readable storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6683609B1 (en) * 1997-10-20 2004-01-27 Baron Services, Inc. Real-time three-dimensional weather data processing method and system
US20080042973A1 (en) * 2006-07-10 2008-02-21 Memsic, Inc. System for sensing yaw rate using a magnetic field sensor and portable electronic devices using the same
US7502036B2 (en) * 2004-03-03 2009-03-10 Virtual Iris Studios, Inc. System for delivering and enabling interactivity with images
WO2009084213A1 (en) * 2007-12-28 2009-07-09 Capcom Co., Ltd. Computer, program, and storage medium
US20090303204A1 (en) * 2007-01-05 2009-12-10 Invensense Inc. Controlling and accessing content using motion processing on mobile devices
US20100136957A1 (en) * 2008-12-02 2010-06-03 Qualcomm Incorporated Method and apparatus for determining a user input from inertial sensors
US20100222179A1 (en) * 2009-02-27 2010-09-02 Sinclair Temple Presenting information to users during an activity, such as information from a previous or concurrent outdoor, physical activity
US20100302278A1 (en) * 2009-05-28 2010-12-02 Apple Inc. Rotation smoothing of a user interface
US20110285704A1 (en) * 2010-02-03 2011-11-24 Genyo Takeda Spatially-correlated multi-display human-machine interface
US9098248B2 (en) * 2010-09-07 2015-08-04 Sony Corporation Information processing apparatus, program, and control method
US20150362998A1 (en) * 2014-06-17 2015-12-17 Amazon Technologies, Inc. Motion control for managing content

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4669812A (en) 1983-09-12 1987-06-02 Hoebing John L Method and apparatus for 3-D image synthesis
US6009188A (en) 1996-02-16 1999-12-28 Microsoft Corporation Method and system for digital plenoptic imaging
US5808613A (en) 1996-05-28 1998-09-15 Silicon Graphics, Inc. Network navigator with enhanced navigational abilities
US5818420A (en) 1996-07-31 1998-10-06 Nippon Hoso Kyokai 3D object graphics display device, 3D object graphics display method, and manipulator for 3D object graphics display
US6049622A (en) 1996-12-05 2000-04-11 Mayo Foundation For Medical Education And Research Graphic navigational guides for accurate image orientation and navigation
EP0926629A1 (en) 1997-01-24 1999-06-30 Sony Corporation Pattern data generator, pattern data generating method and its medium
US6097389A (en) 1997-10-24 2000-08-01 Pictra, Inc. Methods and apparatuses for presenting a collection of digital media in a media container
JP2001126085A (en) 1999-08-16 2001-05-11 Mitsubishi Electric Corp Image generation system, image display system, computer-readable recording medium storing image generation program, and image generation method
AU2001286466A1 (en) 2000-08-11 2002-02-25 Holomage, Inc. Method of and system for generating and viewing multi-dimensional images
US6628279B1 (en) 2000-11-22 2003-09-30 @Last Software, Inc. System and method for three-dimensional modeling
GB0208909D0 (en) 2002-04-18 2002-05-29 Canon Europa Nv Three-dimensional computer modelling
US20050283371A1 (en) 2002-08-30 2005-12-22 Paolo Tiramani Method of selling pre-fabricated houses
JP2007535733A (en) 2004-03-03 2007-12-06 バーチャル アイリス スタジオ,インク. System that enables image distribution and interactive operation
US7262783B2 (en) 2004-03-03 2007-08-28 Virtual Iris Studios, Inc. System for delivering and enabling interactivity with images
US7542050B2 (en) 2004-03-03 2009-06-02 Virtual Iris Studios, Inc. System for delivering and enabling interactivity with images
EP2065783B1 (en) * 2007-11-30 2010-01-13 Telefonaktiebolaget LM Ericsson (publ) A portable electronic apparatus having more than one display area, and a method of controlling a user interface thereof
US20100259610A1 (en) 2009-04-08 2010-10-14 Celsia, Llc Two-Dimensional Display Synced with Real World Object Movement

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6683609B1 (en) * 1997-10-20 2004-01-27 Baron Services, Inc. Real-time three-dimensional weather data processing method and system
US7502036B2 (en) * 2004-03-03 2009-03-10 Virtual Iris Studios, Inc. System for delivering and enabling interactivity with images
US20080042973A1 (en) * 2006-07-10 2008-02-21 Memsic, Inc. System for sensing yaw rate using a magnetic field sensor and portable electronic devices using the same
US20100214216A1 (en) * 2007-01-05 2010-08-26 Invensense, Inc. Motion sensing and processing on mobile devices
US20090303204A1 (en) * 2007-01-05 2009-12-10 Invensense Inc. Controlling and accessing content using motion processing on mobile devices
WO2009084213A1 (en) * 2007-12-28 2009-07-09 Capcom Co., Ltd. Computer, program, and storage medium
US20100279770A1 (en) * 2007-12-28 2010-11-04 Capcorm Co. Ltd Computer, program, and storage medium
US20100136957A1 (en) * 2008-12-02 2010-06-03 Qualcomm Incorporated Method and apparatus for determining a user input from inertial sensors
US20100222179A1 (en) * 2009-02-27 2010-09-02 Sinclair Temple Presenting information to users during an activity, such as information from a previous or concurrent outdoor, physical activity
US20100302278A1 (en) * 2009-05-28 2010-12-02 Apple Inc. Rotation smoothing of a user interface
US20110285704A1 (en) * 2010-02-03 2011-11-24 Genyo Takeda Spatially-correlated multi-display human-machine interface
US9098248B2 (en) * 2010-09-07 2015-08-04 Sony Corporation Information processing apparatus, program, and control method
US20150362998A1 (en) * 2014-06-17 2015-12-17 Amazon Technologies, Inc. Motion control for managing content

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170003721A1 (en) * 2013-12-31 2017-01-05 Commissariat A L'energie Atomique Et Aux Energies Alternatives Method and apparatus for detecting a manipulation of a portable device
US10175777B2 (en) * 2013-12-31 2019-01-08 Commissariat a l'energy Atomique et Aux Energies Alternatives Method and apparatus for detecting a manipulation of a portable device
WO2021087411A1 (en) * 2019-11-01 2021-05-06 Loop Now Technologies, Inc. Audio and video stream rendering modification based on device rotation metric

Also Published As

Publication number Publication date
US8730267B2 (en) 2014-05-20
US20140253436A1 (en) 2014-09-11
US9122313B2 (en) 2015-09-01
US20110310089A1 (en) 2011-12-22

Similar Documents

Publication Publication Date Title
US9122313B2 (en) Viewpoint change on a display device based on movement of the device
US11745097B2 (en) Spatially-correlated human-machine interface
US11989826B2 (en) Generating a three-dimensional model using a portable electronic device recording
US11173392B2 (en) Spatially-correlated human-machine interface
CN103119628B (en) Utilize three-dimensional user interface effect on the display of kinetic characteristic
US20100053322A1 (en) Detecting ego-motion on a mobile device displaying three-dimensional content
US20100259610A1 (en) Two-Dimensional Display Synced with Real World Object Movement
US20100188397A1 (en) Three dimensional navigation using deterministic movement of an electronic device
Billinghurst et al. Mobile collaborative augmented reality
Chow Low-cost multiple degrees-of-freedom optical tracking for 3D interaction in head-mounted display virtual reality
CN103632627A (en) Information display method and apparatus and mobile navigation electronic equipment
US20220343587A1 (en) Augmented reality wall with combined viewer and camera tracking
CN107506026A (en) The method, apparatus and head-mounted display apparatus of control application operation
JP4493082B2 (en) CG presentation device, program thereof, and CG display system
TWI621034B (en) Methods and systems for displaying reality information in a virtual reality environment, and related computer program products
CN117234333A (en) VR object selection method, device, electronic equipment and readable storage medium
TW201740346A (en) Corresponding method and system in between panorama image and message on Internet platform which is viewed at a viewing angle and operated in a cloud server
TW201727351A (en) Devices and methods for browsing photosphere photos
TW201738649A (en) Corresponding method between panorama video and message and system thereof having cloud software calculator to obtain an angle value and a first time point from the cloud database, and calculate and search a picture of panorama video corresponding to the message
TW202419142A (en) Augmented reality interaction system, augmented reality interaction method, server and mobile device
TW201931064A (en) Virtual reality navigation methods and systems with map index, and related computer program products
TW201810177A (en) Methods and systems for presenting data in a virtual environment, and related computer program products

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION