[go: up one dir, main page]

WO2011127578A1 - Virtual product display - Google Patents

Virtual product display Download PDF

Info

Publication number
WO2011127578A1
WO2011127578A1 PCT/CA2011/000413 CA2011000413W WO2011127578A1 WO 2011127578 A1 WO2011127578 A1 WO 2011127578A1 CA 2011000413 W CA2011000413 W CA 2011000413W WO 2011127578 A1 WO2011127578 A1 WO 2011127578A1
Authority
WO
WIPO (PCT)
Prior art keywords
images
user
display surface
product
computing device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CA2011/000413
Other languages
French (fr)
Inventor
Dean Stark
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of WO2011127578A1 publication Critical patent/WO2011127578A1/en
Anticipated expiration legal-status Critical
Priority to US13/652,043 priority Critical patent/US9733699B2/en
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1079Measuring physical dimensions, e.g. size of the entire body or parts thereof using optical or photographic means
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F19/00Advertising or display means not otherwise provided for
    • G09F19/12Advertising or display means not otherwise provided for using special optical effects
    • G09F19/14Advertising or display means not otherwise provided for using special optical effects displaying different signs depending upon the view-point of the observer
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F19/00Advertising or display means not otherwise provided for
    • G09F19/12Advertising or display means not otherwise provided for using special optical effects
    • G09F19/18Advertising or display means not otherwise provided for using special optical effects involving the use of optical projection means, e.g. projection of images on clouds
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F23/00Advertising on or in specific articles, e.g. ashtrays, letter-boxes
    • G09F23/06Advertising on or in specific articles, e.g. ashtrays, letter-boxes the advertising matter being combined with articles for restaurants, shops or offices

Definitions

  • the present invention relates generally to product display systems and, more specifically, to product display systems that provide virtual product visualization and interaction.
  • a product display system comprising: a display surface; a height sensor for determining a height of a user; an input device for receiving a product selection from the user, the product selection corresponding to one or more products; a computing device coupled to the height sensor and to the input device, the computing device comprising a frame buffer for outputting images to be displayed on the display surface, the computing device configured to: receive the product selection from the input device; receive the height from the height sensor and, based on the height, determine a viewing angle of the user in relation to the display surface; and write to the frame buffer one or more images comprising virtual versions of the one or more products, thereby causing the one or more images to be displayed on the display surface, the one or more images corresponding to the viewing angle.
  • a method of providing virtual product interaction to a user comprises: receiving a product selection from the user, the product selection corresponding to one or more products; determining a height of the user; based on the height, determining a viewing angle of the user in relation to a display surface; writing one or more images comprising virtual versions of the one or more products to a frame buffer, thereby causing the one or more images to be displayed on the display surface, the one or more images corresponding to the viewing angle.
  • a product display system comprising: a display surface; an input device for receiving information from the user indicative of at least product of interest; a projector mounted above the display surface; a computing device coupled to the input device and the projector, the computing device comprising a frame buffer for outputting images to be displayed by the projector on the display surface.
  • the computing device is configured to; write to the frame buffer one or more two dimensional images comprising anamorphic projections of one more items of interest to the user, thereby causing the one or more two dimensional images to be displayed on the display surface, and allowing the user to perceive the one or more items of interest as three dimensional objects.
  • FIG. 1 is a simplified perspective view of a product display system that provides virtual product interaction, exemplary of an embodiment of the present invention
  • FIG. 2 is a simplified block diagram of the product display system of FIG. 1 ;
  • FIG. 3 is a schematic diagram of the product display system of FIG. 1;
  • FIG. 4 is a simplified block diagram of the contents of the memory of FIG. 2;
  • FIG. 5 is a flow chart of steps performed during a user interaction with a product display system that provides virtual product interaction
  • FIG. 6A and 6B are images of projections of products of interest, formed by the product display system of FIG. 1. DETAILED DESCRIPTION
  • FIG. 1 illustrates a simplified perspective view of a product display system 100 which provides virtual product interaction.
  • Product display system 100 includes a display surface 110, a projector 120 for projecting images onto display surface 110, a user movement sensor 118, a user height sensor 119, speakers 116, scent dispensers 117, and a marker 115 indicating where a user 130 may stand in order to enjoy an optimal viewing angle and experience.
  • System 100 may further include a kiosk 112 providing a user interface 113.
  • Display surface 110 may be a conventional projection screen including a surface and a support structure used for displaying projected images for viewing by a user 130.
  • display surface 110 may for example be implemented using a sheet a white vellum paper, or the like interposed between two sheets of glass, or between a sheet and another rigid support surface.
  • Surface 110 may be supported by a base, and thereby form a table for the support of objects for sale and the like.
  • display surface 110 takes the form of a round glass table.
  • Projector 120 may be a conventional display projector for projecting an image onto display surface 110.
  • projector 120 may be a conventional 1080p; 720p; XVGA or similar home theatre or business projector available from Sony, Mitsubishi, Sanyo, Panasonic, Epson or other known
  • projector 120 may be focused to project exactly and only on surface 1 10, allowing projector 120 to fill surface 1 10 (and not the environment). This may be accomplished using an appropriate throw distance, focus, and optional mask.
  • projector 120 may be located above display surface 110, and display surface 110 is oriented substantially horizontally. However, it will be appreciated that projector 120 may be positioned in different locations, and that display surface 110 may be arranged in different orientations. For example, in some embodiments projector 120 may be positioned directly below display surface 10 and display surface 110 may be a rear-projection screen. Alternatively, projector 120 may be obliquely mounted at a location above, but not directly above surface 110.
  • multiple projectors in multiple locations may be used.
  • a projector 120 mounted above surface 110 and a second projector mounted beneath surface 110. Both projectors may be operated concurrently to present images on surface 110.
  • display surface 110 may be a digital display screen forming part of a digital display, such as a Liquid Crystal Display (LCD) or the like (in such embodiments, projector 120 may not be required).
  • LCD Liquid Crystal Display
  • user movement sensor 118 is operable to detect hand and/or arm gestures of user 130 in the vicinity of display surface 110 to thereby allow user 130 to interact with product display system 100.
  • User height sensor 119 is operable to detect the height of user 130, and may for example comprise one or more cameras and machine vision software as is known in the art. Alternatively, height sensor 119 may be an infrared sensor, a sonic sensor, or other appropriate sensor for sensing the presence and height of an individual. As will be apparent, user height sensor 119 serves to approximate the viewing height of a user proximate surface 110.
  • Kiosk 112 may be a computer terminal that provides information access by way of user interface 13, which may be a conventional touch-screen display or the like.
  • user 130 may initiate an interaction with product display system 100 through user interface 113 of kiosk 112. Specifically, user interface 113 of kiosk 112. Specifically, user interface 113 of kiosk 112.
  • 113 may present options to user 130, such as the following options: (1 ) browse products catalogue; (2) create a gift registry; and (3) view an existing gift registry. If user 130 selects one of the presented options, user interface 113 may then present a
  • Product display system 100 may also use user movement sensor 118 to enable user 130 to virtually interact with the displayed images.
  • virtual “buttons” associated with any of numerous possible actions, such as rotating or otherwise manipulating a displayed product, may be displayed on display surface 110.
  • the user may interact with projected images of items.
  • User movement sensor 118 detects when user 130 has moved her hand over a virtual button (indicating that user 130 has "activated” the virtual button) and, in response, product display system 100 may invoke images, sounds, and/or scents to virtually animate the action corresponding with the selected virtual button, thereby providing an experience akin to interacting with the actual physical product.
  • projector 120 may project oblique anamorphic images onto display surface 110 to provide virtual 3-dimensional (3D) visualizations of products to user 130.
  • 3D 3-dimensional
  • oblique anamorphic images are drawn in a particular distortion in order to create an impression of 3 dimensions when seen from specific viewpoints.
  • Anamorphic images (or anamorphosis) are more particularly described in the text "Hidden Images: Games of Perception, Anamorphic Art, Illusion: From the Renaissance to the Present" by Fred Leeman, Joost Elffers, Michael Schuyt, published by Harry N. Abrahams, Inc., 1976, ISBN-13: 9780810990197, ISBN: 0810990199, the contents of which are hereby incorporated by reference.
  • realistic high-resolution oblique anamorphic images may be generated using a conventional rendering software application such as, for example, Autodesk's 3ds Max, Autodesk's 3D Studio Max, and/or Autodesk's MAYA.
  • Constructing an oblique anamorphic image is a geometrical exercise known to those ordinarily skilled in the art, and is more particularly described for example in J. L Hunt, B. G. Nickel, and Christian Gigault's paper "Anamorphic images" published in the American Journal of Physics, March 2000, Volume 68, Issue 3, at pp. 232-237, the contents of which are hereby incorporated by reference.
  • multiple sets of oblique anamorphic images may be pre-generated and stored at display system 100 for later presentation on surface 110, where each set corresponds to a different viewing angle.
  • Each image may be a frame, and a set of images may be associated to provide full motion video at a suitable frame rate, such as 24 or 30 fps.
  • Each set of oblique anamorphic images may display one or more items to be viewed by user, in motion. The items may for example move along a defined trajectory on surface 110.
  • the viewing angle of user 130 may be determined by product display system 100, and a set of oblique anamorphic images optimal for user's 130 viewing angle may then be retrieved for presentation on surface 110.
  • marker 115 may be used to indicate to user 130 where user 130 may stand in order to enjoy an optimal viewing angle.
  • Marker 115 may be projected onto the floor, or it may be an actual physical marker, for example painted on the floor.
  • User height sensor 119 may also be used to more accurately determine the viewing angle of user 130, enabling product display system 100 to adjust the displayed images accordingly. For example, based on height information received from user height sensor 119 and the position of user 130 as determined by marker 115, product display system 100 may cause images optimized for user's 130 viewing angle to be displayed on display surface 110.
  • FIG. 2 illustrates a simplified block diagram of a computing device 210 forming part of product display system 100.
  • computing device 210 includes central processing unit (CPU) 212, graphics subsystem 214 having a frame buffer 215, network interface 216, and a suitable combination of persistent storage memory 218, random access memory and read only memory.
  • Network interface 216 interconnects computing device 210 to a network such as a LAN or the Internet.
  • Projector 120 is coupled to computing device 210 by graphics subsystem 214.
  • Subsystem 214 may include a graphics processor, display interface, and frame buffer 215.
  • images written to frame buffer 215 are provided to projector 120 for projection onto display surface 110, using a conventional display interface, such as a VGA, DVI, HDMI, DisplayPort or similar interface. It will be appreciated that in embodiments where display surface 110 forms part of a digital display, the digital display (rather than projector 120) is coupled to computing device 210 via graphics subsystem 214.
  • Kiosk 112, user movement sensor 118, user height sensor 119, speakers 116, and scent dispensers 7 are coupled to computing device 210 via one or more input/output (I/O) peripheral interfaces 220. Additional input/output peripherals such as keyboard, monitor, mouse, and the like of computing device 210 are not specifically detailed herein. These may also interconnect to device 210 via I/O peripheral interfaces 220.
  • Computing device 210 may for example be a conventional x86 based, Windows NT, Windows Vista, Windows XP, Apple, Macintosh, Linux, Solaris or similar based computer, known to those of ordinary skill. As will become apparent, computing device 210 may further host software allowing it to function in manners exemplary of embodiments of the present invention.
  • FIG. 3 A simplified organization of software components stored within persistent storage (i.e. memory 218) of computing device 210 is depicted in FIG. 3.
  • software components embodying depicted functional blocks may be loaded from a computer readable medium and stored within persistent storage memory 218 at computing device 210.
  • software components may include operating system (O/S) software 320, applications 322, kiosk application 324, gift registries 325, virtual interaction application (VIA) 326, rendered images 328, and sound recordings 329, exemplary of embodiments of the present invention.
  • O/S operating system
  • VIP virtual interaction application
  • O/S software 320 may, for example be a Unix based operating system (e.g. Linux, FreeBSD, Solaris), a Microsoft Windows operating system, or the like. O/S software 320 may also include a TCTP/ICP stack allowing communication of computing device 210 with a network such as a LAN or the Internet via network interface 216.
  • a Unix based operating system e.g. Linux, FreeBSD, Solaris
  • Microsoft Windows operating system e.g., a Microsoft Windows operating system, or the like.
  • O/S software 320 may also include a TCTP/ICP stack allowing communication of computing device 210 with a network such as a LAN or the Internet via network interface 216.
  • Applications 322 may include a number of conventional retail applications, such as a point-of-sale application enabling user 130 to purchase products displayed through product display system 100.
  • application 332 may include or be in communication with a gift registry application, suitable for the order and sale of items chosen by or on behalf of a gift recipient.
  • applications 332 may be in communication with a database storing items forming part of a gift list for the recipient. This database may similarly store lists of items chosen by or on behalf of various gift recipients serviced by the retail establishment operating system 100.
  • Kiosk application 324 provides a graphical user interface (GUI) via user interface 113 enabling user 130 to select from a number of functions including (1 ) browsing a products catalogue; (2) creating a gift registry; and (3) viewing an existing gift registry.
  • GUI graphical user interface
  • the GUI may also enable user 130 to initiate a virtual product interaction session, as described below.
  • VIA 326 serves to control the virtual product interaction sessions described above. Specifically, in response to inputs from kiosk 112, user movement sensor 118 and/or user height sensor 119, VIA 326 may send one or more rendered images 328 to projector 120 for projection onto display surface 110. VIA 326 may also at appropriate times provide sound outputs 329, in the form of synthesized or recorded sounds, to speakers 116, and cause scent dispensers 117 to dispense certain scents.
  • the flow chart of FIG. 4 illustrates a process by which product display system 100 may initiate a virtual product interaction session.
  • kiosk application 324 may receive an indication from user interface 113 that user 130 has selected the option to "view an existing gift registry" (step 402).
  • kiosk application 324 prompts user 130 via user interface 113 to enter account identification information (step 404).
  • account identification information is obtained (step 405)
  • kiosk application 324 retrieves the corresponding gift registry from gift registries 325, which may include for example a list of selected products (step 406).
  • Kiosk application 324 then causes the contents of the gift registry and related information (e.g. the list of selected products together with associated images and product information) to be displayed via user interface 113 (step 408).
  • VIA 326 may play a sound recording directing user 130 to stand at the location indicated by marker 115 (step 412).
  • VIA 326 may also use height sensor 119 to detect the height of user 130 (step 414).
  • VIA 326 determines the viewing angle of user 130 (step 415) and retrieves a set of rendered images from rendered images 328 that correspond to the viewing angle of user 130 (step 416).
  • VIA 326 then writes the selected rendered images to frame buffer 215, and the images are then projected by projector 120 onto display surface 110 (step 418).
  • the rendered images appear to user 130 as virtual versions of the products listed in the selected gift registry.
  • VIA 326 may also retrieve a sound recording from sound recordings 329 that corresponds to the images displayed on display surface 110, and send it to speakers 116. Similarly, VIA 326 may cause scent dispensers 117 to dispense certain scents corresponding to the images and/or sounds.
  • the series of images may depict items in the gift registry chosen by the user. All items, or the remaining items not yet purchased by users of the registry may be presented in the series of anamorphic images. A user is thus given a 3D presentations of items in the registry.
  • the items may be animated, moving along a defined trajectory. For example, the items may be moved in a generally circular path on display surface 110. Possibly, a combination of all items in a gift list may be co-rendered, and images corresponding to different viewing positions for all the items in the list may be stored in images 328. For each gift list (i.e. for each gift registrant) a different collection of sets of images may be stored.
  • FIGS. 6A and 6B provide example anamorphic images of various items in a gift list that may be stored at device 210 for display on surface 110. These images are of course only exemplary. Many more images corresponding to different items (or sets of items) and viewing angles may be stored at device 210. In the particular examples, the knife set in the center of the images is being highlighted. This may be done as a result of user interaction, or otherwise.
  • the flow chart of FIG. 5 illustrates a process by which product display system 100 may enable user 130 to interact with the system during a virtual product interaction session.
  • images displayed on display surface 110 appear to user 130 as virtual versions of the products listed in the selected gift registry. Additionally, the images may also depict virtual buttons associated with any of numerous possible actions, such as rotating or otherwise manipulating the displayed products (step 502).
  • VIA 326 may receive an indication from user movement sensor 118 that user 130 has moved her hand over a virtual button (step 504) or a depicted object. In response, VIA 326 retrieves a set of rendered images from rendered images 328 that correspond to the action associated with the selected virtual button (step 506). The selected rendered images are also selected based on the viewing angle of user 130.
  • VIA 326 then writes the selected rendered images to frame buffer 215 (step 508), and the images are then projected by projector 120 onto display surface 110.
  • the rendered images appear to user 130 as an animation corresponding to the action selected by user 130.
  • VIA 326 may also retrieve a sound recording from sound recordings 329 that corresponds to the images displayed on display surface 110, and send it to speakers 116.
  • VIA 326 may cause scent dispensers 117 to dispense certain scents corresponding to the images and/or sounds.
  • the user may interact with virtual objects presented in the images. That is, movement of a user 130 contacting an item in an image may be sensed, and a new series of images may be presented on display surface 110. For example, by using two projectors, touching an image of an item presented by one projector may cause VIA 326 to cause a second projector to present a series of anamorphic images corresponding to the touched item. Similarly, the more information about the desired object may be provided - including for example its price; location or the like.
  • items may be purchased by the interacting user.
  • Identification (including purchase identification) may be provided at kiosk 112, or otherwise.
  • the images may be rendered in real-time as required using conventional graphics processing technology such as a conventional graphics processing unit (GPU) and suitable graphics processing software such as a conventional gaming engine or the like.
  • conventional graphics processing technology such as a conventional graphics processing unit (GPU) and suitable graphics processing software such as a conventional gaming engine or the like.
  • display surface 110 may be a touch-screen digital display.
  • projector 120 and user movement sensor 118 may not be required, as images can be sent by VIA 326 directly to display surface 110 to be displayed by display surface 110, and inputs from user 130, such as touching a displayed virtual button, may be sensed by display surface 10 and indicated to VIA 326.
  • product display system 100 may include a user position sensor, such as a camera and video processing software, to dynamically detect the position of user 130 and thus enable VIA 326 to dynamically determine the viewing angle of user 130 and to display images optimized for that viewing angle.
  • a user position sensor such as a camera and video processing software

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Accounting & Taxation (AREA)
  • Marketing (AREA)
  • Human Computer Interaction (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

A product display system comprising a display surface; a height sensor for determining a height of a user; an input device for receiving a product selection from the user, where the product selection corresponding to one or more products; a computing device coupled to the height sensor and to the input device, the computing device comprising a frame buffer for outputting images to be displayed on the display surface, the computing device being configured to: receive the product selection from the input device; receive the height from the height sensor and, based on the height, determine a viewing angle of the user in relation to the display surface; and write to the frame buffer one or more images comprising virtual versions of the one or more products, thereby causing the one or more images to be displayed on the display surface, the one or more images corresponding to said viewing angle.

Description

VIRTUAL PRODUCT DISPLAY
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims benefits from U.S. Provisional Patent Application No. 61/323,638 filed April 13, 2010, the contents of which are hereby incorporated herein by reference.
FIELD OF THE INVENTION
[0002] The present invention relates generally to product display systems and, more specifically, to product display systems that provide virtual product visualization and interaction.
BACKGROUND
[0003] To promote products and generate sales, typical retail environments utilize floor and shelf space to display products to potential consumers. This type of display marketing helps promote the sales of the displayed products because potential customers often want to both see and interact with the actual products prior to making a purchasing decision.
[0004] Unfortunately, floor and shelf space in typical retail environments is often limited, so retailers often are only able to display a fraction of the set of products which are available for purchase. Furthermore, potential customers may have difficulty finding product displays of items they are looking for as they navigate the retail environment. Potential customers may also need to visit multiple locations of the merchant looking for particular items. Hence, potential customers may be deterred by inconveniences associated with locating products on display prior to making purchasing decisions, and sales may be lost.
[0005] The aforementioned difficulties are more acutely felt in a gift registry context, where potential customers typically want to see and interact with multiple products during the gift selection process.
[0006] To address these drawbacks, some retailers have installed kiosks in retail locations where images of products may be viewed. Unfortunately the images displayed on conventional kiosk displays provide limited, if any, interaction with, and views of, the displayed products. Hence, potential customers do not gain an experience akin to seeing and interacting with the actual product.
SUMMARY OF THE INVENTION
[0007] In accordance with an aspect of the present invention, there is provided a product display system. The system comprises: a display surface; a height sensor for determining a height of a user; an input device for receiving a product selection from the user, the product selection corresponding to one or more products; a computing device coupled to the height sensor and to the input device, the computing device comprising a frame buffer for outputting images to be displayed on the display surface, the computing device configured to: receive the product selection from the input device; receive the height from the height sensor and, based on the height, determine a viewing angle of the user in relation to the display surface; and write to the frame buffer one or more images comprising virtual versions of the one or more products, thereby causing the one or more images to be displayed on the display surface, the one or more images corresponding to the viewing angle.
[0008] In accordance with another aspect of the present invention, there is provided a method of providing virtual product interaction to a user. The method comprises: receiving a product selection from the user, the product selection corresponding to one or more products; determining a height of the user; based on the height, determining a viewing angle of the user in relation to a display surface; writing one or more images comprising virtual versions of the one or more products to a frame buffer, thereby causing the one or more images to be displayed on the display surface, the one or more images corresponding to the viewing angle.
[0009] In accordance with yet another aspect of the present invention, there is provided a product display system. The system comprises: a display surface; an input device for receiving information from the user indicative of at least product of interest; a projector mounted above the display surface; a computing device coupled to the input device and the projector, the computing device comprising a frame buffer for outputting images to be displayed by the projector on the display surface. The computing device is configured to; write to the frame buffer one or more two dimensional images comprising anamorphic projections of one more items of interest to the user, thereby causing the one or more two dimensional images to be displayed on the display surface, and allowing the user to perceive the one or more items of interest as three dimensional objects.
[0010] Other aspects and features of the present invention will become apparent to those ordinarily skilled in the art upon review of the following description of specific embodiments of the invention in conjunction with the accompanying figures.
BRIEF DESCRIPTION OF THE DRAWINGS
[0011] In the figures which illustrate embodiments of the invention by example only,
[0012] FIG. 1 is a simplified perspective view of a product display system that provides virtual product interaction, exemplary of an embodiment of the present invention;
[0013] FIG. 2 is a simplified block diagram of the product display system of FIG. 1 ;
[0014] FIG. 3 is a schematic diagram of the product display system of FIG. 1;
[0015] FIG. 4 is a simplified block diagram of the contents of the memory of FIG. 2; and
[0016] FIG. 5 is a flow chart of steps performed during a user interaction with a product display system that provides virtual product interaction; and
[0017] FIG. 6A and 6B are images of projections of products of interest, formed by the product display system of FIG. 1. DETAILED DESCRIPTION
[0018] FIG. 1 illustrates a simplified perspective view of a product display system 100 which provides virtual product interaction. Product display system 100 includes a display surface 110, a projector 120 for projecting images onto display surface 110, a user movement sensor 118, a user height sensor 119, speakers 116, scent dispensers 117, and a marker 115 indicating where a user 130 may stand in order to enjoy an optimal viewing angle and experience. System 100 may further include a kiosk 112 providing a user interface 113.
[0019] Display surface 110 may be a conventional projection screen including a surface and a support structure used for displaying projected images for viewing by a user 130. In some embodiments, display surface 110 may for example be implemented using a sheet a white vellum paper, or the like interposed between two sheets of glass, or between a sheet and another rigid support surface. Surface 110 may be supported by a base, and thereby form a table for the support of objects for sale and the like. In the depicted embodiment, display surface 110 takes the form of a round glass table.
[0020] Projector 120 may be a conventional display projector for projecting an image onto display surface 110. In some embodiments, projector 120 may be a conventional 1080p; 720p; XVGA or similar home theatre or business projector available from Sony, Mitsubishi, Sanyo, Panasonic, Epson or other known
manufacturer. In an embodiment, projector 120 may be focused to project exactly and only on surface 1 10, allowing projector 120 to fill surface 1 10 (and not the environment). This may be accomplished using an appropriate throw distance, focus, and optional mask.
[0021] As shown in FIG. 1 , projector 120 may be located above display surface 110, and display surface 110 is oriented substantially horizontally. However, it will be appreciated that projector 120 may be positioned in different locations, and that display surface 110 may be arranged in different orientations. For example, in some embodiments projector 120 may be positioned directly below display surface 10 and display surface 110 may be a rear-projection screen. Alternatively, projector 120 may be obliquely mounted at a location above, but not directly above surface 110.
[0022] In some embodiments, multiple projectors in multiple locations may be used. For example, a projector 120 mounted above surface 110, and a second projector mounted beneath surface 110. Both projectors may be operated concurrently to present images on surface 110.
[0023] It will also be appreciated that in some embodiments display surface 110 may be a digital display screen forming part of a digital display, such as a Liquid Crystal Display (LCD) or the like (in such embodiments, projector 120 may not be required).
[0024] As described in more detail below, user movement sensor 118 is operable to detect hand and/or arm gestures of user 130 in the vicinity of display surface 110 to thereby allow user 130 to interact with product display system 100.
[0025] User height sensor 119 is operable to detect the height of user 130, and may for example comprise one or more cameras and machine vision software as is known in the art. Alternatively, height sensor 119 may be an infrared sensor, a sonic sensor, or other appropriate sensor for sensing the presence and height of an individual. As will be apparent, user height sensor 119 serves to approximate the viewing height of a user proximate surface 110.
[0026] Kiosk 112 may be a computer terminal that provides information access by way of user interface 13, which may be a conventional touch-screen display or the like.
[0027] As will become apparent, user 130 may initiate an interaction with product display system 100 through user interface 113 of kiosk 112. Specifically, user interface
113 may present options to user 130, such as the following options: (1 ) browse products catalogue; (2) create a gift registry; and (3) view an existing gift registry. If user 130 selects one of the presented options, user interface 113 may then present a
conventional menu for browsing a product catalogue, creating a gift registry, or viewing W 201 an existing gift registry, including enabling user 130 to create a user profile or to login to an existing user profile. Additionally, user interface 113 may enable user 130 to initiate a virtual product interaction session. If user 130 selects this option, projector 120 displays substantially life-size, realistic images of selected products on display surface 110. Product display system 100 may use speakers 116 and scent dispensers 117 to enhance the sensory experience of user 130 by providing sounds and scents
associated with the images displayed on display surface 110.
[0028] Product display system 100 may also use user movement sensor 118 to enable user 130 to virtually interact with the displayed images. For example, virtual "buttons" associated with any of numerous possible actions, such as rotating or otherwise manipulating a displayed product, may be displayed on display surface 110. Alternatively, the user may interact with projected images of items. User movement sensor 118 detects when user 130 has moved her hand over a virtual button (indicating that user 130 has "activated" the virtual button) and, in response, product display system 100 may invoke images, sounds, and/or scents to virtually animate the action corresponding with the selected virtual button, thereby providing an experience akin to interacting with the actual physical product.
[0029] In some embodiments, projector 120 may project oblique anamorphic images onto display surface 110 to provide virtual 3-dimensional (3D) visualizations of products to user 130. As is known, oblique anamorphic images are drawn in a particular distortion in order to create an impression of 3 dimensions when seen from specific viewpoints. Anamorphic images (or anamorphosis) are more particularly described in the text "Hidden Images: Games of Perception, Anamorphic Art, Illusion: From the Renaissance to the Present" by Fred Leeman, Joost Elffers, Michael Schuyt, published by Harry N. Abrahams, Inc., 1976, ISBN-13: 9780810990197, ISBN: 0810990199, the contents of which are hereby incorporated by reference. As will be appreciated, realistic high-resolution oblique anamorphic images may be generated using a conventional rendering software application such as, for example, Autodesk's 3ds Max, Autodesk's 3D Studio Max, and/or Autodesk's MAYA. Constructing an oblique anamorphic image is a geometrical exercise known to those ordinarily skilled in the art, and is more particularly described for example in J. L Hunt, B. G. Nickel, and Christian Gigault's paper "Anamorphic images" published in the American Journal of Physics, March 2000, Volume 68, Issue 3, at pp. 232-237, the contents of which are hereby incorporated by reference.
[0030] In some embodiments, multiple sets of oblique anamorphic images may be pre-generated and stored at display system 100 for later presentation on surface 110, where each set corresponds to a different viewing angle. Each image may be a frame, and a set of images may be associated to provide full motion video at a suitable frame rate, such as 24 or 30 fps. Each set of oblique anamorphic images may display one or more items to be viewed by user, in motion. The items may for example move along a defined trajectory on surface 110. As described in more detail below, the viewing angle of user 130 may be determined by product display system 100, and a set of oblique anamorphic images optimal for user's 130 viewing angle may then be retrieved for presentation on surface 110.
[0031] Because such images are best viewed from specific viewpoints, marker 115 may be used to indicate to user 130 where user 130 may stand in order to enjoy an optimal viewing angle. Marker 115 may be projected onto the floor, or it may be an actual physical marker, for example painted on the floor. User height sensor 119 may also be used to more accurately determine the viewing angle of user 130, enabling product display system 100 to adjust the displayed images accordingly. For example, based on height information received from user height sensor 119 and the position of user 130 as determined by marker 115, product display system 100 may cause images optimized for user's 130 viewing angle to be displayed on display surface 110.
[0032] FIG. 2 illustrates a simplified block diagram of a computing device 210 forming part of product display system 100. In particular, as illustrated computing device 210 includes central processing unit (CPU) 212, graphics subsystem 214 having a frame buffer 215, network interface 216, and a suitable combination of persistent storage memory 218, random access memory and read only memory. Network interface 216 interconnects computing device 210 to a network such as a LAN or the Internet. [0033] Projector 120 is coupled to computing device 210 by graphics subsystem 214. Subsystem 214 may include a graphics processor, display interface, and frame buffer 215. As is known in the art, images written to frame buffer 215 are provided to projector 120 for projection onto display surface 110, using a conventional display interface, such as a VGA, DVI, HDMI, DisplayPort or similar interface. It will be appreciated that in embodiments where display surface 110 forms part of a digital display, the digital display (rather than projector 120) is coupled to computing device 210 via graphics subsystem 214.
[0034] Kiosk 112, user movement sensor 118, user height sensor 119, speakers 116, and scent dispensers 7 are coupled to computing device 210 via one or more input/output (I/O) peripheral interfaces 220. Additional input/output peripherals such as keyboard, monitor, mouse, and the like of computing device 210 are not specifically detailed herein. These may also interconnect to device 210 via I/O peripheral interfaces 220.
[0035] Computing device 210 may for example be a conventional x86 based, Windows NT, Windows Vista, Windows XP, Apple, Macintosh, Linux, Solaris or similar based computer, known to those of ordinary skill. As will become apparent, computing device 210 may further host software allowing it to function in manners exemplary of embodiments of the present invention.
[0036] A simplified organization of software components stored within persistent storage (i.e. memory 218) of computing device 210 is depicted in FIG. 3. As will be appreciated software components embodying depicted functional blocks may be loaded from a computer readable medium and stored within persistent storage memory 218 at computing device 210. As illustrated, software components may include operating system (O/S) software 320, applications 322, kiosk application 324, gift registries 325, virtual interaction application (VIA) 326, rendered images 328, and sound recordings 329, exemplary of embodiments of the present invention.
[0037] O/S software 320 may, for example be a Unix based operating system (e.g. Linux, FreeBSD, Solaris), a Microsoft Windows operating system, or the like. O/S software 320 may also include a TCTP/ICP stack allowing communication of computing device 210 with a network such as a LAN or the Internet via network interface 216.
[0038] Applications 322 may include a number of conventional retail applications, such as a point-of-sale application enabling user 130 to purchase products displayed through product display system 100. In an embodiment, application 332 may include or be in communication with a gift registry application, suitable for the order and sale of items chosen by or on behalf of a gift recipient. To that end, applications 332 may be in communication with a database storing items forming part of a gift list for the recipient. This database may similarly store lists of items chosen by or on behalf of various gift recipients serviced by the retail establishment operating system 100.
[0039] Kiosk application 324 provides a graphical user interface (GUI) via user interface 113 enabling user 130 to select from a number of functions including (1 ) browsing a products catalogue; (2) creating a gift registry; and (3) viewing an existing gift registry. The GUI may also enable user 130 to initiate a virtual product interaction session, as described below.
[0040] VIA 326 serves to control the virtual product interaction sessions described above. Specifically, in response to inputs from kiosk 112, user movement sensor 118 and/or user height sensor 119, VIA 326 may send one or more rendered images 328 to projector 120 for projection onto display surface 110. VIA 326 may also at appropriate times provide sound outputs 329, in the form of synthesized or recorded sounds, to speakers 116, and cause scent dispensers 117 to dispense certain scents.
[0041] A scenario exemplary of a user interaction with product display system 100 will now be described with reference to FIG. 1 and the flow charts of FIGS. 4 and 5.
[0042] The flow chart of FIG. 4 illustrates a process by which product display system 100 may initiate a virtual product interaction session. At the start, kiosk application 324 may receive an indication from user interface 113 that user 130 has selected the option to "view an existing gift registry" (step 402). In response, kiosk application 324 prompts user 130 via user interface 113 to enter account identification information (step 404). After account identification information is obtained (step 405), kiosk application 324 retrieves the corresponding gift registry from gift registries 325, which may include for example a list of selected products (step 406). Kiosk application 324 then causes the contents of the gift registry and related information (e.g. the list of selected products together with associated images and product information) to be displayed via user interface 113 (step 408). Additionally, user 130 is presented with an option to initiate a virtual product interaction session in association with the products listed in the selected gift registry. When the virtual product interaction session option is selected (step 410), VIA 326 may play a sound recording directing user 130 to stand at the location indicated by marker 115 (step 412). VIA 326 may also use height sensor 119 to detect the height of user 130 (step 414). Based on the height reading and on the position of user 130 as specified by marker 115, VIA 326 determines the viewing angle of user 130 (step 415) and retrieves a set of rendered images from rendered images 328 that correspond to the viewing angle of user 130 (step 416). VIA 326 then writes the selected rendered images to frame buffer 215, and the images are then projected by projector 120 onto display surface 110 (step 418). When displayed on display surface 110, the rendered images appear to user 130 as virtual versions of the products listed in the selected gift registry.
[0043] As will be appreciated, a series of rendered images may be used to depict the virtual products in animation - for example, the displayed products may appear to be rotating about the central axis of display surface 110. At the same time, VIA 326 may also retrieve a sound recording from sound recordings 329 that corresponds to the images displayed on display surface 110, and send it to speakers 116. Similarly, VIA 326 may cause scent dispensers 117 to dispense certain scents corresponding to the images and/or sounds.
[0044] Conveniently, the series of images may depict items in the gift registry chosen by the user. All items, or the remaining items not yet purchased by users of the registry may be presented in the series of anamorphic images. A user is thus given a 3D presentations of items in the registry. Conveniently, the items may be animated, moving along a defined trajectory. For example, the items may be moved in a generally circular path on display surface 110. Possibly, a combination of all items in a gift list may be co-rendered, and images corresponding to different viewing positions for all the items in the list may be stored in images 328. For each gift list (i.e. for each gift registrant) a different collection of sets of images may be stored.
[0045] FIGS. 6A and 6B provide example anamorphic images of various items in a gift list that may be stored at device 210 for display on surface 110. These images are of course only exemplary. Many more images corresponding to different items (or sets of items) and viewing angles may be stored at device 210. In the particular examples, the knife set in the center of the images is being highlighted. This may be done as a result of user interaction, or otherwise.
[0046] The flow chart of FIG. 5 illustrates a process by which product display system 100 may enable user 130 to interact with the system during a virtual product interaction session. As noted, images displayed on display surface 110 appear to user 130 as virtual versions of the products listed in the selected gift registry. Additionally, the images may also depict virtual buttons associated with any of numerous possible actions, such as rotating or otherwise manipulating the displayed products (step 502). VIA 326 may receive an indication from user movement sensor 118 that user 130 has moved her hand over a virtual button (step 504) or a depicted object. In response, VIA 326 retrieves a set of rendered images from rendered images 328 that correspond to the action associated with the selected virtual button (step 506). The selected rendered images are also selected based on the viewing angle of user 130. VIA 326 then writes the selected rendered images to frame buffer 215 (step 508), and the images are then projected by projector 120 onto display surface 110. When displayed on display surface 110, the rendered images appear to user 130 as an animation corresponding to the action selected by user 130. As before, VIA 326 may also retrieve a sound recording from sound recordings 329 that corresponds to the images displayed on display surface 110, and send it to speakers 116. Similarly, VIA 326 may cause scent dispensers 117 to dispense certain scents corresponding to the images and/or sounds.
[0047] Optionally, the user may interact with virtual objects presented in the images. That is, movement of a user 130 contacting an item in an image may be sensed, and a new series of images may be presented on display surface 110. For example, by using two projectors, touching an image of an item presented by one projector may cause VIA 326 to cause a second projector to present a series of anamorphic images corresponding to the touched item. Similarly, the more information about the desired object may be provided - including for example its price; location or the like.
[0048] Also, in the event the items form part of a gift registry, items may be purchased by the interacting user. Identification (including purchase identification) may be provided at kiosk 112, or otherwise.
[0049] Though the images displayed on display surface 110 have been described as being pre-rendered and stored in persistent storage memory 218, it will be
appreciated that in some embodiments the images may be rendered in real-time as required using conventional graphics processing technology such as a conventional graphics processing unit (GPU) and suitable graphics processing software such as a conventional gaming engine or the like.
[0050] It will also be appreciated that in some embodiments, display surface 110 may be a touch-screen digital display. In such embodiments, projector 120 and user movement sensor 118 may not be required, as images can be sent by VIA 326 directly to display surface 110 to be displayed by display surface 110, and inputs from user 130, such as touching a displayed virtual button, may be sensed by display surface 10 and indicated to VIA 326.
[0051] It will further be appreciated that rather than use marker 115 to direct user 130 to a particular optimal viewing location, product display system 100 may include a user position sensor, such as a camera and video processing software, to dynamically detect the position of user 130 and thus enable VIA 326 to dynamically determine the viewing angle of user 130 and to display images optimized for that viewing angle.
[0052] Other modifications will be apparent to those skilled in the art and, therefore, the invention is defined in the claims.

Claims

WHAT IS CLAIMED IS:
1. A product display system comprising: a display surface; a height sensor for determining a height of a user; an input device for receiving a product selection from said user, said product selection corresponding to one or more products; a computing device coupled to said height sensor and to said input device, said computing device comprising a frame buffer for outputting images to be displayed on said display surface, said computing device configured to: receive said product selection from said input device; receive said height from said height sensor and, based on said height, determine a viewing angle of said user in relation to said display surface; and write to said frame buffer one or more images comprising virtual versions of said one or more products, thereby causing said one or more images to be displayed on said display surface, said one or more images
corresponding to said viewing angle.
2. The system of claim 1 , wherein said one or more images comprise one or more oblique anamorphic images.
3. The system of claim 1 , wherein said product display system further comprises a projector coupled to said frame buffer; said display surface comprises a projection screen; and said causing said one or more images to be displayed on said display surface comprises causing said one or more images to be sent to said projector for projection onto said display surface.
4. The system of claim 1 , wherein said one or more images are pre-rendered using a rendering software application and stored in a memory, and wherein said computing device is further configured to retrieve said one or more images from said memory prior to said writing said one ore more images to said frame buffer.
5. The system of claim 1 , wherein said product display system further comprises one or more speakers, and said computing device is further configured to send a sound recording corresponding to said one or more images to said one or more speakers.
6. The system of claim 1 , wherein said product display system further comprises one or more scent dispensers, and said computing device is further configured to cause a scent corresponding to said one or more images to be dispensed by said one or more scent dispensers.
7. The system of claim 1 , wherein said one or more images further comprise one or more virtual buttons, each of said virtual buttons being associated with an action; said product display system further comprises a user movement sensor for detecting hand gestures of said user that correspond to activating said virtual buttons; and said computing device is further configured to: receive an indication from said user movement sensor that said user has activated a particular one of said virtual buttons; and responsive to said indication from said movement sensor, write to said frame buffer a plurality of action images comprising animations corresponding to said action associated with said virtual button, thereby causing said plurality of action images to be displayed on said display surface, said plurality of action images corresponding to said viewing angle.
8. A method of providing virtual product interaction to a user, said method comprising: receiving a product selection from said user, said product selection
corresponding to one or more products; determining a height of said user; based on said height, determining a viewing angle of said user in relation to a display surface; writing one or more images comprising virtual versions of said one or more products to a frame buffer, thereby causing said one or more images to be displayed on said display surface, said one or more images corresponding to said viewing angle.
9. The method of claim 8, wherein said one or more images comprise one or more oblique anamorphic images.
10. The method of claim 8, wherein said display surface comprises a projection screen; and said causing said one or more images to be displayed on said display surface comprises causing said one or more images to be sent to a projector for projection onto said display surface.
11.The method of claim 8, further comprising pre-rendering said one or more images using a rendering software application, storing said one or more images in a memory, and retrieving said one or more images from said memory prior to said writing said one ore more images to said frame buffer. 2. The method of claim 8, further comprising sending a sound recording corresponding to said one or more images to one or more speakers.
13. The method of claim 8, further comprising causing a scent corresponding to said one or more images to be dispensed by one or more scent dispensers.
14. The method of claim 8, wherein said one or more images further comprise one or more virtual buttons, each of said virtual buttons being associated with an action; said method further comprising: detecting a hand gesture of said user that corresponds to activating one of said virtual buttons; responsive to said detecting said hand gesture, writing to said frame buffer a plurality of action images comprising animations corresponding to said action associated with said virtual button, thereby causing said plurality of action images to be displayed on said display surface, said plurality of action images corresponding to said viewing angle.
15. A product display system comprising: a display surface; an input device for receiving information from said user indicative of at least product of interest; a projector mounted above said display surface; a computing device coupled to said input device and said projector, said computing device comprising a frame buffer for outputting images to be displayed by said projector on said display surface, said computing device configured to: write to said frame buffer one or more two dimensional images comprising anamorphic projections of one more items of interest to said user, thereby causing said one or more two dimensional images to be displayed on said display surface, and allowing said user to perceive said one or more items of interest as three dimensional objects.
PCT/CA2011/000413 2010-04-13 2011-04-13 Virtual product display Ceased WO2011127578A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/652,043 US9733699B2 (en) 2010-04-13 2012-10-15 Virtual anamorphic product display with viewer height detection

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US32363810P 2010-04-13 2010-04-13
US61/323,638 2010-04-13

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/652,043 Continuation US9733699B2 (en) 2010-04-13 2012-10-15 Virtual anamorphic product display with viewer height detection

Publications (1)

Publication Number Publication Date
WO2011127578A1 true WO2011127578A1 (en) 2011-10-20

Family

ID=44798212

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2011/000413 Ceased WO2011127578A1 (en) 2010-04-13 2011-04-13 Virtual product display

Country Status (1)

Country Link
WO (1) WO2011127578A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013071904A1 (en) * 2011-11-15 2013-05-23 Seca Ag Height measuring device
EP2940995A1 (en) 2014-04-29 2015-11-04 Satavision OY Virtual vitrine
US9361628B2 (en) 2010-04-13 2016-06-07 Dean Stark Interactive video shelving system
US9367869B2 (en) 2012-02-13 2016-06-14 Dean Stark System and method for virtual display
US9563906B2 (en) 2011-02-11 2017-02-07 4D Retail Technology Corp. System and method for virtual shopping display
US9733699B2 (en) 2010-04-13 2017-08-15 Dean Stark Virtual anamorphic product display with viewer height detection
EP3678126A4 (en) * 2017-08-31 2020-08-26 Sony Corporation INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD AND PROGRAM
CN113450258A (en) * 2021-08-31 2021-09-28 贝壳技术有限公司 Visual angle conversion method and device, storage medium and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6183089B1 (en) * 1998-10-13 2001-02-06 Hossein Tajalli Tehrani Motion picture, TV and computer 3-D imaging system and method of use
WO2006047487A2 (en) * 2004-10-25 2006-05-04 The Trustees Of Columbia University In The City Of New York Systems and methods for displaying three-dimensional images
WO2006115261A1 (en) * 2005-04-21 2006-11-02 Canon Kabushiki Kaisha Image processing method and image processing apparatus
EP1837830A1 (en) * 2006-03-14 2007-09-26 Kaon Interactive Inc. Product visualization and interaction systems and methods thereof
US20080010169A1 (en) * 2006-07-07 2008-01-10 Dollens Joseph R Method and system for managing and displaying product images
WO2009032772A2 (en) * 2007-08-28 2009-03-12 Ngo Nghi Roger High product density interactive rotating kiosk with virtual tracking animation user interface

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6183089B1 (en) * 1998-10-13 2001-02-06 Hossein Tajalli Tehrani Motion picture, TV and computer 3-D imaging system and method of use
WO2006047487A2 (en) * 2004-10-25 2006-05-04 The Trustees Of Columbia University In The City Of New York Systems and methods for displaying three-dimensional images
WO2006115261A1 (en) * 2005-04-21 2006-11-02 Canon Kabushiki Kaisha Image processing method and image processing apparatus
EP1837830A1 (en) * 2006-03-14 2007-09-26 Kaon Interactive Inc. Product visualization and interaction systems and methods thereof
US20080010169A1 (en) * 2006-07-07 2008-01-10 Dollens Joseph R Method and system for managing and displaying product images
WO2009032772A2 (en) * 2007-08-28 2009-03-12 Ngo Nghi Roger High product density interactive rotating kiosk with virtual tracking animation user interface

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9361628B2 (en) 2010-04-13 2016-06-07 Dean Stark Interactive video shelving system
US9733699B2 (en) 2010-04-13 2017-08-15 Dean Stark Virtual anamorphic product display with viewer height detection
US9563906B2 (en) 2011-02-11 2017-02-07 4D Retail Technology Corp. System and method for virtual shopping display
JP2014534877A (en) * 2011-11-15 2014-12-25 ゼカ アーゲー Length measuring device
WO2013071904A1 (en) * 2011-11-15 2013-05-23 Seca Ag Height measuring device
US20140071270A1 (en) * 2011-11-15 2014-03-13 Seca Ag Device for height measurement
US9636047B2 (en) 2011-11-15 2017-05-02 Seca Ag Device for height measurement
CN103228212A (en) * 2011-11-15 2013-07-31 塞卡股份公司 device for height measurement
US9367869B2 (en) 2012-02-13 2016-06-14 Dean Stark System and method for virtual display
EP2940995A1 (en) 2014-04-29 2015-11-04 Satavision OY Virtual vitrine
EP3678126A4 (en) * 2017-08-31 2020-08-26 Sony Corporation INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD AND PROGRAM
US11460994B2 (en) 2017-08-31 2022-10-04 Sony Corporation Information processing apparatus and information processing method
CN113450258A (en) * 2021-08-31 2021-09-28 贝壳技术有限公司 Visual angle conversion method and device, storage medium and electronic equipment
CN113450258B (en) * 2021-08-31 2021-11-05 贝壳技术有限公司 Visual angle conversion method and device, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
WO2011127578A1 (en) Virtual product display
US11854148B2 (en) Virtual content display opportunity in mixed reality
US9361628B2 (en) Interactive video shelving system
US8577762B2 (en) Detail-in-context lenses for interacting with objects in digital image presentations
US10341642B2 (en) Display device, control method, and control program for stereoscopically displaying objects
US9087413B2 (en) System for delivering and enabling interactivity with images
US9965697B2 (en) Head pose determination using a camera and a distance determination
US11495003B1 (en) Placing and manipulating multiple three-dimensional (3D) models using mobile augmented reality
TW201028888A (en) Method of presenting head-pose feedback to a user of an interactive display system
KR101724999B1 (en) Virtual shopping system comprising virtual shopping terminal and virtual shopping server
US9733699B2 (en) Virtual anamorphic product display with viewer height detection
US20050156027A1 (en) Method and apparatus for vending magic, pranks, and gags
EP2778887A2 (en) Interactive display device
KR101181740B1 (en) Smart show window
CN106815756A (en) A kind of exchange method of Virtual shop, subscriber terminal equipment and server
US20080010167A1 (en) Virtual-Product Presentation System
JP6094918B1 (en) Information terminal device, three-dimensional image generation server, three-dimensional image display system, three-dimensional image display method, and three-dimensional image display program
JP6583745B2 (en) Online shopping system
JP2021140761A (en) Distribution system, viewing apparatus, video generation apparatus, information processing method, and video generation method
JP6261880B2 (en) Image display device
US20170083952A1 (en) System and method of markerless injection of 3d ads in ar and user interaction
JP6065533B2 (en) Electronic signage apparatus and operation method
US12093995B1 (en) Card ecosystem guest interface in virtual reality retail environments
JP7445048B1 (en) Programs and game devices
WO2013108388A1 (en) Vending machine, method for controlling vending machine, and program for controlling vending machine

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11768318

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11768318

Country of ref document: EP

Kind code of ref document: A1