[go: up one dir, main page]

WO2024205389A1 - Intégration d'une vue de rue navigable et d'une vue intérieure de lieu - Google Patents

Intégration d'une vue de rue navigable et d'une vue intérieure de lieu Download PDF

Info

Publication number
WO2024205389A1
WO2024205389A1 PCT/MY2023/050018 MY2023050018W WO2024205389A1 WO 2024205389 A1 WO2024205389 A1 WO 2024205389A1 MY 2023050018 W MY2023050018 W MY 2023050018W WO 2024205389 A1 WO2024205389 A1 WO 2024205389A1
Authority
WO
WIPO (PCT)
Prior art keywords
view
user
venue
perspective
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/MY2023/050018
Other languages
English (en)
Inventor
Fook Wah CHEN
Fu Seng LOO
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Juda Universe Sdn Bhd
Original Assignee
Juda Universe Sdn Bhd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Juda Universe Sdn Bhd filed Critical Juda Universe Sdn Bhd
Priority to PCT/MY2023/050018 priority Critical patent/WO2024205389A1/fr
Publication of WO2024205389A1 publication Critical patent/WO2024205389A1/fr
Anticipated expiration legal-status Critical
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation

Definitions

  • the present invention relates to a method of integrating a navigable map street perspective view and a venue interior perspective view.
  • Open-source and online maps typically offer overhead views, where whole areas on a map can be viewed from a top-down perspective.
  • Online maps such as Google maps now also offer a street perspective view, where a user can view objects and buildings from a street perspective view, which is akin to what they would see were they to walk on the street itself. The user is able to walk freely along any street that is available in the virtual map.
  • virtual venues such as shopping centers to walk around virtually in those venues, and engage in activities that they would normally engage in when visiting actual shopping centers, such as shopping, etc.
  • What is needed in the art is a way to integrate in a seamless fashion the street perspective view and an interior perspective view of a virtual venue such as a shopping center.
  • the present invention seeks to overcome the aforementioned disadvantages by providing a computer-based method for a user to navigate between a street perspective view of a virtual map and an interior perspective view of a virtual venue on the map.
  • a street perspective view of a virtual street in an area of the virtual map is displayed on a display seen by the user.
  • the user navigates around the virtual street towards a venue point that corresponds to a virtual venue.
  • the user engages with the venue point by walking into it or clicking on it.
  • the display then changes seamlessly to an interior of the virtual venue, akin to if the user had walked from an actual street into an actual venue.
  • the present invention thus relates to a computer-based method for a user to navigate between a street perspective view of a virtual map and an interior perspective view of a virtual venue on the map, comprising:
  • the street perspective view having a point of view that corresponds to a position and directional orientation of the user, the display thus changing in accordance with changes in said position and directional orientation as the user navigates around;
  • the interior perspective view having a point of view that corresponds to a position and directional orientation of the user, the display thus changing in accordance with changes in said position and directional orientation of the user within the virtual venue;
  • meta data including street names, building names, and venue point names in the street perspective view and interior perspective views.
  • the user is able to control their position and directional orientation using their head movement, eye movement, hand gestures, speech, computer mouse, computer keyboard, or touchscreen display, or any combination thereof.
  • An image and movement recognition device is further provided, which is able to decipher images and movements of objects, and further translate said images and movements into predetermined movements of the user within the virtual map.
  • a file server stores colour and textural data used to render the said street perspective view, venue points, and interior perspective view
  • a database server stores the meta data used to render the street names, building names, and venue point names in the street perspective view and interior perspective views.
  • a street perspective view of a virtual street in an area of the virtual map is displayed on a display seen by the user.
  • the user navigates around the virtual street towards a venue point that corresponds to a virtual venue.
  • the user engages with the venue point by walking into it or clicking on it.
  • the display then changes seamlessly to an interior of the virtual venue, akin to if the user had walked from an actual street into an actual venue.
  • FIG. 1 shows a third person street perspective view of an embodiment of this invention.
  • FIG. 1 shows a first person street perspective view of an embodiment of this invention.
  • FIG. 1 shows a third person street perspective view of an embodiment of this invention.
  • FIG. 1 shows a first person street perspective view of an embodiment of this invention.
  • FIG. 1 shows a first person interior perspective view of an embodiment of this invention.
  • FIG. 1 shows a process flow chart of an embodiment of this invention.
  • FIG. 1 shows a network architecture diagram of an embodiment of this invention.
  • the map could be any open-sourced or online map that offers displays of street views on a device such as a computer, tablet, phone or other multimedia device.
  • FIG. 1A and 1B there is also shown venue points (24A, 24B and 24C). These are user engageable points located within the street view perspective map. The user is able to navigate along the street and around the map as if he was walking on an actual street. The way the user navigates around the map will be discussed in more detail further down.
  • Figures 2A and 2B the user has navigated in the virtual map towards a venue point (24B).
  • the user is now standing directly in front of the venue point (24B).
  • Similar to the difference between Figures 1A and 1B shows a third person view which shows the back of the virtual user (20), and shows a first person view, which shows on the display a view that would be seen by a someone standing where the user is standing, and looking in the direction the user is looking.
  • Using the same navigation controls as he used to navigate around the map he moves the virtual user (20) forwards as if he was entering the virtual venue shown on the street view perspective. He also has the option of engaging with the venue point in any number of other predetermined ways, such as clicking or tapping on the venue point.
  • the main objective of the present invention is to provide a seamless transition between the street perspective view and a venue interior perspective view, and the user virtually walking into the venue directly from the street view is paramount in maintaining the illusion of a seamless transition.
  • FIG. 34 shows an interior perspective view of the inside of a virtual venue (30) that would be displayed to the user once he has entered the virtual venue.
  • the interior of the virtual venue is further provided with user engageable points (34A, 34B, 34C), which may include shopping items available for online purchase, cashiers for payment, help or information kiosks, and any other services that might be available in a real-world shopping center.
  • user engageable points 34A, 34B, 34C
  • the virtual venue in this example is that of a store of shopping center
  • the main inventive feature of this invention is the seamless transition between the street perspective view and the venue interior perspective view displayed to the user.
  • the virtual venue could be any venue that exists in the real world, not limited only to retail establishments, such as a hotel, children’s playground, police station, travel agent, and car dealership.
  • the user accesses an area of a virtual map (100) in a street perspective view (such as that shown in and 1B above).
  • the device displays to the user a street perspective view in accordance with the position and directional orientation of the user (102).
  • the user is able to navigate around the street view, and the display changes in accordance with changes in the said position and directional orientation of the user as he navigates around.
  • a pre-determined venue point 104
  • he is able to walk into or access said venue point or engage it in some other way (106).
  • the display changes immediately to an interior view of a virtual venue that corresponds to the venue point that was accessed by the user (108).
  • the user is able to seamlessly transition between a street perspective view of a virtual street on an online or other virtual map, to an interior perspective view of a virtual venue.
  • a map content server (210) that is updated regularly from open-sourced maps with map data such as user coordinates and 3D positions.
  • a map node (220) retrieves the coordinates, 3D position and directional orientation of the user from the map content server (210).
  • the map node (220) also retrieves colour and texture information for the view that it will render on the user display from a file server (212).
  • the map node (220) also retrieves meta data such as street names, building names and other information from a database server (216). Using all this retrieved data, the map node (220) renders a street perspective view on the user display for the user to view, based on the user’s 3D position and directional orientation on the virtual street.
  • the user is able to navigate along the street and around the map using a multitude of inputs, such as using their head movement, eye movement, hand gestures, speech, computer mouse, computer keyboard, or touchscreen display, or any combination thereof.
  • an image and movement capture device which functions to capture movement of the user’s head, eyes or hands and send that data to a motion capture node (222).
  • the motion capture node (222) is able to decipher the captured images and movements data into predetermined movements of the user within the virtual map, according to a predetermined set of rules. In this way, the user does not need to use a traditional mouse and keyboard or other input device to control movement within the virtual map.
  • a navigation node (224) assists in the movement of the user within the virtual map.
  • a store node (226) retrieves colour and texture information for the venue interior perspective view that it will render on the user display from a file server (212).
  • the 3D position and directional orientation of the user is fixed to an entrance of the venue, looking into the venue, as if the user had just walked into the venue.
  • the store node (220) also retrieves data about items that are to be offered in the venue from the database server (216), such as products or services for sale or otherwise being provided.
  • the store node (226) displays an interior perspective view of the venue from a perspective of the user having just entered the virtual venue. Thereafter, the user is again free to navigate around the venue interior just as he did on the street perspective view.
  • Machine learning server (214)

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)

Abstract

L'invention concerne un procédé informatique permettant à un utilisateur de naviguer entre une vue en perspective de rue d'une carte virtuelle et une vue en perspective intérieure d'un lieu virtuel sur la carte. Une vue en perspective de rue d'une rue virtuelle dans une zone de la carte virtuelle est affichée sur un écran vu par l'utilisateur. L'utilisateur navigue autour de la rue virtuelle vers un point de lieu qui correspond à un lieu virtuel. L'utilisateur interagit avec le point de lieu en marchant dans celui-ci ou en cliquant sur celui-ci. L'écran change ensuite de manière transparente à l'intérieur du lieu virtuel, comme si l'utilisateur avait marché d'une rue réelle vers un lieu réel.
PCT/MY2023/050018 2023-03-29 2023-03-29 Intégration d'une vue de rue navigable et d'une vue intérieure de lieu Pending WO2024205389A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/MY2023/050018 WO2024205389A1 (fr) 2023-03-29 2023-03-29 Intégration d'une vue de rue navigable et d'une vue intérieure de lieu

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/MY2023/050018 WO2024205389A1 (fr) 2023-03-29 2023-03-29 Intégration d'une vue de rue navigable et d'une vue intérieure de lieu

Publications (1)

Publication Number Publication Date
WO2024205389A1 true WO2024205389A1 (fr) 2024-10-03

Family

ID=92907212

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/MY2023/050018 Pending WO2024205389A1 (fr) 2023-03-29 2023-03-29 Intégration d'une vue de rue navigable et d'une vue intérieure de lieu

Country Status (1)

Country Link
WO (1) WO2024205389A1 (fr)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000060440A1 (fr) * 1999-04-06 2000-10-12 Vergics Corporation Navigation visuelle basee sur des graphes a travers des environnements spatiaux
US20120127170A1 (en) * 2010-11-24 2012-05-24 Google Inc. Path Planning For Street Level Navigation In A Three-Dimensional Environment, And Applications Thereof
US20120162253A1 (en) * 2012-03-05 2012-06-28 David Collins Systems and methods of integrating virtual flyovers and virtual tours
US20130083055A1 (en) * 2011-09-30 2013-04-04 Apple Inc. 3D Position Tracking for Panoramic Imagery Navigation
US20130321461A1 (en) * 2012-05-29 2013-12-05 Google Inc. Method and System for Navigation to Interior View Imagery from Street Level Imagery
US20140053077A1 (en) * 2011-04-12 2014-02-20 Google Inc. Integrating Maps and Street Views
US20160299661A1 (en) * 2015-04-07 2016-10-13 Geopogo, Inc. Dynamically customized three dimensional geospatial visualization
US20220319108A1 (en) * 2021-03-31 2022-10-06 SY Interiors Pvt. Ltd Methods and systems for provisioning a virtual experience of a building based on user profile data

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000060440A1 (fr) * 1999-04-06 2000-10-12 Vergics Corporation Navigation visuelle basee sur des graphes a travers des environnements spatiaux
US20120127170A1 (en) * 2010-11-24 2012-05-24 Google Inc. Path Planning For Street Level Navigation In A Three-Dimensional Environment, And Applications Thereof
US20140053077A1 (en) * 2011-04-12 2014-02-20 Google Inc. Integrating Maps and Street Views
US20130083055A1 (en) * 2011-09-30 2013-04-04 Apple Inc. 3D Position Tracking for Panoramic Imagery Navigation
US20120162253A1 (en) * 2012-03-05 2012-06-28 David Collins Systems and methods of integrating virtual flyovers and virtual tours
US20130321461A1 (en) * 2012-05-29 2013-12-05 Google Inc. Method and System for Navigation to Interior View Imagery from Street Level Imagery
US20160299661A1 (en) * 2015-04-07 2016-10-13 Geopogo, Inc. Dynamically customized three dimensional geospatial visualization
US20220319108A1 (en) * 2021-03-31 2022-10-06 SY Interiors Pvt. Ltd Methods and systems for provisioning a virtual experience of a building based on user profile data

Similar Documents

Publication Publication Date Title
CN108984605B (zh) 场地地图应用以及提供场地目录的系统
US12374105B2 (en) Systems and methods for personalized augmented reality view
US10249095B2 (en) Context-based discovery of applications
US7126579B2 (en) Method for requesting destination information and for navigating in a map view, computer program product and navigation unit
US9672588B1 (en) Approaches for customizing map views
CN112219205A (zh) 内容到空间3d环境的匹配
CN111133365A (zh) 内容到空间3d环境的匹配
US20130141428A1 (en) Computer-implemented apparatus, system, and method for three dimensional modeling software
US20140248950A1 (en) System and method of interaction for mobile devices
Schmalstieg et al. The world as a user interface: Augmented reality for ubiquitous computing
EP2814000A1 (fr) Appareil de traitement d'image, procédé et programme de traitement d'image
WO2012047655A1 (fr) Programme informatique, système, procédé et dispositif pour afficher et rechercher des unités dans une structure à niveaux multiples
JP7032451B2 (ja) デジタルマップ上のインジケータの視覚的なプロパティを動的に変更すること
US12266061B2 (en) Virtual personal interface for control and travel between virtual worlds
US11776206B1 (en) Extended reality system and extended reality method with two-way digital interactive digital twins
CN111538405A (zh) 信息处理方法及终端、非临时性计算机可读存储介质
CN109313768A (zh) 用于在线门票市场的天气增强的图形预览
US10489965B1 (en) Systems and methods for positioning a virtual camera
Grasset et al. Navigation techniques in augmented and mixed reality: Crossing the virtuality continuum
WO2024205389A1 (fr) Intégration d'une vue de rue navigable et d'une vue intérieure de lieu
Abbas et al. Augmented reality-based real-time accurate artifact management system for museums
JP7332197B2 (ja) 情報共有装置、イベント支援システム、情報共有方法、及びイベント支援システムの生産方法
US10831332B2 (en) User interface element for building interior previewing and navigation
JP7287710B2 (ja) 需要度算出装置、イベント支援システム、需要度算出方法、及びイベント支援システムの生産方法
JP3615840B2 (ja) 画像表示装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23931076

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE