WO2016089357A1 - Localisation de bien - Google Patents
Localisation de bien Download PDFInfo
- Publication number
- WO2016089357A1 WO2016089357A1 PCT/US2014/067962 US2014067962W WO2016089357A1 WO 2016089357 A1 WO2016089357 A1 WO 2016089357A1 US 2014067962 W US2014067962 W US 2014067962W WO 2016089357 A1 WO2016089357 A1 WO 2016089357A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- optical sensor
- data
- identifier
- view
- field
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/16—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/20—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S1/00—Beacons or beacon systems transmitting signals having a characteristic or characteristics capable of being detected by non-directional receivers and defining directions, positions, or position lines fixed relatively to the beacon transmitters; Receivers co-operating therewith
- G01S1/70—Beacons or beacon systems transmitting signals having a characteristic or characteristics capable of being detected by non-directional receivers and defining directions, positions, or position lines fixed relatively to the beacon transmitters; Receivers co-operating therewith using electromagnetic waves other than radio waves
Definitions
- the subject matter described herein relates to location tracking of assets, such as a person or product within various environments including clinical healthcare settings.
- patients within telemetry care units of hospitals are typically coupled to continuous health monitoring sensors (e.g., electrocardiogram (ECG) sensors, blood oxygen sensors, etc.) for monitoring their health status.
- ECG electrocardiogram
- a patient admitted to a telemetry care unit has the benefit of not being permanently tethered to a wall mounted patient monitor and hence has the freedom of walking around the care unit. Patients can also wander to other areas within the hospital like visiting the cafeteria or gift shop.
- a patient's physiological parameters are monitored such that if the parameters are outside a predetermined healthy range, the sensors generate an alarm. When an alarm occurs, because the patient is mobile and may be at an unknown location, healthcare staff may waste life-saving time searching for the patient.
- data is received including a trigger instruction to scan a field of view of the optical sensor for a data marker containing an identifier and located at a known location.
- An optical sensor scans the field of view to acquire the identifier from the data marker.
- the identifier is provided for determining a current position of the optical sensor using a record associating the identifier contained on the data marker and the known location.
- a method for implementation by an optical sensor in operation with at least one data processor forming part of at least one wearable computing device includes receiving data including a trigger instruction to scan a field of view of the optical sensor for a data marker containing an identifier and located at a known location.
- the trigger instruction is generated periodically, received from a network over a wireless link, or is generated in response to action by a wearer.
- the optical sensor scans the field of view to acquire the identifier from the data marker.
- the field of view of the optical sensor overlaps with a wearer's field of view when the wearable computing device is worn by the wearer.
- the identifier is provided for determining a current position of the optical sensor using a record associating the identifier contained on the data marker and the known location.
- a method for implementation by an optical sensor in operation with at least one data processor forming part of at least one computing system includes receiving data including a trigger instruction to scan a field of view of the optical sensor for a data marker containing an identifier and located at a known location.
- the optical sensor scans the field of view to acquire the identifier from the data marker.
- a current position of the optical sensor is determined.
- the trigger instruction can be generated periodically.
- the trigger instruction can be received from a network.
- the trigger instruction can be generated in response to action by a user.
- the optical sensor and the at least one data processor can form a wearable computing device.
- the field of view of the optical sensor can overlap with a wearer's field of view when the wearable computing device is worn.
- An audio or visual stimulus can be generated for a user to direct the field of view of the optical sensor towards the data marker.
- the data marker can be visible to the optical sensor and invisible to a human eye.
- a plurality of data markers can be located in predetermined positions within a healthcare facility for tracking the optical sensor and the at least one data processor.
- a current position of the optical sensor can be determined using a record associating the identifier contained on the data marker and the known location.
- the identifier can be unique to the known location.
- the data marker can be visible to the optical sensor and invisible to a human eye.
- Computer program products are also described that comprise non- transitory computer readable media storing instructions, which when executed by at least one data processor of one or more computing systems, causes at least one data processor to perform operations herein.
- computer systems are also described that may include one or more data processors and a memory coupled to the one or more data processors.
- the memory may temporarily or permanently store instructions that cause at least one processor to perform one or more of the operations described herein.
- methods can be implemented by one or more data processors either within a single computing system or distributed among two or more computing systems.
- the current subject matter described herein provides many technical advantages.
- the current subject matter enables either fully or partially automatic location tracking of assets, such as a patient, worker, or product, within a healthcare setting.
- the tracking can be accurate and a separate infrastructure may not be required to implement the current subject matter in existing healthcare facilities.
- the current subject matter can use power more efficiently by querying a wearable device to scan a code or other data marker only when triggered to do so.
- the usable battery life of optical sensors can be increased by controlling a rate at which the optical sensor scans its field of view.
- FIG. 1 is a process flow diagram of a method for tracking assets, such as a patient or healthcare worker within a healthcare facility;
- FIG. 2 is a system block diagram illustrating an example asset tracking system including a wearable device with an optical sensor, data markers, and a hospital network with central database;
- FIG. 3 is a data flow diagram illustrating an example data flow of the asset tracking system of FIG. 2; and [0015] FIG. 4 illustrates the wearable device and its field of view display at different steps of an example tracking process.
- FIG. 1 is a process flow diagram of a method 100 for tracking assets, such as a patient or healthcare worker within a healthcare facility. Data can be received at
- the optical sensor can include a camera and, in some implementations, can be part of a wearable computing device that includes at least one data processor, such as a GOOGLE GLASS® or EPSON MO VERIO® device in which the field of view of the optical sensor overlaps the field of view of a wearer when the device is worn.
- a data processor such as a GOOGLE GLASS® or EPSON MO VERIO® device in which the field of view of the optical sensor overlaps the field of view of a wearer when the device is worn.
- the data marker can include a sticker with a barcode, such as a matrix barcode or two-dimensional barcode or QR CODE®, although other indicia such as plaintext are possible.
- An identifier such as an alpha numeric or binary number can be encoded within the data marker.
- Multiple data markers, each having their own encoded identifier, can be located around a facility, such as on the walls, ceilings, and floors at known locations.
- the identifier for a given data marker can be unique in that it uniquely identifies the data marker within a given context.
- the identifier may be unique worldwide, within a hospital system, and/or within a clinical care unit.
- the identifier can be unique to the specific known location.
- the trigger instruction can be generated periodically.
- a wearable device can generate a trigger instruction according to a predefined schedule (e.g., every five minutes) and/or receive the trigger instruction from a remote network over a wireless connection.
- the trigger instruction can be generated in response to action by a user. For example, a voice command, gesture, or touch input by a user can cause generation of a trigger instruction by the wearable device.
- an audio or visual stimulus can be generated prompting the user or wearer to direct the field of view of the optical sensor towards the nearest data marker. By not continuously scanning the field of view for a data marker, the usable battery life of the wearable device can be extended.
- the optical sensor can scan its field of view at 120 to acquire the identifier from the data marker.
- the optical sensor can capture a visual, infrared, or other non- visible light image, process the image to identify the data marker, and extract the identifier using image processing techniques.
- the identifier can be provided at 130 for determining a current position of the optical sensor using a record associating the identifier and the known location.
- the record can include a lookup table associating identifiers with known locations and the record can be created during a record creation phase in which identifiers are associated with locations, such as room numbers, global positioning system coordinates, and the like.
- the identifier can be provided to a remote database, for example, by wirelessly transmitting the identifier over a wireless link to the remote database.
- the identifier can be provided (e.g., stored) on the wearable computing device for locally determining a current position of the optical sensor (e.g., determination is performed by the wearable computing device).
- a current position of the optical sensor can be determined at 140 using the record associating the identifier contained on the data marker and the known location. This current position can be used to track in real time the position of the optical sensor and, by association, the user of the optical sensor.
- the tracked position can be used for a number of purposes. For example, knowledge of current asset positions can be used for finding patients, healthcare workers, medical equipment, and other assets quickly, for example, during an urgent care situation. Additionally, tracking of positions can be used to prompt location-specific functionalities, such as applications that operate based on locational context. For example, a map can be displayed by a wearable device that shows the wearer's current location in real-time allowing the wearer to navigate through the facility.
- the data marker can include infrared ink so that the data marker is visible to an infrared optical sensor and invisible to a human eye.
- the data marker can include printing or other indicia in ink that absorbs in the infrared spectrum and reflects the visible spectrum of light.
- tracking the optical sensor wearer can be passive and transparent to the user and occur in the background (e.g., the current subject matter may not require explicit input from the user or even knowledge on the part of the user).
- multiple data markers may be within the field of view.
- one of the data markers may be selected as the current position or a current position may be averaged or triangulated based on multiple data markers.
- the trigger instruction to scan the field of view can be generated based on a push model (e.g., from a remote network), a pull model, an accelerometer, expiration of a timer (e.g., periodic), and the like.
- the wearable device 205 includes an optical sensor or camera 210, infrared emitter 215, field of view display 217, a microprocessor 220 including at least one data processor, and a wireless communications module 225.
- the wireless communications module 225 can include cellular, WI-FI, Bluetooth, and/or other wireless technology.
- the camera 210 is capable of acquiring images in both the visible and infrared spectrum in a field of view.
- the camera 210 field of view can overlap the field of view of the wearer of the wearable device 205 so that the camera 210 "sees" what the wearer can see.
- the infrared emitter 215 can provide illumination for scanning data markers 230; containing infrared ink.
- Field of view display 217 is an augmented reality display that can be semi-transparent, allowing the wearer to view the display and see through the display.
- the field of view display can display an indicator such as an icon that a data marker has been scanned.
- Field of view display 217 may obscure a subset of the field of view of the wearer.
- the hospital network 235 can include a central database 240 housing a remote record 243 or lookup table associating each identifier with the location that the associated data marker is located.
- the record table can be stored on the wearable device 205, as a local record 245.
- Each identifier e.g., alphanumeric code
- the following is an example record table.
- FIG. 3 is a data flow diagram illustrating an example data flow 300 of the asset tracking system 200 of FIG. 2.
- the hospital network 235 transmits a request for position to the wearable device at 305.
- the request for position can include a trigger instruction to scan the field of view of the camera 210 of the wearable device 205.
- the wearable device 205 can receive the request for position using the wireless
- the camera 210 can, in response to the trigger instruction, scan its field of view for a data marker 230 ; by acquiring an image at 310.
- the acquired image can be processed by the microprocessor 220 to identify and extract the identifier from the acquired image. If no data marker 230 ; is identified, scanning of the field of view of the camera 210 can continue on a periodic basis until a data marker 230 ; is identified.
- the identifier can be transmitted from the wearable device 205 to the hospital network 235.
- Central database 240 on the hospital network 235 can determine the wearable device 205 position at 320 using the remote record 243.
- the wearable device 205 position can be used to track the location of the wearable device 205 wearer.
- the wearable device 205 position can be transmitted from the hospital network 235 to the wearable device 205 to enable location-based functionality.
- FIG. 4 illustrates wearable device 205 at different steps of an example asset tracking process.
- the wearer points wearable device 205, including field of view display 217, towards a data marker 410. Since wearable device camera's 210 field of view overlaps with a wearer's field of view, data marker 230 ; - is within camera's 210 field of view.
- Data marker 410 may be attached to a wall, ceiling, hallway, asset, such as a medical device. In this case, data marker 410 is a two dimensional barcode.
- a trigger instruction can be generated in response to a verbal command from the wearer and wearable device 205 can scan the data marker 410. Wearable device 205 can capture the identifier contained within data marker 410 as described above.
- field of view display 215 can display an icon 430 indicating that the identifier from the data marker 410 has been captured.
- a wearer of the wearable device who is a healthcare worker (e.g., doctor, nurse, and the like) can use the current subject matter to track patients without requiring the patient to wear a wearable device.
- the wearer of the wearable device knows the patient of interest, the wearer can select the patient's name from a down list on the user interface, for example to track the patient.
- the wearer can then scan a data marker near the patient location (in some implementations, the data marker can be scanned before association with a patient).
- the wearer can speak the patient's name and scan a data marker.
- Such a tracking scheme can also be used for other types of assets.
- an example implementation has been described using a wearable computing device such as a GOOGLE GLASS® or EPSON MO VERIO®
- the current subject matter is not limited to wearable devices but can include a mobile computing device, e.g., smartphone.
- the optical sensor and emitter are not limited to infra-red wavelengths but can acquire images at any wavelength and/or over any range of wavelengths with or without an emitter producing light at the corresponding wavelengths.
- an optical sensor can integrate into or attach to any non-person asset for tracking the asset.
- an optical sensor can integrate into a patient monitor for tracking the location of the patient monitor, which may move throughout a healthcare facility with a patient.
- the current subject matter can be used to track other assets as well, such as hospital beds, physiological parameter sensors, and the like.
- the current subject matter is not limited to healthcare setting but can be used in other settings as well.
- the current subject matter can allow firefighters wearing a wearable device and searching a building to track their current location, as well as rooms or areas they have previously visited.
- the wearable device may then provide location-based functionality, such as a localized map. This may be useful in an emergency where the firefighter lacks knowledge regarding the layout of the facility.
- Various implementations of the subject matter described herein may be realized in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various implementations may include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.
- ASICs application specific integrated circuits
- the subject matter described herein may be implemented on a computer having a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) or E-ink (e.g., low power electronic paper display) for displaying information to the user and a keyboard and a pointing device (e.g., a mouse or a trackball) or gestures and spoken commands, by which the user may provide input to the computer.
- a display device e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor
- E-ink e.g., low power electronic paper display
- keyboard and a pointing device e.g., a mouse or a trackball
- gestures and spoken commands by which the user may provide input to the computer.
- feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
- feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
- the subject matter described herein may be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front- end component (e.g., a client computer having a graphical user interface or a Web browser through which a user may interact with an implementation of the subject matter described herein), or any combination of such back-end, middleware, or front-end components.
- the components of the system may be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN”), a wide area network (“WAN”), and the Internet.
- LAN local area network
- WAN wide area network
- the Internet the global information network
- the computing system may include clients and servers.
- a client and server are generally remote from each other and typically interact through a
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- Radar, Positioning & Navigation (AREA)
- Biomedical Technology (AREA)
- Epidemiology (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Primary Health Care (AREA)
- Public Health (AREA)
- Remote Sensing (AREA)
- Human Computer Interaction (AREA)
- Electromagnetism (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Des données sont reçues comprenant une instruction de déclenchement de balayage d'un champ de vision du capteur optique d'un marqueur de données contenant un identificateur et situé à une position connue. Un capteur optique balaye le champ de vision pour acquérir l'identificateur à partir du marqueur de données. L'identificateur est obtenu de façon à déterminer une position courante du capteur optique au moyen d'un enregistrement associant l'identificateur contenu dans le marqueur de données et la position connue. L'invention concerne également un appareil, des systèmes, des techniques et des articles.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/US2014/067962 WO2016089357A1 (fr) | 2014-12-01 | 2014-12-01 | Localisation de bien |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/US2014/067962 WO2016089357A1 (fr) | 2014-12-01 | 2014-12-01 | Localisation de bien |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2016089357A1 true WO2016089357A1 (fr) | 2016-06-09 |
Family
ID=52278754
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2014/067962 Ceased WO2016089357A1 (fr) | 2014-12-01 | 2014-12-01 | Localisation de bien |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2016089357A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10489651B2 (en) | 2017-04-14 | 2019-11-26 | Microsoft Technology Licensing, Llc | Identifying a position of a marker in an environment |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080137912A1 (en) * | 2006-12-08 | 2008-06-12 | Electronics And Telecommunications Research Institute | Apparatus and method for recognizing position using camera |
| WO2013032690A2 (fr) * | 2011-08-26 | 2013-03-07 | Qualcomm Incorporated | Génération d'identifiant pour balise visuelle |
| US20130278631A1 (en) * | 2010-02-28 | 2013-10-24 | Osterhout Group, Inc. | 3d positioning of augmented reality information |
| WO2014020547A1 (fr) * | 2012-07-31 | 2014-02-06 | Indoorgo Navigation Systems Ltd. | Procédé et dispositif de navigation |
| EP2732761A1 (fr) * | 2012-11-14 | 2014-05-21 | Hill-Rom Services, Inc. | Système à réalité augmentée dans l'environnement de soins de patient |
| WO2014176054A1 (fr) * | 2013-04-22 | 2014-10-30 | Alcatel Lucent | Systèmes et procédés de localisation |
-
2014
- 2014-12-01 WO PCT/US2014/067962 patent/WO2016089357A1/fr not_active Ceased
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080137912A1 (en) * | 2006-12-08 | 2008-06-12 | Electronics And Telecommunications Research Institute | Apparatus and method for recognizing position using camera |
| US20130278631A1 (en) * | 2010-02-28 | 2013-10-24 | Osterhout Group, Inc. | 3d positioning of augmented reality information |
| WO2013032690A2 (fr) * | 2011-08-26 | 2013-03-07 | Qualcomm Incorporated | Génération d'identifiant pour balise visuelle |
| WO2014020547A1 (fr) * | 2012-07-31 | 2014-02-06 | Indoorgo Navigation Systems Ltd. | Procédé et dispositif de navigation |
| EP2732761A1 (fr) * | 2012-11-14 | 2014-05-21 | Hill-Rom Services, Inc. | Système à réalité augmentée dans l'environnement de soins de patient |
| WO2014176054A1 (fr) * | 2013-04-22 | 2014-10-30 | Alcatel Lucent | Systèmes et procédés de localisation |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10489651B2 (en) | 2017-04-14 | 2019-11-26 | Microsoft Technology Licensing, Llc | Identifying a position of a marker in an environment |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Biswas et al. | Indoor navigation support system for patients with neurodegenerative diseases | |
| US9554705B2 (en) | System and device for medical monitoring | |
| KR102478783B1 (ko) | 탐색 정보 제공 방법 및 장치 | |
| Villarrubia et al. | Monitoring and detection platform to prevent anomalous situations in home care | |
| Antunes et al. | A survey of sensors in healthcare workflow monitoring | |
| US20150379441A1 (en) | Shared asset management system | |
| US20080106374A1 (en) | Patient Room Information System | |
| US20120313775A1 (en) | System and method for rapid location of an alarm condition | |
| Manavi et al. | Review on emerging Internet of Things technologies to fight the COVID-19 | |
| Garcia-Requejo et al. | Activity monitoring and location sensory system for people with mild cognitive impairments | |
| Haider et al. | Automated robotic system for assistance of isolated patients of coronavirus (COVID-19) | |
| Reddy et al. | Smart assistance of elderly individuals in emergency situations at home | |
| Aranda et al. | Collection and analysis of physiological data in smart environments: a systematic mapping | |
| Udgata et al. | Advances in sensor technology and IOT framework to mitigate COVID-19 challenges | |
| US20170249823A1 (en) | System for Tracking Wellness and Scheduling of Caregiving | |
| TWM479469U (zh) | 監控系統及醫療監控系統 | |
| WO2016089357A1 (fr) | Localisation de bien | |
| Nair et al. | Internet of things in smart and intelligent healthcare systems | |
| Namboodiri et al. | Arduino-based smart walker support for the elderly | |
| Mallat et al. | Assistive Technology for Risks Affecting Elderly People in Outdoor Environment. | |
| KR102576646B1 (ko) | 병원에서 복수의 병실들 내 환자들에 대한 모니터링을 수행하기 위한 로봇 및 그 동작 방법 | |
| Caporusso et al. | A pervasive solution for risk awareness in the context of fall prevention | |
| TWI656502B (zh) | 隨身安全照護方法、裝置及系統 | |
| Comai et al. | ALMA: An Indoor Localization and Navigation System for the Elderly | |
| Redwan et al. | Smart IoT-Driven Shopping Assistance System for Cognitive Impairment Patients: Ensuring Safe and Independent Shopping Experiences |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14821926 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 14821926 Country of ref document: EP Kind code of ref document: A1 |